US20220292849A1 - Road Information Detection Method and Apparatus - Google Patents

Road Information Detection Method and Apparatus Download PDF

Info

Publication number
US20220292849A1
US20220292849A1 US17/827,170 US202217827170A US2022292849A1 US 20220292849 A1 US20220292849 A1 US 20220292849A1 US 202217827170 A US202217827170 A US 202217827170A US 2022292849 A1 US2022292849 A1 US 2022292849A1
Authority
US
United States
Prior art keywords
looking
lane line
information
lane
correspondence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/827,170
Inventor
Wei Zhou
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of US20220292849A1 publication Critical patent/US20220292849A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06K9/6288
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/42
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk

Definitions

  • Embodiments of this application relate to the information processing field, and in particular, to a road information detection method and a road information detection apparatus.
  • an intelligent vehicle In an assisted driving scenario and an automatic driving scenario, an intelligent vehicle needs to be aware of a driving environment in a traveling process of the vehicle.
  • Road information detection is an important function of the intelligent vehicle to identify a surrounding environment, and is also an important part of environment perception. Only through effective and accurate road information detection can auxiliary functions such as path planning, road deviation alarm, or lane line maintenance be better supported.
  • a vehicle analyzes an image obtained by a front-looking camera apparatus, to obtain lane line information, or analyzes an image obtained by a rear-looking camera apparatus, to obtain lane line information, and an intelligent vehicle performs road information detection based on the lane line information.
  • the image obtained by the front-looking camera apparatus and the image obtained by the rear-looking camera apparatus are mutually redundancy backups.
  • Road information detection is directly performed on lane line information obtained by analyzing an image corresponding to one camera apparatus such as the front-looking camera apparatus, and consequently, reliability of road information is not high, and stability of a road information detection result is not high.
  • Embodiments of this application provide a road information detection method, so that a plurality of groups of lane line information may be fused to obtain target lane line information.
  • a first aspect of embodiments of this application provides a road information detection method.
  • a road information detection apparatus receives a plurality of pieces of image information, such as a front-looking image and a rear-looking image, where the front-looking image is obtained by a front-looking camera apparatus, and the rear-looking image is obtained by a rear-looking camera apparatus; the road information detection apparatus obtains front-looking lane line information based on the front-looking image, and obtains rear-looking lane line information based on the rear-looking image; and the road information detection apparatus performs fusion based on the obtained front-looking lane line information and the obtained rear-looking lane line information, to obtain target lane line information. Further optionally, the road information detection apparatus outputs the target lane line information.
  • Images corresponding to a plurality of camera apparatuses such as the front-looking camera apparatus and the rear-looking camera apparatus are separately analyzed to obtain a plurality of groups of lane line information, and the plurality of groups of lane line information are fused to obtain the target lane line information, so that reliability of the target lane line information is relatively high, and stability of a road information detection result is relatively high.
  • the road information detection apparatus may fuse the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line, the probability that the rear-looking lane line exists, to obtain the target lane line information.
  • the target lane line information may be a correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • the road information detection apparatus may alternatively obtain front-looking lane structure information based on the front-looking image, and obtain rear-looking lane structure information based on the rear-looking image, and if the road information detection apparatus obtains the front-looking lane structure information based on the front-looking image, and obtains the rear-looking lane structure information based on the rear-looking image, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain the target lane line information.
  • the road information detection apparatus may alternatively obtain the front-looking lane structure information based on the front-looking image, and obtain the rear-looking lane structure information based on the rear-looking image.
  • the target lane line information obtained with reference to the front-looking lane structure information and the rear-looking lane structure information is more stable.
  • the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists
  • the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists
  • the front-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists
  • the rear-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure in the traveling process of the vehicle and a probability that the rear-looking lane structure exists
  • the rear-looking lane structure information may include information such as a correspondence between a horizontal offset and a
  • the road information detection apparatus may receive radar data, where the radar data is obtained by a millimeter-wave radar; the road information detection apparatus obtains radar lane structure information based on the radar data; and if the road information detection apparatus receives the radar data and obtains the radar lane structure information by analyzing the radar data, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain the target lane line information.
  • the road information detection apparatus may alternatively receive the radar data, and obtain the radar lane structure information based on the radar data.
  • the target lane line information obtained with reference to the radar lane structure information is more stable.
  • the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists
  • the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists
  • the radar lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists
  • the road information detection apparatus may fuse the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line,
  • the road information detection apparatus may further receive radar data in addition to the front-looking image and the rear-looking image, and the road information detection apparatus obtains the front-looking lane line information and front-looking lane structure information based on the front-looking image, obtains the rear-looking lane line information and rear-looking lane structure information based on the rear-looking image, and obtains radar lane structure information based on the radar data.
  • the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain the target lane line information.
  • the road information detection apparatus may alternatively obtain the front-looking lane structure information based on the front-looking image, and obtain the rear-looking lane structure information based on the rear-looking image.
  • the target lane line information obtained with reference to the front-looking lane structure information and the rear-looking lane structure information is more stable.
  • the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists
  • the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists
  • the front-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists
  • the rear-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure in the traveling process of the vehicle and a probability that the rear-looking lane lane
  • specific information included in the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information is provided, and a specific form of the target lane line information is provided, so that this solution is more implementable.
  • the front-looking lane structure information may be front-looking road edge information and/or front-looking vehicle track information; the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge and/or a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track; the probability that the front-looking lane structure exists may be a probability that the front-looking road edge exists and/or a probability that the front-looking vehicle track exists; the rear-looking lane line information may be rear-looking road edge information; the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge; and the probability that the rear-looking lane structure exists may be a probability that the rear-looking road edge exists.
  • the radar lane structure information may be radar road edge information and/or radar vehicle track information;
  • the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a radar road edge and/or a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track;
  • the probability that the radar lane structure exists may be a probability that the radar road edge exists and/or a probability that the radar vehicle track exists.
  • a second aspect of embodiments of this application provides a road information detection apparatus, and the apparatus performs the method in the first aspect.
  • a third aspect of embodiments of this application provides a road information detection apparatus, and the apparatus performs the method in the first aspect.
  • a fourth aspect of embodiments of this application provides a computer storage medium.
  • the computer storage medium stores instructions. When the instructions are executed on a computer, the computer is enabled to perform the method in the first aspect.
  • a fifth aspect of embodiments of this application provides a computer program product.
  • the computer program product When the computer program product is run on a computer, the computer is enabled to perform the method in the first aspect.
  • FIG. 1 is a schematic diagram of a framework of a road information detection apparatus
  • FIG. 2 is a schematic diagram of a road structure model according to an embodiment of this application.
  • FIG. 3 is a schematic diagram of an embodiment of a road information detection method according to an embodiment of this application.
  • FIG. 4 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application.
  • FIG. 5 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application.
  • FIG. 6 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application.
  • FIG. 7 is a schematic diagram of a structure of a road information detection apparatus according to an embodiment of this application.
  • FIG. 8 is a schematic diagram of another structure of a road information detection apparatus according to an embodiment of this application.
  • Embodiments of this application provide a road information detection method and a road information detection apparatus, to obtain more reliable lane line information.
  • a framework of a road information detection system in an embodiment of this application includes: a front-looking camera apparatus 101 , a rear-looking camera apparatus 102 , a radar 103 , and a road information detection apparatus 104 .
  • the radar may be a millimeter-wave radar or a laser radar.
  • the road information detection apparatus 104 is applied to an intelligent vehicle, and sensors of the intelligent vehicle include the front-looking camera apparatus 101 and the rear-looking camera apparatus 102 .
  • the road information detection apparatus 104 is separately connected to the front-looking camera apparatus 101 and the rear-looking camera apparatus 102 .
  • the front-looking camera apparatus 101 photographs road information to obtain a front-looking image, and transmits the front-looking image to the road information detection apparatus 104 .
  • the rear-looking camera apparatus 102 photographs road information to obtain a rear-looking image, and transmits the rear-looking image to the road information detection apparatus 104 .
  • the sensors of the intelligent vehicle may further include another sensor such as the radar 103 .
  • the road information detection apparatus 104 is connected to the radar 103 , and the radar 103 detects road information to obtain radar data, and transmits the radar data to the road information detection apparatus 104 .
  • a main function of the road information detection apparatus 104 is to receive data transmitted by the sensor, separately analyze and process different sensor data to obtain corresponding road information, fuse obtained plurality of pieces of road information to obtain target lane line information, and output the target lane line information.
  • the road information detection apparatus 104 receives the front-looking image sent by the front-looking camera apparatus 101 in the sensors, and road information obtained by analyzing the front-looking image may include road information such as front-looking lane line information and/or front-looking lane structure information.
  • the front-looking lane structure information may include front-looking lane structure information such as front-looking road edge information and/or front-looking vehicle track information.
  • the road information detection apparatus 104 receives the rear-looking image sent by the rear-looking camera apparatus 102 in the sensors, and road information obtained by analyzing the rear-looking image may include road information such as rear-looking lane line information and/or rear-looking lane structure information.
  • the rear-looking lane structure information may include but is not limited to rear-looking road edge information.
  • the road information detection apparatus 104 receives the radar data sent by the radar 103 in the sensors, and road information obtained by analyzing the radar data may include radar lane structure information.
  • the radar lane structure information may include radar lane structure information such as radar road edge information and/or radar vehicle track information.
  • the road information detection apparatus 104 obtains, by using a technology such as an image detection technology or a visual recognition technology, the front-looking image photographed by the front-looking camera apparatus 101 and the rear-looking image photographed by the rear-looking camera apparatus 102 , to obtain road information.
  • a technology such as an image detection technology or a visual recognition technology
  • An obtaining technology is not limited herein.
  • the road information detection apparatus 104 obtains, by using a method such as a clustering algorithm or Hough transform, radar data detected by the millimeter-wave radar, to obtain road information.
  • An obtaining technology is not limited herein.
  • a road information detection apparatus may obtain a road structure model based on at least one of road information parameters such as an average curvature change rate of a road edge, average curvature of a road edge, a heading angle of a road edge, and a horizontal offset of a road edge.
  • road information parameters such as an average curvature change rate of a road edge, average curvature of a road edge, a heading angle of a road edge, and a horizontal offset of a road edge.
  • the road structure model is generally represented by a clothoid.
  • a correspondence between a horizontal offset and a longitudinal offset of the road structure model may be represented as:
  • a relationship between a horizontal offset of a vehicle and an arc length 1 of traveling of the vehicle is that a longitudinal offset of the vehicle is a sum of a third power of 1 multiplied by one sixth of A3, a second power of 1 multiplied by half of A2, 1 multiplied by A1, and A0, where A0 is a horizontal offset of a road edge, A1 is a heading angle of the road edge, A2 is average curvature of the road edge, and A3 is an average curvature change rate of the road edge.
  • a heading angle of the vehicle needs to be less than 10 degrees. Therefore, a sine value of the heading angle of the vehicle is approximately the heading angle of the vehicle, and a cosine value is approximately 1.
  • the heading angle ⁇ (1) of the vehicle is an integral of the curvature on the arc length 1:
  • a rectangular coordinate system is established by using a head direction of the vehicle as a positive direction of an x-axis, and the vehicle is located at (x0, y0) at an initial moment.
  • the vehicle travels by the arc length 1 the following may be obtained:
  • curve coordinates that meet the assumption that the heading angle ⁇ is less than 10 degrees may be represented as follows:
  • C0 is the horizontal offset of the road edge
  • C1 is the heading angle of the road edge
  • C2 is twice the average curvature of the road edge
  • C3 is six times the average curvature change rate of the road edge.
  • the road information detection apparatus may fuse a plurality of pieces of road information to obtain target lane line information.
  • the road information detection apparatus fuses front-looking lane line information and rear-looking lane line information, to obtain the target lane line information.
  • an embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • a road information detection apparatus receives a front-looking image.
  • the road information detection apparatus is connected to a front-looking camera apparatus.
  • the connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, a front-looking image photographed by the front-looking camera apparatus. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, a front-looking image photographed by the front-looking camera apparatus.
  • the wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • the road information detection apparatus receives a rear-looking image.
  • the road information detection apparatus is connected to a rear-looking camera apparatus.
  • the connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, a rear-looking image photographed by the rear-looking camera apparatus. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, a rear-looking image photographed by the rear-looking camera apparatus.
  • the wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • the road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • the road information detection apparatus analyzes the front-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the front-looking lane line information.
  • a technology such as an image detection technology or a visual recognition technology.
  • the road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • the road information detection apparatus analyzes the rear-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the rear-looking lane line information.
  • a technology such as an image detection technology or a visual recognition technology.
  • a process in which the road information detection apparatus obtains the front-looking lane line information by using the front-looking image is described in steps 301 and 303
  • a process in which the road information detection apparatus obtains the rear-looking lane line information by using the rear-looking image is described in steps 302 and 304 .
  • the road information detection apparatus fuses the front-looking lane line information and the rear-looking lane line information, to obtain target lane line information.
  • the road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line.
  • the rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line and the probability that the front-looking lane line exists
  • the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line and the probability that the rear-looking lane line exists.
  • the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:
  • X front,lane C 3 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,lane ⁇ Z+C 0 front,lane .
  • C0 front, lane is a horizontal offset of a road edge of the front-looking lane line
  • C1 front, lane is front-looking a heading angle of the road edge of the lane line
  • C2 front, lane is twice average curvature of the road edge of the front-looking lane line
  • C3 front, lane is six times an average curvature change rate of the road edge of the front-looking lane line
  • X front, lane is front-looking a horizontal offset of the vehicle on the lane line
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking lane line exists may be further obtained through detection, and is represented as P front, lane .
  • the probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:
  • C0 rear, lane is a horizontal offset of a road edge of the rear-looking lane line
  • C1 rear, lane is a heading angle of the road edge of the rear-looking lane line
  • C2 rear, lane is twice average curvature of the road edge of the rear-looking lane line
  • C3 rear, lane is six times an average curvature change rate of the road edge of the rear-looking lane line
  • X rear, lane is a horizontal offset of the vehicle on the rear-looking lane line
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as P rear, lane .
  • the probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • correspondence is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • the road information detection apparatus may fuse the front-looking lane line information and the rear-looking lane line information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information.
  • the fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm.
  • a probability-based fusion algorithm is used as an example for a description.
  • a correspondence between a horizontal offset and a longitudinal offset of a target lane line obtained after fusion may be represented as follows:
  • X fusion C 3 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 fusion ⁇ Z+C 0 fusion ,
  • x fusion is a horizontal offset of the vehicle obtained after fusion
  • Z is a longitudinal offset of the vehicle.
  • C0 fusion is a horizontal offset of a road edge of the lane line obtained after fusion
  • C3 fusion is a heading angle of the road edge of the lane line obtained after fusion
  • C2 fusion is twice average curvature of the road edge of the lane line obtained after fusion
  • C3 fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • the road information detection apparatus may obtain (C3 fusion , C2 fusion , C1 fusion , C0 fusion ), P fusion based on (C3 front, lane , C2 front, lane , C0 front, lane , C0 front, lane ), P front, lane ; and (C3 rear, lane , C2 rear, lane , C1 rear, lane , C0 rear, lane ), P rear, lane .
  • an example fusion manner is as follows.
  • a threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if P front, lane is less than the threshold, C3 front, lane , C2 front, lane , C1 front, lane , and C0 front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted.
  • the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • all probabilities may be compared with a same threshold, or different thresholds may be used.
  • the probability P front, lane that the front-looking lane line information exists is compared with a first threshold
  • the probability P rear, lane that the rear-looking lane line information exists is compared with a second threshold.
  • the first threshold and the second threshold are unequal.
  • the first threshold is 0.6
  • the second threshold is 0.7.
  • X fusion /P fusion X front, lane /P front, lane +X rear, lane /P rear, lane .
  • X fusion is a horizontal offset obtained after fusion.
  • Ci fusion where i is an integer from 0 to 3.
  • the road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 305 , between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle.
  • An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • the road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, front-looking lane structure information, and rear-looking lane structure information, to obtain the target lane line information.
  • FIG. 4 another embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • a road information detection apparatus receives a front-looking image.
  • the road information detection apparatus receives a rear-looking image.
  • the road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • the road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 401 to 404 in this embodiment are similar to steps 301 to 304 in the foregoing embodiment shown in FIG. 3 , and details are not described herein again.
  • the road information detection apparatus obtains front-looking lane structure information based on the front-looking image.
  • the road information detection apparatus analyzes the front-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the front-looking lane structure information.
  • a technology such as an image detection technology or a visual recognition technology.
  • the road information detection apparatus obtains rear-looking lane structure information based on the rear-looking image.
  • the road information detection apparatus analyzes the rear-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the rear-looking lane structure information.
  • a technology such as an image detection technology or a visual recognition technology.
  • the road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain target lane line information.
  • front-looking information such as the front-looking lane line information and the front-looking lane structure information
  • rear-looking information such as the rear-looking lane line information and the rear-looking lane structure information
  • the road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • the road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the image obtained by the camera apparatus.
  • vehicle track information is track information of an observed vehicle that is obtained by using an image, for example, a traveling track of a vehicle, that is, the observed vehicle, that can be photographed by a camera and that is obtained based on a plurality of images. Traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning.
  • a road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line.
  • the road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line.
  • the rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • the front-looking lane structure information may include at least one of front-looking road edge information and/or front-looking vehicle track information in the traveling process of the vehicle.
  • the front-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge in the traveling process of the vehicle, a probability that the front-looking road edge exists, a type of the front-looking road edge, and/or a height of the front-looking road edge.
  • the front-looking vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle, a probability that the front-looking vehicle track exists, and/or a status of a front-looking observed vehicle.
  • the rear-looking lane structure information may include rear-looking road edge information.
  • the rear-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge, a probability that the rear-looking road edge exists, a type of the rear-looking road edge, and/or a height of the rear-looking road edge.
  • the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists
  • the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists
  • the front-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge and/or the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, and the probability that the front-looking road edge exists and/or the probability that the front-looking vehicle
  • the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:
  • X front,lane C 3 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,lane ⁇ Z+C 0 front,lane .
  • C0 front, lane is a horizontal offset of a road edge of the front-looking lane line
  • C1 front, lane is a heading angle of the road edge of the front-looking lane line
  • C2 front, lane is twice average curvature of the road edge of the front-looking lane line
  • C3 front, lane is six times an average curvature change rate of the road edge of the front-looking lane line
  • X front, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking lane line exists may be further obtained through detection, and is represented as P front, lane .
  • the probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:
  • C0 rear, lane is a horizontal offset of a road edge of the rear-looking lane line
  • C1 rear lane is a heading angle of the road edge of the rear-looking lane line
  • C2 rear lane is twice average curvature of the road edge of the rear-looking lane line
  • C3 rear lane is six times an average curvature change rate of the road edge of the rear-looking lane line
  • X rear, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as P rear, lane .
  • the probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • the front-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained.
  • the road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge may be represented as follows:
  • X front,edge C 3 front,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,edge ⁇ Z+C 0 front,edge .
  • C0 front, edge is a horizontal offset of a road edge related to the front-looking road edge
  • C1 front, edge is a heading angle of the road edge related to the front-looking road edge
  • C1 front, edge is twice average curvature of the road edge related to the front-looking road edge
  • C1 front, edge is six times an average curvature change rate of the road edge related to the front-looking road edge
  • X front, edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking road edge exists may be further obtained through detection, and is represented as P front, edge .
  • the probability that the front-looking road edge exists is used to indicate a probability that the road edge exists in the front-looking image.
  • a front-looking observed vehicle is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding vehicle track information is obtained.
  • the vehicle track may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an i th observed vehicle in vehicle tracks collected by the front-looking camera apparatus in the traveling process of the vehicle may be represented as follows:
  • X front,car(i) C 3 front,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,car(i) ⁇ Z+C 0 front,car(i) .
  • C0 front, car(i) is a horizontal offset of a road edge related to the vehicle track of the i th observed vehicle
  • C1 front, car(i) is a heading angle of the road edge related to the vehicle track of the i th observed vehicle
  • C2 front, car(i) is twice average curvature of the road edge related to the vehicle track of the i th observed vehicle
  • C3 front car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the i th observed vehicle
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • a probability that the vehicle track of the i th observed vehicle exists may be represented as P front, car(i) .
  • vehicle track information of each observed vehicle may be processed to obtain the front-looking vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle may be represented as follows:
  • X front,car C 3 front,car ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,car ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,car ⁇ Z+C 0 front,car .
  • C0 front car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle
  • C1 front car is a heading angle of the road edge related to the vehicle track of the observed vehicle
  • C2 front car is twice average curvature of the road edge related to the vehicle track of the observed vehicle
  • C3 front car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle
  • X front, car is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • An example implementation may be as follows. Vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as the front-looking vehicle track information, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the front-looking vehicle track information. This is not limited herein.
  • a probability that the front-looking vehicle track exists may be represented as P front, car , and the probability that the front-looking vehicle track exists is used to indicate a probability that the vehicle track exists in the front-looking image.
  • the rear-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained.
  • the road edge may be represented by using a curve of a specific road structure model.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the rear-looking road edge may be represented as follows:
  • C0 rear,edge is a horizontal offset of a road edge related to the rear-looking road edge
  • C1 rear,edge is a heading angle of the road edge related to the rear-looking road edge
  • C2 rear,edge is twice average curvature of the road edge related to the rear-looking road edge
  • C3 rear,edge is six times an average curvature change rate of the road edge related to the rear-looking road edge
  • X rear,edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking road edge exists may be further obtained through detection, and is represented as P rear,edge .
  • the probability that the rear-looking road edge exists is used to indicate a probability that the road edge exists in the rear-looking image.
  • the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information.
  • the fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm.
  • a probability-based fusion algorithm is used as an example for a description.
  • a correspondence between a horizontal offset and a longitudinal offset of a target lane line obtained after fusion may be represented as follows:
  • X fusion C 3 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 fusion ⁇ Z+C 0 fusion ,
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • C0 fusion is a horizontal offset of a road edge of the lane line obtained after fusion
  • C1 fusion is a heading angle of the road edge of the lane line obtained after fusion
  • C2 fusion is twice average curvature of the road edge of the lane line obtained after fusion
  • C3 fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • the road information detection apparatus obtains (C3 fusion , C2 fusion , C1 fusion , C0 fusion ), P fusion based on (C3 front, lane , C2 front, lane , C1 front, lane , C0 front, lane ), P front, lane ; (C3 rear, lane , C2 rear, lane , C1 rear, lane , C0 rear, lane ), P rear, lane ; (C3 front, edge , C2 front, edge , C1 front, edge , C0 front, edge ), P front, edge ; (C3 front, car , C2 front, car , C1 front, car , C0 front, car ), P front, car ; and (C3 rear, edge , C2 rear, edge , C1 rear, edge , C0 rear, edge ), P rear, edge .
  • an example manner is as follows.
  • a threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if P front, lane is less than the threshold, C3 front, lane , C2 front, lane , C1 front, lane , and C0 front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted.
  • the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • all probabilities may be compared with a same threshold, or different thresholds may be used.
  • the probability P front, lane that the front-looking lane line information exists is compared with a first threshold
  • the probability P rear, lane that the rear-looking lane line information exists is compared with a second threshold.
  • the first threshold and the second threshold are unequal.
  • the first threshold is 0.6
  • the second threshold is 0.7.
  • a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • X fusion /P fusion X front, lane /P front, lane +X rear, lane /P rear, lane +X front, edge /P front, edge +X front, car /P front, car +X rear, edge /P rear, edge , where X fusion is a horizontal offset obtained after fusion.
  • Ci fusion where i is an integer from 0 to 3.
  • the threshold may not be set in actual application.
  • the road information detection apparatus determines that all parameters are valid parameters.
  • the road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 407 , between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle.
  • An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • the road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, and radar lane structure information, to obtain the target lane line information.
  • FIG. 5 another embodiment of a road information detection method in an embodiment of this application includes the following steps.
  • a road information detection apparatus receives a front-looking image.
  • the road information detection apparatus receives a rear-looking image.
  • Steps 501 and 502 in this embodiment are similar to steps 301 and 302 in the foregoing embodiment shown in FIG. 3 , and details are not described herein again.
  • the road information detection apparatus receives radar data.
  • the road information detection apparatus is connected to a millimeter-wave radar.
  • the connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, radar data detected by the millimeter-wave radar. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, radar data detected by the millimeter-wave radar.
  • the wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • the road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • the road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 404 and 405 in this embodiment are similar to steps 303 and 304 in the foregoing embodiment shown in FIG. 3 , and details are not described herein again.
  • the road information detection apparatus obtains radar lane structure information based on the radar data.
  • the road information detection apparatus obtains the radar lane structure information based on the radar data by using a method such as a clustering algorithm, a graph model method, Hough transform, machine learning, or the like.
  • steps 501 and 504 there is no fixed time sequence relationship between steps 501 and 504 , steps 502 and 505 , and steps 503 and 506 .
  • the road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain target lane line information.
  • front-looking information such as the front-looking lane line information
  • rear-looking information such as the rear-looking lane line information
  • the road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • the road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the radar data obtained by the millimeter-wave radar.
  • the vehicle track information is track information of an observed vehicle that is obtained by using the radar data, for example, a traveling track of a vehicle, that is, the observed vehicle, that can be collected by using a signal obtained based on radar data collected by the millimeter-wave radar for a plurality of times.
  • Traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning.
  • a road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line.
  • the road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line.
  • the rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • the radar lane structure information may include at least one of radar vehicle track information and/or radar road edge information in the traveling process of the vehicle
  • the radar vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle, a probability that the radar vehicle track exists, and/or a status of a radar observed vehicle.
  • the radar road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar road edge in the traveling process of the vehicle, a probability that the radar road edge exists, a type of the radar road edge, and/or a height of the radar road edge.
  • the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists
  • the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists
  • the radar lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the radar road edge and/or the correspondence between the horizontal offset and the longitudinal offset of radar vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure, and the probability that the radar road edge exists and/or the probability that the radar vehicle track exists are/is used as the probability that the radar lane structure exists
  • the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:
  • X front,lane C 3 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,lane ⁇ Z+C 0 front,lane .
  • C0 front, lane is a horizontal offset of a road edge of the front-looking lane line
  • C1 front, lane is front-looking a heading angle of the road edge of the lane line
  • C2 front, lane is twice average curvature of the road edge of the front-looking lane line
  • C3 front, lane is six times an average curvature change rate of the road edge of the front-looking lane line
  • X front, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking lane line exists may be further obtained through detection, and is represented as P front, lane .
  • the probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:
  • C0 rear, lane is a horizontal offset of a road edge of the rear-looking lane line
  • C1 rear lane is a heading angle of the road edge of the rear-looking lane line
  • C2 rear lane is twice average curvature of the road edge of the rear-looking lane line
  • C3 rear lane is six times an average curvature change rate of the road edge of the rear-looking lane line
  • X rear, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as P rear, lane .
  • the probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • the radar information is detected based on a road structure model by using a method such as deep learning, and corresponding road edge information is obtained.
  • the road edge may be represented by using a curve of a specific road structure model.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the radar road edge may be represented as follows:
  • X radar,edge C 3 radar,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,edge ⁇ Z+C 0 radar,edge .
  • C0 radar edge is a horizontal offset of a road edge obtained based on the radar road edge information
  • C1 radar edge is a heading angle of the road edge obtained based on the radar road edge information
  • C2 radar edge is twice average curvature of the road edge obtained based on the radar road edge information
  • C3 radar edge is six times an average curvature change rate of the road edge obtained based on the radar road edge information
  • X radar edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the radar road edge exists may be further obtained through detection, and is represented as P radar, lane .
  • the probability that the radar road edge exists is used to indicate a probability that a road edge detected by the millimeter-wave radar exists.
  • track information of one or more observed vehicles that is collected by one or more vehicles may be analyzed based on a road structure model by using a method such as depth learning, and a vehicle track may be represented by using a curve of a specific road structure model.
  • Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an i th observed vehicle in vehicle tracks collected by the millimeter-wave radar in the traveling process of the vehicle may be represented as follows:
  • X radar,car(i) C 3 radar,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,car(i) ⁇ Z+C 0 radar,car(i) .
  • C0 radar, car(i) is a horizontal offset of a road edge related to the vehicle track of the i th observed vehicle
  • C1 radar, car(i) is a heading angle of the road edge related to the vehicle track of the i th observed vehicle
  • C2 radar, car(i) is twice average curvature of the road edge related to the vehicle track of the i th observed vehicle
  • C3 radar, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the i th observed vehicle
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • a probability that a vehicle track of a target vehicle exists may be represented as P radar, car(i) .
  • vehicle track information of each target vehicle may be processed to obtain radar vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle may be represented as follows:
  • X radar,car C 3 radar, car ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,car ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,car ⁇ Z+C 0 radar,car .
  • C0 radar car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle
  • C1 radar car is a heading angle of the road edge related to the vehicle track of the observed vehicle
  • C2 radar car is twice average curvature of the road edge related to the vehicle track of the observed vehicle
  • C3 radar car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle
  • X radar car is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows. Radar vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as radar vehicle track information of a to-be-observed vehicle, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the radar vehicle track information. This is not limited herein.
  • a probability that the radar vehicle track exists may be represented as P radar, car and the probability that the radar vehicle track exists is used to indicate a probability that a vehicle track detected by the millimeter-wave radar exists.
  • the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information.
  • the fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm.
  • a probability-based fusion algorithm is used as an example for a description.
  • a correspondence of a target lane line obtained after fusion may be represented as follows:
  • X fusion C 3 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 fusion ⁇ Z+C 0 fusion .
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • C0 fusion is a horizontal offset of a road edge of the lane line obtained after fusion
  • C1 fusion is a heading angle of the road edge of the lane line obtained after fusion
  • C2 fusion is twice average curvature of the road edge of the lane line obtained after fusion
  • C3 fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • the road information detection apparatus obtains (C3 fusion , C2 fusion , C1 fusion , C0 fusion ), P fusion based on (C3 front, lane , C2 front, lane , C1 front, lane , C0 front, lane ), P front, lane ; (C3 rear, lane , C2 rear, lane , C1 rear, lane , C0 rear, lane ), P rear, lane ; (C3 radar, car , C2 radar, car , C1 radar, car , C0 radar, car ), P radar, car ; and (C3 radar, edge , C2 radar, edge , C1 radar, edge , C0 radar, edge ), P radar, edge .
  • an example manner is as follows.
  • a threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if P front, lane is less than the threshold, C3 front, lane , C2 front, lane , C1 front, lane , and C0 front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted.
  • the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • all probabilities may be compared with a same threshold, or different thresholds may be used.
  • the probability P front, lane that the front-looking lane line information exists is compared with a first threshold
  • the probability P rear, lane that the rear-looking lane line information exists is compared with a second threshold.
  • the first threshold and the second threshold are unequal.
  • the first threshold is 0.6
  • the second threshold is 0.7.
  • a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • X fusion /P fusion X front, lane /P front, lane +X rear, lane /P rear, lane +X radar, car /P radar, car +X radar, edge /P radar, edge where X fusion is a horizontal offset obtained after fusion.
  • Ci fusion where i is an integer from 0 to 3.
  • the threshold may not be set in actual application.
  • the road information detection apparatus determines that all parameters are valid parameters.
  • the road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 507 , between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle.
  • An output manner may be an image, a video, voice, or text. An manner is not limited herein.
  • the road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, front-looking lane structure information, rear-looking lane structure information, and radar lane structure information, to obtain the target lane line information.
  • FIG. 6 another embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • a road information detection apparatus receives a front-looking image.
  • the road information detection apparatus receives a rear-looking image.
  • the road information detection apparatus receives radar data.
  • the road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • the road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 601 to 605 in this embodiment are similar to steps 501 to 505 in the foregoing embodiment shown in FIG. 5 , and details are not described herein again.
  • the road information detection apparatus obtains front-looking lane structure information based on the front-looking image.
  • the road information detection apparatus obtains rear-looking lane structure information based on the rear-looking image.
  • Steps 606 and 607 in this embodiment are similar to steps 405 and 406 in the foregoing embodiment shown in FIG. 4 , and details are not described herein again.
  • the road information detection apparatus obtains radar lane structure information based on the radar data.
  • Step 608 in this embodiment is similar to step 506 in the foregoing embodiment shown in FIG. 5 , and details are not described herein again.
  • steps 601 , 604 , and 606 there is no fixed time sequence relationship between steps 601 , 604 , and 606 , steps 602 , 605 , and 607 , and steps 603 , 606 , and 608 .
  • the road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain target lane line information.
  • Front-looking information such as the front-looking lane line information and the front-looking lane structure information
  • rear-looking information such as the rear-looking lane line information and the rear-looking lane structure information
  • radar information such as the radar lane line information and the radar lane structure information.
  • the road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus or radar data obtained by a millimeter-wave radar, for example, a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle, a probability that the lane line exists, and information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line.
  • the road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the image obtained by the camera apparatus or the radar data obtained by the millimeter-wave radar.
  • vehicle track information is traveling track information of an observed vehicle, and traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning.
  • a road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line.
  • the road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line.
  • the rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • the front-looking lane structure information may include at least one of front-looking road edge information and/or front-looking vehicle track information in the traveling process of the vehicle.
  • the front-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge in the traveling process of the vehicle, a probability that the front-looking road edge exists, a type of the front-looking road edge, and/or a height of the front-looking road edge.
  • the front-looking vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle, a probability that the front-looking vehicle track exists, and/or a status of a front-looking observed vehicle.
  • the rear-looking lane structure information may include rear-looking road edge information.
  • the rear-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge in the traveling process of the vehicle, a probability that the rear-looking road edge exists, a type of the rear-looking road edge, and/or a height of the rear-looking road edge.
  • the radar lane structure information may include at least one of radar vehicle track information and/or radar road edge information in the traveling process of the vehicle
  • the radar vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle, a probability that the radar vehicle track exists, and/or a status of a radar observed vehicle.
  • the radar road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar road edge in the traveling process of the vehicle, a probability that the radar road edge exists, a type of the radar road edge, and/or a height of the radar road edge.
  • the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists
  • the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists
  • the front-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge and/or the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, and the probability that the front-looking road edge exists and/or the probability that the front-looking vehicle
  • the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:
  • X front,lane C 3 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,lane ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,car(i) ⁇ Z+C 0 front,lane .
  • C0 front, lane is a horizontal offset of a road edge of the front-looking lane line
  • C1 front, lane is front-looking a heading angle of the road edge of the lane line
  • C2 front, lane is twice average curvature of the road edge of the front-looking lane line
  • C3 front, lane is six times an average curvature change rate of the road edge of the front-looking lane line
  • X front, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking lane line exists may be further obtained through detection, and is represented as P front, lane .
  • the probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained.
  • the lane line may be represented by using a curve of a specific road structure model.
  • Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:
  • C0 rear, lane is a horizontal offset of a road edge of the rear-looking lane line
  • C1 rear lane is a heading angle of the road edge of the rear-looking lane line
  • C2 rear lane is twice average curvature of the road edge of the rear-looking lane line
  • C3 rear lane is six times an average curvature change rate of the road edge of the rear-looking lane line
  • X rear, lane is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as P rear, lane .
  • the probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • the front-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained.
  • the road edge may be represented by using a curve of a specific road structure model.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge may be represented as follows:
  • C0 front, edge is a horizontal offset of a road edge related to the front-looking road edge
  • C1 front edge is a heading angle of the road edge related to the front-looking road edge
  • C1 front, edge is twice average curvature of the road edge related to the front-looking road edge
  • C1 front, edge is six times an average curvature change rate of the road edge related to the front-looking road edge
  • X front, edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the front-looking road edge exists may be further obtained through detection, and is represented as P front, edge
  • the probability that the front-looking road edge exists is used to indicate a probability that the road edge exists in the front-looking image.
  • a front-looking observed vehicle is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding vehicle track information is obtained.
  • a vehicle track may be represented by using a curve of a specific road structure model.
  • Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an i th observed vehicle in vehicle tracks collected by the front-looking camera apparatus in the traveling process of the vehicle may be represented as follows:
  • X front,car(i) C 3 front,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,car(i) ⁇ Z+C 0 front,car(i) .
  • C0 front, car(i) is a horizontal offset of a road edge related to the vehicle track of the i th observed vehicle
  • C1 front, car(i) is a heading angle of the road edge related to the vehicle track of the i th observed vehicle
  • C2 front, car(i) is twice average curvature of the road edge related to the vehicle track of the i th observed vehicle
  • C3 front, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the i th observed vehicle
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • a probability that the vehicle track of the i th observed vehicle exists may be represented as P front, car(i) .
  • vehicle track information of each observed vehicle may be processed to obtain the front-looking vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle may be represented as follows:
  • X front,car C 3 front,car ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 front,car ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 front,car ⁇ Z+C 0 front,car .
  • C0 front car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle
  • C1 front car is a heading angle of the road edge related to the vehicle track of the observed vehicle
  • C2 front car 1 S twice average curvature of the road edge related to the vehicle track of the observed vehicle
  • C3 front car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle
  • X front, car is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows: Vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as the front-looking vehicle track information, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the front-looking vehicle track information. This is not limited herein.
  • a probability that the front-looking vehicle track exists may be represented as P front, car , and the probability that the front-looking vehicle track exists is used to indicate a probability that the vehicle track exists in the front-looking image.
  • the rear-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane edge information is obtained.
  • the road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • C0 rear, edge is a horizontal offset of a road edge related to the rear-looking road edge
  • C1 rear, edge is a heading angle of the road edge related to the rear-looking road edge
  • C2 rear, edge is twice average curvature of the road edge related to the rear-looking road edge
  • C3 rear, edge is six times an average curvature change rate of the road edge related to the rear-looking road edge
  • X rear, edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the rear-looking road edge exists may be further obtained through detection, and is represented as P rear, edge .
  • the probability that the rear-looking road edge exists is used to indicate a probability that the road edge exists in the rear-looking image.
  • the radar information is detected based on a road structure model by using a method such as deep learning, and corresponding road edge information is obtained.
  • a road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a road curve obtained after fusion that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • the correspondence between the horizontal offset and the longitudinal offset of the radar road edge may be represented as follows:
  • X radar,edge C 3 radar,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,edge ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,edge ⁇ Z+C 0 radar,edge .
  • C0 radar edge is a horizontal offset of a road edge obtained based on the radar road edge information
  • C1 radar edge is a heading angle of the road edge obtained based on the radar road edge information
  • C2 radar edge is twice average curvature of the road edge obtained based on the radar road edge information
  • C3 radar edge is six times an average curvature change rate of the road edge obtained based on the radar road edge information
  • X radar edge is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • the probability that the radar road edge exists may be further obtained through detection, and is represented as P radar, lane .
  • the probability that the radar road edge exists is used to indicate a probability that a road edge detected by the millimeter-wave radar exists.
  • collected track information of one or more observed vehicles may be analyzed based on a road structure model by using a method such as depth learning, and may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset.
  • Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road.
  • An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • a correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an i th observed vehicle in vehicle tracks collected by the millimeter-wave radar in the traveling process of the vehicle may be represented as follows:
  • X radar,car(i) C 3 radar,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,car(i) ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,car(i) ⁇ Z+C 0 radar,car(i) .
  • C0 radar, car(i) is a horizontal offset of a road edge related to the vehicle track of the i th observed vehicle
  • C1 radar, car(i) is a heading angle of the road edge related to the vehicle track of the i th observed vehicle
  • C2 radar, car(i) is twice average curvature of the road edge related to the vehicle track of the i th observed vehicle
  • C3 radar, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the i th observed vehicle
  • X is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • a probability that a vehicle track of a target vehicle exists may be represented as P radar, car(i) .
  • vehicle track information of each target vehicle may be processed to obtain the radar vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track may be represented as follows:
  • X radar,car C 3 radar,car ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 radar,car ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 radar,car ⁇ Z+C 0 radar,car .
  • C0 radar,car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle
  • C1 radar,car is a heading angle of the road edge related to the vehicle track of the observed vehicle
  • C2 radar,car is twice average curvature of the road edge related to the vehicle track of the observed vehicle
  • C3 radar,car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle
  • X radar,car is a horizontal offset of the vehicle
  • Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows. Radar vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as radar vehicle track information of a to-be-observed vehicle, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the radar vehicle track information. This is not limited herein.
  • a probability that the radar vehicle track exists may be represented as P radar, car , and the probability that the radar vehicle track exists is used to indicate a probability that a vehicle track detected by the millimeter-wave radar exists.
  • the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information.
  • the fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm.
  • a probability-based fusion algorithm is used as an example for a description.
  • a correspondence of a target lane line obtained after fusion may be represented as follows:
  • X fusion C 3 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 3+ C 2 fusion ⁇ Z ⁇ circumflex over ( ) ⁇ 2+ C 1 fusion ⁇ Z+C 0 fusion .
  • a probability that a lane line exists is P fusion , X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • C0 fusion is a horizontal offset of a road edge of the lane line obtained after fusion
  • C1 fusion is a heading angle of the road edge of the lane line obtained after fusion
  • C2 fusion is twice average curvature of the road edge of the lane line obtained after fusion
  • C3 fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • the road information detection apparatus obtains (C3 fusion , C2 fusion , C1 fusion , C0 fusion ), P fusion based on
  • an example specific manner is as follows.
  • a threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if P front, lane is less than the threshold, C3 front, lane , C2 front, lane , C1 front, lane , and C0 front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted.
  • the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • all probabilities may be compared with a same threshold, or different thresholds may be used.
  • the probability P front, lane that the front-looking lane line information exists is compared with a first threshold
  • the probability P rear, lane that the rear-looking lane line information exists is compared with a second threshold.
  • the first threshold and the second threshold are unequal.
  • the first threshold is 0.6
  • the second threshold is 0.7.
  • a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • X fusion /P fusion X front, lane /P front, lane +X rear, lane /P rear, lane +X front, edge /P front, edge +X front, car /P front, car +X rear, edge /P rear, edge +X radar, car /P radar, car +X radar, edge /P radar, edge .
  • X fusion is a horizontal offset obtained after fusion.
  • Ci fusion where i is an integer from 0 to 3.
  • the threshold may not be set in actual application.
  • the road information detection apparatus determines that all parameters are valid parameters.
  • the road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 609 , between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle.
  • An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • a road information detection apparatus may be deployed on the road information detection apparatus in a form of software.
  • an embodiment of a road information detection apparatus in embodiments of this application includes a receiving unit 701 , an obtaining unit 702 , a fusion unit 703 , and an output unit 704 .
  • the receiving unit 701 is configured to receive a front-looking image from at least one front-looking camera apparatus and a rear-looking image from at least one rear-looking camera apparatus, and is further configured to receive radar data.
  • the obtaining unit 702 is configured to obtain front-looking lane line information based on the front-looking image, obtain rear-looking lane line information based on the rear-looking image, is further configured to obtain front-looking lane structure information based on the front-looking image, obtain rear-looking lane structure information based on the rear-looking image, and is further configured to obtain radar lane structure information based on the radar data.
  • the fusion unit 703 is configured to obtain target lane line information based on the front-looking lane line information and the rear-looking lane line information.
  • the fusion unit 703 may be configured to obtain the target lane line information in any one or more of the following manners.
  • the output unit 704 is configured to output the target lane line information.
  • each unit of the road information detection apparatus is similar to those described in embodiments shown in FIG. 3 to FIG. 6 . Details are not described herein again.
  • FIG. 8 is another schematic diagram of a structure of a road information detection apparatus according to an embodiment of this application.
  • the road information detection apparatus 800 may include one or more processors 801 and a memory 805 , and the memory 805 stores one or more application programs or data.
  • the memory 805 may be a volatile memory or a persistent memory.
  • the program stored in the memory 805 may include one or more modules, and each module may include a series of instruction operations for the road information detection apparatus.
  • the processor 801 may be configured to communicate with the memory 805 , and perform, on the road information detection apparatus 800 , the series of instruction operations in the memory 805 .
  • the road information detection apparatus 800 may further include one or more power supplies 802 , one or more wired or wireless network interfaces 803 , one or more input/output interfaces 804 , and/or one or more operating systems such as Windows Server®, Mac OS X®, Unix®, Linux®, and FreeBSD®.
  • one or more power supplies 802 may further include one or more power supplies 802 , one or more wired or wireless network interfaces 803 , one or more input/output interfaces 804 , and/or one or more operating systems such as Windows Server®, Mac OS X®, Unix®, Linux®, and FreeBSD®.
  • the processor 801 may perform operations performed by the road information detection apparatuses in embodiments shown in FIG. 3 to FIG. 6 . Details are not described herein again.
  • An embodiment of this application provides a vehicle.
  • the vehicle includes at least one detection apparatus, for example, the road information detection apparatus 800 shown in FIG. 8 .
  • the vehicle may perform operations performed by the road information detection apparatuses in embodiments shown in FIG. 3 to FIG. 6 .
  • the detection system includes a detection apparatus and at least one sensor.
  • the detection apparatus may be the road information detection apparatus 800 shown in FIG. 8 , and the sensor may include at least one of a camera apparatus, a millimeter-wave radar, and/or the like.
  • the chip system includes a processor configured to support a road information detection apparatus in implementing functions in the foregoing aspects, for example, sending or processing data and/or information in the foregoing methods.
  • the chip system further includes a memory.
  • the memory is configured to store necessary program instructions and data.
  • the chip system may include a chip, or may include a chip and another discrete device.
  • the chip when the chip system is a chip in the road information detection apparatus and the like, the chip includes a processing unit and a communication unit.
  • the processing unit may be, for example, a processor.
  • the communication unit may be, for example, an input/output interface, a pin, or a circuit.
  • the processing unit may execute computer-executable instructions stored in a storage unit, so that the chip in the road information detection apparatus performs the steps of the method performed by the road information detection apparatus in any embodiment in FIG. 3 to FIG. 6 .
  • the storage unit is a storage unit in the chip, for example, a register or a cache.
  • the storage unit may alternatively be a storage unit that is in UE, a base station, and the like and that is located outside the chip, for example, a read-only memory (ROM), another type of static storage device capable of storing static information and instructions, or a random-access memory (RAM).
  • ROM read-only memory
  • RAM random-access memory
  • An embodiment of this application further provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program.
  • the computer program When the computer program is executed by a computer, a method procedure related to the road information detection apparatus in any one of the foregoing method embodiments is implemented.
  • the computer may be the foregoing road information detection apparatus.
  • the processor in the road information detection apparatus, the chip system, or the like in the foregoing embodiments of this application, or the processor provided in the foregoing embodiments of this application may be a central processing unit (CPU), or may be another general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), another programmable logic device, a discrete gate or a transistor logic device, a discrete hardware component, or the like.
  • the general-purpose processor may be a microprocessor, or the processor may be any conventional processor or the like.
  • processors in the road information detection apparatus, the chip system, or the like in the foregoing embodiments of this application.
  • the quantity may be adjusted based on an actual application scenario. This is merely an example for description herein, and is not limited.
  • memories in embodiments of this application. This may be adjusted based on an actual application scenario. This is merely an example for description herein, and is not limited.
  • the memory, the readable storage medium, or the like in the road information detection apparatus, the chip system, or the like in the foregoing embodiments may be a volatile memory or a non-volatile memory, or may include both a volatile memory and a non-volatile memory.
  • the non-volatile memory may be a ROM, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically-erasable programmable read-only memory (EEPROM), or a flash memory.
  • the volatile memory may be a RAM, and is used as an external cache.
  • RAMs are available, for example, a static random-access memory (SRAM), a dynamic random-access memory (DRAM), a synchronous dynamic random-access memory (SDRAM), a double data rate synchronous dynamic random-access memory (DDR SDRAM), an enhanced synchronous dynamic random-access memory (ESDRAM), a SynchLink dynamic random-access memory (SLDRAM), and a direct Rambus random-access memory (DR RAM).
  • SRAM static random-access memory
  • DRAM dynamic random-access memory
  • SDRAM synchronous dynamic random-access memory
  • DDR SDRAM double data rate synchronous dynamic random-access memory
  • ESDRAM enhanced synchronous dynamic random-access memory
  • SLDRAM SynchLink dynamic random-access memory
  • DR RAM direct Rambus random-access memory
  • the road information detection apparatus includes a processor (or a processing unit) and a memory
  • the processor in this application may be integrated with the memory, or may be connected to the memory through an interface. This may be adjusted based on an actual application scenario, and is not limited.
  • An embodiment of this application further provides a computer program or a computer program product including the computer program.
  • the computer program When the computer program is executed on a computer, the computer is enabled to implement a method procedure related to any one of the foregoing method embodiments.
  • the computer may be the foregoing road information detection apparatus.
  • All or some of the foregoing embodiments in FIG. 3 to FIG. 6 may be implemented by using software, hardware, firmware, or any combination thereof.
  • software is used to implement embodiments, all or some of embodiments may be implemented in a form of a computer program product.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus.
  • the computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line (DSL)) or wireless (for example, infrared, radio, or microwave) manner.
  • a wired for example, a coaxial cable, an optical fiber, or a digital subscriber line (DSL)
  • wireless for example, infrared, radio, or microwave
  • the computer-readable storage medium may be any usable medium accessible by the computer, or a data storage device, such as a server or a data center, integrating one or more usable media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a Digital Video Disk (DVD)), a semiconductor medium (for example, a solid-state drive (SSD)), or the like.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the described apparatus embodiment is merely an example.
  • unit division is merely logical function division and may be other division during actual implementation.
  • a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed.
  • the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces.
  • the indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
  • the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, in other words, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of embodiments.
  • function units in embodiments of this application may be integrated into one processing unit, each of the units may exist alone physically, or two or more units are integrated into one unit.
  • the integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software function unit.
  • the integrated unit When the integrated unit is implemented in the form of a software function unit and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the steps of the methods described in embodiments of this application.
  • the foregoing storage medium includes any medium that can store program code, such as a Universal Serial Bus (USB) flash drive, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disc.
  • USB Universal Serial Bus

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Human Computer Interaction (AREA)
  • Analytical Chemistry (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

A road information detection method is applied to automatic driving or intelligent driving and includes receiving a front-looking image from at least one front-looking camera apparatus; receiving a rear-looking image from at least one rear-looking camera apparatus; obtaining front-looking lane line information based on the front-looking image; obtaining rear-looking lane line information based on the rear-looking image; and obtaining target lane line information based on the front-looking lane line information and the rear-looking lane line information.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation application of International Patent Application No. PCT/CN2020/108133, filed on Aug. 10, 2020, which claims priority to Chinese Patent Application No. 201911209196.4, filed on Nov. 30, 2019. The disclosures of the aforementioned applications are hereby incorporated by reference in their entireties.
  • TECHNICAL FIELD
  • Embodiments of this application relate to the information processing field, and in particular, to a road information detection method and a road information detection apparatus.
  • BACKGROUND
  • In an assisted driving scenario and an automatic driving scenario, an intelligent vehicle needs to be aware of a driving environment in a traveling process of the vehicle. Road information detection is an important function of the intelligent vehicle to identify a surrounding environment, and is also an important part of environment perception. Only through effective and accurate road information detection can auxiliary functions such as path planning, road deviation alarm, or lane line maintenance be better supported.
  • In a conventional technology, a vehicle analyzes an image obtained by a front-looking camera apparatus, to obtain lane line information, or analyzes an image obtained by a rear-looking camera apparatus, to obtain lane line information, and an intelligent vehicle performs road information detection based on the lane line information. The image obtained by the front-looking camera apparatus and the image obtained by the rear-looking camera apparatus are mutually redundancy backups.
  • Road information detection is directly performed on lane line information obtained by analyzing an image corresponding to one camera apparatus such as the front-looking camera apparatus, and consequently, reliability of road information is not high, and stability of a road information detection result is not high.
  • SUMMARY
  • Embodiments of this application provide a road information detection method, so that a plurality of groups of lane line information may be fused to obtain target lane line information.
  • A first aspect of embodiments of this application provides a road information detection method.
  • A road information detection apparatus receives a plurality of pieces of image information, such as a front-looking image and a rear-looking image, where the front-looking image is obtained by a front-looking camera apparatus, and the rear-looking image is obtained by a rear-looking camera apparatus; the road information detection apparatus obtains front-looking lane line information based on the front-looking image, and obtains rear-looking lane line information based on the rear-looking image; and the road information detection apparatus performs fusion based on the obtained front-looking lane line information and the obtained rear-looking lane line information, to obtain target lane line information. Further optionally, the road information detection apparatus outputs the target lane line information.
  • Images corresponding to a plurality of camera apparatuses such as the front-looking camera apparatus and the rear-looking camera apparatus are separately analyzed to obtain a plurality of groups of lane line information, and the plurality of groups of lane line information are fused to obtain the target lane line information, so that reliability of the target lane line information is relatively high, and stability of a road information detection result is relatively high.
  • Based on the first aspect of embodiments of this application, in a first implementation of the first aspect of embodiments of this application, if the front-looking lane line information includes information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists, and the rear-looking lane line information includes information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists, the road information detection apparatus may fuse the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line, the probability that the rear-looking lane line exists, to obtain the target lane line information. The target lane line information may be a correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • In this embodiment of this application, specific information included in the front-looking lane line information and the rear-looking lane line information is provided, and a specific form of the target lane line information is provided, so that this solution is more implementable.
  • Based on the first aspect of embodiments of this application, in a second implementation of the first aspect of embodiments of this application, the road information detection apparatus may alternatively obtain front-looking lane structure information based on the front-looking image, and obtain rear-looking lane structure information based on the rear-looking image, and if the road information detection apparatus obtains the front-looking lane structure information based on the front-looking image, and obtains the rear-looking lane structure information based on the rear-looking image, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain the target lane line information.
  • In this embodiment of this application, the road information detection apparatus may alternatively obtain the front-looking lane structure information based on the front-looking image, and obtain the rear-looking lane structure information based on the rear-looking image. The target lane line information obtained with reference to the front-looking lane structure information and the rear-looking lane structure information is more stable.
  • Based on the second implementation of the first aspect of embodiments of this application, in a third implementation of the first aspect of embodiments of this application, the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists, the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists, the front-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, and the rear-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure in the traveling process of the vehicle and a probability that the rear-looking lane structure exists; and the road information detection apparatus fuses the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line, the probability that the rear-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, the probability that the front-looking lane structure exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure, and the probability that the rear-looking lane structure exists, to obtain a correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • In this embodiment of this application, specific information included in the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information is provided, and a specific form of the target lane line information is provided, so that this solution is more implementable.
  • Based on the first aspect of embodiments of this application, in a fourth implementation of the first aspect of embodiments of this application, the road information detection apparatus may receive radar data, where the radar data is obtained by a millimeter-wave radar; the road information detection apparatus obtains radar lane structure information based on the radar data; and if the road information detection apparatus receives the radar data and obtains the radar lane structure information by analyzing the radar data, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain the target lane line information.
  • In this embodiment of this application, the road information detection apparatus may alternatively receive the radar data, and obtain the radar lane structure information based on the radar data. The target lane line information obtained with reference to the radar lane structure information is more stable.
  • Based on the fourth implementation of the first aspect of embodiments of this application, in a fifth implementation of the first aspect of embodiments of this application, the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists, the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists, and the radar lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists; and the road information detection apparatus may fuse the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line, the probability that the rear-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure, and the probability that the radar lane structure exists, to obtain a correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • In this embodiment of this application, specific information included in the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information is provided, and a specific form of the target lane line information is provided, so that this solution is more implementable.
  • Based on the first aspect of embodiments of this application, in a sixth implementation of the first aspect of embodiments of this application, the road information detection apparatus may further receive radar data in addition to the front-looking image and the rear-looking image, and the road information detection apparatus obtains the front-looking lane line information and front-looking lane structure information based on the front-looking image, obtains the rear-looking lane line information and rear-looking lane structure information based on the rear-looking image, and obtains radar lane structure information based on the radar data. In a condition of this implementation, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain the target lane line information.
  • In this embodiment of this application, the road information detection apparatus may alternatively obtain the front-looking lane structure information based on the front-looking image, and obtain the rear-looking lane structure information based on the rear-looking image. The target lane line information obtained with reference to the front-looking lane structure information and the rear-looking lane structure information is more stable.
  • Based on the sixth implementation of the first aspect of embodiments of this application, in a seventh implementation of the first aspect of embodiments of this application, the front-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of a vehicle and a probability that the front-looking lane line exists, the rear-looking lane line information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle and a probability that the rear-looking lane line exists, the front-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, the rear-looking lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure in the traveling process of the vehicle and a probability that the rear-looking lane structure exists, and the radar lane structure information may include information such as a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists; and the road information detection apparatus may fuse the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line, the probability that the front-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line, the probability that the rear-looking lane line exists, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, the probability that the front-looking lane structure exists, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure, the probability that the rear-looking lane structure exists, the correspondence between a horizontal offset and a longitudinal offset of a radar lane structure, and the probability that the radar lane structure exists, to obtain a correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • In this embodiment of this application, specific information included in the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information is provided, and a specific form of the target lane line information is provided, so that this solution is more implementable.
  • Based on the third implementation of the first aspect of embodiments of this application and the seventh implementation of the first aspect, in an eighth implementation of the first aspect of embodiments of this application, the front-looking lane structure information may be front-looking road edge information and/or front-looking vehicle track information; the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge and/or a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track; the probability that the front-looking lane structure exists may be a probability that the front-looking road edge exists and/or a probability that the front-looking vehicle track exists; the rear-looking lane line information may be rear-looking road edge information; the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge; and the probability that the rear-looking lane structure exists may be a probability that the rear-looking road edge exists.
  • In this embodiment of this application, specific information content of the front-looking lane structure information and the rear-looking lane structure information is provided, so that this solution is more implementable.
  • Based on the fifth implementation of the first aspect of embodiments of this application and the seventh implementation of the first aspect, in a ninth implementation of the first aspect of embodiments of this application, the radar lane structure information may be radar road edge information and/or radar vehicle track information; the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure may be a correspondence between a horizontal offset and a longitudinal offset of a radar road edge and/or a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track; and the probability that the radar lane structure exists may be a probability that the radar road edge exists and/or a probability that the radar vehicle track exists.
  • In this embodiment of this application, specific information content of the radar lane structure information is provided, so that this solution is more implementable.
  • A second aspect of embodiments of this application provides a road information detection apparatus, and the apparatus performs the method in the first aspect.
  • A third aspect of embodiments of this application provides a road information detection apparatus, and the apparatus performs the method in the first aspect.
  • A fourth aspect of embodiments of this application provides a computer storage medium. The computer storage medium stores instructions. When the instructions are executed on a computer, the computer is enabled to perform the method in the first aspect.
  • A fifth aspect of embodiments of this application provides a computer program product. When the computer program product is run on a computer, the computer is enabled to perform the method in the first aspect.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a schematic diagram of a framework of a road information detection apparatus;
  • FIG. 2 is a schematic diagram of a road structure model according to an embodiment of this application;
  • FIG. 3 is a schematic diagram of an embodiment of a road information detection method according to an embodiment of this application;
  • FIG. 4 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application;
  • FIG. 5 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application;
  • FIG. 6 is a schematic diagram of another embodiment of a road information detection method according to an embodiment of this application;
  • FIG. 7 is a schematic diagram of a structure of a road information detection apparatus according to an embodiment of this application; and
  • FIG. 8 is a schematic diagram of another structure of a road information detection apparatus according to an embodiment of this application.
  • DESCRIPTION OF EMBODIMENTS
  • Embodiments of this application provide a road information detection method and a road information detection apparatus, to obtain more reliable lane line information.
  • The following clearly describes the technical solutions in embodiments of this application with reference to the accompanying drawings in embodiments of this application. It is clear that the described embodiments are merely some but not all of embodiments of this application.
  • In the specification, claims, and accompanying drawings of this application, the terms “first”, “second”, “third”, “fourth”, and the like (if existent) are intended to distinguish between similar objects but do not necessarily indicate a specific order or sequence. It should be understood that the data termed in such a way are interchangeable in appropriate circumstances, so that embodiments described herein can be implemented in an order other than the content illustrated or described herein. In addition, terms such as “include”, “have”, and any variations thereof are intended to cover non-exclusive inclusion, for example, a process, method, system, product, or device that includes a series of steps or units is not necessarily limited to those expressly listed steps or units, but may include other steps or units that are not expressly listed or inherent to such a process, method, product, or device.
  • Refer to FIG. 1, a framework of a road information detection system in an embodiment of this application includes: a front-looking camera apparatus 101, a rear-looking camera apparatus 102, a radar 103, and a road information detection apparatus 104. Optionally, the radar may be a millimeter-wave radar or a laser radar.
  • In this embodiment of this application, the road information detection apparatus 104 is applied to an intelligent vehicle, and sensors of the intelligent vehicle include the front-looking camera apparatus 101 and the rear-looking camera apparatus 102. The road information detection apparatus 104 is separately connected to the front-looking camera apparatus 101 and the rear-looking camera apparatus 102. The front-looking camera apparatus 101 photographs road information to obtain a front-looking image, and transmits the front-looking image to the road information detection apparatus 104. The rear-looking camera apparatus 102 photographs road information to obtain a rear-looking image, and transmits the rear-looking image to the road information detection apparatus 104.
  • In addition to the front-looking camera apparatus 101 and the rear-looking camera apparatus 102, the sensors of the intelligent vehicle may further include another sensor such as the radar 103. The road information detection apparatus 104 is connected to the radar 103, and the radar 103 detects road information to obtain radar data, and transmits the radar data to the road information detection apparatus 104.
  • A main function of the road information detection apparatus 104 is to receive data transmitted by the sensor, separately analyze and process different sensor data to obtain corresponding road information, fuse obtained plurality of pieces of road information to obtain target lane line information, and output the target lane line information.
  • The road information detection apparatus 104 receives the front-looking image sent by the front-looking camera apparatus 101 in the sensors, and road information obtained by analyzing the front-looking image may include road information such as front-looking lane line information and/or front-looking lane structure information. The front-looking lane structure information may include front-looking lane structure information such as front-looking road edge information and/or front-looking vehicle track information.
  • The road information detection apparatus 104 receives the rear-looking image sent by the rear-looking camera apparatus 102 in the sensors, and road information obtained by analyzing the rear-looking image may include road information such as rear-looking lane line information and/or rear-looking lane structure information. The rear-looking lane structure information may include but is not limited to rear-looking road edge information.
  • The road information detection apparatus 104 receives the radar data sent by the radar 103 in the sensors, and road information obtained by analyzing the radar data may include radar lane structure information. The radar lane structure information may include radar lane structure information such as radar road edge information and/or radar vehicle track information.
  • In this embodiment of this application, the road information detection apparatus 104 obtains, by using a technology such as an image detection technology or a visual recognition technology, the front-looking image photographed by the front-looking camera apparatus 101 and the rear-looking image photographed by the rear-looking camera apparatus 102, to obtain road information. An obtaining technology is not limited herein. The road information detection apparatus 104 obtains, by using a method such as a clustering algorithm or Hough transform, radar data detected by the millimeter-wave radar, to obtain road information. An obtaining technology is not limited herein.
  • Refer to a road structure model shown in FIG. 2, the following describes a road structure model in embodiments of this application.
  • A road information detection apparatus may obtain a road structure model based on at least one of road information parameters such as an average curvature change rate of a road edge, average curvature of a road edge, a heading angle of a road edge, and a horizontal offset of a road edge. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, the road structure model is generally represented by a clothoid. A correspondence between a horizontal offset and a longitudinal offset of the road structure model may be represented as:

  • y(1)=A0+A11+½A212+⅙A313.
  • It may be learned from the foregoing correspondence that, in a traveling process, a relationship between a horizontal offset of a vehicle and an arc length 1 of traveling of the vehicle is that a longitudinal offset of the vehicle is a sum of a third power of 1 multiplied by one sixth of A3, a second power of 1 multiplied by half of A2, 1 multiplied by A1, and A0, where A0 is a horizontal offset of a road edge, A1 is a heading angle of the road edge, A2 is average curvature of the road edge, and A3 is an average curvature change rate of the road edge.
  • It should be noted that a basic assumption of using a clothoid model is that a heading angle of the vehicle needs to be less than 10 degrees. Therefore, a sine value of the heading angle of the vehicle is approximately the heading angle of the vehicle, and a cosine value is approximately 1.
  • It is assumed that a change of curvature C of this curve with the arc length 1 is a constant; in other words, the average curvature change rate of the road edge is approximately a constant A3.
  • In this case, the curvature C may be represented as a linear function of the arc length 1: C(1)=A2+A31.
  • The heading angle ϕ(1) of the vehicle is an integral of the curvature on the arc length 1:

  • ϕ(1)=ϕ0+∫0 1 C(C)(1)d1=ϕ0 +A21+½A312 =A1+A21+½A312.
  • A rectangular coordinate system is established by using a head direction of the vehicle as a positive direction of an x-axis, and the vehicle is located at (x0, y0) at an initial moment. When the vehicle travels by the arc length 1, the following may be obtained:

  • x(1)=x 0+∫0 1 cos ϕ(1)d1; and y(1)=y 0+∫0 1 sin ϕ(1)d1.
  • If x is 0 at the initial moment, that is, x0 is 0, curve coordinates that meet the assumption that the heading angle ϕ is less than 10 degrees may be represented as follows:

  • x(1); and y(1)=y 0+∫0 1ϕ(1)d1.
  • Therefore, an expression of the clothoid may be obtained:

  • y(1)=A0+A11+½A212+⅙A313 =A0+A1x+½A2x 2+⅙A3x 3.
  • For subsequent brief description, a constant is simplified to obtain the following:

  • y(x)C0+C1x+C2x 2 +C3x 3.
  • C0 is the horizontal offset of the road edge, C1 is the heading angle of the road edge, C2 is twice the average curvature of the road edge, and C3 is six times the average curvature change rate of the road edge.
  • In this embodiment of this application, the road information detection apparatus may fuse a plurality of pieces of road information to obtain target lane line information. Separate descriptions are provided below with reference to a framework of a road information detection apparatus and a road structure model.
  • 1. The road information detection apparatus fuses front-looking lane line information and rear-looking lane line information, to obtain the target lane line information.
  • Refer to FIG. 3, an embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • 301: A road information detection apparatus receives a front-looking image.
  • The road information detection apparatus is connected to a front-looking camera apparatus. The connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, a front-looking image photographed by the front-looking camera apparatus. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, a front-looking image photographed by the front-looking camera apparatus. The wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • 302: The road information detection apparatus receives a rear-looking image.
  • The road information detection apparatus is connected to a rear-looking camera apparatus. The connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, a rear-looking image photographed by the rear-looking camera apparatus. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, a rear-looking image photographed by the rear-looking camera apparatus. The wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • 303: The road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • The road information detection apparatus analyzes the front-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the front-looking lane line information. A technology is not limited herein.
  • 304: The road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • The road information detection apparatus analyzes the rear-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the rear-looking lane line information. A technology is not limited herein.
  • In this embodiment, a process in which the road information detection apparatus obtains the front-looking lane line information by using the front-looking image is described in steps 301 and 303, and a process in which the road information detection apparatus obtains the rear-looking lane line information by using the rear-looking image is described in steps 302 and 304. There is no fixed time sequence relationship between the two processes.
  • 305: The road information detection apparatus fuses the front-looking lane line information and the rear-looking lane line information, to obtain target lane line information.
  • Information obtained by using the front-looking image is referred to as front-looking information, such as the front-looking lane line information, and information obtained by using the rear-looking image is referred to as rear-looking information, such as the rear-looking lane line information. The road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • In this embodiment, the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line.
  • The rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • In this embodiment, detailed descriptions are provided by using an example in which the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line and the probability that the front-looking lane line exists, and an example in which the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line and the probability that the rear-looking lane line exists.
  • Based on image information collected by the front-looking camera apparatus, the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:

  • X front,lane =C3front,lane ×Z{circumflex over ( )}3+C2front,lane ×Z{circumflex over ( )}2+C1front,lane ×Z+C0front,lane.
  • In this embodiment of this application, C0front, lane is a horizontal offset of a road edge of the front-looking lane line, C1front, lane is front-looking a heading angle of the road edge of the lane line, C2front, lane is twice average curvature of the road edge of the front-looking lane line, C3front, lane is six times an average curvature change rate of the road edge of the front-looking lane line, Xfront, lane is front-looking a horizontal offset of the vehicle on the lane line, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking lane line exists may be further obtained through detection, and is represented as Pfront, lane. The probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • Based on image information collected by the rear-looking camera apparatus, the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:

  • X rear,lane =C3rear,lane ×Z{circumflex over ( )}3+C2rear,lane ×Z{circumflex over ( )}2+C1rear,lane ×Z+C0rear,lane
  • In this embodiment of this application, C0rear, lane is a horizontal offset of a road edge of the rear-looking lane line, C1rear, lane is a heading angle of the road edge of the rear-looking lane line, C2rear, lane is twice average curvature of the road edge of the rear-looking lane line, C3rear, lane. is six times an average curvature change rate of the road edge of the rear-looking lane line, Xrear, lane is a horizontal offset of the vehicle on the rear-looking lane line, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as Prear, lane. The probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • In this embodiment, the foregoing correspondence is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • In this embodiment, the road information detection apparatus may fuse the front-looking lane line information and the rear-looking lane line information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information. The fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm. In this embodiment, a probability-based fusion algorithm is used as an example for a description.
  • In an optional design, a correspondence between a horizontal offset and a longitudinal offset of a target lane line obtained after fusion may be represented as follows:

  • X fusion =C3fusion ×Z{circumflex over ( )}3+C2fusion ×Z{circumflex over ( )}2+C1fusion ×Z+C0fusion,
  • where a probability that a lane line exists is fusion Pfusion, xfusion is a horizontal offset of the vehicle obtained after fusion, and Z is a longitudinal offset of the vehicle.
  • In this embodiment of this application, C0fusion is a horizontal offset of a road edge of the lane line obtained after fusion, C3fusion is a heading angle of the road edge of the lane line obtained after fusion, C2fusion is twice average curvature of the road edge of the lane line obtained after fusion, and C3fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • The road information detection apparatus may obtain (C3fusion, C2fusion, C1fusion, C0fusion), Pfusion based on (C3front, lane, C2front, lane, C0front, lane, C0front, lane), Pfront, lane; and (C3rear, lane, C2rear, lane, C1rear, lane, C0rear, lane), Prear, lane.
  • In an optional manner, an example fusion manner is as follows.
  • A threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if Pfront, lane is less than the threshold, C3front, lane, C2front, lane, C1front, lane, and C0front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted. For example, the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • In this embodiment of this application, all probabilities may be compared with a same threshold, or different thresholds may be used. For example, the probability Pfront, lane that the front-looking lane line information exists is compared with a first threshold, and the probability Prear, lane that the rear-looking lane line information exists is compared with a second threshold. The first threshold and the second threshold are unequal. For example, the first threshold is 0.6, and the second threshold is 0.7.
  • If both Pfront, lane and Prear, lane are less than the specified threshold, it indicates that the road information is invalid.
  • If one of Pfront, lane and Prear, lane is less than the threshold, and the other is greater than or equal to the threshold, it indicates that one piece of road information is untrusted, and the other piece of road information is trusted. In this case, the trusted road information is directly used.
  • If both Pfront, lane and Prear, lane are greater than or equal to the threshold, it indicates that detection of two sources has specific credibility, and the following example method may be used.
  • Pfusion=n/(1/Pfront, lane+1/Prear, lane), where n=2.
  • Then, this correspondence is solved.
  • Xfusion/Pfusion=Xfront, lane/Pfront, lane+Xrear, lane/Prear, lane.
  • Xfusion is a horizontal offset obtained after fusion.
  • The following is obtained:
  • Cifusion, where i is an integer from 0 to 3.
  • 306: The road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 305, between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle. An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • 2. The road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, front-looking lane structure information, and rear-looking lane structure information, to obtain the target lane line information.
  • Refer to FIG. 4, another embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • 401: A road information detection apparatus receives a front-looking image.
  • 402: The road information detection apparatus receives a rear-looking image.
  • 403: The road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • 404: The road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 401 to 404 in this embodiment are similar to steps 301 to 304 in the foregoing embodiment shown in FIG. 3, and details are not described herein again.
  • 405: The road information detection apparatus obtains front-looking lane structure information based on the front-looking image.
  • The road information detection apparatus analyzes the front-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the front-looking lane structure information. A technology is not limited herein.
  • 406: The road information detection apparatus obtains rear-looking lane structure information based on the rear-looking image.
  • The road information detection apparatus analyzes the rear-looking image by using a technology such as an image detection technology or a visual recognition technology, to obtain the rear-looking lane structure information. A technology is not limited herein.
  • In this embodiment, there is no fixed time sequence relationship between steps 401, 403, and 405 and steps 402, 404, and 406.
  • 407: The road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain target lane line information.
  • Information obtained by using the front-looking image is referred to as front-looking information, such as the front-looking lane line information and the front-looking lane structure information, and information obtained by using the rear-looking image is referred to as rear-looking information, such as the rear-looking lane line information and the rear-looking lane structure information.
  • The road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • The road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the image obtained by the camera apparatus. The vehicle track information is track information of an observed vehicle that is obtained by using an image, for example, a traveling track of a vehicle, that is, the observed vehicle, that can be photographed by a camera and that is obtained based on a plurality of images. Traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning. A road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line. The road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • In this embodiment, the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line. The rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • In this embodiment, the front-looking lane structure information may include at least one of front-looking road edge information and/or front-looking vehicle track information in the traveling process of the vehicle. The front-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge in the traveling process of the vehicle, a probability that the front-looking road edge exists, a type of the front-looking road edge, and/or a height of the front-looking road edge. The front-looking vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle, a probability that the front-looking vehicle track exists, and/or a status of a front-looking observed vehicle.
  • In this embodiment, the rear-looking lane structure information may include rear-looking road edge information. The rear-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge, a probability that the rear-looking road edge exists, a type of the rear-looking road edge, and/or a height of the rear-looking road edge.
  • In this embodiment, detailed descriptions are provided by using an example in which the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists, an example in which the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists, an example in which the front-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge and/or the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, and the probability that the front-looking road edge exists and/or the probability that the front-looking vehicle track exists are/is used as an example of the probability that the front-looking lane structure exists, and an example in which the rear-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure and a probability that the rear-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the rear-looking road edge is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure, and the probability that the rear-looking road edge exists is used as an example of the probability that the rear-looking lane structure exists.
  • Based on image information collected by a front-looking camera apparatus, the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:

  • X front,lane =C3front,lane ×Z{circumflex over ( )}3+C2front,lane ×Z{circumflex over ( )}2+C1front,lane ×Z+C0front,lane.
  • In this embodiment of this application, C0front, lane is a horizontal offset of a road edge of the front-looking lane line, C1front, lane is a heading angle of the road edge of the front-looking lane line, C2front, lane is twice average curvature of the road edge of the front-looking lane line, C3front, lane is six times an average curvature change rate of the road edge of the front-looking lane line, Xfront, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking lane line exists may be further obtained through detection, and is represented as Pfront, lane. The probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • Based on image information collected by a rear-looking camera apparatus, the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:

  • X rear,lane =C3rear,lane ×Z{circumflex over ( )}3+C2rear,lane ×Z{circumflex over ( )}2+C1rear,lane ×Z+C0rear,lane.
  • In this embodiment of this application, C0rear, lane is a horizontal offset of a road edge of the rear-looking lane line, C1rear, lane is a heading angle of the road edge of the rear-looking lane line, C2rear, lane is twice average curvature of the road edge of the rear-looking lane line, C3rear, lane is six times an average curvature change rate of the road edge of the rear-looking lane line, Xrear, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as Prear, lane. The probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • Based on image information collected by the front-looking camera apparatus, the front-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained. The road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge may be represented as follows:

  • X front,edge =C3front,edge ×Z{circumflex over ( )}3+C2front,edge ×Z{circumflex over ( )}2+C1front,edge ×Z+C0front,edge.
  • In this embodiment of this application, C0front, edge is a horizontal offset of a road edge related to the front-looking road edge, C1front, edge is a heading angle of the road edge related to the front-looking road edge, C1front, edge is twice average curvature of the road edge related to the front-looking road edge, C1front, edge is six times an average curvature change rate of the road edge related to the front-looking road edge, Xfront, edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking road edge exists may be further obtained through detection, and is represented as Pfront, edge. The probability that the front-looking road edge exists is used to indicate a probability that the road edge exists in the front-looking image.
  • Based on the image information collected by the front-looking camera apparatus, a front-looking observed vehicle is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding vehicle track information is obtained. The vehicle track may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. A correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an ith observed vehicle in vehicle tracks collected by the front-looking camera apparatus in the traveling process of the vehicle may be represented as follows:

  • X front,car(i) =C3front,car(i) ×Z{circumflex over ( )}3+C2front,car(i) ×Z{circumflex over ( )}2+C1front,car(i) ×Z+C0front,car(i).
  • In this embodiment of this application, C0front, car(i) is a horizontal offset of a road edge related to the vehicle track of the ith observed vehicle, C1front, car(i) is a heading angle of the road edge related to the vehicle track of the ith observed vehicle, C2front, car(i) is twice average curvature of the road edge related to the vehicle track of the ith observed vehicle, C3front car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the ith observed vehicle, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • A probability that the vehicle track of the ith observed vehicle exists may be represented as Pfront, car(i).
  • In an optional design, vehicle track information of each observed vehicle may be processed to obtain the front-looking vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle may be represented as follows:

  • X front,car =C3front,car ×Z{circumflex over ( )}3+C2front,car ×Z{circumflex over ( )}2+C1front,car ×Z+C0front,car.
  • In this embodiment of this application, in parameters of the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track in the traveling process of the vehicle, C0front, car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle, C1front, car is a heading angle of the road edge related to the vehicle track of the observed vehicle, C2front, car is twice average curvature of the road edge related to the vehicle track of the observed vehicle, C3front, car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle, Xfront, car is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • An example implementation may be as follows. Vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as the front-looking vehicle track information, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the front-looking vehicle track information. This is not limited herein.
  • A probability that the front-looking vehicle track exists may be represented as Pfront, car, and the probability that the front-looking vehicle track exists is used to indicate a probability that the vehicle track exists in the front-looking image.
  • Based on the image information collected by the rear-looking camera apparatus, the rear-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained. The road edge may be represented by using a curve of a specific road structure model. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking road edge may be represented as follows:

  • X rear,edge =C3rear,edge ×Z{circumflex over ( )}3+C2rear,edge ×Z{circumflex over ( )}2+C1rear,edge ×Z+C0rear,edge.
  • In this embodiment of this application, C0rear,edge is a horizontal offset of a road edge related to the rear-looking road edge, C1rear,edge is a heading angle of the road edge related to the rear-looking road edge, C2rear,edge is twice average curvature of the road edge related to the rear-looking road edge, C3rear,edge is six times an average curvature change rate of the road edge related to the rear-looking road edge, Xrear,edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking road edge exists may be further obtained through detection, and is represented as Prear,edge. The probability that the rear-looking road edge exists is used to indicate a probability that the road edge exists in the rear-looking image.
  • In this embodiment, the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • In this embodiment, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information. The fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm. In this embodiment, a probability-based fusion algorithm is used as an example for a description.
  • In an optional design, a correspondence between a horizontal offset and a longitudinal offset of a target lane line obtained after fusion may be represented as follows:

  • X fusion =C3fusion ×Z{circumflex over ( )}3+C2fusion ×Z{circumflex over ( )}2+C1fusion ×Z+C0fusion,
  • where a probability that a lane line exists is Pfusion, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In this embodiment of this application, C0fusion is a horizontal offset of a road edge of the lane line obtained after fusion, C1fusion is a heading angle of the road edge of the lane line obtained after fusion, C2fusion is twice average curvature of the road edge of the lane line obtained after fusion, and C3fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • The road information detection apparatus obtains (C3fusion, C2fusion, C1fusion, C0fusion), Pfusion based on (C3front, lane, C2front, lane, C1front, lane, C0front, lane), Pfront, lane; (C3rear, lane, C2rear, lane, C1rear, lane, C0rear, lane), Prear, lane; (C3front, edge, C2front, edge, C1front, edge, C0front, edge), Pfront, edge; (C3front, car, C2front, car, C1front, car, C0front, car), Pfront, car; and (C3rear, edge, C2rear, edge, C1rear, edge, C0rear, edge), Prear, edge.
  • In an optional manner, an example manner is as follows.
  • A threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if Pfront, lane is less than the threshold, C3front, lane, C2front, lane, C1front, lane, and C0front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted. For example, the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • In this embodiment of this application, all probabilities may be compared with a same threshold, or different thresholds may be used. For example, the probability Pfront, lane that the front-looking lane line information exists is compared with a first threshold, and the probability Prear, lanethat the rear-looking lane line information exists is compared with a second threshold. The first threshold and the second threshold are unequal. For example, the first threshold is 0.6, and the second threshold is 0.7.
  • If probabilities of all collected road information are less than the specified threshold, it indicates that the road information collected this time is invalid.
  • If a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • Pfusion−n/(1/Pfront, lane+1/Prear, lane+1/Pfront, edge+1)/Pfront, car+1/Prear, edge), where n=5.
  • Then, this correspondence is solved.
  • Xfusion/Pfusion=Xfront, lane/Pfront, lane+Xrear, lane/Prear, lane+Xfront, edge/Pfront, edge+Xfront, car/Pfront, car+Xrear, edge/Prear, edge, where Xfusion is a horizontal offset obtained after fusion.
  • The following is obtained:
  • Cifusion, where i is an integer from 0 to 3.
  • It may be understood that the threshold may not be set in actual application. For example, the road information detection apparatus determines that all parameters are valid parameters.
  • 408: The road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 407, between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle. An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • 3. The road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, and radar lane structure information, to obtain the target lane line information.
  • Refer to FIG. 5, another embodiment of a road information detection method in an embodiment of this application includes the following steps.
  • 501: A road information detection apparatus receives a front-looking image.
  • 502: The road information detection apparatus receives a rear-looking image.
  • Steps 501 and 502 in this embodiment are similar to steps 301 and 302 in the foregoing embodiment shown in FIG. 3, and details are not described herein again.
  • 503: The road information detection apparatus receives radar data.
  • The road information detection apparatus is connected to a millimeter-wave radar. The connection may be a wired connection or a wireless connection, and this is not limited herein. If the connection is a wired connection, the road information detection apparatus may receive, by using a data transmission line, radar data detected by the millimeter-wave radar. If the connection is a wireless connection, the road information detection apparatus may receive, by using a wireless network, radar data detected by the millimeter-wave radar. The wireless network may be a public network wireless network or a dedicated network wireless network, and this is not limited herein.
  • 504: The road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • 505: The road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 404 and 405 in this embodiment are similar to steps 303 and 304 in the foregoing embodiment shown in FIG. 3, and details are not described herein again.
  • 506: The road information detection apparatus obtains radar lane structure information based on the radar data.
  • The road information detection apparatus obtains the radar lane structure information based on the radar data by using a method such as a clustering algorithm, a graph model method, Hough transform, machine learning, or the like.
  • In this embodiment, there is no fixed time sequence relationship between steps 501 and 504, steps 502 and 505, and steps 503 and 506.
  • 507: The road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain target lane line information.
  • Information obtained by the road information detection apparatus by using the front-looking image is referred to as front-looking information, such as the front-looking lane line information, and information obtained by using the rear-looking image is referred to as rear-looking information, such as the rear-looking lane line information. The road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus, such as a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle and a probability that the lane line exists, and may further obtain information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line by using the front-looking image.
  • Information obtained by the road information detection apparatus by using the radar data is referred to as radar information, such as radar lane line information. The road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the radar data obtained by the millimeter-wave radar. The vehicle track information is track information of an observed vehicle that is obtained by using the radar data, for example, a traveling track of a vehicle, that is, the observed vehicle, that can be collected by using a signal obtained based on radar data collected by the millimeter-wave radar for a plurality of times. Traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning. A road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line. The road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • In this embodiment, the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line. The rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • In this embodiment, the radar lane structure information may include at least one of radar vehicle track information and/or radar road edge information in the traveling process of the vehicle, and the radar vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle, a probability that the radar vehicle track exists, and/or a status of a radar observed vehicle. The radar road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar road edge in the traveling process of the vehicle, a probability that the radar road edge exists, a type of the radar road edge, and/or a height of the radar road edge.
  • In this embodiment, detailed descriptions are provided by using an example in which the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists, an example in which the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists, and an example in which the radar lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the radar road edge and/or the correspondence between the horizontal offset and the longitudinal offset of radar vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure, and the probability that the radar road edge exists and/or the probability that the radar vehicle track exists are/is used as the probability that the radar lane structure exists.
  • Based on image information collected by the front-looking camera apparatus, the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:

  • X front,lane =C3front,lane ×Z{circumflex over ( )}3+C2front,lane ×Z{circumflex over ( )}2+C1front,lane ×Z+C0front,lane.
  • In this embodiment of this application, C0front, lane is a horizontal offset of a road edge of the front-looking lane line, C1front, lane is front-looking a heading angle of the road edge of the lane line, C2front, lane is twice average curvature of the road edge of the front-looking lane line, C3front, lane is six times an average curvature change rate of the road edge of the front-looking lane line, Xfront, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking lane line exists may be further obtained through detection, and is represented as Pfront, lane. The probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • Based on image information collected by the rear-looking camera apparatus, the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:

  • X rear,lane =C3rear,lane ×Z{circumflex over ( )}3+C2rear,lane ×Z{circumflex over ( )}2+C1rear,lane ×Z+C0rear,lane.
  • In this embodiment of this application, C0rear, lane is a horizontal offset of a road edge of the rear-looking lane line, C1rear, lane is a heading angle of the road edge of the rear-looking lane line, C2rear, lane is twice average curvature of the road edge of the rear-looking lane line, C3rear, lane is six times an average curvature change rate of the road edge of the rear-looking lane line, Xrear, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as Prear, lane. The probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • Based on the radar information collected by the millimeter-wave radar, the radar information is detected based on a road structure model by using a method such as deep learning, and corresponding road edge information is obtained. The road edge may be represented by using a curve of a specific road structure model. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters. In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the radar road edge may be represented as follows:

  • X radar,edge =C3radar,edge ×Z{circumflex over ( )}3+C2radar,edge ×Z{circumflex over ( )}2+C1radar,edge ×Z+C0radar,edge.
  • In this embodiment of this application, C0radar, edge is a horizontal offset of a road edge obtained based on the radar road edge information, C1radar, edge is a heading angle of the road edge obtained based on the radar road edge information, C2radar, edge is twice average curvature of the road edge obtained based on the radar road edge information, C3radar, edge is six times an average curvature change rate of the road edge obtained based on the radar road edge information, Xradar, edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the radar road edge exists may be further obtained through detection, and is represented as Pradar, lane. The probability that the radar road edge exists is used to indicate a probability that a road edge detected by the millimeter-wave radar exists.
  • Based on the radar information collected by the millimeter-wave radar, track information of one or more observed vehicles that is collected by one or more vehicles may be analyzed based on a road structure model by using a method such as depth learning, and a vehicle track may be represented by using a curve of a specific road structure model. Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. A correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an ith observed vehicle in vehicle tracks collected by the millimeter-wave radar in the traveling process of the vehicle may be represented as follows:

  • X radar,car(i) =C3radar,car(i) ×Z{circumflex over ( )}3+C2radar,car(i) ×Z{circumflex over ( )}2+C1radar,car(i) ×Z+C0radar,car(i).
  • In this embodiment of this application, C0radar, car(i) is a horizontal offset of a road edge related to the vehicle track of the ith observed vehicle, C1radar, car(i) is a heading angle of the road edge related to the vehicle track of the ith observed vehicle, C2radar, car(i) is twice average curvature of the road edge related to the vehicle track of the ith observed vehicle, C3radar, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the ith observed vehicle, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • A probability that a vehicle track of a target vehicle exists may be represented as Pradar, car(i).
  • In an optional design, vehicle track information of each target vehicle may be processed to obtain radar vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle may be represented as follows:

  • X radar,car =C3radar, car ×Z{circumflex over ( )}3+C2radar,car ×Z{circumflex over ( )}2+C1radar,car ×Z+C0radar,car.
  • In this embodiment of this application, in parameters of the correspondence between the horizontal offset and the longitudinal offset of the radar vehicle track in the traveling process of the vehicle, C0radar, car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle, C1radar, car is a heading angle of the road edge related to the vehicle track of the observed vehicle, C2radar, car is twice average curvature of the road edge related to the vehicle track of the observed vehicle, C3radar, car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle, Xradar, car is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows. Radar vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as radar vehicle track information of a to-be-observed vehicle, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the radar vehicle track information. This is not limited herein.
  • A probability that the radar vehicle track exists may be represented as Pradar, car and the probability that the radar vehicle track exists is used to indicate a probability that a vehicle track detected by the millimeter-wave radar exists.
  • In this embodiment, the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • In this embodiment, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information. The fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm. In this embodiment, a probability-based fusion algorithm is used as an example for a description.
  • In an optional design, a correspondence of a target lane line obtained after fusion may be represented as follows:

  • X fusion =C3fusion ×Z{circumflex over ( )}3+C2fusion ×Z{circumflex over ( )}2+C1fusion ×Z+C0fusion.
  • where a probability that a lane line exists is Pfusion, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In this embodiment of this application, C0fusion is a horizontal offset of a road edge of the lane line obtained after fusion, C1fusion is a heading angle of the road edge of the lane line obtained after fusion, C2fusion is twice average curvature of the road edge of the lane line obtained after fusion, and C3fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • The road information detection apparatus obtains (C3fusion, C2fusion, C1fusion, C0fusion), Pfusion based on (C3front, lane, C2front, lane, C1front, lane, C0front, lane), Pfront, lane; (C3rear, lane, C2rear, lane, C1rear, lane, C0rear, lane), Prear, lane; (C3radar, car, C2radar, car, C1radar, car, C0radar, car), Pradar, car; and (C3radar, edge, C2radar, edge, C1radar, edge, C0radar, edge), Pradar, edge.
  • In an optional manner, an example manner is as follows.
  • A threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if Pfront, lane is less than the threshold, C3front, lane, C2front, lane, C1front, lane, and C0front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted. For example, the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • In this embodiment of this application, all probabilities may be compared with a same threshold, or different thresholds may be used. For example, the probability Pfront, lane that the front-looking lane line information exists is compared with a first threshold, and the probability Prear, lane that the rear-looking lane line information exists is compared with a second threshold. The first threshold and the second threshold are unequal. For example, the first threshold is 0.6, and the second threshold is 0.7.
  • If probabilities of all collected road information are less than the specified threshold, it indicates that the road information collected this time is invalid.
  • If a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • Pfusion−n/(1/Pfront, lane+1/Prear, lane+1/Pfront, car+1)/Pfront, edge), where n=4.
  • Then, this correspondence is solved.
  • Xfusion/Pfusion=Xfront, lane/Pfront, lane+Xrear, lane/Prear, lane+Xradar, car/Pradar, car+Xradar, edge/Pradar, edgewhere Xfusion is a horizontal offset obtained after fusion.
  • The following is obtained:
  • Cifusion, where i is an integer from 0 to 3.
  • It may be understood that the threshold may not be set in actual application. For example, the road information detection apparatus determines that all parameters are valid parameters.
  • 508: The road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 507, between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle. An output manner may be an image, a video, voice, or text. An manner is not limited herein.
  • 4. The road information detection apparatus fuses front-looking lane line information, rear-looking lane line information, front-looking lane structure information, rear-looking lane structure information, and radar lane structure information, to obtain the target lane line information.
  • Refer to FIG. 6, another embodiment of a road information detection method in an embodiment of this application includes the following steps:
  • 601: A road information detection apparatus receives a front-looking image.
  • 602: The road information detection apparatus receives a rear-looking image.
  • 603: The road information detection apparatus receives radar data.
  • 604: The road information detection apparatus obtains front-looking lane line information based on the front-looking image.
  • 605: The road information detection apparatus obtains rear-looking lane line information based on the rear-looking image.
  • Steps 601 to 605 in this embodiment are similar to steps 501 to 505 in the foregoing embodiment shown in FIG. 5, and details are not described herein again.
  • 606: The road information detection apparatus obtains front-looking lane structure information based on the front-looking image.
  • 607: The road information detection apparatus obtains rear-looking lane structure information based on the rear-looking image.
  • Steps 606 and 607 in this embodiment are similar to steps 405 and 406 in the foregoing embodiment shown in FIG. 4, and details are not described herein again.
  • 608: The road information detection apparatus obtains radar lane structure information based on the radar data.
  • Step 608 in this embodiment is similar to step 506 in the foregoing embodiment shown in FIG. 5, and details are not described herein again.
  • In this embodiment, there is no fixed time sequence relationship between steps 601, 604, and 606, steps 602, 605, and 607, and steps 603, 606, and 608.
  • 609: The road information detection apparatus fuses the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain target lane line information.
  • Information obtained by the road information detection apparatus by using the front-looking image is referred to as front-looking information, such as the front-looking lane line information and the front-looking lane structure information; information obtained by using the rear-looking image is referred to as rear-looking information, such as the rear-looking lane line information and the rear-looking lane structure information; and information obtained by using the radar data is referred to as radar information, such as the radar lane line information and the radar lane structure information.
  • The road information detection apparatus may obtain lane line information based on an image obtained by a camera apparatus or radar data obtained by a millimeter-wave radar, for example, a correspondence between a horizontal offset and a longitudinal offset of a lane line that changes with traveling of a vehicle, a probability that the lane line exists, and information such as a width of a lane, a type of the lane line such as a one-way line or a deceleration line, color of the lane line such as yellow or white, and a width of the lane line.
  • The road information detection apparatus may further obtain lane structure information such as vehicle track information or road edge information based on the image obtained by the camera apparatus or the radar data obtained by the millimeter-wave radar. The vehicle track information is traveling track information of an observed vehicle, and traveling tracks of a plurality of observed vehicles are fused to obtain a correspondence between a horizontal offset and a longitudinal offset of a vehicle track, a probability that the vehicle track exists, and a status of the observed vehicle, for example, the observed vehicle is in a state of parking, traveling forward, or turning. A road edge is generally a boundary stone of an edge. Generally, the road edge is parallel to the lane line. The road information detection apparatus may obtain a correspondence between a horizontal offset and a longitudinal offset of the road edge that changes with traveling of the vehicle and a probability that the road edge exists, and may further obtain road edge information such as a type of the road edge such as a stone road edge or a fence, or a height of the road edge.
  • In this embodiment, the front-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line in a traveling process of the vehicle, a probability that the front-looking lane line exists, a width of a front-looking lane, a type of the front-looking lane line, color of the front-looking lane line, and/or a width of the front-looking lane line. The rear-looking lane line information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line in the traveling process of the vehicle, a probability that the rear-looking lane line exists, a width of a rear-looking lane, a type of the rear-looking lane line, color of the rear-looking lane line, and/or a width of the rear-looking lane line.
  • In this embodiment, the front-looking lane structure information may include at least one of front-looking road edge information and/or front-looking vehicle track information in the traveling process of the vehicle. The front-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking road edge in the traveling process of the vehicle, a probability that the front-looking road edge exists, a type of the front-looking road edge, and/or a height of the front-looking road edge. The front-looking vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle, a probability that the front-looking vehicle track exists, and/or a status of a front-looking observed vehicle.
  • In this embodiment, the rear-looking lane structure information may include rear-looking road edge information. The rear-looking road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a rear-looking road edge in the traveling process of the vehicle, a probability that the rear-looking road edge exists, a type of the rear-looking road edge, and/or a height of the rear-looking road edge.
  • In this embodiment, the radar lane structure information may include at least one of radar vehicle track information and/or radar road edge information in the traveling process of the vehicle, and the radar vehicle track information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track in the traveling process of the vehicle, a probability that the radar vehicle track exists, and/or a status of a radar observed vehicle. The radar road edge information may include at least one of a correspondence between a horizontal offset and a longitudinal offset of a radar road edge in the traveling process of the vehicle, a probability that the radar road edge exists, a type of the radar road edge, and/or a height of the radar road edge.
  • In this embodiment, detailed descriptions are provided by using an example in which the front-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line in the traveling process of the vehicle and the probability that the front-looking lane line exists, an example in which the rear-looking lane line information is the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line in the traveling process of the vehicle and the probability that the rear-looking lane line exists, an example in which the front-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure in the traveling process of the vehicle and a probability that the front-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge and/or the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane structure, and the probability that the front-looking road edge exists and/or the probability that the front-looking vehicle track exists are/is used as an example of the probability that the front-looking lane structure exists, an example in which the rear-looking lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure in the traveling process of the vehicle and a probability that the rear-looking lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the rear-looking road edge is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane structure, and the probability that the rear-looking road edge exists is used as an example of the probability that the rear-looking lane structure exists, and an example in which the radar lane structure information is a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure in the traveling process of the vehicle and a probability that the radar lane structure exists, where the correspondence between the horizontal offset and the longitudinal offset of the radar road edge and/or the correspondence between the horizontal offset and the longitudinal offset of the radar vehicle track are/is used as an example of the correspondence between the horizontal offset and the longitudinal offset of the radar lane structure, and the probability that the radar road edge exists and/or the probability that the radar vehicle track exists are/is used as an example of the probability that the radar lane structure exists.
  • Based on image information collected by a front-looking camera apparatus, the front-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking lane line may be represented as follows:

  • X front,lane =C3front,lane ×Z{circumflex over ( )}3+C2front,lane ×Z{circumflex over ( )}2+C1front,car(i) ×Z+C0front,lane.
  • In this embodiment of this application, C0front, lane is a horizontal offset of a road edge of the front-looking lane line, C1front, lane is front-looking a heading angle of the road edge of the lane line, C2front, lane is twice average curvature of the road edge of the front-looking lane line, C3front, lane is six times an average curvature change rate of the road edge of the front-looking lane line, Xfront, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking lane line exists may be further obtained through detection, and is represented as Pfront, lane. The probability that the front-looking lane line exists is used to indicate a probability that the lane line exists in the front-looking image.
  • Based on image information collected by a rear-looking camera apparatus, the rear-looking lane line is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane line information is obtained. The lane line may be represented by using a curve of a specific road structure model. Different lane line representations may include a horizontal offset of a lane edge, a heading angle of the lane edge, average curvature of the lane line, and an average curvature change rate of the lane line. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking lane line may be represented as follows:

  • X rear,lane =C3rear,lane ×Z{circumflex over ( )}3+C2rear,lane ×Z{circumflex over ( )}2+C1rear,lane ×Z+C0rear,lane.
  • In this embodiment of this application, C0rear, lane is a horizontal offset of a road edge of the rear-looking lane line, C1rear, lane is a heading angle of the road edge of the rear-looking lane line, C2rear, lane is twice average curvature of the road edge of the rear-looking lane line, C3rear, lane is six times an average curvature change rate of the road edge of the rear-looking lane line, Xrear, lane is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking lane line exists may be further obtained through detection, and is represented as Prear, lane. The probability that the rear-looking lane line exists is used to indicate a probability that the lane line exists in the rear-looking image.
  • Based on the image information collected by the front-looking camera apparatus, the front-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding road edge information is obtained. The road edge may be represented by using a curve of a specific road structure model. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the front-looking road edge may be represented as follows:

  • X front,edge =C3front,edge ×Z{circumflex over ( )}3+C2front,edge ×Z{circumflex over ( )}2+C1front,edge ×Z+C0front,edge
  • In this embodiment of this application, C0front, edge is a horizontal offset of a road edge related to the front-looking road edge, C1front edge is a heading angle of the road edge related to the front-looking road edge, C1front, edge is twice average curvature of the road edge related to the front-looking road edge, C1front, edge is six times an average curvature change rate of the road edge related to the front-looking road edge, Xfront, edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the front-looking road edge exists may be further obtained through detection, and is represented as Pfront, edge The probability that the front-looking road edge exists is used to indicate a probability that the road edge exists in the front-looking image.
  • Based on the image information collected by the front-looking camera apparatus, a front-looking observed vehicle is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding vehicle track information is obtained. A vehicle track may be represented by using a curve of a specific road structure model. Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. A correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an ith observed vehicle in vehicle tracks collected by the front-looking camera apparatus in the traveling process of the vehicle may be represented as follows:

  • X front,car(i) =C3front,car(i) ×Z{circumflex over ( )}3+C2front,car(i) ×Z{circumflex over ( )}2+C1front,car(i) ×Z+C0front,car(i).
  • In this embodiment of this application, C0front, car(i) is a horizontal offset of a road edge related to the vehicle track of the ith observed vehicle, C1front, car(i) is a heading angle of the road edge related to the vehicle track of the ith observed vehicle, C2front, car(i) is twice average curvature of the road edge related to the vehicle track of the ith observed vehicle, C3front, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the ith observed vehicle, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • A probability that the vehicle track of the ith observed vehicle exists may be represented as Pfront, car(i).
  • In an optional design, vehicle track information of each observed vehicle may be processed to obtain the front-looking vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a front-looking vehicle track in the traveling process of the vehicle may be represented as follows:

  • X front,car =C3front,car ×Z{circumflex over ( )}3+C2front,car ×Z{circumflex over ( )}2+C1front,car ×Z+C0front,car.
  • In this embodiment of this application, in parameters of the correspondence between the horizontal offset and the longitudinal offset of the front-looking vehicle track, C0front, car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle, C1front, car is a heading angle of the road edge related to the vehicle track of the observed vehicle, C2front, car 1S twice average curvature of the road edge related to the vehicle track of the observed vehicle, C3front, car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle, Xfront, car is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows: Vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as the front-looking vehicle track information, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the front-looking vehicle track information. This is not limited herein.
  • A probability that the front-looking vehicle track exists may be represented as Pfront, car, and the probability that the front-looking vehicle track exists is used to indicate a probability that the vehicle track exists in the front-looking image.
  • Based on the image information collected by the rear-looking camera apparatus, the rear-looking road edge is detected based on a road structure model by using a method such as image detection or deep learning, and corresponding lane edge information is obtained. The road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained.
  • In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the rear-looking road edge may be represented as follows:

  • X rear,edge =C3rear,edge ×Z{circumflex over ( )}3+C2rear,edge ×Z{circumflex over ( )}2+C1rear,edge ×Z+C0rear,edge.
  • In this embodiment of this application, C0rear, edge is a horizontal offset of a road edge related to the rear-looking road edge, C1rear, edge is a heading angle of the road edge related to the rear-looking road edge, C2rear, edge is twice average curvature of the road edge related to the rear-looking road edge, C3rear, edge is six times an average curvature change rate of the road edge related to the rear-looking road edge, Xrear, edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the rear-looking road edge exists may be further obtained through detection, and is represented as Prear, edge. The probability that the rear-looking road edge exists is used to indicate a probability that the road edge exists in the rear-looking image.
  • Based on the radar information collected by the millimeter-wave radar, the radar information is detected based on a road structure model by using a method such as deep learning, and corresponding road edge information is obtained. A road edge may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different road edge representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. In the traveling process of the vehicle, the correspondence between the horizontal offset and the longitudinal offset of the radar road edge may be represented as follows:

  • X radar,edge =C3radar,edge ×Z{circumflex over ( )}3+C2radar,edge ×Z{circumflex over ( )}2+C1radar,edge ×Z+C0radar,edge.
  • In this embodiment of this application, C0radar, edge is a horizontal offset of a road edge obtained based on the radar road edge information, C1radar, edge is a heading angle of the road edge obtained based on the radar road edge information, C2radar, edge is twice average curvature of the road edge obtained based on the radar road edge information, C3radar, edge is six times an average curvature change rate of the road edge obtained based on the radar road edge information, Xradar, edge is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In addition, the probability that the radar road edge exists may be further obtained through detection, and is represented as Pradar, lane. The probability that the radar road edge exists is used to indicate a probability that a road edge detected by the millimeter-wave radar exists.
  • Based on the radar information collected by the millimeter-wave radar, collected track information of one or more observed vehicles may be analyzed based on a road structure model by using a method such as depth learning, and may be represented by using a curve of a specific road structure model and a correspondence between a horizontal offset and a longitudinal offset. Different vehicle track representations may include a horizontal offset of a road edge, a heading angle of the road edge, average curvature of a road, and an average curvature change rate of the road. An optional formula representation method for the road structure model is provided below. However, this application is not limited thereto, and the road structure model may be determined or indicated by at least one of the foregoing parameters.
  • In an optional design, road curve information detected in the traveling process of the vehicle is fused, so that a road curve obtained after fusion, that is, a correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle may be obtained. A correspondence between a horizontal offset and a longitudinal offset of a vehicle track of an ith observed vehicle in vehicle tracks collected by the millimeter-wave radar in the traveling process of the vehicle may be represented as follows:

  • X radar,car(i) =C3radar,car(i) ×Z{circumflex over ( )}3+C2radar,car(i) ×Z{circumflex over ( )}2+C1radar,car(i) ×Z+C0radar,car(i).
  • In this embodiment of this application, C0radar, car(i) is a horizontal offset of a road edge related to the vehicle track of the ith observed vehicle, C1radar, car(i) is a heading angle of the road edge related to the vehicle track of the ith observed vehicle, C2radar, car(i) is twice average curvature of the road edge related to the vehicle track of the ith observed vehicle, C3radar, car(i) is six times an average curvature change rate of the road edge related to the vehicle track of the ith observed vehicle, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • A probability that a vehicle track of a target vehicle exists may be represented as Pradar, car(i).
  • In an optional design, vehicle track information of each target vehicle may be processed to obtain the radar vehicle track information, and a correspondence between a horizontal offset and a longitudinal offset of a radar vehicle track may be represented as follows:

  • X radar,car =C3radar,car ×Z{circumflex over ( )}3+C2radar,car ×Z{circumflex over ( )}2+C1radar,car ×Z+C0radar,car.
  • In this embodiment of this application, in parameters of the correspondence between the horizontal offset and the longitudinal offset of the radar vehicle track, C0radar,car is a horizontal offset of a road edge related to a vehicle track of an observed vehicle, C1radar,car is a heading angle of the road edge related to the vehicle track of the observed vehicle, C2radar,car is twice average curvature of the road edge related to the vehicle track of the observed vehicle, C3radar,car is six times an average curvature change rate of the road edge related to the vehicle track of the observed vehicle, Xradar,car is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • An implementation may be as follows. Radar vehicle track information corresponding to an observed vehicle with a highest existence probability of a vehicle track is selected as radar vehicle track information of a to-be-observed vehicle, or vehicle track information of all observed vehicles may be weighted and averaged to obtain the radar vehicle track information. This is not limited herein.
  • A probability that the radar vehicle track exists may be represented as Pradar, car, and the probability that the radar vehicle track exists is used to indicate a probability that a vehicle track detected by the millimeter-wave radar exists.
  • In this embodiment, the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle is merely used as an example for description. It may be understood that in actual application, a correspondence may have another form, for example, an equivalent variant of the foregoing correspondence between a horizontal offset and a longitudinal offset in the traveling process of the vehicle. This is not limited herein.
  • In this embodiment, the road information detection apparatus may fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information by using at least one of a plurality of fusion algorithms, to obtain the target lane line information. The fusion algorithm may be a Bayesian fusion algorithm, or may be another fusion algorithm such as a multi-hypothesis check algorithm. In this embodiment, a probability-based fusion algorithm is used as an example for a description.
  • In an optional design, a correspondence of a target lane line obtained after fusion may be represented as follows:

  • X fusion =C3fusion ×Z{circumflex over ( )}3+C2fusion ×Z{circumflex over ( )}2+C1fusion ×Z+C0fusion.
  • A probability that a lane line exists is Pfusion, X is a horizontal offset of the vehicle, and Z is a longitudinal offset of the vehicle.
  • In this embodiment of this application, C0fusion is a horizontal offset of a road edge of the lane line obtained after fusion, C1fusion is a heading angle of the road edge of the lane line obtained after fusion, C2fusion is twice average curvature of the road edge of the lane line obtained after fusion, and C3fusion is six times an average curvature change rate of the road edge of the lane line obtained after fusion.
  • The road information detection apparatus obtains (C3fusion, C2fusion, C1fusion, C0fusion), Pfusion based on
  • (C3front, lane, C2front, lane, C1front, lane, C0front, lane), Pfront, lane; (C3rear, lane, C2rear, lane, C1rear, lane, C0rear, lane), Prear, lane; (C3front, edge, C2front, edge, C1front, edge, C0front, edge), Pfront, edge; (C3front, car, C2front, car, C1front, car, C0front, car), Pfront, car; (C3rear, edge, C2rear, edge, C1rear, edge, C0rear, edge), Prear, edge; (C3radar, car, C2radar, car, C1radar, car, C0radar, car), Pradar, car, and (C3radar, edge, C2radar, edge, C1radar, edge, C1radar, edge), Pradar, edge.
  • In an optional manner, an example specific manner is as follows.
  • A threshold may be first set. If a probability of road information collected by a specific collector is less than the threshold, the road information is considered as untrusted. For example, if Pfront, lane is less than the threshold, C3front, lane, C2front, lane, C1front, lane, and C0front, lane are considered as untrusted, and only road information greater than or equal to the threshold is considered as trusted. For example, the threshold herein may be set to 0.5. It may be understood that the threshold may be another value such as 0.6, and this is not limited herein.
  • In this embodiment of this application, all probabilities may be compared with a same threshold, or different thresholds may be used. For example, the probability Pfront, lane that the front-looking lane line information exists is compared with a first threshold, and the probability Prear, lanethat the rear-looking lane line information exists is compared with a second threshold. The first threshold and the second threshold are unequal. For example, the first threshold is 0.6, and the second threshold is 0.7.
  • If probabilities of all collected road information are less than the specified threshold, it indicates that the road information collected this time is invalid.
  • If a probability of only one piece of collected road information is greater than or equal to the threshold, it indicates that road information corresponding to a probability higher than the threshold is trusted, and other road information is untrusted. In this case, the trusted road information is directly used.
  • Detection information greater than or equal to the threshold in all the foregoing road detection information is selected, and the following example method is used.
  • Pfusion−n/(1/Pfront, lane+1/Prear, lane+1/Pfront, edge+1)/Pfront, car+1/Prear, edge+1/Pradar, car+1/Pradar, edge), where n=7.
  • Then, this correspondence is solved.
  • Xfusion/Pfusion=Xfront, lane/Pfront, lane+Xrear, lane/Prear, lane+Xfront, edge/Pfront, edge+Xfront, car/Pfront, car+Xrear, edge/Prear, edge+Xradar, car/Pradar, car+Xradar, edge/Pradar, edge.
  • Xfusion is a horizontal offset obtained after fusion.
  • The following is obtained:
  • Cifusion, where i is an integer from 0 to 3.
  • It may be understood that the threshold may not be set in actual application. For example, the road information detection apparatus determines that all parameters are valid parameters.
  • 610: The road information detection apparatus outputs the target lane line information.
  • That the road information detection apparatus outputs the target lane line information includes: The road information detection apparatus outputs the correspondence, obtained in step 609, between the horizontal offset and the longitudinal offset of the target lane line in the traveling process of the vehicle. An output manner may be an image, a video, voice, or text. An output manner is not limited herein.
  • The foregoing describes the information transmission method in embodiments of this application, and the following describes a device in embodiments of this application. A road information detection apparatus may be deployed on the road information detection apparatus in a form of software.
  • Refer to FIG. 7, an embodiment of a road information detection apparatus in embodiments of this application includes a receiving unit 701, an obtaining unit 702, a fusion unit 703, and an output unit 704.
  • The receiving unit 701 is configured to receive a front-looking image from at least one front-looking camera apparatus and a rear-looking image from at least one rear-looking camera apparatus, and is further configured to receive radar data.
  • The obtaining unit 702 is configured to obtain front-looking lane line information based on the front-looking image, obtain rear-looking lane line information based on the rear-looking image, is further configured to obtain front-looking lane structure information based on the front-looking image, obtain rear-looking lane structure information based on the rear-looking image, and is further configured to obtain radar lane structure information based on the radar data.
  • The fusion unit 703 is configured to obtain target lane line information based on the front-looking lane line information and the rear-looking lane line information.
  • The fusion unit 703 may be configured to obtain the target lane line information in any one or more of the following manners.
  • (1) fusing a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line, a probability that the front-looking lane line exists, a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line, and a probability that the rear-looking lane line exists, to obtain a correspondence between a horizontal offset and a longitudinal offset of a target lane line;
  • (2) fusing the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain the target lane line information;
  • (3) fusing the correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line, the probability that the front-looking lane line exists, the correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line, the probability that the rear-looking lane line exists, a correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure, a probability that the front-looking lane structure exists, a correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure, and a probability that the rear-looking lane structure exists, to obtain the correspondence between a horizontal offset and a longitudinal offset of a target lane line;
  • (4) fusing the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain the target lane line information;
  • (5) fusing the correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line, the probability that the front-looking lane line exists, the correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line, the probability that the rear-looking lane line exists, a correspondence between a horizontal offset and a longitudinal offset of a radar lane structure, and a probability that the radar lane structure exists, to obtain the correspondence between a horizontal offset and a longitudinal offset of a target lane line;
  • (6) fusing the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain the target lane line information; and
  • (7) fusing the correspondence between a horizontal offset and a longitudinal offset of a front-looking lane line, the probability that the front-looking lane line exists, the correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane line, the probability that the rear-looking lane line exists, the correspondence between a horizontal offset and a longitudinal offset of a front-looking lane structure, the probability that the front-looking lane structure exists, the correspondence between a horizontal offset and a longitudinal offset of a rear-looking lane structure, and the probability that the rear-looking lane structure exists, the correspondence between a horizontal offset and a longitudinal offset of a radar lane structure, and the probability that the radar lane structure exists, to obtain the correspondence between a horizontal offset and a longitudinal offset of a target lane line.
  • The output unit 704 is configured to output the target lane line information.
  • In this embodiment, an operation performed by each unit of the road information detection apparatus is similar to those described in embodiments shown in FIG. 3 to FIG. 6. Details are not described herein again.
  • FIG. 8 is another schematic diagram of a structure of a road information detection apparatus according to an embodiment of this application. The road information detection apparatus 800 may include one or more processors 801 and a memory 805, and the memory 805 stores one or more application programs or data.
  • The memory 805 may be a volatile memory or a persistent memory. The program stored in the memory 805 may include one or more modules, and each module may include a series of instruction operations for the road information detection apparatus. Further, the processor 801 may be configured to communicate with the memory 805, and perform, on the road information detection apparatus 800, the series of instruction operations in the memory 805.
  • Further, optionally, the road information detection apparatus 800 may further include one or more power supplies 802, one or more wired or wireless network interfaces 803, one or more input/output interfaces 804, and/or one or more operating systems such as Windows Server®, Mac OS X®, Unix®, Linux®, and FreeBSD®.
  • The processor 801 may perform operations performed by the road information detection apparatuses in embodiments shown in FIG. 3 to FIG. 6. Details are not described herein again.
  • An embodiment of this application provides a vehicle. The vehicle includes at least one detection apparatus, for example, the road information detection apparatus 800 shown in FIG. 8. The vehicle may perform operations performed by the road information detection apparatuses in embodiments shown in FIG. 3 to FIG. 6.
  • An embodiment of this application provides a detection system. The detection system includes a detection apparatus and at least one sensor. The detection apparatus may be the road information detection apparatus 800 shown in FIG. 8, and the sensor may include at least one of a camera apparatus, a millimeter-wave radar, and/or the like.
  • This application provides a chip system. The chip system includes a processor configured to support a road information detection apparatus in implementing functions in the foregoing aspects, for example, sending or processing data and/or information in the foregoing methods. In a possible design, the chip system further includes a memory. The memory is configured to store necessary program instructions and data. The chip system may include a chip, or may include a chip and another discrete device.
  • In another possible design, when the chip system is a chip in the road information detection apparatus and the like, the chip includes a processing unit and a communication unit. The processing unit may be, for example, a processor. The communication unit may be, for example, an input/output interface, a pin, or a circuit. The processing unit may execute computer-executable instructions stored in a storage unit, so that the chip in the road information detection apparatus performs the steps of the method performed by the road information detection apparatus in any embodiment in FIG. 3 to FIG. 6. Optionally, the storage unit is a storage unit in the chip, for example, a register or a cache. The storage unit may alternatively be a storage unit that is in UE, a base station, and the like and that is located outside the chip, for example, a read-only memory (ROM), another type of static storage device capable of storing static information and instructions, or a random-access memory (RAM).
  • An embodiment of this application further provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program. When the computer program is executed by a computer, a method procedure related to the road information detection apparatus in any one of the foregoing method embodiments is implemented. Correspondingly, the computer may be the foregoing road information detection apparatus.
  • It should be understood that the processor in the road information detection apparatus, the chip system, or the like in the foregoing embodiments of this application, or the processor provided in the foregoing embodiments of this application may be a central processing unit (CPU), or may be another general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), another programmable logic device, a discrete gate or a transistor logic device, a discrete hardware component, or the like. The general-purpose processor may be a microprocessor, or the processor may be any conventional processor or the like.
  • It should be further understood that there may be one or more processors in the road information detection apparatus, the chip system, or the like in the foregoing embodiments of this application. The quantity may be adjusted based on an actual application scenario. This is merely an example for description herein, and is not limited. There may be one or more memories in embodiments of this application. This may be adjusted based on an actual application scenario. This is merely an example for description herein, and is not limited.
  • It should be further understood that in embodiments of this application, the memory, the readable storage medium, or the like in the road information detection apparatus, the chip system, or the like in the foregoing embodiments may be a volatile memory or a non-volatile memory, or may include both a volatile memory and a non-volatile memory. The non-volatile memory may be a ROM, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically-erasable programmable read-only memory (EEPROM), or a flash memory. The volatile memory may be a RAM, and is used as an external cache. Through example but not limitative descriptions, many forms of RAMs are available, for example, a static random-access memory (SRAM), a dynamic random-access memory (DRAM), a synchronous dynamic random-access memory (SDRAM), a double data rate synchronous dynamic random-access memory (DDR SDRAM), an enhanced synchronous dynamic random-access memory (ESDRAM), a SynchLink dynamic random-access memory (SLDRAM), and a direct Rambus random-access memory (DR RAM).
  • It should be further noted that when the road information detection apparatus includes a processor (or a processing unit) and a memory, the processor in this application may be integrated with the memory, or may be connected to the memory through an interface. This may be adjusted based on an actual application scenario, and is not limited.
  • An embodiment of this application further provides a computer program or a computer program product including the computer program. When the computer program is executed on a computer, the computer is enabled to implement a method procedure related to any one of the foregoing method embodiments. Correspondingly, the computer may be the foregoing road information detection apparatus.
  • All or some of the foregoing embodiments in FIG. 3 to FIG. 6 may be implemented by using software, hardware, firmware, or any combination thereof. When software is used to implement embodiments, all or some of embodiments may be implemented in a form of a computer program product.
  • The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the procedure or functions according to embodiments of this application are all or partially generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line (DSL)) or wireless (for example, infrared, radio, or microwave) manner. The computer-readable storage medium may be any usable medium accessible by the computer, or a data storage device, such as a server or a data center, integrating one or more usable media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a Digital Video Disk (DVD)), a semiconductor medium (for example, a solid-state drive (SSD)), or the like.
  • It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.
  • In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus, and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, unit division is merely logical function division and may be other division during actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
  • The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, in other words, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of embodiments.
  • In addition, function units in embodiments of this application may be integrated into one processing unit, each of the units may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in a form of hardware, or may be implemented in a form of a software function unit.
  • When the integrated unit is implemented in the form of a software function unit and sold or used as an independent product, the integrated unit may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the conventional technology, or all or some of the technical solutions may be implemented in a form of a software product. The computer software product is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the steps of the methods described in embodiments of this application. The foregoing storage medium includes any medium that can store program code, such as a Universal Serial Bus (USB) flash drive, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disc.

Claims (20)

1. A road information detection method, comprising:
receiving a front-looking image from at least one front-looking camera apparatus
receiving a rear-looking image from at least one rear-looking camera apparatus;
obtaining front-looking lane line information based on the front-looking image; obtaining rear-looking lane line information based on the rear-looking image; and
obtaining target lane line information based on the front-looking lane line information and the rear-looking lane line information.
2. The road information detection method of claim 1, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, and wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists; and wherein obtaining the target lane line comprises:
fusing the first correspondence, the first probability, the second correspondence, and the second probability; and
obtaining a correspondence of a target lane line in response to the fusing.
3. The road information detection method of claim 1, further comprising:
obtaining front-looking lane structure information based on the front-looking image, and obtaining rear-looking lane structure information based on the rear-looking image;
wherein obtaining the target lane line information comprises fusing the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information.
4. The road information detection method of claim 3, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the front-looking lane structure information comprises a third correspondence of a front-looking lane structure and a third probability that the front-looking lane structure exists, wherein the rear-looking lane structure information comprises a fourth correspondence of a rear-looking lane structure and a fourth probability that the rear-looking lane structure exists, and wherein the road information detection method further comprises further fusing the first correspondence, the first probability, the fourth correspondence, the second probability, the second correspondence, the third probability, the fourth correspondence, and the fourth probability to obtain a correspondence of a target lane line.
5. The road information detection method of claim 1, further comprising:
receiving radar data from at least one radar; and
obtaining radar lane structure information based on the radar data,
wherein obtaining the target lane line information further comprises fusing the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information.
6. The road information detection method claim 5, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the radar lane structure information comprises a third correspondence of a radar lane structure and a third probability that the radar lane structure exists, and wherein fusing the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information further comprises further fusing the first correspondence, the first probability, the second correspondence, the second probability, the third correspondence, and the third probability, to obtain a correspondence of a target lane line.
7. The road information detection method of claim 1, further comprising:
obtaining front-looking lane structure information based on the front-looking image and,
obtaining rear-looking lane structure information based on the rear-looking image;
receiving radar data from at least one radar;
obtaining radar lane structure information based on the radar data; and
the obtaining t further comprises fusing the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information.
8. The road information detection method of claim 7, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the front-looking lane structure information comprises a third correspondence of a front-looking lane structure and a third probability that the front-looking lane structure exists, wherein the rear-looking lane structure information comprises a fourth correspondence of a rear-looking lane structure and a fourth probability that the rear-looking lane structure exists, wherein the radar lane structure information comprises a fifth correspondence of a radar lane structure and a fifth probability that the radar lane structure exists, and wherein the road information detection method further comprises further fusing the first correspondence, the first probability that the front-looking lane line exists, the second correspondence, the second probability, the third correspondence, the third probability, the fourth correspondence, the fourth probability, the fifth correspondence, and the fifth probability, to obtain a correspondence of a target lane line.
9. The road information detection method of claim 4, wherein the front-looking lane structure information comprises at least one of front-looking road edge information or front-looking vehicle track information, wherein the third correspondence comprises a fifth correspondence of at least one of a front-looking road edge or an sixth correspondence of a front-looking vehicle track, wherein the third probability comprises a seventh probability that at least one of the front-looking road edge exists or a eighth probability that the front-looking vehicle track exists, wherein the rear-looking lane structure information comprises rear-looking road edge information, wherein the fourth correspondence comprises a ninth correspondence of a rear-looking road edge, and wherein the fourth probability that the rear-looking lane structure exists comprises a tenth probability that the rear-looking road edge exists.
10. The road information detection method of claim 6, wherein the radar lane structure information comprises at least one of radar road edge information or radar vehicle track information, wherein the third correspondence comprises a fourth correspondence of at least one of a radar road edge or a fifth correspondence of a radar vehicle track, and wherein the third probability that the radar lane structure exists comprises at least one of a fourth probability that the radar road edge exists or a fifth probability that the radar vehicle track exists.
11. The road information detection method of claim 2, wherein the first correspondence comprises a horizontal offset that corresponds to a longitudinal offset.
12. A road information detection apparatus, comprising:
a processor, and
a non-transitory storage medium in communication with the processor and configured to store program instructions that when executed by the processor cause the road information detection apparatus to be configured to:
receive a front-looking image from at least one front-looking camera apparatus;
receive a rear-looking image from at least one rear-looking camera apparatus;
obtain front-looking lane line information based on the front-looking image;
obtain rear-looking lane line information based on the rear-looking image; and
obtain target lane line information based on the front-looking lane line information and the rear-looking lane line information.
13. The road information detection apparatus of claim 12, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, and wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, and wherein the program instructions further cause the road information detection apparatus to be configured to obtain the target lane line information by:
fusing the first correspondence, the first probability, the second correspondence, and the second probability; and
obtaining a correspondence of a target lane line in response to response to fusing the first correspondence, the first probability, the second correspondence, and the second probability.
14. The road information detection apparatus of claim 12, wherein the program instructions further cause the road information detection apparatus to be configured to;
obtain front-looking lane structure information based on the front-looking image;
obtain rear-looking lane structure information based on the rear-looking image; and
fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, and the rear-looking lane structure information, to obtain the target lane line information.
15. The road information detection apparatus of claim 14, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the front-looking lane structure information comprises a third correspondence of a front-looking lane structure and a third probability that the front-looking lane structure exists, wherein the rear-looking lane structure information comprises a fourth correspondence of a rear-looking lane structure and a fourth probability that the rear-looking lane structure exists, and wherein the program instructions further cause the road information detection apparatus to be configured to further fuse the first correspondence, the first probability, the second correspondence, the second probability, the third correspondence, the third probability, the fourth correspondence of the rear-looking lane structure, and the fourth probability that the rear-looking lane structure exists, to obtain a correspondence of a target lane line.
16. The road information detection apparatus of claim 12, wherein the program instructions further cause the road information detection apparatus to be configured to:
receive radar data;
obtain radar lane structure information based on the radar data; and
fuse the front-looking lane line information, the rear-looking lane line information, and the radar lane structure information, to obtain the target lane line information.
17. The road information detection apparatus of claim 16, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the radar lane structure information comprises a third correspondence of a radar lane structure and a third probability that the radar lane structure exists, and wherein the program instructions further cause the road information detection apparatus to further fuse the first correspondence, the first probability, the second correspondence, the second probability, the third correspondence, and the third probability to obtain a correspondence of a target lane line.
18. The road information detection apparatus of claim 12, wherein the program instructions further cause the road information detection apparatus to:
obtain front-looking lane structure information based on the front-looking image;
obtain rear-looking lane structure information based on the rear-looking image;
receive radar data;
obtain radar lane structure information based on the radar data; and
fuse the front-looking lane line information, the rear-looking lane line information, the front-looking lane structure information, the rear-looking lane structure information, and the radar lane structure information, to obtain the target lane line information.
19. The road information detection apparatus of claim 18, wherein the front-looking lane line information comprises a first correspondence of a front-looking lane line and a first probability that the front-looking lane line exists, wherein the rear-looking lane line information comprises a second correspondence of a rear-looking lane line and a second probability that the rear-looking lane line exists, wherein the front-looking lane structure information comprises a third correspondence of a front-looking lane structure and a third probability that the front-looking lane structure exists, wherein the rear-looking lane structure information comprises a fourth correspondence of a rear-looking lane structure and a fourth probability that the rear-looking lane structure exists, wherein the radar lane structure information comprises a fifth correspondence of a radar lane structure and a fifth probability that the radar lane structure exists, and wherein the program instructions further cause the toad information detection apparatus to further fuse the first correspondence, the first probability, the second correspondence, the second probability, the third correspondence, the third probability, the fourth correspondence, the fourth probability, the fifth correspondence, and the fifth probability, to obtain a correspondence of a target lane line.
20. A computer program product comprising computer-executable instructions that are stored on a non-transitory computer storage medium and that when executed by a computer, cause the computer to:
receive a front-looking image from at least one front-looking camera apparatus and a rear-looking image from at least one rear-looking camera apparatus;
obtain front-looking lane line information based on the front-looking image;
obtain rear-looking lane line information based on the rear-looking image; and
obtain target lane line information based on the front-looking lane line information and the rear-looking lane line information.
US17/827,170 2019-11-30 2022-05-27 Road Information Detection Method and Apparatus Pending US20220292849A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201911209196.4 2019-11-30
CN201911209196.4A CN112885074B (en) 2019-11-30 2019-11-30 Road information detection method and device
PCT/CN2020/108133 WO2021103651A1 (en) 2019-11-30 2020-08-10 Road information detection method and apparatus

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/108133 Continuation WO2021103651A1 (en) 2019-11-30 2020-08-10 Road information detection method and apparatus

Publications (1)

Publication Number Publication Date
US20220292849A1 true US20220292849A1 (en) 2022-09-15

Family

ID=76039440

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/827,170 Pending US20220292849A1 (en) 2019-11-30 2022-05-27 Road Information Detection Method and Apparatus

Country Status (4)

Country Link
US (1) US20220292849A1 (en)
EP (1) EP4053816B1 (en)
CN (1) CN112885074B (en)
WO (1) WO2021103651A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023092451A1 (en) * 2021-11-26 2023-06-01 华为技术有限公司 Method and apparatus for predicting drivable lane
CN113942503B (en) * 2021-12-02 2023-06-27 天津天瞳威势电子科技有限公司 Lane keeping method and device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6930593B2 (en) * 2003-02-24 2005-08-16 Iteris, Inc. Lane tracking system employing redundant image sensing devices
JP4988786B2 (en) * 2009-04-09 2012-08-01 株式会社日本自動車部品総合研究所 Boundary line recognition device
EP2279889B1 (en) * 2009-07-08 2015-09-09 Volvo Car Corporation Method and system for shoulder departure assistance in an automotive vehicle
US9090263B2 (en) * 2010-07-20 2015-07-28 GM Global Technology Operations LLC Lane fusion system using forward-view and rear-view cameras
US8706417B2 (en) * 2012-07-30 2014-04-22 GM Global Technology Operations LLC Anchor lane selection method using navigation input in road change scenarios
CN104751151B (en) * 2015-04-28 2017-12-26 苏州安智汽车零部件有限公司 A kind of identification of multilane in real time and tracking
DE102015209467A1 (en) * 2015-05-22 2016-11-24 Continental Teves Ag & Co. Ohg Method of estimating lanes
CN107862290B (en) * 2017-11-10 2021-09-24 智车优行科技(北京)有限公司 Lane line detection method and system
CN108162867A (en) * 2017-12-21 2018-06-15 宁波吉利汽车研究开发有限公司 A kind of lane recognition system and lane recognition method
CN110286671B (en) * 2019-04-29 2022-03-29 北京工业大学 Automatic driving vehicle path generation method based on clothoid curve
CN110334634A (en) * 2019-06-28 2019-10-15 广州鹰瞰信息科技有限公司 A kind of detection method and prior-warning device of lane line classification
CN110516652B (en) * 2019-08-30 2023-04-18 北京百度网讯科技有限公司 Lane detection method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
EP4053816B1 (en) 2024-04-24
CN112885074B (en) 2023-01-13
EP4053816A1 (en) 2022-09-07
EP4053816A4 (en) 2022-12-07
WO2021103651A1 (en) 2021-06-03
CN112885074A (en) 2021-06-01

Similar Documents

Publication Publication Date Title
US20220292849A1 (en) Road Information Detection Method and Apparatus
WO2021143286A1 (en) Method and apparatus for vehicle positioning, controller, smart car and system
US20200116498A1 (en) Visual assisted distance-based slam method and mobile robot using the same
US10074021B2 (en) Object detection apparatus, object detection method, and program
US20220373353A1 (en) Map Updating Method and Apparatus, and Device
US11126875B2 (en) Method and device of multi-focal sensing of an obstacle and non-volatile computer-readable storage medium
JP2015026234A (en) Rear-sideways warning device for vehicles, rear-sideways warning method for vehicles, and three-dimensional object detecting device
CN108154149B (en) License plate recognition method based on deep learning network sharing
WO2020248910A1 (en) Target detection method and device
US11250274B2 (en) In-vehicle device and control method
US20200082181A1 (en) Method and system for sensing an obstacle, and storage medium
JP2014137288A (en) Device and method for monitoring surroundings of vehicle
US20220196408A1 (en) Lane Line Information Determining Method and Apparatus
US11281930B2 (en) System and method for object detection
CN111080784A (en) Ground three-dimensional reconstruction method and device based on ground image texture
CN113450388B (en) Target tracking method and device and electronic equipment
WO2021185104A1 (en) Method and device for determining lane line information
US11080561B2 (en) Training and verification of learning models using high-definition map information and positioning information
JP2023534850A (en) Labeling information determination method and apparatus
KR101875517B1 (en) Method and apparatus for processing a image
KR102426562B1 (en) Appratus and method for detecting object
Wang Phd forum: Real-time lane-vehicle detection for advanced driver assistance on mobile devices
US11580661B2 (en) Device, method and system for estimating elevation in images from camera devices
CN116257273B (en) Updating method, terminal and computer storage medium of obstacle detection model
KR20200034118A (en) Object detection apparatus and method based on sensor fusion

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION