WO2021110166A1 - 道路结构检测方法及装置 - Google Patents
道路结构检测方法及装置 Download PDFInfo
- Publication number
- WO2021110166A1 WO2021110166A1 PCT/CN2020/134225 CN2020134225W WO2021110166A1 WO 2021110166 A1 WO2021110166 A1 WO 2021110166A1 CN 2020134225 W CN2020134225 W CN 2020134225W WO 2021110166 A1 WO2021110166 A1 WO 2021110166A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane
- width
- information
- road
- boundary
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 86
- 238000000034 method Methods 0.000 claims abstract description 59
- 238000000926 separation method Methods 0.000 claims abstract description 54
- 230000015654 memory Effects 0.000 claims description 55
- 238000004422 calculation algorithm Methods 0.000 claims description 34
- 238000004891 communication Methods 0.000 claims description 30
- 238000012545 processing Methods 0.000 claims description 27
- 230000006870 function Effects 0.000 description 52
- 238000013461 design Methods 0.000 description 18
- 238000004590 computer program Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 14
- 238000013519 translation Methods 0.000 description 14
- 239000011159 matrix material Substances 0.000 description 13
- 230000000007 visual effect Effects 0.000 description 13
- 238000013500 data storage Methods 0.000 description 11
- 230000004927 fusion Effects 0.000 description 11
- 239000003973 paint Substances 0.000 description 10
- 230000003993 interaction Effects 0.000 description 8
- 230000002093 peripheral effect Effects 0.000 description 8
- 238000013528 artificial neural network Methods 0.000 description 7
- 230000006399 behavior Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 6
- 239000004568 cement Substances 0.000 description 6
- 241001465754 Metazoa Species 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000008447 perception Effects 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- MHABMANUFPZXEB-UHFFFAOYSA-N O-demethyl-aloesaponarin I Natural products O=C1C2=CC=CC(O)=C2C(=O)C2=C1C=C(O)C(C(O)=O)=C2C MHABMANUFPZXEB-UHFFFAOYSA-N 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 4
- 230000010267 cellular communication Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000010276 construction Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000446 fuel Substances 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- LFQSCWFLJHTTHZ-UHFFFAOYSA-N Ethanol Chemical compound CCO LFQSCWFLJHTTHZ-UHFFFAOYSA-N 0.000 description 2
- ATUOYWHBWRKTHZ-UHFFFAOYSA-N Propane Chemical compound CCC ATUOYWHBWRKTHZ-UHFFFAOYSA-N 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 238000002485 combustion reaction Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 125000000524 functional group Chemical group 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 239000003208 petroleum Substances 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 239000001294 propane Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0098—Details of control systems ensuring comfort, safety or stability not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0043—Signal treatments, identification of variables or parameters, parameter estimation or state estimation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/05—Type of road, e.g. motorways, local streets, paved or unpaved roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/10—Number of lanes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
Definitions
- This application relates to the field of automatic driving, and in particular to a road structure detection method and device.
- Intelligent vehicles In autonomous driving, smart vehicles need to perceive the surrounding environment. Intelligent vehicles can detect and classify the surrounding environment of the vehicle through a variety of sensors, and transmit this information to the planning and control module to form a driving strategy for the intelligent vehicle and complete the entire automatic driving process.
- the more important aspect is to detect the road structure, so that the intelligent vehicle can adjust the driving strategy according to the road structure, avoid some obstacles on the road, etc., and realize better automatic driving.
- an existing road structure detection technology is to use high-precision maps to obtain some lane information, etc., and then determine the road structure.
- the high-precision map usually covers a limited area. When the driving area is not covered by the high-precision map, the intelligent vehicle cannot obtain the corresponding information, and thus cannot determine the road structure.
- the present application provides a road structure detection method and device, which can realize road structure detection without relying on high-precision maps.
- the present application provides a road structure detection method, which can be executed by a smart vehicle (or a component in a vehicle) or another device with a control function (or a component in another device).
- the method includes: determining the boundary information of the own lane and the boundary information of the adjacent lane, and determining the road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
- the boundary information of the current lane is used to characterize the position of the boundary of the current lane;
- the boundary information of the adjacent lane is used to characterize the position of the boundary of the adjacent lane;
- the boundary information includes lane line information and/or lane edge position information;
- road structure information includes Location information of the merging point of the own lane and the adjacent lane and/or the separation point of the own lane and the adjacent lane.
- paint markings may be used to demarcate boundaries between lanes. Or, use protrusions (such as guardrails, concrete piers, etc.) to delimit the boundary. Of course, other boundary division methods are also possible.
- the road structure detection method provided by the embodiment of the present application does not need to rely on a high-precision map, and can complete road structure detection in an area not covered by a high-precision map.
- the road structure detection method provided by the present application has a wider application range, and can be applied to areas covered by high-precision maps and areas not covered by high-precision maps.
- the lane line may be drawn on the road surface with paint.
- a visual sensor such as a camera
- the edge of the lane can be divided by protrusions (such as guardrails).
- radar or similar components can be used to emit specific waves, and the emitted waves of the protrusions at the edge of the lane can be obtained.
- the characteristics of the emitted and reflected waves (such as Phase difference, frequency difference, etc.) to obtain the position of the edge of the lane.
- the visual information of the edge of the lane such as the guardrail can also be collected through a camera to determine the position information of the guardrail.
- the boundary information of the adjacent lane is determined according to the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
- the boundary information of the adjacent lane can be determined based on the detected traffic trajectory information, or the boundary information of the adjacent lane can be determined based on the road boundary information, or the neighboring lane can be determined based on both the detected traffic trajectory information and the road boundary information.
- the boundary information of the lane can be determined based on the detected traffic trajectory information, or the boundary information of the adjacent lane can be determined based on the road boundary information, or the neighboring lane can be determined based on both the detected traffic trajectory information and the road boundary information.
- the visual sensor of the vehicle can be used to collect the visual information of the vehicle in the adjacent lane to determine the detected driving trajectory information.
- the vehicle's radar or components with similar functions can also be used to determine the position and speed of the vehicle in other lanes by means of receiving and transmitting lasers, millimeter waves, etc., to determine the driving trajectory information of the other lane.
- other methods can also be used to determine the trajectory information of other lanes.
- components such as a camera can be used to collect visual information of the road boundary line to determine the location of the road boundary, that is, road boundary information.
- the camera can also be used to determine the road boundary information.
- Components such as radar can also be used to determine road boundary information.
- the adjacent lane boundary information is inferred based on the road boundary information and/or the detected vehicle trajectory information, instead of directly detecting the adjacent lane boundary information through the camera.
- the poor detection performance of the camera such as a small detection distance
- environmental factors such as haze
- the boundary information of the adjacent lane includes at least one of the following:
- the location of the merging point or the separation point needs to be determined according to the boundary information of the own lane and the boundary information of the right adjacent lane.
- the boundary information of the right adjacent lane needs to be determined first.
- the boundary information of the right adjacent lane refers to the left lane line (RL) position of the right adjacent lane.
- the position of RL can be derived from road boundary information. Specifically, the first position of RL is obtained by shifting the road boundary to the left by a first width. Alternatively, the RL position is derived from the driving trajectory of other lanes. Specifically, the second position of the RL is obtained by shifting the driving trajectory to the left by a second width. Alternatively, the RL position can be derived from the road boundary position and the trajectory of other lanes.
- the RL's third position (which can be referred to as the fusion position) is calculated from the RL's first position and the RL's second position through a preset algorithm Get together.
- the deviation between the fusion position and the actual merging point (and/or separation point) position can be corrected to improve the accuracy of the final position result .
- the boundary information of the right adjacent lane refers to the position of the left edge of the right adjacent lane.
- the position of the left edge can be derived from the position of the road boundary. Specifically, the seventh position of the left edge of the adjacent lane to the right is obtained by shifting the road boundary to the left by a fifth width. Alternatively, the position of the left edge can be derived from the driving trajectory of other lanes. Specifically, the eighth position of the left edge is obtained by shifting the driving trajectory to the left by a sixth width. Alternatively, the position of the left edge can be derived from both the road boundary position and the traffic trajectory of other lanes. Specifically, the ninth position of the left edge is determined by a preset algorithm from the seventh position of the left edge and the eighth position of the left edge. Get together.
- the boundary information of the left adjacent lane refers to the right lane line (LR) position of the left adjacent lane.
- the fourth position of LR is obtained by shifting the road boundary to the right by the third width; or, the fifth position of LR is obtained by shifting the trajectory to the right by the fourth width; or, the sixth position of LR is obtained by the fourth position of LR and LR's
- the fifth position is fused by a preset algorithm.
- the boundary information of the left adjacent lane refers to the position of the right edge of the left adjacent lane.
- the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the twelfth position of the right edge
- the tenth position of the right edge and the eleventh position of the right edge are fused by a preset algorithm.
- the first width is an integer multiple of the lane width
- the second width is an odd multiple of the half-lane width
- the third width is an integer multiple of the lane width
- the fourth width is an odd multiple of the half-lane width
- the fifth width is the lane width
- the sixth width is an odd multiple of the half-lane width
- the seventh width is an integer multiple of the half-lane width
- the eighth width is an odd multiple of the half-lane width.
- the distance between the merging point and the reference point is less than or equal to a first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold, and the reference point includes a vehicle.
- the preliminarily determined position of the merge point is deemed to be accurate, and there is no need to further adjust the position of the merge point.
- the initially determined distance between the merge point and the vehicle when the initially determined distance between the merge point and the vehicle is less than or equal to the first threshold, it may be the translation required by the road boundary and/or the driving trajectory when determining the adjacent lane boundary information.
- the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width are not accurate, possibly As a result, the initially determined location of the merge point and/or separation point is inaccurate. In this case, it is necessary to further adjust the position of the merging point and/or the separation point to improve the accuracy of the road structure detection result.
- the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or are adjusted according to the first threshold value Eighth width; based on the adjusted first width and/or second width and/or third width and/or fourth width and/or fifth width and/or sixth width and/or seventh width and/or first width Eight widths, adjust the merging point; and/or adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width according to the first threshold / Or seventh width and / or eighth width; based on the adjusted first width and / or second width and / or third width and / or fourth width and / or fifth width and / or sixth width and / Or seventh width and/or eighth width, adjust the separation point.
- adjusting the width of the translation required for the position of the road boundary and/or the width of the translation required for the driving trajectory determines the border information of the adjacent lane, so that the obtained border information of the adjacent lane is more accurate.
- the position of the merge point (and/or the separation point) thus determined is more accurate.
- a more precise driving strategy can be formulated to guide the driving of the vehicle and improve the safety of driving.
- the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
- the boundary information of the adjacent lane is determined according to the road prior data, and the road prior data includes the lane width.
- the present application provides a road structure detection device.
- the device can be a vehicle or a device that can support the vehicle to achieve driving functions. It can be used in conjunction with the vehicle.
- it can be a device in a vehicle (such as a vehicle).
- the chip system in the vehicle, or the operating system and/or driver running on the computer system of the vehicle may also be other equipment (such as a server) or a chip in the equipment.
- the device includes a determination module and an adjustment module, and these modules can execute the road structure detection method in any of the design examples in the first aspect, specifically:
- the determination module is used to determine the boundary information of the own lane, the boundary information of the current lane is used to characterize the position of the boundary of the current lane; the boundary information of the adjacent lane is determined, and the boundary information of the adjacent lane is used to characterize the position of the boundary of the adjacent lane.
- the boundary information includes lane line information and/or lane edge position information.
- the determining module is also used to determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
- the road structure information includes the location information of the merge point of the own lane and the adjacent lane and/or the position information of the separation point of the own lane and the adjacent lane.
- the boundary information of the adjacent lane is determined based on the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
- the boundary information of adjacent lanes includes at least one of the following:
- the first position of RL is obtained by shifting the road boundary to the left by a first width; or, the second position of RL is obtained by shifting the trajectory to the left by a second width; or, the third position of RL is obtained by shifting to the left by a second width.
- the first position of the RL and the second position of the RL are fused by a preset algorithm.
- the fourth position of LR is obtained by shifting the road boundary to the right by the third width; or, the fifth position of LR is obtained by shifting the trajectory to the right by the fourth width; or, the sixth position of LR is obtained by the fourth position of LR and LR's
- the fifth position is fused by a preset algorithm.
- the seventh position of the left edge of the right adjacent lane is obtained by shifting the road boundary to the left by the fifth width; or, the eighth position of the left edge is obtained by shifting the traffic track to the left by the sixth width; or, the ninth position of the left edge is obtained by shifting the left
- the seventh position of the edge and the eighth position of the left edge are fused by a preset algorithm.
- the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the twelfth position of the right edge
- the tenth position of the right edge and the eleventh position of the right edge are fused by a preset algorithm.
- the first width is an integer multiple of the lane width
- the second width is an odd multiple of the half-lane width
- the third width is an integer multiple of the lane width
- the fourth width is an odd multiple of the half-lane width
- the fifth width is the lane width
- the sixth width is an odd multiple of the half-lane width
- the seventh width is an integer multiple of the half-lane width
- the eighth width is an odd multiple of the half-lane width.
- the distance between the merging point and the reference point is less than or equal to the first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold, and the reference point includes the vehicle.
- the adjustment module is used to adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width according to the first threshold. Width and/or seventh width and/or eighth width; based on adjusted first width and/or second width and/or third width and/or fourth width and/or fifth width and/or sixth width And/or the seventh width and/or the eighth width to adjust the merging point; and/or, to adjust the first width and/or the second width and/or the third width and/or the fourth width and / Or fifth width and / or sixth width and / or seventh width and / or eighth width; based on the adjusted first width and / or second width and / or third width and / or fourth width and / Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width to adjust the separation point.
- the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
- the boundary information of the adjacent lane is determined based on the road prior data, and the road prior data includes the lane width.
- an embodiment of the present application provides a road structure detection device, which has the function of realizing the road structure detection method of any one of the above-mentioned first aspects.
- This function can be realized by hardware, or by hardware executing corresponding software.
- the hardware or software includes one or more modules corresponding to the above-mentioned functions.
- a road structure detection device which includes a processor.
- the processor is configured to couple with the memory, and after reading the instructions in the memory, execute the road structure detection method according to any one of the above-mentioned first aspects according to the instructions.
- the memory may be an external memory of the device.
- the external memory is coupled with the processor.
- the memory may also refer to the memory included in the device. That is, the device optionally includes a memory.
- the device may also include a communication interface. Used to communicate between the device and other equipment.
- the communication interface can be, for example, but not limited to, a transceiver, a transceiver circuit, and the like.
- an embodiment of the present application also provides a computer-readable storage medium, including instructions, which when run on a computer, cause the computer to execute the method of the first aspect.
- the embodiments of the present application also provide a computer program product, including instructions, which when run on a computer, cause the computer to execute the method of the first aspect.
- an embodiment of the present application provides a road structure detection device.
- the detection device may be a sensor device, such as a radar device.
- the device may also be a chip system, and the chip system may include a processor and a memory, which is used to implement the functions of the method in the first aspect described above.
- the chip system can be composed of chips, and can also include chips and other discrete devices.
- a road structure detection device may be a circuit system.
- the circuit system includes a processing circuit configured to execute the road structure detection method according to any one of the above-mentioned first aspects.
- embodiments of the present application provide a system, which includes the device of any one of the second to fourth aspects, and/or the chip system of the seventh aspect, and/or the circuit system of the eighth aspect, and /Or the readable storage medium in the fifth aspect, and/or the computer program product in the sixth aspect, and/or one or more types of sensors, and/or a smart car.
- one or more types of sensors can be, but not limited to, visual sensors (such as cameras, etc.), radars or other sensors with similar functions.
- embodiments of the present application provide a smart car, which includes the device of any one of the second to fourth aspects, and/or the chip system of the seventh aspect, and/or the circuit system of the eighth aspect, And/or the readable storage medium in the fifth aspect, and/or the computer program product in the sixth aspect.
- FIG. 1 is a schematic structural diagram of an autonomous vehicle provided by an embodiment of the application.
- FIG. 2 is a schematic structural diagram of an autonomous vehicle provided by an embodiment of the application.
- FIG. 3 is a schematic structural diagram of a computer system provided by an embodiment of this application.
- FIG. 4 is a schematic structural diagram of a neural network processor provided by an embodiment of the application.
- FIG. 5 is a schematic diagram of the application of a cloud-side commanded autonomous vehicle provided by an embodiment of this application;
- FIG. 6 is a schematic diagram of an application of a cloud-side commanded autonomous vehicle provided by an embodiment of the application
- FIG. 7 is a schematic structural diagram of a computer program product provided by an embodiment of this application.
- FIG. 8 is a schematic diagram of a scene of a road structure detection method provided by an embodiment of the application.
- FIG. 9 is a schematic flowchart of a road structure detection method provided by an embodiment of the application.
- FIG. 12 is a schematic structural diagram of a road structure detection device provided by an embodiment of the application.
- FIG. 13 is a schematic structural diagram of a road structure detection device provided by an embodiment of the application.
- the road structure detection method provided by the embodiment of the present application is applied to a smart vehicle, or applied to other devices with control functions (such as a cloud server).
- the vehicle can implement the road structure detection method provided by the embodiments of the present application through its components (including hardware and software), obtain lane line information of its own lane, and border information of adjacent lanes, and determine road structure information based on the two.
- other devices such as a server
- the road structure information is used to formulate driving strategies for the target vehicle.
- Fig. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
- the vehicle 100 is configured in a fully or partially autonomous driving mode.
- the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operations, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine that the other vehicle performs
- the confidence level corresponding to the likelihood of the possible behavior is used to control the vehicle 100 based on the determined information.
- the vehicle 100 can be placed to operate without human interaction.
- the vehicle 100 may include various subsystems, such as a travel system 102, a sensor system 104, a control system 106, one or more peripheral devices 108 and a power supply 110, a computer system 112, and a user interface 116.
- the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
- each of the subsystems and elements of the vehicle 100 may be wired or wirelessly interconnected.
- the travel system 102 may include components that provide power movement for the vehicle 100.
- the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
- the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, such as a hybrid engine composed of a gasoline engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
- the engine 118 converts the energy source 119 into mechanical energy.
- Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
- the energy source 119 may also provide energy for other systems of the vehicle 100.
- the transmission device 120 can transmit mechanical power from the engine 118 to the wheels 121.
- the transmission device 120 may include a gearbox, a differential, and a drive shaft.
- the transmission device 120 may also include other devices, such as a clutch.
- the drive shaft may include one or more shafts that can be coupled to one or more wheels 121.
- the sensor system 104 may include several sensors that sense information about the environment around the vehicle 100.
- the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, and a radar 126, a laser rangefinder 128, and a camera 130.
- the sensor system 104 may also include sensors of the internal system of the monitored vehicle 100 (for example, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). Such detection and identification are key functions for the safe operation of the autonomous vehicle 100.
- Different types of sensors have different characteristics. For millimeter-wave radar, it can work around the clock and has good range and speed accuracy, but the classification and recognition effect is not good.
- the camera has a strong resolution and a strong target recognition and classification effect, but because of the loss of depth information, the performance of distance measurement and speed measurement may be poor.
- Lidar has good depth information and can also perform range and speed measurement, but the detection distance is not far. It can be seen that these different types of sensors have different characteristics. Under different functional requirements, different sensors are required for fusion processing to achieve better performance.
- the positioning system 122 can be used to estimate the geographic location of the vehicle 100.
- the IMU 124 is used to sense changes in the position and orientation of the vehicle 100 based on inertial acceleration.
- the IMU 124 may be a combination of an accelerometer and a gyroscope.
- the radar 126 can also be called a detector, a detection device, or a radio signal sending device.
- the radio signal can be used to sense objects in the surrounding environment of the vehicle 100.
- the radar 126 may also be used to sense the speed and/or direction of the object. Its working principle is to detect the corresponding signal by transmitting the signal (or called the detection signal) and receiving the reflected signal reflected by the target object (in this article, also called the echo signal of the target object, two-way echo signal, etc.) Target object.
- the radar has a variety of different radar waveforms according to different purposes, including but not limited to pulsed millimeter waves, stepped frequency modulated continuous waves, and linear frequency modulated continuous waves.
- the linear frequency modulation continuous wave is more common and the technology is more mature.
- the linear frequency modulation continuous wave has a large time zone product, and usually has a high ranging accuracy and resolution. It supports adaptive cruise control (ACC), automatic emergency braking (Autonomous Emergency Braking, AEB), lane change assist (Lance Change Assist, LCA), blind spot monitoring (Blind Spot Monitoring, BSD) and other assisted driving functions .
- the laser rangefinder 128 can use laser light to sense objects in the environment where the vehicle 100 is located.
- the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
- the camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100.
- the camera 130 may be a still camera or a video camera.
- the control system 106 may control the operation of the vehicle 100 and its components.
- the control system 106 may include various components, including a steering system 132, a throttle 134, a braking unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
- the steering system 132 is operable to adjust the forward direction of the vehicle 100.
- it may be a steering wheel system.
- the throttle 134 is used to control the operating speed of the engine 118 and thereby control the speed of the vehicle 100.
- the braking unit 136 is used to control the vehicle 100 to decelerate.
- the braking unit 136 may use friction to slow down the wheels 121.
- the braking unit 136 may convert the kinetic energy of the wheels 121 into electric current.
- the braking unit 136 may also take other forms to slow down the rotation speed of the wheels 121 to control the speed of the vehicle 100.
- the computer vision system 140 may be operable to process and analyze the images captured by the camera 130 in order to identify objects and/or features in the surrounding environment of the vehicle 100.
- the objects and/or features may include traffic signals, road boundaries and obstacles.
- the computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision technologies.
- SFM structure from motion
- the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and so on.
- the route control system 142 is used to determine the travel route of the vehicle 100.
- the route control system 142 may combine data from sensors, the positioning system 122, and one or more predetermined maps to determine a travel route for the vehicle 100.
- the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise surpass potential obstacles in the environment of the vehicle 100.
- control system 106 may add or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
- the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripheral devices 108.
- the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
- the peripheral device 108 provides a means for the user of the vehicle 100 to interact with the user interface 116.
- the onboard computer 148 may provide information to the user of the vehicle 100.
- the user interface 116 can also operate the on-board computer 148 to receive user input.
- the on-board computer 148 can be operated through a touch screen.
- the peripheral device 108 may provide a means for the vehicle 100 to communicate with other devices or users located in the vehicle.
- the microphone 150 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100.
- the speaker 152 may output audio to the user of the vehicle 100.
- the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network.
- the wireless communication system 146 may use third-generation (3rd-generation, 3G) cellular communication, such as code division multiple access (CDMA), data evolution (evolution data only, EVDO), and global mobile communication system ( Global system for mobile communications, GSM, general packet radio service, GPRS, or the 4th generation (4G) cellular communications, such as long term evolution (LTE).
- Fifth-generation (5th-Generation, 5G) cellular communication may use wireless fidelity (wireless fidelity, WiFi) to communicate with a wireless local area network (WLAN).
- the wireless communication system 146 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
- Other wireless protocols such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices.
- DSRC dedicated short range communications
- the power supply 110 may provide power to various components of the vehicle 100.
- the power source 110 may be a rechargeable lithium ion or lead-acid battery.
- One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100.
- the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
- the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as a data storage device 114.
- the computer system 112 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
- the processor 113 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor.
- FIG. 1 functionally illustrates the processor, memory, and other elements in the same physical enclosure, those of ordinary skill in the art should understand that the processor, computer system, or memory may actually be included in the same physical enclosure. Multiple processors, computer systems, or memories in a housing, or include multiple processors, computer systems, or memories that may not be in the same physical housing.
- the memory may be a hard drive, or other storage medium located in a different physical enclosure.
- a reference to a processor or computer system will be understood to include a reference to a collection of processors or computer systems or memories that may operate in parallel, or a reference to a collection of processors or computer systems or memories that may not operate in parallel.
- some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
- the processor may be located far away from the vehicle and wirelessly communicate with the vehicle, and this type of processor may be referred to as a remote processor.
- this type of processor may be referred to as a remote processor.
- some of the processes described herein are executed on a processor disposed in the vehicle and others are executed by a remote processor, including taking the necessary steps to perform a single manipulation.
- the data storage device 114 may include instructions 115 (eg, program logic), which may be executed by the processor 113 to perform various functions of the vehicle 100, including those described above.
- the data storage device 114 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data on one or more of the traveling system 102, the sensor system 104, the control system 106, and the peripheral device 108. Control instructions.
- the data storage device 114 may also store data, such as road maps, route information, the location, direction, and speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 100 and the computer system 112 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
- the data storage device 114 obtains the lane line information of the own lane from the sensor system 104 or other components of the vehicle 100, and may also obtain the detected driving trajectory information and/or road boundary information from the aforementioned components. .
- the road boundary information is used to characterize the location of the road boundary.
- the data storage device 114 may also store the information obtained above.
- the vehicle can obtain the distance between other vehicles and the speed of other vehicles and the speed of other vehicles based on the speed and distance measurement functions of the radar 126.
- the processor 113 can obtain the information from the data storage device 114, and determine the road structure information based on the information.
- Road structure information can be used to assist the vehicle in determining a driving strategy to control the vehicle to drive.
- the user interface 116 is used to provide information to or receive information from a user of the vehicle 100.
- the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as one or more of the wireless communication system 146, the onboard computer 148, the microphone 150, and the speaker 152.
- the computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (for example, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may use input from the control system 106 to control the steering system 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
- one or more of these components described above may be installed or associated with the vehicle 100 separately.
- the data storage device 114 may exist partially or completely separately from the vehicle 100.
- the above-mentioned components may be communicatively coupled together in a wired and/or wireless manner.
- FIG. 1 should not be construed as a limitation to the embodiments of the present application.
- a smart car traveling on a road can recognize objects in its surrounding environment to determine the current speed adjustment.
- the object may be other vehicles, traffic control equipment, or other types of objects.
- each recognized object can be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, distance from the vehicle, etc., can be used to determine the speed to be adjusted by the smart car.
- the vehicle 100 or a computing device associated with the vehicle 100 may be based on the characteristics of the identified object and the state of the surrounding environment (for example, Traffic, rain, ice on the road, etc.) to predict the behavior of the identified object.
- each recognized object depends on each other's behavior. Therefore, all recognized objects can also be considered together to predict the behavior of a single recognized object.
- the vehicle 100 can adjust its speed based on the predicted behavior of the identified object. In other words, the smart car can determine what stable state the vehicle will need to adjust to (for example, accelerating, decelerating, or stopping) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 on the road on which it is traveling, the curvature of the road, and the proximity of static and dynamic objects.
- the computing device can also provide instructions to modify the steering angle of the vehicle 100 so that the smart car follows a given trajectory and/or maintains contact with objects near the smart car (for example, on a road).
- the above-mentioned vehicle 100 may be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
- the application examples are not particularly limited.
- the smart vehicle may also include a hardware structure and/or software module, and the above functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function among the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
- the vehicle may include the following modules:
- the environment perception module 201 is used to obtain information of vehicles, pedestrians, and road objects recognized by roadside sensors and/or on-board sensors.
- the roadside sensor and the vehicle-mounted sensor can be a camera (camera), lidar, millimeter wave radar, etc.
- the data acquired by the environmental perception module can be the original collected video stream, radar point cloud data, or analyzed structured data on the position, speed, steering angle, and size of people, vehicles, and objects.
- the environmental perception module can process these data into recognizable and structured data such as the position, speed, steering angle, and size of people, vehicles, and objects, and send them to the control module 202 transmits these data so that the control module 202 can generate a driving strategy.
- the environment perception module 201 includes a camera or a radar, which is used to obtain the boundary information of the lane where the vehicle is located. It is also used to obtain road boundary information and/or detected vehicle trajectory information. Wherein, road boundary information and/or detected vehicle trajectory information is used to determine adjacent lane boundary information. The boundary information of the own lane and the boundary information of the adjacent lane are used to determine the road structure information.
- Control module 202 This module may be a traditional control module of the vehicle. Its function is to determine road structure information based on the data input by the environment perception module 201 (own lane boundary information, driving trajectory information and/or road boundary information). It is also used to fuse the adjacent lane boundary information determined according to the driving trajectory information and the adjacent lane boundary information determined according to the road boundary information to obtain more accurate adjacent lane boundary information. It is also used to generate a driving strategy according to the road structure information, and output an action instruction corresponding to the driving strategy, and send the action instruction to the control module 203. The action instruction is used to instruct the control module 203 to control the driving of the vehicle.
- the control module 202 may be a collection of components or subsystems with control and processing functions. For example, it may be the processor 113 shown in FIG. 1, or some functional modules in the processor, or similar components or similar subsystems.
- the control module 203 is used to receive action instructions from the control module 202 to control the vehicle to complete the driving operation.
- the control module 203 may be a collection of components or subsystems with control and processing functions. For example, it may be the processor 113 shown in FIG. 1, or a similar component or a similar subsystem.
- the above modules can also be integrated into one module.
- the integrated module is used to provide the above-mentioned multiple functions.
- Vehicle-mounted communication module (not shown in Figure 2): used for information exchange between the vehicle and other vehicles.
- the vehicle-mounted communication module may be, for example, but not limited to, a component in the wireless communication system 146 as shown in FIG. 1.
- the storage component (not shown in FIG. 2) is used to store the executable code of each of the above-mentioned modules. Running these executable codes can implement part or all of the method flow in the embodiments of the present application.
- the storage component may be, for example, but not limited to, the component in the data storage device 114 as shown in FIG. 1.
- the computer system 112 shown in FIG. 1 includes a processor 303, and the processor 303 is coupled to a system bus 305.
- the processor 303 may be one or more processors, where each processor may include one or more processor cores.
- a display adapter (video adapter) 307, the display adapter 307 can drive the display 309, and the display 309 is coupled to the system bus 305.
- the system bus 305 is coupled with an input/output (I/O) bus (BUS) 313 through a bus bridge 311.
- the I/O interface 315 and the I/O bus 313 are coupled.
- the I/O interface 315 communicates with various I/O devices, such as an input device 317 (such as a keyboard, a mouse, a touch screen, etc.), a media tray 321 (such as a CD-ROM, a multimedia interface, etc.).
- the transceiver 323 can send and/or receive radio communication signals
- the camera 355 can capture static and dynamic digital video images
- an external Universal Serial Bus (USB) interface 325 can be a USB interface.
- the processor 303 may be any traditional processor, including a Reduced Instruction Set Computer (RISC) processor, a Complex Instruction Set Computer (CISC) processor, or a combination of the foregoing.
- the processor may be a dedicated device such as an application specific integrated circuit (ASIC).
- the processor 303 may be a neural network processor or a combination of a neural network processor and the foregoing traditional processors.
- the computer system 112 may be located remotely from the vehicle and may communicate wirelessly with the vehicle 100.
- some of the processes described herein may be configured to be executed on a processor in the vehicle, and other processes may be executed by a remote processor, including taking actions required to perform a single manipulation.
- the computer system 112 may communicate with a software deployment server 349 through a network interface 329.
- the network interface 329 is a hardware network interface, such as a network card.
- the network (network) 327 may be an external network, such as the Internet, or an internal network, such as an Ethernet or a virtual private network (VPN).
- the network 327 may also be a wireless network, such as a WiFi network, a cellular network, and so on.
- the hard disk drive interface 331 and the system bus 305 are coupled.
- the hard disk drive interface 331 and the hard disk drive 333 are connected.
- the system memory 335 is coupled to the system bus 305.
- the data running in the system memory 335 may include an operating system (OS) 337 and application programs 343 of the computer system 112.
- the operating system includes but is not limited to Shell 339 and kernel (kernel) 341.
- Shell 339 is an interface between the user and the kernel of the operating system.
- the shell is the outermost layer of the operating system. The shell manages the interaction between the user and the operating system: waiting for the user's input, interpreting the user's input to the operating system, and processing the output of various operating systems.
- the kernel 341 is composed of those parts of the operating system that are used to manage memory, files, peripherals, and system resources. Directly interact with the hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management and other functions.
- the application program 343 includes programs related to controlling car driving, for example, a program that manages the interaction between the car and an obstacle on the road, a program that controls the route or speed of the car, and a program that controls the interaction between the car and other cars on the road.
- the application 343 also exists on the deploying server 349 system. In one embodiment, when the application program 343 needs to be executed, the computer system 112 may download the application program 343 from the deploying server 349.
- the application program 343 may be an application program that controls the vehicle to determine the road structure information based on the lane line information of the own lane and the boundary information of the adjacent lanes (determined based on the road boundary information and/or the detected driving trajectory information).
- the processor 303 of the computer system 112 calls the application program 343 to obtain the final road structure.
- the sensor 353 is associated with the computer system 112.
- the sensor 353 is used to detect the environment around the computer system 112.
- the sensor 353 can detect animals, cars, obstacles, and pedestrian crossings.
- the sensor can also detect the surrounding environment of the above-mentioned animals, cars, obstacles and crosswalks.
- the environment around the animal for example, other animals that appear around the animal, weather conditions, and the brightness of the surrounding environment.
- the sensor may be a camera, an infrared sensor, a chemical detector, a microphone, etc.
- the road structure detection method in the embodiments of the present application may also be executed by a chip system.
- the chip system can be located in the vehicle or in another location, such as a server. Refer to FIG. 4, which is an exemplary structure diagram of a chip system provided by an embodiment of the present application.
- a neural network processor (neural-network processing unit, NPU) 50 can be mounted on a host CPU (host CPU) as a coprocessor, and the host CPU assigns tasks to the NPU.
- the core part of the NPU is the arithmetic circuit 503.
- the arithmetic circuit 503 is controlled by the controller 504, so that the arithmetic circuit 503 can extract matrix data in the memory and perform a multiplication operation.
- the arithmetic circuit 503 includes multiple processing units (Process Engine, PE). In some implementations, the arithmetic circuit 503 is a two-dimensional systolic array. The arithmetic circuit 503 may also be a one-dimensional systolic array, or other electronic circuits capable of performing mathematical operations such as multiplication and addition. In some implementations, the arithmetic circuit 503 is a general-purpose matrix processor.
- the arithmetic circuit 503 obtains the data corresponding to the weight matrix B from the weight memory 502 and caches it on each PE in the arithmetic circuit 503.
- the arithmetic circuit 503 fetches the data corresponding to the input matrix A from the input memory 501, and performs matrix operations according to the input matrix A and the weight matrix B, and the partial or final result of the matrix operation can be stored in an accumulator 508.
- the arithmetic circuit 503 can be used to implement a feature extraction model (such as a convolutional neural network model), and input image data into the convolutional neural network model, and the features of the image can be obtained through the operation of the model. Furthermore, the image features are output to the classifier, and the classifier outputs the classification probability of the object in the image.
- a feature extraction model such as a convolutional neural network model
- the unified memory 506 is used to store input data and output data.
- the weight data in the external memory is directly sent to the weight memory 502 through a memory unit access controller (Direct Memory Access Controller, DMAC) 505.
- the input data in the external memory can be transferred to the unified memory 506 through the DMAC, or transferred to the input memory 501.
- DMAC Direct Memory Access Controller
- the Bus Interface Unit (BIU) 510 is used for the interaction between the Advanced Extensible Interface (AXI) bus, the DMAC and the instruction fetch buffer 509. It is also used for the instruction fetch memory 509 to obtain instructions from an external memory, and also used for the storage unit access controller 505 to obtain the original data of the input matrix A or the weight matrix B from the external memory.
- AXI Advanced Extensible Interface
- the DMAC is mainly used to transfer the input data in the external storage to the unified storage 506, or to transfer the weight data to the weight storage 502, or to transfer the input data to the input storage 501.
- the vector calculation unit 507 may include a plurality of operation processing units. It is used to perform further processing on the output of the arithmetic circuit 503 when needed, such as vector multiplication, vector addition, exponential operation, logarithmic operation, size comparison and so on. Mainly used for non-convolution/FC layer network calculations in neural networks, such as pooling, batch normalization, local response normalization, etc.
- the vector calculation unit 507 stores the processed output vector to the unified memory 506.
- the vector calculation unit 507 may apply a nonlinear function to the output of the arithmetic circuit 503, such as a vector of accumulated values, to generate the activation value.
- the vector calculation unit 507 generates a normalized value, a combined value, or both.
- the processed output vector can also be used as an activation input of the arithmetic circuit 503, for example, for use in a subsequent layer in a neural network.
- the controller 504 is connected to an instruction fetch buffer 509, and the instructions used by the controller 504 can be stored in the instruction fetch buffer 509.
- the unified memory 506, the input memory 501, the weight memory 502, and the instruction fetch memory 509 are all on-chip memories.
- the external memory is private to the NPU hardware architecture.
- the main CPU and NPU cooperate to realize the corresponding algorithm for the functions required by the vehicle 100 in Figure 1, and the corresponding algorithm for the functions required by the vehicle shown in Figure 2, or the algorithm shown in Figure 3.
- the main CPU and NPU work together to implement the corresponding algorithms for the functions required by the server.
- the computer system 112 may also receive information from other computer systems or transfer information to other computer systems.
- the sensor data collected from the sensor system 104 of the vehicle 100 can be transferred to another computer, and the data can be processed by the other computer.
- data from the computer system 112 may be transmitted to the computer system 720 on the cloud side via the network for further processing.
- the network and intermediate nodes can include various configurations and protocols, including the Internet, World Wide Web, Intranet, Virtual Private Network, Wide Area Network, Local Area Network, private network using one or more company's proprietary communication protocols, Ethernet, WiFi, and hypertext Transmission protocol (hypertext transfer protocol, HTTP), and various combinations of the foregoing. This communication can be performed by any device capable of transferring data to and from other computers, such as modems and wireless interfaces.
- the computer system 720 may include a server with multiple computers, such as a load balancing server group. In order to receive, process, and transmit data from the computer system 112, the computer system 720 exchanges information with different nodes of the network.
- the server 720 may have a configuration similar to the computer system 112 and have a processor 730, a memory 740, instructions 750, and data 760.
- the data 760 of the server 720 may include providing weather-related information.
- the server 720 may receive, monitor, store, update, and transmit various information related to weather.
- the information may include precipitation, cloud, and/or temperature information and/or humidity information in the form of reports, radar information, forecasts, etc., for example.
- the cloud service center may receive information (such as data collected by vehicle sensors or other information) from vehicles 513 and 512 in its operating environment 500 via a network 511 such as a wireless communication network.
- information such as data collected by vehicle sensors or other information
- the cloud service center 520 controls the vehicles 513 and 512 by running its stored programs related to controlling car driving according to the received data.
- the program related to controlling the driving of a car can be: a program that manages the interaction between the car and an obstacle on the road, or a program that controls the route or speed of the car, or a program that controls the interaction between the car and other cars on the road.
- the cloud service center 520 may provide a part of the map to the vehicles 513 and 512 through the network 511.
- operations can be divided between different locations.
- multiple cloud service centers can receive, confirm, combine, and/or send information reports.
- information reports and/or sensor data can also be sent between vehicles.
- Other configurations are also possible.
- the cloud service center 520 sends a suggested solution to the vehicle regarding possible driving situations in the environment (eg, inform the obstacle ahead and tell how to circumvent it)). For example, the cloud service center 520 may assist the vehicle in determining how to proceed when facing a specific obstacle in the environment.
- the cloud service center 520 sends a response to the vehicle indicating how the vehicle should travel in a given scene.
- the cloud service center 520 can confirm the existence of a temporary stop sign in front of the road based on the collected sensor data. For example, based on the “lane closed” sign and the sensor data of construction vehicles, it can be determined that the lane is closed due to construction.
- the cloud service center 520 sends a suggested operation mode for the vehicle to pass through the obstacle (for example, instructing the vehicle to change lanes on another road).
- the operation steps used for the vehicle can be added to the driving information map.
- this information can be sent to other vehicles in the area that may encounter the same obstacle, so as to assist other vehicles not only to recognize the closed lanes but also to know how to pass.
- the disclosed methods may be implemented as computer program instructions in a machine-readable format, encoded on a computer-readable storage medium, or encoded on other non-transitory media or articles.
- Figure 7 schematically illustrates a conceptual partial view of an example computer program product arranged in accordance with at least some of the embodiments shown herein, the example computer program product including a computer program for executing a computer process on a computing device.
- the example computer program product 600 is provided using a signal bearing medium 601.
- the signal-bearing medium 601 may include one or more program instructions 602, which, when run by one or more processors, can provide all or part of the functions described above with respect to FIGS. 1 to 6, or can provide descriptions in subsequent embodiments All or part of the function.
- one or more features in S901 to S903 may be undertaken by one or more instructions associated with the signal bearing medium 601.
- the program instructions 602 in FIG. 7 also describe example instructions.
- the computer program product when the technical solutions of the embodiments of the present application are executed by a vehicle or a component in the vehicle, the computer program product may be a program product used by the vehicle or its components.
- the computer program product when the technical solution of the embodiment of the present application is executed by another device other than the vehicle, such as a server, the computer program product may be a program product used by the other device.
- the signal-bearing medium 601 may include a computer-readable medium 603, such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (read -only memory, ROM) or random access memory (RAM), etc.
- the signal bearing medium 601 may include a computer recordable medium 604, such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
- the signal-bearing medium 601 may include a communication medium 605, such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
- the signal bearing medium 601 may be communicated by a wireless communication medium 605 (for example, a wireless communication medium that complies with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard or other transmission protocols).
- the one or more program instructions 602 may be, for example, computer-executable instructions or logic-implemented instructions.
- a computing device such as that described with respect to FIGS. 1 to 6 may be configured to respond to passing through one or more of the computer-readable medium 603, and/or the computer recordable medium 604, and/or the communication medium 605
- a program instruction 602 communicated to the computing device provides various operations, functions, or actions. It should be understood that the arrangement described here is for illustrative purposes only.
- the road structure detection method provided by the embodiment of the present application is applied in automatic/semi-automatic driving or other driving scenarios. Specifically, it is applied in the scene of determining road structure information.
- the road structure information includes the location information of the merging point of the own lane and the adjacent lane and/or the separation point of the own lane and the adjacent lane. Exemplarily, it is applied to the scene shown in Figure 8 (a) or Figure 8 (b). As shown in Fig. 8(a), there is a merge point A between the own lane and the adjacent lane where the vehicle 801 is located (that is, the lane close to the own lane), that is, the own lane and the right adjacent lane.
- the location of the merging point A is the intersection of the right lane line (HR) of the own lane and the right left lane line (RL) of the right adjacent lane.
- lane lines generally refer to flat patterns drawn on the road surface using paint (such as paint, etc.).
- FIG. 8(b) there is a separation point B between the own lane where the vehicle 802 is located and the right adjacent lane.
- the position of the separation point B is related to the position of the guardrail (or cement pier) on the edge of the lane and the guardrail (or cement pier) on the edge of the adjacent lane.
- the position of the separation point B is the intersection of the two guardrails.
- the location of the separation point B is the intersection of the two cement piers.
- the difference between the scene shown in Figure 8 (a) and Figure 8 (b) lies in the fact that the location of the merge point or the separation point is determined according to what factors. In one case, the location of the merge point or the separation point is determined by the lane line of the plane. The intersection position is determined. In one case, the position of the merging point or the separation point is determined by the position of the road surface protrusion.
- the embodiment of the present application provides a road structure detection method, which can be applied to the device shown in FIG. 1 to FIG. 6 or the computer program product shown in FIG. 7 or other devices far away from the vehicle.
- the executive body of the technical solution is omitted. Referring to Figure 9, the method includes the following steps:
- the boundary information of the own lane is used to characterize the position of the boundary of the own lane (ie, the current lane); the boundary information of the own lane includes lane line information of the own lane, and/or position information of the edge of the own lane.
- protrusions can be used to divide the edges of the lanes between lanes.
- the position information of the edge of the own lane includes, but is not limited to, the position information of the protrusion on the edge of the own lane.
- the protrusions on the edge of the lane include, but are not limited to, guardrails and concrete piers on the edge of the lane.
- a visual sensor such as a camera
- Radar or similar components can be used to obtain the position of the edge of the lane.
- S902 Determine boundary information of adjacent lanes.
- the adjacent lane boundary information is used to characterize the position of the boundary of the adjacent lane; the boundary information of the adjacent lane includes the lane line information of the adjacent lane and/or the position information of the edge of the adjacent lane.
- the position information of the edge of the adjacent lane may refer to the position information of the protrusion on the edge of the adjacent lane.
- Protrusions include, but are not limited to, guardrails and concrete piers adjacent to the edge of the lane.
- the adjacent lane boundary information is determined based on the detected vehicle trajectory information and/or road boundary information, and road prior data. That is, the adjacent lane boundary information is determined based on the road prior data and the detected vehicle trajectory information. Alternatively, the adjacent lane boundary information is determined based on road prior data and road boundary information. Alternatively, the adjacent lane boundary information is determined based on the road prior data, the detected vehicle trajectory information, and the road boundary information.
- the road boundary information is used to characterize the position of the road edge (edge).
- the visual sensor of the vehicle can be used to collect the visual information of the vehicle in the adjacent lane to determine the detected driving trajectory information.
- the vehicle's radar or components with similar functions can also be used to determine the position and speed of the vehicle in other lanes by means of receiving and transmitting lasers, millimeter waves, etc., to determine the driving trajectory information of the other lane.
- other methods may also be used to determine the driving trajectory information of other lanes, which is not limited in the embodiment of the present application.
- components such as a camera can be used to collect visual information of the road boundary line to determine the location of the road boundary, that is, road boundary information.
- the camera can also be used to determine the road boundary information.
- Components such as radar can also be used to determine road boundary information. The embodiment of the present application does not limit the manner of determining road boundary information.
- the road prior data includes lane width.
- the lane width may be different.
- the width of each lane of urban roads is 3.5 meters
- the width of each lane of the diversion lanes of intersections is 2.3-2.5 meters
- the width of each lane of arterial roads (including expressways) is 3.75 meters.
- the lane width used to determine the adjacent lane boundary information in the embodiment of the present application needs to be related to the scene where the vehicle is currently located.
- the sensor of the own vehicle such as a camera, can be used to collect the lane width in the current scene.
- the own vehicle can also directly obtain the lane width in the current scene from other devices in the network (such as a server).
- the embodiment of the present application does not limit the way of acquiring the lane width.
- the method of determining the boundary information of adjacent lanes is described in detail as follows. Among them, the following is divided into two cases according to whether the lane line or the convexity is used to divide the merge point or the separation point, and the description is divided into two cases:
- Case 1 When using paint to mark lane lines to divide the merge or split point, if the adjacent lane is the right adjacent lane, the boundary information of the adjacent lane includes the position information of the RL; if the adjacent lane is the left adjacent lane, the adjacent lane The boundary information of the lane includes the position information of the left right lane line (LR) of the left adjacent lane.
- LR left right lane line
- An example of LR is shown in (a) of FIG. 8. It should be noted that in Figure 8(a), RL and LR are marked at the same time, but in actual application scenarios, there may be cases where LR and RL do not exist on the road at the same time. In other words, in actual scenarios, there may only be RL or only LR.
- RL and/or LR are determined according to road boundary information and road prior data. Specifically, the road boundary is shifted to the left by a first width to obtain the first position of the RL; and the road boundary is shifted to the right by a third width to obtain the fourth position of the LR.
- the first width is an integer multiple of the lane width
- the third width is an integer multiple of the lane width
- the road boundary information can be described by a cubic equation (because the road surface is usually a flat surface, it can also be described by a quadratic equation or other possible equations). It is possible to map road boundary information (points constituting the road boundary) to the world coordinate system, and perform a series of operations on these points in the world coordinate system to obtain the position of RL or LR. Exemplarily, to determine the RL position as an example, see Figure 10 (a), after mapping the points on the road boundary to the world coordinate system, all points on the road boundary are shifted to the left (that is, to the point close to the origin o Pan on one side).
- the vehicle since the vehicle does not perceive the existence of several lanes between the road boundary and its own lane in advance, when performing translation operations, first use the smallest granularity as a unit, that is, first translate a lane width to obtain a preliminary RL.
- the preliminary determined RL is used to subsequently determine the position of the preliminary merging point or the separation point.
- the RL that is initially determined needs to be adjusted again until the final location of the merging point or the separation point meets the preset conditions.
- the method of initially determining RL specifically how to determine whether the location of the merging point or the separation point meets the preset conditions, and how to adjust the RL, etc., which will be explained in detail below.
- the determination method of LR can be referred to the determination method of RL, which will not be elaborated here.
- the road boundary information can also be mapped to the image plane (or called the image coordinate system), and a series of operations such as translation are performed on the points on the road boundary on the image plane to determine the RL or LR. s position.
- the embodiment of the present application does not limit the type of the mapped coordinate system.
- RL and/or LR are determined according to the detected vehicle trajectory information and road prior data. Specifically, the vehicle trajectory is shifted to the left by a second width to obtain the second position of the RL; and the vehicle trajectory is shifted to the right by a fourth width to obtain the fifth position of the LR.
- the second width is an odd multiple of the half-lane width
- the fourth width is an odd multiple of the half-lane width
- the detected vehicle trajectory information can be described by a cubic equation (because the road is usually a flat surface, it can also be described by a quadratic equation or other possible equations).
- the driving trajectory information (points constituting the driving trajectory) can be mapped to the world coordinate system, and a series of operations such as translation can be performed on these points in the world coordinate system to obtain the position of RL or LR.
- a series of operations such as translation can be performed on these points in the world coordinate system to obtain the position of RL or LR.
- Figure 10 (b) Exemplarily, to determine the RL position as an example, see Figure 10 (b). After mapping the points on the driving trajectory to the world coordinate system, all points on the driving trajectory are shifted to the left by half the width of the lane, that is RL is available.
- the determination method of LR can be referred to the determination method of RL, which will not be elaborated here.
- the second width or the fourth width is set by taking the observation vehicle at the center of the lane as an example. Considering that the observation vehicle may not actually be in the center of the lane, the actual width of the second width or the fourth width can also be adjusted to other widths.
- the driving trajectory information may also be mapped to the image plane, and a series of operations such as translation are performed on the points on the driving trajectory on the image plane to determine the position of the RL or LR.
- RL and/or LR are determined according to road boundary information, detected vehicle trajectory information, and road prior data. Specifically, the first position and the second position are merged through a preset algorithm to obtain the third position of the RL; the fourth position and the fifth position are merged through the preset algorithm to obtain the sixth position of the LR.
- the RL determined based on the road boundary information and road prior data there may be between the RL determined based on the road boundary information and road prior data and the RL determined based on the trajectory of the adjacent lane and the road prior data. Some deviations. For example, because the radar detection range is relatively small, the trajectory of the short-distance vehicle may be detected more accurately, and the RL position obtained from this is more accurate. The detected long-distance road boundary information is not very accurate, and the RL position obtained may not be very accurate. accurate. In this case, in order to improve the accuracy of the determined RL position, the two types of RL position information can be fused. Among them, the adopted fusion algorithm can be but not limited to a weighted summation.
- the weight of the first position of the RL determined according to the road boundary information and the weight of the second position of the RL determined according to the traffic trajectory information may be set according to the actual scene. For example, when the detection distance of the radar itself is small, or the detection distance is small due to haze weather, and the second position may be closer to the accurate position of the RL, the weight of the second position is set to be greater than the weight of the first position. In this case, since the weight can be determined in real time according to factors such as the actual scene or the performance of the vehicle itself, the accuracy of the location information obtained by the fusion can be improved.
- the weights of the first position and the second position can also be preset. For example, the weights of the first position and the second position are set in advance according to some historical data or other data. The way of preset weights does not need to adjust the weights of the first position and the second position according to the actual scene, which is relatively simple to implement.
- Case 2 When a convex object is used to divide the merge point or the separation point, if the adjacent lane is the right adjacent lane, the boundary information of the adjacent lane includes the position information of the left edge of the right adjacent lane; if the adjacent lane is the left adjacent lane, The boundary information of the adjacent lane includes the position information of the right edge of the left adjacent lane.
- the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to the road boundary information and road prior data. Specifically, shifting the road boundary to the left by a fifth width to obtain the seventh position of the left edge of the right adjacent lane; shifting the road boundary to the right by a seventh width to obtain the first position of the right edge of the left adjacent lane Ten positions.
- the fifth width is an integer multiple of the lane width
- the seventh width is an integer multiple of the lane width.
- the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to the detected vehicle trajectory information and road prior data. Specifically, the vehicle trajectory is shifted to the left by a sixth width to obtain the eighth position of the left edge of the right adjacent lane; and the vehicle trajectory is shifted to the right by an eighth width to obtain the eleventh position of the right edge of the left adjacent lane.
- the sixth width is an odd multiple of the half-lane width
- the eighth width is an odd multiple of the half-lane width
- the driving trajectory information can also be mapped to the image plane to determine the position of the left edge of the adjacent lane to the right or the right edge of the adjacent lane to the left.
- the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to road boundary information, detected vehicle trajectory information, and road prior data. Specifically, the first position and the second position of the left edge of the adjacent lane on the right are merged by a preset algorithm to obtain the third position of the left edge of the adjacent lane on the right; the fourth position and the fifth position of the right edge of the adjacent lane on the left are combined The sixth position of the right edge of the left adjacent lane is obtained by the preset algorithm fusion.
- the lanes mentioned in the embodiments of the present application not only refer to ordinary lanes, but can also refer to road structures such as emergency parking belts.
- the emergency parking zone is not much different from the ordinary lane width, it can be processed according to the ordinary lane width.
- the actual emergency parking belt width may be used to perform the above translation operation to determine the adjacent lane boundary information.
- the adjacent lane boundary information is inferred based on the road boundary information and/or the detected driving trajectory information, instead of directly detecting the adjacent lane boundary information through the camera.
- the poor detection performance of the camera such as a small detection distance
- environmental factors such as haze
- S903 Determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
- S903 is specifically implemented as: determining the location information of the merging point or the separation point according to the lane line information (HR) of the own lane and the lane line information of the adjacent lane. Take the merging point between the current lane and the right adjacent lane as an example. See Figure 10 (a) or Figure 10 (b). After determining the positions of HR and RL, the intersection of HR and RL is the merge point A position.
- HR lane line information
- S903 is specifically implemented as: determining the position information of the merging point or the separating point according to the information of the protrusions on the edge of the lane and the information of the protrusions on the edge of the adjacent lane .
- the merging point between the lane and the right adjacent lane as an example, see Figure 10 (c) or Figure 10 (d), when determining the right edge guardrail (or cement pier, etc.) of the lane and the left side of the right adjacent lane After the position of the edge guardrail (or cement pier, etc.), the intersection of the two guardrails (or concrete pier) is the location of the merge point A.
- the preset condition may be: the merging point that is initially determined (such as those determined in FIG. 10(a)-FIG. 10(d))
- the distance between the merging point A) and the reference point is less than or equal to the first threshold.
- the reference point is a position reference, which is used to judge whether the position of the merging point (or the separation point) determined initially is accurate.
- the reference point may be the current vehicle (such as the vehicle 801 in (a) of FIG. 8).
- the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
- the position of the merge point A and the vehicle is preliminarily determined in Figure 10 (a)- Figure 10 (d). If the distance between the two is less than or equal to the first threshold, it is deemed that the initially determined position of the merge point is accurate, and there is no need to further adjust the position of the merge point. Similarly, when a separation point appears on the road, the distance between the separation point and the vehicle is also required to be less than or equal to the first threshold. When the preliminarily determined distance between the separation point and the vehicle is less than or equal to the first threshold, the preliminarily determined separation point position is deemed accurate, and there is no need to further adjust the separation point position.
- the width adjusted when determining the adjacent lane boundary information is inaccurate.
- at least one of the aforementioned widths should be adjusted. Specifically, the aforementioned at least one width is adjusted according to the first threshold; and the merging point and/or the separation point are adjusted based on the adjusted at least one width.
- Figure 11 (a) when there are multiple lanes between the vehicle 801's own lane and the road boundary ( Figure 11 (a) shows two lanes), move the point on the road boundary to the left as a whole by one
- Figure 11 (a) shows two lanes
- the right lane line of the right adjacent lane is regarded as the preliminary determined RL.
- the location of the merge point is initially determined as shown in Figure 11 (a) at point B.
- the distance between the preliminarily determined point B and the vehicle is relatively large, and as shown in (a) of FIG. 11, point B is not a true merge point. Therefore, it is necessary to adjust the translation width from a lane width according to a certain compensation step length, where the step length can be a lane width.
- the step length can be a lane width.
- you need to adjust the translation width from one lane width to two lane widths that is, adjust the point on the road boundary to two lane widths to the left, that is, to pan to the left by one lane width for the first time and then to the left by one lane width.
- Get the adjusted RL As shown in Figure 11 (a), after this adjustment, the actual RL can be obtained.
- the position of the intersection of the RL and the HR that is, the position of the merging point meets a preset condition, the position of the merging point is regarded as the final merging point position.
- the adjusted first width is actually two lane widths.
- widths other than the first width can also be adjusted.
- the first width in the current scene, as shown in Figure 11(a), there is also a merging point between the left adjacent lane and the own lane.
- the width of the road boundary required to be translated, that is, the third width is the width of two lanes to obtain the actual LR position.
- point C is not a true merging point. Therefore, it is necessary to adjust the translation width from half the lane width to a certain compensation step length, where the step length can be a lane width.
- the step length can be a lane width.
- Figure 11 (b) after this adjustment, the position of the intersection (B) of the obtained lane line 1102 and the HR still does not meet the preset condition, therefore, the translation width needs to be adjusted.
- the road structure detection method provided by the embodiments of the present application can determine road structure information based on the boundary information of the own lane and the boundary information of the adjacent lanes, where the boundary information includes lane line information and/or lane edge position information. That is to say, in the embodiment of the present application, the road structure information can be determined without using a high-precision map. Therefore, in areas where there is no HD map coverage, the vehicle can still determine the road structure.
- the embodiment of the present application may divide the road structure detection device into functional modules according to the foregoing method examples.
- each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
- the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules.
- the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
- Fig. 12 shows a possible schematic diagram of the structure of the road structure detection device involved in the above-mentioned embodiment.
- the road structure detection device 16 includes a determination module 161 and an adjustment module 162.
- the road structure detection device 16 may also include other modules (such as a storage module), or the road structure detection device may include fewer modules.
- the determining module is used to determine the boundary information of the own lane, the boundary information of the own lane is used to characterize the position of the boundary of the current lane; the boundary information of the adjacent lane is determined, and the boundary information of the adjacent lane is used to characterize the boundary of the adjacent lane The location of the border.
- the boundary information includes lane line information and/or lane edge position information.
- the determining module is further configured to determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane, where the road structure information includes the merging point of the own lane and the adjacent lane and/or the Location information of the separation point between the own lane and the adjacent lane.
- the boundary information of the adjacent lane is determined according to the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
- the boundary information of the adjacent lane includes at least one of the following:
- the first position of the RL is obtained by shifting the road boundary to the left by a first width; or, the second position of the RL is obtained by shifting the driving track to the left by a second width; or, the RL
- the third position is obtained by fusing the first position of the RL and the second position of the RL through a preset algorithm.
- the fourth position of the LR is obtained by shifting the road boundary to the right by a third width; or, the fifth position of the LR is obtained by shifting the driving track to the right by a fourth width; or, the sixth position of the LR is obtained by the The fourth position of the LR and the fifth position of the LR are fused through a preset algorithm.
- the seventh position of the left edge of the right adjacent lane is obtained by shifting the road boundary to the left by a fifth width; or, the eighth position of the left edge is obtained by shifting the driving track to the left by a sixth width; or, the left edge
- the ninth position of is obtained by fusing the seventh position of the left edge and the eighth position of the left edge through a preset algorithm.
- the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the right
- the twelfth position of the edge is obtained by fusing the tenth position of the right edge and the eleventh position of the right edge through a preset algorithm.
- the first width is an integer multiple of a lane width
- the second width is an odd multiple of a half-lane width
- the third width is an integer multiple of the lane width
- the fourth width is the half-lane width.
- the fifth width is an integer multiple of the lane width
- the sixth width is an odd multiple of the half-lane width
- the seventh width is an integer multiple of the lane width
- the eighth width is an odd multiple of the half-lane width.
- the distance between the merge point and the reference point is less than or equal to a first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold.
- Threshold the reference point includes a vehicle.
- the adjustment module is configured to adjust the first width and/or the second width and/or the third width and/or the fourth width and the fourth width according to the first threshold. / Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width; based on the adjusted first width and/or the second width and/ Or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width, adjusting the merging point; and /Or, used to adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the The sixth width and/or the seventh width and/or the eighth width; based on the adjustment of the first width and/or the second width and/or the third width and/or the fourth width and/ Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width to adjust the separation point.
- the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
- the boundary information of the adjacent lane is determined according to road prior data, and the road prior data includes the lane width.
- the present application also provides a road structure detection device 10 including a processor 1001.
- the road structure detection device 10 may further include a memory 1002.
- the processor 1001 and the memory 1002 are connected (for example, connected to each other through a bus 1004).
- the road structure detection device 10 may further include a transceiver 1003, which is connected to the processor 1001 and the memory 1002, and the transceiver is used to receive/send data.
- a transceiver 1003 which is connected to the processor 1001 and the memory 1002, and the transceiver is used to receive/send data.
- the processor 1001 can execute operations of any one of the implementation solutions and various feasible implementation manners corresponding to FIG. 9. For example, it is used to perform operations of the determining module 161 and the adjusting module 162, and/or other operations described in the embodiments of the present application.
- the processor 1001 is also used to control the sensor 1005 so that the sensor 1005 obtains some sensing information.
- the sensor 1005 may be included in the road structure detection device 10. It can also be an external sensor.
- the road structure detection device 10 includes a sensor 1005, that is, the sensor 1005 is a built-in sensor of the road structure detection device 10, optionally, all the data processing functions in the foregoing method embodiment can be integrated in the sensor 1005. In this case, the road structure detection device 10 may not include the processor 1001.
- the sensor 1005 may be used to perform the foregoing method embodiments, that is, to perform operations of the determining module 161 and the adjusting module 162, and/or other operations described in the embodiments of the present application.
- the data processing may be a fusion operation.
- the sensor 1005 can perform a fusion operation on the adjacent lane boundary information determined by the road boundary information and the adjacent lane boundary information determined by the driving trajectory information, so as to improve the accuracy of the adjacent lane boundary information.
- the data processing can also be other data processing procedures in the foregoing method embodiments. For example, when the position of the merged point (or separation point) that is initially determined is greater than or equal to the first threshold, the sensor 1005 can be used to adjust the position of the road boundary and/or the trajectory of the vehicle. The width of the desired translation.
- the embodiment of the present application does not limit the specific data processing functions that the sensor 1005 can perform.
- the senor 1005 may refer to a visual sensor (such as a camera) or a sensor such as a radar. It can also refer to other sensors with similar functions.
- the senor 1005 may not integrate data processing functions, or integrate a part of data processing functions.
- the sensor 1005 combined with the processor can execute the foregoing method embodiment, and the road structure detection device 10 needs to include the sensor 1005 and the processor 1001.
- the senor 1005 is a traditional sensor, which does not have a data function.
- the sensor 1005 is used to determine road boundary information and/or vehicle trajectory information. That is, the sensor 1005 is used for.
- the processor is used to determine the adjacent lane boundary information according to the road boundary information and/or the driving trajectory information, and to determine the location of the merge point (and/or separation point) according to the adjacent lane boundary information and the own lane boundary information. Among them, when the adjacent lane boundary information is determined according to the road boundary information and the driving trajectory information, a fusion operation is involved.
- the senor 1005 may have a part of data processing function
- the processor may have a part of data processing function.
- the sensor 1005 can perform data processing according to the position of the road boundary collected by the sensor 1005 to obtain adjacent lane boundary information.
- the processor can be used for other data processing functions such as fusion operations.
- the embodiment of the present application does not limit the specific division of labor between the sensor 1005 and the processor 1001 (that is, which part of the data is processed separately).
- This application also provides a road structure detection device, which includes a non-volatile storage medium and a central processing unit.
- the non-volatile storage medium stores an executable program.
- the central processing unit is connected to the non-volatile storage medium and executes The program can be executed to implement the road structure detection method of the embodiment of the present application.
- the present application further provides a computer-readable storage medium.
- the computer-readable storage medium includes one or more program codes.
- the one or more programs include instructions.
- the processor executes the program codes
- the The road structure detection device executes the road structure detection method shown in FIG. 9.
- a computer program product in another embodiment of the present application, includes computer-executable instructions, and the computer-executable instructions are stored in a computer-readable storage medium. At least one processor of the road structure detection device can read the computer-executable instruction from the computer-readable storage medium, and at least one processor executes the computer-executed instruction so that the road structure detection device implements the corresponding road structure detection method shown in FIG. 9 step.
- the above embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
- a software program it can appear in the form of a computer program product in whole or in part.
- the computer program product includes one or more computer instructions.
- the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
- Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
- Computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
- the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
- the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
- the disclosed device and method may be implemented in other ways.
- the device embodiments described above are merely illustrative.
- the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be divided. It can be combined or integrated into another device, or some features can be omitted or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate parts may or may not be physically separate.
- the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
- the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of a software product, and the software product is stored in a storage medium. It includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program code .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Multimedia (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
Claims (20)
- 一种道路结构检测方法,其特征在于,包括:确定本车道的边界信息,所述本车道的边界信息用于表征当前车道的边界的位置;确定邻车道的边界信息,所述邻车道的边界信息用于表征相邻车道的边界的位置;其中,所述边界信息包括车道线信息和/或车道边缘的位置信息;根据所述本车道的边界信息和所述邻车道的边界信息确定道路结构信息,所述道路结构信息包括所述本车道与所述邻车道的合并点和/或所述本车道与所述邻车道的分离点的位置信息。
- 根据权利要求1所述的道路结构检测方法,其特征在于,所述邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,所述道路边界信息用于表征道路边界的位置。
- 根据权利要求1或2所述的道路结构检测方法,其特征在于,所述邻车道的边界信息包括以下至少一个:右邻车道的左车道线RL的位置信息;左邻车道的右车道线LR的位置信息;右邻车道的左边缘的位置信息;左邻车道的右边缘的位置信息。
- 根据权利要求3所述的道路结构检测方法,其特征在于,所述RL的第一位置由道路边界向左平移第一宽度得到;或者,所述RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由所述RL的第一位置和所述RL的第二位置经预设算法融合得到;所述LR的第四位置由道路边界向右平移第三宽度得到;或者,所述LR的第五位置由行车轨迹向右平移第四宽度得到;或者,所述LR的第六位置由所述LR的第四位置和所述LR的第五位置经预设算法融合得到;所述右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,所述左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,所述左边缘的第九位置由所述左边缘的第七位置和所述左边缘的第八位置经预设算法融合得到;所述左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,所述右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,所述右边缘的第十二位置由所述右边缘的第十位置和所述右边缘的第十一位置经预设算法融合得到;其中,所述第一宽度为车道宽度的整数倍,所述第二宽度为半车道宽度的奇数倍,所述第三宽度为所述车道宽度的整数倍,所述第四宽度为所述半车道宽度的奇数倍,所述第五宽度为所述车道宽度的整数倍,所述第六宽度为所述半车道宽度的奇数倍,所述第七宽度为所述车道宽度的整数倍,所述第八宽度为所述半车道宽度的奇数倍。
- 根据权利要求1至4中任一项所述的道路结构检测方法,其特征在于,所述合并点与参考点之间的距离小于或等于第一阈值,和/或,所述分离点与所述参考点之间的距离小于或等于所述第一阈值,所述参考点包括当前车辆。
- 根据权利要求4或5所述的道路结构检测方法,其特征在于,所述方法还包括:根据所述第一阈值调整至少一个所述宽度;根据至少一个所述宽度调整所述合并点和/或所述分离点。
- 根据权利要求5或6所述的道路结构检测方法,其特征在于,所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
- 根据权利要求1至7中任一项所述的道路结构检测方法,其特征在于,所述邻车道的边界信息是根据道路先验数据确定的,所述道路先验数据包括车道宽度。
- 一种道路结构检测装置,其特征在于,包括:处理器,用于确定本车道的边界信息,所述本车道的边界信息用于表征当前车道的边界的位置;确定邻车道的边界信息,所述邻车道的边界信息用于表征相邻车道的边界的位置;其中,所述边界信息包括车道线信息和/或车道边缘的位置信息;所述处理器,还用于根据所述本车道的边界信息和所述邻车道的边界信息确定道路结构信息,所述道路结构信息包括所述本车道与所述邻车道的合并点和/或所述本车道与所述邻车道的分离点的位置信息。
- 根据权利要求9所述的道路结构检测装置,其特征在于,所述邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,所述道路边界信息用于表征道路边界的位置。
- 根据权利要求9或10所述的道路结构检测装置,其特征在于,所述邻车道的边界信息包括以下至少一个:右邻车道的左车道线RL的位置信息;左邻车道的右车道线LR的位置信息;右邻车道的左边缘的位置信息;左邻车道的右边缘的位置信息。
- 根据权利要求11所述的道路结构检测装置,其特征在于,所述RL的第一位置由道路边界向左平移第一宽度得到;或者,所述RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由所述RL的第一位置和所述RL的第二位置经预设算法融合得到;所述LR的第四位置由道路边界向右平移第三宽度得到;或者,所述LR的第五位置由行车轨迹向右平移第四宽度得到;或者,所述LR的第六位置由所述LR的第四位置和所述LR的第五位置经预设算法融合得到;所述右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,所述左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,所述左边缘的第九位置由所述左边缘的第七位置和所述左边缘的第八位置经预设算法融合得到;所述左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,所述右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,所述右边缘的第十二位置由所述右边缘的第十位置和所述右边缘的第十一位置经预设算法融合得到;其中,所述第一宽度为车道宽度的整数倍,所述第二宽度为半车道宽度的奇数倍,所述第三宽度为所述车道宽度的整数倍,所述第四宽度为所述半车道宽度的奇数倍,所述第五宽度为所述车道宽度的整数倍,所述第六宽度为所述半车道宽度的奇数倍,所述第七宽度为所述车道宽度的整数倍,所述第八宽度为所述半车道宽度的奇数倍。
- 根据权利要求9至12中任一项所述的道路结构检测装置,其特征在于,所述合并点与参考点之间的距离小于或等于第一阈值,和/或,所述分离点与所述参考点之间的距离小于或等于所述第一阈值,所述参考点包括车辆。
- 根据权利要求12或13所述的道路结构检测装置,其特征在于,所述处理器,还用于根据所述第一阈值调整至少一个所述宽度;以及用于根据至少一个所述宽度调整所述合并点和/或所述分离点。
- 根据权利要求13或14所述的道路结构检测装置,其特征在于,所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
- 根据权利要求9至15中任一项所述的道路结构检测装置,其特征在于,所述邻车道的边界信息是根据道路先验数据确定的,所述道路先验数据包括车道宽度。
- 一种计算机可读存储介质,其特征在于,包括程序或指令,当所述程序或指令在计算机上运行时,如权利要求1至8中任一项所述的道路结构检测方法被实现。
- 一种芯片系统,其特征在于,包括处理器和通信接口,所述处理器用于实现权利要求1至8任意一项所述的道路结构检测方法。
- 一种电路系统,其特征在于,所述电路系统包括处理电路,所述处理电路配置为执行如权利要求1至8任意一项所述的道路结构检测方法。
- 一种道路结构检测装置,其特征在于,包括处理器和存储器;所述存储器用于存储计算机执行指令,当所述装置运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述装置执行如权利要求1-8中任意一项所述的道路结构检测方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112022010922A BR112022010922A2 (pt) | 2019-12-06 | 2020-12-07 | Método de detecção de estrutura de estrada, dispositivo, meio de armazenamento legível por computador e sistema de chip |
EP20895891.8A EP4059799A4 (en) | 2019-12-06 | 2020-12-07 | METHOD AND DEVICE FOR RECOGNIZING ROAD STRUCTURES |
US17/833,456 US20220309806A1 (en) | 2019-12-06 | 2022-06-06 | Road structure detection method and apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911245257.2 | 2019-12-06 | ||
CN201911245257.2A CN113022573B (zh) | 2019-12-06 | 2019-12-06 | 道路结构检测方法及装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/833,456 Continuation US20220309806A1 (en) | 2019-12-06 | 2022-06-06 | Road structure detection method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021110166A1 true WO2021110166A1 (zh) | 2021-06-10 |
Family
ID=76221510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/134225 WO2021110166A1 (zh) | 2019-12-06 | 2020-12-07 | 道路结构检测方法及装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220309806A1 (zh) |
EP (1) | EP4059799A4 (zh) |
CN (1) | CN113022573B (zh) |
BR (1) | BR112022010922A2 (zh) |
WO (1) | WO2021110166A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4020428A4 (en) * | 2019-08-28 | 2022-10-12 | Huawei Technologies Co., Ltd. | LANE RECOGNITION METHOD AND APPARATUS, AND COMPUTER DEVICE |
CN116030286B (zh) * | 2023-03-29 | 2023-06-16 | 高德软件有限公司 | 边界车道线匹配方法、装置、电子设备及存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104149783A (zh) * | 2014-08-27 | 2014-11-19 | 刘红华 | 一种数字公路及其自动驾驶车辆 |
CN104554259A (zh) * | 2013-10-21 | 2015-04-29 | 财团法人车辆研究测试中心 | 主动式自动驾驶辅助系统与方法 |
JP2015102893A (ja) * | 2013-11-21 | 2015-06-04 | 日産自動車株式会社 | 合流支援システム |
JP2018044833A (ja) * | 2016-09-14 | 2018-03-22 | 日産自動車株式会社 | 自動運転支援方法および装置 |
DE102018007298A1 (de) * | 2018-09-14 | 2019-03-28 | Daimler Ag | Verfahren zur Routenplanung |
CN110361021A (zh) * | 2018-09-30 | 2019-10-22 | 长城汽车股份有限公司 | 车道线拟合方法及系统 |
CN110361015A (zh) * | 2018-09-30 | 2019-10-22 | 长城汽车股份有限公司 | 道路特征点提取方法及系统 |
CN110386065A (zh) * | 2018-04-20 | 2019-10-29 | 比亚迪股份有限公司 | 车辆盲区的监控方法、装置、计算机设备及存储介质 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5066123B2 (ja) * | 2009-03-24 | 2012-11-07 | 日立オートモティブシステムズ株式会社 | 車両運転支援装置 |
JP5852920B2 (ja) * | 2012-05-17 | 2016-02-03 | クラリオン株式会社 | ナビゲーション装置 |
WO2016027270A1 (en) * | 2014-08-18 | 2016-02-25 | Mobileye Vision Technologies Ltd. | Recognition and prediction of lane constraints and construction areas in navigation |
US9494438B1 (en) * | 2015-12-15 | 2016-11-15 | Honda Motor Co., Ltd. | System and method for verifying map data for a vehicle |
US10670416B2 (en) * | 2016-12-30 | 2020-06-02 | DeepMap Inc. | Traffic sign feature creation for high definition maps used for navigating autonomous vehicles |
JP6653300B2 (ja) * | 2017-09-15 | 2020-02-26 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
JP6614509B2 (ja) * | 2017-10-05 | 2019-12-04 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
CN109186615A (zh) * | 2018-09-03 | 2019-01-11 | 武汉中海庭数据技术有限公司 | 基于高精度地图的车道边线距离检测方法、装置及存储介质 |
CN110160552B (zh) * | 2019-05-29 | 2021-05-04 | 百度在线网络技术(北京)有限公司 | 导航信息确定方法、装置、设备和存储介质 |
-
2019
- 2019-12-06 CN CN201911245257.2A patent/CN113022573B/zh active Active
-
2020
- 2020-12-07 EP EP20895891.8A patent/EP4059799A4/en active Pending
- 2020-12-07 WO PCT/CN2020/134225 patent/WO2021110166A1/zh unknown
- 2020-12-07 BR BR112022010922A patent/BR112022010922A2/pt unknown
-
2022
- 2022-06-06 US US17/833,456 patent/US20220309806A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104554259A (zh) * | 2013-10-21 | 2015-04-29 | 财团法人车辆研究测试中心 | 主动式自动驾驶辅助系统与方法 |
JP2015102893A (ja) * | 2013-11-21 | 2015-06-04 | 日産自動車株式会社 | 合流支援システム |
CN104149783A (zh) * | 2014-08-27 | 2014-11-19 | 刘红华 | 一种数字公路及其自动驾驶车辆 |
JP2018044833A (ja) * | 2016-09-14 | 2018-03-22 | 日産自動車株式会社 | 自動運転支援方法および装置 |
CN110386065A (zh) * | 2018-04-20 | 2019-10-29 | 比亚迪股份有限公司 | 车辆盲区的监控方法、装置、计算机设备及存储介质 |
DE102018007298A1 (de) * | 2018-09-14 | 2019-03-28 | Daimler Ag | Verfahren zur Routenplanung |
CN110361021A (zh) * | 2018-09-30 | 2019-10-22 | 长城汽车股份有限公司 | 车道线拟合方法及系统 |
CN110361015A (zh) * | 2018-09-30 | 2019-10-22 | 长城汽车股份有限公司 | 道路特征点提取方法及系统 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4059799A4 |
Also Published As
Publication number | Publication date |
---|---|
US20220309806A1 (en) | 2022-09-29 |
BR112022010922A2 (pt) | 2022-09-06 |
CN113022573B (zh) | 2022-11-04 |
EP4059799A4 (en) | 2023-03-01 |
CN113022573A (zh) | 2021-06-25 |
EP4059799A1 (en) | 2022-09-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021135371A1 (zh) | 一种自动驾驶方法、相关设备及计算机可读存储介质 | |
WO2021000800A1 (zh) | 道路可行驶区域推理方法及装置 | |
WO2022001773A1 (zh) | 轨迹预测方法及装置 | |
WO2021103511A1 (zh) | 一种设计运行区域odd判断方法、装置及相关设备 | |
CN113792566B (zh) | 一种激光点云的处理方法及相关设备 | |
CN113968216B (zh) | 一种车辆碰撞检测方法、装置及计算机可读存储介质 | |
WO2021196879A1 (zh) | 车辆驾驶行为的识别方法以及识别装置 | |
EP4029750A1 (en) | Data presentation method and terminal device | |
WO2021147748A1 (zh) | 一种自动驾驶方法及相关设备 | |
US12001517B2 (en) | Positioning method and apparatus | |
US20220309806A1 (en) | Road structure detection method and apparatus | |
WO2022062825A1 (zh) | 车辆的控制方法、装置及车辆 | |
WO2022051951A1 (zh) | 车道线检测方法、相关设备及计算机可读存储介质 | |
CN112512887A (zh) | 一种行驶决策选择方法以及装置 | |
EP4307251A1 (en) | Mapping method, vehicle, computer readable storage medium, and chip | |
CN113885045A (zh) | 车道线的检测方法和装置 | |
WO2022052881A1 (zh) | 一种构建地图的方法及计算设备 | |
WO2021217447A1 (zh) | 一种车辆通过道闸横杆的方法及装置 | |
CN115398272A (zh) | 检测车辆可通行区域的方法及装置 | |
CN116135654A (zh) | 一种车辆行驶速度生成方法以及相关设备 | |
WO2022022284A1 (zh) | 目标物的感知方法及装置 | |
WO2021159397A1 (zh) | 车辆可行驶区域的检测方法以及检测装置 | |
CN114764980B (zh) | 一种车辆转弯路线规划方法及装置 | |
CN115508841A (zh) | 一种路沿检测的方法和装置 | |
WO2022001432A1 (zh) | 推理车道的方法、训练车道推理模型的方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20895891 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022010922 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2020895891 Country of ref document: EP Effective date: 20220615 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 112022010922 Country of ref document: BR Kind code of ref document: A2 Effective date: 20220603 |