WO2021110166A1 - 道路结构检测方法及装置 - Google Patents

道路结构检测方法及装置 Download PDF

Info

Publication number
WO2021110166A1
WO2021110166A1 PCT/CN2020/134225 CN2020134225W WO2021110166A1 WO 2021110166 A1 WO2021110166 A1 WO 2021110166A1 CN 2020134225 W CN2020134225 W CN 2020134225W WO 2021110166 A1 WO2021110166 A1 WO 2021110166A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
width
information
road
boundary
Prior art date
Application number
PCT/CN2020/134225
Other languages
English (en)
French (fr)
Inventor
周伟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to BR112022010922A priority Critical patent/BR112022010922A2/pt
Priority to EP20895891.8A priority patent/EP4059799A4/en
Publication of WO2021110166A1 publication Critical patent/WO2021110166A1/zh
Priority to US17/833,456 priority patent/US20220309806A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road, e.g. motorways, local streets, paved or unpaved roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/10Number of lanes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk

Definitions

  • This application relates to the field of automatic driving, and in particular to a road structure detection method and device.
  • Intelligent vehicles In autonomous driving, smart vehicles need to perceive the surrounding environment. Intelligent vehicles can detect and classify the surrounding environment of the vehicle through a variety of sensors, and transmit this information to the planning and control module to form a driving strategy for the intelligent vehicle and complete the entire automatic driving process.
  • the more important aspect is to detect the road structure, so that the intelligent vehicle can adjust the driving strategy according to the road structure, avoid some obstacles on the road, etc., and realize better automatic driving.
  • an existing road structure detection technology is to use high-precision maps to obtain some lane information, etc., and then determine the road structure.
  • the high-precision map usually covers a limited area. When the driving area is not covered by the high-precision map, the intelligent vehicle cannot obtain the corresponding information, and thus cannot determine the road structure.
  • the present application provides a road structure detection method and device, which can realize road structure detection without relying on high-precision maps.
  • the present application provides a road structure detection method, which can be executed by a smart vehicle (or a component in a vehicle) or another device with a control function (or a component in another device).
  • the method includes: determining the boundary information of the own lane and the boundary information of the adjacent lane, and determining the road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
  • the boundary information of the current lane is used to characterize the position of the boundary of the current lane;
  • the boundary information of the adjacent lane is used to characterize the position of the boundary of the adjacent lane;
  • the boundary information includes lane line information and/or lane edge position information;
  • road structure information includes Location information of the merging point of the own lane and the adjacent lane and/or the separation point of the own lane and the adjacent lane.
  • paint markings may be used to demarcate boundaries between lanes. Or, use protrusions (such as guardrails, concrete piers, etc.) to delimit the boundary. Of course, other boundary division methods are also possible.
  • the road structure detection method provided by the embodiment of the present application does not need to rely on a high-precision map, and can complete road structure detection in an area not covered by a high-precision map.
  • the road structure detection method provided by the present application has a wider application range, and can be applied to areas covered by high-precision maps and areas not covered by high-precision maps.
  • the lane line may be drawn on the road surface with paint.
  • a visual sensor such as a camera
  • the edge of the lane can be divided by protrusions (such as guardrails).
  • radar or similar components can be used to emit specific waves, and the emitted waves of the protrusions at the edge of the lane can be obtained.
  • the characteristics of the emitted and reflected waves (such as Phase difference, frequency difference, etc.) to obtain the position of the edge of the lane.
  • the visual information of the edge of the lane such as the guardrail can also be collected through a camera to determine the position information of the guardrail.
  • the boundary information of the adjacent lane is determined according to the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
  • the boundary information of the adjacent lane can be determined based on the detected traffic trajectory information, or the boundary information of the adjacent lane can be determined based on the road boundary information, or the neighboring lane can be determined based on both the detected traffic trajectory information and the road boundary information.
  • the boundary information of the lane can be determined based on the detected traffic trajectory information, or the boundary information of the adjacent lane can be determined based on the road boundary information, or the neighboring lane can be determined based on both the detected traffic trajectory information and the road boundary information.
  • the visual sensor of the vehicle can be used to collect the visual information of the vehicle in the adjacent lane to determine the detected driving trajectory information.
  • the vehicle's radar or components with similar functions can also be used to determine the position and speed of the vehicle in other lanes by means of receiving and transmitting lasers, millimeter waves, etc., to determine the driving trajectory information of the other lane.
  • other methods can also be used to determine the trajectory information of other lanes.
  • components such as a camera can be used to collect visual information of the road boundary line to determine the location of the road boundary, that is, road boundary information.
  • the camera can also be used to determine the road boundary information.
  • Components such as radar can also be used to determine road boundary information.
  • the adjacent lane boundary information is inferred based on the road boundary information and/or the detected vehicle trajectory information, instead of directly detecting the adjacent lane boundary information through the camera.
  • the poor detection performance of the camera such as a small detection distance
  • environmental factors such as haze
  • the boundary information of the adjacent lane includes at least one of the following:
  • the location of the merging point or the separation point needs to be determined according to the boundary information of the own lane and the boundary information of the right adjacent lane.
  • the boundary information of the right adjacent lane needs to be determined first.
  • the boundary information of the right adjacent lane refers to the left lane line (RL) position of the right adjacent lane.
  • the position of RL can be derived from road boundary information. Specifically, the first position of RL is obtained by shifting the road boundary to the left by a first width. Alternatively, the RL position is derived from the driving trajectory of other lanes. Specifically, the second position of the RL is obtained by shifting the driving trajectory to the left by a second width. Alternatively, the RL position can be derived from the road boundary position and the trajectory of other lanes.
  • the RL's third position (which can be referred to as the fusion position) is calculated from the RL's first position and the RL's second position through a preset algorithm Get together.
  • the deviation between the fusion position and the actual merging point (and/or separation point) position can be corrected to improve the accuracy of the final position result .
  • the boundary information of the right adjacent lane refers to the position of the left edge of the right adjacent lane.
  • the position of the left edge can be derived from the position of the road boundary. Specifically, the seventh position of the left edge of the adjacent lane to the right is obtained by shifting the road boundary to the left by a fifth width. Alternatively, the position of the left edge can be derived from the driving trajectory of other lanes. Specifically, the eighth position of the left edge is obtained by shifting the driving trajectory to the left by a sixth width. Alternatively, the position of the left edge can be derived from both the road boundary position and the traffic trajectory of other lanes. Specifically, the ninth position of the left edge is determined by a preset algorithm from the seventh position of the left edge and the eighth position of the left edge. Get together.
  • the boundary information of the left adjacent lane refers to the right lane line (LR) position of the left adjacent lane.
  • the fourth position of LR is obtained by shifting the road boundary to the right by the third width; or, the fifth position of LR is obtained by shifting the trajectory to the right by the fourth width; or, the sixth position of LR is obtained by the fourth position of LR and LR's
  • the fifth position is fused by a preset algorithm.
  • the boundary information of the left adjacent lane refers to the position of the right edge of the left adjacent lane.
  • the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the twelfth position of the right edge
  • the tenth position of the right edge and the eleventh position of the right edge are fused by a preset algorithm.
  • the first width is an integer multiple of the lane width
  • the second width is an odd multiple of the half-lane width
  • the third width is an integer multiple of the lane width
  • the fourth width is an odd multiple of the half-lane width
  • the fifth width is the lane width
  • the sixth width is an odd multiple of the half-lane width
  • the seventh width is an integer multiple of the half-lane width
  • the eighth width is an odd multiple of the half-lane width.
  • the distance between the merging point and the reference point is less than or equal to a first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold, and the reference point includes a vehicle.
  • the preliminarily determined position of the merge point is deemed to be accurate, and there is no need to further adjust the position of the merge point.
  • the initially determined distance between the merge point and the vehicle when the initially determined distance between the merge point and the vehicle is less than or equal to the first threshold, it may be the translation required by the road boundary and/or the driving trajectory when determining the adjacent lane boundary information.
  • the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width are not accurate, possibly As a result, the initially determined location of the merge point and/or separation point is inaccurate. In this case, it is necessary to further adjust the position of the merging point and/or the separation point to improve the accuracy of the road structure detection result.
  • the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or are adjusted according to the first threshold value Eighth width; based on the adjusted first width and/or second width and/or third width and/or fourth width and/or fifth width and/or sixth width and/or seventh width and/or first width Eight widths, adjust the merging point; and/or adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width according to the first threshold / Or seventh width and / or eighth width; based on the adjusted first width and / or second width and / or third width and / or fourth width and / or fifth width and / or sixth width and / Or seventh width and/or eighth width, adjust the separation point.
  • adjusting the width of the translation required for the position of the road boundary and/or the width of the translation required for the driving trajectory determines the border information of the adjacent lane, so that the obtained border information of the adjacent lane is more accurate.
  • the position of the merge point (and/or the separation point) thus determined is more accurate.
  • a more precise driving strategy can be formulated to guide the driving of the vehicle and improve the safety of driving.
  • the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
  • the boundary information of the adjacent lane is determined according to the road prior data, and the road prior data includes the lane width.
  • the present application provides a road structure detection device.
  • the device can be a vehicle or a device that can support the vehicle to achieve driving functions. It can be used in conjunction with the vehicle.
  • it can be a device in a vehicle (such as a vehicle).
  • the chip system in the vehicle, or the operating system and/or driver running on the computer system of the vehicle may also be other equipment (such as a server) or a chip in the equipment.
  • the device includes a determination module and an adjustment module, and these modules can execute the road structure detection method in any of the design examples in the first aspect, specifically:
  • the determination module is used to determine the boundary information of the own lane, the boundary information of the current lane is used to characterize the position of the boundary of the current lane; the boundary information of the adjacent lane is determined, and the boundary information of the adjacent lane is used to characterize the position of the boundary of the adjacent lane.
  • the boundary information includes lane line information and/or lane edge position information.
  • the determining module is also used to determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
  • the road structure information includes the location information of the merge point of the own lane and the adjacent lane and/or the position information of the separation point of the own lane and the adjacent lane.
  • the boundary information of the adjacent lane is determined based on the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
  • the boundary information of adjacent lanes includes at least one of the following:
  • the first position of RL is obtained by shifting the road boundary to the left by a first width; or, the second position of RL is obtained by shifting the trajectory to the left by a second width; or, the third position of RL is obtained by shifting to the left by a second width.
  • the first position of the RL and the second position of the RL are fused by a preset algorithm.
  • the fourth position of LR is obtained by shifting the road boundary to the right by the third width; or, the fifth position of LR is obtained by shifting the trajectory to the right by the fourth width; or, the sixth position of LR is obtained by the fourth position of LR and LR's
  • the fifth position is fused by a preset algorithm.
  • the seventh position of the left edge of the right adjacent lane is obtained by shifting the road boundary to the left by the fifth width; or, the eighth position of the left edge is obtained by shifting the traffic track to the left by the sixth width; or, the ninth position of the left edge is obtained by shifting the left
  • the seventh position of the edge and the eighth position of the left edge are fused by a preset algorithm.
  • the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the twelfth position of the right edge
  • the tenth position of the right edge and the eleventh position of the right edge are fused by a preset algorithm.
  • the first width is an integer multiple of the lane width
  • the second width is an odd multiple of the half-lane width
  • the third width is an integer multiple of the lane width
  • the fourth width is an odd multiple of the half-lane width
  • the fifth width is the lane width
  • the sixth width is an odd multiple of the half-lane width
  • the seventh width is an integer multiple of the half-lane width
  • the eighth width is an odd multiple of the half-lane width.
  • the distance between the merging point and the reference point is less than or equal to the first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold, and the reference point includes the vehicle.
  • the adjustment module is used to adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the sixth width according to the first threshold. Width and/or seventh width and/or eighth width; based on adjusted first width and/or second width and/or third width and/or fourth width and/or fifth width and/or sixth width And/or the seventh width and/or the eighth width to adjust the merging point; and/or, to adjust the first width and/or the second width and/or the third width and/or the fourth width and / Or fifth width and / or sixth width and / or seventh width and / or eighth width; based on the adjusted first width and / or second width and / or third width and / or fourth width and / Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width to adjust the separation point.
  • the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
  • the boundary information of the adjacent lane is determined based on the road prior data, and the road prior data includes the lane width.
  • an embodiment of the present application provides a road structure detection device, which has the function of realizing the road structure detection method of any one of the above-mentioned first aspects.
  • This function can be realized by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • a road structure detection device which includes a processor.
  • the processor is configured to couple with the memory, and after reading the instructions in the memory, execute the road structure detection method according to any one of the above-mentioned first aspects according to the instructions.
  • the memory may be an external memory of the device.
  • the external memory is coupled with the processor.
  • the memory may also refer to the memory included in the device. That is, the device optionally includes a memory.
  • the device may also include a communication interface. Used to communicate between the device and other equipment.
  • the communication interface can be, for example, but not limited to, a transceiver, a transceiver circuit, and the like.
  • an embodiment of the present application also provides a computer-readable storage medium, including instructions, which when run on a computer, cause the computer to execute the method of the first aspect.
  • the embodiments of the present application also provide a computer program product, including instructions, which when run on a computer, cause the computer to execute the method of the first aspect.
  • an embodiment of the present application provides a road structure detection device.
  • the detection device may be a sensor device, such as a radar device.
  • the device may also be a chip system, and the chip system may include a processor and a memory, which is used to implement the functions of the method in the first aspect described above.
  • the chip system can be composed of chips, and can also include chips and other discrete devices.
  • a road structure detection device may be a circuit system.
  • the circuit system includes a processing circuit configured to execute the road structure detection method according to any one of the above-mentioned first aspects.
  • embodiments of the present application provide a system, which includes the device of any one of the second to fourth aspects, and/or the chip system of the seventh aspect, and/or the circuit system of the eighth aspect, and /Or the readable storage medium in the fifth aspect, and/or the computer program product in the sixth aspect, and/or one or more types of sensors, and/or a smart car.
  • one or more types of sensors can be, but not limited to, visual sensors (such as cameras, etc.), radars or other sensors with similar functions.
  • embodiments of the present application provide a smart car, which includes the device of any one of the second to fourth aspects, and/or the chip system of the seventh aspect, and/or the circuit system of the eighth aspect, And/or the readable storage medium in the fifth aspect, and/or the computer program product in the sixth aspect.
  • FIG. 1 is a schematic structural diagram of an autonomous vehicle provided by an embodiment of the application.
  • FIG. 2 is a schematic structural diagram of an autonomous vehicle provided by an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of a computer system provided by an embodiment of this application.
  • FIG. 4 is a schematic structural diagram of a neural network processor provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of the application of a cloud-side commanded autonomous vehicle provided by an embodiment of this application;
  • FIG. 6 is a schematic diagram of an application of a cloud-side commanded autonomous vehicle provided by an embodiment of the application
  • FIG. 7 is a schematic structural diagram of a computer program product provided by an embodiment of this application.
  • FIG. 8 is a schematic diagram of a scene of a road structure detection method provided by an embodiment of the application.
  • FIG. 9 is a schematic flowchart of a road structure detection method provided by an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of a road structure detection device provided by an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of a road structure detection device provided by an embodiment of the application.
  • the road structure detection method provided by the embodiment of the present application is applied to a smart vehicle, or applied to other devices with control functions (such as a cloud server).
  • the vehicle can implement the road structure detection method provided by the embodiments of the present application through its components (including hardware and software), obtain lane line information of its own lane, and border information of adjacent lanes, and determine road structure information based on the two.
  • other devices such as a server
  • the road structure information is used to formulate driving strategies for the target vehicle.
  • Fig. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operations, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine that the other vehicle performs
  • the confidence level corresponding to the likelihood of the possible behavior is used to control the vehicle 100 based on the determined information.
  • the vehicle 100 can be placed to operate without human interaction.
  • the vehicle 100 may include various subsystems, such as a travel system 102, a sensor system 104, a control system 106, one or more peripheral devices 108 and a power supply 110, a computer system 112, and a user interface 116.
  • the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each of the subsystems and elements of the vehicle 100 may be wired or wirelessly interconnected.
  • the travel system 102 may include components that provide power movement for the vehicle 100.
  • the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, such as a hybrid engine composed of a gasoline engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • the engine 118 converts the energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy for other systems of the vehicle 100.
  • the transmission device 120 can transmit mechanical power from the engine 118 to the wheels 121.
  • the transmission device 120 may include a gearbox, a differential, and a drive shaft.
  • the transmission device 120 may also include other devices, such as a clutch.
  • the drive shaft may include one or more shafts that can be coupled to one or more wheels 121.
  • the sensor system 104 may include several sensors that sense information about the environment around the vehicle 100.
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, and a radar 126, a laser rangefinder 128, and a camera 130.
  • the sensor system 104 may also include sensors of the internal system of the monitored vehicle 100 (for example, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). Such detection and identification are key functions for the safe operation of the autonomous vehicle 100.
  • Different types of sensors have different characteristics. For millimeter-wave radar, it can work around the clock and has good range and speed accuracy, but the classification and recognition effect is not good.
  • the camera has a strong resolution and a strong target recognition and classification effect, but because of the loss of depth information, the performance of distance measurement and speed measurement may be poor.
  • Lidar has good depth information and can also perform range and speed measurement, but the detection distance is not far. It can be seen that these different types of sensors have different characteristics. Under different functional requirements, different sensors are required for fusion processing to achieve better performance.
  • the positioning system 122 can be used to estimate the geographic location of the vehicle 100.
  • the IMU 124 is used to sense changes in the position and orientation of the vehicle 100 based on inertial acceleration.
  • the IMU 124 may be a combination of an accelerometer and a gyroscope.
  • the radar 126 can also be called a detector, a detection device, or a radio signal sending device.
  • the radio signal can be used to sense objects in the surrounding environment of the vehicle 100.
  • the radar 126 may also be used to sense the speed and/or direction of the object. Its working principle is to detect the corresponding signal by transmitting the signal (or called the detection signal) and receiving the reflected signal reflected by the target object (in this article, also called the echo signal of the target object, two-way echo signal, etc.) Target object.
  • the radar has a variety of different radar waveforms according to different purposes, including but not limited to pulsed millimeter waves, stepped frequency modulated continuous waves, and linear frequency modulated continuous waves.
  • the linear frequency modulation continuous wave is more common and the technology is more mature.
  • the linear frequency modulation continuous wave has a large time zone product, and usually has a high ranging accuracy and resolution. It supports adaptive cruise control (ACC), automatic emergency braking (Autonomous Emergency Braking, AEB), lane change assist (Lance Change Assist, LCA), blind spot monitoring (Blind Spot Monitoring, BSD) and other assisted driving functions .
  • the laser rangefinder 128 can use laser light to sense objects in the environment where the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100.
  • the camera 130 may be a still camera or a video camera.
  • the control system 106 may control the operation of the vehicle 100 and its components.
  • the control system 106 may include various components, including a steering system 132, a throttle 134, a braking unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
  • the steering system 132 is operable to adjust the forward direction of the vehicle 100.
  • it may be a steering wheel system.
  • the throttle 134 is used to control the operating speed of the engine 118 and thereby control the speed of the vehicle 100.
  • the braking unit 136 is used to control the vehicle 100 to decelerate.
  • the braking unit 136 may use friction to slow down the wheels 121.
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electric current.
  • the braking unit 136 may also take other forms to slow down the rotation speed of the wheels 121 to control the speed of the vehicle 100.
  • the computer vision system 140 may be operable to process and analyze the images captured by the camera 130 in order to identify objects and/or features in the surrounding environment of the vehicle 100.
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • the computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision technologies.
  • SFM structure from motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and so on.
  • the route control system 142 is used to determine the travel route of the vehicle 100.
  • the route control system 142 may combine data from sensors, the positioning system 122, and one or more predetermined maps to determine a travel route for the vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise surpass potential obstacles in the environment of the vehicle 100.
  • control system 106 may add or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
  • the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripheral devices 108.
  • the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
  • the peripheral device 108 provides a means for the user of the vehicle 100 to interact with the user interface 116.
  • the onboard computer 148 may provide information to the user of the vehicle 100.
  • the user interface 116 can also operate the on-board computer 148 to receive user input.
  • the on-board computer 148 can be operated through a touch screen.
  • the peripheral device 108 may provide a means for the vehicle 100 to communicate with other devices or users located in the vehicle.
  • the microphone 150 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100.
  • the speaker 152 may output audio to the user of the vehicle 100.
  • the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system 146 may use third-generation (3rd-generation, 3G) cellular communication, such as code division multiple access (CDMA), data evolution (evolution data only, EVDO), and global mobile communication system ( Global system for mobile communications, GSM, general packet radio service, GPRS, or the 4th generation (4G) cellular communications, such as long term evolution (LTE).
  • Fifth-generation (5th-Generation, 5G) cellular communication may use wireless fidelity (wireless fidelity, WiFi) to communicate with a wireless local area network (WLAN).
  • the wireless communication system 146 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices.
  • DSRC dedicated short range communications
  • the power supply 110 may provide power to various components of the vehicle 100.
  • the power source 110 may be a rechargeable lithium ion or lead-acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100.
  • the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
  • the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as a data storage device 114.
  • the computer system 112 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 113 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements in the same physical enclosure, those of ordinary skill in the art should understand that the processor, computer system, or memory may actually be included in the same physical enclosure. Multiple processors, computer systems, or memories in a housing, or include multiple processors, computer systems, or memories that may not be in the same physical housing.
  • the memory may be a hard drive, or other storage medium located in a different physical enclosure.
  • a reference to a processor or computer system will be understood to include a reference to a collection of processors or computer systems or memories that may operate in parallel, or a reference to a collection of processors or computer systems or memories that may not operate in parallel.
  • some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
  • the processor may be located far away from the vehicle and wirelessly communicate with the vehicle, and this type of processor may be referred to as a remote processor.
  • this type of processor may be referred to as a remote processor.
  • some of the processes described herein are executed on a processor disposed in the vehicle and others are executed by a remote processor, including taking the necessary steps to perform a single manipulation.
  • the data storage device 114 may include instructions 115 (eg, program logic), which may be executed by the processor 113 to perform various functions of the vehicle 100, including those described above.
  • the data storage device 114 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data on one or more of the traveling system 102, the sensor system 104, the control system 106, and the peripheral device 108. Control instructions.
  • the data storage device 114 may also store data, such as road maps, route information, the location, direction, and speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 100 and the computer system 112 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the data storage device 114 obtains the lane line information of the own lane from the sensor system 104 or other components of the vehicle 100, and may also obtain the detected driving trajectory information and/or road boundary information from the aforementioned components. .
  • the road boundary information is used to characterize the location of the road boundary.
  • the data storage device 114 may also store the information obtained above.
  • the vehicle can obtain the distance between other vehicles and the speed of other vehicles and the speed of other vehicles based on the speed and distance measurement functions of the radar 126.
  • the processor 113 can obtain the information from the data storage device 114, and determine the road structure information based on the information.
  • Road structure information can be used to assist the vehicle in determining a driving strategy to control the vehicle to drive.
  • the user interface 116 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as one or more of the wireless communication system 146, the onboard computer 148, the microphone 150, and the speaker 152.
  • the computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (for example, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may use input from the control system 106 to control the steering system 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the data storage device 114 may exist partially or completely separately from the vehicle 100.
  • the above-mentioned components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation to the embodiments of the present application.
  • a smart car traveling on a road can recognize objects in its surrounding environment to determine the current speed adjustment.
  • the object may be other vehicles, traffic control equipment, or other types of objects.
  • each recognized object can be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, distance from the vehicle, etc., can be used to determine the speed to be adjusted by the smart car.
  • the vehicle 100 or a computing device associated with the vehicle 100 may be based on the characteristics of the identified object and the state of the surrounding environment (for example, Traffic, rain, ice on the road, etc.) to predict the behavior of the identified object.
  • each recognized object depends on each other's behavior. Therefore, all recognized objects can also be considered together to predict the behavior of a single recognized object.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object. In other words, the smart car can determine what stable state the vehicle will need to adjust to (for example, accelerating, decelerating, or stopping) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 on the road on which it is traveling, the curvature of the road, and the proximity of static and dynamic objects.
  • the computing device can also provide instructions to modify the steering angle of the vehicle 100 so that the smart car follows a given trajectory and/or maintains contact with objects near the smart car (for example, on a road).
  • the above-mentioned vehicle 100 may be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
  • the application examples are not particularly limited.
  • the smart vehicle may also include a hardware structure and/or software module, and the above functions are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function among the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • the vehicle may include the following modules:
  • the environment perception module 201 is used to obtain information of vehicles, pedestrians, and road objects recognized by roadside sensors and/or on-board sensors.
  • the roadside sensor and the vehicle-mounted sensor can be a camera (camera), lidar, millimeter wave radar, etc.
  • the data acquired by the environmental perception module can be the original collected video stream, radar point cloud data, or analyzed structured data on the position, speed, steering angle, and size of people, vehicles, and objects.
  • the environmental perception module can process these data into recognizable and structured data such as the position, speed, steering angle, and size of people, vehicles, and objects, and send them to the control module 202 transmits these data so that the control module 202 can generate a driving strategy.
  • the environment perception module 201 includes a camera or a radar, which is used to obtain the boundary information of the lane where the vehicle is located. It is also used to obtain road boundary information and/or detected vehicle trajectory information. Wherein, road boundary information and/or detected vehicle trajectory information is used to determine adjacent lane boundary information. The boundary information of the own lane and the boundary information of the adjacent lane are used to determine the road structure information.
  • Control module 202 This module may be a traditional control module of the vehicle. Its function is to determine road structure information based on the data input by the environment perception module 201 (own lane boundary information, driving trajectory information and/or road boundary information). It is also used to fuse the adjacent lane boundary information determined according to the driving trajectory information and the adjacent lane boundary information determined according to the road boundary information to obtain more accurate adjacent lane boundary information. It is also used to generate a driving strategy according to the road structure information, and output an action instruction corresponding to the driving strategy, and send the action instruction to the control module 203. The action instruction is used to instruct the control module 203 to control the driving of the vehicle.
  • the control module 202 may be a collection of components or subsystems with control and processing functions. For example, it may be the processor 113 shown in FIG. 1, or some functional modules in the processor, or similar components or similar subsystems.
  • the control module 203 is used to receive action instructions from the control module 202 to control the vehicle to complete the driving operation.
  • the control module 203 may be a collection of components or subsystems with control and processing functions. For example, it may be the processor 113 shown in FIG. 1, or a similar component or a similar subsystem.
  • the above modules can also be integrated into one module.
  • the integrated module is used to provide the above-mentioned multiple functions.
  • Vehicle-mounted communication module (not shown in Figure 2): used for information exchange between the vehicle and other vehicles.
  • the vehicle-mounted communication module may be, for example, but not limited to, a component in the wireless communication system 146 as shown in FIG. 1.
  • the storage component (not shown in FIG. 2) is used to store the executable code of each of the above-mentioned modules. Running these executable codes can implement part or all of the method flow in the embodiments of the present application.
  • the storage component may be, for example, but not limited to, the component in the data storage device 114 as shown in FIG. 1.
  • the computer system 112 shown in FIG. 1 includes a processor 303, and the processor 303 is coupled to a system bus 305.
  • the processor 303 may be one or more processors, where each processor may include one or more processor cores.
  • a display adapter (video adapter) 307, the display adapter 307 can drive the display 309, and the display 309 is coupled to the system bus 305.
  • the system bus 305 is coupled with an input/output (I/O) bus (BUS) 313 through a bus bridge 311.
  • the I/O interface 315 and the I/O bus 313 are coupled.
  • the I/O interface 315 communicates with various I/O devices, such as an input device 317 (such as a keyboard, a mouse, a touch screen, etc.), a media tray 321 (such as a CD-ROM, a multimedia interface, etc.).
  • the transceiver 323 can send and/or receive radio communication signals
  • the camera 355 can capture static and dynamic digital video images
  • an external Universal Serial Bus (USB) interface 325 can be a USB interface.
  • the processor 303 may be any traditional processor, including a Reduced Instruction Set Computer (RISC) processor, a Complex Instruction Set Computer (CISC) processor, or a combination of the foregoing.
  • the processor may be a dedicated device such as an application specific integrated circuit (ASIC).
  • the processor 303 may be a neural network processor or a combination of a neural network processor and the foregoing traditional processors.
  • the computer system 112 may be located remotely from the vehicle and may communicate wirelessly with the vehicle 100.
  • some of the processes described herein may be configured to be executed on a processor in the vehicle, and other processes may be executed by a remote processor, including taking actions required to perform a single manipulation.
  • the computer system 112 may communicate with a software deployment server 349 through a network interface 329.
  • the network interface 329 is a hardware network interface, such as a network card.
  • the network (network) 327 may be an external network, such as the Internet, or an internal network, such as an Ethernet or a virtual private network (VPN).
  • the network 327 may also be a wireless network, such as a WiFi network, a cellular network, and so on.
  • the hard disk drive interface 331 and the system bus 305 are coupled.
  • the hard disk drive interface 331 and the hard disk drive 333 are connected.
  • the system memory 335 is coupled to the system bus 305.
  • the data running in the system memory 335 may include an operating system (OS) 337 and application programs 343 of the computer system 112.
  • the operating system includes but is not limited to Shell 339 and kernel (kernel) 341.
  • Shell 339 is an interface between the user and the kernel of the operating system.
  • the shell is the outermost layer of the operating system. The shell manages the interaction between the user and the operating system: waiting for the user's input, interpreting the user's input to the operating system, and processing the output of various operating systems.
  • the kernel 341 is composed of those parts of the operating system that are used to manage memory, files, peripherals, and system resources. Directly interact with the hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management and other functions.
  • the application program 343 includes programs related to controlling car driving, for example, a program that manages the interaction between the car and an obstacle on the road, a program that controls the route or speed of the car, and a program that controls the interaction between the car and other cars on the road.
  • the application 343 also exists on the deploying server 349 system. In one embodiment, when the application program 343 needs to be executed, the computer system 112 may download the application program 343 from the deploying server 349.
  • the application program 343 may be an application program that controls the vehicle to determine the road structure information based on the lane line information of the own lane and the boundary information of the adjacent lanes (determined based on the road boundary information and/or the detected driving trajectory information).
  • the processor 303 of the computer system 112 calls the application program 343 to obtain the final road structure.
  • the sensor 353 is associated with the computer system 112.
  • the sensor 353 is used to detect the environment around the computer system 112.
  • the sensor 353 can detect animals, cars, obstacles, and pedestrian crossings.
  • the sensor can also detect the surrounding environment of the above-mentioned animals, cars, obstacles and crosswalks.
  • the environment around the animal for example, other animals that appear around the animal, weather conditions, and the brightness of the surrounding environment.
  • the sensor may be a camera, an infrared sensor, a chemical detector, a microphone, etc.
  • the road structure detection method in the embodiments of the present application may also be executed by a chip system.
  • the chip system can be located in the vehicle or in another location, such as a server. Refer to FIG. 4, which is an exemplary structure diagram of a chip system provided by an embodiment of the present application.
  • a neural network processor (neural-network processing unit, NPU) 50 can be mounted on a host CPU (host CPU) as a coprocessor, and the host CPU assigns tasks to the NPU.
  • the core part of the NPU is the arithmetic circuit 503.
  • the arithmetic circuit 503 is controlled by the controller 504, so that the arithmetic circuit 503 can extract matrix data in the memory and perform a multiplication operation.
  • the arithmetic circuit 503 includes multiple processing units (Process Engine, PE). In some implementations, the arithmetic circuit 503 is a two-dimensional systolic array. The arithmetic circuit 503 may also be a one-dimensional systolic array, or other electronic circuits capable of performing mathematical operations such as multiplication and addition. In some implementations, the arithmetic circuit 503 is a general-purpose matrix processor.
  • the arithmetic circuit 503 obtains the data corresponding to the weight matrix B from the weight memory 502 and caches it on each PE in the arithmetic circuit 503.
  • the arithmetic circuit 503 fetches the data corresponding to the input matrix A from the input memory 501, and performs matrix operations according to the input matrix A and the weight matrix B, and the partial or final result of the matrix operation can be stored in an accumulator 508.
  • the arithmetic circuit 503 can be used to implement a feature extraction model (such as a convolutional neural network model), and input image data into the convolutional neural network model, and the features of the image can be obtained through the operation of the model. Furthermore, the image features are output to the classifier, and the classifier outputs the classification probability of the object in the image.
  • a feature extraction model such as a convolutional neural network model
  • the unified memory 506 is used to store input data and output data.
  • the weight data in the external memory is directly sent to the weight memory 502 through a memory unit access controller (Direct Memory Access Controller, DMAC) 505.
  • the input data in the external memory can be transferred to the unified memory 506 through the DMAC, or transferred to the input memory 501.
  • DMAC Direct Memory Access Controller
  • the Bus Interface Unit (BIU) 510 is used for the interaction between the Advanced Extensible Interface (AXI) bus, the DMAC and the instruction fetch buffer 509. It is also used for the instruction fetch memory 509 to obtain instructions from an external memory, and also used for the storage unit access controller 505 to obtain the original data of the input matrix A or the weight matrix B from the external memory.
  • AXI Advanced Extensible Interface
  • the DMAC is mainly used to transfer the input data in the external storage to the unified storage 506, or to transfer the weight data to the weight storage 502, or to transfer the input data to the input storage 501.
  • the vector calculation unit 507 may include a plurality of operation processing units. It is used to perform further processing on the output of the arithmetic circuit 503 when needed, such as vector multiplication, vector addition, exponential operation, logarithmic operation, size comparison and so on. Mainly used for non-convolution/FC layer network calculations in neural networks, such as pooling, batch normalization, local response normalization, etc.
  • the vector calculation unit 507 stores the processed output vector to the unified memory 506.
  • the vector calculation unit 507 may apply a nonlinear function to the output of the arithmetic circuit 503, such as a vector of accumulated values, to generate the activation value.
  • the vector calculation unit 507 generates a normalized value, a combined value, or both.
  • the processed output vector can also be used as an activation input of the arithmetic circuit 503, for example, for use in a subsequent layer in a neural network.
  • the controller 504 is connected to an instruction fetch buffer 509, and the instructions used by the controller 504 can be stored in the instruction fetch buffer 509.
  • the unified memory 506, the input memory 501, the weight memory 502, and the instruction fetch memory 509 are all on-chip memories.
  • the external memory is private to the NPU hardware architecture.
  • the main CPU and NPU cooperate to realize the corresponding algorithm for the functions required by the vehicle 100 in Figure 1, and the corresponding algorithm for the functions required by the vehicle shown in Figure 2, or the algorithm shown in Figure 3.
  • the main CPU and NPU work together to implement the corresponding algorithms for the functions required by the server.
  • the computer system 112 may also receive information from other computer systems or transfer information to other computer systems.
  • the sensor data collected from the sensor system 104 of the vehicle 100 can be transferred to another computer, and the data can be processed by the other computer.
  • data from the computer system 112 may be transmitted to the computer system 720 on the cloud side via the network for further processing.
  • the network and intermediate nodes can include various configurations and protocols, including the Internet, World Wide Web, Intranet, Virtual Private Network, Wide Area Network, Local Area Network, private network using one or more company's proprietary communication protocols, Ethernet, WiFi, and hypertext Transmission protocol (hypertext transfer protocol, HTTP), and various combinations of the foregoing. This communication can be performed by any device capable of transferring data to and from other computers, such as modems and wireless interfaces.
  • the computer system 720 may include a server with multiple computers, such as a load balancing server group. In order to receive, process, and transmit data from the computer system 112, the computer system 720 exchanges information with different nodes of the network.
  • the server 720 may have a configuration similar to the computer system 112 and have a processor 730, a memory 740, instructions 750, and data 760.
  • the data 760 of the server 720 may include providing weather-related information.
  • the server 720 may receive, monitor, store, update, and transmit various information related to weather.
  • the information may include precipitation, cloud, and/or temperature information and/or humidity information in the form of reports, radar information, forecasts, etc., for example.
  • the cloud service center may receive information (such as data collected by vehicle sensors or other information) from vehicles 513 and 512 in its operating environment 500 via a network 511 such as a wireless communication network.
  • information such as data collected by vehicle sensors or other information
  • the cloud service center 520 controls the vehicles 513 and 512 by running its stored programs related to controlling car driving according to the received data.
  • the program related to controlling the driving of a car can be: a program that manages the interaction between the car and an obstacle on the road, or a program that controls the route or speed of the car, or a program that controls the interaction between the car and other cars on the road.
  • the cloud service center 520 may provide a part of the map to the vehicles 513 and 512 through the network 511.
  • operations can be divided between different locations.
  • multiple cloud service centers can receive, confirm, combine, and/or send information reports.
  • information reports and/or sensor data can also be sent between vehicles.
  • Other configurations are also possible.
  • the cloud service center 520 sends a suggested solution to the vehicle regarding possible driving situations in the environment (eg, inform the obstacle ahead and tell how to circumvent it)). For example, the cloud service center 520 may assist the vehicle in determining how to proceed when facing a specific obstacle in the environment.
  • the cloud service center 520 sends a response to the vehicle indicating how the vehicle should travel in a given scene.
  • the cloud service center 520 can confirm the existence of a temporary stop sign in front of the road based on the collected sensor data. For example, based on the “lane closed” sign and the sensor data of construction vehicles, it can be determined that the lane is closed due to construction.
  • the cloud service center 520 sends a suggested operation mode for the vehicle to pass through the obstacle (for example, instructing the vehicle to change lanes on another road).
  • the operation steps used for the vehicle can be added to the driving information map.
  • this information can be sent to other vehicles in the area that may encounter the same obstacle, so as to assist other vehicles not only to recognize the closed lanes but also to know how to pass.
  • the disclosed methods may be implemented as computer program instructions in a machine-readable format, encoded on a computer-readable storage medium, or encoded on other non-transitory media or articles.
  • Figure 7 schematically illustrates a conceptual partial view of an example computer program product arranged in accordance with at least some of the embodiments shown herein, the example computer program product including a computer program for executing a computer process on a computing device.
  • the example computer program product 600 is provided using a signal bearing medium 601.
  • the signal-bearing medium 601 may include one or more program instructions 602, which, when run by one or more processors, can provide all or part of the functions described above with respect to FIGS. 1 to 6, or can provide descriptions in subsequent embodiments All or part of the function.
  • one or more features in S901 to S903 may be undertaken by one or more instructions associated with the signal bearing medium 601.
  • the program instructions 602 in FIG. 7 also describe example instructions.
  • the computer program product when the technical solutions of the embodiments of the present application are executed by a vehicle or a component in the vehicle, the computer program product may be a program product used by the vehicle or its components.
  • the computer program product when the technical solution of the embodiment of the present application is executed by another device other than the vehicle, such as a server, the computer program product may be a program product used by the other device.
  • the signal-bearing medium 601 may include a computer-readable medium 603, such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (read -only memory, ROM) or random access memory (RAM), etc.
  • the signal bearing medium 601 may include a computer recordable medium 604, such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
  • the signal-bearing medium 601 may include a communication medium 605, such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • the signal bearing medium 601 may be communicated by a wireless communication medium 605 (for example, a wireless communication medium that complies with the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standard or other transmission protocols).
  • the one or more program instructions 602 may be, for example, computer-executable instructions or logic-implemented instructions.
  • a computing device such as that described with respect to FIGS. 1 to 6 may be configured to respond to passing through one or more of the computer-readable medium 603, and/or the computer recordable medium 604, and/or the communication medium 605
  • a program instruction 602 communicated to the computing device provides various operations, functions, or actions. It should be understood that the arrangement described here is for illustrative purposes only.
  • the road structure detection method provided by the embodiment of the present application is applied in automatic/semi-automatic driving or other driving scenarios. Specifically, it is applied in the scene of determining road structure information.
  • the road structure information includes the location information of the merging point of the own lane and the adjacent lane and/or the separation point of the own lane and the adjacent lane. Exemplarily, it is applied to the scene shown in Figure 8 (a) or Figure 8 (b). As shown in Fig. 8(a), there is a merge point A between the own lane and the adjacent lane where the vehicle 801 is located (that is, the lane close to the own lane), that is, the own lane and the right adjacent lane.
  • the location of the merging point A is the intersection of the right lane line (HR) of the own lane and the right left lane line (RL) of the right adjacent lane.
  • lane lines generally refer to flat patterns drawn on the road surface using paint (such as paint, etc.).
  • FIG. 8(b) there is a separation point B between the own lane where the vehicle 802 is located and the right adjacent lane.
  • the position of the separation point B is related to the position of the guardrail (or cement pier) on the edge of the lane and the guardrail (or cement pier) on the edge of the adjacent lane.
  • the position of the separation point B is the intersection of the two guardrails.
  • the location of the separation point B is the intersection of the two cement piers.
  • the difference between the scene shown in Figure 8 (a) and Figure 8 (b) lies in the fact that the location of the merge point or the separation point is determined according to what factors. In one case, the location of the merge point or the separation point is determined by the lane line of the plane. The intersection position is determined. In one case, the position of the merging point or the separation point is determined by the position of the road surface protrusion.
  • the embodiment of the present application provides a road structure detection method, which can be applied to the device shown in FIG. 1 to FIG. 6 or the computer program product shown in FIG. 7 or other devices far away from the vehicle.
  • the executive body of the technical solution is omitted. Referring to Figure 9, the method includes the following steps:
  • the boundary information of the own lane is used to characterize the position of the boundary of the own lane (ie, the current lane); the boundary information of the own lane includes lane line information of the own lane, and/or position information of the edge of the own lane.
  • protrusions can be used to divide the edges of the lanes between lanes.
  • the position information of the edge of the own lane includes, but is not limited to, the position information of the protrusion on the edge of the own lane.
  • the protrusions on the edge of the lane include, but are not limited to, guardrails and concrete piers on the edge of the lane.
  • a visual sensor such as a camera
  • Radar or similar components can be used to obtain the position of the edge of the lane.
  • S902 Determine boundary information of adjacent lanes.
  • the adjacent lane boundary information is used to characterize the position of the boundary of the adjacent lane; the boundary information of the adjacent lane includes the lane line information of the adjacent lane and/or the position information of the edge of the adjacent lane.
  • the position information of the edge of the adjacent lane may refer to the position information of the protrusion on the edge of the adjacent lane.
  • Protrusions include, but are not limited to, guardrails and concrete piers adjacent to the edge of the lane.
  • the adjacent lane boundary information is determined based on the detected vehicle trajectory information and/or road boundary information, and road prior data. That is, the adjacent lane boundary information is determined based on the road prior data and the detected vehicle trajectory information. Alternatively, the adjacent lane boundary information is determined based on road prior data and road boundary information. Alternatively, the adjacent lane boundary information is determined based on the road prior data, the detected vehicle trajectory information, and the road boundary information.
  • the road boundary information is used to characterize the position of the road edge (edge).
  • the visual sensor of the vehicle can be used to collect the visual information of the vehicle in the adjacent lane to determine the detected driving trajectory information.
  • the vehicle's radar or components with similar functions can also be used to determine the position and speed of the vehicle in other lanes by means of receiving and transmitting lasers, millimeter waves, etc., to determine the driving trajectory information of the other lane.
  • other methods may also be used to determine the driving trajectory information of other lanes, which is not limited in the embodiment of the present application.
  • components such as a camera can be used to collect visual information of the road boundary line to determine the location of the road boundary, that is, road boundary information.
  • the camera can also be used to determine the road boundary information.
  • Components such as radar can also be used to determine road boundary information. The embodiment of the present application does not limit the manner of determining road boundary information.
  • the road prior data includes lane width.
  • the lane width may be different.
  • the width of each lane of urban roads is 3.5 meters
  • the width of each lane of the diversion lanes of intersections is 2.3-2.5 meters
  • the width of each lane of arterial roads (including expressways) is 3.75 meters.
  • the lane width used to determine the adjacent lane boundary information in the embodiment of the present application needs to be related to the scene where the vehicle is currently located.
  • the sensor of the own vehicle such as a camera, can be used to collect the lane width in the current scene.
  • the own vehicle can also directly obtain the lane width in the current scene from other devices in the network (such as a server).
  • the embodiment of the present application does not limit the way of acquiring the lane width.
  • the method of determining the boundary information of adjacent lanes is described in detail as follows. Among them, the following is divided into two cases according to whether the lane line or the convexity is used to divide the merge point or the separation point, and the description is divided into two cases:
  • Case 1 When using paint to mark lane lines to divide the merge or split point, if the adjacent lane is the right adjacent lane, the boundary information of the adjacent lane includes the position information of the RL; if the adjacent lane is the left adjacent lane, the adjacent lane The boundary information of the lane includes the position information of the left right lane line (LR) of the left adjacent lane.
  • LR left right lane line
  • An example of LR is shown in (a) of FIG. 8. It should be noted that in Figure 8(a), RL and LR are marked at the same time, but in actual application scenarios, there may be cases where LR and RL do not exist on the road at the same time. In other words, in actual scenarios, there may only be RL or only LR.
  • RL and/or LR are determined according to road boundary information and road prior data. Specifically, the road boundary is shifted to the left by a first width to obtain the first position of the RL; and the road boundary is shifted to the right by a third width to obtain the fourth position of the LR.
  • the first width is an integer multiple of the lane width
  • the third width is an integer multiple of the lane width
  • the road boundary information can be described by a cubic equation (because the road surface is usually a flat surface, it can also be described by a quadratic equation or other possible equations). It is possible to map road boundary information (points constituting the road boundary) to the world coordinate system, and perform a series of operations on these points in the world coordinate system to obtain the position of RL or LR. Exemplarily, to determine the RL position as an example, see Figure 10 (a), after mapping the points on the road boundary to the world coordinate system, all points on the road boundary are shifted to the left (that is, to the point close to the origin o Pan on one side).
  • the vehicle since the vehicle does not perceive the existence of several lanes between the road boundary and its own lane in advance, when performing translation operations, first use the smallest granularity as a unit, that is, first translate a lane width to obtain a preliminary RL.
  • the preliminary determined RL is used to subsequently determine the position of the preliminary merging point or the separation point.
  • the RL that is initially determined needs to be adjusted again until the final location of the merging point or the separation point meets the preset conditions.
  • the method of initially determining RL specifically how to determine whether the location of the merging point or the separation point meets the preset conditions, and how to adjust the RL, etc., which will be explained in detail below.
  • the determination method of LR can be referred to the determination method of RL, which will not be elaborated here.
  • the road boundary information can also be mapped to the image plane (or called the image coordinate system), and a series of operations such as translation are performed on the points on the road boundary on the image plane to determine the RL or LR. s position.
  • the embodiment of the present application does not limit the type of the mapped coordinate system.
  • RL and/or LR are determined according to the detected vehicle trajectory information and road prior data. Specifically, the vehicle trajectory is shifted to the left by a second width to obtain the second position of the RL; and the vehicle trajectory is shifted to the right by a fourth width to obtain the fifth position of the LR.
  • the second width is an odd multiple of the half-lane width
  • the fourth width is an odd multiple of the half-lane width
  • the detected vehicle trajectory information can be described by a cubic equation (because the road is usually a flat surface, it can also be described by a quadratic equation or other possible equations).
  • the driving trajectory information (points constituting the driving trajectory) can be mapped to the world coordinate system, and a series of operations such as translation can be performed on these points in the world coordinate system to obtain the position of RL or LR.
  • a series of operations such as translation can be performed on these points in the world coordinate system to obtain the position of RL or LR.
  • Figure 10 (b) Exemplarily, to determine the RL position as an example, see Figure 10 (b). After mapping the points on the driving trajectory to the world coordinate system, all points on the driving trajectory are shifted to the left by half the width of the lane, that is RL is available.
  • the determination method of LR can be referred to the determination method of RL, which will not be elaborated here.
  • the second width or the fourth width is set by taking the observation vehicle at the center of the lane as an example. Considering that the observation vehicle may not actually be in the center of the lane, the actual width of the second width or the fourth width can also be adjusted to other widths.
  • the driving trajectory information may also be mapped to the image plane, and a series of operations such as translation are performed on the points on the driving trajectory on the image plane to determine the position of the RL or LR.
  • RL and/or LR are determined according to road boundary information, detected vehicle trajectory information, and road prior data. Specifically, the first position and the second position are merged through a preset algorithm to obtain the third position of the RL; the fourth position and the fifth position are merged through the preset algorithm to obtain the sixth position of the LR.
  • the RL determined based on the road boundary information and road prior data there may be between the RL determined based on the road boundary information and road prior data and the RL determined based on the trajectory of the adjacent lane and the road prior data. Some deviations. For example, because the radar detection range is relatively small, the trajectory of the short-distance vehicle may be detected more accurately, and the RL position obtained from this is more accurate. The detected long-distance road boundary information is not very accurate, and the RL position obtained may not be very accurate. accurate. In this case, in order to improve the accuracy of the determined RL position, the two types of RL position information can be fused. Among them, the adopted fusion algorithm can be but not limited to a weighted summation.
  • the weight of the first position of the RL determined according to the road boundary information and the weight of the second position of the RL determined according to the traffic trajectory information may be set according to the actual scene. For example, when the detection distance of the radar itself is small, or the detection distance is small due to haze weather, and the second position may be closer to the accurate position of the RL, the weight of the second position is set to be greater than the weight of the first position. In this case, since the weight can be determined in real time according to factors such as the actual scene or the performance of the vehicle itself, the accuracy of the location information obtained by the fusion can be improved.
  • the weights of the first position and the second position can also be preset. For example, the weights of the first position and the second position are set in advance according to some historical data or other data. The way of preset weights does not need to adjust the weights of the first position and the second position according to the actual scene, which is relatively simple to implement.
  • Case 2 When a convex object is used to divide the merge point or the separation point, if the adjacent lane is the right adjacent lane, the boundary information of the adjacent lane includes the position information of the left edge of the right adjacent lane; if the adjacent lane is the left adjacent lane, The boundary information of the adjacent lane includes the position information of the right edge of the left adjacent lane.
  • the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to the road boundary information and road prior data. Specifically, shifting the road boundary to the left by a fifth width to obtain the seventh position of the left edge of the right adjacent lane; shifting the road boundary to the right by a seventh width to obtain the first position of the right edge of the left adjacent lane Ten positions.
  • the fifth width is an integer multiple of the lane width
  • the seventh width is an integer multiple of the lane width.
  • the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to the detected vehicle trajectory information and road prior data. Specifically, the vehicle trajectory is shifted to the left by a sixth width to obtain the eighth position of the left edge of the right adjacent lane; and the vehicle trajectory is shifted to the right by an eighth width to obtain the eleventh position of the right edge of the left adjacent lane.
  • the sixth width is an odd multiple of the half-lane width
  • the eighth width is an odd multiple of the half-lane width
  • the driving trajectory information can also be mapped to the image plane to determine the position of the left edge of the adjacent lane to the right or the right edge of the adjacent lane to the left.
  • the positions of the left edge of the right adjacent lane and/or the right edge of the left adjacent lane are determined according to road boundary information, detected vehicle trajectory information, and road prior data. Specifically, the first position and the second position of the left edge of the adjacent lane on the right are merged by a preset algorithm to obtain the third position of the left edge of the adjacent lane on the right; the fourth position and the fifth position of the right edge of the adjacent lane on the left are combined The sixth position of the right edge of the left adjacent lane is obtained by the preset algorithm fusion.
  • the lanes mentioned in the embodiments of the present application not only refer to ordinary lanes, but can also refer to road structures such as emergency parking belts.
  • the emergency parking zone is not much different from the ordinary lane width, it can be processed according to the ordinary lane width.
  • the actual emergency parking belt width may be used to perform the above translation operation to determine the adjacent lane boundary information.
  • the adjacent lane boundary information is inferred based on the road boundary information and/or the detected driving trajectory information, instead of directly detecting the adjacent lane boundary information through the camera.
  • the poor detection performance of the camera such as a small detection distance
  • environmental factors such as haze
  • S903 Determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane.
  • S903 is specifically implemented as: determining the location information of the merging point or the separation point according to the lane line information (HR) of the own lane and the lane line information of the adjacent lane. Take the merging point between the current lane and the right adjacent lane as an example. See Figure 10 (a) or Figure 10 (b). After determining the positions of HR and RL, the intersection of HR and RL is the merge point A position.
  • HR lane line information
  • S903 is specifically implemented as: determining the position information of the merging point or the separating point according to the information of the protrusions on the edge of the lane and the information of the protrusions on the edge of the adjacent lane .
  • the merging point between the lane and the right adjacent lane as an example, see Figure 10 (c) or Figure 10 (d), when determining the right edge guardrail (or cement pier, etc.) of the lane and the left side of the right adjacent lane After the position of the edge guardrail (or cement pier, etc.), the intersection of the two guardrails (or concrete pier) is the location of the merge point A.
  • the preset condition may be: the merging point that is initially determined (such as those determined in FIG. 10(a)-FIG. 10(d))
  • the distance between the merging point A) and the reference point is less than or equal to the first threshold.
  • the reference point is a position reference, which is used to judge whether the position of the merging point (or the separation point) determined initially is accurate.
  • the reference point may be the current vehicle (such as the vehicle 801 in (a) of FIG. 8).
  • the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
  • the position of the merge point A and the vehicle is preliminarily determined in Figure 10 (a)- Figure 10 (d). If the distance between the two is less than or equal to the first threshold, it is deemed that the initially determined position of the merge point is accurate, and there is no need to further adjust the position of the merge point. Similarly, when a separation point appears on the road, the distance between the separation point and the vehicle is also required to be less than or equal to the first threshold. When the preliminarily determined distance between the separation point and the vehicle is less than or equal to the first threshold, the preliminarily determined separation point position is deemed accurate, and there is no need to further adjust the separation point position.
  • the width adjusted when determining the adjacent lane boundary information is inaccurate.
  • at least one of the aforementioned widths should be adjusted. Specifically, the aforementioned at least one width is adjusted according to the first threshold; and the merging point and/or the separation point are adjusted based on the adjusted at least one width.
  • Figure 11 (a) when there are multiple lanes between the vehicle 801's own lane and the road boundary ( Figure 11 (a) shows two lanes), move the point on the road boundary to the left as a whole by one
  • Figure 11 (a) shows two lanes
  • the right lane line of the right adjacent lane is regarded as the preliminary determined RL.
  • the location of the merge point is initially determined as shown in Figure 11 (a) at point B.
  • the distance between the preliminarily determined point B and the vehicle is relatively large, and as shown in (a) of FIG. 11, point B is not a true merge point. Therefore, it is necessary to adjust the translation width from a lane width according to a certain compensation step length, where the step length can be a lane width.
  • the step length can be a lane width.
  • you need to adjust the translation width from one lane width to two lane widths that is, adjust the point on the road boundary to two lane widths to the left, that is, to pan to the left by one lane width for the first time and then to the left by one lane width.
  • Get the adjusted RL As shown in Figure 11 (a), after this adjustment, the actual RL can be obtained.
  • the position of the intersection of the RL and the HR that is, the position of the merging point meets a preset condition, the position of the merging point is regarded as the final merging point position.
  • the adjusted first width is actually two lane widths.
  • widths other than the first width can also be adjusted.
  • the first width in the current scene, as shown in Figure 11(a), there is also a merging point between the left adjacent lane and the own lane.
  • the width of the road boundary required to be translated, that is, the third width is the width of two lanes to obtain the actual LR position.
  • point C is not a true merging point. Therefore, it is necessary to adjust the translation width from half the lane width to a certain compensation step length, where the step length can be a lane width.
  • the step length can be a lane width.
  • Figure 11 (b) after this adjustment, the position of the intersection (B) of the obtained lane line 1102 and the HR still does not meet the preset condition, therefore, the translation width needs to be adjusted.
  • the road structure detection method provided by the embodiments of the present application can determine road structure information based on the boundary information of the own lane and the boundary information of the adjacent lanes, where the boundary information includes lane line information and/or lane edge position information. That is to say, in the embodiment of the present application, the road structure information can be determined without using a high-precision map. Therefore, in areas where there is no HD map coverage, the vehicle can still determine the road structure.
  • the embodiment of the present application may divide the road structure detection device into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules.
  • the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • Fig. 12 shows a possible schematic diagram of the structure of the road structure detection device involved in the above-mentioned embodiment.
  • the road structure detection device 16 includes a determination module 161 and an adjustment module 162.
  • the road structure detection device 16 may also include other modules (such as a storage module), or the road structure detection device may include fewer modules.
  • the determining module is used to determine the boundary information of the own lane, the boundary information of the own lane is used to characterize the position of the boundary of the current lane; the boundary information of the adjacent lane is determined, and the boundary information of the adjacent lane is used to characterize the boundary of the adjacent lane The location of the border.
  • the boundary information includes lane line information and/or lane edge position information.
  • the determining module is further configured to determine road structure information according to the boundary information of the own lane and the boundary information of the adjacent lane, where the road structure information includes the merging point of the own lane and the adjacent lane and/or the Location information of the separation point between the own lane and the adjacent lane.
  • the boundary information of the adjacent lane is determined according to the detected driving trajectory information and/or road boundary information, and the road boundary information is used to characterize the position of the road boundary.
  • the boundary information of the adjacent lane includes at least one of the following:
  • the first position of the RL is obtained by shifting the road boundary to the left by a first width; or, the second position of the RL is obtained by shifting the driving track to the left by a second width; or, the RL
  • the third position is obtained by fusing the first position of the RL and the second position of the RL through a preset algorithm.
  • the fourth position of the LR is obtained by shifting the road boundary to the right by a third width; or, the fifth position of the LR is obtained by shifting the driving track to the right by a fourth width; or, the sixth position of the LR is obtained by the The fourth position of the LR and the fifth position of the LR are fused through a preset algorithm.
  • the seventh position of the left edge of the right adjacent lane is obtained by shifting the road boundary to the left by a fifth width; or, the eighth position of the left edge is obtained by shifting the driving track to the left by a sixth width; or, the left edge
  • the ninth position of is obtained by fusing the seventh position of the left edge and the eighth position of the left edge through a preset algorithm.
  • the tenth position of the right edge of the left adjacent lane is obtained by shifting the road boundary to the right by the seventh width; or, the eleventh position of the right edge is obtained by shifting the traffic trajectory to the right by the eighth width; or, the right
  • the twelfth position of the edge is obtained by fusing the tenth position of the right edge and the eleventh position of the right edge through a preset algorithm.
  • the first width is an integer multiple of a lane width
  • the second width is an odd multiple of a half-lane width
  • the third width is an integer multiple of the lane width
  • the fourth width is the half-lane width.
  • the fifth width is an integer multiple of the lane width
  • the sixth width is an odd multiple of the half-lane width
  • the seventh width is an integer multiple of the lane width
  • the eighth width is an odd multiple of the half-lane width.
  • the distance between the merge point and the reference point is less than or equal to a first threshold, and/or the distance between the separation point and the reference point is less than or equal to the first threshold.
  • Threshold the reference point includes a vehicle.
  • the adjustment module is configured to adjust the first width and/or the second width and/or the third width and/or the fourth width and the fourth width according to the first threshold. / Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width; based on the adjusted first width and/or the second width and/ Or the third width and/or the fourth width and/or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width, adjusting the merging point; and /Or, used to adjust the first width and/or the second width and/or the third width and/or the fourth width and/or the fifth width and/or the The sixth width and/or the seventh width and/or the eighth width; based on the adjustment of the first width and/or the second width and/or the third width and/or the fourth width and/ Or the fifth width and/or the sixth width and/or the seventh width and/or the eighth width to adjust the separation point.
  • the first threshold is determined by the sensing range of the sensor, or the first threshold is a pre-configured value.
  • the boundary information of the adjacent lane is determined according to road prior data, and the road prior data includes the lane width.
  • the present application also provides a road structure detection device 10 including a processor 1001.
  • the road structure detection device 10 may further include a memory 1002.
  • the processor 1001 and the memory 1002 are connected (for example, connected to each other through a bus 1004).
  • the road structure detection device 10 may further include a transceiver 1003, which is connected to the processor 1001 and the memory 1002, and the transceiver is used to receive/send data.
  • a transceiver 1003 which is connected to the processor 1001 and the memory 1002, and the transceiver is used to receive/send data.
  • the processor 1001 can execute operations of any one of the implementation solutions and various feasible implementation manners corresponding to FIG. 9. For example, it is used to perform operations of the determining module 161 and the adjusting module 162, and/or other operations described in the embodiments of the present application.
  • the processor 1001 is also used to control the sensor 1005 so that the sensor 1005 obtains some sensing information.
  • the sensor 1005 may be included in the road structure detection device 10. It can also be an external sensor.
  • the road structure detection device 10 includes a sensor 1005, that is, the sensor 1005 is a built-in sensor of the road structure detection device 10, optionally, all the data processing functions in the foregoing method embodiment can be integrated in the sensor 1005. In this case, the road structure detection device 10 may not include the processor 1001.
  • the sensor 1005 may be used to perform the foregoing method embodiments, that is, to perform operations of the determining module 161 and the adjusting module 162, and/or other operations described in the embodiments of the present application.
  • the data processing may be a fusion operation.
  • the sensor 1005 can perform a fusion operation on the adjacent lane boundary information determined by the road boundary information and the adjacent lane boundary information determined by the driving trajectory information, so as to improve the accuracy of the adjacent lane boundary information.
  • the data processing can also be other data processing procedures in the foregoing method embodiments. For example, when the position of the merged point (or separation point) that is initially determined is greater than or equal to the first threshold, the sensor 1005 can be used to adjust the position of the road boundary and/or the trajectory of the vehicle. The width of the desired translation.
  • the embodiment of the present application does not limit the specific data processing functions that the sensor 1005 can perform.
  • the senor 1005 may refer to a visual sensor (such as a camera) or a sensor such as a radar. It can also refer to other sensors with similar functions.
  • the senor 1005 may not integrate data processing functions, or integrate a part of data processing functions.
  • the sensor 1005 combined with the processor can execute the foregoing method embodiment, and the road structure detection device 10 needs to include the sensor 1005 and the processor 1001.
  • the senor 1005 is a traditional sensor, which does not have a data function.
  • the sensor 1005 is used to determine road boundary information and/or vehicle trajectory information. That is, the sensor 1005 is used for.
  • the processor is used to determine the adjacent lane boundary information according to the road boundary information and/or the driving trajectory information, and to determine the location of the merge point (and/or separation point) according to the adjacent lane boundary information and the own lane boundary information. Among them, when the adjacent lane boundary information is determined according to the road boundary information and the driving trajectory information, a fusion operation is involved.
  • the senor 1005 may have a part of data processing function
  • the processor may have a part of data processing function.
  • the sensor 1005 can perform data processing according to the position of the road boundary collected by the sensor 1005 to obtain adjacent lane boundary information.
  • the processor can be used for other data processing functions such as fusion operations.
  • the embodiment of the present application does not limit the specific division of labor between the sensor 1005 and the processor 1001 (that is, which part of the data is processed separately).
  • This application also provides a road structure detection device, which includes a non-volatile storage medium and a central processing unit.
  • the non-volatile storage medium stores an executable program.
  • the central processing unit is connected to the non-volatile storage medium and executes The program can be executed to implement the road structure detection method of the embodiment of the present application.
  • the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium includes one or more program codes.
  • the one or more programs include instructions.
  • the processor executes the program codes
  • the The road structure detection device executes the road structure detection method shown in FIG. 9.
  • a computer program product in another embodiment of the present application, includes computer-executable instructions, and the computer-executable instructions are stored in a computer-readable storage medium. At least one processor of the road structure detection device can read the computer-executable instruction from the computer-readable storage medium, and at least one processor executes the computer-executed instruction so that the road structure detection device implements the corresponding road structure detection method shown in FIG. 9 step.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • a software program it can appear in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • Computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be divided. It can be combined or integrated into another device, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate parts may or may not be physically separate.
  • the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of a software product, and the software product is stored in a storage medium. It includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

一种道路结构检测方法及装置,涉及自动驾驶技术领域,能够不依赖高精地图实现道路结构的检测。该方法包括:确定本车道的边界信息以及邻车道的边界信息,并根据本车道的边界信息和邻车道的边界信息确定道路结构信息。本车道的边界信息用于表征当前车道的边界的位置;邻车道的边界信息用于表征相邻车道的边界的位置;边界信息包括车道线信息和/或车道边缘的位置信息;道路结构信息包括本车道与邻车道的合并点(A)和/或本车道与邻车道的分离点(B)的位置信息。

Description

道路结构检测方法及装置
本申请要求于2019年12月6日提交国家知识产权局、申请号为201911245257.2、发明名称为“道路结构检测方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及自动驾驶领域,尤其涉及道路结构检测方法及装置。
背景技术
在自动驾驶中,智能车辆需要对周边的环境进行感知。智能车辆可以通过多种传感器对车辆周边的环境进行检测、分类,并将这些信息传输到规划和控制模块,形成对智能车辆的驾驶策略,完成整个自动驾驶的过程。
在行驶过程中,较为重要的方面是检测道路结构,以便于智能车辆能够根据道路结构调整驾驶策略,规避一些道路上的障碍物等,实现更好的自动驾驶。目前现有的一种道路结构检测技术是,利用高精地图获取一些车道信息等,进而确定道路结构。但是,高精地图通常覆盖区域有限,当所行驶区域没有高精地图覆盖时,智能车辆无法获取到相应信息,进而无法确定道路结构。
发明内容
本申请提供一种道路结构检测方法及装置,能够不依赖高精地图实现道路结构的检测。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种道路结构检测方法,该方法可以由智能车辆(或者车辆中组件)或者具有控制功能的其他设备(或者其他设备中的组件)执行。该方法包括:确定本车道的边界信息以及邻车道的边界信息,并根据本车道的边界信息和邻车道的边界信息确定道路结构信息。本车道的边界信息用于表征当前车道的边界的位置;邻车道的边界信息用于表征相邻车道的边界的位置;边界信息包括车道线信息和/或车道边缘的位置信息;道路结构信息包括本车道与邻车道的合并点和/或本车道与邻车道的分离点的位置信息。
示例性的,车道之间可以使用涂料标记划分边界。或者,使用凸起物(比如护栏、水泥墩等)之类划分边界。当然,还可能是其他边界划分方式。
可见,本申请实施例提供的道路结构检测方法,无需借助高精地图,能够在没有高精地图覆盖的区域完成道路结构的检测。也就是说,本申请提供的道路结构检测方法的适用范围更加广泛,能够应用在有高精地图覆盖的区域和无高精地图覆盖的区域。
在一种可能的实现方式中,车道线可以是采用涂料在路面上画出的,这样,可以利用视觉传感器(比如摄像头)采集车道线的视觉信息,以确定车道线位置。车道边缘可以是利用凸起物(比如护栏)来划分的,这样,可以利用雷达或者类似组件发射特定的波,并获取车道边缘凸起物的发射波,通过发射波和反射波的特征(比如相位差、频率差等)获取车道边缘的位置。当然,也可以通过摄像头等采集车道边缘诸如 护栏的视觉信息,以确定诸如护栏的位置信息。
在一种可能的实现方式中,邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,道路边界信息用于表征道路边界的位置。
也就是说,可以根据检测到的行车轨迹信息确定邻车道的边界信息,或者,可以根据道路边界信息确定邻车道的边界信息,或者,根据检测到的行车轨迹信息以及道路边界信息两者确定邻车道的边界信息。
其中,可以利用车辆的视觉传感器,比如摄像头采集邻车道的车辆的视觉信息,以确定检测到的行车轨迹信息。还可以利用车辆的雷达或者具有相似功能的组件,利用收发激光、毫米波等方式,确定其他车道中车辆的位置、速度等信息,以确定该其他车道的行车轨迹信息。当然,也可以采用其他方式确定其他车道的行车轨迹信息。
当采用涂料等平面图案标记道路边界线时,可以采用摄像头等组件采集道路边界线的视觉信息,以确定道路边界位置,即道路边界信息。当采用凸起物(护栏、或水泥墩等)等划分道路边界时,同样可以采用摄像头确定道路边界信息。也可以采用雷达等组件确定道路边界信息。
可见,本申请的技术方案中,邻车道边界信息是根据道路边界信息和/或检测到的行车轨迹信息推断出的,而并非直接通过摄像头检测邻车道边界信息。这样一来,能够一定程度上避免直接通过摄像头采集邻车道的车道线位置时,因摄像头检测性能不佳(比如检测距离较小)或者环境因素(比如雾霾),导致采集的邻车道的车道线位置不准确,或者无法采集到邻车道的车道线位置的问题。
在一种可能的实现方式中,邻车道的边界信息包括以下至少一个:
右邻车道的左车道线RL的位置信息;
左邻车道的右车道线LR的位置信息;
右邻车道的左边缘的位置信息;
左邻车道的右边缘的位置信息。
在一种可能的实现方式中,当本车道与右邻车道之间存在合并点或分离点时,需根据上述本车道的边界信息和右邻车道的边界信息确定合并点或分离点位置。这里,需要先确定右邻车道的边界信息。
当采用涂料标记方式划分车道边界时,右邻车道的边界信息,指的是右邻车道的左车道线(RL)位置。其中,RL的位置可由道路边界信息推导而来,具体的,RL的第一位置由道路边界向左平移第一宽度得到。或者,RL位置由其他车道的行车轨迹推导而来,具体的,RL的第二位置由行车轨迹向左平移第二宽度得到。或者,RL位置可由道路边界位置和其他车道的行车轨迹联合推导而来,具体的,RL的第三位置(可称为融合位置)由RL的第一位置和RL的第二位置经预设算法融合得到。这样,当第一位置和第二位置中存在准确度较差的位置时,能够校正融合位置与实际合并点(和/或分离点)位置之间的偏差,提升最终确定的位置结果的准确度。
当采用凸起物或类似方式划分车道边界时,右邻车道的边界信息,指的是右邻车道的左边缘位置。其中,该左边缘的位置可由道路边界位置推导而来,具体的,右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到。或者,该左边缘的位置可由其他车道的行车轨迹推导而来,具体的,左边缘的第八位置由行车轨迹向左平移 第六宽度得到。或者,该左边缘的位置可由道路边界位置以及其他车道的行车轨迹两者推导而来,具体的,左边缘的第九位置由左边缘的第七位置和左边缘的第八位置经预设算法融合得到。
类似的,当采用涂料标记方式划分车道边界时,左邻车道的边界信息,指的是左邻车道的右车道线(LR)位置。LR的第四位置由道路边界向右平移第三宽度得到;或者,LR的第五位置由行车轨迹向右平移第四宽度得到;或者,LR的第六位置由LR的第四位置和LR的第五位置经预设算法融合得到。
当采用凸起物或类似方式划分车道边界时,左邻车道的边界信息,指的是左邻车道的右边缘位置。左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,右边缘的第十二位置由右边缘的第十位置和右边缘的第十一位置经预设算法融合得到。
其中,第一宽度为车道宽度的整数倍,第二宽度为半车道宽度的奇数倍,第三宽度为车道宽度的整数倍,第四宽度为半车道宽度的奇数倍,第五宽度为车道宽度的整数倍,第六宽度为半车道宽度的奇数倍,第七宽度为车道宽度的整数倍,第八宽度为半车道宽度的奇数倍。
在一种可能的实现方式中,合并点与参考点之间的距离小于或等于第一阈值,和/或,分离点与参考点之间的距离小于或等于第一阈值,参考点包括车辆。
当初步确定的合并点与车辆之间的距离小于或等于第一阈值,视为初步确定的合并点位置准确,无需再进一步调整合并点的位置。
在一种可能的实现方式中,当初步确定的合并点与车辆之间的距离小于或等于第一阈值,其可能是在确定邻车道边界信息时,由道路边界和/或行车轨迹所需平移的第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度不准确,可能导致初步确定的合并点和/或分离点位置不准确。此种情况下,需进一步调整合并点和/或分离点位置,以提升道路结构检测结果的准确性。具体的,根据第一阈值调整第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度;基于调整的第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度,调整合并点;和/或,根据第一阈值调整第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度;基于调整的第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度,调整分离点。
上述技术方案中,调整道路边界位置所需平移的宽度,和/或行车轨迹所需平移的宽度,并基于该调整,确定邻车道的边界信息,使得到的邻车道的边界信息更为精准,由此确定的合并点(和/或分离点)位置更为准确。基于精准的合并点(和/或分离点)位置,可以制定更为精准的驾驶策略,以指导车辆驾驶,提升驾驶的安全性。
在一种可能的实现方式中,第一阈值是通过传感器的感知范围确定的,或者,第一阈值为预配置的数值。
在一种可能的实现方式中,邻车道的边界信息是根据道路先验数据确定的,道路先验数据包括车道宽度。
第二方面,本申请提供一种道路结构检测装置,该装置可以为车辆,也可以是能够支持车辆实现驾驶功能的装置,其可以和车辆匹配使用,例如可以是车辆中的装置(比如是车辆中的芯片系统,或者车辆的计算机系统上运行的操作系统和/或驱动等),还可以是其他设备(比如服务器)或设备中的芯片等。该装置包括确定模块、调整模块,这些模块可以执行上述第一方面任一种设计示例中的道路结构检测方法,具体的:
确定模块,用于确定本车道的边界信息,本车道的边界信息用于表征当前车道的边界的位置;确定邻车道的边界信息,邻车道的边界信息用于表征相邻车道的边界的位置。其中,边界信息包括车道线信息和/或车道边缘的位置信息。
确定模块,还用于根据本车道的边界信息和邻车道的边界信息确定道路结构信息,道路结构信息包括本车道与邻车道的合并点和/或本车道与邻车道的分离点的位置信息。
在一种可能的设计中,邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,道路边界信息用于表征道路边界的位置。
在一种可能的设计中,邻车道的边界信息包括以下至少一个:
右邻车道的左车道线RL的位置信息;
左邻车道的右车道线LR的位置信息;
右邻车道的左边缘的位置信息;
左邻车道的右边缘的位置信息。
在一种可能的设计中,RL的第一位置由道路边界向左平移第一宽度得到;或者,RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由RL的第一位置和RL的第二位置经预设算法融合得到。
LR的第四位置由道路边界向右平移第三宽度得到;或者,LR的第五位置由行车轨迹向右平移第四宽度得到;或者,LR的第六位置由LR的第四位置和LR的第五位置经预设算法融合得到。
右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,左边缘的第九位置由左边缘的第七位置和左边缘的第八位置经预设算法融合得到。
左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,右边缘的第十二位置由右边缘的第十位置和右边缘的第十一位置经预设算法融合得到。
其中,第一宽度为车道宽度的整数倍,第二宽度为半车道宽度的奇数倍,第三宽度为车道宽度的整数倍,第四宽度为半车道宽度的奇数倍,第五宽度为车道宽度的整数倍,第六宽度为半车道宽度的奇数倍,第七宽度为车道宽度的整数倍,第八宽度为半车道宽度的奇数倍。
在一种可能的设计中,合并点与参考点之间的距离小于或等于第一阈值,和/或,分离点与参考点之间的距离小于或等于第一阈值,参考点包括车辆。
在一种可能的设计中,调整模块,用于根据第一阈值调整第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度;基于调整的第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和 /或第六宽度和/或第七宽度和/或第八宽度,调整合并点;和/或,用于根据第一阈值调整第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度;基于调整的第一宽度和/或第二宽度和/或第三宽度和/或第四宽度和/或第五宽度和/或第六宽度和/或第七宽度和/或第八宽度,调整分离点。
在一种可能的设计中,第一阈值是通过传感器的感知范围确定的,或者,第一阈值为预配置的数值。
在一种可能的设计中,邻车道的边界信息是根据道路先验数据确定的,道路先验数据包括车道宽度。
第三方面,本申请实施例提供一种道路结构检测装置,该装置具有实现上述第一方面中任一项的道路结构检测方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
第四方面,提供一种道路结构检测装置,包括:处理器。处理器用于与存储器耦合,并读取存储器中的指令之后,根据指令执行如上述第一方面中任一项的道路结构检测方法。
其中,存储器可以是装置的外置存储器。该外置存储器与处理器耦合。存储器也可以指装置中包括的存储器。即该装置可选的包括存储器。
另外,该装置还可以包括通信接口。用以该装置和其他设备之间通信。通信接口可以比如但不限于是收发器,收发电路等。
第五方面,本申请实施例中还提供一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行第一方面的方法。
第六方面,本申请实施例中还提供一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行第一方面的方法。
第七方面,本申请实施例提供了一种道路结构检测装置,该检测装置可以为传感器装置,例如雷达装置。该装置还可以为芯片系统,该芯片系统包括处理器,还可以包括存储器,用于实现上述第一方面方法的功能。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第八方面,提供一种道路结构检测装置,该装置可以为电路系统,电路系统包括处理电路,处理电路被配置为执行如上述第一方面中任一项的道路结构检测方法。
第九方面,本申请实施例提供了一种系统,系统包括第二至第四方面中任一方面的装置,和/或第七方面的芯片系统,和/或第八方面的电路系统,和/或第五方面中的可读存储介质,和/或第六方面中的计算机程序产品,和/或一个或多个类型的传感器,和/或智能车。
其中,一个或多个类型的传感器可以但不限于是视觉传感器(比如摄像头等),雷达或其他具有类似功能的传感器。
第十方面,本申请实施例提供了一种智能车,其中包括第二至第四方面中任一方面的装置,和/或第七方面的芯片系统,和/或第八方面的电路系统,和/或第五方面中的可读存储介质,和/或第六方面中的计算机程序产品。
附图说明
图1为本申请实施例提供的一种自动驾驶汽车的结构示意图;
图2为本申请实施例提供的一种自动驾驶汽车的结构示意图;
图3为本申请实施例提供的一种计算机系统的结构示意图;
图4为本申请实施例提供的一种神经网络处理器的结构示意图;
图5为本申请实施例提供的一种云侧指令自动驾驶汽车的应用示意图;
图6为本申请实施例提供的一种云侧指令自动驾驶汽车的应用示意图;
图7为本申请实施例提供的一种计算机程序产品的结构示意图;
图8为本申请实施例提供的道路结构检测方法的场景示意图;
图9为本申请实施例提供的道路结构检测方法的流程示意图;
图10、图11为本申请实施例提供的道路结构检测方法的场景示意图;
图12为本申请实施例提供的道路结构检测装置的结构示意图;
图13为本申请实施例提供的道路结构检测装置的结构示意图。
具体实施方式
本申请的说明书以及附图中的术语“第一”和“第二”等是用于区别不同的对象,或者用于区别对同一对象的不同处理,而不是用于描述对象的特定顺序。此外,本申请的描述中所提到的术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括其他没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。需要说明的是,本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请实施例提供的道路结构检测方法应用在智能车辆上,或者应用于具有控制功能的其他设备(比如云端服务器)中。车辆可通过其包含的组件(包括硬件和软件)实施本申请实施例提供的道路结构检测方法,获取本车道的车道线信息,以及邻车道的边界信息,并根据这两者确定道路结构信息。或者,其他设备(比如服务器)通过包含的组件可以用于实施本申请实施例的道路结构检测方法,获取本车道的车道线信息,以及邻车道的边界信息,并根据这两者确定道路结构信息,向目标车辆发送该道路结构信息。该道路结构信息用于目标车辆制定驾驶策略。
图1是本申请实施例提供的车辆100的功能框图。在一个实施例中,将车辆100配置为完全或部分地自动驾驶模式。例如,车辆100可以在处于自动驾驶模式的同时控制自身,并且可通过人为操作来确定车辆及其周边环境的当前状态,确定周边环境中的至少一个其他车辆的可能行为,并确定该其他车辆执行可能行为的可能性相对应的置信水平,基于所确定的信息来控制车辆100。在车辆100处于自动驾驶模式时,可以将车辆100置为在没有和人交互的情况下操作。
车辆100可包括各种子系统,例如行进系统102、传感器系统104、控制系统106、一个或多个外围设备108以及电源110、计算机系统112和用户接口116。可选地,车辆100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。
行进系统102可包括为车辆100提供动力运动的组件。在一个实施例中,行进系统102可包括引擎118、能量源119、传动装置120和车轮/轮胎121。
引擎118可以是内燃引擎、电动机、空气压缩引擎或其他类型的引擎组合,例如汽油发动机和电动机组成的混动引擎,内燃引擎和空气压缩引擎组成的混动引擎。引擎118将能量源119转换成机械能量。
能量源119的示例包括汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源。能量源119也可以为车辆100的其他系统提供能量。
传动装置120可以将来自引擎118的机械动力传送到车轮121。传动装置120可包括变速箱、差速器和驱动轴。在一个实施例中,传动装置120还可以包括其他器件,比如离合器。其中,驱动轴可包括可耦合到一个或多个车轮121的一个或多个轴。
传感器系统104可包括感测关于车辆100周边的环境的信息的若干个传感器。例如,传感器系统104可包括定位系统122(定位系统可以是全球定位系统(global positioning system,GPS),也可以是北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)124、雷达126、激光测距仪128以及相机130。传感器系统104还可包括被监视车辆100的内部系统的传感器(例如,车内空气质量监测器、燃油量表、机油温度表等)。来自这些传感器中的一个或多个的传感器数据可用于检测对象及其相应特性(位置、形状、方向、速度等)。这种检测和识别是自主车辆100的安全操作的关键功能。
不同类型的传感器有着不同的特点。对于毫米波雷达而言,可以全天候工作,同时有着良好的测距测速精度,但是分类识别效果不好。摄像头具有强大的分辨率,有着很强的目标识别分类效果,但是因为丧失了深度信息,所以对于测距测速等性能可能不佳。激光雷达具有较好的深度信息,同时也可以进行测距测速,但是检测的距离不远。可以看到,这些不同类型的传感器有着不同的特征,在不同功能需求下,需要不同的传感器进行融合处理,来实现更好的性能。
定位系统122可用于估计车辆100的地理位置。IMU 124用于基于惯性加速度来感测车辆100的位置和朝向变化。在一个实施例中,IMU 124可以是加速度计和陀螺仪的组合。
雷达126,或称为雷达装置,也可以称为探测器、探测装置或者无线电信号发送装置。可利用无线电信号来感测车辆100的周边环境内的物体。在一些实施例中,除了感测物体以外,雷达126还可用于感测物体的速度和/或前进方向。其工作原理是通过发射信号(或者称为探测信号),并接收经过目标物体反射的反射信号(在本文中,也称为目标物体的回波信号,双程回波信号等),来探测相应的目标物体。
其中,雷达根据不同的用途有多种不同的雷达波形,包括但不限于脉冲毫米波、步进调频连续波、线性调频连续波。其中,线性调频连续波较为常见、技术较为成熟。线性调频连续波具有较大的时带积,通常具有较高的测距精度和测距分辨率。其支持自适应巡航控制(Adaptive Cruise Control,ACC)、自动紧急制动(Autonomous Emergency Braking,AEB)、变道辅助(Lance Change Assist,LCA)、盲点监测(Blind Spot Monitoring,BSD)等辅助驾驶功能。
激光测距仪128可利用激光来感测车辆100所位于的环境中的物体。在一些实施例中,激光测距仪128可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。
相机130可用于捕捉车辆100的周边环境的多个图像。相机130可以是静态相机或视频相机。
控制系统106可控制车辆100及其组件的操作。控制系统106可包括各种元件,其中包括转向系统132、油门134、制动单元136、计算机视觉系统140、路线控制系统142以及障碍规避系统144。
转向系统132可操作来调整车辆100的前进方向。例如在一个实施例中可以为方向盘系统。
油门134用于控制引擎118的操作速度并进而控制车辆100的速度。
制动单元136用于控制车辆100减速。制动单元136可使用摩擦力来减慢车轮121。在其他实施例中,制动单元136可将车轮121的动能转换为电流。制动单元136也可采取其他形式来减慢车轮121转速从而控制车辆100的速度。
计算机视觉系统140可以操作来处理和分析由相机130捕捉的图像,以便识别车辆100周边环境中的物体和/或特征。所述物体和/或特征可包括交通信号、道路边界和障碍物。计算机视觉系统140可使用物体识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪和其他计算机视觉技术。在一些实施例中,计算机视觉系统140可以用于为环境绘制地图、跟踪物体、估计物体的速度等等。
路线控制系统142用于确定车辆100的行驶路线。在一些实施例中,路线控制系统142可结合来自传感器、定位系统122和一个或多个预定地图的数据以为车辆100确定行驶路线。
障碍规避系统144用于识别、评估和避免或者以其他方式越过车辆100的环境中的潜在障碍物。
当然,在一个实例中,控制系统106可以增加或替换地包括除了所示出和描述的那些以外的组件。或者也可以减少一部分上述示出的组件。
车辆100通过外围设备108与外部传感器、其他车辆、其他计算机系统或用户之间进行交互。外围设备108可包括无线通信系统146、车载电脑148、麦克风150和/或扬声器152。
在一些实施例中,外围设备108提供车辆100的用户与用户接口116交互的手段。例如,车载电脑148可向车辆100的用户提供信息。用户接口116还可操作车载电脑148来接收用户的输入。车载电脑148可以通过触摸屏进行操作。在其他情况中,外围设备108可提供用于车辆100与位于车内的其它设备或用户通信的手段。例如,麦克风150可从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器152可向车辆100的用户输出音频。
无线通信系统146可以直接地或者经由通信网络来与一个或多个设备无线通信。例如,无线通信系统146可使用第三代(3rd-generation,3G)蜂窝通信,例如码分多址(code division multiple access,CDMA)、数据演化(evolution data only,EVDO)、全球移动通信系统(global system for mobile communications,GSM),通用无线分组 业务(general packet radio service,GPRS),或者第四代(the 4th generation,4G)蜂窝通信,例如长期演进(long term evolution,LTE)。或者第五代(5th-Generation,5G)蜂窝通信。无线通信系统146可利用无线保真(wireless fidelity,WiFi)与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统146可利用红外链路、蓝牙或紫蜂(ZigBee)与设备直接通信。其他无线协议,例如各种车辆通信系统,例如,无线通信系统146可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备。
电源110可向车辆100的各种组件提供电力。在一个实施例中,电源110可以为可再充电锂离子或铅酸电池。这种电池的一个或多个电池组可被配置为电源,从而为车辆100的各种组件提供电力。在一些实施例中,电源110和能量源119可一起实现,例如一些全电动车中那样。
车辆100的部分或所有功能受计算机系统112控制。计算机系统112可包括至少一个处理器113,处理器113执行存储在例如数据存储装置114这样的非暂态计算机可读介质中的指令115。计算机系统112还可以是采用分布式方式控制车辆100的个体组件或子系统的多个计算设备。
处理器113可以是任何常规的处理器,诸如商业可获得的中央处理单元(Central Processing Unit,CPU)。替选地,该处理器可以是诸如专用集成电路(Application Specific Integrated Circuit,ASIC)或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、和在相同物理外壳中的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机系统、或存储器实际上可以包括可以在相同的物理外壳内的多个处理器、计算机系统、或存储器,或者包括可以不在相同的物理外壳内的多个处理器、计算机系统、或存储器。例如,存储器可以是硬盘驱动器,或位于不同于物理外壳内的其它存储介质。因此,对处理器或计算机系统的引用将被理解为包括对可以并行操作的处理器或计算机系统或存储器的集合的引用,或者可以不并行操作的处理器或计算机系统或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,所述处理器只执行与特定于组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以位于远离该车辆的位置并且与该车辆进行无线通信,这类处理器可以称为远程处理器。在其它方面中,此处所描述的过程中的一些在布置于车辆内的处理器上执行而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,数据存储装置114可包含指令115(例如,程序逻辑),指令115可被处理器113执行来执行车辆100的各种功能,包括以上描述的那些功能。数据存储装置114也可包含额外的指令,包括向行进系统102、传感器系统104、控制系统106和外围设备108中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令115以外,数据存储装置114还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆100在自主、半自主和/或手动模式中操作期间被车辆100和计算机系统112使用。
比如,在本申请实施例中,数据存储装置114从传感器系统104或车辆100的其他组件获取的本车道的车道线信息,还可以从上述组件获取检测到的行车轨迹信息和/或道路边界信息。其中,道路边界信息用于表征道路边界的位置。数据存储装置114还可以存储上述获取的信息。再比如,车辆基于雷达126的测速、测距功能,得到其他车辆与自身之间的距离、其他车辆的速度等。如此,处理器113可从数据存储装置114获取这些信息,并基于这些信息,确定道路结构信息。道路结构信息可以用于辅助车辆确定驾驶策略,以控制车辆进行驾驶。
用户接口116,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口116可包括在外围设备108的集合内的一个或多个输入/输出设备,例如无线通信系统146、车载电脑148、麦克风150和扬声器152中的一个或多个。
计算机系统112可基于从各种子系统(例如,行进系统102、传感器系统104和控制系统106)以及从用户接口116接收的输入来控制车辆100的功能。例如,计算机系统112可利用来自控制系统106的输入,以便控制转向系统132,从而规避由传感器系统104和障碍规避系统144检测到的障碍物。在一些实施例中,计算机系统112可操作来对车辆100及其子系统的许多方面提供控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,数据存储装置114可以部分或完全地与车辆100分开存在。上述组件可以按有线和/或无线方式来通信地耦合在一起。
可选地,上述组件只是一个示例。实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1不应理解为对本申请实施例的限制。
在道路行进的智能汽车,如上面的车辆100,可以识别其周围环境内的物体以确定对当前速度的调整。所述物体可以是其它车辆、交通控制设备、或者其它类型的物体。在一些示例中,可以独立地考虑每个识别的物体,并且基于物体的各自的特性,诸如它的当前速度、加速度、与车辆的间距等,可以用来确定智能汽车所要调整的速度。
可选地,车辆100或者与车辆100相关联的计算设备(如图1的计算机系统112、计算机视觉系统140、数据存储装置114)可以基于所识别的物体的特性和周围环境的状态(例如,交通、雨、道路上的冰、等等)来预测所述识别的物体的行为。可选地,每一个所识别的物体都依赖于彼此的行为,因此,还可以将所识别的所有物体全部一起考虑来预测单个识别的物体的行为。车辆100能够基于预测的所述识别的物体的行为来调整它的速度。换句话说,智能汽车能够基于所预测的物体的行为来确定车辆将需要调整到(例如,加速、减速、或者停止)什么稳定状态。在这个过程中,也可以考虑其它因素来确定车辆100的速度,诸如,车辆100在行驶的道路中的横向位置、道路的曲率、静态和动态物体的接近度等等。
除了提供调整智能汽车的速度的指令之外,计算设备还可以提供修改车辆100的转向角的指令,以使得智能汽车遵循给定的轨迹和/或维持与智能汽车附近的物体(例如,道路上的相邻车道中的轿车)的安全横向和纵向距离。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本 申请实施例不做特别的限定。
在本申请的另一些实施例中,智能车辆还可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。
参见图2,示例性的,车辆中可以包括以下模块:
环境感知模块201,用于获取路侧传感器和/或车载传感器识别的车辆、行人、路面物体的信息。路侧传感器与车载传感器可以是摄像头(相机)、激光雷达、毫米波雷达等。环境感知模块获取到的数据可以是原始采集的视频流、雷达的点云数据或者是经过分析的结构化的人、车、物的位置、速度、转向角度、尺寸大小数据。对于原始的视频流数据、雷达的点云数据,环境感知模块可以将这些数据处理成可识别的结构化的人、车、物的位置、速度、转向角度、尺寸大小等数据,并向控制模块202传递这些数据,以便于控制模块202生成驾驶策略。在本申请实施例中,环境感知模块201包括摄像头或者雷达,用于获取车辆所在本车道的边界信息。还用于获取道路边界信息和/或检测到的行车轨迹信息。其中,道路边界信息和/或检测到的行车轨迹信息,用于确定邻车道边界信息。本车道边界信息和邻车道边界信息,用于确定道路结构信息。
控制模块202:该模块可以是车辆所具备的传统控制模块,其作用是根据环境感知模块201输入的数据(本车道边界信息、行车轨迹信息和/或道路边界信息),确定道路结构信息。还用于将根据行车轨迹信息确定的邻车道边界信息,和根据道路边界信息确定的邻车道边界信息进行融合,得到更为准确的邻车道边界信息。还用于根据道路结构信息生成驾驶策略,并输出驾驶策略对应的动作指令,向控制模块203发送该动作指令。该动作指令用于指示控制模块203对车辆进行驾驶控制。控制模块202可以是具有控制、处理功能的组件或子系统的集合。比如,可以是图1所示的处理器113,或者处理器中的部分功能模块,或者类似组件或类似子系统。
控制模块203:用于从控制模块202接收动作指令,以控制车辆完成驾驶操作。控制模块203可以是具有控制、处理功能的组件或子系统的集合。比如,可以是图1所示的处理器113,或者类似组件或类似子系统。
当然,还可以将上述各个模块集成在一个模块中。该集成模块用于提供上述多个功能。
车载通信模块(图2中并未示出):用于车辆和其他车之间的信息交互。车载通信模块可以比如但不限于为如图1所示的无线通信系统146中的组件。
存储组件(图2中并未示出),用于存储上述各个模块的可执行代码。运行这些可执行代码可实现本申请实施例的部分或全部方法流程。存储组件可以比如但不限于为如图1所示的数据存储装置114中的组件。
在本申请实施例的一种可能的实现方式中,如图3所示,图1所示的计算机系统112包括处理器303,处理器303和系统总线305耦合。处理器303可以是一个或者多个处理器,其中每个处理器都可以包括一个或多个处理器核。显示适配器(video adapter)307,显示适配器307可以驱动显示器309,显示器309和系统总线305耦合。 系统总线305通过总线桥311和输入输出(I/O)总线(BUS)313耦合。I/O接口315和I/O总线313耦合。I/O接口315和多种I/O设备进行通信,比如输入设备317(如:键盘,鼠标,触摸屏等),多媒体盘(media tray)321,(例如,CD-ROM,多媒体接口等)。收发器323(可以发送和/或接收无线电通信信号),摄像头355(可以捕捉静态和动态数字视频图像)和外部通用串行总线(Universal Serial Bus,USB)接口325。其中,可选地,和I/O接口315相连接的接口可以是USB接口。
其中,处理器303可以是任何传统处理器,包括精简指令集计算(Reduced Instruction Set Computer,RISC)处理器、复杂指令集计算(Complex Instruction Set Computer,CISC)处理器或上述的组合。可选地,处理器可以是诸如专用集成电路(application specific integrated circuit,ASIC)的专用装置。可选地,处理器303可以是神经网络处理器或者是神经网络处理器和上述传统处理器的组合。
可选地,在本文所述的各种实施例中,计算机系统112可位于远离车辆的地方,并且可与车辆100无线通信。在其它方面,本文所述的一些过程可设置在车辆内的处理器上执行,其它一些过程由远程处理器执行,包括采取执行单个操纵所需的动作。
计算机系统112可以通过网络接口329和软件部署服务器(deploying server)349通信。网络接口329是硬件网络接口,比如,网卡。网络(network)327可以是外部网络,比如因特网,也可以是内部网络,比如以太网或者虚拟私人网络(virtual private network,VPN)。可选地,网络327还可以为无线网络,比如WiFi网络,蜂窝网络等。
硬盘驱动器接口331和系统总线305耦合。硬盘驱动器接口331和硬盘驱动器333相连接。系统内存335和系统总线305耦合。运行在系统内存335的数据可以包括计算机系统112的操作系统(OS)337和应用程序343。
操作系统包括但不限于Shell 339和内核(kernel)341。Shell 339是介于使用者和操作系统的内核(kernel)间的一个接口。shell是操作系统最外面的一层。shell管理使用者与操作系统之间的交互:等待使用者的输入,向操作系统解释使用者的输入,并且处理各种各样的操作系统的输出结果。
内核341由操作系统中用于管理存储器、文件、外设和系统资源的那些部分组成。直接与硬件交互,操作系统内核通常运行进程,并提供进程间的通信,提供CPU时间片管理、中断、内存管理、IO管理等等功能。
应用程序343包括控制汽车驾驶相关的程序,比如,管理汽车和路上障碍物交互的程序,控制汽车路线或者速度的程序,控制汽车和路上其他汽车交互的程序。应用程序343也存在于deploying server 349的系统上。在一个实施例中,在需要执行应用程序343时,计算机系统112可以从deploying server 349下载应用程序343。
又比如,应用程序343可以是控制车辆根据本车道的车道线信息、邻车道的边界信息(根据道路边界信息和/或检测到的行车轨迹信息确定),确定道路结构信息的应用程序。计算机系统112的处理器303调用该应用程序343,得到最终的道路结构。
传感器353和计算机系统112关联。传感器353用于探测计算机系统112周围的环境。举例来说,传感器353可以探测动物,汽车,障碍物和人行横道等。进一步传感器还可以探测上述动物,汽车,障碍物和人行横道等物体周围的环境。比如:动物 周围的环境,例如,动物周围出现的其他动物,天气条件,周围环境的光亮度等。可选地,如果计算机系统112位于汽车上,传感器可以是摄像头,红外线感应器,化学检测器,麦克风等。
在本申请的另一些实施例中,本申请实施例的道路结构检测方法还可以由芯片系统执行。芯片系统可以位于车辆,也可以位于其他位置,比如,位于服务器。参见图4,是本申请实施例提供的一种芯片系统的示例性结构图。
神经网络处理器(neural-network processing unit,NPU)50可以作为协处理器挂载到主CPU(host CPU)上,由host CPU为NPU分配任务。NPU的核心部分为运算电路503。示例性的,通过控制器504控制运算电路503,从而运算电路503可提取存储器中的矩阵数据并进行乘法运算。
在一些实现中,运算电路503内部包括多个处理单元(Process Engine,PE)。在一些实现中,运算电路503是二维脉动阵列。运算电路503还可以是一维脉动阵列,或者能够执行例如乘法和加法这样的数学运算的其它电子线路。在一些实现中,运算电路503是通用的矩阵处理器。
举例来说,假设有输入矩阵A,权重矩阵B,输出矩阵C。运算电路503从权重存储器502中获取权重矩阵B相应的数据,并缓存在运算电路503中每一个PE上。运算电路503从输入存储器501中取输入矩阵A相应的数据,并根据输入矩阵A和权重矩阵B进行矩阵运算,得到矩阵运算的部分结果或最终结果可保存在累加器(accumulator)508中。
又比如,运算电路503可用于实现特征提取模型(如卷积神经网络模型),并将图像数据输入卷积神经网络模型,通过该模型的运算,得到图像的特征。进而,将图像特征输出到分类器,由分类器输出图像中物体的分类概率。
统一存储器506用于存放输入数据以及输出数据。外部存储器中的权重数据直接通过存储单元访问控制器(Direct Memory Access Controller,DMAC)505被送往到权重存储器502中。外部存储器中的输入数据可通过DMAC被搬运到统一存储器506中,或者被搬运到输入存储器501中。
总线接口单元(Bus Interface Unit,BIU)510,用于高级可拓展接口(advanced extensible interface,AXI)总线与DMAC和取指存储器(instruction fetch buffer)509的交互。还用于取指存储器509从外部存储器获取指令,还用于存储单元访问控制器505从外部存储器获取输入矩阵A或者权重矩阵B的原数据。
DMAC主要用于将外部存储器中的输入数据搬运到统一存储器506,或将权重数据搬运到权重存储器502中,或将输入数据搬运到输入存储器501中。
向量计算单元507可包括多个运算处理单元。用于在需要的情况下,可以对运算电路503的输出做进一步处理,如向量乘,向量加,指数运算,对数运算,大小比较等等。主要用于神经网络中非卷积/FC层网络计算,如池化(pooling),批归一化(batch normalization),局部响应归一化(local response normalization)等。
在一些实现中,向量计算单元507将经处理的输出向量存储到统一存储器506。例如,向量计算单元507可以将非线性函数应用到运算电路503的输出,例如累加值的向量,用以生成激活值。在一些实现中,向量计算单元507生成归一化的值、合并 值,或二者均有。在一些实现中,处理过的输出向量还能够用作运算电路503的激活输入,例如用于在神经网络中的后续层中的使用。
控制器504连接取指存储器(instruction fetch buffer)509,控制器504使用的指令可存储在取指存储器509中。
作为一种可能的实现方式,统一存储器506,输入存储器501,权重存储器502以及取指存储器509均为片上(On-Chip)存储器。外部存储器私有于该NPU硬件架构。
结合图1至图3,主CPU和NPU共同配合,可实现图1中车辆100所需功能的相应算法,也可实现图2所示车辆所需功能的相应算法,也可以实现图3所示计算机系统112所需功能的相应算法。当图4所示芯片系统位于除车辆之外的位置,比如服务器,主CPU和NPU共同配合,可以实现服务器所需功能的相应算法。
在本申请的另一些实施例中,计算机系统112还可以从其它计算机系统接收信息或转移信息到其它计算机系统。或者,从车辆100的传感器系统104收集的传感器数据可以被转移到另一个计算机,由另一计算机对此数据进行处理。如图5所示,来自计算机系统112的数据可以经由网络被传送到云侧的计算机系统720用于进一步的处理。网络以及中间节点可以包括各种配置和协议,包括因特网、万维网、内联网、虚拟专用网络、广域网、局域网、使用一个或多个公司的专有通信协议的专用网络、以太网、WiFi和超文本传输协议(hypertext transfer protocol,HTTP)、以及前述的各种组合。这种通信可以由能够传送数据到其它计算机和从其它计算机传送数据的任何设备执行,诸如调制解调器和无线接口。
在一个示例中,计算机系统720可以包括具有多个计算机的服务器,例如负载均衡服务器群。为了从计算机系统112接收、处理并传送数据,计算机系统720与网络的不同节点交换信息。该服务器720可以具有类似于计算机系统112的配置,并具有处理器730、存储器740、指令750、和数据760。
在一个示例中,服务器720的数据760可以包括提供天气相关的信息。例如,服务器720可以接收、监视、存储、更新、以及传送与天气相关的各种信息。该信息可以包括例如以报告形式、雷达信息形式、预报形式等的降水、云、和/或温度信息和/或湿度信息。
参见图6,为车辆和云服务中心(云服务器)交互的示例。云服务中心可以经诸如无线通信网络的网络511,从其操作环境500内的车辆513、512接收信息(诸如车辆传感器收集到数据或者其它信息)。
云服务中心520根据接收到的数据,运行其存储的控制汽车驾驶相关的程序对车辆513、512进行控制。控制汽车驾驶相关的程序可以为:管理汽车和路上障碍物交互的程序,或者控制汽车路线或者速度的程序,或者控制汽车和路上其他汽车交互的程序。
示例性的,云服务中心520通过网络511可将地图的部分提供给车辆513、512。在其它示例中,可以在不同位置之间划分操作。例如,多个云服务中心可以接收、证实、组合和/或发送信息报告。在一些示例中还可以在车辆之间发送信息报告和/传感器数据。其它配置也是可能的。
在一些示例中,云服务中心520向车辆发送关于环境内可能的驾驶情况所建议的解决方案(如,告知前方障碍物,并告知如何绕开它))。例如,云服务中心520可以辅助车辆确定当面对环境内的特定障碍时如何行进。云服务中心520向车辆发送指示该车辆应当在给定场景中如何行进的响应。例如,云服务中心520基于收集到的传感器数据,可以确认道路前方具有临时停车标志的存在,又比如,基于“车道封闭”标志和施工车辆的传感器数据,确定该车道由于施工而被封闭。相应地,云服务中心520发送用于车辆通过障碍的建议操作模式(例如:指示车辆变道另一条道路上)。云服务中心520观察其操作环境500内的视频流,并且已确认车辆能安全并成功地穿过障碍时,对该车辆所使用的操作步骤可以被添加到驾驶信息地图中。相应地,这一信息可以发送到该区域内可能遇到相同障碍的其它车辆,以便辅助其它车辆不仅识别出封闭的车道还知道如何通过。
在一些实施例中,所公开的方法可以实施为以机器可读格式,被编码在计算机可读存储介质上的或者被编码在其它非瞬时性介质或者制品上的计算机程序指令。图7示意性地示出根据这里展示的至少一些实施例而布置的示例计算机程序产品的概念性局部视图,示例计算机程序产品包括用于在计算设备上执行计算机进程的计算机程序。在一个实施例中,示例计算机程序产品600是使用信号承载介质601来提供的。信号承载介质601可以包括一个或多个程序指令602,其当被一个或多个处理器运行时可以提供以上针对图1至图6描述的全部功能或者部分功能,或者可以提供后续实施例中描述的全部或部分功能。例如,参考图9中所示的实施例,S901至S903中的一个或多个特征可以由与信号承载介质601相关联的一个或多个指令来承担。此外,图7中的程序指令602也描述示例指令。
在一些实施例中,当本申请实施例的技术方案由车辆或车辆中的组件执行,计算机程序产品可以为车辆或其组件所使用的程序产品。当本申请实施例的技术方案由车辆之外的其他装置执行,比如服务器,计算机程序产品可以为该其他装置所使用的程序产品。
在一些示例中,信号承载介质601可以包含计算机可读介质603,诸如但不限于,硬盘驱动器、紧密盘(CD)、数字视频光盘(DVD)、数字磁带、存储器、只读存储记忆体(read-only memory,ROM)或随机存储记忆体(random access memory,RAM)等等。在一些实施方式中,信号承载介质601可以包含计算机可记录介质604,诸如但不限于,存储器、读/写(R/W)CD、R/W DVD、等等。在一些实施方式中,信号承载介质601可以包含通信介质605,诸如但不限于,数字和/或模拟通信介质(例如,光纤电缆、波导、有线通信链路、无线通信链路、等等)。因此,例如,信号承载介质601可以由无线形式的通信介质605(例如,遵守电气和电子工程师协会(institute of electrical and electronics engineers,IEEE)802.11标准或者其它传输协议的无线通信介质)来传达。一个或多个程序指令602可以是,例如,计算机可执行指令或者逻辑实施指令。在一些示例中,诸如针对图1至图6描述的计算设备可以被配置为,响应于通过计算机可读介质603、和/或计算机可记录介质604、和/或通信介质605中的一个或多个传达到计算设备的程序指令602,提供各种操作、功能、或者动作。应该理解,这里描述的布置仅仅是用于示例的目的。因而,本领域技术人员将理解,其它布置和 其它元素(例如,机器、接口、功能、顺序、和功能组等等)能够被取而代之地使用,并且一些元素可以根据所期望的结果而一并省略。另外,所描述的元素中的许多是可以被实现为离散的或者分布式的组件的、或者以任何适当的组合和位置来结合其它组件实施的功能实体。
本申请实施例提供的道路结构检测方法应用在自动/半自动驾驶或其他驾驶场景中。具体的,应用在确定道路结构信息的场景中。道路结构信息包括本车道与邻车道的合并点和/或本车道与邻车道的分离点的位置信息。示例性的,应用在图8中(a)或图8中(b)所示的场景中。如图8中(a)所示,车辆801所在本车道和邻车道(即靠近本车道的车道),即本车道和右邻车道之间存在合并点A。其中,合并点A的位置,即本车道的右车道线(host right lane line,HR)与右邻车道的左车道线(right left lane line,RL)的交叉位置。在本申请实施例中,除非另有说明,车道线通常指使用涂料(比如油漆等)在路面上画出的平面图案。如图8中(b)所示,车辆802所在本车道和右邻车道之间存在分离点B。其中,分离点B的位置,与本车道边缘侧的护栏等凸起物(或水泥墩)和邻车道边缘侧的护栏(或水泥墩)等凸起物的位置有关。示例性的,当两条车道通过护栏隔离时,分离点B的位置为两个护栏的交叉位置。当两条车道通过水泥墩隔离时,分离点B的位置为两个水泥墩的交叉位置。图8中(a)与图8中(b)所示场景的区别在于,合并点或分离点位置是根据什么因素确定的,一种情况下,合并点或分离点的位置由平面的车道线交叉位置确定,一种情况下,合并点或分离点的位置由路面凸起物的位置确定。
下面结合各个附图详细描述本申请实施例的道路结构检测方法。
本申请实施例提供一种道路结构检测方法,该方法可应用于图1至图6中的装置或图7所示的计算机程序产品或由远离车辆的其他装置中,在下述阐述技术方案时,省略了技术方案的执行主体。参见图9,该方法包括如下步骤:
S901、确定本车道的边界信息。
其中,所述本车道边界信息用于表征本车道(即当前车道)的边界的位置;本车道的边界信息包括本车道的车道线信息,和/或所述本车道边缘的位置信息。作为一种可能的实现方式,车道之间可以采用凸起物划分车道边缘。本车道边缘的位置信息包括但不限于本车道边缘侧凸起物的位置信息。本车道边缘侧的凸起物包括但不限于本车道边缘侧的护栏、水泥墩。
作为一种可能的实现方式,可以利用视觉传感器(比如摄像头)采集车道线的视觉信息,以确定车道线位置。可以利用雷达或者类似组件获取车道边缘的位置。
S902、确定邻车道边界信息。
所述邻车道边界信息用于表征相邻车道的边界的位置;邻车道的边界信息包括所述邻车道的车道线信息,和/或所述邻车道边缘的位置信息。当车道之间采用凸起物划分车道边缘时,邻车道边缘的位置信息,可以指邻车道边缘侧凸起物的位置信息。凸起物包括但不限于邻车道边缘侧的护栏、水泥墩。
作为一种可能的实现方式,所述邻车道边界信息是根据检测到的行车轨迹信息和/或道路边界信息,以及道路先验数据确定的。即邻车道边界信息是根据道路先验数据和检测到的行车轨迹信息确定的。或者,邻车道边界信息是根据道路先验数据和道路 边界信息确定的。或者,邻车道边界信息是根据道路先验数据、检测到的行车轨迹信息,以及道路边界信息确定的。
其中,所述道路边界信息用于表征道路边界(edge)的位置。
这里,示例性的说明几种确定行车轨迹信息、道路边界信息的方式。其中,可以利用车辆的视觉传感器,比如摄像头采集邻车道的车辆的视觉信息,以确定检测到的行车轨迹信息。还可以利用车辆的雷达或者具有相似功能的组件,利用收发激光、毫米波等方式,确定其他车道中车辆的位置、速度等信息,以确定该其他车道的行车轨迹信息。当然,也可以采用其他方式确定其他车道的行车轨迹信息,本申请实施例对此不限制。当采用涂料等平面图案标记道路边界线时,可以采用摄像头等组件采集道路边界线的视觉信息,以确定道路边界位置,即道路边界信息。当采用凸起物(护栏、或水泥墩等)等划分道路边界时,同样可以采用摄像头确定道路边界信息。也可以采用雷达等组件确定道路边界信息。本申请实施例对确定道路边界信息的方式不做限定。
在本申请实施例中,所述道路先验数据包括车道宽度。其中,不同场景中,车道宽度可能不同。比如,城市道路每车道宽度为3.5米,交叉路口分流车道每车道为2.3-2.5米,干线公路(包括高速公路)每车道宽为3.75米。也就是说,本申请实施例中确定邻车道边界信息所使用的车道宽度,需根据车辆当前所处场景有关。其中,可以利用自车的传感器,比如摄像头采集当前场景下的车道宽度。自车也可以从网络中的其他设备(比如服务器)中直接获取当前场景下的车道宽度。本申请实施例对获取车道宽度的方式不做限定。
如下具体说明确定邻车道边界信息的方式。其中,如下根据是以车道线还是以凸起物划分合并点或分离点,分为两种情况分别进行阐述:
情况1:当采用涂料标记车道线这种方式划分合并点或分离点时,若邻车道为右邻车道,所述邻车道的边界信息包括RL的位置信息;若邻车道为左邻车道,邻车道的边界信息包括左邻车道的右车道线(left right lane line,LR)的位置信息。如图8的(a)中示出了LR的示例。需要说明的是,图8的(a)中,将RL、LR同时标注出来,而实际应用场景中,可能有些情况下LR、RL不同时存在于道路中。也就是说,实际场景中,可能只存在RL或只存在LR。可以理解的是,本申请实施例仅为一种举例,本领域技术人员可以应用本方法检测其他的道路边界信息,例如与本车道不相邻的其他车道之间的合并点和分离点,进而可以提供充足的道路信息,以供决策执行模块预测行驶路径。
在情况1的一种可能的实现方式中,根据道路边界信息和道路先验数据确定RL和/或LR。具体的,将所述道路边界向左平移第一宽度得到所述RL的第一位置;将所述道路边界向右平移第三宽度得到所述LR的第四位置。
第一宽度为车道宽度的整数倍,所述第三宽度为车道宽度的整数倍。
通常,可以将道路边界信息可以用三次方程(由于路面通常是平面,也可以用二次方程或其他可能的方程)来描述。可以将道路边界信息(构成该道路边界的点)映射到世界坐标系,并在世界坐标系中对这些点进行一系列操作,以得到RL或LR的位置。示例性的,以确定RL位置为例,参见图10中(a),在将道路边界上的点映射到世界坐标系之后,对道路边界上的全部点向左平移(即向靠近原点o的一侧平移)。 由于车辆事先并不感知道路边界和本车道之间存在几个车道,因此,在进行平移操作时,先以最小粒度为单位,即先平移一个车道宽度得到初步确定的RL。该初步确定的RL用于后续确定初步的合并点或分离点位置。当初步确定的合并点或分离点位置满足预设条件时,无需调整初步确定的RL。反之,当初步确定的合并点或分离点位置不满足预设条件时,还需再调整初步确定的RL,直至最终合并点或分离点位置满足预设条件。这里先介绍初步确定RL的方式,具体如何判断合并点或分离点位置是否满足预设条件,以及具体如何调整RL等,将在下文给予详细说明。
LR的确定方式可参见RL的确定方式,这里不再具体阐述。
在另一些实施例中,也可以将道路边界信息映射到图像平面(或称为图像坐标系),通过在图像平面上对道路边界上的点进行诸如平移的一系列操作,以确定RL或LR的位置。
当然,本申请实施例并不限制所映射的坐标系的类型。
在情况1的又一种可能的实现方式中,根据检测到的行车轨迹信息和道路先验数据确定RL和/或LR。具体的,将所述行车轨迹向左平移第二宽度得到RL的第二位置;将所述行车轨迹向右平移第四宽度得到LR的第五位置。
其中,所述第二宽度为半车道宽度的奇数倍,所述第四宽度为半车道宽度的奇数倍。
这里,可以将检测到的行车轨迹信息用三次方程(由于道路通常是平面,也可以用二次方程或其他可能的方程)来描述。可以将该行车轨迹信息(构成该行车轨迹的点)映射到世界坐标系,并在世界坐标系中对这些点进行诸如平移的一系列操作,以得到RL或LR的位置。示例性的,以确定RL位置为例,参见图10中(b),在将行车轨迹上的点映射到世界坐标系之后,对该行车轨迹上的全部点向左平移半个车道宽度,即可得到RL。LR的确定方式可参见RL的确定方式,这里不再具体阐述。
需要说明的是,本申请实施例中,以观测车位于车道中心为例,来设置第二宽度或第四宽度。考虑到观测车实际上可能不是在车道中心,第二宽度或第四宽度的实际宽度还可以调整为其他宽度。
在另一些实施例中,也可以将行车轨迹信息映射到图像平面,通过在图像平面上对行车轨迹上的点进行诸如平移的一系列操作,以确定RL或LR的位置。
在情况1的再一种可能的实现方式中,根据道路边界信息、检测到的行车轨迹信息和道路先验数据确定RL和/或LR。具体的,将第一位置和第二位置经预设算法融合得到RL的第三位置;将第四位置和第五位置经预设算法融合得到LR的第六位置。
以确定RL为例,容易理解的是,在上述两种情况中,根据道路边界信息和道路先验数据确定的RL,与根据邻车道行车轨迹和道路先验数据确定的RL之间,可能存在一些偏差。比如,由于雷达探测距离较小,可能探测的近距离的行车轨迹较为准确,由此得到的RL位置较为准确,探测的远距离的道路边界信息不是很准确,由此得到的RL位置可能不是很准确。这种情况下,为了提升所确定RL位置的准确性,可以将两种RL位置信息进行融合。其中,采用的融合算法可以但不限于是加权求和。其中,根据道路边界信息确定的RL的第一位置的权重、根据行车轨迹信息确定的RL的第二位置的权重,可以根据实际场景设定。比如,当雷达本身探测距离较小,或出现雾霾 天气导致探测距离较小,第二位置可能更接近RL的准确位置时,设置第二位置的权重大于第一位置的权重。这种情况下,由于可以根据实际场景或者车辆本身性能等因素实时确定权重,所以能够提升融合得到的位置信息的准确性。当然,第一位置和第二位置的权重,还可以预先设置好。比如,根据一些历史数据或其他数据提前设定第一位置和第二位置的权重。预设值权重的方式,无需根据实际场景调整第一位置和第二位置的权重,实现起来较为简单。
情况2:当采用凸起物划分合并点或分离点时,若邻车道为右邻车道,所述邻车道的边界信息包括右邻车道的左边缘的位置信息;若邻车道为左邻车道,邻车道的边界信息包括左邻车道的右边缘的位置信息。
在情况2的一种可能的实现方式中,根据道路边界信息和道路先验数据确定右邻车道的左边缘和/或左邻车道的右边缘的位置。具体的,将所述道路边界向左平移第五宽度得到所述右邻车道的左边缘的第七位置;将所述道路边界向右平移第七宽度得到所述左邻车道的右边缘的第十位置。
第五宽度为车道宽度的整数倍,所述第七宽度为车道宽度的整数倍。
示例性的,以确定右邻车道的左边缘的位置为例,参见图10中(c),在将道路边界上的点映射到世界坐标系之后,对道路边界上的全部点向左平移一个车道宽度,即可得到右邻车道的左边缘的位置。左邻车道的右边缘位置的确定方式可参见右邻车道的左边缘位置的确定方式,这里不再具体阐述。
在情况2的又一种可能的实现方式中,根据检测到的行车轨迹信息和道路先验数据确定右邻车道的左边缘和/或左邻车道的右边缘的位置。具体的,将所述行车轨迹向左平移第六宽度得到右邻车道的左边缘的第八位置;将所述行车轨迹向右平移第八宽度得到左邻车道的右边缘的第十一位置。
其中,所述第六宽度为半车道宽度的奇数倍,所述第八宽度为半车道宽度的奇数倍。
示例性的,以确定右邻车道的左边缘凸起物(比如护栏)位置为例,参见图10中(d),在将行车轨迹上的点映射到世界坐标系之后,对该行车轨迹上的全部点向左平移半个车道宽度,即可得到右邻车道的左边缘凸起物的位置。左邻车道的右边缘凸起物位置的确定方式可参见右邻车道的左边缘凸起物位置的确定方式,这里不再具体阐述。
在另一些实施例中,也可以将行车轨迹信息映射到图像平面,以确定右邻车道的左边缘或左邻车道的右边缘的位置。
在情况2的再一种可能的实现方式中,根据道路边界信息、检测到的行车轨迹信息和道路先验数据确定右邻车道的左边缘和/或左邻车道的右边缘的位置。具体的,将右邻车道的左边缘的第一位置和第二位置经预设算法融合得到右邻车道的左边缘的第三位置;将左邻车道的右边缘的第四位置和第五位置经预设算法融合得到左邻车道的右边缘的第六位置。
根据右邻车道的左边缘的第一位置、第二位置和融合算法得到第三位置的具体方式,可参见上述根据RL的第一位置、第二位置和融合算法得到第三位置的过程,这里不再赘述。
需要说明的是,本申请实施例中所提及的车道,不仅指普通车道,还可以指诸如紧急停车带等道路结构。当路面出现紧急停车带时,因为紧急停车带和普通车道宽度相差不大,可以按照普通车道宽度进行处理。或者,可以采用实际的紧急停车带宽度进行上述平移操作,以确定邻车道边界信息。
可见,本申请实施例中,邻车道边界信息是根据道路边界信息和/或检测到的行车轨迹信息推断出的,而并非直接通过摄像头检测邻车道边界信息。这样一来,能够一定程度上避免直接通过摄像头采集邻车道的车道线位置时,因摄像头检测性能不佳(比如检测距离较小)或者环境因素(比如雾霾),导致采集的邻车道的车道线位置不准确,或者无法采集到邻车道的车道线位置的问题。
S903、根据本车道的边界信息和邻车道边界信息确定道路结构信息。
当采用涂料标记车道线的方式划分合并点或分离点时,S903具体实现为:根据本车道的车道线信息(HR)和邻车道的车道线信息,确定合并点或分离点的位置信息。以本车道和右邻车道之间存在合并点为例,参见图10中(a)或图10中(b),在确定HR和RL的位置之后,HR和RL的交叉位置即合并点A的位置。
当采用凸起物的方式划分合并点或分离点时,S903具体实现为:根据本车道边缘侧的凸起物信息和邻车道边缘侧的凸起物信息,确定合并点或分离点的位置信息。以本车道和右邻车道之间存在合并点为例,参见图10中(c)或图10中(d),在确定本车道的右边缘护栏(或水泥墩等)和右邻车道的左边缘护栏(或水泥墩等)的位置之后,这两个护栏(或水泥墩)的交叉位置即合并点A的位置。
在本申请实施例中,在初步确定合并点或分离点的位置的基础上,还需判断初步确定的合并点(或分离点)位置是否符合预设条件。具体的,以判断初步确定的合并点位置是否符合预设条件为例,预设条件可以是:初步确定的所述合并点(比如图10中(a)-图10中(d)中确定的合并点A)与参考点之间的距离小于或等于第一阈值。其中,参考点为一个位置基准,用于判断初步确定的合并点(或分离点)位置是否准确。当初步确定的合并点(或分离点)位置与该参考点之间的距离小于第一阈值时,可以认为初步确定的合并点(或分离点)位置准确。例如,参考点可以是当前车辆(比如图8中(a)的车辆801)。所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
在本申请实施例中,当初步确定的合并点与车辆之间的距离小于或等于第一阈值,比如图10中(a)-图10中(d)初步确定的合并点A位置与车辆之间的距离小于或等于第一阈值,视为初步确定的合并点位置准确,无需再进一步调整合并点的位置。类似的,当道路上出现分离点的场景中,同样要求所述分离点与车辆之间的距离小于或等于第一阈值。当初步确定的分离点与车辆之间的距离小于或等于第一阈值,视初步确定的分离点位置准确,无需再进一步调整分离点位置。
反之,当初步确定的合并点与车辆之间的距离大于第一阈值,其可能是在确定邻车道边界信息时调整的宽度不准确。此种情况,应当调整上述至少一个宽度。具体的,根据所述第一阈值调整上述至少一个宽度;基于调整的至少一个宽度,调整合并点和/或分离点。
比如,参见图11中(a),当车辆801所在本车道和道路边界之间存在多个车道 (图11中(a)所示为两个),将道路边界上的点整体向左平移一个车道宽度,实际得到的是右邻车道的右车道线,并将右邻车道的右车道线视为初步确定的RL。这样一来,基于初步确定的RL(实际是右邻车道的右车道线)和本车道的右车道线初步确定合并点位置如图11中(a)所示B点位置。该初步确定的B点位置与车辆之间的距离较大,并且如图11中(a)所示,B点并非真实合并点。因此,需要将平移宽度由一个车道宽度按照一定补步长进行调整,这里步长可以是一个车道宽度。这里需要将平移宽度由一个车道宽度调整至两个车道宽度,即将道路边界上的点调整为向左平移两个车道宽度,即在首次向左平移一个车道宽度后再向左平移一个车道宽度,得到调整后的RL。如图11中(a)所示,经过该次调整,可得到实际的RL。该RL与HR的交叉点位置,即合并点位置满足预设条件,则该合并点位置视为最终的合并点位置。这里,调整之后的第一宽度实际为两个车道宽度。
当然,还可以调整第一宽度之外的其他宽度。比如,当前场景中,如图11中(a)所示,左邻车道与本车道之间也存在合并点,则除了调整上述第一宽度为两个车道宽度之外,可能还需调整左侧道路边界所需平移的宽度,即第三宽度为两个车道宽度,以得到实际的LR位置。
又比如,参见图11中(b),当车辆801所在本车道和检测到的行车轨迹之间存在多个车道(图11中(b)所示为两个),将行车轨迹的点整体向左平移半个车道宽度,实际得到的是车道线1101,并将车道线1101视为初步确定的RL。这样一来,基于初步确定的RL(实际是车道线1101)和本车道的右车道线初步确定合并点位置如图11中(b)所示C点位置。该初步确定的C点位置与车辆之间的距离较大,如图11中(b)所示,C点并非真实合并点。因此,需要将平移宽度由半个车道宽度按照一定补步长进行调整,这里步长可以是一个车道宽度。这里需要将平移宽度由半个车道宽度调整至1.5个车道宽度,即将行车轨迹上的点调整为向左平移1.5个车道宽度,即在首次向左平移半个车道宽度后再向左平移一个车道宽度,得到车道线1102。如图11中(b)所示,经过该次调整,得到车道线1102与HR的交叉点(B)位置仍不满足预设条件,因此,还需调整平移宽度。这里,将行车轨迹上的点调整为向左平移2.5个车道宽度,即在向左平移1.5个车道宽度后再向左平移一个车道宽度,如图11中(b)所示,经过该次调整,得到了实际RL,该RL与HR的交叉点A位置满足预设条件,则该A点位置视为最终的合并点位置。
本申请实施例提供的道路结构检测方法,能够基于本车道的边界信息和邻车道边界信息确定道路结构信息,其中,边界信息包括车道线信息和/或车道边缘的位置信息。也就是说,本申请实施例中,无需借助高精地图即可确定道路结构信息。因此,在没有高精地图覆盖的区域,车辆仍能够确定道路结构。
本申请实施例可以根据上述方法示例对道路结构检测装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图12示出上述实施例中所涉及 的道路结构检测装置的一种可能的结构示意图。如图12所示,道路结构检测装置16包括确定模块161、调整模块162。当然,道路结构检测装置16还可以包括其他模块(比如存储模块),或者道路结构检测装置可以包括更少的模块。
确定模块,用于确定本车道的边界信息,所述本车道的边界信息用于表征当前车道的边界的位置;确定邻车道的边界信息,所述邻车道的边界信息用于表征相邻车道的边界的位置。其中,所述边界信息包括车道线信息和/或车道边缘的位置信息。
确定模块,还用于根据所述本车道的边界信息和所述邻车道的边界信息确定道路结构信息,所述道路结构信息包括所述本车道与所述邻车道的合并点和/或所述本车道与所述邻车道的分离点的位置信息。
在一种可能的设计中,所述邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,所述道路边界信息用于表征道路边界的位置。
在一种可能的设计中,所述邻车道的边界信息包括以下至少一个:
右邻车道的左车道线RL的位置信息;
左邻车道的右车道线LR的位置信息;
右邻车道的左边缘的位置信息;
左邻车道的右边缘的位置信息。
在一种可能的设计中,所述RL的第一位置由道路边界向左平移第一宽度得到;或者,所述RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由所述RL的第一位置和所述RL的第二位置经预设算法融合得到。
所述LR的第四位置由道路边界向右平移第三宽度得到;或者,所述LR的第五位置由行车轨迹向右平移第四宽度得到;或者,所述LR的第六位置由所述LR的第四位置和所述LR的第五位置经预设算法融合得到。
所述右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,所述左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,所述左边缘的第九位置由所述左边缘的第七位置和所述左边缘的第八位置经预设算法融合得到。
所述左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,所述右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,所述右边缘的第十二位置由所述右边缘的第十位置和所述右边缘的第十一位置经预设算法融合得到。
其中,所述第一宽度为车道宽度的整数倍,所述第二宽度为半车道宽度的奇数倍,所述第三宽度为所述车道宽度的整数倍,所述第四宽度为所述半车道宽度的奇数倍,所述第五宽度为所述车道宽度的整数倍,所述第六宽度为所述半车道宽度的奇数倍,所述第七宽度为所述车道宽度的整数倍,所述第八宽度为所述半车道宽度的奇数倍。
在一种可能的设计中,所述合并点与参考点之间的距离小于或等于第一阈值,和/或,所述分离点与所述参考点之间的距离小于或等于所述第一阈值,所述参考点包括车辆。
在一种可能的设计中,调整模块,用于根据所述第一阈值调整所述第一宽度和/或所述第二宽度和/或所述第三宽度和/或所述第四宽度和/或所述第五宽度和/或所述第六宽度和/或所述第七宽度和/或所述第八宽度;基于调整的所述第一宽度和/或所述第二宽度和/或第三宽度和/或第四宽度和/或所述第五宽度和/或所述第六宽度和/或所述 第七宽度和/或所述第八宽度,调整所述合并点;和/或,用于根据所述第一阈值调整所述第一宽度和/或所述第二宽度和/或第三宽度和/或第四宽度和/或所述第五宽度和/或所述第六宽度和/或所述第七宽度和/或所述第八宽度;基于调整的所述第一宽度和/或所述第二宽度和/或第三宽度和/或第四宽度和/或所述第五宽度和/或所述第六宽度和/或所述第七宽度和/或所述第八宽度,调整所述分离点。
在一种可能的设计中,所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
在一种可能的设计中,所述邻车道的边界信息是根据道路先验数据确定的,所述道路先验数据包括车道宽度。
参见图13,本申请还提供一种道路结构检测装置10,包括处理器1001。
可选的,道路结构检测装置10还可包括存储器1002。
处理器1001与存储器1002相连接(如通过总线1004相互连接)。
可选的,道路结构检测装置10还可包括收发器1003,收发器1003连接处理器1001和存储器1002,收发器用于接收/发送数据。
处理器1001,可以执行图9所对应的任意一个实施方案及其各种可行的实施方式的操作。比如,用于执行确定模块161、调整模块162的操作,和/或本申请实施例中所描述的其他操作。处理器1001,还用于控制传感器1005,使得传感器1005获取一些感应信息。该传感器1005可以包括在道路结构检测装置10内。也可以是外置传感器。
当道路结构检测装置10包括传感器1005,即传感器1005为道路结构检测装置10的内置传感器时,可选的,传感器1005中可以集成上述方法实施例中的全部数据处理功能。这种情况下,道路结构检测装置10可以不包括处理器1001。传感器1005可以用于执行上述方法实施例,即用于执行确定模块161、调整模块162的操作,和/或本申请实施例中所描述的其他操作。在本申请实施例中,数据处理可以是融合运算。即传感器1005可以将由道路边界信息确定的邻车道边界信息,以及由行车轨迹信息确定的邻车道边界信息进行融合运算,以提升邻车道边界信息的准确性。数据处理还可以是上述方法实施例中的其他数据处理过程,比如当初步确定的合并点(或分离点)位置大于或等于第一阈值,传感器1005可以用于调整道路边界位置和/或行车轨迹所需平移的宽度。本申请实施例对传感器1005可以执行的具体数据处理功能不进行限制。
如上文,传感器1005可以指视觉传感器(如摄像头)或者雷达之类的传感器。还可以指具有类似功能的其他传感器。
当然,传感器1005还可以不集成数据处理功能,或集成一部分数据处理功能。这种情况下,传感器1005联合处理器可以执行上述方法实施例,道路结构检测装置10需包括传感器1005和处理器1001。
在一种示例中,传感器1005为传统传感器,其并不具备数据功能。作为一种可能的实现方式,传感器1005用于确定道路边界信息,和/或行车轨迹信息。即传感器1005用于。处理器用于根据道路边界信息,和/或行车轨迹信息,确定邻车道边界信息,并根据邻车道边界信息和本车道边界信息,确定合并点(和/或分离点)位置。其中,当根据道路边界信息以及行车轨迹信息确定邻车道边界信息时,涉及融合运算。
在另一示例中,也可以是传感器1005具备一部分数据处理功能,处理器具备一部分数据处理功能。比如,传感器1005可以根据其采集的道路边界位置,进行数据处理,得到邻车道边界信息。这种情况下,处理器可以用于融合运算等其他数据处理功能。本申请实施例不对传感器1005和处理器1001的具体分工(即具体分别处理哪部分数据)不做限定。
关于处理器、存储器、总线和收发器的具体介绍,可参见上文,这里不再赘述。
本申请还提供一种道路结构检测装置,包括非易失性存储介质,以及中央处理器,非易失性存储介质存储有可执行程序,中央处理器与非易失性存储介质连接,并执行可执行程序以实现本申请实施例的道路结构检测方法。
本申请另一实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括一个或多个程序代码,该一个或多个程序包括指令,当处理器在执行该程序代码时,该道路结构检测装置执行如图9所示的道路结构检测方法。
在本申请的另一实施例中,还提供一种计算机程序产品,该计算机程序产品包括计算机执行指令,该计算机执行指令存储在计算机可读存储介质中。道路结构检测装置的至少一个处理器可以从计算机可读存储介质读取该计算机执行指令,至少一个处理器执行该计算机执行指令使得道路结构检测装置实施执行图9所示的道路结构检测方法中相应步骤。
在上述实施例中,可以全部或部分的通过软件,硬件,固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式出现。计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。
计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些 接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (20)

  1. 一种道路结构检测方法,其特征在于,包括:
    确定本车道的边界信息,所述本车道的边界信息用于表征当前车道的边界的位置;
    确定邻车道的边界信息,所述邻车道的边界信息用于表征相邻车道的边界的位置;
    其中,所述边界信息包括车道线信息和/或车道边缘的位置信息;
    根据所述本车道的边界信息和所述邻车道的边界信息确定道路结构信息,所述道路结构信息包括所述本车道与所述邻车道的合并点和/或所述本车道与所述邻车道的分离点的位置信息。
  2. 根据权利要求1所述的道路结构检测方法,其特征在于,
    所述邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,所述道路边界信息用于表征道路边界的位置。
  3. 根据权利要求1或2所述的道路结构检测方法,其特征在于,
    所述邻车道的边界信息包括以下至少一个:
    右邻车道的左车道线RL的位置信息;
    左邻车道的右车道线LR的位置信息;
    右邻车道的左边缘的位置信息;
    左邻车道的右边缘的位置信息。
  4. 根据权利要求3所述的道路结构检测方法,其特征在于,
    所述RL的第一位置由道路边界向左平移第一宽度得到;或者,所述RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由所述RL的第一位置和所述RL的第二位置经预设算法融合得到;
    所述LR的第四位置由道路边界向右平移第三宽度得到;或者,所述LR的第五位置由行车轨迹向右平移第四宽度得到;或者,所述LR的第六位置由所述LR的第四位置和所述LR的第五位置经预设算法融合得到;
    所述右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,所述左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,所述左边缘的第九位置由所述左边缘的第七位置和所述左边缘的第八位置经预设算法融合得到;
    所述左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,所述右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,所述右边缘的第十二位置由所述右边缘的第十位置和所述右边缘的第十一位置经预设算法融合得到;
    其中,所述第一宽度为车道宽度的整数倍,所述第二宽度为半车道宽度的奇数倍,所述第三宽度为所述车道宽度的整数倍,所述第四宽度为所述半车道宽度的奇数倍,所述第五宽度为所述车道宽度的整数倍,所述第六宽度为所述半车道宽度的奇数倍,所述第七宽度为所述车道宽度的整数倍,所述第八宽度为所述半车道宽度的奇数倍。
  5. 根据权利要求1至4中任一项所述的道路结构检测方法,其特征在于,
    所述合并点与参考点之间的距离小于或等于第一阈值,和/或,所述分离点与所述参考点之间的距离小于或等于所述第一阈值,所述参考点包括当前车辆。
  6. 根据权利要求4或5所述的道路结构检测方法,其特征在于,所述方法还包括:
    根据所述第一阈值调整至少一个所述宽度;
    根据至少一个所述宽度调整所述合并点和/或所述分离点。
  7. 根据权利要求5或6所述的道路结构检测方法,其特征在于,所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
  8. 根据权利要求1至7中任一项所述的道路结构检测方法,其特征在于,所述邻车道的边界信息是根据道路先验数据确定的,所述道路先验数据包括车道宽度。
  9. 一种道路结构检测装置,其特征在于,包括:
    处理器,用于确定本车道的边界信息,所述本车道的边界信息用于表征当前车道的边界的位置;确定邻车道的边界信息,所述邻车道的边界信息用于表征相邻车道的边界的位置;其中,所述边界信息包括车道线信息和/或车道边缘的位置信息;
    所述处理器,还用于根据所述本车道的边界信息和所述邻车道的边界信息确定道路结构信息,所述道路结构信息包括所述本车道与所述邻车道的合并点和/或所述本车道与所述邻车道的分离点的位置信息。
  10. 根据权利要求9所述的道路结构检测装置,其特征在于,所述邻车道的边界信息是根据检测到的行车轨迹信息和/或道路边界信息确定的,所述道路边界信息用于表征道路边界的位置。
  11. 根据权利要求9或10所述的道路结构检测装置,其特征在于,所述邻车道的边界信息包括以下至少一个:
    右邻车道的左车道线RL的位置信息;
    左邻车道的右车道线LR的位置信息;
    右邻车道的左边缘的位置信息;
    左邻车道的右边缘的位置信息。
  12. 根据权利要求11所述的道路结构检测装置,其特征在于,
    所述RL的第一位置由道路边界向左平移第一宽度得到;或者,所述RL的第二位置由行车轨迹向左平移第二宽度得到;或者,RL的第三位置由所述RL的第一位置和所述RL的第二位置经预设算法融合得到;
    所述LR的第四位置由道路边界向右平移第三宽度得到;或者,所述LR的第五位置由行车轨迹向右平移第四宽度得到;或者,所述LR的第六位置由所述LR的第四位置和所述LR的第五位置经预设算法融合得到;
    所述右邻车道的左边缘的第七位置由道路边界向左平移第五宽度得到;或者,所述左边缘的第八位置由行车轨迹向左平移第六宽度得到;或者,所述左边缘的第九位置由所述左边缘的第七位置和所述左边缘的第八位置经预设算法融合得到;
    所述左邻车道的右边缘的第十位置由道路边界向右平移第七宽度得到;或者,所述右边缘的第十一位置由行车轨迹向右平移第八宽度得到;或者,所述右边缘的第十二位置由所述右边缘的第十位置和所述右边缘的第十一位置经预设算法融合得到;
    其中,所述第一宽度为车道宽度的整数倍,所述第二宽度为半车道宽度的奇数倍,所述第三宽度为所述车道宽度的整数倍,所述第四宽度为所述半车道宽度的奇数倍,所述第五宽度为所述车道宽度的整数倍,所述第六宽度为所述半车道宽度的奇数倍,所述第七宽度为所述车道宽度的整数倍,所述第八宽度为所述半车道宽度的奇数倍。
  13. 根据权利要求9至12中任一项所述的道路结构检测装置,其特征在于,
    所述合并点与参考点之间的距离小于或等于第一阈值,和/或,所述分离点与所述参考点之间的距离小于或等于所述第一阈值,所述参考点包括车辆。
  14. 根据权利要求12或13所述的道路结构检测装置,其特征在于,所述处理器,还用于根据所述第一阈值调整至少一个所述宽度;以及用于根据至少一个所述宽度调整所述合并点和/或所述分离点。
  15. 根据权利要求13或14所述的道路结构检测装置,其特征在于,所述第一阈值是通过传感器的感知范围确定的,或者,所述第一阈值为预配置的数值。
  16. 根据权利要求9至15中任一项所述的道路结构检测装置,其特征在于,所述邻车道的边界信息是根据道路先验数据确定的,所述道路先验数据包括车道宽度。
  17. 一种计算机可读存储介质,其特征在于,包括程序或指令,当所述程序或指令在计算机上运行时,如权利要求1至8中任一项所述的道路结构检测方法被实现。
  18. 一种芯片系统,其特征在于,包括处理器和通信接口,所述处理器用于实现权利要求1至8任意一项所述的道路结构检测方法。
  19. 一种电路系统,其特征在于,所述电路系统包括处理电路,所述处理电路配置为执行如权利要求1至8任意一项所述的道路结构检测方法。
  20. 一种道路结构检测装置,其特征在于,包括处理器和存储器;
    所述存储器用于存储计算机执行指令,当所述装置运行时,所述处理器执行所述存储器存储的所述计算机执行指令,以使所述装置执行如权利要求1-8中任意一项所述的道路结构检测方法。
PCT/CN2020/134225 2019-12-06 2020-12-07 道路结构检测方法及装置 WO2021110166A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
BR112022010922A BR112022010922A2 (pt) 2019-12-06 2020-12-07 Método de detecção de estrutura de estrada, dispositivo, meio de armazenamento legível por computador e sistema de chip
EP20895891.8A EP4059799A4 (en) 2019-12-06 2020-12-07 METHOD AND DEVICE FOR RECOGNIZING ROAD STRUCTURES
US17/833,456 US20220309806A1 (en) 2019-12-06 2022-06-06 Road structure detection method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911245257.2 2019-12-06
CN201911245257.2A CN113022573B (zh) 2019-12-06 2019-12-06 道路结构检测方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/833,456 Continuation US20220309806A1 (en) 2019-12-06 2022-06-06 Road structure detection method and apparatus

Publications (1)

Publication Number Publication Date
WO2021110166A1 true WO2021110166A1 (zh) 2021-06-10

Family

ID=76221510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/134225 WO2021110166A1 (zh) 2019-12-06 2020-12-07 道路结构检测方法及装置

Country Status (5)

Country Link
US (1) US20220309806A1 (zh)
EP (1) EP4059799A4 (zh)
CN (1) CN113022573B (zh)
BR (1) BR112022010922A2 (zh)
WO (1) WO2021110166A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4020428A4 (en) * 2019-08-28 2022-10-12 Huawei Technologies Co., Ltd. LANE RECOGNITION METHOD AND APPARATUS, AND COMPUTER DEVICE
CN116030286B (zh) * 2023-03-29 2023-06-16 高德软件有限公司 边界车道线匹配方法、装置、电子设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104149783A (zh) * 2014-08-27 2014-11-19 刘红华 一种数字公路及其自动驾驶车辆
CN104554259A (zh) * 2013-10-21 2015-04-29 财团法人车辆研究测试中心 主动式自动驾驶辅助系统与方法
JP2015102893A (ja) * 2013-11-21 2015-06-04 日産自動車株式会社 合流支援システム
JP2018044833A (ja) * 2016-09-14 2018-03-22 日産自動車株式会社 自動運転支援方法および装置
DE102018007298A1 (de) * 2018-09-14 2019-03-28 Daimler Ag Verfahren zur Routenplanung
CN110361021A (zh) * 2018-09-30 2019-10-22 长城汽车股份有限公司 车道线拟合方法及系统
CN110361015A (zh) * 2018-09-30 2019-10-22 长城汽车股份有限公司 道路特征点提取方法及系统
CN110386065A (zh) * 2018-04-20 2019-10-29 比亚迪股份有限公司 车辆盲区的监控方法、装置、计算机设备及存储介质

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5066123B2 (ja) * 2009-03-24 2012-11-07 日立オートモティブシステムズ株式会社 車両運転支援装置
JP5852920B2 (ja) * 2012-05-17 2016-02-03 クラリオン株式会社 ナビゲーション装置
WO2016027270A1 (en) * 2014-08-18 2016-02-25 Mobileye Vision Technologies Ltd. Recognition and prediction of lane constraints and construction areas in navigation
US9494438B1 (en) * 2015-12-15 2016-11-15 Honda Motor Co., Ltd. System and method for verifying map data for a vehicle
US10670416B2 (en) * 2016-12-30 2020-06-02 DeepMap Inc. Traffic sign feature creation for high definition maps used for navigating autonomous vehicles
JP6653300B2 (ja) * 2017-09-15 2020-02-26 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
JP6614509B2 (ja) * 2017-10-05 2019-12-04 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
CN109186615A (zh) * 2018-09-03 2019-01-11 武汉中海庭数据技术有限公司 基于高精度地图的车道边线距离检测方法、装置及存储介质
CN110160552B (zh) * 2019-05-29 2021-05-04 百度在线网络技术(北京)有限公司 导航信息确定方法、装置、设备和存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104554259A (zh) * 2013-10-21 2015-04-29 财团法人车辆研究测试中心 主动式自动驾驶辅助系统与方法
JP2015102893A (ja) * 2013-11-21 2015-06-04 日産自動車株式会社 合流支援システム
CN104149783A (zh) * 2014-08-27 2014-11-19 刘红华 一种数字公路及其自动驾驶车辆
JP2018044833A (ja) * 2016-09-14 2018-03-22 日産自動車株式会社 自動運転支援方法および装置
CN110386065A (zh) * 2018-04-20 2019-10-29 比亚迪股份有限公司 车辆盲区的监控方法、装置、计算机设备及存储介质
DE102018007298A1 (de) * 2018-09-14 2019-03-28 Daimler Ag Verfahren zur Routenplanung
CN110361021A (zh) * 2018-09-30 2019-10-22 长城汽车股份有限公司 车道线拟合方法及系统
CN110361015A (zh) * 2018-09-30 2019-10-22 长城汽车股份有限公司 道路特征点提取方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4059799A4

Also Published As

Publication number Publication date
US20220309806A1 (en) 2022-09-29
BR112022010922A2 (pt) 2022-09-06
CN113022573B (zh) 2022-11-04
EP4059799A4 (en) 2023-03-01
CN113022573A (zh) 2021-06-25
EP4059799A1 (en) 2022-09-21

Similar Documents

Publication Publication Date Title
WO2021135371A1 (zh) 一种自动驾驶方法、相关设备及计算机可读存储介质
WO2021000800A1 (zh) 道路可行驶区域推理方法及装置
WO2022001773A1 (zh) 轨迹预测方法及装置
WO2021103511A1 (zh) 一种设计运行区域odd判断方法、装置及相关设备
CN113792566B (zh) 一种激光点云的处理方法及相关设备
CN113968216B (zh) 一种车辆碰撞检测方法、装置及计算机可读存储介质
WO2021196879A1 (zh) 车辆驾驶行为的识别方法以及识别装置
EP4029750A1 (en) Data presentation method and terminal device
WO2021147748A1 (zh) 一种自动驾驶方法及相关设备
US12001517B2 (en) Positioning method and apparatus
US20220309806A1 (en) Road structure detection method and apparatus
WO2022062825A1 (zh) 车辆的控制方法、装置及车辆
WO2022051951A1 (zh) 车道线检测方法、相关设备及计算机可读存储介质
CN112512887A (zh) 一种行驶决策选择方法以及装置
EP4307251A1 (en) Mapping method, vehicle, computer readable storage medium, and chip
CN113885045A (zh) 车道线的检测方法和装置
WO2022052881A1 (zh) 一种构建地图的方法及计算设备
WO2021217447A1 (zh) 一种车辆通过道闸横杆的方法及装置
CN115398272A (zh) 检测车辆可通行区域的方法及装置
CN116135654A (zh) 一种车辆行驶速度生成方法以及相关设备
WO2022022284A1 (zh) 目标物的感知方法及装置
WO2021159397A1 (zh) 车辆可行驶区域的检测方法以及检测装置
CN114764980B (zh) 一种车辆转弯路线规划方法及装置
CN115508841A (zh) 一种路沿检测的方法和装置
WO2022001432A1 (zh) 推理车道的方法、训练车道推理模型的方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20895891

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112022010922

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 2020895891

Country of ref document: EP

Effective date: 20220615

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 112022010922

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20220603