WO2021217646A1 - Procédé et dispositif de détection d'un espace libre pour un véhicule - Google Patents

Procédé et dispositif de détection d'un espace libre pour un véhicule Download PDF

Info

Publication number
WO2021217646A1
WO2021217646A1 PCT/CN2020/088473 CN2020088473W WO2021217646A1 WO 2021217646 A1 WO2021217646 A1 WO 2021217646A1 CN 2020088473 W CN2020088473 W CN 2020088473W WO 2021217646 A1 WO2021217646 A1 WO 2021217646A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
boundary grid
grid unit
boundary
unit
Prior art date
Application number
PCT/CN2020/088473
Other languages
English (en)
Chinese (zh)
Inventor
李选富
陈海
吴祖光
郑凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202080099748.5A priority Critical patent/CN115398272A/zh
Priority to PCT/CN2020/088473 priority patent/WO2021217646A1/fr
Publication of WO2021217646A1 publication Critical patent/WO2021217646A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles

Definitions

  • This application relates to the field of automatic driving, and in particular to a method and device for detecting a passable area of a vehicle.
  • Mobile robots can use the data obtained by visual sensors (such as cameras) and radar sensors (such as lidar, millimeter wave radar) to detect the passable area in the surrounding environment (free space) , That is, the area where the autonomous vehicle can travel. Subsequently, the self-driving vehicle can plan its driving path according to the detected passable area, so as to realize the automatic driving of the vehicle.
  • visual sensors such as cameras
  • radar sensors such as lidar, millimeter wave radar
  • the other is to use the data obtained by the sensor to determine the passable area, and then detect the information obtained by the sensor to determine the edge information of the road, such as the road edge and lane line, etc., and combine the location of the vehicle with high precision Map to determine the surrounding road structure where the vehicle is located. Finally, according to the detected road edge information and the surrounding road structure of the vehicle, the passable area is cut, so as to well remove the narrow passable area caused by the area outside the road and the green belt.
  • the present application provides a method and device for detecting a vehicle passable area.
  • a boundary grid unit is determined from a plurality of grid units, and then a target boundary grid unit is determined according to the boundary grid unit, thereby realizing the vehicle passable area Cut out the area that is actually not allowed to pass through, and obtain the boundary of the passable area of the vehicle, thereby improving the accuracy of detecting the passable area of the vehicle.
  • the present application provides a method for detecting a vehicle passable area, which relates to the field of automatic driving.
  • the method includes: determining at least two boundary grid units from a plurality of grid units, wherein the at least two boundary grids
  • the grid unit is a grid unit where the obstacle closest to the vehicle is located on the same azimuth angle, and multiple grid units are obtained by dividing the surrounding area where the vehicle is located.
  • a target boundary grid unit is determined, and the target boundary grid unit refers to a grid unit where the boundary of the vehicle's passable area is located.
  • the area between the target boundary grid unit and the vehicle is determined as the vehicle's passable area.
  • the present application can determine at least two boundary grid units from multiple grid units, and then determine the target boundary grid unit based on the at least two boundary grid units, so as to realize the alignment of the boundary grid units. Update to get the target boundary grid unit. And the area between the target boundary grid unit and the vehicle is determined as the vehicle passable area.
  • the target boundary grid unit can be used to cut the area between the boundary grid unit and the vehicle to remove the area that is actually impassable by the vehicle. , Improve the accuracy of vehicle passable area detection.
  • determining the target boundary grid unit according to the at least two boundary grid units includes: determining the target boundary grid unit according to the distance between the first boundary grid unit and the second boundary grid unit Cell.
  • the first boundary grid unit is any one of at least two boundary grid units
  • the distance between the second boundary grid unit and the vehicle is less than or equal to the distance between the first boundary grid unit and the vehicle, and is preset
  • the second boundary grid unit is the closest to the first boundary grid unit in the direction.
  • the present application can determine any one of the at least two boundary grid units as the first boundary grid unit, and for each first boundary grid unit, the unique second boundary corresponding to it can be determined Grid unit. Therefore, according to the distance between the first boundary grid unit and the second boundary grid unit to determine the target boundary grid unit, the boundary grid unit can be updated, and the boundary grid unit can be removed as much as possible. In the determined area, the actual vehicle impassable area is to improve the accuracy of the vehicle passable area detection.
  • determining the target boundary grid unit according to the distance between the first boundary grid unit and the second boundary grid unit includes: connecting the first boundary grid unit with the second boundary grid unit The distance between the grid cells is compared with the preset distance threshold to determine the target boundary grid cell.
  • the solution of determining the target boundary grid unit according to the preset distance threshold in this application can ensure that the vehicle is in accordance with the target boundary network as much as possible.
  • the vehicle passable area determined by the grid unit can pass, which improves the accuracy of vehicle passable area detection.
  • determining the target boundary grid unit by comparing the distance between the first boundary grid unit and the second boundary grid unit with a preset distance threshold includes: When the distance between the grid unit and the second boundary grid unit is less than or equal to the preset distance threshold, the first boundary grid unit and the second boundary grid unit are determined as the target boundary grid unit. Wherein, the distance between the third boundary grid unit and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, and the third boundary grid unit is located between the first boundary grid unit and the second boundary grid unit in the preset direction between. When the distance between the first boundary grid unit and the second boundary grid unit is greater than the preset distance threshold, the first boundary grid unit and the second boundary grid unit are determined as the target boundary grid unit.
  • the present application after determining the vehicle passable area according to the boundary grid unit, the present application also eliminates the area in the passable area whose width is less than the preset distance threshold according to the preset distance threshold, that is, the vehicle is eliminated
  • the boundary grid unit is updated to the target boundary grid unit used to determine the vehicle passable area, so as to improve the accuracy of the vehicle passable area detection.
  • a grid map is generated according to the passable area and the impassable area of the vehicle.
  • the impassable area refers to an area where a grid unit whose distance from the vehicle is greater than a first distance is located, and the first distance refers to the distance between the target boundary grid unit and the vehicle.
  • the grid map generated according to the passable area and the impassable area of the vehicle can more intuitively indicate the passable area and the impassable area of the vehicle, and it is convenient to view the passable area and the impassable area of the vehicle.
  • position information of obstacles around the vehicle is determined first.
  • the position information of the obstacle may include the azimuth angle of the obstacle relative to the vehicle, and the distance between the obstacle and the vehicle.
  • determining the location information of the obstacles around the vehicle includes: determining the location information of the obstacles around the vehicle according to external data collected by sensors on the vehicle.
  • the external data refers to the data outside the vehicle collected by the sensor.
  • this application will eliminate the data collected by the sensor on the vehicle other than the vehicle, that is, external data, so as to reduce a certain point on the vehicle collected by the sensor, and determine it as the distance from the vehicle on the azimuth angle of the point.
  • the occurrence of recent obstacles reduces the possibility of using a certain point on the vehicle as the boundary of the vehicle's passable area and improves the accuracy of detecting the vehicle's passable area.
  • the sensor used to collect external data includes one or more of lidar, millimeter wave radar, or vision sensor.
  • the detection method of the passable area of a vehicle in this application has good versatility and uses multiple sensors to collect data.
  • the obtained data is used to determine the passable area of the vehicle, which can further improve the accuracy of detecting the passable area of the vehicle.
  • the aforementioned preset direction is a clockwise direction or a counterclockwise direction.
  • the boundary grid unit in the process of determining the target boundary grid unit, if the target boundary grid unit is at the same azimuth angle If there is a boundary grid unit whose distance from the vehicle is smaller than the target boundary grid unit, then the boundary grid unit is determined to be a boundary grid unit that does not need to be processed. That is to say, in the subsequent process of determining the boundary grid unit, the boundary grid unit can be regarded as a grid unit in the vehicle impassable area, and the boundary grid unit is no longer used as the first boundary grid unit. Determine the target boundary grid unit.
  • the present application can reduce the boundary grid units that need to be processed in the process of determining the target boundary grid unit, thereby reducing the amount of calculation in the process of detecting the passable area of the vehicle, and improving the detection of the passable area of the vehicle. efficient.
  • the present application provides a device for detecting a vehicle passable area, which relates to the field of automatic driving.
  • the device includes: a processing unit configured to determine at least two boundary grid units from a plurality of grid units, the at least two grid units Each boundary grid unit is a grid unit where the obstacle that is closest to the vehicle at the same azimuth angle is located, and the multiple grid units are obtained by dividing the surrounding area where the vehicle is located.
  • the processing unit is further configured to determine a target boundary grid unit based on the at least two boundary grid units, where the target boundary grid unit refers to a grid unit where the boundary of the passable area of the vehicle is located.
  • the determining unit is used to determine the area between the target boundary grid unit and the vehicle as the passable area of the vehicle.
  • the processing unit is configured to determine the target boundary grid unit based on at least two boundary grid units, including: a processing unit, configured to determine the target boundary grid unit based on the first boundary grid unit and the second boundary grid unit The distance between the cells determines the target boundary grid cell.
  • the first boundary grid unit is any one of the at least two boundary grid units
  • the distance between the second boundary grid unit and the vehicle is less than or equal to the distance between the first boundary grid unit and the vehicle, and is preset
  • the second boundary grid unit is the closest to the first boundary grid unit in the direction.
  • the processing unit is configured to determine the target boundary grid unit according to the distance between the first boundary grid unit and the second boundary grid unit, including: a processing unit, configured to pass the first boundary grid unit The distance between a boundary grid unit and the second boundary grid unit is compared with a preset distance threshold to determine the target boundary grid unit.
  • the processing unit is configured to determine the target boundary grid unit by comparing the distance between the first boundary grid unit and the second boundary grid unit with a preset distance threshold, including: The processing unit is configured to determine the first boundary grid unit and the second boundary grid unit as the target when the distance between the first boundary grid unit and the second boundary grid unit is less than or equal to the preset distance threshold. Boundary grid cell. Wherein, the distance between the third boundary grid unit and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, and the third boundary grid unit is located between the first boundary grid unit and the second boundary grid unit in the preset direction between. The processing unit is configured to determine the first boundary grid unit and the second boundary grid unit as the target boundary grid unit when the distance between the first boundary grid unit and the second boundary grid unit is greater than a preset distance threshold .
  • the processing unit is configured to determine the target boundary grid unit based on the at least two boundary grid units
  • the generating unit is configured to generate a grid based on the passable area and the impassable area of the vehicle picture.
  • the impassable area refers to an area where a grid unit whose distance from the vehicle is greater than a first distance is located, and the first distance refers to the distance between the target boundary grid unit and the vehicle.
  • the processing unit before the processing unit is configured to determine at least two boundary grid units from a plurality of grid units, the processing unit is configured to determine location information of obstacles around the vehicle.
  • the processing unit is configured to determine the location information of obstacles around the vehicle, and includes: a processing unit, configured to determine the location information of the obstacles around the vehicle according to external data collected by sensors on the vehicle.
  • the external data refers to data outside the vehicle collected by the sensor.
  • the aforementioned sensor includes one or more of laser radar, millimeter wave radar, or vision sensor.
  • the aforementioned preset direction is a clockwise direction or a counterclockwise direction.
  • the present application provides a device for detecting a vehicle passable area, the device comprising: a processor and a memory; wherein the memory is used to store computer program instructions, and the processor runs the computer program instructions to make the detected vehicle passable area The device executes the method for detecting a vehicle passable area described in the first aspect.
  • the present application provides a computer-readable storage medium, including computer instructions, when the computer instructions are executed by a processor, the device for detecting a vehicle passable area executes the detection of a vehicle passable area as described in the first aspect Methods.
  • the present application provides a computer program product, characterized in that, when the computer program product runs on a processor, the device for detecting a vehicle passable area executes the detection of a vehicle passable area as described in the first aspect Methods.
  • FIG. 1 is a first structural diagram of a vehicle provided by an embodiment of the application
  • FIG. 2 is a second structural diagram of a vehicle provided by an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of a computer system provided by an embodiment of this application.
  • FIG. 4 is a schematic diagram 1 of the application of a cloud-side command automatic driving vehicle provided by an embodiment of this application;
  • FIG. 5 is a second schematic diagram of the application of a cloud-side command automatic driving vehicle provided by an embodiment of this application;
  • FIG. 6 is a schematic structural diagram of a computer program product provided by an embodiment of the application.
  • FIG. 7 is a schematic flowchart of a method for detecting a vehicle passable area provided by an embodiment of this application.
  • FIG. 8 is a schematic diagram of a plurality of grid units provided by an embodiment of this application.
  • FIG. 9 is a first schematic diagram of a boundary grid unit provided by an embodiment of this application.
  • FIG. 10 is a second schematic diagram of a boundary grid unit provided by an embodiment of this application.
  • FIG. 11 is a schematic diagram of a distance between boundary grid cells according to an embodiment of the application.
  • FIG. 12 is a first schematic diagram of a grid diagram provided by an embodiment of this application.
  • FIG. 13 is a second schematic diagram of a grid diagram provided by an embodiment of this application.
  • FIG. 14 is a schematic diagram of a concentric circle provided by an embodiment of the application.
  • FIG. 15 is a schematic diagram of a device for detecting a vehicle passable area provided by an embodiment of the application.
  • the embodiments of the present application provide a method and device for detecting a vehicle passable area.
  • the method is applied in a vehicle or other equipment (such as a cloud server, a mobile phone terminal, etc.) having a function of controlling a vehicle.
  • the vehicle or other equipment can implement the method for detecting the passable area of the vehicle provided by the embodiment of the present application through the components (including hardware and software) contained in it, and detect the surrounding environment of the vehicle according to the data collected by the sensor to determine the vehicle’s
  • the passable area allows the vehicle to plan its travel path according to the passable area.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the application.
  • the vehicle 100 may be an autonomous driving vehicle.
  • the vehicle 100 detects the passable area of the vehicle according to the data collected by the sensor, and determines the passable area of the vehicle, so that the vehicle can plan the travel path of the vehicle according to the passable area.
  • the vehicle 100 may include various subsystems, such as a travel system 110, a sensor system 120, a control system 130, one or more peripheral devices 140 and a power supply 150, a computer system 160, and a user interface 170.
  • the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each of the subsystems and elements of the vehicle 100 may be wired or wirelessly interconnected.
  • the travel system 110 may include components that provide power movement for the vehicle 100.
  • the travel system 110 may include an engine 111, a transmission 112, an energy source 113 and wheels 114.
  • the engine 111 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, such as a hybrid engine composed of a gasoline engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • the engine 111 converts the energy source 113 into mechanical energy.
  • Examples of the energy source 113 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 113 may also provide energy for other systems of the vehicle 100.
  • the transmission device 112 can transmit the mechanical power from the engine 111 to the wheels 114.
  • the transmission 112 may include a gearbox, a differential, and a drive shaft.
  • the transmission device 112 may also include other components, such as a clutch.
  • the drive shaft may include one or more shafts that may be coupled to one or more wheels 114.
  • the sensor system 120 may include several sensors that sense information about the environment around the vehicle 100.
  • the sensor system 120 may include a positioning system 121 (the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems), an inertial measurement unit (IMU) 122, and a radar 123, Lidar 124, and camera 125.
  • the sensor system 120 may also include sensors of the internal system of the monitored vehicle 100 (for example, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, direction, speed, etc.). This detection and recognition is a key function of the safe operation of the automatic driving of the vehicle 100.
  • the positioning system 121 can be used to estimate the geographic location of the vehicle 100.
  • the IMU 122 is used to sense the position and orientation change of the vehicle 100 based on the inertial acceleration.
  • the IMU 122 may be a combination of an accelerometer and a gyroscope.
  • the radar 123 may use radio signals to sense objects in the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing the object, the radar 123 may also be used to sense the speed and/or direction of the object.
  • the lidar 124 can use laser light to sense objects in the environment where the vehicle 100 is located.
  • the lidar 124 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the camera 125 may be used to capture multiple images of the surrounding environment of the vehicle 100 and multiple images in the cockpit of the vehicle.
  • the camera 125 may be a still camera or a video camera.
  • the control system 130 may control the operation of the vehicle 100 and its components.
  • the control system 130 may include various components, including a steering system 131, a throttle 132, a braking unit 133, a computer vision system 134, a route control system 135, and an obstacle avoidance system 136.
  • the steering system 131 is operable to adjust the forward direction of the vehicle 100.
  • it may be a steering wheel system.
  • the throttle 132 is used to control the operating speed of the engine 111 and thereby control the speed of the vehicle 100.
  • the braking unit 133 is used to control the vehicle 100 to decelerate.
  • the braking unit 133 may use friction to slow down the wheels 114.
  • the braking unit 133 may convert the kinetic energy of the wheels 114 into electric current.
  • the braking unit 133 may also take other forms to slow down the rotation speed of the wheels 114 to control the speed of the vehicle 100.
  • the computer vision system 134 may be operable to process and analyze the images captured by the camera 125 in order to identify objects and/or features in the surrounding environment of the vehicle 100 as well as the physical and facial features of the driver in the cockpit of the vehicle.
  • the objects and/or features may include traffic signals, road conditions, and obstacles, and the driver's physical and facial features include the driver's behavior, line of sight, expression, and the like.
  • the computer vision system 134 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision technologies.
  • SFM structure from motion
  • the computer vision system 134 can be used to map the environment, track objects, estimate the speed of objects, determine driver behavior, face recognition, and so on.
  • the route control system 135 is used to determine the travel route of the vehicle 100.
  • the route control system 135 may combine data from sensors, the positioning system 121, and one or more predetermined maps to determine a travel route for the vehicle 100.
  • the obstacle avoidance system 136 is used to identify, evaluate, and avoid or otherwise surpass potential obstacles in the environment of the vehicle 100.
  • control system 130 may add or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
  • the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through the peripheral device 140.
  • the peripheral device 140 may include a wireless communication system 141, an onboard computer 142, a microphone 143, and/or a speaker 144.
  • the peripheral device 140 provides a means for the user of the vehicle 100 to interact with the user interface 170.
  • the onboard computer 142 may provide information to the user of the vehicle 100.
  • the user interface 170 can also operate the on-board computer 142 to receive user input.
  • the on-board computer 142 can be operated through a touch screen.
  • the peripheral device 140 may provide a means for the vehicle 100 to communicate with other devices located in the vehicle.
  • the microphone 143 may receive audio (eg, voice commands or other audio input) from the user of the vehicle 100.
  • the speaker 144 may output audio to the user of the vehicle 100.
  • the wireless communication system 141 may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system 141 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication.
  • the wireless communication system 141 may use WiFi to communicate with a wireless local area network (WLAN).
  • WLAN wireless local area network
  • the wireless communication system 141 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 141 may include one or more dedicated short range communications (DSRC) devices.
  • DSRC dedicated short range communications
  • the power supply 150 may provide power to various components of the vehicle 100.
  • the power source 150 may be a rechargeable lithium ion or lead-acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100.
  • the power source 150 and the energy source 113 may be implemented together, such as in some all-electric vehicles.
  • the computer system 160 may include at least one processor 161 that executes instructions 1621 stored in a non-transitory computer readable medium such as a data storage device 162.
  • the computer system 160 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 161 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, the memory, and other elements in the same physical enclosure, those of ordinary skill in the art should understand that the processor, computer system, or memory may actually include Multiple processors, computer systems, or memories in a physical housing, or include multiple processors, computer systems, or memories that may not be stored in the same physical housing.
  • the memory may be a hard drive, or other storage medium located in a different physical enclosure.
  • a reference to a processor or computer system will be understood to include a reference to a collection of processors or computer systems or memories that may operate in parallel, or a reference to a collection of processors or computer systems or memories that may not operate in parallel.
  • some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
  • the processor may be located away from the vehicle and wirelessly communicate with the vehicle.
  • some of the processes described herein are executed on a processor disposed in the vehicle and others are executed by a remote processor, including taking the necessary steps to perform a single manipulation.
  • the data storage device 162 may include instructions 1621 (eg, program logic), which may be executed by the processor 161 to perform various functions of the vehicle 100, including those described above.
  • the data storage device 162 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data on one or more of the traveling system 110, the sensor system 120, the control system 130, and the peripheral device 140. Control instructions.
  • the data storage device 162 may also store data, such as road maps, route information, the location, direction, and speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 100 and the computer system 160 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the data storage device 162 may obtain obstacle information in the surrounding environment obtained by the vehicle based on the sensors in the sensor system 120, such as obstacles such as other vehicles, road edges, and green belts. Information such as the location, the distance between the obstacle and the vehicle, and the distance between the obstacles.
  • the data storage device 162 may also obtain environmental information from the sensor system 120 or other components of the vehicle 100. The environmental information may be, for example, whether there are green belts, lanes, pedestrians, etc. near the current environment of the vehicle, or the vehicle calculates the current location through a machine learning algorithm. Whether there are green belts, pedestrians, etc. near the environment.
  • the data storage device 162 can also store the state information of the vehicle itself and the state information of other vehicles that interact with the vehicle.
  • the state information of the vehicle includes but is not limited to the vehicle's position, speed, acceleration, Heading angle, etc.
  • the processor 161 can obtain this information from the data storage device 162, and determine the passable area of the vehicle based on the environmental information of the environment in which the vehicle is located, the state information of the vehicle itself, the state information of other vehicles, etc., and based on the passable area
  • the final driving strategy is determined to control the vehicle 100 to drive automatically.
  • the user interface 170 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 170 may interact with one or more input/output devices in the set of peripheral devices 140, such as one or more of the wireless communication system 141, the onboard computer 142, the microphone 143, and the speaker 144.
  • the computer system 160 may control the vehicle 100 based on information acquired from various subsystems (for example, the traveling system 110, the sensor system 120, and the control system 130) and the information received from the user interface 170. For example, the computer system 160 may control the steering system 131 to change the forward direction of the vehicle according to the information from the control system 130, so as to avoid obstacles detected by the sensor system 120 and the obstacle avoidance system 136. In some embodiments, the computer system 160 can control many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the data storage device 162 may exist partially or completely separately from the vehicle 100.
  • the above-mentioned components may be coupled together for communication in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation to the embodiments of the present application.
  • An autonomous vehicle traveling on a road can determine the current speed adjustment instruction according to other vehicles in its surrounding environment.
  • the objects in the environment around the vehicle 100 may be other types of objects such as traffic control equipment or green belts.
  • each object in the surrounding environment may be considered independently, and the speed adjustment instruction of the vehicle 100 may be determined based on the respective characteristics of the object, such as its current speed, acceleration, and distance from the vehicle.
  • the vehicle 100 as an autonomous vehicle or the computer equipment associated with it (such as the computer system 160, the computer vision system 134, and the data storage device 162 in FIG. 1) can obtain the state of the surrounding environment based on the identified measurement data. (For example, traffic, rain, ice on the road, etc.), and determine the relative position of obstacles and vehicles in the surrounding environment at the current moment.
  • the boundary of the passable area formed by each obstacle depends on each other. Therefore, all the acquired measurement data can also be used to determine the boundary of the passable area of the vehicle, removing the actual inaccessible area from the passable area. Passable area.
  • the vehicle 100 can adjust its driving strategy based on the detected passable area of the vehicle.
  • an autonomous vehicle can determine what stable state the vehicle needs to adjust to (for example, accelerating, decelerating, turning, or stopping, etc.) based on the detected passable area of the vehicle. In this process, other factors may also be considered to determine the speed adjustment instruction of the vehicle 100, such as the lateral position of the vehicle 100 on the traveling road, the curvature of the road, the proximity of static and dynamic objects, and so on.
  • the computer device can also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains the self-driving car and nearby objects (such as The safe horizontal and vertical distances of cars in adjacent lanes.
  • the above-mentioned vehicle 100 may be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
  • the application examples are not particularly limited.
  • the autonomous driving vehicle may further include a hardware structure and/or software module, which implements the above-mentioned functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether a certain function among the above-mentioned functions is executed by a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraint conditions of the technical solution.
  • the vehicle may include the following modules:
  • Environment perception module 201 used to obtain information about vehicles, pedestrians, and road objects in the surrounding environment of the vehicle through roadside sensors and on-board sensors.
  • the roadside sensor and the vehicle-mounted sensor may be laser radar, millimeter wave radar, vision sensor, and so on.
  • the environment perception module 201 obtains the video stream data originally collected by the sensor, the point cloud data of the radar, etc., and then processes these original video stream data and the point cloud data of the radar to obtain recognizable structured data of people, vehicles, and objects. Location, size and other data determine the location of these people, cars, and objects relative to the vehicle, and then determine the boundary of the vehicle's passable area and other information.
  • the environmental perception module 201 can determine the boundary of the passable area of the vehicle based on the data collected by all or a certain type or a certain sensor, so as to obtain the passable area of one or more vehicles. Borders.
  • the environment sensing module 201 is also used to send the determined location information of people, vehicles, and objects and the boundaries of the vehicle's passable area to the passable area detection module 202 according to the data obtained by the sensor.
  • the passable area detection module 202 is used to obtain the boundaries of the passable area of one or more vehicles from the environment perception module 201, and determine the position information of people, vehicles, objects, etc., based on the obtained data The location information of people, vehicles, and objects is used to cut the boundaries of the passable area of one or more vehicles to remove the area where the vehicle is actually impassable.
  • the passable area detection module 202 is also used to perform fusion processing on the borders of the passable areas of one or more vehicles, or the borders of the passable areas of one or more vehicles that have been cut, to finally obtain The passable area of the vehicle.
  • the passable area detection module 202 is also used to obtain vehicle positioning information and map data, and determine the road on which the vehicle is located and the structure of the road according to the map and vehicle positioning information, so as to determine the boundary of the vehicle passable area Cut to obtain a passable area suitable for vehicles on the road where the current vehicle is located.
  • the passable area detection module 202 is also used to send the finally obtained vehicle passable area to the simulation display module 204 and the planning control module 203.
  • the simulation display module 204 is used to receive the information of the passable area of the vehicle sent by the passable area detection module 202, and display the passable area of the vehicle in the form of a grid diagram, etc., so as to more conveniently and intuitively view the passable area of the vehicle An impassable area with vehicles.
  • the planning control module 203 is used to receive the passable area of the vehicle sent by the passable area detection module 202, and plan the travel path of the vehicle according to the received passable area of the vehicle, generate a driving strategy, and output the driving strategy. Action instructions corresponding to the strategy, and control the vehicle to perform automatic driving according to the instructions.
  • This module is a traditional control module of autonomous vehicles.
  • the vehicle positioning module 205 is used to determine the positioning information of the vehicle and send the positioning information of the vehicle to the passable area detection module 202. In a possible implementation manner, the vehicle positioning module 205 is also used to send the positioning information of the vehicle to the environment perception module 201.
  • the vehicle-mounted communication module 206 (not shown in FIG. 2) is used for information exchange between the own vehicle and other vehicles.
  • the storage component 207 (not shown in FIG. 2) is used to store the executable codes of the foregoing modules, and running these executable codes can implement part or all of the method procedures of the embodiments of the present application.
  • the computer system 160 shown in FIG. 1 includes a processor 301, which is coupled to a system bus 302, and the processor 301 may be one or more Processors, each of which can include one or more processor cores.
  • the video adapter 303 can drive the display 324, and the display 324 is coupled to the system bus 302.
  • the system bus 302 is coupled with the input/output (I/O) bus (BUS) 305 through the bus bridge 304, the I/O interface 306 is coupled with the I/O bus 305, and the I/O interface 306 communicates with various I/O devices, For example, an input device 307 (such as a keyboard, a mouse, a touch screen, etc.), a media tray 308 (such as a CD-ROM, a multimedia interface, etc.).
  • the transceiver 309 can send and/or receive radio communication signals
  • the camera 310 can capture static and dynamic digital video images
  • USB universal serial bus
  • the interface connected to the I/O interface 306 may be a USB interface.
  • the processor 301 may be any traditional processor, including a reduced instruction set computer (RISC) processor, a complex instruction set computer (CISC) processor, or a combination of the foregoing.
  • the processor 301 may also be a dedicated device such as an application specific integrated circuit ASIC.
  • the processor 301 may also be a neural network processor or a combination of a neural network processor and the foregoing traditional processors.
  • the computer system 160 may be located far away from the autonomous driving vehicle and wirelessly communicate with the autonomous driving vehicle 100.
  • some of the processes described in this application may be configured to be executed on a processor in an autonomous vehicle, and other processes may be executed by a remote processor, including taking actions required to perform a single manipulation.
  • the computer system 160 can communicate with a software deployment server (deploying server) 313 through a network interface 312.
  • the network interface 312 may be a hardware network interface, such as a network card.
  • the network (Network) 314 may be an external network, such as the Internet, or an internal network, such as an Ethernet or a virtual private network (VPN).
  • the network 314 may also be a wireless network, such as a WiFi network, a cellular network, and the like.
  • the hard disk drive interface 315 and the system bus 302 are coupled.
  • the hard disk drive interface 315 and the hard disk drive 316 are connected.
  • the system memory 317 and the system bus 302 are coupled.
  • the data running in the system memory 317 may include an operating system (OS) 318 and application programs 319 of the computer system 160.
  • OS operating system
  • the operating system (OS) 318 includes but is not limited to Shell 320 and kernel 321.
  • Shell 320 is an interface between the user and the kernel 321 of the operating system 318.
  • Shell 320 is the outermost layer of operating system 318. The shell manages the interaction between the user and the operating system 318: waiting for the user's input, interpreting the user's input to the operating system 318, and processing various output results of the operating system 318.
  • the kernel 321 is composed of parts of the operating system 318 for managing memory, files, peripherals, and system resources, and directly interacts with the hardware.
  • the kernel 321 of the operating system 318 generally runs processes, provides communication between processes, and provides functions such as CPU time slice management, interruption, memory management, and IO management.
  • Application programs 319 include programs 323 related to autonomous driving, such as programs that manage the interaction between autonomous vehicles and road obstacles, programs that control the driving route or speed of autonomous vehicles, and control interaction between autonomous vehicles and other cars on the road/autonomous vehicles Procedures, etc.
  • the application 319 also exists on the deploying server313 system. In one embodiment, when the application program 319 needs to be executed, the computer system 160 may download the application program 319 from the deploying server 313.
  • the application program 319 may be an application program that controls the vehicle to determine a driving strategy according to the above-mentioned passable area of the vehicle and a traditional control module.
  • the processor 301 of the computer system 160 calls the application 319 to obtain the driving strategy.
  • the sensor 322 is associated with the computer system 160.
  • the sensor 322 is used to detect the environment around the computer system 160.
  • the sensor 322 can detect animals, cars, obstacles, and/or pedestrian crossings.
  • the sensor 322 can also detect the environment around the aforementioned objects such as animals, cars, obstacles and/or pedestrian crossings.
  • the environment around the animal for example, other animals that appear around the animal, weather conditions, and the brightness of the environment around the animal.
  • the sensor 322 may be at least one of a camera, an infrared sensor, a chemical detector, a microphone, and other devices.
  • the computer system 160 may also receive information from other computer systems or transfer information to other computer systems.
  • the sensor data collected from the sensor system 120 of the vehicle 100 may be transferred to another computer, and the data can be processed by the other computer.
  • data from the computer system 160 may be transmitted to the computer system 410 on the cloud side via the network for further processing.
  • the network and intermediate nodes can include various configurations and protocols, including the Internet, World Wide Web, Intranet, virtual private network, wide area network, local area network, private network using one or more company’s proprietary communication protocols, Ethernet, WiFi and HTTP, And various combinations of the foregoing. This communication can be performed by any device capable of transferring data to and from other computers, such as modems and wireless interfaces.
  • the computer system 410 may include a server with multiple computers, such as a load balancing server group.
  • the server 420 exchanges information with different nodes of the network.
  • the computer system 410 may have a configuration similar to that of the computer system 160, and has a processor 430, a memory 440, instructions 450, and data 460.
  • the data 460 of the server 420 may include providing weather-related information.
  • the server 420 can receive, monitor, store, update, and transmit various information related to target objects in the surrounding environment.
  • the information may include, for example, target category, target shape information, and target tracking information in a report form, radar information form, forecast form, etc.
  • the cloud service center may receive information (such as data collected by vehicle sensors or other information) from vehicles 513 and vehicles 512 in its operating environment 500 via a network 511 such as a wireless communication network.
  • a network 511 such as a wireless communication network.
  • the vehicle 513 and the vehicle 512 may be self-driving vehicles.
  • the cloud service center 520 controls the vehicle 513 and the vehicle 512 by running its stored programs related to controlling the automatic driving of the automobile according to the received data.
  • Programs related to controlling auto-driving cars can be: programs that manage the interaction between autonomous vehicles and road obstacles, or programs that control the route or speed of autonomous vehicles, or programs that control interaction between autonomous vehicles and other autonomous vehicles on the road.
  • the cloud service center 520 may provide a part of the map to the vehicle 513 and the vehicle 512 through the network 511.
  • operations can be divided between different locations.
  • multiple cloud service centers can receive, confirm, combine, and/or send information reports.
  • information reports and/or sensor data can also be sent between vehicles.
  • Other configurations are also possible.
  • the cloud service center 520 sends a suggested solution to the autonomous vehicle regarding possible driving situations in the environment (eg, inform the obstacle ahead and tell how to avoid it)). For example, the cloud service center 520 may assist the vehicle in determining how to proceed when facing a specific obstacle in the environment.
  • the cloud service center 520 sends a response to the autonomous vehicle indicating how the vehicle should travel in a given scene.
  • the cloud service center 520 can confirm the existence of a temporary stop sign in front of the road based on the collected sensor data. For example, based on the “lane closed” sign and the sensor data of construction vehicles, it can be determined that the lane is closed due to construction.
  • the cloud service center 520 sends a suggested operation mode for the vehicle to pass through the obstacle (for example, instructing the vehicle to change lanes on another road).
  • the operation steps used for the autonomous driving vehicle can be added to the driving information map.
  • this information can be sent to other vehicles in the area that may encounter the same obstacle, so as to assist other vehicles not only to recognize the closed lanes but also to know how to pass.
  • the disclosed methods may be implemented as computer program instructions in a machine-readable format, encoded on a computer-readable storage medium, or encoded on other non-transitory media or articles.
  • Figure 6 schematically illustrates a conceptual partial view of an example computer program product arranged in accordance with at least some of the embodiments shown herein, the example computer program product including a computer program for executing a computer process on a computing device.
  • the example computer program product 600 is provided using a signal bearing medium 601.
  • the signal bearing medium 601 may include one or more program instructions 602, which, when run by one or more processors, may provide all or part of the functions described above with respect to FIGS. 2 to 5, or may provide descriptions in subsequent embodiments. All or part of the function.
  • one or more features in S701 to S704 may be undertaken by one or more instructions associated with the signal bearing medium 601.
  • the program instructions 602 in FIG. 6 also describe example instructions.
  • the signal-bearing medium 601 may include a computer-readable medium 603, such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (Read -Only Memory, ROM) or Random Access Memory (RAM), etc.
  • the signal bearing medium 601 may include a computer recordable medium 604, such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
  • the signal-bearing medium 601 may include a communication medium 605, such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • the signal bearing medium 601 may be communicated by a wireless communication medium 605 (for example, a wireless communication medium that complies with the IEEE 802.11 standard or other transmission protocols).
  • the one or more program instructions 602 may be, for example, computer-executable instructions or logic-implemented instructions.
  • computing devices such as those described with respect to FIGS. 2 to 6 may be configured to respond to one or more of the computer readable medium 603, and/or computer recordable medium 604, and/or communication medium 605.
  • a program instruction 602 communicated to the computing device provides various operations, functions, or actions. It should be understood that the arrangement described here is for illustrative purposes only.
  • this application proposes a A method of detecting the area where vehicles can pass.
  • the execution subject of this method can be a vehicle or other device with the function of controlling the vehicle, such as an autonomous vehicle, or a processor in the vehicle or other device with the function of controlling the vehicle, such as the processor mentioned in the above content. 161, the processor 301, the processor 430, and so on. As shown in Figure 7, the method includes steps S701-S704:
  • the position of the center point (or center of mass) of the vehicle is determined as the position of the vehicle, and the direction of travel of the vehicle is determined. Then, the direction of travel of the vehicle is taken as the polar axis, that is, the direction of travel of the vehicle is taken as the 0° angle direction of the polar coordinate system, and the position of the vehicle is taken as the pole to establish a polar coordinate system. Generally, the counterclockwise direction is set as the positive direction of the polar coordinate system. Finally, the surrounding area where the vehicle is located, that is, the preset area in the above-mentioned polar coordinate system, is divided according to the preset angular interval and the preset radial distance interval to obtain a plurality of grid units.
  • the polar coordinates of a certain point on each grid unit can be determined as the identifier of the grid unit.
  • the surrounding area where the vehicle is located may be a circular area with the vehicle as the center and the preset radius as the radius.
  • the angle in the identification of each grid unit is the azimuth angle of the grid unit relative to the vehicle.
  • the preset angular interval remains unchanged, and the preset radial distance
  • the distance interval remains unchanged as the value of the radial distance increases.
  • the preset radial distance interval changes as the value of the radial distance increases, so that the area of each grid unit is substantially the same.
  • the region in the polar coordinate system is divided to obtain multiple grid cells.
  • the logo of the grid unit is (0.2m, 0.5°) or (0.4m, 0.5°) and so on.
  • dividing the preset area in the polar coordinate system can obtain multiple grid units as shown in FIG. 8.
  • O is the pole of the polar coordinate system, that is, the position of the vehicle, and 0°, 90°, 180° and 270° are the polar angles of the polar coordinate system.
  • a circular area with a radius of 30m in the polar coordinate system is divided to obtain Multiple grid units; when the radial distance is greater than 30m and less than or equal to 60m, the preset angular interval is 0.5°, and the preset radial distance interval is 0.5m.
  • the preset angular interval is 0.5°
  • the preset radial distance interval is 1m.
  • the area with a radius greater than 60m in the polar coordinate system is divided. Get multiple grid cells.
  • the grid unit is marked as (28.8m, 0.5°) or (30m, 0.5°) or (30.5m, 0.5°) or (55.5m, 0.5°) or (60m, 0.5°) or (61m, 0.5°) °) etc.
  • S702 Determine a boundary grid unit according to the data collected by the sensor.
  • the senor is a sensor on the vehicle.
  • the sensor may include a sensor outside the vehicle, such as a roadside sensor, or a sensor outside the vehicle, in addition to a sensor on the vehicle.
  • the aforementioned sensors for collecting data are one or more of sensors such as a lidar sensor, a millimeter wave radar sensor, or a vision sensor. It should be noted that the use of data collected by multiple sensors to determine the passable area of a vehicle can make the method of detecting the passable area of a vehicle in this application very versatile, and use data collected by multiple sensors To determine the passable area of the vehicle can further improve the accuracy of detecting the passable area of the vehicle.
  • the position information of the obstacles in the surrounding area of the vehicle is determined according to the data collected by the sensor, and the boundary grid unit is determined according to the position information and the multiple grid units obtained in step S701.
  • lidar sensor uses a lidar sensor as an example to introduce the specific implementation process of determining the location information of obstacles based on the data collected by the sensor in this step:
  • the laser line data collected by the laser radar sensor determine the coordinates of the points in the laser radar sensor's rectangular coordinate system, that is, the echo point corresponding to each laser line data is in the laser radar sensor's rectangular coordinate system coordinate of. Then, according to the position of the lidar sensor in the vehicle's rectangular coordinate system, the coordinates of the echo point corresponding to each laser line data in the lidar sensor's rectangular coordinate system are converted into the coordinates of the points in the vehicle's rectangular coordinate system. Coordinates to determine the location information of obstacles in the surrounding environment where the vehicle is located. Wherein, the position information includes the azimuth angle of the obstacle relative to the vehicle, and the distance between the obstacle and the vehicle.
  • the data collected by the lidar sensor includes multiple laser line data, and each laser line data includes the vertical direction angle ⁇ of the laser emission, the horizontal direction angle ⁇ , and the scanning distance d.
  • the vertical direction angle ⁇ of laser emission is the angle between the laser line and the horizontal plane of the lidar sensor, that is, the angle between the xoy plane of the rectangular coordinate system of the lidar sensor
  • the horizontal direction angle ⁇ is the laser line on the lidar sensor.
  • the scanning distance d is the distance between the echo point on the laser line and the laser radar sensor Straight line distance.
  • the rectangular coordinate system of the lidar sensor takes the position of the lidar sensor as the origin
  • the 0° angle direction of the laser line emitted by the lidar sensor is the positive direction of the y-axis of the rectangular coordinate system
  • the rotation of the lidar The axis is the z axis
  • the direction passing through the origin and perpendicular to the yz plane is the x axis.
  • x sin( ⁇ )cos( ⁇ )d
  • y cos( ⁇ )cos( ⁇ )d
  • z sin( ⁇ )d
  • Cart Cartesian coordinate system Take the position of the center of the vehicle as the origin, the direction of the vehicle as the positive x-axis, the direction perpendicular to the plane of the vehicle as the positive z-axis, and the straight line perpendicular to the xoz plane and passing through the origin as the y-axis.
  • P ego represents the coordinates of the echo point corresponding to each laser line data in the Cartesian coordinate system of the lidar sensor
  • P represents the coordinates of the echo point corresponding to each laser line data in the vehicle Cartesian coordinate system
  • the xoy plane of the vehicle coordinate system is the plane where the polar coordinate system mentioned in step S701 is located.
  • the height of the echo point is determined according to the coordinates of the echo point corresponding to the laser line data collected by the lidar sensor in the vehicle coordinate system. Then the echo point is projected into the polar coordinate system to obtain the polar coordinate of the echo point, that is, the position information of the echo point relative to the vehicle, and the grid unit where the echo point is located is determined. Then according to the height, slope, height difference and the distance between the echo point and the vehicle in the same azimuth angle in the polar coordinate system, determine whether the echo point on the azimuth angle is an obstacle and where the obstacle is Grid unit (or the location information of the obstacle). Alternatively, methods such as conditional random field, Markov random field or neural network can be used in combination with data collected by lidar sensors to detect whether there are obstacles at each azimuth angle of the polar coordinate system and determine the obstacles Location information.
  • the location information of obstacles can be determined by using data collected by sensors, that is, the positions of obstacles in multiple grid units, and obstacles can be obtained.
  • the schematic diagram of the position information of the object is shown in FIG. 9.
  • black circle 1-and black circle 6 respectively represent obstacle 1-obstacle 6, and the identifier of the grid unit where obstacle 1-obstacle 6 are located, that is, the position information of obstacle 1-obstacle 6.
  • the grid unit of the obstacle with the closest distance to the vehicle on the azimuth is the boundary grid unit.
  • the obstacle 2 and the obstacle 3 are located at the same azimuth angle, in the azimuth angle, the obstacle closest to the vehicle is the obstacle 2, and the obstacle 5 and the obstacle 6 are located at the same azimuth.
  • the grid unit where the obstacle 1, the obstacle 2, the obstacle 4, and the obstacle 5 are located is the boundary grid unit.
  • the grid unit 7-grid unit 34 is the grid unit with the farthest distance from the vehicle at each azimuth angle except the azimuth angle where the obstacle 1-obstacle 6 are located.
  • the grid unit of the obstacle with the closest distance to the vehicle on the azimuth is the boundary network Grid unit; if there is no obstacle at the azimuth angle, the grid unit farthest from the vehicle on the azimuth angle is determined to be the boundary grid unit.
  • the boundary grid unit farthest from the vehicle is used as the boundary grid unit in the azimuth angle where there is no obstacle, then the obstacle 1, the obstacle 2, the obstacle are removed 4.
  • the boundary grid unit also includes grid unit 7-grid unit 34.
  • a boundary grid unit exists at each azimuth angle in the polar coordinate system, and the boundary grid unit is the grid unit where the obstacle closest to the vehicle is located in the azimuth angle, or the azimuth angle where the boundary grid unit is located.
  • the grid unit farthest from the vehicle is taken as an example to describe the embodiment of the present application.
  • S703 Determine the target boundary grid unit according to the at least two boundary grid units.
  • the target boundary grid unit is a grid unit where the boundary of the vehicle's passable area is located.
  • the target boundary grid unit is determined according to the distance between the first boundary grid unit and the second boundary grid unit.
  • the first boundary grid unit is any one of the above at least two boundary grid units
  • the distance between the second boundary grid unit and the vehicle is less than or equal to the distance between the first boundary grid unit and the vehicle, and is preset
  • the second boundary grid unit is the boundary grid unit that is closest to the first boundary grid unit among other boundary grid units except the first boundary grid unit .
  • each first boundary grid unit has a unique second boundary grid unit corresponding to it.
  • the direction of the arrow shown in the figure indicates that the preset direction is counterclockwise. If the boundary grid unit where obstacle 2 is located is taken as the first boundary grid unit, it can be determined that the distance between the boundary grid unit where obstacle 1 is located and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, and the obstacle The boundary grid unit where the object 1 is located is the boundary grid unit closest to the first boundary grid unit in the counterclockwise direction. Therefore, the boundary grid unit where the obstacle 1 is located is the second boundary grid unit corresponding to the first boundary grid unit. Based on the same reason as above, if the boundary grid unit where the obstacle 5 is located is taken as the first boundary grid unit, the boundary grid unit where the obstacle 4 is located can be determined as the first boundary grid unit corresponding to the first boundary grid unit. Two boundary grid cells.
  • the target boundary grid unit is determined by comparing the distance between the first boundary grid unit and the second boundary grid unit with a preset distance threshold.
  • the preset distance threshold is the width of the vehicle, or other values greater than the width of the vehicle determined according to the width of the vehicle, and the preset distance threshold is determined by the user or determined by the execution subject of the method.
  • the preset distance threshold is determined according to the width of the vehicle, when the target boundary grid unit is determined according to the preset distance threshold, the actual impassable area of the vehicle can be eliminated as much as possible, thereby improving the accuracy of vehicle passable area detection sex.
  • the first boundary grid unit and the second boundary grid unit when the distance between the first boundary grid unit and the second boundary grid unit is less than or equal to a preset distance threshold, the first boundary grid unit and the second boundary grid unit The third boundary grid unit is determined as the target boundary grid unit.
  • the distance between the third boundary grid unit and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, and the third boundary grid unit is located between the first boundary grid unit and the second boundary grid unit in the preset direction between.
  • the first boundary grid unit and the second boundary grid unit are determined as the target boundary grid unit.
  • the distance between the first boundary grid unit and the second boundary grid unit is less than or equal to the preset distance threshold, before determining the target boundary grid unit, it is necessary to first determine the target boundary grid unit according to the first boundary grid unit, And the second boundary grid unit to determine the third boundary grid unit, and then the first boundary grid unit, the second boundary grid unit, and the third boundary grid unit are determined as the target boundary grid unit.
  • the boundary grid unit where the obstacle 4 is located is the first boundary grid unit
  • the boundary grid unit where the obstacle 2 is located can be determined as For the second boundary grid unit corresponding to the first boundary grid unit, the distance between the first boundary grid unit and the second boundary grid unit is d.
  • the distance between the grid unit 35 and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, that is, the distance between the boundary grid unit where the obstacle 4 is located and the vehicle, and the grid unit 35 is in the preset direction ( (Counterclockwise) is located between the first boundary grid unit and the second boundary grid unit.
  • the grid unit 35 is the third boundary grid unit
  • the grid unit 36 and the grid unit 37 are both the third boundary grid unit. If d is less than or equal to the preset distance threshold, the first boundary grid unit, the second boundary grid unit, and the third boundary grid unit 35, the third boundary grid unit 36, and the third boundary grid unit in this example
  • the grid cell 37 is determined as the target boundary grid cell. If d is greater than the preset distance threshold, only the first boundary grid unit and the second boundary grid unit in this example are determined as the target boundary grid unit.
  • the distance between the first boundary grid unit and the second boundary grid unit may be a linear distance or an arc distance.
  • the arc distance is the distance between the first boundary grid unit and the vehicle as the radius, and an arc on the circle formed with the pole as the center, and the two ends of the arc are the first boundary grid unit and the second boundary The two closest endpoints on the grid cell.
  • the boundary grid unit where the obstacle 4 is located is the first boundary grid unit
  • the boundary grid unit where the obstacle 2 is located is determined as the second boundary corresponding to the first boundary grid unit.
  • Circle a is a circle with the distance between the first boundary grid unit and the vehicle as the radius and the pole as the center, so that the distance between the first boundary grid unit 4 and the second boundary grid unit can be determined as d or d'.
  • d' is the arc distance
  • d is the straight line distance
  • d' is an arc on the circle a that connects the two closest end points on the first boundary grid unit and the second boundary grid unit.
  • the preset distance threshold used for comparison with the straight line distance or the arc distance is different.
  • the preset distance threshold for comparison with the straight-line distance is the first preset distance threshold
  • the preset distance threshold for comparison with the straight-line distance is the second preset distance threshold.
  • the distance between the first boundary grid unit and the second boundary grid unit is less than or equal to the preset distance threshold, it means that the vehicle is between the first boundary grid unit and the second boundary grid unit. The distance between the two is too small, and the vehicle cannot pass.
  • the first boundary grid unit, the second boundary grid unit and the third boundary grid unit are determined as the target boundary grid unit, which means that the boundary grid unit is formed.
  • the area of the vehicle is cut to eliminate the area that is actually impassable by the vehicle, and the accuracy of the vehicle’s passable area detection is improved.
  • S704 Determine an area between the target boundary grid unit and the vehicle as a passable area of the vehicle.
  • the area between the target boundary grid unit and the vehicle is determined as the vehicle's passable area, and the area where the grid unit is located at a distance greater than the first distance from the vehicle is determined as the vehicle's impassable area .
  • the first distance is the distance between the target boundary grid unit and the vehicle.
  • a grid map can be generated according to the passable area and the impassable area of the vehicle.
  • the grid map can also be referred to as an occupancy grid map (occupancy grid map,
  • the grid map can more intuitively represent the passable area and the inaccessible area of the vehicle, and it is convenient for users to view the passable area and the inaccessible area of the vehicle.
  • the lower boundary of the target boundary grid unit that is, the boundary on the side close to the vehicle on the target boundary grid unit, is regarded as the boundary of the vehicle's passable area, and the area between the boundary and the vehicle is the vehicle
  • the other side of the boundary that is, the side away from the vehicle, is the impassable area of the vehicle.
  • the target boundary grid unit includes grid unit 1-grid unit 32, the shaded part in the figure is the vehicle impassable area, and the unshaded part That is, the area between the target grid unit and the vehicle is the vehicle passable area.
  • the grid diagram used to represent the vehicle passable area and the unpassable area may also be in the form shown in FIG. 13 .
  • the black area in the figure is used to indicate the impassable area of the vehicle
  • point A indicates the vehicle
  • the white grid area is used to indicate the passable area of the vehicle
  • the white line is used to indicate the distance and relative distance between the vehicle and the vehicle according to the boundary of the vehicle’s passable area
  • the boundary between the vehicle's passable area and the vehicle's impassable area in FIG. 12 is projected onto the grid shown in FIG. 13, and the boundary of the vehicle's passable area is finally obtained.
  • the present application can determine at least two boundary grid units from multiple grid units, and then determine the target boundary grid unit based on the at least two boundary grid units, so as to realize the alignment of the boundary grid units. Update to get the target boundary grid unit. And the area between the target boundary grid unit and the vehicle is determined as the vehicle passable area.
  • the target boundary grid unit can be used to cut the area between the boundary grid unit and the vehicle to remove the area that is actually impassable by the vehicle. , Improve the accuracy of vehicle passable area detection.
  • the present application also provides a method for detecting a vehicle passable area.
  • the data collected by the sensor is first filtered, and the data inside the vehicle (including the vehicle housing) collected by the sensor is removed.
  • the location information of the obstacles around the vehicle is determined.
  • the boundary grid unit is determined according to the location information of the obstacles around the vehicle.
  • the data in the vehicle housing collected by the sensor is eliminated, and the grid unit where a certain point on the vehicle or the vehicle housing is located is avoided as the grid unit where the obstacle is located, thereby reducing the use of a certain point on the vehicle as a grid unit.
  • the possibility of the boundary of the passable area of the vehicle improves the accuracy of the detection of the passable area of the vehicle.
  • the boundary grid unit is determined in step S702
  • the distance between the boundary grid unit and the vehicle at each azimuth angle is stored as an array element according to a preset direction.
  • Dimensional boundary array is stored as an array element according to a preset direction.
  • the array elements in the one-dimensional boundary array are updated according to the distance between the target boundary grid unit and the vehicle in each azimuth.
  • the preset direction is counterclockwise or clockwise.
  • the positive direction of the polar coordinate system that is, the order in the counterclockwise direction is selected as the order in which the array elements are stored in the one-dimensional boundary array.
  • the number of array elements in the one-dimensional boundary array is the same as the number of azimuth angles obtained when the grid unit is divided according to the preset angle interval.
  • the distance between the boundary grid unit and the vehicle is stored as an array element and stored in a one-dimensional boundary array.
  • the subscripts of the array elements in the one-dimensional boundary array are 0-719.
  • step S702 in another possible implementation manner, for the same azimuth angle, if there is an obstacle on the azimuth angle, determine the network where the obstacle on the azimuth angle is closest to the vehicle.
  • the grid unit is the boundary grid unit on the azimuth angle; if there is no obstacle on the azimuth angle, it is determined that there is no boundary grid unit on the azimuth angle.
  • the farthest distance that can be monitored on the azimuth can be used as an array element and stored in a one-dimensional boundary array, or infinity INF can be used as the azimuth
  • the array elements are stored in a one-dimensional boundary array, or the array elements are not stored in the azimuth.
  • the distance between the boundary grid unit and the vehicle is stored in a one-dimensional boundary array in a counterclockwise direction as an array element, and the subscripts of the array elements in the one-dimensional boundary array are 0-71.
  • each array element can be represented by a 0 to a 71 , where the array elements a 1 and a 71 have no boundary grid cells in the azimuth angles, and a 1 and a 71 correspond to the direction angles
  • the distance between the boundary grid unit and the vehicle is stored in a one-dimensional boundary array in a counterclockwise direction as an array element, and the subscripts of the array elements in the one-dimensional boundary array are 0-71.
  • the two boundary grid units at the azimuth angles corresponding to a0 and a2 can be used as the first boundary grid unit and the second boundary grid unit respectively. If the first boundary grid unit If the distance from the second boundary grid unit is less than or equal to the preset distance threshold, the first boundary grid unit, the second boundary grid unit, and the third boundary grid unit are determined as the target boundary grid unit.
  • the third boundary grid unit is a grid unit that is at the azimuth angle corresponding to a2 and is the same as the distance between the first boundary grid unit and the vehicle. Then, the one-dimensional boundary array updated according to the distance between the target boundary grid unit and the vehicle is shown in Table 4 below.
  • the distance between the target boundary grid unit and the vehicle is stored as an array element and stored in a one-dimensional boundary array.
  • the one-dimensional boundary array can simply and intuitively reflect the passable area of the vehicle at each azimuth angle. boundary.
  • each boundary grid unit is used as the first boundary grid unit to determine its corresponding second boundary grid unit, and according to the first boundary grid unit and the second boundary grid unit The distance between the grid cells and the preset distance threshold are used to determine the target boundary grid cell.
  • 3 can be determined according to the boundary grid unit where obstacle 1, obstacle 2, obstacle 4, obstacle 5 are located, and boundary grid unit 7-boundary grid unit 34
  • the concentric circles that is, the circles formed by the solid lines shown in the figure
  • the target boundary grid unit is determined. That is to say, according to the order of the radius of the concentric circles from small to large, the boundary grid cells on the concentric circle a, the concentric circle b, and the concentric circle c are analyzed in turn.
  • any one of them can be determined as the first boundary grid unit.
  • the second boundary grid unit corresponding to the first boundary grid unit is obstacle 4 Where the boundary grid unit is located, and the distance between the first boundary grid unit and the second boundary grid unit is greater than the preset distance threshold, then the first boundary grid unit and the second boundary grid unit are determined to be the target boundary Grid unit.
  • the second boundary grid unit corresponding to the first boundary grid unit is the boundary where obstacle 2 is located.
  • the distance between the first boundary grid unit and the second boundary grid unit is greater than a preset distance threshold, then it is determined that the first boundary grid unit and the second boundary grid unit are the target boundary grids unit.
  • the boundary grid unit where obstacle 2 is located is the first boundary grid unit, and the preset direction is counterclockwise, then the boundary grid unit where obstacle 1 is located is the second boundary corresponding to the first boundary grid unit.
  • the grid unit E is located in the preset direction of the first boundary grid unit between the first boundary grid unit and the second boundary grid unit, and the distance between the grid unit E and the vehicle is equal to the distance between the first grid unit and the vehicle, the grid unit E is the third boundary grid Grid unit, determining the first boundary grid unit, the second boundary grid unit, and the third boundary grid unit as the target boundary grid unit.
  • the process of determining the target boundary grid unit based on the boundary grid unit on the concentric circle a is over, the following is the same way to determine the target boundary grid based on the boundary grid unit on the concentric circle b and concentric circle c Cell.
  • the first boundary grid unit and the second boundary grid unit are determined as the target boundary grid unit.
  • the boundary grid unit with the same azimuth angle as the third boundary grid unit may not be considered, or the boundary grid unit with the same azimuth angle as the third boundary grid unit may be marked as unnecessary to be processed, so as to reduce the determination target
  • the workload of the boundary grid unit process improves the efficiency of determining the target boundary grid unit, thereby improving the efficiency of vehicle passable area detection.
  • the boundary grid unit 9 and the third boundary grid unit E are located at the same azimuth angle, and the distance between the boundary grid unit 9 and the vehicle is greater than that of the third boundary grid.
  • the boundary grid unit 9 For the distance between grid cell E and the vehicle, mark the boundary grid unit 9 as not to be processed.
  • the target boundary grid unit is determined based on the boundary grid unit on the concentric circle c, the boundary grid unit is no longer 9 is processed.
  • the first boundary grid unit and the second boundary grid unit are directly determined as the target boundary grid unit.
  • the boundary grid unit can be directly connected to the boundary grid unit.
  • a grid unit located on the same azimuth angle and whose distance from the vehicle is greater than the distance between the boundary grid unit and the vehicle, as well as the boundary grid unit, is determined as an impassable grid unit, that is, an impassable area of the vehicle. That is to say, in addition to the above-mentioned impassable grid units, the area where other grid units are located is the passable area of the vehicle. If the number of boundary grid units determined in the above step S702 is 0, and the boundary grid unit is no longer determined in the azimuth angle of the obstacle, the entire surrounding area of the vehicle, that is, the preset area, is the vehicle’s availability. Passable area.
  • FIG. A schematic diagram of a possible structure of a device in a traffic area, the device includes a processing unit 1501, a determining unit 1502, and a generating unit 1503.
  • the device for detecting a vehicle passable area may also include other modules, or the device for detecting a vehicle passable area may include fewer modules.
  • the processing unit 1501 is configured to determine at least two boundary grid units from a plurality of grid units, where the at least two boundary grid units are grid units where the obstacle closest to the vehicle at the same azimuth angle is located, The multiple grid units are obtained by dividing the surrounding area where the vehicle is located.
  • the processing unit 1501 is configured to determine at least two boundary grid units from a plurality of grid units, the processing unit 1501 is configured to determine location information of obstacles around the vehicle.
  • the processing unit 1501 is used to determine the location information of obstacles around the vehicle, including: a processing unit 1501 is used to determine the location of the obstacles around the vehicle based on external data collected by sensors on the vehicle information.
  • the external data refers to data outside the vehicle collected by the sensor.
  • the aforementioned sensor includes one or more of laser radar, millimeter wave radar, or vision sensor.
  • the processing unit 1501 is further configured to determine a target boundary grid unit based on at least two boundary grid units, where the target boundary grid unit refers to a grid unit where the boundary of the vehicle's passable area is located.
  • the processing unit 1501 is configured to determine a target boundary grid unit based on at least two boundary grid units, and includes: a processing unit 1501, configured to determine a target boundary grid unit based on the difference between the first boundary grid unit and the second boundary grid unit The distance to determine the target boundary grid unit.
  • the first boundary grid unit is any one of the at least two boundary grid units
  • the distance between the second boundary grid unit and the vehicle is less than or equal to the distance between the first boundary grid unit and the vehicle
  • the second boundary grid unit is the closest to the first boundary grid unit in the direction.
  • the aforementioned preset direction is a clockwise direction or a counterclockwise direction.
  • the processing unit 1501 is configured to determine the target boundary grid unit according to the distance between the first boundary grid unit and the second boundary grid unit, including: a processing unit 1501, configured to pass the first boundary grid unit The distance between the grid cell and the second boundary grid cell is compared with a preset distance threshold to determine the target boundary grid cell.
  • the processing unit 1501 is configured to determine the target boundary grid unit by comparing the distance between the first boundary grid unit and the second boundary grid unit with a preset distance threshold, including : Processing unit 1501, configured to determine the first boundary grid unit and the second boundary grid unit when the distance between the first boundary grid unit and the second boundary grid unit is less than or equal to the preset distance threshold Is the target boundary grid unit.
  • the distance between the third boundary grid unit and the vehicle is equal to the distance between the first boundary grid unit and the vehicle, and the third boundary grid unit is located between the first boundary grid unit and the second boundary grid unit in the preset direction between.
  • the processing unit 1501 is configured to determine the first boundary grid unit and the second boundary grid unit as the target boundary grid when the distance between the first boundary grid unit and the second boundary grid unit is greater than a preset distance threshold. unit.
  • the processing unit 1501 is configured to determine the target boundary grid unit according to at least two boundary grid units
  • the generating unit 1503 is configured to generate Grid diagram.
  • the impassable area refers to an area where a grid unit whose distance from the vehicle is greater than a first distance is located, and the first distance refers to the distance between the target boundary grid unit and the vehicle.
  • the determining unit 1502 is configured to determine the area between the target boundary grid unit and the vehicle as a passable area of the vehicle.
  • An embodiment of the present application provides a computer-readable storage medium storing one or more programs.
  • the one or more programs include instructions that, when executed by a computer, cause the computer to perform steps S701 to S704 of the foregoing embodiment.
  • the method for detecting the passable area of a vehicle is not limited to one or more programs.
  • the embodiment of the present application also provides a computer program product containing instructions, which when the instructions are run on a computer, cause the computer to execute the method of detecting a vehicle passable area executed in steps S701 to S704 of the foregoing embodiment.
  • An embodiment of the present application provides an apparatus for detecting a vehicle passable area, including a processor and a memory; wherein the memory is used to store computer program instructions, and the processor is used to run the computer program instructions so that the apparatus for detecting a vehicle passable area executes the foregoing
  • the method of detecting a vehicle passable area is executed in step S701-step S704 in the embodiment.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne le domaine de la conduite automatique et propose un procédé et un dispositif de détection d'un espace libre pour un véhicule, qui sont utilisés pour déterminer une cellule de grille de limite cible sur la base d'au moins deux cellules de grille de limite, de manière à réaliser le cadrage d'un espace libre pour un véhicule et améliorer la précision de la détection de l'espace libre pour le véhicule. Le procédé comprend les étapes consistant à : déterminer au moins deux cellules de grille de limite à partir d'une pluralité de cellules de grille, les deux ou plus de deux cellules de grille de limite étant des cellules de grille où un obstacle le plus proche du véhicule est situé au même angle d'azimut, et la pluralité de cellules de grille étant obtenues en divisant une région environnante où le véhicule est situé ; déterminer une cellule de grille de limite cible parmi les deux ou plus de deux cellules de grille de limite, la cellule de grille de limite cible faisant référence à une cellule de grille où la limite de l'espace libre pour le véhicule est située ; et déterminer l'espace entre la cellule de grille de limite cible et le véhicule en tant qu'espace libre pour le véhicule.
PCT/CN2020/088473 2020-04-30 2020-04-30 Procédé et dispositif de détection d'un espace libre pour un véhicule WO2021217646A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080099748.5A CN115398272A (zh) 2020-04-30 2020-04-30 检测车辆可通行区域的方法及装置
PCT/CN2020/088473 WO2021217646A1 (fr) 2020-04-30 2020-04-30 Procédé et dispositif de détection d'un espace libre pour un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/088473 WO2021217646A1 (fr) 2020-04-30 2020-04-30 Procédé et dispositif de détection d'un espace libre pour un véhicule

Publications (1)

Publication Number Publication Date
WO2021217646A1 true WO2021217646A1 (fr) 2021-11-04

Family

ID=78373178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/088473 WO2021217646A1 (fr) 2020-04-30 2020-04-30 Procédé et dispositif de détection d'un espace libre pour un véhicule

Country Status (2)

Country Link
CN (1) CN115398272A (fr)
WO (1) WO2021217646A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116486377B (zh) * 2023-04-26 2023-12-26 小米汽车科技有限公司 可行驶区域的生成方法及装置
CN116343159B (zh) * 2023-05-24 2023-08-01 之江实验室 一种非结构化场景可通行区域检测方法、装置及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110006430A (zh) * 2019-03-26 2019-07-12 智慧航海(青岛)科技有限公司 一种航迹规划算法的优化方法
CN110208819A (zh) * 2019-05-14 2019-09-06 江苏大学 一种多个障碍物三维激光雷达数据的处理方法
CN110262518A (zh) * 2019-07-22 2019-09-20 上海交通大学 基于轨迹拓扑地图和避障的车辆导航方法、系统及介质
US20190385457A1 (en) * 2019-08-07 2019-12-19 Lg Electronics Inc. Obstacle warning method for vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110006430A (zh) * 2019-03-26 2019-07-12 智慧航海(青岛)科技有限公司 一种航迹规划算法的优化方法
CN110208819A (zh) * 2019-05-14 2019-09-06 江苏大学 一种多个障碍物三维激光雷达数据的处理方法
CN110262518A (zh) * 2019-07-22 2019-09-20 上海交通大学 基于轨迹拓扑地图和避障的车辆导航方法、系统及介质
US20190385457A1 (en) * 2019-08-07 2019-12-19 Lg Electronics Inc. Obstacle warning method for vehicle

Also Published As

Publication number Publication date
CN115398272A (zh) 2022-11-25

Similar Documents

Publication Publication Date Title
US20210262808A1 (en) Obstacle avoidance method and apparatus
WO2021135371A1 (fr) Procédé de conduite automatique, dispositif associé et support de stockage lisible par ordinateur
WO2022001773A1 (fr) Procédé et appareil de prédiction de trajectoire
WO2021102955A1 (fr) Procédé et appareil de planification de trajet pour véhicule
CN112230642B (zh) 道路可行驶区域推理方法及装置
US20220289252A1 (en) Operational Design Domain Odd Determining Method and Apparatus and Related Device
WO2022021910A1 (fr) Procédé et appareil de détection de collision de véhicule et support de stockage lisible par ordinateur
WO2021212379A1 (fr) Procédé et appareil de détection de ligne de délimitation de voie
EP4029750A1 (fr) Procédé de présentation de données et dispositif terminal
EP3933439A1 (fr) Procédé de localisation et dispositif de localisation
WO2022001366A1 (fr) Procédé et appareil de détection de ligne de voie
WO2022062825A1 (fr) Procédé, dispositif de commande de véhicule et véhicule
EP4307251A1 (fr) Procédé de mappage, véhicule, support d'informations lisible par ordinateur, et puce
WO2021217646A1 (fr) Procédé et dispositif de détection d'un espace libre pour un véhicule
CN112810603B (zh) 定位方法和相关产品
WO2022052881A1 (fr) Procédé de construction de carte et dispositif informatique
WO2022052872A1 (fr) Procédé et appareil de conduite autonome
CN115330923B (zh) 点云数据渲染方法、装置、车辆、可读存储介质及芯片
US20220309806A1 (en) Road structure detection method and apparatus
WO2022022284A1 (fr) Procédé et appareil de détection d'objet cible
WO2021159397A1 (fr) Procédé de détection et dispositif de détection de région pouvant être parcourue par un véhicule
CN114764980B (zh) 一种车辆转弯路线规划方法及装置
CN114576148B (zh) 液冷系统、应用在液冷系统的控制方法、控制装置及车辆
WO2024055252A1 (fr) Procédé et appareil de fusion de données, et dispositif de conduite intelligente
WO2022033089A1 (fr) Procédé et dispositif permettant de déterminer des informations tridimensionnelles d'un objet qui doit subir une détection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933535

Country of ref document: EP

Kind code of ref document: A1