WO2021097823A1 - Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule - Google Patents

Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule Download PDF

Info

Publication number
WO2021097823A1
WO2021097823A1 PCT/CN2019/120348 CN2019120348W WO2021097823A1 WO 2021097823 A1 WO2021097823 A1 WO 2021097823A1 CN 2019120348 W CN2019120348 W CN 2019120348W WO 2021097823 A1 WO2021097823 A1 WO 2021097823A1
Authority
WO
WIPO (PCT)
Prior art keywords
gaussian
vehicle
parameter
target position
gaussian mixture
Prior art date
Application number
PCT/CN2019/120348
Other languages
English (en)
Chinese (zh)
Inventor
周鹏
吴祖光
郑佳
王岩岩
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201980101534.4A priority Critical patent/CN114556251B/zh
Priority to PCT/CN2019/120348 priority patent/WO2021097823A1/fr
Publication of WO2021097823A1 publication Critical patent/WO2021097823A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • This application relates to the field of autonomous driving, and more specifically to a method and device for determining the free space of a vehicle.
  • Autonomous driving is a mainstream application in the field of artificial intelligence.
  • Autonomous driving technology relies on the collaboration of computer vision, radar, monitoring devices, and global positioning systems to allow motor vehicles to achieve autonomous driving without the need for human active operations.
  • Self-driving vehicles use various computing systems to help transport passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator (such as a navigator, driver, or passenger). The self-driving vehicle allows the operator to switch from the manual mode to the self-driving mode or a mode in between. Since autonomous driving technology does not require humans to drive motor vehicles, it can theoretically effectively avoid human driving errors, reduce the occurrence of traffic accidents, and improve highway transportation efficiency. Therefore, autonomous driving technology is getting more and more attention.
  • Passable space is a way of describing the surrounding environment of a vehicle.
  • the passable space generally contains information such as other vehicles, pedestrians, and roadsides, and is mainly used to clearly describe the freely exerciseable space near the autonomous vehicle.
  • the present application provides a method and device for determining a vehicle passable space, which can improve the flexibility of application scenarios.
  • a method for determining a vehicle passable space includes: obtaining a first mixture Gaussian parameter of a target position; obtaining a first measurement value, where the first measurement value is measured by the vehicle at the first moment The obtained distance between the vehicle and the target position; the first mixture Gaussian parameter and the first measured value are passed through the probability hypothesis density PHD model to obtain the second mixture Gaussian parameter; according to the second mixture Gaussian parameter, the target position is determined to correspond to The status value of the grid, the status value of the grid is used to indicate whether the target location is passable.
  • the PHD model is a Bayesian statistical algorithm, through which the analysis accuracy can be improved.
  • the vehicle takes the first Gaussian mixture parameter and the first measured value as input, and obtains the second Gaussian mixture parameter after passing the PHD model.
  • the vehicle obtains a higher-precision second Gaussian mixture parameter through the measured value and the first Gaussian mixture parameter, so that the vehicle can obtain a more accurate network state value, thereby more accurately determining whether the target location is passable. Further improve the safety of vehicle traffic.
  • the embodiment of the present application may not require the sensitivity of the sensor, thereby improving the flexibility of the application scenario.
  • the PHD model includes a detection probability model, and the detection probability model satisfies the Rayleigh distribution.
  • the detection effect is better in the middle area, and the detection effect is poor in the nearest or farthest area.
  • the Rayleigh distribution can further conform to the measurement state of the sensor, so that the accuracy of the second Gaussian mixture parameter output is higher, and the state of the obtained grid The value is more accurate, which helps to more accurately determine whether the target location is passable, and further improves the safety of vehicle passage.
  • the determining the state value of the grid corresponding to the target position according to the second mixture Gaussian parameter includes: clipping the second mixture Gaussian parameter to obtain a third mixture Gaussian parameter, and the third mixture Gaussian parameter The number of Gaussian elements in the Gaussian parameter is smaller than the number of Gaussian elements of the second Gaussian mixture parameter; and the state value of the grid corresponding to the target position is determined according to the third Gaussian mixture parameter.
  • the vehicle can merge some Gaussian elements, such as merging Gaussian elements that are very close. This can reduce some Gaussian elements with smaller weights, thereby reducing the complexity of subsequent calculations and reducing the impact of clutter.
  • the fourth Gaussian mixture parameter is obtained through the PHD model according to the second measured value at the second moment and the third Gaussian mixture parameter. In this way, the weight of the Gaussian element that is greater than the preset threshold can be found in the fourth Gaussian mixture parameter, thereby improving the accuracy of the state value, thereby more accurately determining whether the target position is passable, and further improving the vehicle passability safety.
  • the determining the state value of the grid corresponding to the target position according to the third Gaussian mixture parameter includes: in the third Gaussian mixture parameter, the weight of the second Gaussian element is greater than a preset threshold. In this case, the weight of the second Gaussian element is determined as the state value of the grid corresponding to the target position.
  • the vehicle can set a preset threshold (for example, weight threshold) in advance, and use the weight of the Gaussian element in the fourth mixture Gaussian parameter greater than the preset threshold as the state value of the grid corresponding to the target position, which can improve The accuracy of the state value can further accurately determine whether the target location is passable, and further improve the safety of vehicle passing.
  • a preset threshold for example, weight threshold
  • a device for determining a vehicle's passable space includes: a memory for storing a program; a processor for executing the program stored in the memory, and when the program stored in the memory is During execution, the processor is configured to execute the foregoing first aspect and the method in any one of the possible implementation manners of the foregoing first aspect.
  • a computer-readable medium stores an instruction executed by an apparatus for processing data, and the instruction is used to execute the foregoing first aspect and any one of the foregoing first aspect. The method in the implementation mode.
  • a computer program product containing instructions is provided.
  • the computer program product runs on a computer, the computer executes the first aspect and the method in any one of the possible implementations of the first aspect. .
  • a chip in a fifth aspect, includes a processor and a data interface.
  • the processor reads instructions stored in a memory through the data interface, and executes the first aspect and any of the first aspects.
  • One possible implementation method One possible implementation method.
  • the chip may further include a memory in which instructions are stored, and the processor is configured to execute instructions stored on the memory.
  • the processor is configured to execute the method in any one of the implementation manners in the first aspect.
  • the aforementioned chip may specifically be a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC).
  • FPGA field-programmable gate array
  • ASIC application-specific integrated circuit
  • a terminal in a sixth aspect, includes the device of the second aspect described above.
  • a vehicle in a seventh aspect, includes the device of the second aspect described above.
  • the vehicle takes the first Gaussian mixture parameter and the first measured value as input, and obtains the second Gaussian mixture parameter after passing the PHD model.
  • the vehicle obtains a higher-precision second Gaussian mixture parameter through the first measured value and the first Gaussian mixture parameter, so that the vehicle can obtain a more accurate network state value, thereby more accurately determining whether the target position can be Passing further improves the safety of vehicle passing.
  • the embodiments of the present application may not be limited to the sensitivity requirements of the sensor, thereby improving the flexibility of application scenarios.
  • Fig. 1 is a functional block diagram of a vehicle according to an embodiment of the present application
  • Fig. 2 is a schematic diagram of an automatic driving system according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a device for determining a vehicle passable space according to an embodiment of the present application
  • Fig. 4 is a schematic block diagram of a device for determining a vehicle passable space according to an embodiment of the present application.
  • Fig. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present invention.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operations, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle
  • the confidence level corresponding to the possibility of performing the possible behavior is to control the vehicle 100 based on the determined information.
  • the vehicle 100 can be placed to operate without human interaction.
  • the travel system 102 may include components that provide power movement for the vehicle 100.
  • the propulsion system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121.
  • the engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, such as a hybrid engine composed of a gas oil engine and an electric motor, or a hybrid engine composed of an internal combustion engine and an air compression engine.
  • the engine 118 converts the energy source 119 into mechanical energy.
  • the transmission device 120 can transmit mechanical power from the engine 118 to the wheels 121.
  • the transmission device 120 may include a gearbox, a differential, and a drive shaft.
  • the transmission device 120 may also include other devices, such as a clutch.
  • the drive shaft may include one or more shafts that can be coupled to one or more wheels 121.
  • the laser rangefinder 128 can use laser light to sense objects in the environment where the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the steering system 132 is operable to adjust the forward direction of the vehicle 100.
  • it may be a steering wheel system in one embodiment.
  • the throttle 134 is used to control the operating speed of the engine 118 and thereby control the speed of the vehicle 100.
  • the braking unit 136 is used to control the vehicle 100 to decelerate.
  • the braking unit 136 may use friction to slow down the wheels 121.
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electric current.
  • the braking unit 136 may also take other forms to slow down the rotation speed of the wheels 121 to control the speed of the vehicle 100.
  • the computer vision system 140 may be operable to process and analyze the images captured by the camera 130 in order to identify objects and/or features in the surrounding environment of the vehicle 100.
  • the objects and/or features may include traffic signals, road boundaries, and obstacles.
  • the computer vision system 140 may use object recognition algorithms, Structure from Motion (SFM) algorithms, video tracking, and other computer vision technologies.
  • SFM Structure from Motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and so on.
  • the route control system 142 is used to determine the travel route of the vehicle 100.
  • the route control system 142 may combine data from the sensor 138, the GPS 122, and one or more predetermined maps to determine the driving route for the vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise cross over potential obstacles in the environment of the vehicle 100.
  • control system 106 may add or alternatively include components other than those shown and described. Alternatively, a part of the components shown above may be reduced.
  • the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripheral devices 108.
  • the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
  • the peripheral device 108 provides a means for the user of the vehicle 100 to interact with the user interface 116.
  • the onboard computer 148 may provide information to the user of the vehicle 100.
  • the user interface 116 can also operate the onboard computer 148 to receive user input.
  • the on-board computer 148 can be operated through a touch screen.
  • the peripheral device 108 may provide a means for the vehicle 100 to communicate with other devices located in the vehicle.
  • the microphone 150 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100.
  • the speaker 152 may output audio to the user of the vehicle 100.
  • the wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network.
  • the wireless communication system 146 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication.
  • the wireless communication system 146 may use WiFi to communicate with a wireless local area network (WLAN).
  • WLAN wireless local area network
  • the wireless communication system 146 may directly communicate with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols such as various vehicle communication systems.
  • the wireless communication system 146 may include one or more dedicated short-range communications (DSRC) devices, which may include vehicles and/or roadside stations. Public and/or private data communications.
  • DSRC dedicated short-range communications
  • the power supply 110 may provide power to various components of the vehicle 100.
  • the power source 110 may be a rechargeable lithium ion or lead-acid battery.
  • One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100.
  • the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
  • the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as the data storage device 114.
  • the computer system 112 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 113 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor may be a dedicated device such as an ASIC or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, those of ordinary skill in the art should understand that the processor, computer, or memory may actually include Multiple processors, computers, or memories stored in the same physical enclosure.
  • the memory may be a hard disk drive or other storage medium located in a housing other than the computer 110. Therefore, a reference to a processor or computer will be understood to include a reference to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described here, some components such as steering components and deceleration components may each have its own processor that only performs calculations related to component-specific functions .
  • the processor may be located away from the vehicle and wirelessly communicate with the vehicle.
  • some of the processes described herein are executed on a processor arranged in the vehicle and others are executed by a remote processor, including taking the necessary steps to perform a single manipulation.
  • the data storage device 114 may include instructions 115 (eg, program logic), which may be executed by the processor 113 to perform various functions of the vehicle 100, including those functions described above.
  • the data storage device 114 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data on one or more of the propulsion system 102, the sensor system 104, the control system 106, and the peripheral device 108. Control instructions.
  • the data storage device 114 may also store data, such as road maps, route information, the location, direction, and speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 100 and the computer system 112 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the user interface 116 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and a speaker 152.
  • the computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (for example, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the data storage device 114 may exist partially or completely separately from the vehicle 1100.
  • the aforementioned components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation to the embodiment of the present invention.
  • An autonomous vehicle traveling on a road can recognize objects in its surrounding environment to determine the adjustment to the current speed.
  • the object may be other vehicles, traffic control equipment, or other types of objects.
  • each recognized object can be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, distance from the vehicle, etc., can be used to determine the speed to be adjusted by the self-driving car.
  • the self-driving car vehicle 100 or the computing device associated with the self-driving vehicle 100 may be based on the characteristics of the identified object and the surrounding environment
  • the state of the object e.g., traffic, rain, ice on the road, etc.
  • each recognized object depends on each other's behavior, so all recognized objects can also be considered together to predict the behavior of a single recognized object.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
  • an autonomous vehicle can determine what stable state the vehicle will need to adjust to (for example, accelerating, decelerating, or stopping) based on the predicted behavior of the object.
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 on the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so on.
  • the computing device can also provide instructions to modify the steering angle of the vehicle 100 so that the self-driving car follows a given trajectory and/or maintains an object near the self-driving car (for example, , The safe horizontal and vertical distances of cars in adjacent lanes on the road.
  • the above-mentioned vehicle 100 may be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
  • the embodiments of the invention are not particularly limited.
  • FIG. 2 shows a schematic diagram of the automated driving system.
  • the computer system 101 includes a processor 103, and the processor 103 is coupled to a system bus 105.
  • the processor 103 may be one or more processors, where each processor may include one or more processor cores.
  • the system bus 105 is coupled with an input/output (I/O) bus 113 through a bus bridge 111.
  • the I/O interface 115 is coupled to the I/O bus.
  • the I/O interface 115 communicates with various I/O devices, such as an input device 117 (such as a keyboard, a mouse, a touch screen, etc.), and a media tray 121 (such as a CD-ROM, a multimedia interface, etc.).
  • Transceiver 123 can send and/or receive radio communication signals
  • camera 155 can capture scene and dynamic digital video images
  • external USB interface 125 external USB interface 125.
  • the interface connected to the I/O interface 115 may be a USB interface.
  • the processor 103 may be any conventional processor, including a reduced instruction set computing ("RISC”) processor, a complex instruction set computing (“CISC”) processor, or a combination of the foregoing.
  • the processor may be a dedicated device such as an application specific integrated circuit (“ASIC").
  • the processor 103 may be a neural network processor or a combination of a neural network processor and the foregoing traditional processors.
  • the computer system 101 may be located far away from the autonomous driving vehicle, and may wirelessly communicate with the autonomous driving vehicle O.
  • some of the processes described herein are executed on a processor provided in an autonomous vehicle, and others are executed by a remote processor, including taking actions required to perform a single manipulation.
  • the hard disk drive interface is coupled to the system bus 105.
  • the hardware drive interface is connected with the hard drive.
  • the system memory 135 is coupled to the system bus 105.
  • the data running in the system memory 135 may include the operating system 137 and application programs 143 of the computer 101.
  • the kernel 141 is composed of those parts of the operating system that are used to manage memory, files, peripherals, and system resources. Directly interact with the hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and so on.
  • Application programs 141 include programs related to controlling auto-driving cars, such as programs that manage the interaction between autonomous vehicles and obstacles on the road, programs that control the route or speed of autonomous vehicles, and programs that control interaction between autonomous vehicles and other autonomous vehicles on the road. .
  • the application program 141 also exists on the system of the deploying server 149 of the software part. In one embodiment, when the application program 141 needs to be executed, the computer system 101 may download the application program 141 from the deploying server 14.
  • Gridmap discretizes a piece of area. For example, a 100m*100m area is divided into 1m*1m grids, that is, the area will be divided into 100*100 altogether 10,000 grids.
  • Each grid corresponds to a state value, and according to the state value, it is determined whether the area covered by the corresponding grid is freespace.
  • there are two ways to determine the state value corresponding to each grid one is based on the dominance rate, and the other is based on deep learning.
  • the state value corresponding to each grid is determined based on the dominance rate as follows:
  • a gridmap is initialized, the sensor receives obstacle information, and the obstacle information is projected into the gridmap through an algorithm.
  • the current measurement value has nothing to do with the historical measurement value, that is
  • Determining the state value corresponding to each grid based on deep learning is as follows:
  • residual network residual network, resnet
  • the image preprocessing module preprocesses the image, specifically scales the image, transforms it into the input required by the neural network, and sends the scaled image to the neural network input port.
  • the image area is processed in the image coordinate system, and the neural network will output the freespace area in the picture;
  • Fig. 3 shows a schematic flowchart of a method for determining a vehicle passable space according to an embodiment of the present application.
  • the execution subject of the embodiments of the present application may be a vehicle, or a terminal in the vehicle, or a processing module in the vehicle.
  • the computer system 112 shown in FIG. 1. For the convenience of description, the following embodiments are described with a vehicle as the execution subject, but the present application is not limited to this.
  • the vehicle obtains the first Gaussian mixture parameter of the target position.
  • the first Gaussian mixture parameter of the target position may be obtained empirically, or may be obtained from the last measurement of the vehicle, which is not limited in this application.
  • the location of the target may be the location of obstacles (for example, other vehicles, pedestrians, roadsides, etc.).
  • the location of the target may be one or multiple.
  • the embodiment of the present application takes the location of a certain target as an example for description, but the present application is not limited to this.
  • the number of targets is the number of Gaussian elements in the first Gaussian mixture function.
  • the vehicle obtains a first measurement value, where the first measurement value is the distance between the vehicle and the target position measured by the vehicle at the first moment.
  • the vehicle can measure the distance between the vehicle and the target location through a sensor system (for example, the sensor system 104 shown in FIG. 1).
  • a sensor system for example, the sensor system 104 shown in FIG. 1.
  • the vehicle uses the target position as the center of mass coordinates to measure the distance between the target position and the vehicle.
  • the vehicle obtains the second Gaussian mixture parameter through a probability hypothesis density (PHD) model using the first Gaussian mixture parameter and the first measured value.
  • PLD probability hypothesis density
  • the apparatus 400 may include units for performing various operations in the foregoing method embodiments.
  • each unit in the device 400 is designed to implement the corresponding process of any of the foregoing methods.
  • the device 400 includes a transceiver module 410 and a processing module 420.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente demande se rapporte au domaine de l'intelligence artificielle. L'invention concerne spécifiquement un procédé et un dispositif destinés à être utilisés dans la détermination d'un espace pouvant être traversé par un véhicule. Un véhicule, avec un premier paramètre gaussien mixte et une première valeur mesurée servant d'entrées, obtient un second paramètre gaussien mixte au moyen d'un modèle de densité d'hypothèse de probabilité (PHD). En d'autres termes, le véhicule obtient un second paramètre gaussien mixte de précision accrue par l'intermédiaire de la première valeur mesurée et du premier paramètre gaussien mixte ; ainsi, le véhicule est autorisé à obtenir une valeur d'état d'un réseau de précision accrue, ce qui permet de déterminer avec précision si un emplacement cible peut-être traversé, ce qui permet d'augmenter la sécurité du déplacement du véhicule. De plus, les modes de réalisation de la présente invention ne sont pas limités par une exigence de sensibilité d'un capteur, augmentant ainsi la flexibilité pour des scénarios d'application.
PCT/CN2019/120348 2019-11-22 2019-11-22 Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule WO2021097823A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980101534.4A CN114556251B (zh) 2019-11-22 2019-11-22 用于确定车辆可通行空间的方法和装置
PCT/CN2019/120348 WO2021097823A1 (fr) 2019-11-22 2019-11-22 Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/120348 WO2021097823A1 (fr) 2019-11-22 2019-11-22 Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule

Publications (1)

Publication Number Publication Date
WO2021097823A1 true WO2021097823A1 (fr) 2021-05-27

Family

ID=75981151

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/120348 WO2021097823A1 (fr) 2019-11-22 2019-11-22 Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule

Country Status (2)

Country Link
CN (1) CN114556251B (fr)
WO (1) WO2021097823A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108594812A (zh) * 2018-04-16 2018-09-28 电子科技大学 一种结构化道路的智能车辆平滑轨迹规划方法
CN108764373A (zh) * 2018-06-08 2018-11-06 北京领骏科技有限公司 一种用于自动驾驶中的传感器数据过滤及融合方法
US20190080604A1 (en) * 2017-09-08 2019-03-14 Connaught Electronics Ltd. Freespace detection in a driver assistance system of a motor vehicle with a neural network
CN109901574A (zh) * 2019-01-28 2019-06-18 华为技术有限公司 自动驾驶方法及装置
CN109937343A (zh) * 2017-06-22 2019-06-25 百度时代网络技术(北京)有限公司 用于自动驾驶车辆交通预测中的预测轨迹的评估框架

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020095292A (ja) * 2017-02-24 2020-06-18 株式会社日立製作所 混雑予測システムおよび歩行者シミュレーション装置
US10474157B2 (en) * 2017-06-06 2019-11-12 Baidu Usa Llc Data-based control error detection and parameter compensation system
DE102017211613A1 (de) * 2017-07-07 2019-01-10 Robert Bosch Gmbh Verfahren zur Verifizierung einer digitalen Karte eines höher automatisierten Fahrzeugs (HAF), insbesondere eines hochautomatisierten Fahrzeugs

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109937343A (zh) * 2017-06-22 2019-06-25 百度时代网络技术(北京)有限公司 用于自动驾驶车辆交通预测中的预测轨迹的评估框架
US20190080604A1 (en) * 2017-09-08 2019-03-14 Connaught Electronics Ltd. Freespace detection in a driver assistance system of a motor vehicle with a neural network
CN108594812A (zh) * 2018-04-16 2018-09-28 电子科技大学 一种结构化道路的智能车辆平滑轨迹规划方法
CN108764373A (zh) * 2018-06-08 2018-11-06 北京领骏科技有限公司 一种用于自动驾驶中的传感器数据过滤及融合方法
CN109901574A (zh) * 2019-01-28 2019-06-18 华为技术有限公司 自动驾驶方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHANG YOU, ZHOU PENG, PU DONG-BING, HE JI-ZHE: "Vehicle detection and tracking based on DM6437", XINXI JISHU = INFORMATION TECHNOLOGY, XINXI CHANYEBU DIANZI XINXI ZHONGXIN, CN, no. 3, 1 March 2013 (2013-03-01), CN, pages 78 - 80, XP055813184, ISSN: 1009-2552, DOI: 10.13274/j.cnki.hdzj.2013.03.037 *

Also Published As

Publication number Publication date
CN114556251B (zh) 2023-11-17
CN114556251A (zh) 2022-05-27

Similar Documents

Publication Publication Date Title
US20210262808A1 (en) Obstacle avoidance method and apparatus
CN110379193B (zh) 自动驾驶车辆的行为规划方法及行为规划装置
WO2021135371A1 (fr) Procédé de conduite automatique, dispositif associé et support de stockage lisible par ordinateur
CN113879295B (zh) 轨迹预测方法及装置
WO2021196879A1 (fr) Procédé et dispositif de reconnaissance du comportement de conduite d'un véhicule
WO2021000800A1 (fr) Procédé de raisonnement pour la région roulable d'une route, et dispositif
WO2021102955A1 (fr) Procédé et appareil de planification de trajet pour véhicule
CN112512887B (zh) 一种行驶决策选择方法以及装置
WO2021189210A1 (fr) Procédé de changement de voie de véhicule et dispositif associé
WO2022062825A1 (fr) Procédé, dispositif de commande de véhicule et véhicule
CN114693540A (zh) 一种图像处理方法、装置以及智能汽车
CN113498529A (zh) 一种目标跟踪方法及其装置
US20230048680A1 (en) Method and apparatus for passing through barrier gate crossbar by vehicle
CN112543877A (zh) 定位方法和定位装置
CN112810603B (zh) 定位方法和相关产品
EP4286972A1 (fr) Procédé et appareil de prédiction d'intention de conduite de véhicule, terminal et support de stockage
WO2022061702A1 (fr) Procédé, appareil et système pour des alertes de conduite
CN113741384A (zh) 检测自动驾驶系统的方法和装置
CN115398272A (zh) 检测车辆可通行区域的方法及装置
WO2021254000A1 (fr) Procédé et dispositif de planification de paramètres de mouvement longitudinal de véhicule
WO2022022284A1 (fr) Procédé et appareil de détection d'objet cible
WO2021110166A1 (fr) Procédé et dispositif de détection de structure de route
WO2021159397A1 (fr) Procédé de détection et dispositif de détection de région pouvant être parcourue par un véhicule
WO2021097823A1 (fr) Procédé et dispositif destinés à être utilisés pour déterminer un espace pouvant être traversé par un véhicule
CN115508841A (zh) 一种路沿检测的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19953304

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19953304

Country of ref document: EP

Kind code of ref document: A1