WO2022061725A1 - Procédé et appareil d'observation d'élément de circulation - Google Patents

Procédé et appareil d'observation d'élément de circulation Download PDF

Info

Publication number
WO2022061725A1
WO2022061725A1 PCT/CN2020/117785 CN2020117785W WO2022061725A1 WO 2022061725 A1 WO2022061725 A1 WO 2022061725A1 CN 2020117785 W CN2020117785 W CN 2020117785W WO 2022061725 A1 WO2022061725 A1 WO 2022061725A1
Authority
WO
WIPO (PCT)
Prior art keywords
observation data
traffic
traffic element
vehicle
multiple sets
Prior art date
Application number
PCT/CN2020/117785
Other languages
English (en)
Chinese (zh)
Inventor
卢远志
陈灿平
陈保成
赵剑
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/117785 priority Critical patent/WO2022061725A1/fr
Priority to CN202080004590.9A priority patent/CN112639910B/zh
Priority to EP20954577.1A priority patent/EP4207133A4/fr
Publication of WO2022061725A1 publication Critical patent/WO2022061725A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096758Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Definitions

  • the present application relates to the field of autonomous driving, and more particularly, to a method and apparatus for observing traffic elements.
  • Autopilot is a mainstream application in the field of artificial intelligence.
  • Autopilot technology relies on the cooperation of computer vision, radar, monitoring devices and global positioning systems to allow motor vehicles to achieve autonomous driving without the need for human active operation.
  • Autonomous vehicles use various computing systems to help transport passengers from one location to another. Some autonomous vehicles may require some initial or continuous input from an operator, such as a pilot, driver, or passenger.
  • An autonomous vehicle permits the operator to switch from a manual operating mode to an autonomous driving mode or a mode in between. Since automatic driving technology does not require humans to drive motor vehicles, it can theoretically effectively avoid human driving errors, reduce the occurrence of traffic accidents, and improve the efficiency of highway transportation. Therefore, autonomous driving technology is getting more and more attention.
  • the cloud server may fuse the same first observation data in the process of fusing multiple sets of first observation data.
  • Multiple sets of observation data of traffic elements are identified as observation data of different traffic elements, or the observation data of different traffic elements are identified as observation data of the same traffic element, resulting in inaccurate second observation data of traffic elements obtained.
  • the present application provides a method and device for observing traffic elements, so as to improve the accuracy of acquiring observation data of traffic elements.
  • a method for observing traffic elements comprising: receiving multiple sets of first observation data of traffic elements sent by multiple vehicles, and each vehicle in the multiple vehicles collects the multiple sets of first observations A set of first observation data in the data, the first observation data is used to indicate the change of the coordinate position of the traffic element with time and/or the change of the speed of the traffic element with time; Perform time synchronization processing and/or spatial correction processing on observation data to obtain multiple sets of processed observation data; and determine a second observation of the traffic element observed by the multiple vehicles based on the processed multiple sets of observation data data.
  • the second observation data of the traffic element is used to improve the accuracy of obtaining the observation data of the traffic element. This avoids directly fusing multiple sets of first observation data to determine the second observation data of traffic elements in the prior art, resulting in inaccurate second observation data of traffic elements obtained.
  • the time synchronization is performed on the first observation data of the traffic element sent by the plurality of vehicles
  • the processing to obtain multiple sets of processed observation data includes: determining the time deviation of the multiple sets of first observation data; and adjusting each of the multiple sets of first observation data based on the time deviation of the multiple sets of first observation data. grouping the first observation data to obtain the processed observation data, and the time synchronization of each group of observation data in the processed observation data.
  • each group of first observation data in the plurality of groups of first observation data is adjusted based on the time deviation of the plurality of groups of first observation data to obtain processed observation data, wherein the processed observation data
  • the time synchronization of each set of observation data in the system can improve the accuracy of the observation data of traffic elements.
  • the spatial correction processing is performed on the plurality of sets of first observation data, and the processed The multiple sets of observation data, including: determining the coordinates of the traffic elements at different time points indicated by each set of first observation data in the multiple sets of first observation data in a preset coordinate system;
  • the coordinate value of the traffic element included in each preset coordinate range is represented by the target coordinate value corresponding to the coordinate range, and the coordinates of the traffic element in each preset coordinate range in the coordinate system are obtained,
  • the processed observation data includes coordinates of the traffic element within each preset coordinate range in the coordinate system.
  • the coordinate value of the traffic element included in each preset coordinate range in the coordinate system is represented by the target coordinate value corresponding to the coordinate range, so as to obtain each preset coordinate value of the traffic element in the coordinate system.
  • the first observation data includes the type of the target, the motion state of the target, the motion trajectory of the target, the at least one of the size of the target.
  • the first observation data when the target object is a traffic light, the first observation data further includes timing information of the traffic light.
  • the multiple sets of first observation data are collected by on-board sensors in each of the multiple vehicles, and processed by a multi-domain controller.
  • the first observation data is collected by the vehicle-mounted sensor, and processed by the multi-domain controller, so as to avoid adding additional data collection devices and data processing devices, which is beneficial to avoid increasing the cost.
  • a device for observing traffic elements is provided, and the device may be a computing device or a chip in the computing device.
  • the apparatus may include a processing unit and a receiving unit.
  • the processing unit may be a processor and the receiving unit may be a communication interface.
  • the apparatus may further include a storage unit, and when the apparatus is a computing device, the storage unit may be a memory.
  • the storage unit is used to store instructions, and the processing unit executes the instructions stored in the storage unit to cause the computing device to perform the method in the first aspect.
  • the processing unit may be a processor, and the receiving unit may be an input/output interface, a pin or a circuit, etc.; the processing unit executes the instructions stored in the storage unit, to cause the computing device to perform the method of the first aspect.
  • the storage unit may be a storage unit (for example, a register, a cache, etc.) in the chip, or a storage unit (for example, a read-only memory, a read-only memory, a read-only memory, etc.) located outside the chip in the computing device. random access memory, etc.).
  • a storage unit for example, a register, a cache, etc.
  • a storage unit for example, a read-only memory, a read-only memory, a read-only memory, etc. located outside the chip in the computing device. random access memory, etc.
  • the above-mentioned memory is coupled with the processor, and it can be understood that the memory is located inside the processor, or the memory is located outside the processor, so as to be independent of the processor.
  • a computer program product comprising: computer program code, when the computer program code is run on a computer, causing the computer to perform the methods of the above aspects.
  • the above computer program code may be stored in whole or in part on the first storage medium, where the first storage medium may be packaged with the processor or separately packaged with the processor, which is not implemented in this embodiment of the present application. Specific restrictions.
  • a computer-readable medium stores program codes, which, when executed on a computer, cause the computer to execute the methods in the above-mentioned aspects.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a system 300 including an autonomous driving vehicle and a cloud service center to which the embodiments of the present application are applicable.
  • FIG. 4 is a flowchart of a traffic element observation method according to an embodiment of the present application.
  • Figure 5 shows a simulation diagram of the observation data collected before time synchronization processing.
  • FIG. 6 shows a simulation diagram of the observation data collected after time synchronization processing.
  • FIG. 7 is a flowchart of a traffic element observation method according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of an observation device of a traffic element according to an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in an autonomous driving mode, and can determine the current state of the vehicle and its surroundings through human manipulation, determine the likely behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle
  • the vehicle 100 is controlled based on the determined information with a confidence level corresponding to the likelihood of performing the possible behavior.
  • the vehicle 100 may be placed to operate without human interaction.
  • Vehicle 100 may include various subsystems, such as travel system 102 , sensor system 104 , control system 106 , one or more peripherals 108 and power supply 110 , computer system 112 , and user interface 116 .
  • vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. Additionally, each of the subsystems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
  • the travel system 102 may include components that provide powered motion for the vehicle 100 .
  • travel system 102 may include engine 118 , energy source 119 , transmission 120 , and wheels/tires 121 .
  • Engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a gasoline engine and electric motor hybrid engine, an internal combustion engine and an air compression engine hybrid engine.
  • Engine 118 converts energy source 119 into mechanical energy.
  • Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electricity.
  • the energy source 119 may also provide energy to other systems of the vehicle 100 .
  • Transmission 120 may transmit mechanical power from engine 118 to wheels 121 .
  • Transmission 120 may include a gearbox, a differential, and a driveshaft.
  • transmission 120 may also include other devices, such as clutches.
  • the drive shaft may include one or more axles that may be coupled to one or more wheels 121 .
  • the sensor system 104 may include several sensors that sense information about the environment surrounding the vehicle 100 .
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a global positioning system (GPS) system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 124, Radar 126 , laser rangefinder 128 and camera 130 .
  • the sensor system 104 may also include sensors of the internal systems of the vehicle 100 being monitored (eg, an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors can be used to detect objects and their corresponding characteristics (position, shape, orientation, velocity, etc.). This detection and identification is a critical function for the safe operation of the autonomous vehicle 100 .
  • the positioning system 122 may be used to estimate the geographic location of the vehicle 100 .
  • the IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration.
  • IMU 124 may be a combination of an accelerometer and a gyroscope.
  • Radar 126 may utilize radio signals to sense objects within the surrounding environment of vehicle 100 .
  • the radar 126 may also be used to sense one or more of the target's speed, position, and heading.
  • the laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located.
  • the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
  • Camera 130 may be used to capture multiple images of the surrounding environment of vehicle 100 .
  • Camera 130 may be a still camera or a video camera.
  • Control system 106 controls the operation of the vehicle 100 and its components.
  • Control system 106 may include various elements including steering system 132 , throttle 134 , braking unit 136 , computer vision system 140 , route control system 142 , and obstacle avoidance system 144 .
  • the steering system 132 is operable to adjust the heading of the vehicle 100 .
  • it may be a steering wheel system.
  • the throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100 .
  • the braking unit 136 is used to control the deceleration of the vehicle 100 .
  • the braking unit 136 may use friction to slow the wheels 121 .
  • the braking unit 136 may convert the kinetic energy of the wheels 121 into electrical current.
  • the braking unit 136 may also take other forms to slow the wheels 121 to control the speed of the vehicle 100 .
  • Computer vision system 140 may be operable to process and analyze images captured by camera 130 in order to identify objects and/or features in the environment surrounding vehicle 100 .
  • the objects and/or features may include traffic signals, road boundaries and obstacles.
  • Computer vision system 140 may use object recognition algorithms, structure from motion (SFM) algorithms, video tracking, and other computer vision techniques.
  • SFM structure from motion
  • the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, and the like.
  • the route control system 142 is used to determine the travel route of the vehicle 100 .
  • route control system 142 may combine data from sensors, GPS 122, and one or more predetermined maps to determine a driving route for vehicle 100.
  • the obstacle avoidance system 144 is used to identify, evaluate and avoid or otherwise traverse potential obstacles in the environment of the vehicle 100 .
  • control system 106 may additionally or alternatively include components other than those shown and described. Alternatively, some of the components shown above may be reduced.
  • Peripherals 108 may include a wireless communication system 146 , an onboard computer 148 , a microphone 150 and/or a speaker 152 .
  • peripherals 108 provide a means for a user of vehicle 100 to interact with user interface 116 .
  • the onboard computer 148 may provide information to the user of the vehicle 100 .
  • User interface 116 may also operate on-board computer 148 to receive user input.
  • the onboard computer 148 can be operated via a touch screen.
  • peripheral devices 108 may provide a means for vehicle 100 to communicate with other devices located within the vehicle.
  • microphone 150 may receive audio (eg, voice commands or other audio input) from a user of vehicle 100 .
  • speakers 152 may output audio to a user of vehicle 100 .
  • Wireless communication system 146 may wirelessly communicate with one or more devices, either directly or via a communication network.
  • wireless communication system 146 may use 3G cellular communications, such as code division multiple access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4th generation, 4G) communications such as LTE. Or the fifth generation (5th-Generation, 5G) communication.
  • the wireless communication system 146 may communicate with a wireless local area network (WLAN) using WiFi.
  • the wireless communication system 146 may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
  • Other wireless protocols, such as various vehicle communication systems, for example, wireless communication system 146 may include one or more dedicated short range communications (DSRC) devices, which may include communication between vehicles and/or roadside stations public and/or private data communications.
  • DSRC dedicated short range communications
  • the power supply 110 may provide power to various components of the vehicle 100 .
  • the power source 110 may be a rechargeable lithium-ion or lead-acid battery.
  • One or more battery packs of such a battery may be configured as a power source to provide power to various components of the vehicle 100 .
  • power source 110 and energy source 119 may be implemented together, such as in some all-electric vehicles.
  • Computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as data memory 114 .
  • Computer system 112 may also be multiple computing devices that control individual components or subsystems of vehicle 100 in a distributed fashion.
  • the processor 113 may be any conventional processor, such as a commercially available central processing unit (CPU). Alternatively, the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • FIG. 1 functionally illustrates the processor, memory, and other elements of the computer 110 in the same block, one of ordinary skill in the art will understand that the processor, computer, or memory may actually include a processor, a computer, or a memory that may or may not Multiple processors, computers, or memories stored within the same physical enclosure.
  • the memory may be a hard drive or other storage medium located within an enclosure other than computer 110 .
  • reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel.
  • some components such as the steering and deceleration components may each have their own processor that only performs computations related to component-specific functions .
  • a processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are performed on a processor disposed within the vehicle while others are performed by a remote processor, including taking steps necessary to perform a single maneuver.
  • the memory 114 may contain instructions 115 (eg, program logic) executable by the processor 113 to perform various functions of the vehicle 100 , including those described above.
  • Memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of travel system 102 , sensor system 104 , control system 106 , and peripherals 108 . instruction.
  • memory 114 may store data such as road maps, route information, vehicle location, direction, speed, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous and/or manual modes.
  • the above-mentioned processor 113 may also execute the planning scheme for the longitudinal motion parameters of the vehicle according to the embodiments of the present application, so as to help the vehicle to plan the longitudinal motion parameters.
  • the specific longitudinal motion parameter planning method reference may be made to the introduction of FIG. 3 below. , and are not repeated here for brevity.
  • a user interface 116 for providing information to or receiving information from a user of the vehicle 100 .
  • user interface 116 may include one or more input/output devices within the set of peripheral devices 108 , such as wireless communication system 146 , onboard computer 148 , microphone 150 and speaker 152 .
  • Computer system 112 may control functions of vehicle 100 based on input received from various subsystems (eg, travel system 102 , sensor system 104 , and control system 106 ) and from user interface 116 .
  • computer system 112 may utilize input from control system 106 in order to control steering unit 132 to avoid obstacles detected by sensor system 104 and obstacle avoidance system 144 .
  • computer system 112 is operable to provide control of various aspects of vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • memory 114 may exist partially or completely separate from vehicle 100 .
  • the above-described components may be communicatively coupled together in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation on the embodiment of the present invention.
  • An autonomous vehicle traveling on a road can recognize objects within its surroundings to determine adjustments to current speed.
  • the objects may be other vehicles, traffic control equipment, or other types of objects.
  • each identified object may be considered independently, and based on the object's respective characteristics, such as its current speed, acceleration, distance from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to adjust.
  • autonomous vehicle 100 or a computing device associated with autonomous vehicle 100 eg, computer system 112, computer vision system 140, memory 114 of FIG.
  • autonomous vehicle 100 For example, traffic, rain, ice on the road, etc.
  • each identified object is dependent on the behavior of the other, so it is also possible to predict the behavior of a single identified object by considering all identified objects together.
  • the vehicle 100 can adjust its speed based on the predicted behavior of the identified object.
  • the autonomous vehicle can determine that the vehicle will need to adjust to a steady state (eg, accelerate, decelerate, or stop) based on the predicted behavior of the object.
  • a steady state eg, accelerate, decelerate, or stop
  • other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and the like.
  • the computing device may also provide instructions to modify the steering angle of the vehicle 100 so that the autonomous vehicle follows a given trajectory and/or maintains contact with objects in the vicinity of the autonomous vehicle (eg, , cars in adjacent lanes on the road) safe lateral and longitudinal distances.
  • objects in the vicinity of the autonomous vehicle eg, , cars in adjacent lanes on the road
  • the above-mentioned vehicle 100 can be a car, a truck, a motorcycle, a public vehicle, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, an amusement park vehicle, construction equipment, a tram, a golf cart, a train, a cart, etc.
  • the embodiments of the invention are not particularly limited.
  • FIG. 2 is a schematic diagram of a suitable automatic driving system according to an embodiment of the present application.
  • the computer system 101 includes a processor 103 , and the processor 103 is coupled to a system bus 105 .
  • the processor 103 may be one or more processors, each of which may include one or more processor cores.
  • a video adapter 107 which can drive a display 109, is coupled to the system bus 105.
  • the system bus 105 is coupled to an input/output (I/O) bus 113 through a bus bridge 111 .
  • I/O interface 115 is coupled to the I/O bus.
  • I/O interface 115 communicates with various I/O devices, such as input device 117 (eg, keyboard, mouse, touch screen, etc.), media tray 121, (eg, CD-ROM, multimedia interface, etc.).
  • Transceiver 123 which can transmit and/or receive radio communication signals
  • camera 155 which can capture sceneries and dynamic digital video images
  • external USB interface 125 external USB interface 125 .
  • the interface connected to the I/O interface 115 may be a USB interface.
  • the processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (Reduced Instruction Set Computing, RISC) processor, a Complex Instruction Set Computing (Complex Instruction Set Computer, CISC) processor or a combination of the above.
  • the processor may be a special purpose device such as an application specific integrated circuit ASIC.
  • the processor 103 may be a neural network processor or a combination of a neural network processor and the above-mentioned conventional processors.
  • computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle.
  • some of the processes described herein are performed on a processor disposed within the autonomous vehicle, others are performed by a remote processor, including taking actions required to perform a single maneuver.
  • Network interface 129 is a hardware network interface, such as a network card.
  • the network 127 may be an external network, such as the Internet, or an internal network, such as an Ethernet network or a virtual private network (Virtual Private Network, VPN).
  • the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, and the like.
  • the hard disk drive interface is coupled to the system bus 105 .
  • the hard drive interface is connected to the hard drive.
  • System memory 135 is coupled to system bus 105 . Data running in system memory 135 may include operating system 137 and application programs 143 of computer 101 .
  • the operating system includes a shell 139 and a kernel 141 .
  • Shell 139 is an interface between the user and the kernel of the operating system.
  • Shell 139 is the outermost layer of the operating system.
  • Shell 139 manages the interaction between the user and the operating system: waiting for user input, interpreting user input to the operating system, and processing various operating system outputs.
  • Kernel 141 consists of those parts of the operating system that manage memory, files, peripherals, and system resources. Interacting directly with hardware, the operating system kernel usually runs processes and provides inter-process communication, providing CPU time slice management, interrupts, memory management, IO management, and more.
  • Application 143 includes programs that control the autonomous driving of the vehicle, such as programs that manage the interaction between the autonomous vehicle and obstacles on the road, programs that control the route or speed of the autonomous vehicle, and programs that control the interaction between the autonomous vehicle and other autonomous vehicles on the road. .
  • Application 143 also exists on the system of software deploying server 149 .
  • computer system 101 may download application 143 from software deploying server 149 when application 147 needs to be executed.
  • the above-mentioned application program may further include an application program corresponding to the target object perception scheme provided by the embodiments of the present application, wherein the target object perception scheme of the embodiments of the present application will be described in detail below. For the sake of brevity, the This will not be repeated here.
  • Sensor 153 is associated with computer system 101 .
  • the sensor 153 is used to detect the environment around the computer 101 .
  • the sensor 153 can detect objects, such as animals, vehicles, obstacles, etc., and further sensors can detect the surrounding environment of the above objects, such as: the environment around the animal, other animals appearing around the animal, weather conditions , the brightness of the surrounding environment, etc.
  • the sensors may be lidars, cameras, infrared sensors, chemical detectors, microphones, and the like.
  • FIG. 1 and FIG. 2 The vehicles and driving systems to which the embodiments of the present application are applicable are described above with reference to FIG. 1 and FIG. 2 .
  • FIG. 3 a system including a vehicle and a cloud service center is used as an example to introduce the applicable scenarios of the embodiments of the present application.
  • FIG. 3 is a schematic diagram of a system 300 including an autonomous driving vehicle and a cloud service center to which the embodiments of the present application are applicable.
  • Cloud service center 310 may receive information from autonomous vehicle 330 and autonomous vehicle 331 via network 320, such as a wireless communication network.
  • the above-mentioned received information may be the position of the target, the speed of the target, etc. sent by the autonomous vehicle 330 and/or the autonomous vehicle 331 .
  • the target objects may be traffic elements collected during the driving of the vehicle that collects data, such as other vehicles, pedestrians, traffic lights, and the like.
  • the cloud service center 310 controls the autonomous vehicles 330 and 331 by running a program related to controlling the autonomous driving of the vehicle stored in the cloud service center 310 according to the received data.
  • Programs related to controlling the autonomous driving of vehicles can be programs that manage the interaction between the autonomous vehicle and obstacles on the road, programs that control the route or speed of the autonomous vehicle, and programs that control the interaction between the autonomous vehicle and other autonomous vehicles on the road.
  • the network 320 provides portions of the map out to the autonomous vehicle 330 or 331 .
  • operations may be divided between different locations or centers.
  • multiple cloud service centers may receive, validate, combine, and/or transmit information reports.
  • Information reports and/or sensor data may also be sent between autonomous vehicles in some examples. Other configurations are also possible.
  • the hub sends the autonomous vehicle suggested solutions regarding possible driving situations within the system 300 (eg, informing the obstacle ahead, and informing how to get around it)).
  • a cloud service center can assist the vehicle in determining how to proceed when faced with certain obstacles within the environment.
  • the cloud service center sends a response to the autonomous vehicle indicating how the vehicle should behave in a given scenario.
  • the cloud service center can confirm the presence of a temporary stop sign ahead of the road based on the collected sensor data, and also determine that the lane is closed due to the application based on the "lane closed" sign and sensor data of the construction vehicle. .
  • the cloud service center sends a suggested operating mode for the autonomous vehicle to pass the obstacle (eg, instructing the vehicle to change lanes on another road).
  • a suggested operating mode for the autonomous vehicle eg, instructing the vehicle to change lanes on another road.
  • the cloud service center observes the video stream within its operating environment and has confirmed that the self-driving vehicle can safely and successfully traverse the obstacle, the operating steps used by the self-driving vehicle can be added to the driving information map. Accordingly, this information can be sent to other vehicles in the area that may encounter the same obstacle in order to assist other vehicles not only in recognizing closed lanes but also knowing how to pass.
  • a scheme of collaboratively collecting observation data of traffic elements in a traffic scene is usually adopted.
  • the data has certain errors in time and space, which leads to the possibility that the observation data of the same traffic element may be identified as the observation data of different traffic elements, or the observation data of different traffic elements may be identified as the observation data of the same traffic element in the process of data fusion. Observational data, resulting in inaccurate observations of traffic elements.
  • the present application provides a new traffic element observation solution, that is, by synchronizing the observation data of traffic elements collected by multiple vehicles in time and/or calibrating in space, and The adjusted observation data collected by each vehicle are fused to obtain the final observation result of the traffic element, so as to improve the accuracy of the observation result of the traffic element.
  • the following describes the observation method of the traffic element according to the embodiment of the present application with reference to FIG. 4 .
  • FIG. 4 is a flowchart of a method for observing traffic elements according to an embodiment of the present application.
  • the method shown in FIG. 4 may be executed by the cloud service center 310 shown in FIG. 3 , and may also be executed by other computing devices. Not limited.
  • the method shown in FIG. 4 includes steps 410 to 430 .
  • each of the multiple vehicles collects a set of first observation data from the multiple sets of first observation data, and the first observation data is used to indicate traffic The coordinate position of an element over time and/or the velocity of a traffic element over time.
  • the above traffic elements may include dynamic obstacles or static obstacles in the traffic scene, wherein the dynamic obstacles may be other vehicles other than the first observation data collected, and may also be pedestrians in the traffic scene; static obstacles may be It's traffic lights, etc.
  • the first observation data may include at least one of the type of the target, the motion state of the target, the motion trajectory of the target, and the size of the target.
  • the types of objects may include vehicles, pedestrians, bicycles, and the like.
  • the motion state of the target can include static and dynamic.
  • the movement trajectory of the target object may include the speed trajectory of the target object and the space trajectory of the target object.
  • the size of the target may include the length of the target and the width of the target.
  • the first observation data further includes timing information of the traffic light.
  • the above-mentioned multiple sets of first observation data are collected by on-board sensors in each of the multiple vehicles, and processed by a multi-domain controller (MDC).
  • MDC multi-domain controller
  • the observation accuracy of the vehicles is relatively high, and can be about 3 cm to 4 cm.
  • the distance between vehicle #1 and the traffic elements is 200 meters, and the distance between vehicle #2 and the traffic elements is 100 meters. At this time, the accuracy of the observation data of the distance between vehicle #1 and the traffic elements may be low.
  • the observation data of the traffic elements collected by the vehicle #2 can be used to compensate the accuracy of the observation data of the traffic elements collected by the vehicle #1.
  • the embodiment of the present application does not specifically limit the distance between every two vehicles in the above-mentioned plurality of vehicles.
  • the above-mentioned multiple vehicles are smart vehicles.
  • the above-mentioned time synchronization processing and spatial correction processing can be selected based on the type of observation data. If the observation data contains the position information of the traffic element, the spatial calibration process can be performed on the observation data of this type. For example, when the observed data is the coordinate position of the traffic element, the spatial correction processing and time synchronization processing can be performed on the observed data of this type. If the observation data contains time information, time synchronization processing can be performed on this type of observation data. For example, if the observation data is a speed curve of a traffic element, the time synchronization processing can be performed on this type of observation data.
  • the above step 420 includes: determining a time deviation between multiple sets of first observation data; and adjusting each group of first observation data in the multiple sets of first observation data based on the time deviation, so as to obtain the processed multiple sets of first observation data.
  • the time deviation between the above-mentioned multiple sets of first observation data may be the average value of the time deviation between each two sets of first observation data in the multiple sets of first observation data, or may be every two sets of the multiple sets of first observation data.
  • the minimum value of the time deviations between the first observation data may also be the maximum value of the time deviations between each two groups of the first observation data in the multiple sets of first observation data, which is not specifically limited in this embodiment of the present application .
  • the above time offset can also be collected by each vehicle.
  • the deviation between the longitude and latitude of the traffic element or the heading angle of the traffic element is determined.
  • the method for determining the time deviation of multiple sets of first observation data is described below by taking the calculation of the deviation between the longitude and latitude of the traffic element or the heading angle of the traffic element collected by each vehicle as an example.
  • the time offset ⁇ offset of the above-mentioned multiple groups of first observation data can be calculated by the formula Determine, where n represents the total number of times of the target traffic element observed by multiple vehicles, i represents the i-th vehicle among the multiple vehicles, j represents the j-th vehicle among the multiple vehicles, and t represents the t-th vehicle among the n times. time, lon it represents the longitude of the target traffic element collected by the i-th vehicle at the t-th time, and lon jt represents the longitude of the target traffic element collected by the j-th vehicle at the t-th time.
  • the time offset ⁇ offset of the above-mentioned multiple groups of first observation data can also be calculated by the formula Determine, where n represents the total number of times when multiple vehicles observe target traffic elements, i represents the i-th vehicle among the multiple vehicles, j represents the j-th vehicle among the multiple vehicles, and t represents the t-th vehicle among the n times. time, lat it represents the latitude of the target traffic element collected by the i-th vehicle at the t-th time, and lat jt represents the latitude of the target traffic element collected by the j-th vehicle at the t-th time.
  • the time offset ⁇ offset of the above-mentioned multiple groups of first observation data can also be calculated by the formula Determine, where n represents the total number of times when multiple vehicles observe target traffic elements, i represents the i-th vehicle among the multiple vehicles, j represents the j-th vehicle among the multiple vehicles, and t represents the t-th vehicle among the n times. time, yaw it represents the turning angle rate of the heading angle of the target traffic element collected by the i-th vehicle at the t-th time, and yaw jt represents the turning-angle rate of the heading angle of the target traffic element collected by the j-th vehicle at the t-th time.
  • the above three methods for determining the time offset may be used independently based on different scenarios, or may be combined to determine the time offset, which is not specifically determined in this embodiment of the present application.
  • FIG. 5 shows a simulation diagram of the observation data collected before time synchronization processing.
  • curve 1 of the observed data of traffic element 1 collected by vehicle #1 over time and curve 2 of the observed data of traffic element 1 collected by vehicle #2 over time, it can be seen that before time synchronization processing , the observation data corresponding to the two curves at the same time are different.
  • FIG. 6 shows a simulation diagram of the observation data collected after time synchronization processing. It can be seen that after the time synchronization processing method of the embodiment of the present application, the above-mentioned curve 1 and curve 2 basically overlap.
  • the above step 420 includes: determining, in a preset coordinate system, each group of first observation data in the plurality of groups of first observation data.
  • the coordinates of the indicated traffic element change with time; the coordinate value of the traffic element contained in each preset coordinate range in the coordinate system is represented by the target coordinate value corresponding to the coordinate range to obtain the The coordinates of the traffic element within each preset coordinate range in the coordinate system, and the processed observation data includes the coordinates of the traffic element within each preset coordinate range in the coordinate system.
  • the coordinates of the traffic elements indicated by each set of first observation data in the multiple sets of first observation data are located in grid #1 in the preset coordinate system, and the target coordinate value corresponding to grid #1 is (x, y), it can be determined that at the qth moment, the coordinates of the traffic elements indicated by each set of first observation data are the target coordinate values (x, y) corresponding to grid #1.
  • the above-mentioned change of the coordinate position of the traffic element over time can be determined based on the current state of the traffic element collected by multiple vehicles (including the position and speed of the traffic element) and the Kalman filter algorithm to determine the coordinate position of the traffic element. change in time.
  • the above Kalman filter algorithm can be divided into two stages: prediction stage and update stage.
  • the prediction stage is used to predict the state of the traffic element at the k-th time based on the state of the traffic element at the k-1 time.
  • the update stage is used to update the variables in the Kalman filter algorithm based on the predicted state of the traffic element at time k.
  • Predict the state of the traffic element at time k in represents the state vector of the traffic element at time k predicted based on the state vector of the traffic element at time k-1, represents the state vector containing the position and speed of the traffic element at time k-1, P k
  • control vector and control matrix can reflect the influence of external factors on the state of the traffic element at time k
  • second covariance matrix can reflect the influence of external uncertainty on the state of the traffic element at time k
  • the formula can be based on as well as Determine the measurement residual vector at time k
  • the above-mentioned third covariance matrix R k may be set based on the noise of the above-mentioned sensor.
  • Second observation data of traffic elements observed by multiple vehicles Based on the processed sets of observation data, determine second observation data of traffic elements observed by multiple vehicles. Wherein, the second observation data can be used as the final observation data of the traffic element.
  • the above-mentioned multiple vehicles can also report one or more of the information such as the position of the self-vehicle, the speed of the self-vehicle, and the state of the self-vehicle, so that the cloud computing server can base on the above-mentioned second observation data and the above-mentioned information of the self-vehicle.
  • FIG. 7 is a flowchart of a traffic element observation method according to an embodiment of the present application. The method shown in FIG. 7 includes steps 710 to 750 .
  • Vehicle #1, Vehicle #2, and Vehicle #3 respectively send their collected data to the cloud server.
  • observation data #1 is the data collected by vehicle #1 including traffic element #1
  • observation data #2 is the data collected by vehicle #2 including traffic element #1
  • observation data #3 is collected by vehicle #3 including traffic element #1 Data for element #1.
  • the distance between every two vehicles in Vehicle #1, Vehicle #2, and Vehicle #3 is less than or equal to 100 meters, and the distance between any of the above vehicles and the traffic element #1 is less than or equal to 100 meters, The accuracy of the data of the traffic element #1 collected by the vehicle #1, vehicle #2, and vehicle #3 is high.
  • the aforementioned vehicle #1, vehicle #2, and vehicle #3 are smart vehicles.
  • the cloud computing server performs time synchronization processing on the observation data in the above data, and obtains the processed observation data #1, the processed observation data #2, and the processed observation data #3.
  • the observation result #1 of the vehicle #1 on the traffic element #1 includes information such as the relative position and the relative speed of the vehicle #1 and the traffic element #1.
  • the observation result #2 of the vehicle #2 on the traffic element #1 includes information such as the relative position and relative speed of the vehicle #2 and the traffic element #1.
  • the observation result #3 of the vehicle #3 on the traffic element #1 includes information such as the relative position and the relative speed of the vehicle #3 and the traffic element #1.
  • FIG. 8 to FIG. 9 The observation method of the traffic element according to the embodiment of the present application is described above with reference to FIGS. 1 to 7 , and the apparatus of the embodiment of the present application is described below with reference to FIGS. 8 to 9 . It should be understood that the apparatuses shown in FIG. 8 to FIG. 9 can implement each step in the foregoing method, and for brevity, details are not described herein again.
  • FIG. 8 is a schematic diagram of an observation device of a traffic element according to an embodiment of the present application.
  • the apparatus 800 shown in FIG. 8 includes: a receiving unit 810 and a processing unit 820 .
  • a receiving unit 810 configured to receive multiple sets of first observation data of traffic elements sent by multiple vehicles, where each vehicle in the multiple vehicles collects a set of first observation data in the multiple sets of first observation data,
  • the first observation data is used to indicate the time change of the coordinate position of the traffic element and/or the time change of the speed of the traffic element;
  • a processing unit 820 configured to perform time synchronization processing and/or spatial correction processing on the multiple sets of first observation data to obtain multiple sets of processed observation data
  • the processing unit 820 is further configured to determine the second observation data of the traffic element observed by the plurality of vehicles based on the plurality of sets of observation data after processing.
  • the processing unit 820 is further configured to: determine time offsets of the multiple sets of first observation data; and adjust the multiple sets of first observation data based on the time offsets of the multiple sets of first observation data
  • Each group of first observation data in the first observation data is to obtain the processed observation data, and the time synchronization of each group of observation data in the processed observation data.
  • the processing unit 820 is further configured to: determine in a preset coordinate system that the traffic elements indicated by each group of first observation data in the multiple groups of first observation data are in different The coordinates of the time point; the coordinate value of the traffic element contained in each preset coordinate range in the coordinate system is represented by the target coordinate value corresponding to the coordinate range to obtain the traffic element in the coordinate system. coordinates within each preset coordinate range in the coordinate system, and the processed observation data includes the coordinates of the traffic element within each preset coordinate range in the coordinate system.
  • the first observation data includes the type of the target, the motion state of the target, the motion trajectory of the target, the at least one of the size of the target.
  • the first observation data when the target object is a traffic light, the first observation data further includes timing information of the traffic light.
  • the multiple sets of first observation data are collected by on-board sensors in each of the multiple vehicles, and processed by a multi-domain controller.
  • the receiving unit 810 may be a communication interface 930
  • the processing unit 820 may be a processor 920
  • the computing device may further include a memory 910, as shown in FIG. 9 .
  • FIG. 9 is a schematic block diagram of a computing device according to another embodiment of the present application.
  • the computing device 900 shown in FIG. 9 may include: a memory 910 , a processor 920 , and a communication interface 930 .
  • the memory 910, the processor 920, and the communication interface 930 are connected through an internal connection path, the memory 910 is used to store instructions, and the processor 920 is used to execute the instructions stored in the memory 920 to control the input/output interface 930 to receive/send at least part of the parameters of the second channel model.
  • the memory 910 can either be coupled with the processor 920 through an interface, or can be integrated with the processor 920 .
  • the above-mentioned communication interface 930 uses a transceiver such as but not limited to a transceiver to implement communication between the communication device 900 and other devices or a communication network.
  • the above-mentioned communication interface 930 may also include an input/output interface.
  • each step of the above-mentioned method may be completed by an integrated logic circuit of hardware in the processor 920 or an instruction in the form of software.
  • the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • the software module may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and completes the steps of the above method in combination with its hardware. To avoid repetition, detailed description is omitted here.
  • the processor may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), dedicated integrated Circuit (application specific integrated circuit, ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory may include a read-only memory and a random access memory, and provide instructions and data to the processor.
  • a portion of the processor may also include non-volatile random access memory.
  • the processor may also store device type information.
  • the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
  • the disclosed system, apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the functions, if implemented in the form of software functional units and sold or used as independent products, may be stored in a computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution, and the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program codes .

Abstract

Procédé d'observation d'élément de circulation et dispositif associé. La précision d'acquisition de données d'observation d'un élément de circulation peut être améliorée. Le procédé consiste : à recevoir de multiples ensembles de premières données d'observation, transmis par de multiples véhicules, d'un élément de circulation (410), chacun des multiples véhicules collectant un ensemble de premières données d'observation dans les multiples ensembles de premières données d'observation, et les premières données d'observation étant utilisées pour indiquer un changement de la position de coordonnées de l'élément de circulation dans le temps et/ou un changement de la vitesse de l'élément de circulation dans le temps ; à effectuer un traitement de synchronisation temporelle et/ou un traitement de correction spatiale sur les multiples ensembles de premières données d'observation, de manière à obtenir de multiples ensembles de données d'observation traitées (420) ; et sur la base des multiples ensembles de données d'observation traitées, à déterminer des secondes données d'observation de l'élément de circulation observées par de multiples véhicules (430).
PCT/CN2020/117785 2020-09-25 2020-09-25 Procédé et appareil d'observation d'élément de circulation WO2022061725A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/117785 WO2022061725A1 (fr) 2020-09-25 2020-09-25 Procédé et appareil d'observation d'élément de circulation
CN202080004590.9A CN112639910B (zh) 2020-09-25 2020-09-25 交通元素的观测方法和装置
EP20954577.1A EP4207133A4 (fr) 2020-09-25 2020-09-25 Procédé et appareil d'observation d'élément de circulation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117785 WO2022061725A1 (fr) 2020-09-25 2020-09-25 Procédé et appareil d'observation d'élément de circulation

Publications (1)

Publication Number Publication Date
WO2022061725A1 true WO2022061725A1 (fr) 2022-03-31

Family

ID=75291151

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/117785 WO2022061725A1 (fr) 2020-09-25 2020-09-25 Procédé et appareil d'observation d'élément de circulation

Country Status (3)

Country Link
EP (1) EP4207133A4 (fr)
CN (1) CN112639910B (fr)
WO (1) WO2022061725A1 (fr)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002157696A (ja) * 2000-11-22 2002-05-31 Natl Inst For Land & Infrastructure Management Mlit 専用レーンを有するahsにおける障害物回避方法
JP4271720B1 (ja) * 2008-04-24 2009-06-03 本田技研工業株式会社 車両周辺監視装置
CN104217615A (zh) * 2014-09-16 2014-12-17 武汉理工大学 一种基于车路协同的行人防碰撞系统和方法
CN105741546A (zh) * 2016-03-18 2016-07-06 重庆邮电大学 路侧设备与车传感器融合的智能车辆目标跟踪系统及方法
CN109556615A (zh) * 2018-10-10 2019-04-02 吉林大学 基于自动驾驶的多传感器融合认知的驾驶地图生成方法
CN109872530A (zh) * 2017-12-05 2019-06-11 广州腾讯科技有限公司 一种路况信息的生成方法、车载终端及服务器
CN111243281A (zh) * 2018-11-09 2020-06-05 杭州海康威视系统技术有限公司 一种道路多视频联合检测系统及检测方法
CN111383287A (zh) * 2020-02-13 2020-07-07 湖北亿咖通科技有限公司 一种车载传感器的外参标定方法及装置
CN111459168A (zh) * 2020-04-23 2020-07-28 上海交通大学 一种融合的自动驾驶汽车过街行人轨迹预测方法及系统
CN111596090A (zh) * 2020-06-17 2020-08-28 中国第一汽车股份有限公司 车辆行驶速度的测量方法、装置、车辆和介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140078304A1 (en) * 2012-09-20 2014-03-20 Cloudcar, Inc. Collection and use of captured vehicle data
CN109900490B (zh) * 2017-12-11 2020-11-03 上海交通大学 基于自主式和协同式传感器的车辆运动状态检测方法及系统
DE102018216809A1 (de) * 2018-09-28 2020-04-02 Robert Bosch Gmbh Verfahren, Vorrichtung und Sensorsystem zur Umfelderfassung für ein Fahrzeug
CN111402574B (zh) * 2018-12-13 2023-04-07 阿里巴巴集团控股有限公司 车辆检测方法、装置、设备和存储介质
US11436923B2 (en) * 2019-01-25 2022-09-06 Cavh Llc Proactive sensing systems and methods for intelligent road infrastructure systems
CN110208158A (zh) * 2019-06-13 2019-09-06 上汽大众汽车有限公司 一种车辆环境检测传感器在线校准方法及系统
CN111193568A (zh) * 2019-12-19 2020-05-22 北汽福田汽车股份有限公司 时间同步方法、装置、系统、存储介质及车辆
CN111209956A (zh) * 2020-01-02 2020-05-29 北京汽车集团有限公司 传感器数据融合方法、车辆的环境地图生成方法及系统
CN111222568A (zh) * 2020-01-03 2020-06-02 北京汽车集团有限公司 一种车辆网联数据融合方法及装置
CN111220998B (zh) * 2020-02-26 2022-10-28 江苏大学 一种基于车车通信的多目标协同跟踪的方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002157696A (ja) * 2000-11-22 2002-05-31 Natl Inst For Land & Infrastructure Management Mlit 専用レーンを有するahsにおける障害物回避方法
JP4271720B1 (ja) * 2008-04-24 2009-06-03 本田技研工業株式会社 車両周辺監視装置
CN104217615A (zh) * 2014-09-16 2014-12-17 武汉理工大学 一种基于车路协同的行人防碰撞系统和方法
CN105741546A (zh) * 2016-03-18 2016-07-06 重庆邮电大学 路侧设备与车传感器融合的智能车辆目标跟踪系统及方法
CN109872530A (zh) * 2017-12-05 2019-06-11 广州腾讯科技有限公司 一种路况信息的生成方法、车载终端及服务器
CN109556615A (zh) * 2018-10-10 2019-04-02 吉林大学 基于自动驾驶的多传感器融合认知的驾驶地图生成方法
CN111243281A (zh) * 2018-11-09 2020-06-05 杭州海康威视系统技术有限公司 一种道路多视频联合检测系统及检测方法
CN111383287A (zh) * 2020-02-13 2020-07-07 湖北亿咖通科技有限公司 一种车载传感器的外参标定方法及装置
CN111459168A (zh) * 2020-04-23 2020-07-28 上海交通大学 一种融合的自动驾驶汽车过街行人轨迹预测方法及系统
CN111596090A (zh) * 2020-06-17 2020-08-28 中国第一汽车股份有限公司 车辆行驶速度的测量方法、装置、车辆和介质

Also Published As

Publication number Publication date
EP4207133A4 (fr) 2023-11-01
CN112639910A (zh) 2021-04-09
CN112639910B (zh) 2022-05-17
EP4207133A1 (fr) 2023-07-05

Similar Documents

Publication Publication Date Title
WO2022027304A1 (fr) Procédé et appareil de test de véhicule autonome
CN110379193B (zh) 自动驾驶车辆的行为规划方法及行为规划装置
WO2021135371A1 (fr) Procédé de conduite automatique, dispositif associé et support de stockage lisible par ordinateur
WO2021102955A1 (fr) Procédé et appareil de planification de trajet pour véhicule
CN110550029A (zh) 障碍物避让方法及装置
CN113168708B (zh) 车道线跟踪方法和装置
CN113460042A (zh) 车辆驾驶行为的识别方法以及识别装置
WO2022062825A1 (fr) Procédé, dispositif de commande de véhicule et véhicule
US20230048680A1 (en) Method and apparatus for passing through barrier gate crossbar by vehicle
WO2022051951A1 (fr) Procédé de détection de ligne de voie de circulation, dispositif associé et support de stockage lisible par ordinateur
EP4307251A1 (fr) Procédé de mappage, véhicule, support d'informations lisible par ordinateur, et puce
CN113498529A (zh) 一种目标跟踪方法及其装置
CN112810603B (zh) 定位方法和相关产品
WO2021163846A1 (fr) Procédé de suivi de cible et appareil de suivi de cible
CN113954858A (zh) 一种规划车辆行驶路线的方法以及智能汽车
CN113859265B (zh) 一种驾驶过程中的提醒方法及设备
WO2022022284A1 (fr) Procédé et appareil de détection d'objet cible
CN114764980B (zh) 一种车辆转弯路线规划方法及装置
CN113799794B (zh) 车辆纵向运动参数的规划方法和装置
WO2021159397A1 (fr) Procédé de détection et dispositif de détection de région pouvant être parcourue par un véhicule
WO2022061725A1 (fr) Procédé et appareil d'observation d'élément de circulation
CN113022573B (zh) 道路结构检测方法及装置
WO2022041820A1 (fr) Procédé et appareil pour planification de trajectoire de changement de file
WO2022127502A1 (fr) Procédé et dispositif de commande
CN114556251B (zh) 用于确定车辆可通行空间的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954577

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020954577

Country of ref document: EP

Effective date: 20230330

NENP Non-entry into the national phase

Ref country code: DE