CN112639910B - Method and device for observing traffic elements - Google Patents

Method and device for observing traffic elements Download PDF

Info

Publication number
CN112639910B
CN112639910B CN202080004590.9A CN202080004590A CN112639910B CN 112639910 B CN112639910 B CN 112639910B CN 202080004590 A CN202080004590 A CN 202080004590A CN 112639910 B CN112639910 B CN 112639910B
Authority
CN
China
Prior art keywords
observation data
traffic
vehicle
vehicles
traffic element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080004590.9A
Other languages
Chinese (zh)
Other versions
CN112639910A (en
Inventor
卢远志
陈灿平
陈保成
赵剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112639910A publication Critical patent/CN112639910A/en
Application granted granted Critical
Publication of CN112639910B publication Critical patent/CN112639910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0129Traffic data processing for creating historical data or processing based on historical data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0141Measuring and analyzing of parameters relative to traffic conditions for specific applications for traffic information dissemination
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096758Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Abstract

The application provides an observation method of traffic elements and related equipment, wherein the method comprises the following steps: receiving a plurality of sets of first observations of a traffic element sent by a plurality of vehicles, each vehicle of the plurality of vehicles acquiring one of the plurality of sets of first observations, the first observations indicating changes in coordinate position of the traffic element over time and/or changes in speed of the traffic element over time; carrying out time synchronization processing and/or space correction processing on the multiple groups of first observation data to obtain multiple groups of processed observation data; determining second observation data of the traffic element observed by the plurality of vehicles based on the processed plurality of sets of observation data. So as to improve the accuracy of acquiring the observation data of the traffic elements.

Description

Method and device for observing traffic elements
Technical Field
The present application relates to the field of automated driving, and more particularly, to a method and apparatus for observation of traffic elements.
Background
Automatic driving is a mainstream application in the field of artificial intelligence, and the automatic driving technology depends on the cooperative cooperation of computer vision, radar, a monitoring device, a global positioning system and the like, so that the motor vehicle can realize automatic driving without the active operation of human beings. Autonomous vehicles use various computing systems to assist in transporting passengers from one location to another. Some autonomous vehicles may require some initial input or continuous input from an operator, such as a pilot, driver, or passenger. Autonomous vehicles permit an operator to switch from a manual mode of operation to an autonomous mode or an intervening mode. Because the automatic driving technology does not need human to drive the motor vehicle, the driving error of human can be effectively avoided theoretically, the occurrence of traffic accidents is reduced, and the transportation efficiency of the road can be improved. Therefore, the automatic driving technique is increasingly emphasized.
With the development of automatic driving technology, functions of vehicle path planning, vehicle obstacle avoidance and the like become increasingly important, and the functions can not be separated from the basic technology of collecting observation data of traffic elements. At present, in order to improve the accuracy of acquiring the observation data of the traffic elements, a scheme of acquiring the observation data of the traffic elements by multiple vehicles in a coordinated manner is generally adopted. The method comprises the steps that a plurality of vehicles are used for observing the same traffic element respectively, first observed data after respective observation are sent to a cloud server, the first observed data uploaded by the vehicles are fused by the cloud server, and finally second observed data of the traffic element are obtained.
However, in the above scheme of cooperatively acquiring observation data of traffic elements by multiple vehicles, due to a certain error in the observation data acquired by each vehicle, in the process of fusing multiple sets of first observation data, the cloud server may recognize multiple sets of observation data of the same traffic element as observation data of different traffic elements, or recognize observation data of different traffic elements as observation data of the same traffic element, so that the obtained second observation data of the traffic element is not accurate.
Disclosure of Invention
The application provides an observation method and device for traffic elements, which are used for improving the accuracy of obtaining observation data of the traffic elements.
In a first aspect, a method for observing traffic elements is provided, which includes: receiving a plurality of sets of first observations of a traffic element sent by a plurality of vehicles, each vehicle of the plurality of vehicles acquiring one of the plurality of sets of first observations, the first observations indicating changes in coordinate position of the traffic element over time and/or changes in speed of the traffic element over time; carrying out time synchronization processing and/or space correction processing on the multiple groups of first observation data to obtain multiple groups of processed observation data; determining second observation data of the traffic element observed by the plurality of vehicles based on the processed plurality of sets of observation data.
In the embodiment of the application, the plurality of groups of processed observation data are obtained by performing time synchronization processing and/or space correction processing on the plurality of groups of first observation data, and the second observation data of the traffic elements observed by the plurality of vehicles are determined based on the plurality of groups of processed observation data, so that the accuracy of obtaining the observation data of the traffic elements is improved. The method and the device avoid the problem that in the prior art, multiple groups of first observation data are directly fused to determine the second observation data of the traffic element, so that the obtained second observation data of the traffic element is inaccurate.
In a possible implementation manner, if the first observation data is used to indicate a change of a speed of the traffic element with time, performing time synchronization processing on the first observation data of the traffic element sent by the multiple vehicles to obtain multiple processed sets of observation data, including: determining a time offset for the plurality of sets of first observations; and adjusting each group of first observation data in the multiple groups of first observation data based on the time deviation of the multiple groups of first observation data to obtain the processed observation data, wherein the time of each group of observation data in the processed observation data is synchronous.
In the embodiment of the application, each group of first observation data in the multiple groups of first observation data is adjusted based on the time deviation of the multiple groups of first observation data to obtain processed observation data, wherein the time of each group of observation data in the processed observation data is synchronized to improve the accuracy of obtaining the observation data of the traffic element.
In a possible implementation manner, if the first observation data is used to indicate a change of the coordinate position of the traffic element with time, the performing spatial correction processing on the multiple sets of first observation data to obtain processed multiple sets of observation data includes: determining coordinates of the traffic element indicated by each group of first observation data in the multiple groups of first observation data at different time points in a preset coordinate system; and expressing the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system by using the target coordinate values corresponding to the coordinate ranges to obtain the coordinates of the traffic elements in each preset coordinate range in the coordinate system, wherein the processed observation data comprise the coordinates of the traffic elements in each preset coordinate range in the coordinate system.
In the embodiment of the application, the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system are represented by the target coordinate values corresponding to the coordinate ranges, so that the coordinates of the traffic elements in each preset coordinate range in the coordinate system are obtained, and the accuracy of acquiring the observation data of the traffic elements is improved.
In a possible implementation manner, when the traffic element is a target object, the first observation data includes at least one of a type of the target object, a motion state of the target object, a motion trajectory of the target object, and a size of the target object.
In a possible implementation manner, when the target object is a traffic light, the first observation data further includes time service information of the traffic light.
In one possible implementation, the plurality of sets of first observation data are acquired by an on-board sensor in each of the plurality of vehicles and processed by a multi-domain controller.
In the embodiment of the application, the first observation data are acquired through the vehicle-mounted sensor, and the multi-domain controller is used for processing, so that an additional data acquisition device and a data processing device are avoided, and the cost is favorably avoided being increased.
In a second aspect, an apparatus for observing traffic elements is provided, where the apparatus may be a computing device or a chip within the computing device.
The apparatus may include a processing unit and a receiving unit. When the apparatus is a computing device, the processing unit may be a processor and the receiving unit may be a communication interface. Optionally, the apparatus may further comprise a storage unit, which may be a memory when the apparatus is a computing device. The storage unit is configured to store instructions, and the processing unit executes the instructions stored by the storage unit to cause the computing device to perform the method of the first aspect.
When the apparatus is a chip within a computing device, the processing unit may be a processor, and the receiving unit may be an input/output interface, a pin, a circuit, or the like; the processing unit executes the instructions stored by the storage unit to cause the computing device to perform the method of the first aspect.
Optionally, the storage unit may be a storage unit (e.g., a register, a cache, etc.) inside the chip, or may be a storage unit (e.g., a read-only memory, a random access memory, etc.) inside the computing device and outside the chip.
The memory is coupled to the processor, and it is to be understood that the memory is located within the processor or that the memory is located external to the processor and thus independent of the processor.
In a third aspect, a computer program product is provided, the computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method of the above-mentioned aspects.
It should be noted that, all or part of the computer program code may be stored in the first storage medium, where the first storage medium may be packaged together with the processor or may be packaged separately from the processor, and this is not specifically limited in this embodiment of the present application.
In a fourth aspect, a computer-readable medium is provided, which stores program code, which, when run on a computer, causes the computer to perform the method of the above-mentioned aspects.
Drawings
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of an applicable automatic driving system according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a system 300 including an autonomous driving vehicle and a cloud service center to which embodiments of the application are applicable.
Fig. 4 is a flowchart of an observation method of a traffic element according to an embodiment of the present application.
Fig. 5 shows a simulation diagram of observation data acquired before the time synchronization process.
Fig. 6 shows a simulation diagram of the observation data acquired after the time synchronization process.
Fig. 7 is a flowchart of an observation method of a traffic element according to an embodiment of the present application.
Fig. 8 is a schematic view of an observation apparatus of a traffic element according to an embodiment of the present application.
FIG. 9 is a schematic block diagram of a computing device of another embodiment of the present application.
Detailed Description
The technical solution in the present application will be described below with reference to the accompanying drawings. For convenience of understanding, a scene of intelligent driving is taken as an example in conjunction with fig. 1 to 3, and a scene to which the embodiment of the present application is applied is described below.
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application. In one embodiment, the vehicle 100 is configured in a fully or partially autonomous driving mode. For example, the vehicle 100 may control itself while in the autonomous driving mode, and may determine a current state of the vehicle and its surroundings by human operation, determine a possible behavior of at least one other vehicle in the surroundings, and determine a confidence level corresponding to a likelihood that the other vehicle performs the possible behavior, controlling the vehicle 100 based on the determined information. While the vehicle 100 is in the autonomous driving mode, the vehicle 100 may be placed into operation without human interaction.
The vehicle 100 may include various subsystems such as a travel system 102, a sensor system 104, a control system 106, one or more peripherals 108, as well as a power supply 110, a computer system 112, and a user interface 116. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 102 may include components that provide powered motion to the vehicle 100. In one embodiment, the travel system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels/tires 121. The engine 118 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine of a gasoline engine and an electric motor, or a hybrid engine of an internal combustion engine and an air compression engine. The engine 118 converts the energy source 119 into mechanical energy.
Examples of energy sources 119 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 119 may also provide energy to other systems of the vehicle 100.
The transmission 120 may transmit mechanical power from the engine 118 to the wheels 121. The transmission 120 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 120 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 121.
The sensor system 104 (also referred to as a "collection device") may include several sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 104 may include a positioning system 122 (which may be a Global Positioning System (GPS) system, a Beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 124, a radar 126, a laser rangefinder 128, and a camera 130. The sensor system 104 may also include sensors of internal systems of the monitored vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the autonomous vehicle 100.
The positioning system 122 may be used to estimate the geographic location of the vehicle 100. The IMU 124 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, IMU 124 may be a combination of an accelerometer and a gyroscope.
The radar 126 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing the target object, the radar 126 may be used to sense one or more of a speed, a position, and a heading of the target object.
The laser rangefinder 128 may utilize laser light to sense objects in the environment in which the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 130 may be used to capture multiple images of the surrounding environment of the vehicle 100. The camera 130 may be a still camera or a video camera.
The control system 106 is for controlling the operation of the vehicle 100 and its components. Control system 106 may include various elements including a steering system 132, a throttle 134, a brake unit 136, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
The steering system 132 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system.
The throttle 134 is used to control the operating speed of the engine 118 and thus the speed of the vehicle 100.
The brake unit 136 is used to control the deceleration of the vehicle 100. The brake unit 136 may use friction to slow the wheel 121. In other embodiments, the brake unit 136 may convert the kinetic energy of the wheel 121 into an electric current. The brake unit 136 may take other forms to slow the rotational speed of the wheels 121 to control the speed of the vehicle 100.
The computer vision system 140 may be operable to process and analyze images captured by the camera 130 to identify objects and/or features in the environment surrounding the vehicle 100. The objects and/or features may include traffic signals, road boundaries, and obstacles. The computer vision system 140 may use object recognition algorithms, Structure From Motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 140 may be used to map an environment, track objects, estimate the speed of objects, and so forth.
The route control system 142 is used to determine a travel route of the vehicle 100. In some embodiments, the route control system 142 may combine data from the sensors, the GPS 122, and one or more predetermined maps to determine a travel route for the vehicle 100.
Obstacle avoidance system 144 is used to identify, assess, and avoid or otherwise negotiate potential obstacles in the environment of vehicle 100.
Of course, in one example, the control system 106 may additionally or alternatively include components other than those shown and described. Or may reduce some of the components shown above.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 108. The peripheral devices 108 may include a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and/or speakers 152.
In some embodiments, the peripheral devices 108 provide a means for a user of the vehicle 100 to interact with the user interface 116. For example, the onboard computer 148 may provide information to a user of the vehicle 100. The user interface 116 may also operate the in-vehicle computer 148 to receive user input. The in-vehicle computer 148 may be operated via a touch screen. In other cases, the peripheral devices 108 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 150 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 152 may output audio to a user of the vehicle 100.
The wireless communication system 146 may wirelessly communicate with one or more devices directly or via a communication network. For example, the wireless communication System 146 may use 3G cellular communication such as Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM)/GPRS, or fourth generation (4G) communication such as LTE. Or a fifth Generation (5th-Generation, 5G) communication. The wireless communication system 146 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 146 may utilize an infrared link, bluetooth, or ZigBee (ZigBee) to communicate directly with the device. Other wireless protocols, such as various vehicle communication systems, for example, the wireless communication system 146 may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The power supply 110 may provide power to various components of the vehicle 100. In one embodiment, power source 110 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, the power source 110 and the energy source 119 may be implemented together, such as in some all-electric vehicles.
Some or all of the functionality of the vehicle 100 is controlled by the computer system 112. The computer system 112 may include at least one processor 113, the processor 113 executing instructions 115 stored in a non-transitory computer readable medium, such as data storage 114. The computer system 112 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
The processor 113 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although fig. 1 functionally illustrates processors, memories, and other elements of the computer 110 in the same blocks, those of ordinary skill in the art will appreciate that the processors, computers, or memories may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard disk drive or other storage medium located in a different housing than the computer 110. Thus, references to a processor or computer are to be understood as including references to a collection of processors or computers or memories which may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to execute a single maneuver.
In some embodiments, the memory 114 may include instructions 115 (e.g., program logic), and the instructions 115 may be executed by the processor 113 to perform various functions of the vehicle 100, including those described above. The memory 114 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 102, the sensor system 104, the control system 106, and the peripheral devices 108.
In addition to instructions 115, memory 114 may also store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, among other information. Such information may be used by the vehicle 100 and the computer system 112 during operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
In some embodiments, the processor 113 may further execute the planning scheme for vehicle longitudinal motion parameters according to the embodiment of the present application to help the vehicle plan the longitudinal motion parameters, where the specific longitudinal motion parameter planning method may refer to the description of fig. 3 below, and for brevity, details are not described herein again.
A user interface 116 for providing information to and receiving information from a user of the vehicle 100. Optionally, the user interface 116 may include one or more input/output devices within the collection of peripheral devices 108, such as a wireless communication system 146, an in-vehicle computer 148, a microphone 150, and a speaker 152.
The computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (e.g., the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control over many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the memory 114 may exist partially or completely separate from the vehicle 100. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present invention.
Autonomous vehicles traveling on the road, such as vehicle 100 above, may identify objects within their surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently and may be used to determine the speed at which the autonomous vehicle is to be adjusted based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, and the like.
Optionally, the autonomous vehicle 100 or a computing device associated with the autonomous vehicle 100 (e.g., the computer system 112, the computer vision system 140, the memory 114 of fig. 1) may predict behavior of the identified objects based on characteristics of the identified objects and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 100 is able to adjust its speed based on the predicted behaviour of said identified object. In other words, the autonomous vehicle can determine that the vehicle will need to adjust to a steady state (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 100, such as the lateral position of the vehicle 100 in the road on which it is traveling, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., cars in adjacent lanes on the road).
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
A scenario in which the embodiment of the present application is applied is described above with reference to fig. 1, and an automatic driving system in which the embodiment of the present application is applied is described below with reference to fig. 2.
FIG. 2 is a schematic diagram of an autopilot system suitable for use with an embodiment of the application, where computer system 101 includes a processor 103, and where processor 103 is coupled to a system bus 105. Processor 103 may be one or more processors, each of which may include one or more processor cores. A display adapter (video adapter)107, which may drive a display 109, the display 109 coupled with system bus 105. System bus 105 is coupled via a bus bridge 111 to an input/output (I/O) bus 113. The I/O interface 115 is coupled to an I/O bus. The I/O interface 115 communicates with various I/O devices, such as an input device 117 (e.g., keyboard, mouse, touch screen, etc.), a multimedia disk (media tray)121 (e.g., CD-ROM, multimedia interface, etc.). A transceiver 123 (which can send and/or receive radio communication signals), a camera 155 (which can capture scenic and motion digital video images), and an external USB interface 125. Wherein, optionally, the interface connected with the I/O interface 115 may be a USB interface.
The processor 103 may be any conventional processor, including a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, or a combination thereof. Alternatively, the processor may be a dedicated device such as an application specific integrated circuit, ASIC. Alternatively, the processor 103 may be a neural network processor or a combination of a neural network processor and a conventional processor as described above.
Optionally, in various embodiments described herein, computer system 101 may be located remotely from the autonomous vehicle and may communicate wirelessly with the autonomous vehicle. In other aspects, some processes described herein are performed on a processor disposed within an autonomous vehicle, others being performed by a remote processor, including taking the actions required to perform a single maneuver.
Computer 101 may communicate with software deploying server 149 via network interface 129. The network interface 129 is a hardware network interface, such as a network card. The Network 127 may be an external Network, such as the internet, or an internal Network, such as an ethernet or Virtual Private Network (VPN). Optionally, the network 127 may also be a wireless network, such as a Wi-Fi network, a cellular network, or the like.
The hard drive interface is coupled to system bus 105. The hardware drive interface is connected with the hard disk drive. System memory 135 is coupled to system bus 105. Data running in system memory 135 may include the operating system 137 and application programs 143 of computer 101.
The operating system includes a shell (shell)139 and a kernel (kernel) 141. The housing 139 is an interface between the user and the kernel of the operating system. The housing 139 is the outermost layer of the operating system. The shell 139 manages the interaction between the user and the operating system: waits for user input, interprets the user input to the operating system, and processes the output results of the various operating systems.
Kernel 141 is comprised of those portions of the operating system that are used to manage memory, files, peripherals, and system resources. Interacting directly with the hardware, the operating system kernel typically runs processes and provides inter-process communication, CPU slot management, interrupts, memory management, IO management, and the like.
Applications 143 include programs related to controlling the automatic driving of a vehicle, such as programs that manage the interaction of an automatically driven vehicle with obstacles on the road, programs that control the route or speed of an automatically driven vehicle, and programs that control the interaction of an automatically driven vehicle with other automatically driven vehicles on the road. The application program 143 also exists on the system of the software deploying server (deploying server) 149. In one embodiment, computer system 101 may download application program 143 from software deploying server (deploying server)149 when application program 147 needs to be executed.
In some embodiments, the application programs may further include an application program corresponding to a sensing scheme for the target object provided in the embodiments of the present application, where the sensing scheme for the target object in the embodiments of the present application will be specifically described below, and is not described herein again for brevity.
Sensor 153 is associated with computer system 101. The sensor 153 is used to detect the environment surrounding the computer 101. For example, the sensor 153 may detect an object, such as an animal, a vehicle, an obstacle, etc., and further the sensor may detect an environment around the object, such as: the environment surrounding the animal, other animals present around the animal, weather conditions, brightness of the surrounding environment, etc. Alternatively, if the computer 101 is located on an autonomous vehicle, the sensor may be a laser radar, camera, infrared sensor, chemical detector, microphone, or the like.
The vehicle and the driving system to which the embodiment of the present application is applied are described above with reference to fig. 1 and fig. 2, and a system including a vehicle and a cloud service center is taken as an example and a scenario to which the embodiment of the present application is applied is described below with reference to fig. 3.
Fig. 3 is a schematic diagram of a system 300 including an autonomous driving vehicle and a cloud service center to which embodiments of the application are applicable. Cloud service center 310 may receive information from autonomous vehicle 330 and autonomous vehicle 331 via network 320, such as a wireless communication network.
Alternatively, the received information may be the position of the target object, the speed of the target object, etc. transmitted by autonomous vehicle 330 and/or autonomous vehicle 331. The target object can be a traffic element acquired by the data-acquired vehicle during driving, such as other vehicles, pedestrians, traffic lights and the like.
The cloud service center 310 runs the stored programs related to controlling the automatic driving of the vehicle to control the automatic driving vehicles 330 and 331 according to the received data. The programs associated with controlling the automatic driving of the vehicle may be programs that manage the interaction of the automatically driven vehicle with obstacles on the road, programs that control the route or speed of the automatically driven vehicle, and programs that control the interaction of the automatically driven vehicle with other automatically driven vehicles on the road.
The network 320 provides portions of the map out to the autonomous vehicle 330 or 331. In other examples, operations may be divided between different locations or centers. For example, multiple cloud service centers may receive, validate, combine, and/or send information reports. Information reports and/or sensor data may also be sent between autonomous vehicles in some examples. Other configurations are also possible.
In some examples, the center sends suggested solutions to the autonomous vehicle regarding possible driving conditions within the system 300 (e.g., informing of a front obstacle and informing of how to bypass it)). For example, the cloud service center may assist the vehicle in determining how to travel when facing a particular obstacle within the environment. The cloud service center sends a response to the autonomous vehicle indicating how the vehicle should travel in the given scenario. For example, the cloud service center may confirm the presence of a temporary stop sign in front of the road based on the collected sensor data, and also determine that the lane is closed due to the application based on a "lane closing" sign and sensor data of the construction vehicle on the lane. Accordingly, the cloud service center sends a suggested mode of operation for automatically driving the vehicle through the obstacle (e.g., instructing the vehicle to change lanes on another road). The operational steps used for the autonomous vehicle may be added to the driving information map when the cloud service center observes the video stream within its operating environment and has confirmed that the autonomous vehicle can safely and successfully traverse the obstacle. Accordingly, this information may be sent to other vehicles in the area that may encounter the same obstacle in order to assist the other vehicles not only in recognizing the closed lane but also in knowing how to pass.
At present, in order to improve the accuracy of the acquired observation data of the traffic elements in the traffic scene, a scheme of cooperatively acquiring the observation data of the traffic elements in the traffic scene by multiple vehicles is generally adopted, and in the scheme of cooperatively acquiring the multiple vehicles, due to certain errors in time and space of the observation data acquired by each vehicle, the observation data of the same traffic element may be recognized as the observation data of different traffic elements or the observation data of different traffic elements may be recognized as the observation data of the same traffic element in the data fusion process, so that the observation result of the traffic element is inaccurate.
In order to avoid the above problems, the present application provides a new observation scheme for traffic elements, that is, the observation data of the traffic elements collected by multiple vehicles are synchronized in time and/or calibrated in space, and the adjusted observation data collected by multiple vehicles are fused to obtain the final observation result of the traffic elements, so as to improve the accuracy of the observation result of the traffic elements. The following describes a method for observing a traffic element according to an embodiment of the present application with reference to fig. 4.
Fig. 4 is a flowchart of an observation method of a traffic element according to an embodiment of the present application, where the method shown in fig. 4 may be executed by the cloud service center 310 shown in fig. 3, and may also be executed by another computing device, and the present application is not limited thereto. The method shown in fig. 4 includes steps 410 to 430.
And 410, receiving a plurality of groups of first observation data of the traffic element sent by a plurality of vehicles, wherein each vehicle in the plurality of vehicles collects one group of the plurality of groups of first observation data, and the first observation data is used for indicating the change of the coordinate position of the traffic element and/or the change of the speed of the traffic element along with time.
The traffic element may include a dynamic obstacle or a static obstacle in the traffic scene, where the dynamic obstacle may be another vehicle except for the vehicle that collects the first observation data, or may be a pedestrian in the traffic scene; the static obstacle may be a traffic light or the like.
Optionally, when the traffic element is a target object, the first observation data may include at least one of a type of the target object, a motion state of the target object, a motion trajectory of the target object, and a size of the target object. Wherein the type of object may include a vehicle, a pedestrian, a bicycle, etc. The motion state of the object may include static and dynamic. The motion trajectory of the target object may include a velocity trajectory of the target object and a spatial trajectory of the target object. The size of the target may include the length of the target and the width of the target.
Optionally, when the target object is a traffic light, the first observation data further includes time service information of the traffic light.
Optionally, the plurality of sets of first observation data are acquired by an on-board sensor in each of the plurality of vehicles and processed by a multi-domain controller (MDC).
Optionally, when the distance between each of the plurality of vehicles is 100 meters, the observation accuracy of the vehicle is high, and may be about 3cm to 4 cm. For example, if the distance between the vehicle #1 and the traffic element is 200 meters and the distance between the vehicle #2 and the traffic element is 100 meters, the accuracy of the observation data of the vehicle #1 and the traffic element may be low, and if the method of the embodiment of the present application is adopted, the accuracy of the observation data of the traffic element collected by the vehicle #1 may be compensated by using the observation data of the traffic element collected by the vehicle # 2. Of course, the distance between each two of the plurality of vehicles is not particularly limited in the embodiments of the present application.
Optionally, the plurality of vehicles are smart vehicles.
And 420, performing time synchronization processing and/or space correction processing on the multiple groups of first observation data to obtain multiple groups of processed observation data.
The time synchronization process and the spatial correction process described above may be selected based on the type of observation data. If the observation data contains the position information of the traffic element, the spatial calibration processing can be performed on the type of observation data. For example, when the observation data is the coordinate position of the traffic element, the spatial correction processing and the time synchronization processing may be performed on the type of observation data. If the observation data includes time information, time synchronization processing may be performed on the type of observation data. For example, if the observed data is a speed curve of a traffic element, the observed data of this type may be time-synchronized.
Optionally, step 420 includes: determining a time offset between the plurality of sets of first observations; and adjusting each group of first observation data in the multiple groups of first observation data based on the time deviation to obtain multiple groups of processed observation data, wherein the time of each group of observation data in the multiple groups of processed observation data is synchronous.
The time deviation between the multiple sets of first observation data may be an average value of the time deviations between every two sets of first observation data in the multiple sets of first observation data, may also be a minimum value of the time deviations between every two sets of first observation data in the multiple sets of first observation data, and may also be a maximum value of the time deviations between every two sets of first observation data in the multiple sets of first observation data, which is not specifically limited in this embodiment of the application.
Alternatively, since the longitude, latitude, or heading angle of the traffic element collected by each vehicle at the same time should be the same for the same traffic element, the time offset may also be determined by the offset between the longitude, latitude, or heading angle of the traffic element collected by each vehicle. The following describes a method for determining the time offset of the plurality of sets of first observation data by taking the example of calculating the offset between the longitude and latitude of the traffic element or the heading angle of the traffic element acquired by each vehicle by the least square method.
The time deviation delta of the multiple groups of first observed dataoffsetCan be represented by formula
Figure BDA0002943514610000101
Determining, wherein n represents a total number of times of the target traffic element observed by the plurality of vehicles, i represents an ith vehicle of the plurality of vehicles, j represents a jth vehicle of the plurality of vehicles, t represents a tth time of the n times, lonitRepresenting the longitude, lon, of the target traffic element acquired by the ith vehicle at the time tjtRepresenting the target traffic element collected by the jth vehicle at the tth momentLongitude of the pixel.
The time deviation delta of the multiple groups of first observed dataoffsetOr by the formula
Figure BDA0002943514610000102
Determining, wherein n represents the total time number of the plurality of vehicles observing the target traffic element, i represents the ith vehicle in the plurality of vehicles, j represents the jth vehicle in the plurality of vehicles, t represents the tth time in the n times, and latitIndicating the latitude, lat, of the target traffic element collected by the ith vehicle at the t-th timejtIndicating the latitude of the target traffic element collected by the jth vehicle at the time of the tth.
The time deviation delta of the multiple groups of first observed dataoffsetOr by the formula
Figure BDA0002943514610000103
Determining, wherein n represents the total time number of the plurality of vehicles observing the target traffic element, i represents the ith vehicle in the plurality of vehicles, j represents the jth vehicle in the plurality of vehicles, t represents the tth time in the n times, and yawitYaw rate, yaw, representing the heading angle of the target traffic element acquired by the ith vehicle at the t-th timejtAnd the corner speed representing the course angle of the target traffic element collected by the jth vehicle at the tth moment.
It should be noted that the above 3 ways of determining the time offset may be used alone based on different scenarios, or may be combined together to determine the time offset, which is not specifically determined in the embodiments of the present application.
The following describes simulation results of the time synchronization process according to the embodiment of the present application with reference to fig. 5 and 6. Fig. 5 shows a simulation diagram of observation data acquired before the time synchronization process. In fig. 5, a curve 1 of the observed data of the traffic element 1 collected by the vehicle #1 and a curve 2 of the observed data of the traffic element 1 collected by the vehicle #2 are changed with time, and it can be seen that the observed data of the two curves are different at the same time before the time synchronization processing. Fig. 6 shows a simulation diagram of the observation data acquired after the time synchronization process. It can be seen that the curve 1 and the curve 2 substantially coincide with each other after the time synchronization processing method according to the embodiment of the present application.
Optionally, if the first observation data is used to indicate the change of the coordinate position of the traffic element with time, the step 420 includes: determining the change of the coordinates of the traffic element, which are indicated by each group of the first observation data in the plurality of groups of the first observation data, along with the time in a preset coordinate system; and expressing the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system by using target coordinate values corresponding to the coordinate ranges to obtain the coordinates of the traffic elements in each preset coordinate range in the coordinate system, wherein the processed observation data comprise the coordinates of the traffic elements in each preset coordinate range in the coordinate system.
For example, at the q-th time, the coordinates of the traffic element indicated by each of the first observation data in the first observation data sets are located in grid #1 in the preset coordinate system, and the target coordinate value corresponding to grid #1 is (x, y), then it may be determined that at the q-th time, the coordinates of the traffic element indicated by each of the first observation data sets are the target coordinate value (x, y) corresponding to grid # 1.
The grid in the coordinate system may be divided in advance, and the embodiment of the present application does not limit this.
Optionally, the change of the coordinate position of the traffic element with time may be determined based on the current state of the traffic element (including the position and the speed of the traffic element) collected by a plurality of vehicles and a kalman filter algorithm.
The kalman filtering algorithm described above can be divided into two phases: a prediction phase and an update phase. The prediction stage is used for predicting the state of the traffic element at the k-th moment based on the state of the traffic element at the k-1 th moment. And the updating stage is used for updating the variables in the Kalman filtering algorithm based on the state of the traffic element at the predicted k-th moment.
In the prediction stage, by formula
Figure BDA0002943514610000111
And
Figure BDA0002943514610000112
predicting the state of a traffic element at time k
Figure BDA0002943514610000113
Wherein the content of the first and second substances,
Figure BDA0002943514610000114
indicating the state vector of the traffic element at the k-th time predicted based on the state vector of the traffic element at the k-1 th time,
Figure BDA0002943514610000115
representing a state vector, P, containing the position and speed of the traffic element at time k-1k|k-1Representing a first covariance matrix at time k, P, predicted based on the covariance matrix at time k-1k-1|k-1Represents the covariance matrix, u, at time k-1kRepresenting a predetermined control vector, BkRepresenting a preset control matrix, QkRepresenting a preset second covariance matrix, FkA prediction matrix representing the use of the state of the traffic element at the k-th time based on the state prediction of the traffic element at the k-1 th time.
It should be noted that the control vector and the control matrix may reflect the influence of external factors on the state of the traffic element at the k-th time, and the second covariance matrix may reflect the influence of external uncertainties on the state of the traffic element at the k-th time.
In the update phase, it can be based on formulas
Figure BDA0002943514610000116
And
Figure BDA0002943514610000117
determining a measurement residual vector at a k-th time instant
Figure BDA0002943514610000118
Measurement residual covariance at time kMatrix SkAnd Kalman gain K at time KkWherein H iskSensor reading matrix, R, representing the use of the state of the collected traffic elementskRepresenting a preset third covariance matrix, zkRepresenting the mean of the distribution of the sensor readings.
The third covariance matrix R iskThe noise setting may be based on the sensors described above.
Based on the above-mentioned measurement residual vector
Figure BDA0002943514610000119
Measuring residual covariance matrix SkAnd Kalman gain KkBy the formula
Figure BDA00029435146100001110
Pk|k=(I-KkHk)Pk|k-1Updating the state of the traffic element at the k-th time
Figure BDA00029435146100001111
And the covariance matrix P at the k-th momentk|k
Based on the processed sets of observations, second observations of the traffic elements observed by the plurality of vehicles are determined 430. Wherein the second observation may be a final observation of the traffic element.
Of course, the plurality of vehicles may also report one or more of information such as the position of the vehicle, the speed of the vehicle, and the state of the vehicle, so that the cloud computing server determines information such as a relative position and a relative speed between the vehicle and the traffic element based on the second observation data and the information of the vehicle, and feeds back the calculated information to the vehicle, so that the vehicle may adjust a driving route, a driving speed, and the like based on the information such as the position of the vehicle, the speed of the vehicle, and the state of the vehicle.
The following describes an observation method of a traffic element according to an embodiment of the present application, taking processing of observation data collected by vehicle #1, vehicle #2, and vehicle #3 as an example, with reference to fig. 7. Fig. 7 is a flowchart of an observation method of a traffic element according to an embodiment of the present application. The method shown in fig. 7 includes steps 710 to 750.
710, the vehicle #1, the vehicle #2, and the vehicle #3 respectively transmit the collected data to the cloud server.
The data uploaded by each vehicle comprises observation data collected by each vehicle, the driving track of the vehicle and information of the vehicle. The observation data #1 is data collected by the vehicle #1 and containing the traffic element #1, the observation data #2 is data collected by the vehicle #2 and containing the traffic element #1, and the observation data #3 is data collected by the vehicle #3 and containing the traffic element # 1.
Optionally, when the distance between every two vehicles in the vehicle #1, the vehicle #2, and the vehicle #3 is less than or equal to 100m, and the distance between any one of the vehicles and the traffic element #1 is less than or equal to 100m, the accuracy of the data of the traffic element #1 collected by the vehicle #1, the vehicle #2, and the vehicle #3 is higher.
Optionally, the vehicle #1, the vehicle #2, and the vehicle #3 are smart vehicles.
720, the cloud computing server performs time synchronization processing on the observation data in the data to obtain processed observation data #1, processed observation data #2, and processed observation data # 3.
It should be noted that, the above procedure of the time synchronization process may refer to the above description, and for brevity, is not described herein again.
730, according to the position information of the traffic element #1 carried in the observation data #1, the processed observation data #2 and the processed observation data #3, the multiple sets of coordinate positions of the traffic element #1 in the high-definition map are determined.
740, performing spatial correction processing on the multiple sets of coordinate positions to obtain the corrected coordinate position of the traffic element # 1.
It should be noted that, the above-mentioned process of spatial correction processing may refer to the above description, and for brevity, is not described herein again.
And 750, determining the observation result of each vehicle on the traffic element #1 based on the coordinate position of the corrected traffic element #1, the speed curve of the processed traffic element #1, the running track of each vehicle and the information of the vehicles, and sending the observation result of each vehicle to each vehicle.
The observation result #1 of the vehicle #1 with respect to the traffic element #1 includes information such as the relative position and relative speed of the vehicle #1 and the traffic element # 1. The observation #2 of the vehicle #2 for the traffic element #1 contains information such as the relative position, the relative speed, and the like of the vehicle #2 and the traffic element # 1. The observation #3 of the vehicle #3 for the traffic element #1 contains information such as the relative position, the relative speed, and the like of the vehicle #3 and the traffic element # 1.
The method for observing traffic elements according to the embodiment of the present application is described above with reference to fig. 1 to 7, and the apparatus according to the embodiment of the present application is described below with reference to fig. 8 to 9. It should be understood that the apparatus shown in fig. 8 to 9 can implement the steps of the above method, and for brevity, the description is omitted here.
Fig. 8 is a schematic view of an observation apparatus of a traffic element according to an embodiment of the present application. The apparatus 800 shown in fig. 8 comprises: a receiving unit 810 and a processing unit 820.
A receiving unit 810, configured to receive multiple sets of first observation data of a traffic element sent by multiple vehicles, where each vehicle in the multiple vehicles acquires one set of the multiple sets of first observation data, and the first observation data is used to indicate a change of a coordinate position of the traffic element with time and/or a change of a speed of the traffic element with time;
a processing unit 820, configured to perform time synchronization processing and/or spatial correction processing on the multiple sets of first observation data to obtain multiple sets of processed observation data;
the processing unit 820 is further configured to determine second observation data of the traffic element observed by the plurality of vehicles based on the processed plurality of sets of observation data.
Optionally, as an embodiment, the processing unit 820 is further configured to: determining a time offset for the plurality of sets of first observations; and adjusting each group of first observation data in the multiple groups of first observation data based on the time deviation of the multiple groups of first observation data to obtain the processed observation data, wherein the time of each group of observation data in the processed observation data is synchronous.
Optionally, as an embodiment, the processing unit 820 is further configured to: determining coordinates of the traffic element indicated by each group of first observation data in the multiple groups of first observation data at different time points in a preset coordinate system; and expressing the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system by using target coordinate values corresponding to the coordinate ranges to obtain the coordinates of the traffic elements in each preset coordinate range in the coordinate system, wherein the processed observation data comprise the coordinates of the traffic elements in each preset coordinate range in the coordinate system.
Optionally, as an embodiment, when the traffic element is a target object, the first observation data includes at least one of a type of the target object, a motion state of the target object, a motion trajectory of the target object, and a size of the target object.
Optionally, as an embodiment, when the target object is a traffic light, the first observation data further includes time service information of the traffic light.
Optionally, as an embodiment, the method includes: the multiple sets of first observation data are acquired by a vehicle-mounted sensor in each of the plurality of vehicles and are processed by a multi-domain controller.
In an alternative embodiment, the receiving unit 810 may be a communication interface 930, the processing unit 820 may be a processor 920, and the computing device may further include a memory 910, as specifically shown in fig. 9.
FIG. 9 is a schematic block diagram of a computing device of another embodiment of the present application. The computing device 900 shown in fig. 9 may include: memory 910, processor 920, and communication interface 930. Wherein, the memory 910, the processor 920, and the communication interface 930 are connected via an internal connection path, the memory 910 is configured to store instructions, and the processor 920 is configured to execute the instructions stored in the memory 920 to control the input/output interface 930 to receive/transmit at least part of the parameters of the second channel model. Optionally, the memory 910 may be coupled to the processor 920 via an interface, or may be integrated with the processor 920.
It is noted that the communication interface 930 implements communication between the communication device 900 and other devices or communication networks using transceiver means, such as, but not limited to, a transceiver. The communication interface 930 may also include an input/output interface (input/output interface).
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 920. The method disclosed in the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 910, and the processor 920 reads the information in the memory 910, and performs the steps of the above method in combination with the hardware thereof. To avoid repetition, it is not described in detail here.
It should be understood that, in the embodiment of the present application, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that in embodiments of the present application, the memory may comprise both read-only memory and random access memory, and may provide instructions and data to the processor. A portion of the processor may also include non-volatile random access memory. For example, the processor may also store information of the device type.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. An observation method of traffic elements is applied to a cloud service center, and comprises the following steps:
receiving a plurality of sets of first observation data of a traffic element sent by a plurality of vehicles, wherein each vehicle in the plurality of vehicles collects one set of first observation data in the plurality of sets of first observation data, the distance between every two vehicles in the plurality of vehicles is smaller than or equal to a first distance, the first observation data is used for indicating the change of the coordinate position of the traffic element with time and/or the change of the speed of the traffic element with time, and the plurality of sets of first observation data are collected by an on-board sensor in each vehicle in the plurality of vehicles and are obtained through the processing of a multi-domain controller (MDC);
carrying out time synchronization processing and/or space correction processing on the multiple groups of first observation data to obtain multiple groups of processed observation data;
determining second observation data of the traffic element observed by the plurality of vehicles based on the processed plurality of sets of observation data.
2. The method of claim 1, wherein if the first observation indicates a change in velocity of the traffic element over time, the time synchronizing the first observations of the traffic element sent by the plurality of vehicles to obtain processed sets of observations comprises:
determining a time offset for the plurality of sets of first observations;
and adjusting each group of first observation data in the multiple groups of first observation data based on the time deviation of the multiple groups of first observation data to obtain the processed observation data, wherein the time of each group of observation data in the processed observation data is synchronous.
3. The method of claim 1 or 2, wherein if the first observation indicates a change in the coordinate position of the traffic element over time, the spatially correcting the plurality of first observations to obtain a processed plurality of observations comprises:
determining coordinates of the traffic element indicated by each group of first observation data in the multiple groups of first observation data at different time points in a preset coordinate system;
and expressing the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system by using target coordinate values corresponding to the preset coordinate ranges to obtain the coordinates of the traffic elements in each preset coordinate range in the coordinate system, wherein the processed observation data comprise the coordinates of the traffic elements in each preset coordinate range in the coordinate system.
4. The method of claim 1 or 2, wherein when the traffic element is an object, the first observation data includes at least one of a type of the object, a motion state of the object, a motion trajectory of the object, and a size of the object.
5. The method of claim 4, wherein the first observation further includes time service information for a traffic light when the object is the traffic light.
6. An observation device of a traffic element, comprising:
a receiving unit, configured to receive multiple sets of first observation data of a traffic element sent by multiple vehicles, where each vehicle in the multiple vehicles acquires one set of the multiple sets of first observation data, a distance between every two vehicles in the multiple vehicles is less than or equal to a first distance, the first observation data is used to indicate a change in a coordinate position of the traffic element over time and/or a change in a speed of the traffic element over time, and the multiple sets of first observation data are acquired by an on-board sensor in each vehicle in the multiple vehicles and processed by a multi-domain controller MDC;
the processing unit is used for carrying out time synchronization processing and/or space correction processing on the multiple groups of first observation data to obtain multiple groups of processed observation data;
the processing unit is further configured to determine second observation data of the traffic element observed by the plurality of vehicles based on the processed plurality of sets of observation data.
7. The apparatus as recited in claim 6, said processing unit to further:
determining a time offset for the plurality of sets of first observations;
and adjusting each group of first observation data in the multiple groups of first observation data based on the time deviation of the multiple groups of first observation data to obtain the processed observation data, wherein the time of each group of observation data in the processed observation data is synchronous.
8. The apparatus of claim 6 or 7, wherein the processing unit is further configured to:
determining coordinates of the traffic element indicated by each group of first observation data in the multiple groups of first observation data at different time points in a preset coordinate system;
and expressing the coordinate values of the traffic elements contained in each preset coordinate range in the coordinate system by using target coordinate values corresponding to the preset coordinate ranges to obtain the coordinates of the traffic elements in each preset coordinate range in the coordinate system, wherein the processed observation data comprise the coordinates of the traffic elements in each preset coordinate range in the coordinate system.
9. The apparatus of claim 6 or 7, wherein when the traffic element is an object, the first observation data includes at least one of a type of the object, a motion state of the object, a motion trajectory of the object, and a size of the object.
10. The apparatus of claim 9, wherein the first observation further includes time service information for a traffic light when the object is the traffic light.
11. A computing device comprising at least one processor and memory, the at least one processor coupled with the memory to read and execute instructions in the memory to perform the method of any of claims 1-5.
12. A computer-readable medium, characterized in that the computer-readable medium has stored program code which, when run on a computer, causes the computer to perform the method according to any one of claims 1-5.
CN202080004590.9A 2020-09-25 2020-09-25 Method and device for observing traffic elements Active CN112639910B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117785 WO2022061725A1 (en) 2020-09-25 2020-09-25 Traffic element observation method and apparatus

Publications (2)

Publication Number Publication Date
CN112639910A CN112639910A (en) 2021-04-09
CN112639910B true CN112639910B (en) 2022-05-17

Family

ID=75291151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080004590.9A Active CN112639910B (en) 2020-09-25 2020-09-25 Method and device for observing traffic elements

Country Status (3)

Country Link
EP (1) EP4207133A4 (en)
CN (1) CN112639910B (en)
WO (1) WO2022061725A1 (en)

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002157696A (en) * 2000-11-22 2002-05-31 Natl Inst For Land & Infrastructure Management Mlit Obstacle avoiding method of ahs with dedicated lane
JP4271720B1 (en) * 2008-04-24 2009-06-03 本田技研工業株式会社 Vehicle periphery monitoring device
US20140078304A1 (en) * 2012-09-20 2014-03-20 Cloudcar, Inc. Collection and use of captured vehicle data
CN104217615B (en) * 2014-09-16 2016-08-24 武汉理工大学 A kind of pedestrian anti-collision system and method collaborative based on bus or train route
CN105741546B (en) * 2016-03-18 2018-06-29 重庆邮电大学 The intelligent vehicle Target Tracking System and method that roadside device is merged with vehicle sensor
CN109872530B (en) * 2017-12-05 2022-02-15 广州腾讯科技有限公司 Road condition information generation method, vehicle-mounted terminal and server
CN109900490B (en) * 2017-12-11 2020-11-03 上海交通大学 Vehicle motion state detection method and system based on autonomous and cooperative sensors
DE102018216809A1 (en) * 2018-09-28 2020-04-02 Robert Bosch Gmbh Method, device and sensor system for environmental detection for a vehicle
CN109556615B (en) * 2018-10-10 2022-10-04 吉林大学 Driving map generation method based on multi-sensor fusion cognition of automatic driving
CN111243281A (en) * 2018-11-09 2020-06-05 杭州海康威视系统技术有限公司 Road multi-video joint detection system and detection method
CN111402574B (en) * 2018-12-13 2023-04-07 阿里巴巴集团控股有限公司 Vehicle detection method, device, equipment and storage medium
US11436923B2 (en) * 2019-01-25 2022-09-06 Cavh Llc Proactive sensing systems and methods for intelligent road infrastructure systems
CN110208158A (en) * 2019-06-13 2019-09-06 上汽大众汽车有限公司 A kind of vehicle environmental detection sensor on-line calibration method and system
CN111193568A (en) * 2019-12-19 2020-05-22 北汽福田汽车股份有限公司 Time synchronization method, device, system, storage medium and vehicle
CN111209956A (en) * 2020-01-02 2020-05-29 北京汽车集团有限公司 Sensor data fusion method, and vehicle environment map generation method and system
CN111222568A (en) * 2020-01-03 2020-06-02 北京汽车集团有限公司 Vehicle networking data fusion method and device
CN111383287B (en) * 2020-02-13 2021-06-29 湖北亿咖通科技有限公司 External parameter calibration method and device for vehicle-mounted sensor
CN111220998B (en) * 2020-02-26 2022-10-28 江苏大学 Multi-target cooperative tracking method based on vehicle-to-vehicle communication
CN111459168B (en) * 2020-04-23 2021-12-10 上海交通大学 Fused automatic-driving automobile pedestrian crossing track prediction method and system
CN111596090A (en) * 2020-06-17 2020-08-28 中国第一汽车股份有限公司 Method and device for measuring vehicle running speed, vehicle and medium

Also Published As

Publication number Publication date
EP4207133A1 (en) 2023-07-05
EP4207133A4 (en) 2023-11-01
CN112639910A (en) 2021-04-09
WO2022061725A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
CN110379193B (en) Behavior planning method and behavior planning device for automatic driving vehicle
WO2022027304A1 (en) Testing method and apparatus for autonomous vehicle
WO2021102955A1 (en) Path planning method for vehicle and path planning apparatus for vehicle
CN113168708B (en) Lane line tracking method and device
CN112639883B (en) Relative attitude calibration method and related device
CN112146671B (en) Path planning method, related equipment and computer readable storage medium
JP2023508114A (en) AUTOMATED DRIVING METHOD, RELATED DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
CN113460042B (en) Vehicle driving behavior recognition method and recognition device
CN113261274B (en) Image processing method and related terminal device
CN113156927A (en) Safety control method and safety control device for automatic driving vehicle
CN113498529B (en) Target tracking method and device
CN113835421A (en) Method and device for training driving behavior decision model
US20230048680A1 (en) Method and apparatus for passing through barrier gate crossbar by vehicle
EP4307251A1 (en) Mapping method, vehicle, computer readable storage medium, and chip
CN112810603B (en) Positioning method and related product
CN114248794A (en) Vehicle control method and device and vehicle
CN113954858A (en) Method for planning vehicle driving route and intelligent automobile
CN113859265B (en) Reminding method and device in driving process
CN113968242B (en) Automatic driving scene generation method, device and system
WO2021163846A1 (en) Target tracking method and target tracking apparatus
CN114445490A (en) Pose determination method and related equipment thereof
CN112639910B (en) Method and device for observing traffic elements
CN114092898A (en) Target object sensing method and device
CN113799794A (en) Method and device for planning longitudinal motion parameters of vehicle
CN112654547A (en) Driving reminding method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant