CN109635861B - Data fusion method and device, electronic equipment and storage medium - Google Patents

Data fusion method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN109635861B
CN109635861B CN201811479927.2A CN201811479927A CN109635861B CN 109635861 B CN109635861 B CN 109635861B CN 201811479927 A CN201811479927 A CN 201811479927A CN 109635861 B CN109635861 B CN 109635861B
Authority
CN
China
Prior art keywords
current sensor
data
sensor data
current
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811479927.2A
Other languages
Chinese (zh)
Other versions
CN109635861A (en
Inventor
王军
张晔
王亮
袁庭荣
程凯
徐铎
林坚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811479927.2A priority Critical patent/CN109635861B/en
Publication of CN109635861A publication Critical patent/CN109635861A/en
Application granted granted Critical
Publication of CN109635861B publication Critical patent/CN109635861B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Abstract

The embodiment of the invention discloses a data fusion method, a data fusion device, electronic equipment and a storage medium. The method comprises the following steps: acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current moment; determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data; and fusing the current sensor data and the target track data of which the current sensor data are mutually associated at the current moment. The robustness of data fusion can be improved, and the real-time performance of the data fusion can be ensured.

Description

Data fusion method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a data fusion method and device, electronic equipment and a storage medium.
Background
An autonomous vehicle, which may also be referred to as an unmanned vehicle, senses the surroundings of the vehicle through various sensors, and controls the steering and speed of the vehicle according to the sensed road, vehicle position, obstacle information, and the like, so that the vehicle can safely and reliably travel on the road. Therefore, when the autonomous vehicle travels, the traveling environment around the vehicle must be known in real time. While the acquisition of environmental information relies on various sensors installed on autonomous vehicles, typical sensors on current autonomous vehicles include: laser radar, cameras and millimeter wave radar. During the running process of the vehicle, after data are collected by each sensor, the sensor module in each sensor performs detection and tracking, and the output results of each sensor may conflict, for example, for a certain obstacle, the laser radar is identified as a motor vehicle and the camera is identified as a bicycle. In order to improve the identification capability of the target obstacle, a data fusion technology is used, and the complementarity of each sensor is fully utilized to form a comprehensive perception description of the system environment.
In the existing data fusion method, sensor data output by each sensor in the unmanned vehicle is generally obtained firstly; and then fusing the sensor data output by each sensor. In the existing data fusion method, sensor data output by all sensors are directly fused, and if the number of the sensors in the unmanned vehicle is large, the ambiguity of the sensor data output by each sensor is large, and the fusion logic is complex, the existing data fusion method is adopted, so that the robustness of data fusion is reduced, and the real-time performance of the data fusion is also reduced.
Disclosure of Invention
In view of this, embodiments of the present invention provide a data fusion method, an apparatus, an electronic device, and a storage medium, which can not only improve robustness of data fusion, but also ensure real-time performance of data fusion.
In a first aspect, an embodiment of the present invention provides a data fusion method, where the method includes:
acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current moment;
determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data;
and fusing the current sensor data and the target track data of which the current sensor data are mutually associated at the current moment.
In the above embodiment, the acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current time includes:
acquiring data of each sensor output by the current sensor in the unmanned vehicle in a current period;
extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period;
and acquiring at least one piece of current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each piece of sensor data.
In the above embodiment, the determining, according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data, target trajectory data in which each current sensor data is associated with each other includes:
calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment;
and determining target track data of the current sensor data which are correlated with each other according to the similarity of each current sensor data output by the current sensor at the current moment and each historical track data.
In the above embodiment, the calculating the similarity between each current sensor data output by the current sensor at the current time and each historical track data includes:
extracting vehicle running data corresponding to each current sensor data from each current sensor data output by the current sensor at the current moment;
splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1;
calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data;
and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
In the above embodiment, the fusing, at the current time, the target trajectory data in which the current sensor data and the current sensor data are associated with each other includes:
respectively extracting characteristic information of the current moment from each current sensor data and target track data of each current sensor data correlated with each other; wherein the feature information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information;
determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information;
and fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
In a second aspect, an embodiment of the present invention provides a data fusion apparatus, where the apparatus includes: the system comprises an acquisition module, a determination module and a fusion module; wherein the content of the first and second substances,
the acquisition module is used for acquiring at least one piece of current sensor data output by a current sensor in the unmanned vehicle at the current moment;
the determining module is used for determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data;
and the fusion module is used for fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment.
In the above embodiment, the obtaining module is specifically configured to obtain sensor data output by the current sensor in the unmanned vehicle in a current period; extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one piece of current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each piece of sensor data.
In the above embodiment, the determining module includes: a calculation submodule and a determination submodule; wherein the content of the first and second substances,
the calculation submodule is used for calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment;
and the determining submodule is used for determining target track data of the current sensor data correlated with each other according to the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
In the above embodiment, the calculation submodule is specifically configured to extract vehicle driving data corresponding to each current sensor data from each current sensor data output by the current sensor at the current time; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1; calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
In the above embodiment, the fusion module is specifically configured to extract feature information of a current time from each current sensor data and target trajectory data in which each current sensor data is associated with each other; wherein the feature information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information; determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information; and fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
one or more processors;
a memory for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the data fusion method according to any embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the data fusion method according to any embodiment of the present invention.
The embodiment of the invention provides a data fusion method, a data fusion device, electronic equipment and a storage medium, wherein at least one current sensor data output by a current sensor in an unmanned vehicle at the current moment is obtained; then, determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment. That is, in the technical solution of the present invention, target trajectory data in which each current sensor data is associated with each other may be determined according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data. In the existing data fusion method, sensor data output by all sensors are directly fused, and if the number of the sensors in the unmanned vehicle is large, the ambiguity of the sensor data output by each sensor is large, and the fusion logic is complex, the existing data fusion method is adopted, so that the robustness of data fusion is reduced, and the real-time performance of the data fusion is also reduced. Therefore, compared with the prior art, the data fusion method, the data fusion device, the electronic equipment and the storage medium provided by the embodiment of the invention can not only improve the robustness of data fusion, but also ensure the real-time performance of the data fusion; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Drawings
Fig. 1 is a schematic flow chart of a data fusion method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a data fusion method according to a second embodiment of the present invention;
fig. 3 is a schematic flow chart of a data fusion method according to a third embodiment of the present invention;
fig. 4 is a first structural diagram of a data fusion device according to a fourth embodiment of the present invention;
fig. 5 is a second schematic structural diagram of a data fusion apparatus according to a fourth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings.
Example one
Fig. 1 is a flowchart of a data fusion method according to an embodiment of the present invention, where the method may be executed by a data fusion apparatus or an electronic device, where the apparatus or the electronic device may be implemented by software and/or hardware, and the apparatus or the electronic device may be integrated in any intelligent device with a network communication function. As shown in fig. 1, the data fusion method may include the steps of:
s101, acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current moment.
In particular embodiments of the present invention, an electronic device may obtain at least one current sensor data output by a current sensor in an unmanned vehicle at a current time. Specifically, the electronic device may first acquire sensor data output by a current sensor in the unmanned vehicle in a current period; then extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each sensor data.
S102, determining target track data of the current sensor data which are mutually related according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data.
In a specific embodiment of the present invention, the electronic device may determine target trajectory data in which each current sensor data is associated with each other according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data. Specifically, the electronic device may first calculate the similarity between each current sensor data output by the current sensor at the current time and each historical track data; and then determining target track data of the current sensor data which are correlated with each other according to the similarity of each current sensor data output by the current sensor at the current moment and each historical track data.
S103, fusing the current sensor data and the target track data of the current sensor data correlated with each other at the current moment.
In a specific embodiment of the present invention, the electronic device may fuse each current sensor data and target trajectory data in which each current sensor data is associated with each other at the current time. Specifically, the electronic device may first extract feature information of a current time from each current sensor data and target trajectory data in which each current sensor data is associated with each other; wherein the characteristic information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information; then determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
The data fusion method provided by the embodiment of the invention comprises the steps of firstly obtaining at least one current sensor data output by a current sensor in an unmanned vehicle at the current moment; then, determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment. That is, in the technical solution of the present invention, target trajectory data in which each current sensor data is associated with each other may be determined according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data. In the existing data fusion method, sensor data output by all sensors are directly fused, and if the number of the sensors in the unmanned vehicle is large, the ambiguity of the sensor data output by each sensor is large, and the fusion logic is complex, the existing data fusion method is adopted, so that the robustness of data fusion is reduced, and the real-time performance of the data fusion is also reduced. Therefore, compared with the prior art, the data fusion method provided by the embodiment of the invention can not only improve the robustness of data fusion, but also ensure the real-time performance of the data fusion; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Example two
Fig. 2 is a schematic flow chart of a data fusion method according to a second embodiment of the present invention. As shown in fig. 2, the data fusion method may include the steps of:
s201, acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current moment.
In particular embodiments of the present invention, an electronic device may obtain at least one current sensor data output by a current sensor in an unmanned vehicle at a current time. Specifically, the electronic device may first acquire sensor data output by a current sensor in the unmanned vehicle in a current period; then extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each sensor data.
S202, calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
In a specific embodiment of the present invention, the electronic device may calculate the similarity between each current sensor data and each historical track data output by the current sensor at the current time. Specifically, suppose that the current sensor outputs X current sensor data at the current time, which are: current sensor data 1, current sensor data 2, …, current sensor data X; the electronic equipment determines Y pieces of historical track data in advance, which are respectively as follows: historical track data 1, historical track data 2, …, and historical track data Y; wherein X and Y are both natural numbers equal to or greater than 1. In this step, the electronic device may calculate the similarity between the current sensor data 1 and the historical track data 1, the similarity between the current sensor data 1 and the historical track data 2, …, and the similarity between the current sensor data 1 and the historical track data Y; the similarity between the current sensor data 2 and the historical track data 1, the similarity between the current sensor data 2 and the historical track data 2, …, and the similarity between the current sensor data 2 and the historical track data Y can also be calculated; and so on; the similarity of the current sensor data X and the historical track data 1, the similarity of the current sensor data X and the historical track data 2, …, and the similarity of the current sensor data X and the historical track data Y may also be calculated.
Preferably, in an embodiment of the present invention, the electronic device may extract vehicle driving data corresponding to each current sensor data from each current sensor data output by the current sensor at the current time; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1; calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data. Specifically, the electronic device may extract vehicle driving data corresponding to each current sensor data from each current sensor data according to a predetermined data storage format; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata according to a predetermined data splitting mode; for example, it is assumed that the vehicle travel data corresponding to the current sensor data includes 10 objects, which are: in the step, the electronic device can divide the vehicle driving data corresponding to the current sensor data into 3 vehicle driving subdata; respectively as follows: vehicle travel subdata 1, vehicle travel subdata 2 and vehicle travel subdata 3; wherein the vehicle travel sub data 1 includes 3 pedestrians; the vehicle travel sub data 2 includes 3 vehicles; the vehicle travel sub data 3 includes 4 trees. Then calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
S203, determining target track data of the current sensor data correlated with each other according to the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
In a specific embodiment of the present invention, the electronic device may determine, according to the similarity between each current sensor data output by the current sensor at the current time and each historical track data, target track data in which each current sensor data is associated with each other. Specifically, suppose that the current sensor outputs X current sensor data at the current time, which are: current sensor data 1, current sensor data 2, …, current sensor data X; the electronic equipment determines Y pieces of historical track data in advance, which are respectively as follows: historical track data 1, historical track data 2, …, and historical track data Y; wherein X and Y are both natural numbers equal to or greater than 1. In this step, the electronic device may determine, according to the similarity between the current sensor data 1 and the historical track data 1, the similarity between the current sensor data 1 and the historical track data 2, …, and the similarity between the current sensor data 1 and the historical track data Y, the target track data 1 in which the current sensor data 1 are associated with each other; the target track data 2 of which the current sensor data 2 is associated with each other can be determined according to the similarity between the current sensor data 2 and the historical track data 1, the similarity between the current sensor data 2 and the historical track data 2, …, and the similarity between the current sensor data 2 and the historical track data Y; and so on; the target track data X of which the current sensor X is associated with each other may also be determined according to the similarity between the current sensor data X and the historical track data 1, the similarity between the current sensor data X and the historical track data 2, …, and the similarity between the current sensor data X and the historical track data Y.
And S204, fusing the current sensor data and the target track data of which the current sensor data are mutually related at the current moment.
In a specific embodiment of the present invention, the electronic device may fuse each current sensor data and target trajectory data in which each current sensor data is associated with each other at the current time. Specifically, the electronic device may first extract feature information of a current time from each current sensor data and target trajectory data in which each current sensor data is associated with each other; wherein the characteristic information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information; then determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein, fuse the information and include: speed information, appearance information, or category information; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
The data fusion method provided by the embodiment of the invention comprises the steps of firstly obtaining at least one current sensor data output by a current sensor in an unmanned vehicle at the current moment; then, determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment. That is, in the technical solution of the present invention, target trajectory data in which each current sensor data is associated with each other may be determined according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data. In the existing data fusion method, sensor data output by all sensors are directly fused, and if the number of the sensors in the unmanned vehicle is large, the ambiguity of the sensor data output by each sensor is large, and the fusion logic is complex, the existing data fusion method is adopted, so that the robustness of data fusion is reduced, and the real-time performance of the data fusion is also reduced. Therefore, compared with the prior art, the data fusion method provided by the embodiment of the invention can not only improve the robustness of data fusion, but also ensure the real-time performance of the data fusion; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
EXAMPLE III
Fig. 3 is a schematic flow chart of a data fusion method according to a third embodiment of the present invention. As shown in fig. 3, the data fusion method may include the steps of:
s301, at least one piece of current sensor data output by a current sensor in the unmanned vehicle at the current moment is obtained.
In particular embodiments of the present invention, an electronic device may obtain at least one current sensor data output by a current sensor in an unmanned vehicle at a current time. Specifically, the electronic device may first acquire sensor data output by a current sensor in the unmanned vehicle in a current period; then extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each sensor data.
S302, calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
In a specific embodiment of the present invention, the electronic device may calculate the similarity between each current sensor data and each historical track data output by the current sensor at the current time. Specifically, suppose that the current sensor outputs X current sensor data at the current time, which are: current sensor data 1, current sensor data 2, …, current sensor data X; the electronic equipment determines Y pieces of historical track data in advance, which are respectively as follows: historical track data 1, historical track data 2, …, and historical track data Y; wherein X and Y are both natural numbers equal to or greater than 1. In this step, the electronic device may calculate the similarity between the current sensor data 1 and the historical track data 1, the similarity between the current sensor data 1 and the historical track data 2, …, and the similarity between the current sensor data 1 and the historical track data Y; the similarity between the current sensor data 2 and the historical track data 1, the similarity between the current sensor data 2 and the historical track data 2, …, and the similarity between the current sensor data 2 and the historical track data Y can also be calculated; and so on; the similarity of the current sensor data X and the historical track data 1, the similarity of the current sensor data X and the historical track data 2, …, and the similarity of the current sensor data X and the historical track data Y may also be calculated.
Preferably, in an embodiment of the present invention, the electronic device may extract vehicle driving data corresponding to each current sensor data from each current sensor data output by the current sensor at the current time; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1; calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data. Specifically, the electronic device may extract vehicle driving data corresponding to each current sensor data from each current sensor data according to a predetermined data storage format; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata according to a predetermined data splitting mode; for example, it is assumed that the vehicle travel data corresponding to the current sensor data includes 10 objects, which are: in the step, the electronic device can divide the vehicle driving data corresponding to the current sensor data into 3 vehicle driving subdata; respectively as follows: vehicle travel subdata 1, vehicle travel subdata 2 and vehicle travel subdata 3; wherein the vehicle travel sub data 1 includes 3 pedestrians; the vehicle travel sub data 2 includes 3 vehicles; the vehicle travel sub data 3 includes 4 trees. Then calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
And S303, determining target track data of the current sensor data correlated with each other according to the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
In a specific embodiment of the present invention, the electronic device may determine, according to the similarity between each current sensor data output by the current sensor at the current time and each historical track data, target track data in which each current sensor data is associated with each other. Specifically, suppose that the current sensor outputs X current sensor data at the current time, which are: current sensor data 1, current sensor data 2, …, current sensor data X; the electronic equipment determines Y pieces of historical track data in advance, which are respectively as follows: historical track data 1, historical track data 2, …, and historical track data Y; wherein X and Y are both natural numbers equal to or greater than 1. In this step, the electronic device may determine, according to the similarity between the current sensor data 1 and the historical track data 1, the similarity between the current sensor data 1 and the historical track data 2, …, and the similarity between the current sensor data 1 and the historical track data Y, the target track data 1 in which the current sensor data 1 are associated with each other; the target track data 2 of which the current sensor data 2 is associated with each other can be determined according to the similarity between the current sensor data 2 and the historical track data 1, the similarity between the current sensor data 2 and the historical track data 2, …, and the similarity between the current sensor data 2 and the historical track data Y; and so on; the target track data X of which the current sensor X is associated with each other may also be determined according to the similarity between the current sensor data X and the historical track data 1, the similarity between the current sensor data X and the historical track data 2, …, and the similarity between the current sensor data X and the historical track data Y.
S304, respectively extracting the characteristic information of the current moment from the current sensor data and the target track data of which the current sensor data are mutually associated.
In a specific embodiment of the present invention, the electronic device may respectively extract feature information of a current time from each current sensor data and target trajectory data in which each current sensor data is associated with each other; wherein the characteristic information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information. Specifically, the electronic device may respectively extract the feature information 1 at the current time from the target trajectory data 1 in which the current sensor data 1 and the current sensor data 1 are associated with each other; respectively extracting the characteristic information 2 at the current moment from the target track data 2 in which the current sensor data 2 and the current sensor data 2 are correlated; …, respectively; the feature information X at the current time may also be extracted from the target trajectory data X in which the current sensor data X and the current sensor data X are associated with each other.
S305, determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the feature information of the target track data, correlated with the current sensor data, at the current moment.
In a specific embodiment of the present invention, the electronic device may determine the fusion information of the unmanned vehicle at the current time according to the current sensor data and the feature information of the target trajectory data at the current time, where the current sensor data are associated with each other; wherein, fuse the information and include: speed information, appearance information, or category information. Specifically, the electronic device may determine fusion information 1 of the unmanned vehicle at the current time according to the current sensor data 1 and feature information 1 of the target trajectory data 1 at the current time, where the current sensor data 1 and the current sensor data 1 are associated with each other; the fusion information 2 of the unmanned vehicle at the current moment can be determined according to the characteristic information 2 of the target track data 2 at the current moment, wherein the current sensor data 2 is correlated with the current sensor data 2; …, respectively; and determining the fusion information X of the unmanned vehicle at the current moment according to the characteristic information X of the target track data X at the current moment, wherein the current sensor data X is correlated with the current sensor data X.
S306, fusing the current sensor data and the target track data of the current sensor data correlated with each other at the current moment according to the fusion information of the unmanned vehicle at the current moment.
In a specific embodiment of the present invention, the electronic device may fuse, at the current time, each piece of current sensor data and target trajectory data in which each piece of current sensor data is associated with each other according to the fusion information of the unmanned vehicle at the current time. Specifically, the electronic device may fuse, at the current time, the current sensor data 1 and the target trajectory data 1 in which the current sensor data 1 are associated with each other according to the fusion information 1 of the unmanned vehicle at the current time; the target track data 2 of which the current sensor data 2 and the current sensor data 2 are mutually associated can be fused at the current moment according to the fusion information 2 of the unmanned vehicle at the current moment; …, respectively; and fusing the current sensor data X and the target track data X which are correlated with each other according to the fusion information X of the unmanned vehicle at the current moment.
The data fusion method provided by the embodiment of the invention comprises the steps of firstly obtaining at least one current sensor data output by a current sensor in an unmanned vehicle at the current moment; then, determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data; and fusing the current sensor data and the target track data correlated with the current sensor data at the current moment. That is, in the technical solution of the present invention, target trajectory data in which each current sensor data is associated with each other may be determined according to at least one current sensor data output by the current sensor at the current time and each predetermined historical trajectory data. In the existing data fusion method, sensor data output by all sensors are directly fused, and if the number of the sensors in the unmanned vehicle is large, the ambiguity of the sensor data output by each sensor is large, and the fusion logic is complex, the existing data fusion method is adopted, so that the robustness of data fusion is reduced, and the real-time performance of the data fusion is also reduced. Therefore, compared with the prior art, the data fusion method provided by the embodiment of the invention can not only improve the robustness of data fusion, but also ensure the real-time performance of the data fusion; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Example four
Fig. 4 is a schematic view of a first structure of a data fusion apparatus according to a fourth embodiment of the present invention. As shown in fig. 4, the data fusion apparatus according to the embodiment of the present invention may include: an acquisition module 401, a determination module 402 and a fusion module 403; wherein the content of the first and second substances,
the acquiring module 401 is configured to acquire at least one current sensor data output by a current sensor in the unmanned vehicle at a current moment;
the determining module 402 is configured to determine target trajectory data, in which current sensor data are associated with each other, according to at least one current sensor data output by the current sensor at a current time and each predetermined historical trajectory data;
the fusion module 403 is configured to fuse the current sensor data and target trajectory data of the current sensor data correlated with each other at a current time.
Further, the obtaining module 401 is specifically configured to obtain sensor data output by the current sensor in the unmanned vehicle in a current period; extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one piece of current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each piece of sensor data.
Fig. 5 is a second schematic structural diagram of a data fusion apparatus according to a fourth embodiment of the present invention. As shown in fig. 5, the determining module 402 includes: a calculation sub-module 4021 and a determination sub-module 4022; wherein the content of the first and second substances,
the calculating submodule 4021 is configured to calculate similarity between each current sensor data output by the current sensor at the current time and each historical track data;
the determining sub-module 4022 is configured to determine target trajectory data, in which current sensor data are associated with each other, according to similarity between each current sensor data output by the current sensor at the current time and each historical trajectory data.
Further, the calculating submodule 4021 is specifically configured to extract vehicle driving data corresponding to each current sensor data from each current sensor data output by the current sensor at the current time; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1; calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
Further, the fusion module 403 is specifically configured to extract feature information of the current time from each current sensor data and target trajectory data of each current sensor data correlated with each other; wherein the feature information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information; determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information; and fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
The data fusion device can execute the method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the data fusion method provided in any embodiment of the present invention.
EXAMPLE five
Fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 6 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 6 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 6, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing, such as implementing the data fusion method provided by the embodiments of the present invention, by executing programs stored in the system memory 28.
EXAMPLE six
The sixth embodiment of the invention provides a computer storage medium.
The computer-readable storage media of embodiments of the invention may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A method of data fusion, the method comprising:
acquiring at least one current sensor data output by a current sensor in the unmanned vehicle at the current moment;
determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data;
determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and target track data of the current sensor data correlated with each other; and fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
2. The method of claim 1, wherein the obtaining at least one current sensor data output by a current sensor in the unmanned vehicle at a current time comprises:
acquiring data of each sensor output by the current sensor in the unmanned vehicle in a current period;
extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period;
and acquiring at least one piece of current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each piece of sensor data.
3. The method of claim 1, wherein determining target trajectory data in which each current sensor data is correlated with each other according to at least one current sensor data output by the current sensor at a current time and each predetermined historical trajectory data comprises:
calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment;
and determining target track data of the current sensor data which are correlated with each other according to the similarity of each current sensor data output by the current sensor at the current moment and each historical track data.
4. The method of claim 3, wherein the calculating the similarity between each current sensor data output by the current sensor at the current moment and each historical track data comprises:
extracting vehicle running data corresponding to each current sensor data from each current sensor data output by the current sensor at the current moment;
splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1;
calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data;
and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
5. The method of claim 1, wherein determining the fusion information of the unmanned vehicle at the current time based on the respective current sensor data and the target trajectory data correlated with the respective current sensor data comprises:
respectively extracting characteristic information of the current moment from each current sensor data and target track data of each current sensor data correlated with each other; wherein the feature information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information;
determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information.
6. A data fusion apparatus, the apparatus comprising: the system comprises an acquisition module, a determination module and a fusion module; wherein the content of the first and second substances,
the acquisition module is used for acquiring at least one piece of current sensor data output by a current sensor in the unmanned vehicle at the current moment;
the determining module is used for determining target track data of each current sensor data which are correlated with each other according to at least one current sensor data output by the current sensor at the current moment and each predetermined historical track data;
the fusion module is used for determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and target track data of the current sensor data which are correlated; and fusing the current sensor data and the target track data which is associated with the current sensor data at the current moment according to the fusion information of the unmanned vehicle at the current moment.
7. The apparatus of claim 6, wherein:
the acquisition module is specifically used for acquiring data of each sensor output by the current sensor in the unmanned vehicle in a current period; extracting a timestamp corresponding to each sensor data from each sensor data output by the current sensor in the current period; and acquiring at least one piece of current sensor data output by the current sensor at the current moment according to the timestamp corresponding to each piece of sensor data.
8. The apparatus of claim 6, wherein the determining module comprises: a calculation submodule and a determination submodule; wherein the content of the first and second substances,
the calculation submodule is used for calculating the similarity of each current sensor data and each historical track data output by the current sensor at the current moment;
and the determining submodule is used for determining target track data of the current sensor data correlated with each other according to the similarity of each current sensor data and each historical track data output by the current sensor at the current moment.
9. The apparatus of claim 8, wherein:
the calculation submodule is specifically configured to extract vehicle driving data corresponding to each current sensor data from each current sensor data output by the current sensor at the current time; splitting vehicle running data corresponding to each current sensor data into M vehicle running subdata; wherein M is a natural number greater than 1; calculating the similarity of M vehicle driving subdata corresponding to each current sensor data and each historical track data; and determining the similarity of each current sensor data and each historical track data output by the current sensor at the current moment according to the similarity of the M pieces of vehicle driving sub data corresponding to each current sensor data and each historical track data.
10. The apparatus of claim 6, wherein:
the fusion module is specifically used for respectively extracting the characteristic information of the current moment from each current sensor data and the target track data of which the current sensor data are correlated; wherein the feature information includes: appearance feature information, motion feature information, semantic feature information, or scene feature information; determining fusion information of the unmanned vehicle at the current moment according to the current sensor data and the characteristic information of the target track data, correlated with the current sensor data, of the unmanned vehicle at the current moment; wherein the fusion information includes: speed information, appearance information, or category information.
11. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the data fusion method of any one of claims 1-5.
12. A storage medium on which a computer program is stored, which program, when being executed by a processor, carries out the data fusion method according to any one of claims 1 to 5.
CN201811479927.2A 2018-12-05 2018-12-05 Data fusion method and device, electronic equipment and storage medium Active CN109635861B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811479927.2A CN109635861B (en) 2018-12-05 2018-12-05 Data fusion method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811479927.2A CN109635861B (en) 2018-12-05 2018-12-05 Data fusion method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109635861A CN109635861A (en) 2019-04-16
CN109635861B true CN109635861B (en) 2021-01-01

Family

ID=66071274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811479927.2A Active CN109635861B (en) 2018-12-05 2018-12-05 Data fusion method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109635861B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879696A (en) * 2019-05-15 2022-08-09 百度在线网络技术(北京)有限公司 Track matching method, device, equipment and medium
CN110751205A (en) * 2019-10-17 2020-02-04 北京百度网讯科技有限公司 Object association method, device, equipment and medium
CN112665590B (en) * 2020-12-11 2023-04-21 国汽(北京)智能网联汽车研究院有限公司 Vehicle track determination method and device, electronic equipment and computer storage medium
CN113094000A (en) * 2021-05-10 2021-07-09 宝能(广州)汽车研究院有限公司 Vehicle signal storage method and device, storage equipment and storage medium
CN115205820A (en) * 2022-07-05 2022-10-18 安徽蔚来智驾科技有限公司 Object association method, computer device, computer-readable storage medium, and vehicle

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919802A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Unmanned vehicle traveling and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9963215B2 (en) * 2014-12-15 2018-05-08 Leidos, Inc. System and method for fusion of sensor data to support autonomous maritime vessels
MX2017014648A (en) * 2015-05-15 2018-04-11 Airfusion Inc Portable apparatus and method for decision support for real time automated multisensor data fusion and analysis.
CN108828527B (en) * 2018-06-19 2021-04-16 驭势(上海)汽车科技有限公司 Multi-sensor data fusion method and device, vehicle-mounted equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108919802A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Unmanned vehicle traveling and device

Also Published As

Publication number Publication date
CN109635861A (en) 2019-04-16

Similar Documents

Publication Publication Date Title
CN109635861B (en) Data fusion method and device, electronic equipment and storage medium
CN109343061B (en) Sensor calibration method and device, computer equipment, medium and vehicle
CN109145680B (en) Method, device and equipment for acquiring obstacle information and computer storage medium
US11599120B2 (en) Obstacle detecting method, apparatus, device and computer storage medium
CN109188438B (en) Yaw angle determination method, device, equipment and medium
US20190080186A1 (en) Traffic light state recognizing method and apparatus, computer device and readable medium
CN109116374B (en) Method, device and equipment for determining distance of obstacle and storage medium
CN109931945B (en) AR navigation method, device, equipment and storage medium
CN109606384B (en) Vehicle control method, device, equipment and storage medium
CN110097121B (en) Method and device for classifying driving tracks, electronic equipment and storage medium
CN109558854B (en) Obstacle sensing method and device, electronic equipment and storage medium
CN112798004B (en) Positioning method, device and equipment for vehicle and storage medium
CN112200142A (en) Method, device, equipment and storage medium for identifying lane line
CN113537362A (en) Perception fusion method, device, equipment and medium based on vehicle-road cooperation
CN115965657B (en) Target tracking method, electronic device, storage medium and vehicle
CN112650300A (en) Unmanned aerial vehicle obstacle avoidance method and device
CN114194217A (en) Vehicle automatic driving method, device, electronic equipment and storage medium
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN110363193B (en) Vehicle weight recognition method, device, equipment and computer storage medium
JP2023038164A (en) Obstacle detection method, device, automatic driving vehicle, apparatus, and storage medium
CN109300322B (en) Guideline drawing method, apparatus, device, and medium
CN112102648B (en) Vacant parking space pushing method, device, equipment and storage medium
CN115908498B (en) Multi-target tracking method and device based on category optimal matching
CN109270566B (en) Navigation method, navigation effect testing method, device, equipment and medium
CN110440788B (en) Navigation prompting method crossing single-line road, server and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant