CN117372477A - Target tracking matching method, device, equipment and medium - Google Patents

Target tracking matching method, device, equipment and medium Download PDF

Info

Publication number
CN117372477A
CN117372477A CN202311401429.7A CN202311401429A CN117372477A CN 117372477 A CN117372477 A CN 117372477A CN 202311401429 A CN202311401429 A CN 202311401429A CN 117372477 A CN117372477 A CN 117372477A
Authority
CN
China
Prior art keywords
target
matched
historical
matching
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311401429.7A
Other languages
Chinese (zh)
Inventor
薛鸿
陈博
徐名源
朱亚旋
邱璆
王佑星
张达明
宋楠楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Original Assignee
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faw Nanjing Technology Development Co ltd, FAW Group Corp filed Critical Faw Nanjing Technology Development Co ltd
Priority to CN202311401429.7A priority Critical patent/CN117372477A/en
Publication of CN117372477A publication Critical patent/CN117372477A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a target tracking and matching method, a device, equipment and a medium, wherein a historical target in a historical image frame is obtained; identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target; and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched. The embodiment of the invention can realize redundant feature extraction and matching removal, solves the problem of unnecessary calculation resource waste of a feature-based target tracking and matching strategy, optimizes the target tracking and matching strategy, reduces the calculated amount and improves the target tracking and matching efficiency.

Description

Target tracking matching method, device, equipment and medium
Technical Field
The present invention relates to the field of autopilot technology, and in particular, to a target tracking matching method, apparatus, device, and medium.
Background
Autopilot is a leading edge technology and focus in the current travel field, and in order to ensure the maturity and reliability of autopilot technology, it is generally necessary to track an obstacle, and plan a travel route according to the position track of the obstacle.
At present, a traditional matching rule comprises positions, features and the like, wherein feature matching obtains target features according to a Re-identification (REID) model, and target tracking matching is performed based on the target features.
However, in the automatic driving scene, the requirement on the target tracking and matching speed is high, but the mode of acquiring the target characteristics and matching through the REID model in the current target tracking and matching process requires large calculation amount and consumes serious calculation time.
Disclosure of Invention
The invention provides a target tracking matching method, a device, equipment and a medium, which further optimize a target tracking matching strategy under the condition of ensuring the accuracy of tracking matching so as to solve the time-consuming problem caused by a feature matching mode, reduce the calculated amount and improve the matching efficiency.
According to an aspect of the present invention, there is provided a target tracking matching method, including:
acquiring a historical target in a historical image frame;
Identifying a target to be matched in a current image frame, classifying the target to be matched based on the confidence of the target to be matched, obtaining a first target to be matched and a second target to be matched;
performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target;
and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
According to another aspect of the present invention, there is provided a target tracking matching apparatus, comprising:
the historical target acquisition module is used for acquiring historical targets in the historical image frames;
the target to be matched identification module is used for identifying targets to be matched in the current image frame, classifying the targets to be matched based on the confidence of the targets to be matched, and obtaining a first target to be matched and a second target to be matched;
the position matching module is used for carrying out position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target;
And the feature matching module is used for carrying out feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the target tracking matching method according to any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the target tracking matching method according to any of the embodiments of the present invention when executed.
According to the technical scheme of the embodiment of the invention, acquiring a historical target in a historical image frame; identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target; and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched, so that redundant feature extraction and matching can be removed, the problem that unnecessary computing resources are wasted in a feature-based target tracking matching strategy is solved, the target tracking matching strategy is optimized, the calculated amount is reduced, and the target tracking matching efficiency is improved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a target tracking matching method according to an embodiment of the present invention;
fig. 2 is a flowchart of a target tracking matching method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a target tracking matching device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device implementing a target tracking matching method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a detailed description of embodiments of the present invention will be provided below, with reference to the accompanying drawings, wherein it is apparent that the described embodiments are only some, but not all, embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above-described drawings are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a target tracking and matching method according to an embodiment of the present invention, where the method may be performed by a target tracking and matching device, and the target tracking and matching device may be implemented in hardware and/or software, and the target tracking and matching device may be configured in an electronic device such as a computer or a server. As shown in fig. 1, the method includes:
S110, acquiring a historical target in the historical image frame.
In this embodiment, during the running of the vehicle, an image of the environment in which the vehicle is located is acquired in real time, a running environment video is formed according to the acquisition time stamp of the image, and an image frame is an image frame extracted from the running environment video according to the time stamp, for example, an image frame in an autopilot scene video can be said. The image frames are ordered according to the time stamps of the image frames to obtain an image frame sequence, which may be ordered according to the time stamp order or may be ordered in the reverse order according to the time stamp, which is not limited in this embodiment. A historical image frame is an image frame that is ordered first in any two consecutive image frames in the sequence of image frames. The target is a target which is tracked and matched in the video, taking the video of the automatic driving scene as an example, and the target is an obstacle which is needed to be automatically avoided when the automatic driving automobile runs. The historical targets are one or more targets tracked in the historical image frames.
Specifically, the target tracking matching device may be connected to the camera device through a wire and/or wirelessly to obtain a video collected by the camera device in real time, or call the video stored in the local and/or server, or use each frame of image extracted from the video as an image frame, or use a frame of image extracted from the video at fixed frame intervals as an image frame. And storing the image frames as an image frame sequence according to the time stamp sequence, obtaining any two continuous image frames from the image frame sequence, wherein one image frame is used as a historical image frame, and performing target detection on the historical image frame based on manual work and/or algorithm to obtain one or more historical targets.
The object tracking matching device can obtain a video acquired in real time by connecting a camera of an automatic driving vehicle, sequentially extracts each image frame in the video according to a time stamp to form an image frame sequence, sequentially increases the storage position of the image frames by a sequence number of 1 from 0 in the image frame sequence, and performs object detection processing on the historical image frames to obtain a plurality of historical objects on the assumption that the image frame with the sequence number of 3 is a historical image frame.
S120, identifying targets to be matched in the current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched.
In the present embodiment, the current image frame is the next image frame adjacent to the history image frame in the image frame sequence. The objects to be matched are one or more objects in the current image frame. The confidence of the target to be matched is the confidence of the identification result of the target to be matched represented in the form of probability value. The confidence of the first target to be matched is higher than that of the second target to be matched.
Specifically, any two continuous image frames are obtained from the image frame sequence, one image frame is used as a current image frame, and target detection is carried out on the current image frame to obtain one or more targets to be matched and the confidence of each target to be matched. Classifying the targets to be matched according to the confidence level dividing standard of the targets to be matched, if the targets to be matched are classified into targets to be matched with high confidence level, the targets to be matched can be considered to have low probability of being blocked and/or overlapping, the success rate of matching the targets to be matched with the historical targets based on the positions is high, and the targets to be matched with high confidence level are taken as first targets to be matched; if the target to be matched is divided into targets to be matched with low confidence, the target to be matched is considered to have high probability of being blocked and/or overlapping targets, and the success rate of matching the target to be matched with the historical targets based on the position is low, the target to be matched with low confidence is taken as a second target to be matched
The target tracking and matching device can obtain a video acquired in real time by connecting a camera of an automatic driving vehicle, sequentially extracting each image frame in the video according to a time stamp to form an image frame sequence, sequentially increasing a sequence number of 1 from 0 in the image frame sequence to represent a storage position of the image frame, assuming that the image frame with the sequence number of 3 is a historical image frame, the image frame with the sequence number of 4 is a current image frame, and performing target detection processing on the current image frame to obtain a plurality of targets to be matched and confidence degrees of each target to be matched. Assuming that two targets to be matched exist in the current image frame, wherein the confidence coefficient of a first target to be matched is higher than that of a second target to be matched, performing two-classification processing on the two targets to be matched according to confidence coefficient dividing standards based on the confidence coefficient, wherein the first target to be matched is the first target to be matched and is used for performing position matching with a historical target; the second target to be matched is a second target to be matched and is used for performing feature matching with the historical target.
Optionally, performing object detection on the historical image frame and the current image frame based on the object detection model to obtain an object to be matched in the current image frame and the confidence of the object to be matched; and determining the target to be matched with the confidence coefficient being greater than or equal to the confidence coefficient threshold value as a first target to be matched, and determining the target to be matched with the confidence coefficient being smaller than the confidence coefficient threshold value as a second target to be matched.
In this embodiment, the confidence threshold is a threshold of confidence that determines that the target to be matched belongs to the first target to be matched or the second target to be matched.
Specifically, based on an object detection model (for example, a model based on traditional machine learning and a model based on deep learning) for performing object detection on historical image frames, object detection is performed on the current image frame, so as to obtain one or more objects to be matched and a confidence of each object to be matched. And comparing the confidence coefficient of the target to be matched with the confidence coefficient threshold value, determining the target to be matched with the confidence coefficient greater than or equal to the confidence coefficient threshold value as a first target to be matched, and determining the target to be matched with the confidence coefficient smaller than the confidence coefficient threshold value as a second target to be matched.
For example, assuming that the confidence coefficient threshold is 0.2, performing object detection on the current image frame based on a YOLO model for performing object detection on the historical image frame to obtain two objects to be matched, and the confidence coefficient of the two objects to be matched is 0.15 and 0.78 respectively, comparing the confidence coefficient of the two objects to be matched with the confidence coefficient threshold, taking the object to be matched with the confidence coefficient of 0.78 as a first object to be matched, and taking the object to be matched with the confidence coefficient of 0.15 as a second object to be matched.
S130, performing position matching on the first target to be matched based on the historical targets to obtain a matching result of the first target to be matched and a non-matched historical target.
In the present embodiment, the position matching is a matching based on the current position data of the first target to be matched and the history position information of the history target. The current position data may be external frame position information of the first target to be matched in the current image frame, and the history position information may be external frame position information of the history target in the history image frame, where the external frame may be a two-dimensional minimum external frame or a three-dimensional minimum external frame of the first target to be matched or the history target, i.e. the external frame may be a rectangular frame or a cuboid external frame. The outer frame position information may include outer frame center point coordinates and vertex coordinates, and may include outer frame length, width, and/or height. The matching result of the first target to be matched is the result of whether the first target to be matched is matched with the historical target or not, and which first target to be matched is matched with which historical target.
Specifically, based on current position data of the first targets to be matched and historical position information of the historical targets, position matching is carried out on each first target to be matched and all the historical targets one by one to obtain a matching result of the historical targets and the first targets to be matched, the historical targets can be matched with the first targets to be matched, the first targets to be matched are not matched with all the historical targets, the historical targets are not matched with all the first targets to be matched, and the historical targets which are not matched with all the first targets to be matched are used as unmatched historical targets for carrying out subsequent matching operation.
For example, there are 2 historical targets in the historical image frame, i.e. a historical target 1 and a historical target 2, and there are 2 first targets to be matched in the current image frame, i.e. a first target to be matched 1 and a first target to be matched 2, and the 2 historical targets are respectively matched with the 2 first targets to be matched in position one by one to obtain a matching result of the first targets to be matched, for example, the historical target 1 is matched with the first target to be matched 1, the historical target 2 is not matched with the first target to be matched 2, and the historical target 2 is used as an unmatched historical target.
Optionally, current position data and current form data of a first target to be matched in the current image frame are obtained; matching with the historical position information and/or the historical form data of the historical targets based on the current position information and/or the current form data of the first target to be matched to obtain matching degree data of the first target to be matched and the historical targets; and determining a matching result of the first target to be matched based on the matching degree data of the first target to be matched and each historical target.
In this embodiment, the current morphology data is data representing the morphology of the first target to be matched, the history morphology data is data representing the morphology of the history target, and the matching degree data is data representing the matching degree of the first target to be matched and the history target.
Specifically, based on the same target detection method, the obtained current position data and current form data of the first targets to be matched in the current image frame, and the historical position information and/or the historical form data of the historical targets, the current position data of each first target to be matched and the historical position information of all the historical targets are matched to obtain position matching data, and/or the current form data of each first target to be matched and the historical form data of all the historical targets are matched to obtain form matching data. And calculating the position matching data and/or the form matching data according to a certain weight proportion to obtain matching degree data of each first target to be matched and all historical targets, matching the first target to be matched and the historical targets in a bipartite graph maximum matching mode based on the matching degree data of the first target to be matched and the historical targets, and accurately measuring the relation on allocation on the basis of transforming various known data to obtain an optimal matching scheme and obtain a matching result of the first target to be matched and the historical targets.
Taking an example that an external frame of a first target to be matched and a historical target comprises a two-dimensional minimum external frame and a three-dimensional minimum external frame, the position information of the historical target and the first target to be matched is based on the central point coordinate information of the external frame, and the form data of the historical target and the first target to be matched are based on the length, width and/or height information of the external frame and the vertex coordinate information. A matching score based on the bounding box center gaussian motion, shape change, and shape overlap (Intersection over Union, ioU) for each first object to be matched and all historical objects is calculated using the following formula.
(1) Based on Gaussian motion of a center point of a two-dimensional external frame, a matching score sm is obtained, and the calculation mode is as follows: sm=gaussian (c.x det ,c.x tra ,w det )*gaussian(c.y det ,c.y tra ,h det ) Therein, c.x det And c.y det Respectively representing the coordinates of the center point of the two-dimensional external frame of any first target to be matched, c.x tra And c.y tra Respectively representing the coordinates, w, of the center point of the two-dimensional external frame of any historical target det And h det The width and the height of the two-dimensional external frame of any first target to be matched are respectively represented. The gaussian function is calculated as follows: gaussian (x, mu, sigma) =exp (- (x-mu)/(2 x sigma) x × (x-mu)
(2) Based on the two-dimensional external frame shape change, a matching score ss is obtained, and the calculation mode is as follows: ss= (w) tra -w det )*(h tra -h det )/(w tra *h tra ) Wherein w is tra And h tra The length and width of the two-dimensional bounding box of any historical object are represented respectively.
(3) The matching score so is obtained based on the two-dimensional external frame shape IoU, and it should be noted that IoU is a standard for measuring accuracy of detecting a corresponding object in a specific data set, and the calculation method is as follows: so=iou (box) det ,box tra ) Wherein, box det =[xmin det ,ymin det ,xmax det ,ymax det ],box tra =[xmin tra ,ymin tra ,xmax tra ,ymax tra ],IOU(box 1 ,box 2 )=(box 1 &box 2 )/(box 1 |box 2 ) Wherein, xmin det ,ymin det ,xmax det And ymax det Respectively representing the vertex coordinates, xmin of the two-dimensional external frame of any first target to be matched tra ,ymin tra ,xmax tra And ymax tra And each vertex coordinate of the two-dimensional external frame of any historical target is respectively represented.
(4) Similarly, based on the three-dimensional external frame, the Gaussian motion, the shape change and IoU of the central point are calculated to obtain the matching score sm respectively 3d 、ss 3d So 3d The calculation method is the same as (1), (2) and (3).
Matching degree data based on the position matching is obtained according to the matching score weighting, the calculation method is as follows: score 1 =0.2*sm+0.2*ss+0.2*so+0.2*sm 3d +0.1*ss 3d +0.1*so 3d It should be noted that the weight value may be set correspondingly according to the actual application and the scene.
The obtained matching degree data is a matching score of one historical target in the historical image frames and one first target to be matched in the current image frame, if a matching result of the first targets to be matched in the two image frames and the historical targets is required to be determined, matching degree data of all the first targets to be matched in the current image frame and each historical target in the historical image frames needs to be calculated, for example, in an automatic driving scene, 3 obstacles are arranged in the historical image frames, namely, 3 historical targets are determined in the historical image frames, 4 high-confidence obstacles are arranged in the current image frame, namely, 4 first targets to be matched in the current image frame are determined, 12 position-based matching degree data need to be calculated, a matching matrix1 is obtained according to the matching degree data, a bipartite image maximum matching calculation is performed through a Hungary algorithm, the first targets to be matched and the historical targets, and a matching result of the first targets to be matched and each historical target is obtained, and whether the first targets to be matched with each historical target is determined.
It should be noted that, the hungarian algorithm is a combined optimization algorithm for solving the task allocation problem in polynomial time, and promotes the subsequent original dual method.
Optionally, under the condition that the matching data of the first target to be matched and any historical target meet the matching condition, updating the historical track of the historical target matched with the first target to be matched based on the position information of the first target to be matched in the current image frame; and generating a track of the newly added target based on the position information of the first target to be matched under the condition that the matching degree data of the first target to be matched and the plurality of historical targets do not meet the matching condition.
In this embodiment, the matching condition is a condition that the matching data needs to satisfy when it is determined that the first target to be matched matches the history target, for example, the matching degree data exceeds a preset matching degree threshold. The history track is a target motion track composed of position information of a history target in each history image frame according to the sequence of the history image frames. The track of the new target is the track of the new target which is the first target to be matched which is not matched with the historical target in the current image frame, the starting point of the track is the newly added target.
Specifically, if the matching data of the first target to be matched and any historical target meets the matching condition, the first target to be matched and the historical target can be considered to be matched, and the position information of the first target to be matched is the position information of the historical target in the current image frame, and then the position information of the first target to be matched is used for updating the historical track of the matched historical target; if the matching degree data of the first target to be matched and any historical target do not meet the matching condition, the first target to be matched is considered to be not matched with any historical target, the first target to be matched is taken as a new target, and the position information of the first target to be matched is taken as a starting point to create a track of the new target; if the matching degree data of the history target and any first target to be matched does not meet the matching condition, the history target can be regarded as not matching with any first target to be matched, and the history target is regarded as the unmatched history target.
In an automatic driving scene, for example, 2 historical targets are in a historical image frame, 2 first targets to be matched are in a current image frame, matching degree data of the historical targets 1 and the first targets to be matched 1 meet matching conditions, and then the first targets to be matched 1 are used for updating tracking tracks of the historical targets 1; if the history target 2 and the first target 2 to be matched do not meet the matching condition, using the first target 2 to be matched as a starting point of the tracking track to newly add a tracking track of the first target 2 to be matched, and using the history target 2 as an unmatched history target.
And S140, performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
In this embodiment, feature matching is matching of the second object to be matched and the history object that is not matched based on the features. The matching result of the second target to be matched is the result of whether the history target which is not matched with the second target to be matched is matched, and which second target to be matched is matched with which history target which is not matched.
Specifically, extracting features of the unmatched historical targets and the second to-be-matched targets, performing feature-by-feature matching on each second to-be-matched target and all the unmatched historical targets one by one to obtain a matching result of the second to-be-matched target and the unmatched historical targets, wherein the matching result can be that the second to-be-matched target is matched with the unmatched historical targets, and the second to-be-matched target is not matched with all the unmatched historical targets.
In an automatic driving scenario, there are 4 obstacles in the history image frame, that is, it is determined that there are 4 history targets in the history image frame, where 3 history targets are unmatched history targets, and there are 5 obstacles in the current image frame, where 4 obstacles are low confidence obstacles, that is, it is determined that there are 4 second targets to be matched in the current image frame, features of the 3 unmatched history targets and the 4 second targets to be matched are extracted, and feature matching is performed on the 3 unmatched history targets and the 4 second targets to be matched one by one, so as to obtain a matching result of the second targets to be matched, for example: the unmatched historical targets 1 are matched with the second targets to be matched 1, the unmatched historical targets 2 are matched with the second targets to be matched 2, the unmatched historical targets 3 are unmatched with the second targets to be matched 3, and the unmatched historical targets 3 are unmatched with the second targets to be matched 4.
According to the technical scheme, a historical target in a historical image frame is obtained; identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target; and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched, so that redundant feature extraction and matching can be removed, the problem that unnecessary computing resources are wasted in a feature-based target tracking matching strategy is solved, the target tracking matching strategy is optimized, the calculated amount is reduced, and the target tracking matching efficiency is improved.
Example two
Fig. 2 is a flowchart of a target tracking matching method according to a second embodiment of the present invention, where the technical solution of the embodiment of the present invention is further optimized based on any of the foregoing embodiments. As shown in fig. 2, the method includes:
s210, acquiring a historical target in the historical image frame.
S220, identifying the target to be matched in the current image frame, classifying the target to be matched based on the confidence of the target to be matched, and obtaining a first target to be matched and a second target to be matched.
And S230, carrying out position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target.
S240, respectively determining first characteristic data of a second target to be matched and second characteristic data of a history target which is not matched based on the characteristic extraction module.
In this embodiment, the feature extraction module is a module for extracting first feature data of a second object to be matched and second feature data of a history object that is not matched.
Specifically, the initial feature extraction model is trained by using a sample set stored in a local and/or server, so as to obtain a trained feature extraction model, and the trained feature extraction model is used as a feature extraction module for extracting first feature data of a second target to be matched and second feature data of a history target which is not matched, wherein the sample position of a sample object corresponds to the feature of the sample position of the sample object. And inputting the unmatched historical targets and the second targets to be matched into a feature extraction module to obtain a first target to be matched corresponding to the second target to be matched and second feature data corresponding to the unmatched historical targets.
Illustratively, a target re-identification REID model is built by taking a resnet18 as a backbone, the resnet is used as an initial feature extraction model, an ROI-id sequence training data set is built based on a Nuscene data set, the initial feature extraction model is trained, the trained REID model is used as a feature extraction module, and first feature data of a second target to be matched and second feature data of a history target which is not matched are extracted.
Optionally, the second characteristic data of the unmatched historical targets includes characteristic data of the unmatched historical targets in a plurality of historical image frames.
In this embodiment, the plurality of historical image frames are a plurality of consecutive image frames preceding the historical image frame in the image frame sequence.
Specifically, according to the order in the image frame sequence, the characteristic data of the unmatched historical targets in the historical image frames and the previous plurality of historical image frames is used as second characteristic data of the unmatched historical targets.
For example, assuming that the image frame with the sequence number of 4 is a current image frame and the image frame with the sequence number of 3 is a history image frame, feature data corresponding to the same unmatched history object in the three image frames with the sequence numbers of 1, 2 and 3 is stored as a feature sequence in sequence number as second feature data of the unmatched history object.
According to the technical scheme, the characteristic data in the plurality of historical image frames are used as the second characteristic data of the unmatched historical targets, so that the matching error caused by the fact that the historical image frames are the interrupt image frames corresponding to the track break points of the same historical target among the plurality of historical image frames is avoided, and the stability of target tracking matching is enhanced.
S250, determining matching degree data of a second target to be matched and a history target which is not matched based on the first characteristic data and the second characteristic data.
Specifically, matching is carried out on each first characteristic data and all second characteristic data to obtain matching degree data of each second target to be matched and all unmatched historical targets.
Illustratively, the matching degree data of the second target to be matched and the unmatched historical target are obtained based on the cosine distance, and the formula is as follows:
wherein feature is det And feature tra The target features of the current frame target and the historical target track are respectively represented, and k represents the dimension of the vector.
S260, determining a matching result of the second target to be matched based on matching degree data of the second target to be matched and a plurality of unmatched historical targets.
Specifically, based on the matching degree data of the second target to be matched and each unmatched historical target, the second target to be matched and the unmatched historical targets are matched in a bipartite graph maximum matching mode, on the basis of transformation processing of various known data, the relation on allocation can be accurately measured, an optimal matching scheme is solved, and the matching result of the second target to be matched and the unmatched historical targets is obtained.
In an automatic driving scene, 3 history targets in a history image frame are unmatched history targets, 4 second targets to be matched are arranged in a current image frame, 12 feature-based matching degree data are required to be calculated, a matching matrix2 is obtained according to the matching degree data, biggest matching calculation of a bipartite graph is carried out through a Hungary algorithm, the second targets to be matched with the unmatched history targets are matched, matching results of the second targets to be matched and the unmatched history targets are obtained, whether the second targets to be matched are matched with the unmatched history targets or not is determined, and the unmatched history targets are matched with which one.
Optionally, under the condition that the matching data of the second target to be matched and any unmatched historical target meet the matching condition, updating the historical track of the historical target matched with the second target to be matched based on the position information of the second target to be matched in the current image frame; and discarding the second target to be matched under the condition that the matching degree data of the second target to be matched and the plurality of unmatched historical targets do not meet the matching condition.
In this embodiment, the matching condition is a condition that the matching data needs to satisfy when it is determined that the second target to be matched matches the history target that is not matched, for example, the matching degree data exceeds a preset matching degree threshold.
Specifically, if the matching data of the second target to be matched and the history target which is not matched meet the matching condition, the second target to be matched and the history target which is not matched can be considered to be matched, and the position information of the second target to be matched is the position information of the history target which is not matched in the current image frame, and then the position information of the second target to be matched is used for updating the history track of the matched history target; if the matching degree data of the second target to be matched and the history targets which are not matched do not meet the matching condition, the second target to be matched and any history target which is not matched can be considered to be not matched, and the second target to be matched is discarded.
According to the technical scheme, a historical target in a historical image frame is obtained; identifying targets to be matched in the current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical targets to obtain a matching result of the first target to be matched and a non-matched historical target; determining first characteristic data of a second target to be matched and second characteristic data of a history target which is not matched based on the characteristic extraction module respectively; determining matching degree data of a second target to be matched and a history target which is not matched based on the first characteristic data and the second characteristic data; and determining a matching result of the second target to be matched based on matching degree data of the second target to be matched and a plurality of unmatched historical targets, and matching the unmatched historical targets with low confidence based on the characteristics, so that calculation resources are saved, and matching efficiency is improved.
Example III
Fig. 3 is a schematic structural diagram of a target tracking matching device according to a third embodiment of the present invention.
As shown in fig. 3, the apparatus includes:
a historical target acquisition module 310, configured to acquire a historical target in a historical image frame;
the target to be matched identification module 320 is configured to identify a target to be matched in a current image frame, and classify the target to be matched based on a confidence level of the target to be matched, so as to obtain a first target to be matched and a second target to be matched;
the location matching module 330 is configured to perform location matching on the first target to be matched based on the historical target, so as to obtain a matching result of the first target to be matched and a non-matched historical target;
and the feature matching module 340 is configured to perform feature matching on the unmatched historical target and the second target to be matched, so as to obtain a matching result of the second target to be matched.
According to the technical scheme, a historical target in a historical image frame is obtained; identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target; and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched, so that redundant feature extraction and matching can be removed, the problem that unnecessary computing resources are wasted in a feature-based target tracking matching strategy is solved, the target tracking matching strategy is optimized, the calculated amount is reduced, and the target tracking matching efficiency is improved.
On the basis of the above embodiment, optionally, the target identifying module 320 to be matched includes:
the confidence degree determining unit is used for carrying out target detection on the historical image frame and the current image frame based on a target detection model to obtain a target to be matched in the current image frame and the confidence degree of the target to be matched;
and the target to be matched determining unit is used for determining the target to be matched with the confidence coefficient larger than or equal to a confidence coefficient threshold value as the first target to be matched, and determining the target to be matched with the confidence coefficient smaller than the confidence coefficient threshold value as the second target to be matched.
Based on the above embodiment, optionally, the location matching module 330 includes:
a position data obtaining unit, configured to obtain current position data and current form data of the first target to be matched in the current image frame;
the position data matching unit is used for matching with the historical position information and/or the historical form data of the historical target based on the current position information and/or the current form data of the first target to be matched to obtain matching degree data of the first target to be matched and the historical target;
And the position matching result determining unit is used for determining the matching result of the first target to be matched based on the matching degree data of the first target to be matched and each history target.
On the basis of the above embodiment, optionally, the first matching result determining unit is specifically configured to:
updating the historical track of the historical target matched with the first target to be matched based on the position information of the first target to be matched in the current image frame under the condition that the matching data of the first target to be matched and any historical target meet the matching condition;
and generating a track of a new target based on the position information of the first target to be matched under the condition that the matching degree data of the first target to be matched and the historical targets do not meet the matching condition.
Based on the above embodiment, optionally, the feature matching module 340 includes:
the characteristic data determining unit is used for respectively determining the first characteristic data of the second target to be matched and the second characteristic data of the unmatched historical target based on the characteristic extracting module;
a feature data matching unit, configured to determine matching degree data of the second target to be matched and the unmatched historical target based on the first feature data and the second feature data;
And the characteristic matching result determining unit is used for determining the matching result of the second target to be matched based on the matching degree data of the second target to be matched and the plurality of unmatched historical targets.
Optionally, the second feature data of the unmatched historical targets includes feature data of the unmatched historical targets in a plurality of historical image frames.
On the basis of the above embodiment, optionally, the feature matching result determining unit is specifically configured to:
the method comprises the steps that under the condition that matching data of a second target to be matched and any unmatched historical target meet matching conditions, a historical track of the historical target matched with the second target to be matched is updated based on position information of the second target to be matched in the current image frame;
and discarding the second target to be matched under the condition that the matching degree data of the second target to be matched and the plurality of unmatched historical targets do not meet the matching condition.
The target tracking matching device provided by the embodiment of the invention can execute the target tracking matching method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 is a schematic structural diagram of an electronic device implementing a target tracking matching method according to an embodiment of the present invention. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic devices may also represent various forms of mobile equipment, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing equipment. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the target tracking matching method.
In some embodiments, the target tracking matching method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the target tracking matching method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the target tracking matching method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above can be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
The computer program for implementing the target tracking matching method of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
Example five
The fifth embodiment of the present invention also provides a computer readable storage medium, where computer instructions are stored, where the computer instructions are configured to cause a processor to execute a target tracking matching method, where the method includes:
acquiring a historical target in a historical image frame; identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched; performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target; and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, apparatus, or devices, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A target tracking matching method, comprising:
acquiring a historical target in a historical image frame;
identifying targets to be matched in a current image frame, and classifying the targets to be matched based on the confidence of the targets to be matched to obtain a first target to be matched and a second target to be matched;
performing position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target;
and performing feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
2. The method of claim 1, wherein the identifying the object to be matched in the current image frame and classifying the object to be matched based on the confidence of the object to be matched, to obtain a first object to be matched and a second object to be matched, comprises:
performing target detection on the historical image frame and the current image frame based on a target detection model to obtain a target to be matched in the current image frame and confidence of the target to be matched;
and determining the target to be matched with the confidence coefficient being greater than or equal to a confidence coefficient threshold value as the first target to be matched, and determining the target to be matched with the confidence coefficient being smaller than the confidence coefficient threshold value as the second target to be matched.
3. The method of claim 1, wherein the performing location matching on the first target to be matched based on the historical target comprises:
acquiring current position data and current form data of the first target to be matched in the current image frame;
based on the current position information and/or the current form data of the first target to be matched, matching with the historical position information and/or the historical form data of the historical target to obtain matching degree data of the first target to be matched and the historical target;
and determining a matching result of the first target to be matched based on the matching degree data of the first target to be matched and each history target.
4. The method of claim 3, wherein the determining a matching result of the first target to be matched based on matching degree data of the first target to be matched and each of the history targets comprises:
updating the historical track of the historical target matched with the first target to be matched based on the position information of the first target to be matched in the current image frame under the condition that the matching data of the first target to be matched and any historical target meet the matching condition;
And generating a track of a new target based on the position information of the first target to be matched under the condition that the matching degree data of the first target to be matched and the historical targets do not meet the matching condition.
5. The method according to claim 1, wherein the performing feature matching on the unmatched historical target and the second target to be matched to obtain a matching result of the second target to be matched includes:
determining first characteristic data of the second target to be matched and second characteristic data of the unmatched historical target respectively based on a characteristic extraction module;
determining matching degree data of the second target to be matched and the unmatched historical target based on the first characteristic data and the second characteristic data;
and determining a matching result of the second target to be matched based on the matching degree data of the second target to be matched and the plurality of unmatched historical targets.
6. The method of claim 5, wherein the second characteristic data of the unmatched historical targets comprises characteristic data of the unmatched historical targets in a plurality of historical image frames.
7. The method of claim 5, wherein the determining the matching result of the second target to be matched based on matching degree data of the second target to be matched and a plurality of the unmatched historical targets comprises:
updating the historical track of the historical target matched with the second target to be matched based on the position information of the second target to be matched in the current image frame under the condition that the matching data of the second target to be matched and any unmatched historical target meets the matching condition;
and discarding the second target to be matched under the condition that the matching degree data of the second target to be matched and the plurality of unmatched historical targets do not meet the matching condition.
8. A target tracking matching apparatus, comprising:
the historical target acquisition module is used for acquiring historical targets in the historical image frames;
the target to be matched identification module is used for identifying targets to be matched in the current image frame, classifying the targets to be matched based on the confidence of the targets to be matched, and obtaining a first target to be matched and a second target to be matched;
the position matching module is used for carrying out position matching on the first target to be matched based on the historical target to obtain a matching result of the first target to be matched and a non-matched historical target;
And the feature matching module is used for carrying out feature matching on the unmatched historical targets and the second target to be matched to obtain a matching result of the second target to be matched.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the target tracking matching method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the target tracking matching method of any one of claims 1-7 when executed.
CN202311401429.7A 2023-10-26 2023-10-26 Target tracking matching method, device, equipment and medium Pending CN117372477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311401429.7A CN117372477A (en) 2023-10-26 2023-10-26 Target tracking matching method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311401429.7A CN117372477A (en) 2023-10-26 2023-10-26 Target tracking matching method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117372477A true CN117372477A (en) 2024-01-09

Family

ID=89402023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311401429.7A Pending CN117372477A (en) 2023-10-26 2023-10-26 Target tracking matching method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117372477A (en)

Similar Documents

Publication Publication Date Title
CN113902897B (en) Training of target detection model, target detection method, device, equipment and medium
CN112597837B (en) Image detection method, apparatus, device, storage medium, and computer program product
CN113392794B (en) Vehicle line crossing identification method and device, electronic equipment and storage medium
CN113205041A (en) Structured information extraction method, device, equipment and storage medium
CN115346171A (en) Power transmission line monitoring method, device, equipment and storage medium
CN116740355A (en) Automatic driving image segmentation method, device, equipment and storage medium
CN116309963B (en) Batch labeling method and device for images, electronic equipment and storage medium
CN115953434B (en) Track matching method, track matching device, electronic equipment and storage medium
CN112308917A (en) Vision-based mobile robot positioning method
CN114429631B (en) Three-dimensional object detection method, device, equipment and storage medium
CN117372477A (en) Target tracking matching method, device, equipment and medium
CN115330841A (en) Method, apparatus, device and medium for detecting projectile based on radar map
CN115436900A (en) Target detection method, device, equipment and medium based on radar map
CN114694138B (en) Road surface detection method, device and equipment applied to intelligent driving
CN117351043A (en) Tracking matching method and device, electronic equipment and storage medium
CN113360688B (en) Method, device and system for constructing information base
CN114049615B (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN117392631B (en) Road boundary extraction method and device, electronic equipment and storage medium
CN114155508B (en) Road change detection method, device, equipment and storage medium
CN116258769B (en) Positioning verification method and device, electronic equipment and storage medium
CN115857502B (en) Driving control method and electronic device
CN114037865B (en) Image processing method, apparatus, device, storage medium, and program product
CN118035788A (en) Target vehicle relative position classification method, device, equipment and storage medium
CN117710459A (en) Method, device and computer program product for determining three-dimensional information
CN116883654A (en) Training method of semantic segmentation model, semantic segmentation method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination