CN115685182A - Method and device for detecting object, electronic equipment and storage medium - Google Patents

Method and device for detecting object, electronic equipment and storage medium Download PDF

Info

Publication number
CN115685182A
CN115685182A CN202110867696.8A CN202110867696A CN115685182A CN 115685182 A CN115685182 A CN 115685182A CN 202110867696 A CN202110867696 A CN 202110867696A CN 115685182 A CN115685182 A CN 115685182A
Authority
CN
China
Prior art keywords
data
point cloud
detection data
detection
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110867696.8A
Other languages
Chinese (zh)
Inventor
关喜嘉
王邓江
邓永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wanji Technology Co Ltd
Original Assignee
Beijing Wanji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wanji Technology Co Ltd filed Critical Beijing Wanji Technology Co Ltd
Priority to CN202110867696.8A priority Critical patent/CN115685182A/en
Publication of CN115685182A publication Critical patent/CN115685182A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a method and a device for detecting an object, electronic equipment and a storage medium, and belongs to the technical field of intelligent monitoring. The method comprises the following steps: and acquiring target detection data of the laser radar sensor. First object information detected by the lidar sensor is determined based on the object detection data by a first object detection model. The first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with the detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is obtained by detecting with a laser radar sensor and is matched with the millimeter wave radar data in time. The method and the device can accurately detect the remote object, so that the effectiveness of object detection can be improved.

Description

Method and device for detecting object, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of intelligent monitoring technologies, and in particular, to a method and an apparatus for detecting an object, an electronic device, and a storage medium.
Background
The application of the laser radar sensor in the technical field of intelligent monitoring is more and more extensive. At present, a detection model to be trained can be trained in advance based on a large number of training samples, so as to obtain a target detection model with a detection function. When the method is used, the detection data are obtained through the laser radar sensor, and then the object detected by the laser radar sensor can be determined through the target detection model based on the detection data.
However, due to the limitation of the detection distance and the characteristics of the lidar sensor, the detection range of the lidar sensor is limited, which results in sparse or short detection data density at the far end in the detection range of the lidar sensor, and thus, when the detection data is processed by the target detection model, the object at the far end cannot be determined, thereby affecting the effectiveness of object detection.
Disclosure of Invention
The embodiment of the application provides a method and a device for detecting an object, an electronic device and a storage medium, which can solve the problem that the effectiveness of object detection is influenced because a remote object cannot be determined when detection data is processed through a target detection model in the related art. The technical scheme is as follows:
in a first aspect, a method for detecting an object is provided, the method comprising:
acquiring current detection data of a laser radar sensor to obtain target detection data;
determining, by a first object detection model, first object information of an object currently detected by the lidar sensor based on the object detection data;
the first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by labeling edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is detected by the laser radar sensor and is matched with the millimeter wave radar data in time.
Therefore, the edge point cloud data in the first detection data is marked by utilizing the second object information used for describing the remote object in the millimeter wave radar data, and the target detection model is updated based on the point cloud data marked with the second object information, so that the obtained first target detection model can accurately detect the remote object, and the effectiveness of object detection can be improved.
As an example of the present application, the manner of acquiring the point cloud data marked with the second object information includes:
in the operation process of the millimeter wave radar sensor and the laser radar sensor, acquiring two frames of detection data matched with time, wherein the two frames of detection data comprise first detection data and second detection data, and the second detection data is obtained by detection of the millimeter wave radar sensor;
processing the second detection data through a second target detection model to obtain the millimeter wave radar data, wherein the second target detection model is used for determining an object detected by the millimeter wave radar sensor based on the second detection data;
and determining the point cloud data marked with the second object information based on the first detection data and the millimeter wave radar data.
Therefore, by acquiring two frames of detection data matched in time and carrying out data labeling based on the two frames of the detection data, the effectiveness of updating training can be ensured when model updating training is carried out subsequently based on the point cloud data after labeling, and thus, the model after updating training can effectively detect a remote object.
As an example of the present application, the determining the point cloud data labeled with the second object information based on the first detection data and the millimeter wave radar data includes:
processing the first detection data through the target detection model, and outputting information of at least one first object under a point cloud coordinate system;
mapping the millimeter wave radar data to the point cloud coordinate system to obtain information of at least one second object in the point cloud coordinate system;
performing object matching based on the information of the at least one first object under the point cloud coordinate system and the information of the at least one second object under the point cloud coordinate system;
and determining the point cloud data marked with the second object information according to the result of object matching.
So, through under the detection data mapping to the same coordinate system with laser radar sensor and millimeter wave radar sensor both to carry out the object matching according to the object information after the mapping, with determine unmatched object, thereby can select the object of far-end, and then can mark the point cloud data of the object of far-end effectively.
As an example of the application, the determining the point cloud data labeled with the second object information according to the result of the object matching includes:
determining an object three-dimensional frame under the point cloud coordinate system according to the result of object matching, wherein the object three-dimensional frame refers to a three-dimensional frame of a second object which is not matched with the at least one first object in the at least one second object;
if the number of point clouds included in the object three-dimensional frame in the first detection data is larger than a point threshold value, marking the point cloud data included in the object three-dimensional frame based on data corresponding to the object three-dimensional frame in the millimeter wave radar data to obtain the point cloud data marked with the second object information.
In this way, an object three-dimensional frame of the remote object is determined, and under the condition that the number of point clouds included in the object three-dimensional frame is larger than a point threshold value, point cloud data in the object three-dimensional frame are labeled, so that the millimeter wave radar data are utilized to label the object which is detected by the laser radar sensor and includes a small amount of point cloud data.
As an example of the present application, the acquiring two frames of detection data with matched time includes:
acquiring a timestamp of detection data of the millimeter wave radar sensor to obtain a first timestamp, and acquiring a timestamp of detection data of the laser radar sensor to obtain a second timestamp;
and if the time difference between the first timestamp and the second timestamp is less than a duration threshold, determining the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor as two frames of detection data matched with the time.
As an example of the present application, the method further comprises:
if the time difference between the first time stamp and the second time stamp is greater than or equal to the duration threshold, acquiring a time stamp of next frame detection data of the millimeter wave radar sensor to obtain a third time stamp;
comparing the second timestamp to the third timestamp;
and if the time difference between the second timestamp and the third timestamp is smaller than the duration threshold, determining the detection data corresponding to the third timestamp and the detection data corresponding to the second timestamp as the two frames of detection data with matched time.
As an example of the present application, the method further comprises:
if the time difference between the second timestamp and the third timestamp is greater than or equal to the duration threshold, acquiring next frame detection data of the laser radar sensor;
and determining the detection data corresponding to the first timestamp and the next frame of detection data of the laser radar sensor as the two frames of detection data matched with the time.
In a second aspect, there is provided an apparatus for detecting an object, the apparatus comprising:
the acquisition module is used for acquiring the current detection data of the laser radar sensor to obtain target detection data;
a determining module, configured to determine, based on the target detection data, first object information of an object currently detected by the lidar sensor through a first object detection model;
the first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by labeling edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is detected by the laser radar sensor and is matched with the millimeter wave radar data in time.
As an example of the present application, the obtaining module is further configured to:
in the operation process of the millimeter wave radar sensor and the laser radar sensor, acquiring two frames of detection data matched with time, wherein the two frames of detection data comprise first detection data and second detection data, and the second detection data is obtained by detection of the millimeter wave radar sensor;
processing the second detection data through a second target detection model to obtain the millimeter wave radar data, wherein the second target detection model is used for determining an object detected by the millimeter wave radar sensor based on the second detection data;
and determining the point cloud data marked with the second object information based on the first detection data and the millimeter wave radar data.
As an example of the present application, the obtaining module is configured to:
processing the first detection data through the target detection model, and outputting information of at least one first object under a point cloud coordinate system;
mapping the millimeter wave radar data to the point cloud coordinate system to obtain information of at least one second object in the point cloud coordinate system;
performing object matching based on the information of the at least one first object under the point cloud coordinate system and the information of the at least one second object under the point cloud coordinate system;
and determining the point cloud data marked with the second object information according to the result of the object matching.
As an example of the present application, the obtaining module is configured to:
determining an object three-dimensional frame under the point cloud coordinate system according to the result of object matching, wherein the object three-dimensional frame refers to a three-dimensional frame of a second object which is not matched with the at least one first object in the at least one second object;
if the number of point clouds included in the object three-dimensional frame in the first detection data is larger than a point threshold value, marking the point cloud data included in the object three-dimensional frame based on data corresponding to the object three-dimensional frame in the millimeter wave radar data to obtain the point cloud data marked with the second object information.
As an example of the present application, the obtaining module is configured to:
acquiring a timestamp of detection data of the millimeter wave radar sensor to obtain a first timestamp, and acquiring a timestamp of detection data of the laser radar sensor to obtain a second timestamp;
and if the time difference between the first timestamp and the second timestamp is less than a duration threshold, determining the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor as two frames of detection data matched with the time.
As an example of the present application, the obtaining module is configured to:
if the time difference between the first time stamp and the second time stamp is greater than or equal to the duration threshold, acquiring a time stamp of next frame detection data of the millimeter wave radar sensor to obtain a third time stamp;
comparing the second timestamp to the third timestamp;
and if the time difference between the second timestamp and the third timestamp is smaller than the duration threshold, determining the detection data corresponding to the third timestamp and the detection data corresponding to the second timestamp as the two frames of detection data with matched time.
As an example of the present application, the obtaining module is configured to:
if the time difference between the second timestamp and the third timestamp is greater than or equal to the duration threshold, acquiring next frame detection data of the laser radar sensor;
and determining the detection data corresponding to the first timestamp and the next frame of detection data of the laser radar sensor as the two frames of detection data matched with the time.
In a third aspect, a computer-readable storage medium is provided, having instructions stored thereon, which when executed by a processor, implement the method of any of the first aspect described above.
In a fourth aspect, an electronic device is provided, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the method of any one of the above first aspects when executing the computer program.
In a fifth aspect, there is provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first aspects described above.
It is to be understood that, for the beneficial effects of the second aspect to the fifth aspect, reference may be made to the relevant description in the first aspect, and details are not described herein again.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the method comprises the steps of obtaining target detection data of a laser radar sensor, and then determining first object information of an object currently detected by the laser radar sensor through a first object detection model based on the target detection data. The first target detection model is subjected to at least one updating training, wherein the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is obtained by detecting with a laser radar sensor and is matched with the millimeter wave radar data in time. Therefore, the edge point cloud data in the first detection data is marked by utilizing the second object information used for describing the remote object in the millimeter wave radar data, and the target detection model is updated based on the point cloud data marked with the second object information, so that the obtained first target detection model can accurately detect the remote object, and the effectiveness of object detection can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a flow chart illustrating a method of detecting an object in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of obtaining point cloud data tagged with second object information in accordance with an exemplary embodiment;
FIG. 3 is a schematic diagram illustrating a structure of an apparatus for detecting an object according to an exemplary embodiment;
fig. 4 is a schematic structural diagram of an electronic device shown in accordance with an exemplary embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that reference to "a plurality" in this application means two or more. In the description of this application, "/" indicates an OR meaning, for example, A/B may indicate A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, for the convenience of clearly describing the technical solutions of the present application, the terms "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
Before describing the method for detecting an object in detail, the terminology used in the embodiments of the present application is briefly described.
Point cloud data: the raw data output by the laser radar sensor is a three-dimensional continuous data stream composed of disordered three-dimensional points.
Target mapping matrix: the method is used for unifying the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor to the same coordinate system.
Next, an execution body related to the embodiment of the present application will be described.
By way of example and not limitation, the method for detecting an object provided by the embodiment of the present application may be performed by an electronic device on which a laser radar sensor and a millimeter wave radar sensor are configured or connected. Illustratively, the lidar sensor may include, but is not limited to, any of 8-line lidar, 16-line lidar, 24-line lidar, 32-line lidar, 64-line lidar, 128-line lidar. The millimeter wave radar sensor may include, but is not limited to, any one of 77GHz millimeter wave radar, 24GHz millimeter wave radar.
In the implementation, laser radar sensor and millimeter wave radar sensor can install according to actual demand, can fix on the sighting rod (horizontal pole or pole) of traffic roadside for example to treat the object that detects through laser radar sensor and millimeter wave radar sensor respectively. As an example, the object to be detected may include, but is not limited to, a vehicle, a pedestrian, a non-motorized vehicle, a tree.
As an example of the present application, before laser radar sensor and millimeter wave radar sensor normally work, the sampling frequency of laser radar sensor and millimeter wave radar sensor may be adjusted in advance, so that the detection data of the two correspond, that is, so that laser radar sensor and millimeter wave radar sensor work synchronously as much as possible, or it may be understood that, at the same or similar time, the object detected by laser radar sensor is consistent with the object detected by millimeter wave radar sensor. For example, the sampling frequencies of the laser radar sensor and the millimeter wave radar sensor are adjusted to be the same, or the sampling frequencies of the laser radar sensor and the millimeter wave radar sensor are adjusted to be in a multiple relation, and the like, which is not limited in the embodiment of the present application.
In some embodiments, the electronic device may include, but is not limited to, a wearable device, a terminal device, a car-mounted device, a camera device, a roadside base station. Illustratively, the wearable device may include, but is not limited to, a smart watch, a smart bracelet, a smart ear cup. In addition, the terminal device may include, but is not limited to, a mobile phone, a tablet computer, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a Personal Digital Assistant (PDA).
The road side base station is an important infrastructure for intelligent transportation vehicle-road cooperation and is a service station integrating sensing, computing and communication capabilities. In one embodiment, the roadside base station may also be referred to as a smart base station or a roadside fusion sensing system.
After introducing the technical terms and executive bodies related to the embodiments of the present application, the method for detecting an object provided by the embodiments of the present application will be described in detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for detecting an object according to an exemplary embodiment, which may be applied to the electronic device, and by way of example and not limitation, the method for detecting an object may include the following steps:
step 101: and acquiring current detection data of the laser radar sensor to obtain target detection data.
In one embodiment, the lidar sensor performs the detection operation according to a first sampling frequency, where the first sampling frequency may be set according to actual requirements, which is not limited in this application. For example, in some monitoring scenes with more mobile objects, such as stations, roads, etc., the first sampling frequency may be set to be larger. In some monitoring scenarios where the mobility of the object is low, the first sampling frequency may be set to be low.
In addition, the millimeter wave radar sensor performs a detection operation at the second sampling frequency. The second sampling frequency can be set according to actual requirements. Illustratively, the second sampling frequency may be the same as the first sampling frequency, or the second sampling frequency may be in a multiple relationship with the first sampling frequency.
Step 102: first object information of an object currently detected by the lidar sensor is determined by a first object detection model based on the object detection data.
In implementation, the specific implementation of step 102 may include: and calling the first target detection model, inputting target detection data into the first target detection model, and outputting first object information of an object currently detected by the laser radar sensor.
In one embodiment, the first object information may include, but is not limited to, at least one of an object position, an object category, an object color, an object size, and an object heading angle of the object detected by the lidar sensor.
The first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is obtained by detecting with a laser radar sensor and is matched with the millimeter wave radar data in time.
In one embodiment, the target detection model is obtained by training a network model to be trained based on a large number of lidar data samples. Each lidar data sample may be obtained by labeling the detection data of the lidar sensor in advance, and for example, at least one of information such as an object size, an object position, an object type, and an object color of an object corresponding to the detection data of the lidar sensor may be determined by a user, and then, the detection data of the lidar sensor is labeled to obtain the lidar data sample.
As an example, the network model to be trained may be a deep learning model or the like.
And then, inputting a large number of laser radar data samples into the network model to be trained for training, and determining the trained network model as a target detection model when the training meets a first training end condition. Wherein, the first training end condition can be set according to actual requirements.
Because the target detection model may not accurately determine the object at the far end when processing the detection data at the far end in the detection range of the lidar sensor, in the embodiment of the present application, the first target detection model is obtained by performing at least one update training on the target detection model, so that the first target detection model can determine the object at the far end in the detection range of the lidar sensor as accurately as possible.
For this reason, millimeter wave radar data may be acquired. The millimeter wave radar data includes object information for describing the object detected by the millimeter wave radar sensor. The object information may include, but is not limited to, at least one of object location, object size, object category, object color, object heading angle. In this way, the second object information used for describing the remote object in the millimeter wave radar data is utilized to label the edge point cloud data in the first detection data, and the point cloud data labeled with the second object information is obtained. And then updating and training the target detection model based on the point cloud data marked with the second object information, so that the obtained first target detection model can accurately detect a far-end object, and the detection range of the laser radar sensor is improved.
As an example of the present application, the millimeter wave radar data is obtained by processing second detection data through a second target detection model, and the second detection data is obtained by detecting with a millimeter wave radar sensor. The second target detection model is used for determining the object detected by the millimeter wave radar sensor based on the second detection data.
The second target detection model may be obtained by training a data processing model to be trained based on a large number of millimeter wave radar data samples, each millimeter wave radar data sample may be detection data of a millimeter wave radar sensor with a label, and for example, object information of an object corresponding to the detection data of the millimeter wave radar sensor may be determined by a user and the detection data of the millimeter wave radar sensor may be labeled.
And then, inputting a large number of millimeter wave radar data samples into a data processing model to be trained for training, and determining the data processing model obtained by training as a second target detection model when a second training end condition is met. Wherein, the second training end condition can be set according to actual requirements.
As an example, the data processing model to be trained may be a DBF (Digital Beam-Forming) model.
It should be noted that after the first object information of the object currently detected by the laser radar sensor is determined through the first object detection model based on the object detection data, the current first object detection model is determined as the object detection model with the detection function, that is, the current first object detection model is subsequently updated and trained based on the object detection data and the detection data of the millimeter wave radar sensor matched with the object detection data. That is, after each determination of the detected object based on the first object detection model, the update training of the first object detection model is continued so that the detection accuracy of the first object detection model becomes more and more accurate.
In the embodiment of the application, target detection data of the laser radar sensor is acquired, and then first object information of an object currently detected by the laser radar sensor is determined through a first object detection model based on the target detection data. The first target detection model is subjected to at least one updating training, wherein the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is obtained by detecting with a laser radar sensor and is matched with the millimeter wave radar data in time. Therefore, the edge point cloud data in the first detection data is marked by utilizing the second object information used for describing the remote object in the millimeter wave radar data, and the target detection model is updated based on the point cloud data marked with the second object information, so that the obtained first target detection model can accurately detect the remote object, and the effectiveness of object detection can be improved.
Next, a description will be given of a manner of acquiring point cloud data labeled with second object information according to an embodiment of the present application. Referring to fig. 2, fig. 2 is a flowchart illustrating a method for acquiring point cloud data marked with second object information according to an exemplary embodiment, which may be performed by the electronic device, by way of example and not limitation, and may include the following steps:
step 201: in the operation process of the millimeter wave radar sensor and the laser radar sensor, two frames of detection data matched with time are obtained, the two frames of detection data comprise first detection data and second detection data, and the second detection data are obtained by detection of the millimeter wave radar sensor.
As described above, since the sampling frequencies of the millimeter wave radar sensor and the laser radar sensor are adjusted, in order to ensure the effectiveness of the subsequent update training of the target detection model with the detection function, two frames of detection data with matched time are obtained in the process of labeling data, and the first detection data and the second detection data are obtained.
In one embodiment, the specific implementation of acquiring the time-matched two frames of probe data may include: the method comprises the steps of obtaining a timestamp of detection data of a millimeter wave radar sensor to obtain a first timestamp, obtaining a timestamp of detection data of a laser radar sensor to obtain a second timestamp. And if the time difference between the first timestamp and the second timestamp is less than the time length threshold value, determining the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor as two frames of detection data matched in time.
The duration threshold may be set by a user according to actual needs, or may be set by default by the electronic device, which is not limited in the embodiment of the present application.
If the time difference between the first timestamp and the second timestamp is smaller than the time threshold, it can be determined that the detection data corresponding to the first timestamp and the detection data corresponding to the second timestamp are acquired at the same time or at similar times, and at this time, it can be determined that the detection data corresponding to the first timestamp and the detection data corresponding to the second timestamp are two frames of detection data matched in time, so that the detection data corresponding to the first timestamp is determined as the second detection data, and the detection data corresponding to the second timestamp is determined as the first detection data.
In one embodiment, if the time difference between the first time stamp and the second time stamp is greater than or equal to the duration threshold, the time stamp of the next frame of detection data of the millimeter wave radar sensor is obtained, and a third time stamp is obtained. The second timestamp is compared to the third timestamp. And if the time difference between the second timestamp and the third timestamp is less than the time length threshold, determining the detection data corresponding to the third timestamp and the detection data corresponding to the second timestamp as two frames of detection data with matched time.
If the time difference between the first timestamp and the second timestamp is greater than or equal to the time length threshold value, it can be determined that the detection data corresponding to the first timestamp and the detection data corresponding to the second timestamp are not acquired at the same time, or are not acquired at similar times, and at this time, the timestamp of the next frame of detection data of the millimeter wave radar sensor can be acquired, so that a third timestamp is obtained. If the time difference between the second timestamp and the third timestamp is smaller than the time length threshold, it is indicated that the detection data corresponding to the second timestamp and the detection data corresponding to the third timestamp are acquired at the same time or at a similar time, and therefore, the first detection data determined by the detection data corresponding to the second timestamp and the detection data corresponding to the third timestamp can be determined as the second detection data.
In one embodiment, if the time difference between the second time stamp and the third time stamp is greater than or equal to the duration threshold, the next frame of detection data of the lidar sensor is acquired. And determining the detection data corresponding to the first timestamp and the next frame of detection data of the laser radar sensor as two frames of detection data matched in time.
If the time difference between the second timestamp and the third timestamp is greater than or equal to the time length threshold, it is indicated that the detection data corresponding to the second timestamp and the detection data corresponding to the third timestamp are not acquired at the same time, or are not acquired at similar times. In this case, since the millimeter wave radar sensor and the laser radar sensor are operated synchronously, the next frame of detection data of the laser radar sensor and the detection data corresponding to the first timestamp should be acquired at the same or similar time, and therefore, the electronic device may directly determine the next frame of detection data of the laser radar sensor as the first detection data and determine the detection data corresponding to the first timestamp as the second detection data. That is, two frames of detection data that are time matched during operation of the millimeter wave radar sensor and the lidar sensor may typically be separated by at most one frame.
It should be noted that, the above description is made by taking a comparative example in which the third timestamp of the next frame of detection data of the millimeter wave radar sensor is obtained first when the timestamp between the first timestamp and the second timestamp is greater than or equal to the duration threshold. In another embodiment, when the timestamp between the first timestamp and the second timestamp is greater than or equal to the duration threshold, the timestamp of the next frame of detection data of the laser radar sensor may be obtained first, so as to obtain a fourth timestamp, then the fourth timestamp is compared with the first timestamp, if the difference between the fourth timestamp and the first timestamp is greater than or equal to the duration threshold, the next frame of detection data of the millimeter wave radar sensor is obtained, the next frame of detection data of the millimeter wave radar sensor is determined as the second detection data, and the detection data corresponding to the second timestamp is determined as the first detection data. That is, in the case that the timestamp between the first timestamp and the second timestamp is greater than or equal to the duration threshold, the timestamp of which radar sensor's detection data is acquired first is not limited in the embodiment of the present application.
Step 202: and processing the second detection data through a second target detection model to obtain millimeter wave radar data, wherein the second target detection model is used for determining the object detected by the millimeter wave radar sensor based on the second detection data.
As described above, the millimeter wave radar data includes the object information of the object detected by the millimeter wave radar sensor, and it is understood that the second object information is included in the millimeter wave radar data.
Step 203: and determining point cloud data marked with second object information based on the first detection data and the millimeter wave radar data.
As an example of the present application, the specific implementation of determining the point cloud data labeled with the second object information based on the first detection data and the millimeter wave radar data may include the following 2031 to 2034:
2031: and processing the first detection data through the target detection model, and outputting information of at least one first object under the point cloud coordinate system.
In implementation, the specific implementation of 2031 may include: and calling a target detection model, inputting the first detection data into the target detection model for processing, and outputting information of at least one first object in the point cloud coordinate system by the target detection model. At least one first object refers to an object detected by a lidar sensor.
2032: and mapping the millimeter wave radar data to a point cloud coordinate system to obtain information of at least one second object in the point cloud coordinate system.
In one embodiment, the millimeter wave radar data may be mapped to the point cloud coordinate system through the target mapping matrix, so as to obtain information of the at least one second object in the point cloud coordinate system. The target mapping matrix may be obtained by a calibration method, and exemplarily, a coordinate transformation matrix including a plurality of degrees of freedom such as translation and rotation is obtained by the calibration method, and the coordinate transformation matrix is determined as the target mapping matrix.
The at least one second object refers to an object detected by the millimeter wave radar sensor, and it is understood that the at least one second object may include a part or all of the first object, that is, some of the at least one second object and the at least one first object are the same object.
2033: and performing object matching based on the information of the at least one first object under the point cloud coordinate system and the information of the at least one second object under the point cloud coordinate system.
As an example of the present application, an IOU (intersection-comparison) calculation may be performed based on information of at least one first object in a point cloud coordinate system and information of at least one second object in the point cloud coordinate system to determine an IOU value between each first object of the at least one first object and each second object of the at least one second object, resulting in an object matching result.
2034: and determining point cloud data marked with second object information according to the object matching result.
As an example of the present application, determining a specific implementation of the point cloud data labeled with the second object information according to the result of the object matching may include: and determining an object three-dimensional frame under the point cloud coordinate system according to the object matching result, wherein the object three-dimensional frame refers to the three-dimensional frame of the second object which is not matched with the at least one first object in the at least one second object. And if the number of the point clouds included in the object three-dimensional frame in the first detection data is larger than the point threshold value, marking the point cloud data included in the object three-dimensional frame based on the data corresponding to the object three-dimensional frame in the millimeter wave radar data to obtain the point cloud data marked with the second object information.
The point threshold value may be set by a user according to actual needs, or may be set by default by the electronic device, which is not limited in the embodiment of the present application.
For any one of the at least one first object, when the IOU value between the any one first object and a certain second object of the at least one second object is greater than or equal to the matching degree threshold, it may be determined that the any one first object matches the certain second object, otherwise, if the IOU value between the any one first object and the certain second object is less than the matching degree threshold, it is determined that the any one first object does not match the certain second object. In this way, a second object of the at least one second object that does not match the at least one first object can be determined.
The matching degree threshold may be set by a user according to actual needs, or may be set by default by the electronic device, which is not limited in the embodiment of the present application.
And then, determining an object three-dimensional frame corresponding to the second object which is not matched under the point cloud coordinate system. The detection range of the millimeter wave radar sensor is wide, so that more second object information corresponding to the object three-dimensional frame is included in the millimeter wave radar data, and the detection range of the laser radar sensor is limited, so that the first detection data may only include a small amount of point cloud data corresponding to the object three-dimensional frame. If the number of the point clouds included in the determined object three-dimensional frame is larger than the point threshold value, it is indicated that partial characteristics of a second object corresponding to the object three-dimensional frame are detected, but the detected point cloud data are sparse, and at this time, the point cloud data included in the object three-dimensional frame can be labeled by using the data corresponding to the object three-dimensional frame in the millimeter wave radar data, so as to obtain the point cloud data labeled with second object information.
In one embodiment, if the determined number of point clouds included in the object three-dimensional frame is less than or equal to the point threshold, it indicates that only a very small amount of point cloud data, such as point cloud data that may include only single digit numbers, is included in the object three-dimensional frame, and the second object corresponding to the object three-dimensional frame may be ignored. In this case, the point cloud data within the object three-dimensional frame may not be labeled.
It should be noted that the point cloud data labeled with the second object information for performing update training on the target detection model may be one or more groups. Under the condition of multiple groups, each group of point cloud data marked with the second object information can be obtained through the method.
In addition, it should be noted that after a group of point cloud data marked with a second object is obtained, the target detection model with the detection function may be updated and trained immediately, or after a plurality of groups of point cloud data marked with a second object are obtained, the target detection model with the detection function may be updated and trained, and the time for updating and training is not limited in the embodiment of the present application.
In addition, as an example of the present application, the first target detection model may not be updated and trained again when a preset condition is met. The preset condition may be set according to actual requirements, for example, the preset condition may be that the detection performance of the first object detection model reaches a certain preset standard, or the number of times of updating training of the first object detection model reaches a preset number of times, and the like.
In the embodiment of the application, in the operation process of the laser radar sensor and the millimeter wave radar sensor, two frames of detection data matched with time are obtained, and first detection data and second detection data are obtained. And then, based on second object information of the unmatched objects in the millimeter wave radar data, marking the edge point cloud data in the first detection data to obtain point cloud data marked with the second object information. And updating and training the target detection model based on the point cloud data marked with the second object information to obtain a first target detection model. And then, target detection data of the laser radar sensor is obtained, and then first object information of an object currently detected by the laser radar sensor is determined through the first object detection model based on the target detection data, and the first object detection model can accurately detect a far-end object, so that the object detection effectiveness can be improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 3 is a schematic structural diagram illustrating an apparatus for detecting an object according to an exemplary embodiment, where the apparatus may be a part or all of an electronic device made of software, hardware or a combination of the software and the hardware. The apparatus for detecting an object may include:
an obtaining module 310, configured to obtain current detection data of a laser radar sensor to obtain target detection data;
a determining module 320, configured to determine, based on the target detection data, first object information of an object currently detected by the lidar sensor through a first target detection model;
the first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by labeling edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is detected by the laser radar sensor and is matched with the millimeter wave radar data in time.
As an example of the present application, the obtaining module 310 is further configured to:
acquiring two frames of detection data matched with time in the operation process of the millimeter wave radar sensor and the laser radar sensor, wherein the two frames of detection data comprise the first detection data and the second detection data, and the second detection data is obtained by detection of the millimeter wave radar sensor;
processing the second detection data through a second target detection model to obtain the millimeter wave radar data, wherein the second target detection model is used for determining an object detected by the millimeter wave radar sensor based on the second detection data;
and determining the point cloud data marked with the second object information based on the first detection data and the millimeter wave radar data.
As an example of the present application, the obtaining module 310 is configured to:
processing the first detection data through the target detection model, and outputting information of at least one first object under a point cloud coordinate system;
mapping the millimeter wave radar data to the point cloud coordinate system to obtain information of at least one second object in the point cloud coordinate system;
performing object matching based on the information of the at least one first object under the point cloud coordinate system and the information of the at least one second object under the point cloud coordinate system;
and determining the point cloud data marked with the second object information according to the result of the object matching.
As an example of the present application, the obtaining module 310 is configured to:
determining an object three-dimensional frame under the point cloud coordinate system according to the result of object matching, wherein the object three-dimensional frame refers to a three-dimensional frame of a second object which is not matched with the at least one first object in the at least one second object;
if the number of point clouds included in the object three-dimensional frame in the first detection data is larger than a point threshold value, marking the point cloud data included in the object three-dimensional frame based on the data corresponding to the object three-dimensional frame in the millimeter wave radar data to obtain the point cloud data marked with the second object information.
As an example of the present application, the obtaining module 310 is configured to:
acquiring a timestamp of detection data of the millimeter wave radar sensor to obtain a first timestamp, and acquiring a timestamp of detection data of the laser radar sensor to obtain a second timestamp;
and if the time difference between the first timestamp and the second timestamp is less than a duration threshold, determining the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor as two frames of detection data matched with the time.
As an example of the present application, the obtaining module 310 is configured to:
if the time difference between the first time stamp and the second time stamp is greater than or equal to the duration threshold, acquiring a time stamp of next frame detection data of the millimeter wave radar sensor to obtain a third time stamp;
comparing the second timestamp to the third timestamp;
and if the time difference between the second timestamp and the third timestamp is smaller than the duration threshold, determining the detection data corresponding to the third timestamp and the detection data corresponding to the second timestamp as the two frames of detection data with matched time.
As an example of the present application, the obtaining module 310 is configured to:
if the time difference between the second timestamp and the third timestamp is greater than or equal to the duration threshold, acquiring next frame detection data of the laser radar sensor;
and determining the detection data corresponding to the first timestamp and the next frame of detection data of the laser radar sensor as the two frames of detection data matched with the time.
In the embodiment of the application, target detection data of the lidar sensor is acquired, and then first object information of an object currently detected by the lidar sensor is determined through a first object detection model based on the target detection data. The first target detection model is subjected to at least one updating training, wherein the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is obtained by detecting with a laser radar sensor and is matched with the millimeter wave radar data in time. Therefore, the edge point cloud data in the first detection data are marked by utilizing the second object information used for describing the remote object in the millimeter wave radar data, and the target detection model is updated based on the point cloud data marked with the second object information, so that the remote object can be accurately detected by the obtained first target detection model, and the effectiveness of object detection can be improved.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 4, the electronic apparatus 4 of this embodiment includes: at least one processor 40 (only one shown in fig. 4), a memory 41, and a computer program 42 stored in the memory 41 and executable on the at least one processor 40, the steps of any of the various method embodiments described above being implemented when the computer program 42 is executed by the processor 40.
The electronic device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The electronic device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of the electronic device 4, and does not constitute a limitation of the electronic device 4, and may include more or less components than those shown, or combine some of the components, or different components, such as an input-output device, a network access device, etc.
The Processor 40 may be a Central Processing Unit (CPU), and the Processor 40 may be other general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may in some embodiments be an internal storage unit of the electronic device 4, such as a hard disk or a memory of the electronic device 4. In other embodiments, the memory 41 may also be an external storage device of the electronic device 4, such as a plug-in hard disk, an SMC (Smart Media Card), an SD (Secure Digital) Card, a Flash memory Card (Flash Card), or the like, provided on the electronic device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the electronic device 4. The memory 41 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, for the information interaction, execution process, and other contents between the above devices/units, the specific functions and technical effects thereof based on the same concept as those of the method embodiment of the present application can be specifically referred to the method embodiment portion, and are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above-mentioned embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method of detecting an object, the method comprising:
acquiring current detection data of a laser radar sensor to obtain target detection data;
determining, by a first object detection model, first object information of an object currently detected by the lidar sensor based on the object detection data;
the first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by marking edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is unsuccessfully matched with a millimeter wave radar data object in first detection data, and the first detection data is detected by a laser radar sensor and is matched with the millimeter wave radar data in time.
2. The method of claim 1, wherein the acquiring of the point cloud data marked with the second object information comprises:
in the operation process of the millimeter wave radar sensor and the laser radar sensor, acquiring two frames of detection data matched with time, wherein the two frames of detection data comprise first detection data and second detection data, and the second detection data is obtained by detection of the millimeter wave radar sensor;
processing the second detection data through a second target detection model to obtain the millimeter wave radar data, wherein the second target detection model is used for determining an object detected by the millimeter wave radar sensor based on the second detection data;
and determining the point cloud data marked with the second object information based on the first detection data and the millimeter wave radar data.
3. The method of claim 2, wherein the determining the point cloud data tagged with the second object information based on the first detection data and the millimeter wave radar data comprises:
processing the first detection data through the target detection model, and outputting information of at least one first object under a point cloud coordinate system;
mapping the millimeter wave radar data to the point cloud coordinate system to obtain information of at least one second object in the point cloud coordinate system;
performing object matching based on the information of the at least one first object under the point cloud coordinate system and the information of the at least one second object under the point cloud coordinate system;
and determining the point cloud data marked with the second object information according to the result of the object matching.
4. The method of claim 3, wherein the determining the point cloud data labeled with the second object information according to the result of the object matching comprises:
determining an object three-dimensional frame under the point cloud coordinate system according to the result of object matching, wherein the object three-dimensional frame refers to a three-dimensional frame of a second object which is not matched with the at least one first object in the at least one second object;
if the number of point clouds included in the object three-dimensional frame in the first detection data is larger than a point threshold value, marking the point cloud data included in the object three-dimensional frame based on data corresponding to the object three-dimensional frame in the millimeter wave radar data to obtain the point cloud data marked with the second object information.
5. The method of claim 2, wherein the acquiring two frames of probe data that are time matched comprises:
acquiring a timestamp of detection data of the millimeter wave radar sensor to obtain a first timestamp, and acquiring a timestamp of detection data of the laser radar sensor to obtain a second timestamp;
and if the time difference between the first timestamp and the second timestamp is less than a duration threshold, determining the detection data of the millimeter wave radar sensor and the detection data of the laser radar sensor as two frames of detection data matched with the time.
6. The method of claim 5, wherein the method further comprises:
if the time difference between the first time stamp and the second time stamp is greater than or equal to the duration threshold, acquiring a time stamp of next frame detection data of the millimeter wave radar sensor to obtain a third time stamp;
comparing the second timestamp to the third timestamp;
and if the time difference between the second timestamp and the third timestamp is smaller than the duration threshold, determining the detection data corresponding to the third timestamp and the detection data corresponding to the second timestamp as the two frames of detection data with matched time.
7. The method of claim 6, wherein the method further comprises:
if the time difference between the second timestamp and the third timestamp is greater than or equal to the duration threshold, acquiring next frame detection data of the laser radar sensor;
and determining the detection data corresponding to the first timestamp and the next frame of detection data of the laser radar sensor as the two frames of detection data matched with the time.
8. An apparatus for detecting an object, the apparatus comprising:
the acquisition module is used for acquiring the current detection data of the laser radar sensor to obtain target detection data;
a determining module, configured to determine, based on the target detection data, first object information of an object currently detected by the lidar sensor through a first object detection model;
the first target detection model is subjected to at least one updating training, the updating training is to update parameters of the target detection model with a detection function based on point cloud data marked with second object information, the point cloud data marked with the second object information is obtained by labeling edge point cloud data based on millimeter wave radar data, the edge point cloud data comprises point cloud data which is failed to be matched with a millimeter wave radar data object in first detection data, and the first detection data is detected by the laser radar sensor and is matched with the millimeter wave radar data in time.
9. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium having instructions stored thereon, wherein the instructions, when executed by a processor, implement the method of any of claims 1 to 7.
CN202110867696.8A 2021-07-29 2021-07-29 Method and device for detecting object, electronic equipment and storage medium Pending CN115685182A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110867696.8A CN115685182A (en) 2021-07-29 2021-07-29 Method and device for detecting object, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110867696.8A CN115685182A (en) 2021-07-29 2021-07-29 Method and device for detecting object, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115685182A true CN115685182A (en) 2023-02-03

Family

ID=85058825

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110867696.8A Pending CN115685182A (en) 2021-07-29 2021-07-29 Method and device for detecting object, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115685182A (en)

Similar Documents

Publication Publication Date Title
CN108228798B (en) Method and device for determining matching relation between point cloud data
CN113139559B (en) Training method of target detection model, and data labeling method and device
CN113421330B (en) Vehicle-road cooperative road three-dimensional scene construction method, device, equipment and medium
CN111540027B (en) Detection method, detection device, electronic equipment and storage medium
CN113505687A (en) Equipment test method, device, electronic equipment, system and storage medium
CN114820749A (en) Unmanned vehicle underground positioning method, system, equipment and medium
CN114993328B (en) Vehicle positioning evaluation method, device, equipment and computer readable medium
CN115690475A (en) Target detection method and device, electronic equipment and readable storage medium
CN114926540A (en) Lane line calibration method and device, terminal equipment and readable storage medium
CN108693517B (en) Vehicle positioning method and device and radar
CN112434682B (en) Data fusion method and device based on multiple sensors and storage medium
CN111538918B (en) Recommendation method and device, electronic equipment and storage medium
CN115639578B (en) Beidou navigation positioning monitoring processing method and system
CN115685182A (en) Method and device for detecting object, electronic equipment and storage medium
CN111444833A (en) Fruit measurement production method and device, computer equipment and storage medium
CN115406452A (en) Real-time positioning and mapping method, device and terminal equipment
CN112304281A (en) Road slope measuring method, terminal equipment and storage medium
CN116295466A (en) Map generation method, map generation device, electronic device, storage medium and vehicle
CN114241195A (en) Target identification method and device, electronic equipment and storage medium
CN115685219A (en) Target detection method and device based on laser scanning and target detection terminal
CN114488042A (en) Laser radar calibration method and device, electronic equipment and storage medium
CN111383337A (en) Method and device for identifying objects
CN113227708B (en) Method and device for determining pitch angle and terminal equipment
CN116578874B (en) Satellite signal attribute appraising method and device based on network protocol
CN116664658B (en) Obstacle detection method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination