CN116664638A - Determination method and device of perspective matrix, electronic equipment and storage medium - Google Patents

Determination method and device of perspective matrix, electronic equipment and storage medium Download PDF

Info

Publication number
CN116664638A
CN116664638A CN202211500222.0A CN202211500222A CN116664638A CN 116664638 A CN116664638 A CN 116664638A CN 202211500222 A CN202211500222 A CN 202211500222A CN 116664638 A CN116664638 A CN 116664638A
Authority
CN
China
Prior art keywords
target
perspective matrix
matrix
target point
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211500222.0A
Other languages
Chinese (zh)
Inventor
闫夏卿
陈家兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN202211500222.0A priority Critical patent/CN116664638A/en
Publication of CN116664638A publication Critical patent/CN116664638A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a method and a device for determining a perspective matrix, electronic equipment and a storage medium. The method comprises the following steps: acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance; if the target point is determined to meet the target tracking condition, determining a deviation value of the target point matched with the initial perspective matrix; recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix; repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as the target perspective matrix. According to the application, the middle perspective matrix is repeatedly calibrated until the middle perspective matrix meets the matrix precision condition, so that the target perspective matrix is determined, the calibration cost of the perspective matrix is reduced, and the precision of the perspective matrix is improved.

Description

Determination method and device of perspective matrix, electronic equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and apparatus for determining a perspective matrix, an electronic device, and a storage medium.
Background
The appearance of the radar video integrated machine is a major breakthrough of road traffic management technology, and the radar video integrated machine can fuse target information acquired by a radar with target information acquired by a video. The coordinates of the target position acquired by the radar and the coordinates of the target pixel detected by the video are two different coordinate systems, so that if two pieces of information are fused, a conversion relation, namely a perspective matrix, needs to be determined.
In the prior art, the method for obtaining the perspective matrix through calibration is that when the radar video all-in-one machine is installed, a plurality of position points and pixel point coordinates corresponding to the position points in the video are determined through engineering investigation, so that the perspective matrix is determined. However, this method requires the determination of as many location points as possible, and the calibration is labor-intensive.
Disclosure of Invention
The application provides a method and a device for determining a perspective matrix, electronic equipment and a storage medium, so as to reduce the calibration cost of the perspective matrix and improve the accuracy of the perspective matrix.
In a first aspect, an embodiment of the present application provides a method for determining a perspective matrix, where the method includes:
acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance;
if the target point is determined to meet the target tracking condition, determining a deviation value of the target point matched with the initial perspective matrix;
recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix;
repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, wherein the intermediate perspective matrix is used as a target perspective matrix.
In a second aspect, an embodiment of the present application further provides a device for determining a perspective matrix, where the device includes:
the first matrix determining module is used for acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system which are calibrated in advance;
the deviation value determining module is used for determining a deviation value of the target point matched with the initial perspective matrix if the target point is determined to meet the target tracking condition;
the second matrix determining module is used for recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix;
and the target matrix determining module is used for repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as a target perspective matrix.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements a method for determining a perspective matrix according to any one of the embodiments of the present application when the processor executes the program.
In a fourth aspect, embodiments of the present application also provide a storage medium storing computer-executable instructions that, when executed by a computer processor, are configured to perform a method of determining a perspective matrix according to any of the embodiments of the present application.
According to the technical scheme, the initial perspective matrix between the video coordinate system and the radar coordinate system is obtained through pre-calibration, then the target point is obtained, the target point is determined to meet the target tracking condition, the deviation value of the target point matched with the initial perspective matrix is determined, so that the tracking of the target point is ensured to be stable, the initial perspective matrix is re-calibrated according to the deviation value of each target point, the intermediate perspective matrix is obtained, the error of the target point is ensured to be smaller and more accurate, the times and time of multiple calibration are reduced, the inaccuracy of the intermediate perspective matrix caused by the accidental of the target point is avoided, and the obtained intermediate perspective matrix is ensured to be more accurate; and finally, repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and carrying out the operation of recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as the target perspective matrix, thereby reducing the calibration cost of the perspective matrix and improving the precision of the perspective matrix.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for determining a perspective matrix according to a first embodiment of the present application;
FIG. 2 is a flowchart of a method for determining a perspective matrix according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a device for determining a perspective matrix according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a method for determining a perspective matrix according to an embodiment of the present application, where the method may be performed by a device for determining a perspective matrix, and the device for determining a perspective matrix may be implemented in hardware and/or software, and may be configured in an electronic device having a method for determining a perspective matrix, where the method is applicable to a case where a conversion relationship (i.e., a perspective matrix) obtained by automatically calibrating a target detected by a video detection target and a target detected by a radar is accurately determined.
As shown in fig. 1, the method includes:
s110, acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance.
According to the scheme, the millimeter wave radar video all-in-one machine is utilized to simultaneously acquire the video of the calibration point and the radar information, the coordinate system of the calibration point in the video information is the video coordinate system, and the radar coordinate system is the radar coordinate system in the radar information, so that in order to fuse the two information together, a transformation relationship, namely a perspective matrix, is required to be determined.
Optionally, the initial perspective matrix is obtained by calibrating in advance the radar coordinates and the video coordinates of at least four calibration points.
For example, when the primary calibration is performed, the calibration points may be set at four points on the left and right sides of a certain distance on the installation site of the millimeter wave radar video all-in-one machine, for example, the calibration points are located at 25 meters and 70 meters, four positions are left and right sides of a picture, at this time, video coordinates and radar coordinates of the four calibration points may be obtained, then the four calibration points are calibrated, and an initial perspective matrix between the video coordinate system and the radar coordinate system may be obtained, where the initial perspective matrix may be expressed as:
wherein X, Y, Z represent radar coordinates. x, y represent video coordinates.
Transforming the above formula can result in the following formula:
where c3=1, the perspective matrix has a total of 8 unknowns, a1, a2, a3, b1, b2, b3, c1, and c2, and a solution to the initial perspective matrix can be obtained using four points (each point can constitute two equations).
And S120, if the target point is determined to meet the target tracking condition, determining a deviation value of the target point matched with the initial perspective matrix.
The target tracking conditions may include the following three conditions: the target point continuously appears in the video coordinate system for a preset number of frames, the position point matched with the target point can be determined in the radar coordinate system, and the deviation value between the converted coordinate of the video coordinate of the target point and the radar coordinate of the position point accords with normal distribution.
The video frames and the radar frames are synchronously acquired, namely, the video frames and the radar frames are acquired at the same time. The number of frames in which the target points continuously appear in the video coordinate system is defined because it is ensured that a sufficient number of video coordinates of the target points can be acquired, so that the position points in the radar coordinate system are determined by the converted coordinates of the video coordinates, and the deviation values are determined. The reason for determining the position point matched with the target point in the radar coordinate system is that if the radar coordinate system does not include the position point matched with the target point, the error of the initial perspective matrix is larger at the moment, and the abnormal condition of the initial perspective matrix needs to be eliminated. The reason for determining that the deviation value accords with the normal distribution is that if the deviation value does not accord with the normal distribution, the error is unstable at the moment, and the error of the perspective matrix is required to be ensured to be in a random error range.
Specifically, the target points of a plurality of video frame images are acquired, the target points are analyzed and processed, and the target points meeting target tracking conditions are reserved, so that the determined target points are accurate, deviation of subsequent calculation caused by inaccuracy of the target points is avoided, and further the deviation value of the target points matched with the initial perspective matrix can be accurately determined.
S130, recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix.
Specifically, the deviation values of the multiple target points may be determined in step 120, and then the deviation values of the multiple points may be calculated and analyzed to achieve recalibration of the initial perspective matrix.
Optionally, the deviation values of the plurality of target points also conform to the normal distribution, and for the deviation values of the target points exceeding the normal distribution mu+/-3 sigma, the deviation values of the target points are considered to have larger errors, so that the corresponding target points can be removed, and the reserved target points are used for recalibrating the initial perspective matrix.
In a possible embodiment, optionally, recalibrating the initial perspective matrix according to the deviation value of each target point comprises the steps of A1-A2:
and A1, determining standard deviation of the deviation value of each target point according to the deviation value of each target point.
And A2, recalibrating the initial perspective matrix according to the target point with the standard deviation smaller than the preset standard deviation.
The preset standard deviation may be a standard deviation boundary value specifically set according to the actual situation in order to make the acquired target point more accurate.
Specifically, the deviation values of the multiple target points are obtained, the deviation values of the target points are calculated to accurately determine the standard deviation of the deviation values of the target points, the standard deviation of the deviation values of the target points is compared with the preset standard deviation, and the target points with the deviation values smaller than the preset standard deviation are kept to be more accurate because the target points with the deviation values smaller than the preset standard deviation are stably tracked, so that the selection of the target points is ensured to be more accurate, and the initial perspective matrix can be recalibrated by using the target points.
According to the technical scheme, the target points which accord with recalibration are successfully screened out by comparing the preset standard deviation with the standard deviation of the deviation value of each target point, so that the target point errors are smaller and more accurate, the initial perspective matrix can be correctly recalibrated by using the points, the times and time of multiple calibration are reduced, the inaccuracy of the intermediate perspective matrix caused by the accidental of the target points is avoided, and the obtained intermediate perspective matrix is ensured to be more accurate.
Optionally, after the target point for recalibration is obtained, the target point may be compared with the original calibration point, where:
a. if the video coordinates of the target point are obtained within the preset range of the video coordinates of the original calibration point, the target point is replaced with the original calibration point and is calibrated again, and an intermediate perspective matrix is obtained;
b. if the obtained video coordinates of the target point are out of the preset range of the video coordinates of the original target point, the target point is taken as an expansion target point, and the intermediate perspective matrix is calculated together with the original target point.
It should be noted that, since the number of points used for calculating the intermediate perspective matrix becomes larger at this time, the number of equations is larger than the number of unknowns, and a solution cannot be obtained directly, a solution needs to be fitted using a least square method.
Assuming a super-definite matrix xθ=y, one can deduce the solution of the least squares method as:
θ=(X T X) -1 X T Y;
the number of extension index points is not limited in this embodiment, and alternatively, the total number of extension index points may be limited to 16.
And S140, repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, wherein the intermediate perspective matrix is used as a target perspective matrix.
The matrix accuracy condition may be a condition that the perspective matrix does not need to be recalibrated, i.e. the perspective matrix has reached the required accuracy.
Specifically, after the intermediate perspective matrix is obtained, whether the intermediate perspective matrix meets the matrix precision condition is required to be judged, if so, the intermediate perspective matrix is regarded as a target perspective matrix if the intermediate perspective matrix is sufficiently accurate, and calibration is finished; if the target point is not satisfied, a new target point is required to be acquired again, the deviation value of the target point matched with the intermediate perspective matrix is repeatedly executed, and the intermediate perspective matrix is recalibrated according to the deviation value of each target point until the matrix precision condition is satisfied. In the operation of repeatedly executing and determining the deviation value of the matching of the target point and the intermediate perspective matrix and recalibrating the intermediate perspective matrix according to the deviation value of each target point, the target point is required to be reselected for calculating the deviation value, so that the accuracy of the final matrix is ensured to be irrelevant to the integral deviation of certain data.
Alternatively, the intermediate perspective matrix may be determined to satisfy the matrix accuracy condition by: calculating a deviation value target value matched with the last intermediate perspective matrix of the current intermediate perspective matrix and a difference value between the deviation value target value matched with the current intermediate perspective matrix and each target point; if the difference value is smaller than or equal to the preset difference value, determining that the current middle perspective matrix meets a matrix precision condition; wherein the deviation value target value comprises a deviation value average value and a deviation value standard deviation.
In this embodiment, if the target value of the deviation value of each target point of the current intermediate perspective matrix is smaller than the target value of the deviation value of each target point of the previous intermediate perspective matrix calibrated in the previous round, and the target values of the deviation values of the perspective matrices obtained by the two rounds of calibration are relatively close, it is indicated that the process of repeated calibration tends to converge, the accuracy performance of the current intermediate perspective matrix is relatively good, the repeated calibration can be stopped, and the current intermediate perspective matrix is used as the final target perspective matrix.
Optionally, in the scheme of the application, the radar coordinates can be converted into video coordinates, and the video coordinates can also be converted into radar coordinates.
Alternatively, taking the example of converting the radar coordinates to the video coordinates, there may be: acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance; determining radar coordinates of a target point in a first number of continuous radar frame images, performing coordinate transformation on the first number of radar coordinates of the target point according to an initial perspective matrix to obtain transformation coordinates matched with each radar coordinate, determining position points matched with the transformation coordinates of the target point in each video frame image according to each transformation coordinate of the target point and each position point in the first number of video frame images, and if deviation values of the first number of transformation coordinates of the target point and the first number of video coordinates of the position points matched with the target point are determined to be in accordance with normal distribution, determining that the target point meets target tracking conditions, and then determining deviation values of the target point and the initial perspective matrix; recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix; repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as the target perspective matrix.
According to the technical scheme, the initial perspective matrix between the video coordinate system and the radar coordinate system is obtained through pre-calibration, then the target point is obtained, the target point is determined to meet the target tracking condition, the deviation value of the target point matched with the initial perspective matrix is determined, so that the tracking of the target point is ensured to be stable, the initial perspective matrix is re-calibrated according to the deviation value of each target point, the intermediate perspective matrix is obtained, the error of the target point is ensured to be smaller and more accurate, the times and time of multiple calibration are reduced, the inaccuracy of the intermediate perspective matrix caused by the accidental of the target point is avoided, and the obtained intermediate perspective matrix is ensured to be more accurate; and finally, repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and carrying out the operation of recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as the target perspective matrix, thereby reducing the calibration cost of the perspective matrix and improving the precision of the perspective matrix.
Example two
Fig. 2 is a flowchart of a method for determining a perspective matrix according to an embodiment of the present application, where on the basis of the foregoing embodiment, the embodiment of the present application further embodies determining a deviation value of a target point matching the initial perspective matrix if the target point is determined to meet a target tracking condition.
As shown in fig. 2, the method includes:
s210, acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance.
The specific process of obtaining the initial perspective matrix through calibration is described in the above embodiment, and this embodiment is not described herein.
S220, determining that the target point meets the target tracking condition.
Specifically, a first number of continuous video frame images are obtained, and video coordinates of a target point in the first number of continuous video frame images are determined; according to the initial perspective matrix, carrying out coordinate transformation on a first number of video coordinates of the target point to obtain transformed coordinates matched with each video coordinate; meanwhile, according to each conversion coordinate of the target point and each position point in the first number of radar frame images, determining the position point matched with the conversion coordinate of the target point in each radar frame image; and finally, if the deviation values of the first number of conversion coordinates of the target point and the first number of radar coordinates of the position point matched with the target point are determined to be in accordance with normal distribution, determining that the target point meets the target tracking condition.
In one possible embodiment, determining the location point in each radar frame image that matches the transformed coordinates of the target point based on each transformed coordinate of the target point and each location point in the first number of radar frame images may include steps B1-B2:
and B1, determining a position point with a distance smaller than or equal to a preset distance between conversion coordinates corresponding to a target point in the target video frame image from all position points of the radar frame image matched with the target video frame image.
And B2, if the number of times of occurrence of the target position point in the first number of radar frame images is determined to meet the confidence coefficient condition, taking the target position point as a position point matched with the target point.
Wherein the confidence condition may be that the target location point occurs the greatest number of times in the first number of radar frame images.
Specifically, video coordinates of target points in a first number of video frame images are obtained, a first number of conversion coordinates of the target points are determined through an initial perspective matrix, and all position points in the first number of radar frame images are obtained at the same time; and then judging the distance between the conversion coordinates of the target points corresponding to the target video frame images and the position points of the radar frame images matched with the target video frame images, and determining the position points with the distance smaller than or equal to the preset distance, wherein one or more or zero position points can exist. Because there may be one, more or zero position points, there may be a plurality of different position points, if the number of times of occurrence of the target position point in the first number of radar frame images satisfies the confidence condition, that is, the number of times of occurrence of the target position point in the first number of radar frame images is the largest, the target position point is regarded as the position point matched with the target point, and the position point matched with the target point is determined more accurately, thereby avoiding errors in calculating the deviation value of the target point due to the contingency of the position point.
Alternatively, if there are a plurality of target position points that each satisfy the confidence condition and that occur the same number of times in the first number of radar frame images, the target position point that is closest to the converted coordinate distance of the target point is selected as the position point that matches the target point. The closest distance to the transformed coordinates of the target point may be the smallest average distance between the coordinates of the target position point and the transformed coordinates of the target point in the first number of radar frame images, or the smallest distance between the coordinates of the target position point and the transformed coordinates of the target point in the first number of radar frame images, which is not limited in this embodiment.
Optionally, if the continuous multiple target points cannot be successfully tracked during the pre-calibration, that is, the corresponding position points cannot be matched in the process of confirming the position points matched with the target points, the initial perspective matrix is considered to be abnormal, and the initial perspective matrix needs to be pre-calibrated again.
According to the technical scheme, the position points, which are possibly matched with the target point, can be accurately screened out by determining the position points, which are smaller than or equal to the preset distance, among the position points of the radar frame image matched with the target video frame image, wherein the distance between the conversion coordinates corresponding to the target video frame image is smaller than or equal to the preset distance, then the occurrence times of the target position points are determined in the first number of radar frame images, when the occurrence times meet the confidence condition, namely the occurrence times are the largest, the position points matched with the target point are found, the accurate determination of the position points matched with the target point is realized, and the inaccuracy of the position points matched with the target point due to the accidental of the position points is avoided.
S230, determining a deviation value of the target point matched with the initial perspective matrix.
Specifically, a first number of conversion coordinates of the target point and a first number of radar coordinates of the position point matched with the target point are obtained, and then an average value of the deviation values of the target point is determined according to the deviation values between the first number of conversion coordinates of the target point and the first number of radar coordinates of the position point matched with the target point.
If the deviation value of the target point exceeds the deviation value in the range of the normal distribution mu+/-3 sigma, the deviation value exceeding the normal distribution mu+/-3 sigma is removed, and the average value of the deviation values of the target point is recalculated, so that the influence of coarse errors on the deviation value is reduced. And finally, determining the deviation value which meets the average value condition of the deviation value as the deviation value matched with the initial perspective matrix in each deviation value after eliminating the deviation value exceeding the normal distribution mu+/-3 sigma. The deviation value average value condition may be that in the normal distribution, the deviation value is the smallest difference value from the deviation value average value. That is, the deviation value closest to the average value of the deviation values is taken as the deviation value of the target point matching the initial perspective matrix.
S240, recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix.
In the above embodiments, the process of determining the target point for recalibration among the target points according to the deviation values of the target points and the process of obtaining the perspective matrix according to the recalibration of the target points have been described, and the embodiments are not repeated here.
S250, repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, wherein the intermediate perspective matrix is used as a target perspective matrix.
According to the technical scheme, an initial perspective matrix between a video coordinate system and a radar coordinate system, which are obtained through pre-calibration, is obtained; then according to each conversion coordinate of the target point and each position point in the first number of radar frame images, determining the position point matched with the conversion coordinate of the target point in each radar frame image, realizing accurate determination of the position point matched with the target point, avoiding inaccuracy of the position point matched with the target point caused by the accident of the position point, further determining that the target point meets the target tracking condition by determining that the deviation values of the first number of conversion coordinates of the target point and the first number of radar coordinates of the position point matched with the target point accord with normal distribution, and then determining the deviation value matched with the initial perspective matrix to ensure that the tracking of the target point is stable; recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix; and finally, repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and carrying out recalibration on the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as the target perspective matrix, thereby reducing the calibration cost of the perspective matrix and improving the precision of the perspective matrix.
Example III
Fig. 3 is a schematic structural diagram of a perspective matrix determining apparatus according to an embodiment of the present application. As shown in fig. 3, the apparatus includes:
a first matrix determining module 310, configured to obtain an initial perspective matrix between a video coordinate system and a radar coordinate system that are calibrated in advance;
a deviation value determining module 320, configured to determine a deviation value of the target point matching the initial perspective matrix if it is determined that the target point meets a target tracking condition;
a second matrix determining module 330, configured to recalibrate the initial perspective matrix according to the deviation values of the target points, to obtain an intermediate perspective matrix;
and the target matrix determining module 340 is configured to repeatedly perform the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrate the intermediate perspective matrix according to the deviation value of each target point until it is determined that the intermediate perspective matrix meets the matrix precision condition, and take the intermediate perspective matrix as the target perspective matrix.
Optionally, the initial perspective matrix is obtained by calibrating in advance according to radar coordinates and video coordinates of at least four calibration points.
Optionally, the deviation value determining module includes a first condition determining unit, specifically configured to:
determining video coordinates of the target point in a first number of consecutive video frame images;
according to the initial perspective matrix, carrying out coordinate transformation on a first number of video coordinates of the target point to obtain transformed coordinates matched with each video coordinate;
determining a position point matched with the conversion coordinates of the target point in each radar frame image according to each conversion coordinate of the target point and each position point in the first number of radar frame images;
if the deviation values of the first number of conversion coordinates of the target point and the first number of radar coordinates of the position point matched with the target point are determined to be in accordance with normal distribution, the target point is determined to meet the target tracking condition.
Optionally, the condition determining unit includes a location point determining unit, specifically configured to:
determining position points with the distance between conversion coordinates corresponding to the target video frame image being less than or equal to a preset distance from each position point of the radar frame image matched with the target video frame image;
and if the number of times of occurrence of the target position point in the first number of radar frame images is determined to meet the confidence coefficient condition, the target position point is used as the position point matched with the target point.
Optionally, the deviation value determining module includes a deviation value determining unit, specifically configured to:
determining an average value of the deviation values of the target points according to the first number of conversion coordinates of the target points and the deviation values of the first number of radar coordinates of the position points matched with the target points;
and determining the deviation value meeting the average value condition of the deviation values as the deviation value matched with the initial perspective matrix in the deviation values of the target points.
Optionally, the second matrix determining module is specifically configured to:
determining standard deviation of the deviation value of each target point according to the deviation value of each target point;
and recalibrating the initial perspective matrix according to the target point with the standard deviation smaller than the preset standard deviation.
Optionally, the target matrix determining module includes a second condition determining unit, specifically configured to:
calculating a deviation value target value matched with the last perspective matrix of the middle perspective matrix and a difference value between the deviation value target value matched with the middle perspective matrix and each target point;
if the difference value is smaller than or equal to the preset difference value, determining that the intermediate perspective matrix meets a matrix precision condition;
wherein the deviation value target value comprises a deviation value average value and a deviation value standard deviation.
The device for determining the perspective matrix provided by the embodiment of the application can execute the method for determining the perspective matrix provided by any embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a central processing unit (central processor), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the determination of the perspective matrix.
In some embodiments, the method of determining the perspective matrix may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the above-described method of determining a perspective matrix may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the method of determination of the perspective matrix in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present application may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present application, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present application are achieved, and the present application is not limited herein.
The above embodiments do not limit the scope of the present application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application should be included in the scope of the present application.

Claims (10)

1. A method for determining a perspective matrix, comprising:
acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system, which are calibrated in advance;
if the target point is determined to meet the target tracking condition, determining a deviation value of the target point matched with the initial perspective matrix;
recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix;
repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, wherein the intermediate perspective matrix is used as a target perspective matrix.
2. The method of claim 1, wherein the initial perspective matrix is pre-calibrated based on radar coordinates and video coordinates of at least four calibration points.
3. The method of claim 1, wherein determining that the target point satisfies the target tracking condition comprises:
determining video coordinates of the target point in a first number of consecutive video frame images;
according to the initial perspective matrix, carrying out coordinate transformation on a first number of video coordinates of the target point to obtain transformed coordinates matched with each video coordinate;
determining a position point matched with the conversion coordinates of the target point in each radar frame image according to each conversion coordinate of the target point and each position point in the first number of radar frame images;
if the deviation values of the first number of conversion coordinates of the target point and the first number of radar coordinates of the position point matched with the target point are determined to be in accordance with normal distribution, the target point is determined to meet the target tracking condition.
4. A method according to claim 3, wherein determining a location point in each radar frame image that matches the transformed coordinates of the target point based on each transformed coordinate of the target point and each location point in the first number of radar frame images comprises:
determining position points with the distance between conversion coordinates corresponding to the target video frame image being less than or equal to a preset distance from each position point of the radar frame image matched with the target video frame image;
and if the number of times of occurrence of the target position point in the first number of radar frame images is determined to meet the confidence coefficient condition, the target position point is used as the position point matched with the target point.
5. A method according to claim 3, wherein determining a deviation value for the target point matching the initial perspective matrix comprises:
determining an average value of the deviation values of the target points according to the first number of conversion coordinates of the target points and the deviation values of the first number of radar coordinates of the position points matched with the target points;
and determining the deviation value meeting the average value condition of the deviation values as the deviation value matched with the initial perspective matrix in the deviation values of the target points.
6. The method of claim 1, wherein recalibrating the initial perspective matrix based on the deviation values of each target point comprises:
determining standard deviation of the deviation value of each target point according to the deviation value of each target point;
and recalibrating the initial perspective matrix according to the target point with the standard deviation smaller than the preset standard deviation.
7. The method of claim 1, wherein determining that the intermediate perspective matrix satisfies a matrix accuracy condition comprises:
calculating a deviation value target value matched with the last perspective matrix of the middle perspective matrix and a difference value between the deviation value target value matched with the middle perspective matrix and each target point;
if the difference value is smaller than or equal to the preset difference value, determining that the intermediate perspective matrix meets a matrix precision condition;
wherein the deviation value target value comprises a deviation value average value and a deviation value standard deviation.
8. A device for determining a perspective matrix, comprising:
the first matrix determining module is used for acquiring an initial perspective matrix between a video coordinate system and a radar coordinate system which are calibrated in advance;
the deviation value determining module is used for determining a deviation value of the target point matched with the initial perspective matrix if the target point is determined to meet the target tracking condition;
the second matrix determining module is used for recalibrating the initial perspective matrix according to the deviation value of each target point to obtain an intermediate perspective matrix;
and the target matrix determining module is used for repeatedly executing the operation of determining the deviation value of the matching of the target point and the intermediate perspective matrix, and recalibrating the intermediate perspective matrix according to the deviation value of each target point until the intermediate perspective matrix is determined to meet the matrix precision condition, and taking the intermediate perspective matrix as a target perspective matrix.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of determining a perspective matrix according to any of claims 1-7 when executing the program.
10. A storage medium storing computer executable instructions which, when executed by a computer processor, are adapted to perform the method of determining a perspective matrix according to any of claims 1-7.
CN202211500222.0A 2022-11-28 2022-11-28 Determination method and device of perspective matrix, electronic equipment and storage medium Pending CN116664638A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211500222.0A CN116664638A (en) 2022-11-28 2022-11-28 Determination method and device of perspective matrix, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211500222.0A CN116664638A (en) 2022-11-28 2022-11-28 Determination method and device of perspective matrix, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116664638A true CN116664638A (en) 2023-08-29

Family

ID=87717704

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211500222.0A Pending CN116664638A (en) 2022-11-28 2022-11-28 Determination method and device of perspective matrix, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116664638A (en)

Similar Documents

Publication Publication Date Title
CN113048920B (en) Method and device for measuring flatness of industrial structural part and electronic equipment
CN115457152A (en) External parameter calibration method and device, electronic equipment and storage medium
CN116664638A (en) Determination method and device of perspective matrix, electronic equipment and storage medium
CN115311624A (en) Slope displacement monitoring method and device, electronic equipment and storage medium
CN115049810A (en) Coloring method, device and equipment for solid-state laser radar point cloud and storage medium
CN114596220A (en) Method for correcting image lateral chromatic aberration, electronic device and computer storage medium
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium
CN117739993B (en) Robot positioning method and device, robot and storage medium
CN114399555B (en) Data online calibration method and device, electronic equipment and computer readable medium
CN116843760A (en) Automatic calibration method, device, equipment and medium for radar
CN116299211B (en) Method, device, equipment and medium for determining radar imaging background data
CN114694138B (en) Road surface detection method, device and equipment applied to intelligent driving
CN117392631B (en) Road boundary extraction method and device, electronic equipment and storage medium
CN117152270A (en) Laser radar and camera combined calibration method, device, equipment and medium
CN116309660A (en) Linear detection method, device, equipment and storage medium
CN118115765A (en) Image matching method, device, equipment and storage medium
CN117310665A (en) External parameter calibration method and device, electronic equipment and storage medium
CN117689737A (en) Panoramic camera calibration method, device, equipment and storage medium
CN114742896A (en) Camera screening method, device, equipment and storage medium for roadside sensing system
CN117968624A (en) Binocular camera ranging method, device, equipment and storage medium
CN117471211A (en) Method, device, equipment and storage medium for determining reference neutral point
CN117406562A (en) Current compensation method and device for exposure machine, electronic equipment and storage medium
CN116071736A (en) Meter reading method and device, electronic equipment and storage medium
CN117911469A (en) Registration method, device and equipment of point cloud data and storage medium
CN117289238A (en) Laser radar map construction optimization method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination