CN111028287A - Method and device for determining transformation matrix of radar coordinates and camera coordinates - Google Patents

Method and device for determining transformation matrix of radar coordinates and camera coordinates Download PDF

Info

Publication number
CN111028287A
CN111028287A CN201811173018.6A CN201811173018A CN111028287A CN 111028287 A CN111028287 A CN 111028287A CN 201811173018 A CN201811173018 A CN 201811173018A CN 111028287 A CN111028287 A CN 111028287A
Authority
CN
China
Prior art keywords
coordinates
camera
coordinate
sample
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811173018.6A
Other languages
Chinese (zh)
Other versions
CN111028287B (en
Inventor
汤琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811173018.6A priority Critical patent/CN111028287B/en
Publication of CN111028287A publication Critical patent/CN111028287A/en
Application granted granted Critical
Publication of CN111028287B publication Critical patent/CN111028287B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization

Abstract

The invention discloses a method and a device for determining a conversion matrix of radar coordinates and camera coordinates, and belongs to the field of intelligent monitoring. The method comprises the following steps: acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by a radar gunlock camera at a plurality of different time points; for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range, wherein a detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range; and determining a conversion matrix of the radar coordinates and the camera coordinates of the radar rifle bolt camera based on the selected coordinates of each sample camera to be used and the corresponding sample radar coordinates. By adopting the invention, the accuracy of coordinate conversion through the conversion matrix can be improved.

Description

Method and device for determining transformation matrix of radar coordinates and camera coordinates
Technical Field
The invention relates to the field of intelligent monitoring, in particular to a method and a device for determining a conversion matrix of radar coordinates and camera coordinates.
Background
The radar rifle bolt camera is a device formed by combining a radar component and a rifle bolt camera, the rifle bolt camera is a monitoring camera, and the combined radar rifle bolt camera can be used for intelligent monitoring, detection, speed measurement and the like.
Taking an intelligent monitoring scene as an example, during monitoring, if a moving object (i.e., a detected object) appears in the detection range of the radar bolt camera, the radar bolt camera can determine the radar coordinates of the detected object in the radar coordinate system established by the radar component and the first camera coordinates of the detected object in the camera coordinate system established by the bolt camera. And then, converting the determined radar coordinate into a second camera coordinate through a prestored conversion relation, comparing the second camera coordinate with the first camera coordinate, and if the second camera coordinate is matched with the first camera coordinate, determining the detected object as a monitoring target. Therefore, errors caused by the fact that the gunlock camera is used for independently detecting the detected object can be reduced, and the accuracy of detecting the detected object is improved.
The conversion relation stored in advance in the radar section may be a product operation of the radar coordinates and a conversion matrix, and the conversion matrix may be calculated in advance and then stored in the radar section. When the transformation matrix is calculated, the transformation matrix can be solved through a plurality of randomly selected to-be-used sample radar coordinates and corresponding sample camera coordinates. Each sample radar coordinate and the corresponding sample camera coordinate correspond to a calibration feature point, the calibration feature point is usually a point calibrated on the detected object in the camera image, and the calibration feature points have the same physical meaning, for example, the calibration feature points may be midpoints of lower edge lines of the detection frame of the target person in the camera images.
In the process of implementing the invention, the inventor finds that the prior art has at least the following problems:
when randomly selecting a sample radar coordinate to be used and a corresponding sample camera coordinate, the selected sample camera coordinates to be used may be located at a more concentrated position in a detection area of the camera, because the bolt camera has the characteristic of obvious edge distortion, when fitting the selected sample camera coordinate to be used and the corresponding sample radar coordinate, a situation of poor fitting effect may occur, when coordinate conversion is performed on the coordinate located at the edge by a conversion matrix obtained through the fitting, a large error may be generated, and further, the accuracy of the coordinate conversion is low.
Disclosure of Invention
In order to solve the problems of the prior art, embodiments of the present invention provide a method and apparatus for determining a transformation matrix of radar coordinates and camera coordinates. The technical scheme is as follows:
in a first aspect, there is provided a method of determining a transformation matrix of radar coordinates and camera coordinates, the method comprising:
acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by a radar gunlock camera at a plurality of different time points;
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range, wherein a detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
and determining a conversion matrix of the radar coordinates and the camera coordinates of the radar rifle bolt camera based on the selected coordinates of each sample camera to be used and the corresponding sample radar coordinates.
Optionally, for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range, where the selecting includes:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
Optionally, the selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range includes:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
Optionally, the selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range includes:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
Optionally, the selecting, from the unselected sample camera coordinates, the sample camera coordinates to be used whose number is the difference value includes:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
In a second aspect, there is provided an apparatus for determining a transformation matrix of radar coordinates and camera coordinates, the apparatus comprising:
the acquisition module is used for acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by the radar rifle bolt camera at a plurality of different time points;
the system comprises a selection module, a detection module and a control module, wherein the selection module is used for selecting at least one sample camera coordinate to be used from sample camera coordinates in a coordinate range if the sample camera coordinate exists in the coordinate range for each preset coordinate range, the detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
and the determining module is used for determining the radar coordinates of the radar rifle bolt camera and the conversion matrix of the camera coordinates based on the selected sample camera coordinates to be used and the corresponding sample radar coordinates.
Optionally, the selecting module is configured to:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
Optionally, the selecting module is configured to:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
Optionally, the selecting module is configured to:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
Optionally, the selecting module is configured to:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
In a third aspect, a computer device is provided, which includes a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete communication with each other through the bus; a memory for storing a computer program; a processor for executing a program stored in the memory for implementing the method for determining a transformation matrix of radar coordinates and camera coordinates as described in the first aspect above.
In a fourth aspect, there is provided a computer readable storage medium having stored therein at least one instruction, which is loaded and executed by the processor, to implement the method of determining a transformation matrix of radar coordinates and camera coordinates as described in the first aspect above.
The technical scheme provided by the embodiment of the invention has the beneficial effects that at least:
in the embodiment of the invention, the sample camera coordinates are selected according to the preset coordinate range, so that the selected sample camera coordinates are distributed more uniformly, the situation that the selected sample camera coordinates are all positioned at the central position of the display picture of the camera detection area is greatly reduced, the situation that overfitting is caused by the fact that the selected sample camera coordinates are not uniform can be reduced as much as possible, and then the fitting error can be reduced, so that the accuracy of the conversion matrix obtained by calculation can be improved, and the accuracy of coordinate conversion is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a method for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
FIG. 2 is an interface schematic diagram of a method for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
FIG. 3 is an interface schematic diagram of a method for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a method for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a method for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an apparatus for determining a transformation matrix of radar coordinates and camera coordinates according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The embodiment of the invention provides a method for determining a conversion matrix of radar coordinates and camera coordinates, which can be realized by computer equipment. The computer device may be a processing unit in the radar bolt face camera, such as a processing unit common to the entire device of the radar bolt face camera, or a processing unit of a radar component in the radar bolt face camera, or the like. The computer device may also be other electronic devices with computing functions than the radar rifle bolt camera, such as a processing unit of a terminal, in which case the computer device may perform data interaction with the radar rifle bolt camera to be tested, and preferably the computer device and the radar rifle bolt camera may perform data interaction via a network. The radar bolt face camera can be formed by combining two independent devices of a radar component and a bolt face camera, and in this case, the radar component and the bolt face camera need to keep a fixed relative shooting angle. The radar bolt face camera can also be a complete device, and the radar part and the bolt face camera part are all part of the device. The radar component may be a two-dimensional radar, i.e. the radar coordinates obtained by the radar component are two-dimensional coordinates. The present embodiment is described by taking an example in which the radar bolt face camera is a combination of two complete devices, namely a radar component and a bolt face camera.
As shown in fig. 1, the processing flow of the method may include the following steps:
in step 101, sample camera coordinates and corresponding sample radar coordinates of a detected object detected by a radar bolt face camera at a plurality of different time points are acquired.
In one possible embodiment, the radar gun camera may be used for intelligent monitoring, speed measurement, and the like, and in use, the radar gun camera respectively gives radar coordinates of a detected object in a radar coordinate system established by the radar equipment and first camera coordinates in a camera coordinate system established by the gun camera, and then the radar gun camera compares the first camera coordinates with second camera coordinates, and if the second camera coordinates match with the first camera coordinates, the detected object is determined as a monitoring target. Through mutual verification of the two measurement components, the accuracy of detecting the monitored target can be improved.
It should be noted that, if the radar bolt face camera is formed by combining two complete devices, namely a radar component and a bolt face camera, when the radar bolt face camera performs detection, the radar component gives a radar coordinate of a detected object, the bolt face camera gives a first camera coordinate of the detected object, then the radar component converts the radar coordinate of the detected object into a second camera coordinate, then the second camera coordinate is sent to the bolt face camera, the bolt face camera compares the first camera coordinate of the detected object with the received second camera coordinate, and if the second camera coordinate is matched with the first camera coordinate, the detected object is determined as a monitoring target.
When converting the radar coordinates into second camera coordinates, the conversion needs to be performed by a conversion matrix, which may be pre-calculated and stored in the radar bolt camera. The transformation matrix may be calculated from the sample camera coordinates and the corresponding sample radar coordinates. Sample camera coordinates and corresponding sample radar coordinates may be determined prior to computing the transformation matrix.
After the radar rifle bolt camera is installed, the detected object appears in the shooting range of the radar rifle bolt camera, so that the radar part in the radar rifle bolt camera and the rifle bolt camera respectively identify the detected object and output the detection result of the detected object. The detection result output by the radar component is usually a radar image displayed on a first display screen of the computer device, as shown in fig. 2, where the radar image includes a point of a detected object and coordinates of the point. The detection result output by the bolt camera is usually a camera image captured by the bolt camera displayed on the second display screen of the computer device, as shown in fig. 3, a detection frame for identifying the detected object may be further included in the camera image, and coordinates of four vertices of the detection frame are output.
The technician may mark a calibration feature point in the camera image, and the calibration feature point may be manually calibrated, for example, the technician clicks on the camera image with a mouse to determine a point as the calibration feature point, and the computer device may automatically provide coordinates of the calibration feature point. The calibration feature point may also be automatically generated by the computer device through some algorithms, for example, a midpoint of a connecting line between a vertex at the lower left corner and a vertex at the lower right corner of the detection box may be automatically selected as the calibration feature point. After the calibration characteristic point is selected, determining a camera coordinate of the calibration characteristic point as a sample camera coordinate, then obtaining a sample radar coordinate determined by the radar component, and determining the sample camera coordinate of the calibration characteristic point and the corresponding sample radar coordinate as a coordinate pair for storage.
And repeating the process, obtaining coordinate pairs formed by the sample camera coordinates detected at a plurality of different time points and the corresponding sample radar coordinates, and storing the obtained coordinate pairs. When calculating the transformation matrix, a plurality of coordinate pairs stored in advance are fetched for subsequent use.
In step 102, for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, at least one sample camera coordinate to be used is selected from the sample camera coordinates in the coordinate range.
The detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to a preset coordinate range.
In one possible embodiment, for subsequent use, the technician may divide the detection area that can be captured by the radar bolt face camera into a plurality of sub-areas in advance, and then record the coordinate range of each sub-area. Preferably, the coordinate range may be represented by coordinates of four vertices corresponding to the sub-region, as shown in fig. 4. The detection area that can be captured by the radar gun camera is an overlapping area of an area range that can be detected by the radar component and an area range that can be captured by the gun camera, and generally, the detection area that can be captured by the radar gun camera is an area range that can be captured by the gun camera, and the area range is a virtual image that can be captured by the gun camera. The way of dividing the plurality of sub-regions by the skilled person may be to divide the region into the plurality of sub-regions on average, or may be to divide the region into the plurality of sub-regions unevenly. A technician can manually mark out a plurality of sub-areas, for example, the technician drags a mouse to mark out a plurality of virtual lines on a display screen as edge lines of the divided sub-areas in an area range which can be shot by a gunlock camera displayed by a computer device; the technician may also pre-store program code for automatically dividing the regions in the computer device, and the program code may be run to automatically divide the plurality of sub-regions. The sub-regions divided in the above manner are all rectangular. The above-mentioned embodiments are merely examples of the method for dividing the sub-regions, and other embodiments are not intended to limit the present invention.
The technical personnel divide the detection area of the radar rifle bolt camera into a plurality of sub-areas and then store the coordinate range corresponding to each sub-area. When the coordinate range needs to be used, a plurality of preset coordinate ranges are obtained, then, each coordinate range is matched with the coordinates of the sample camera, if at least one sample camera coordinate exists in a certain coordinate range, as shown in fig. 5, the number of the sample camera coordinates existing in the coordinate range can be recorded, and one sample camera coordinate is selected from the at least one sample camera coordinate to be used as the sample camera coordinate to be used. If no sample camera coordinates exist in a certain coordinate range, the number of sample camera coordinates existing in the coordinate range can be recorded, for example, the number is recorded as 0, the coordinate range is skipped, and the next coordinate range is matched with the sample camera coordinates.
Optionally, in order to more uniformly distribute the coordinates of the sample camera to be used in the detection area of the radar gun camera, one coordinate of the sample camera to be used may be first selected in each coordinate range, and the corresponding processing steps may be as follows: for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range; and determining the total number of the selected sample camera coordinates to be used, calculating the difference value between the total number and the preset number if the total number is less than the preset number, and selecting the sample camera coordinates to be used with the difference value in the unselected sample camera coordinates.
In a possible embodiment, each preset coordinate range is matched with a plurality of sample camera coordinates, and if at least one sample camera coordinate exists in the plurality of sample camera coordinates, a sample camera coordinate to be used is selected from the at least one sample camera coordinate. If there is no sample camera coordinate in the plurality of sample camera coordinates that is in the coordinate range, the coordinate range is skipped and the next coordinate range is matched to the plurality of sample camera coordinates.
And after matching all the coordinate ranges with the sample camera coordinates, determining the total number of the currently selected sample camera coordinates to be used, then comparing the total number with the preset number, and if the total number is equal to the preset number, indicating that the currently selected sample camera coordinates to be used reach the preset number, so that the sample camera coordinates to be used do not need to be selected again. If the total number of the currently selected to-be-used sample camera coordinates is smaller than the preset number, the currently selected to-be-used sample camera coordinates do not reach the preset number, and therefore the to-be-used sample camera coordinates need to be selected again. And calculating the difference value between the total number of the currently selected to-be-used sample camera coordinates and the preset number, wherein the difference value is the number of the to-be-used sample camera coordinates which need to be selected. And after obtaining the difference, selecting the sample camera coordinates with the number of the difference from the unselected sample camera coordinates as the sample camera coordinates to be used. And finally, selecting all the sample camera coordinates as all the sample camera coordinates to be used, and finishing the selection step.
It should be noted that, when the technician sets the preset number, the preset number is usually set to be the same as the number of the coordinate ranges, and therefore, the total number of the sample camera coordinates to be used currently selected in the above step is usually not greater than the preset number. If the total number of the currently selected sample camera coordinates to be used in the above steps is larger than the preset number due to some consideration, randomly selecting the preset number of sample camera coordinates from the currently selected sample camera coordinates to be used.
Optionally, after the difference between the total number of the sample camera coordinates to be used and the preset number is obtained through the calculation in the above steps, when the sample camera coordinates of which the number is the difference are selected, in order to make the distribution of the selected sample camera coordinates more uniform, the following selection method may be adopted for selection: selecting the coordinate ranges with the difference number in the preset coordinate range, and determining the coordinate of an unselected sample camera as the coordinate of the sample camera to be used in each selected coordinate range, wherein each selected coordinate range comprises at least two sample camera coordinates.
In a possible embodiment, after calculating the difference between the total number of the selected sample camera coordinates to be used and the preset number through the above steps, when the sample camera coordinates to be used are selected, in order to make the distribution of the selected sample camera coordinates more uniform, the coordinate ranges with the difference may be selected from the coordinate ranges including at least two sample camera coordinates, and the selected coordinate ranges may be selected by randomly selecting or manually selecting a more representative coordinate range. And then, selecting an unselected sample camera coordinate from each selected coordinate range as a sample camera coordinate to be used.
Optionally, when one sample camera coordinate to be used is selected from the sample camera coordinates in one coordinate range in the above steps, the selection modes are various, and several selection modes are exemplified below.
In the first mode, one sample camera coordinate to be used is randomly selected from sample camera coordinates in a coordinate range.
In a possible embodiment, for a certain coordinate range in which sample camera coordinates exist, one sample camera coordinate is randomly selected as a sample camera coordinate to be used from among at least one sample camera coordinate in the coordinate range. Thus, the randomly selected sample camera coordinates are more representative.
And selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
In one possible embodiment, for a coordinate range in which the coordinates of the sample camera exist, the center position of the coordinate range is determined, and the center position can be calculated by the coordinate range, for example, the coordinates of four vertices for identifying the coordinate range are (x) respectively1,y1),(x1,y2),(x2,y1),(x2,y2) The center position of the coordinate range may be
Figure BDA0001823021440000101
After the central position is determined, one sample camera coordinate closest to the central position is selected from at least one sample camera coordinate in the coordinate range, the distance between each sample camera coordinate and the central position is calculated, and the sample camera coordinate with the minimum distance from the central position in the coordinate range is selected as the sample camera coordinate to be used. The sample camera coordinates thus selected are more evenly distributed.
In step 103, a conversion matrix of the radar coordinates and the camera coordinates of the radar rifle bolt camera is determined based on the selected sample camera coordinates and corresponding sample radar coordinates for each sample camera to be used.
In one possible embodiment, after the sample camera coordinates to be used and the corresponding sample radar coordinates are selected through the steps, the conversion matrix of the radar coordinates and the camera coordinates of the radar bolt face camera is calculated according to the selected sample camera coordinates to be used and the corresponding sample radar coordinates.
Alternatively, the step of calculating the conversion matrix of the radar coordinates and the camera coordinates of the radar bolt face camera may be as follows:
assuming that the sample radar coordinates are (x, y), the sample camera coordinates are (u, v), and the transformation matrix is T, T is a 3 × 3 matrix, which is a homography transformation matrix to be solved, the calculation formula for transforming the sample radar coordinates into the sample camera coordinates by the transformation matrix may be as the following formula (1):
Figure BDA0001823021440000102
the 9 unknowns of the matrix T can be changed to 8 by the following transformation, for example, x, y is transformed to u, and the expression formula of u can be expressed as the following formula (2):
Figure BDA0001823021440000103
by the above transformation, t can be converted33The number of (2) is set to 1, the remaining 8 unknowns are fitted by the selected sample radar coordinates and the corresponding sample camera coordinates, and T is seti=[ti1ti2ti3]',U=[u1u2…un],V=[v1v2…vn],In×1=[1 1…1]And an
Figure BDA0001823021440000104
The transformation matrix T ═ T1'T2'T3']', obtained by calculation of the following formula (3):
Figure BDA0001823021440000111
preferably, after the conversion matrices are obtained through the calculation in the above steps, in order to improve the accuracy of the finally determined conversion matrices, each calculated conversion matrix may be evaluated, and the corresponding processing steps may be as follows: converting the sample radar coordinates into sample camera coordinates according to the conversion matrix, comparing the sample camera coordinates obtained by the conversion with pre-stored sample camera coordinates corresponding to the sample radar coordinates, and setting the sample camera coordinates obtained by the conversion as (u)1',v1'),(u'2,v'2),…(u'n,v'n) The pre-stored sample camera coordinates are (u)1,v1),(u2,v2),…(un,vn) Then, the error of each sample camera coordinate obtained by the conversion and the corresponding pre-stored sample camera coordinate can be calculated, and a plurality of errors can be combined into one error vector, as the following vector (4):
Figure BDA0001823021440000112
after the error vectors are obtained, the average value and the standard deviation of the error vectors are calculated, then the performance evaluation value of the conversion matrix is calculated according to the average value and the standard deviation of the error vectors, the performance evaluation value represents the difference between the sample camera coordinates obtained through conversion and the pre-stored sample camera coordinates, the smaller the performance evaluation value is, the smaller the difference between the sample camera coordinates obtained through conversion and the pre-stored sample camera coordinates is, namely, the higher the accuracy of the conversion matrix is. Preferably, the performance evaluation value of the transformation matrix may be calculated from the average value of the error vectors plus three times the standard deviation, as in the following equation (5):
E=avg(D)+3×std(D)……(5)
calculating and storing a performance evaluation value of the conversion matrix according to the steps, then repeating the processing steps from the step 101 to the step 103 according to a random sampling consistency algorithm, after the conversion matrix is obtained by repeating the test calculation each time, calculating and storing the performance evaluation value of the conversion matrix according to the steps until the number of repeated tests reaches a preset number of tests, stopping repeating the processing steps from the step 101 to the step 103, comparing the obtained performance evaluation values of all the conversion matrices, selecting the conversion matrix with the minimum performance evaluation value as the finally selected conversion matrix, and storing the selected conversion matrix in the radar component. When the method is used, the radar coordinates obtained by the radar component can be converted into camera coordinates according to the formula (1).
In the embodiment of the invention, the sample camera coordinates are selected according to the preset coordinate range, so that the selected sample camera coordinates are distributed more uniformly, the situation that the selected sample camera coordinates are all positioned at the central position of the display picture of the camera detection area is greatly reduced, the situation that overfitting is caused by the fact that the selected sample camera coordinates are not uniform can be reduced as much as possible, and then the fitting error can be reduced, so that the accuracy of the conversion matrix obtained by calculation can be improved, and the accuracy of coordinate conversion is improved.
Based on the same technical concept, an embodiment of the present invention further provides an apparatus for determining a transformation matrix of radar coordinates and camera coordinates, which may be a computer device in the foregoing embodiment, as shown in fig. 6, and the apparatus includes: an obtaining module 610, a selecting module 620 and a determining module 630.
The acquisition module 610 is configured to acquire sample camera coordinates and corresponding sample radar coordinates of a detected object detected by the radar bolt face camera at a plurality of different time points;
the selecting module 620 is configured to select, for each preset coordinate range, at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range if the sample camera coordinate exists in the coordinate range, where a detection area of the radar bolt face camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
the determining module 630 is configured to determine a conversion matrix of the radar coordinates and the camera coordinates of the radar gun camera based on the selected sample camera coordinates and corresponding sample radar coordinates for each sample camera to be used.
Optionally, the selecting module 620 is configured to:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
Optionally, the selecting module 620 is configured to:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
Optionally, the selecting module 620 is configured to:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
Optionally, the selecting module 620 is configured to:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
In the embodiment of the invention, the sample camera coordinates are selected according to the preset coordinate range, so that the selected sample camera coordinates are distributed more uniformly, the situation that the selected sample camera coordinates are all positioned at the central position of the display picture of the camera detection area is greatly reduced, the situation that overfitting is caused by the fact that the selected sample camera coordinates are not uniform can be reduced as much as possible, and then the fitting error can be reduced, so that the accuracy of the conversion matrix obtained by calculation can be improved, and the accuracy of coordinate conversion is improved.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
It should be noted that: the apparatus for determining a conversion matrix of radar coordinates and camera coordinates provided in the foregoing embodiment is only illustrated by dividing the functional modules when determining the conversion matrix of radar coordinates and camera coordinates, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to complete all or part of the functions described above. In addition, the apparatus for determining the transformation matrix of the radar coordinates and the camera coordinates provided in the above embodiments and the method embodiment for determining the transformation matrix of the radar coordinates and the camera coordinates belong to the same concept, and specific implementation processes thereof are detailed in the method embodiment and are not described herein again.
Fig. 7 is a schematic structural diagram of a computer device 700 according to an embodiment of the present invention, where the computer device 700 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 701 and one or more memories 702, where the memory 702 stores at least one instruction, and the at least one instruction is loaded and executed by the processor 701 to implement the following method steps of determining a conversion matrix of radar coordinates and camera coordinates:
acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by a radar gunlock camera at a plurality of different time points;
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range, wherein a detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
and determining a conversion matrix of the radar coordinates and the camera coordinates of the radar rifle bolt camera based on the selected coordinates of each sample camera to be used and the corresponding sample radar coordinates.
Optionally, the at least one instruction is loaded and executed by the processor 701 to implement the following method steps:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
Optionally, the at least one instruction is loaded and executed by the processor 701 to implement the following method steps:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
Optionally, the at least one instruction is loaded and executed by the processor 701 to implement the following method steps:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
Optionally, the at least one instruction is loaded and executed by the processor 701 to implement the following method steps:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
In the embodiment of the invention, the sample camera coordinates are selected according to the preset coordinate range, so that the selected sample camera coordinates are distributed more uniformly, the situation that the selected sample camera coordinates are all positioned at the central position of the display picture of the camera detection area is greatly reduced, the situation that overfitting is caused by the fact that the selected sample camera coordinates are not uniform can be reduced as much as possible, and then the fitting error can be reduced, so that the accuracy of the conversion matrix obtained by calculation can be improved, and the accuracy of coordinate conversion is improved.
In an exemplary embodiment, a computer-readable storage medium is further provided, in which at least one instruction is stored, and the at least one instruction is loaded and executed by a processor to implement the method for identifying an action category in the above embodiments. For example, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (12)

1. A method of determining a transformation matrix of radar coordinates and camera coordinates, the method comprising:
acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by a radar gunlock camera at a plurality of different time points;
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range, wherein a detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
and determining a conversion matrix of the radar coordinates and the camera coordinates of the radar rifle bolt camera based on the selected coordinates of each sample camera to be used and the corresponding sample radar coordinates.
2. The method according to claim 1, wherein for each preset coordinate range, if sample camera coordinates exist in the coordinate range, selecting at least one sample camera coordinate to be used from the sample camera coordinates in the coordinate range comprises:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
3. The method of claim 2, wherein selecting one of the sample camera coordinates in the coordinate range to be used comprises:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
4. The method of claim 2, wherein selecting one of the sample camera coordinates in the coordinate range to be used comprises:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
5. The method according to claim 2, wherein the selecting a number of the sample camera coordinates to be used as the difference value from the non-selected sample camera coordinates comprises:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
6. An apparatus for determining a transformation matrix of radar coordinates and camera coordinates, the apparatus comprising:
the acquisition module is used for acquiring sample camera coordinates and corresponding sample radar coordinates of a detected object detected by the radar rifle bolt camera at a plurality of different time points;
the system comprises a selection module, a detection module and a control module, wherein the selection module is used for selecting at least one sample camera coordinate to be used from sample camera coordinates in a coordinate range if the sample camera coordinate exists in the coordinate range for each preset coordinate range, the detection area of the radar rifle bolt camera is divided into a plurality of sub-areas, and each sub-area corresponds to one preset coordinate range;
and the determining module is used for determining the radar coordinates of the radar rifle bolt camera and the conversion matrix of the camera coordinates based on the selected sample camera coordinates to be used and the corresponding sample radar coordinates.
7. The apparatus of claim 6, wherein the selecting module is configured to:
for each preset coordinate range, if a sample camera coordinate exists in the coordinate range, selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range;
determining the total number of the selected sample camera coordinates to be used, if the total number is smaller than a preset number, calculating the difference value between the total number and the preset number, and selecting the sample camera coordinates to be used with the number being the difference value from the unselected sample camera coordinates.
8. The apparatus of claim 7, wherein the selecting module is configured to:
and randomly selecting a sample camera coordinate to be used from the sample camera coordinates in the coordinate range.
9. The apparatus of claim 7, wherein the selecting module is configured to:
and selecting the sample camera coordinate to be used with the minimum distance from the center position of the coordinate range from the sample camera coordinates in the coordinate range.
10. The apparatus of claim 7, wherein the selecting module is configured to:
selecting coordinate ranges with the number of the difference values from the preset coordinate ranges, wherein each selected coordinate range comprises at least two sample camera coordinates;
in each selected coordinate range, an unselected sample camera coordinate is determined as a sample camera coordinate to be used.
11. The computer equipment is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the bus; a memory for storing a computer program; a processor for executing a program stored in the memory to perform the method steps of any of claims 1-5.
12. A computer-readable storage medium having stored therein at least one instruction, which is loaded and executed by a processor, to implement a method of determining a transformation matrix of radar and camera coordinates as claimed in any one of claims 1 to 5.
CN201811173018.6A 2018-10-09 2018-10-09 Method and device for determining a transformation matrix of radar coordinates and camera coordinates Active CN111028287B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811173018.6A CN111028287B (en) 2018-10-09 2018-10-09 Method and device for determining a transformation matrix of radar coordinates and camera coordinates

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811173018.6A CN111028287B (en) 2018-10-09 2018-10-09 Method and device for determining a transformation matrix of radar coordinates and camera coordinates

Publications (2)

Publication Number Publication Date
CN111028287A true CN111028287A (en) 2020-04-17
CN111028287B CN111028287B (en) 2023-10-20

Family

ID=70190603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811173018.6A Active CN111028287B (en) 2018-10-09 2018-10-09 Method and device for determining a transformation matrix of radar coordinates and camera coordinates

Country Status (1)

Country Link
CN (1) CN111028287B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112305529A (en) * 2020-10-19 2021-02-02 杭州海康威视数字技术股份有限公司 Parameter calibration method, target object tracking method, device and system

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5617085A (en) * 1995-11-17 1997-04-01 Mitsubishi Denki Kabushiki Kaisha Method and apparatus for monitoring the surroundings of a vehicle and for detecting failure of the monitoring apparatus
US20070146197A1 (en) * 2005-12-23 2007-06-28 Barco Orthogon Gmbh Radar scan converter and method for transforming
WO2012151777A1 (en) * 2011-05-09 2012-11-15 上海芯启电子科技有限公司 Multi-target tracking close-up shooting video monitoring system
CN102800096A (en) * 2012-07-19 2012-11-28 北京航空航天大学 Robustness estimation algorithm of camera parameter
CN103198487A (en) * 2013-04-15 2013-07-10 厦门博聪信息技术有限公司 Automatic calibration method for video monitoring system
CN103425626A (en) * 2012-05-22 2013-12-04 杭州普维光电技术有限公司 Method and device for converting coordinates between video cameras
US20130335259A1 (en) * 2011-03-10 2013-12-19 Panasonic Corporation Object detection device and object detection method
CN103955931A (en) * 2014-04-29 2014-07-30 江苏物联网研究发展中心 Image matching method and device
CN104142157A (en) * 2013-05-06 2014-11-12 北京四维图新科技股份有限公司 Calibration method, device and equipment
CN106296708A (en) * 2016-08-18 2017-01-04 宁波傲视智绘光电科技有限公司 Car tracing method and apparatus
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
JP2017219377A (en) * 2016-06-06 2017-12-14 三菱電機株式会社 Monitoring device, monitoring method, and airport monitoring system
CN107481283A (en) * 2017-08-01 2017-12-15 深圳市神州云海智能科技有限公司 A kind of robot localization method, apparatus and robot based on CCTV camera
CN107481270A (en) * 2017-08-10 2017-12-15 上海体育学院 Table tennis target following and trajectory predictions method, apparatus, storage medium and computer equipment
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
CN108449574A (en) * 2018-03-15 2018-08-24 南京慧尔视防务科技有限公司 A kind of security detection method and system based on microwave
CN108596081A (en) * 2018-04-23 2018-09-28 吉林大学 A kind of traffic detection method merged based on radar and video camera

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5617085A (en) * 1995-11-17 1997-04-01 Mitsubishi Denki Kabushiki Kaisha Method and apparatus for monitoring the surroundings of a vehicle and for detecting failure of the monitoring apparatus
US20070146197A1 (en) * 2005-12-23 2007-06-28 Barco Orthogon Gmbh Radar scan converter and method for transforming
US20130335259A1 (en) * 2011-03-10 2013-12-19 Panasonic Corporation Object detection device and object detection method
WO2012151777A1 (en) * 2011-05-09 2012-11-15 上海芯启电子科技有限公司 Multi-target tracking close-up shooting video monitoring system
CN103425626A (en) * 2012-05-22 2013-12-04 杭州普维光电技术有限公司 Method and device for converting coordinates between video cameras
CN102800096A (en) * 2012-07-19 2012-11-28 北京航空航天大学 Robustness estimation algorithm of camera parameter
CN103198487A (en) * 2013-04-15 2013-07-10 厦门博聪信息技术有限公司 Automatic calibration method for video monitoring system
CN104142157A (en) * 2013-05-06 2014-11-12 北京四维图新科技股份有限公司 Calibration method, device and equipment
CN103955931A (en) * 2014-04-29 2014-07-30 江苏物联网研究发展中心 Image matching method and device
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
JP2017219377A (en) * 2016-06-06 2017-12-14 三菱電機株式会社 Monitoring device, monitoring method, and airport monitoring system
CN106296708A (en) * 2016-08-18 2017-01-04 宁波傲视智绘光电科技有限公司 Car tracing method and apparatus
CN107481283A (en) * 2017-08-01 2017-12-15 深圳市神州云海智能科技有限公司 A kind of robot localization method, apparatus and robot based on CCTV camera
CN107481270A (en) * 2017-08-10 2017-12-15 上海体育学院 Table tennis target following and trajectory predictions method, apparatus, storage medium and computer equipment
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
CN108449574A (en) * 2018-03-15 2018-08-24 南京慧尔视防务科技有限公司 A kind of security detection method and system based on microwave
CN108596081A (en) * 2018-04-23 2018-09-28 吉林大学 A kind of traffic detection method merged based on radar and video camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张致远: "基于非度量校正的大视场图像匹配参数标定法", vol. 38, no. 8, pages 2 - 3 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112305529A (en) * 2020-10-19 2021-02-02 杭州海康威视数字技术股份有限公司 Parameter calibration method, target object tracking method, device and system

Also Published As

Publication number Publication date
CN111028287B (en) 2023-10-20

Similar Documents

Publication Publication Date Title
CN111179358B (en) Calibration method, device, equipment and storage medium
CN108230397A (en) Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium
US9767383B2 (en) Method and apparatus for detecting incorrect associations between keypoints of a first image and keypoints of a second image
US20230214989A1 (en) Defect detection method, electronic device and readable storage medium
CN108074237B (en) Image definition detection method and device, storage medium and electronic equipment
EP4220547A1 (en) Method and apparatus for determining heat data of global region, and storage medium
CN110879131B (en) Imaging quality testing method and imaging quality testing device for visual optical system, and electronic apparatus
JP7368924B2 (en) Hardware accelerator for computation of gradient oriented histograms
CN109931906A (en) Video camera distance measuring method, device and electronic equipment
US5625762A (en) Method for extracting three-dimensional color vector
JPWO2016208404A1 (en) Information processing apparatus and method, and program
JP2024507089A (en) Image correspondence analysis device and its analysis method
CN111028287B (en) Method and device for determining a transformation matrix of radar coordinates and camera coordinates
CN109496326B (en) Image processing method, device and system
CN116977671A (en) Target tracking method, device, equipment and storage medium based on image space positioning
CN115690747B (en) Vehicle blind area detection model test method and device, electronic equipment and storage medium
CN112200002A (en) Body temperature measuring method and device, terminal equipment and storage medium
CN115330657B (en) Ocean exploration image processing method and device and server
CN113205591A (en) Method and device for acquiring three-dimensional reconstruction training data and electronic equipment
CN111028264B (en) Rotation robust three-dimensional object detection optimization method and device
CN111383262B (en) Occlusion detection method, occlusion detection system, electronic terminal and storage medium
CN111429399A (en) Straight line detection method and device
CN113538578B (en) Target positioning method, device, computer equipment and storage medium
CN116503387B (en) Image detection method, device, equipment, system and readable storage medium
CN116228683A (en) Method, device and medium for detecting image before camera calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant