CN114241058A - Target object data processing method and device, binocular calibration system, storage medium and equipment - Google Patents

Target object data processing method and device, binocular calibration system, storage medium and equipment Download PDF

Info

Publication number
CN114241058A
CN114241058A CN202111444526.5A CN202111444526A CN114241058A CN 114241058 A CN114241058 A CN 114241058A CN 202111444526 A CN202111444526 A CN 202111444526A CN 114241058 A CN114241058 A CN 114241058A
Authority
CN
China
Prior art keywords
coordinate
sensor
coordinate system
image
plane data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111444526.5A
Other languages
Chinese (zh)
Inventor
丁大山
李若岱
肖宣煜
马堃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202111444526.5A priority Critical patent/CN114241058A/en
Publication of CN114241058A publication Critical patent/CN114241058A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Abstract

In the embodiment of the disclosure, a coordinate conversion rule between a first coordinate system and a second coordinate system is determined by acquiring first plane data of a target object under the first coordinate system based on a first sensor, the coordinate conversion rule is acquired based on a calibration object with temperature difference, the first plane data is converted to a second coordinate system based on the coordinate conversion rule to obtain second plane data, the second coordinate system corresponds to a second sensor, one sensor of the first sensor and the second sensor is a thermal imager, third plane data of the target object under the second coordinate system is acquired based on the second sensor, and the second plane data and the third plane data are subjected to data processing. In this way, the homography matrix between the sensors can be obtained through the calibration object with the temperature difference, and then the points in the camera can be mapped into the thermal imager.

Description

Target object data processing method and device, binocular calibration system, storage medium and equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a data processing method and apparatus for a target object, a binocular calibration system, a storage medium, and a device.
Background
On the conventional binocular camera equipment, there is often a need to acquire the relative position relationship between different cameras by using binocular calibration. Generally, in binocular calibration, corresponding feature points are extracted from different cameras, and a homography matrix (H) is calculated, that is, a corresponding position of a point on one camera of a binocular camera on the other camera can be calculated based on the H matrix.
The general calibration method can be suitable for common RGB/IR binocular cameras, but cannot be suitable for the situations of the RGB/IR cameras and thermal imagers. In temperature measurement traffic products, points in the RGB/IR camera need to be mapped into the thermal imager, so that characteristic points need to be extracted from the RGB/IR image and the thermal imager to finish calibration. However, because the imaging principles of the RGB/IR camera and the thermal imager are different, if the conventional method is directly used, the feature points of the calibration plate cannot be extracted from the image of the thermal imager, and then the homography matrix cannot be obtained, and the points in the RGB/IR camera cannot be mapped into the thermal imager.
Disclosure of Invention
The present disclosure provides a data processing technical solution for a target object.
The present disclosure provides a data processing method of a target object, which includes:
acquiring first plane data of a target object in a first coordinate system based on a first sensor;
determining a coordinate conversion rule between a first coordinate system and a second coordinate system; the coordinate conversion rule is obtained based on a calibration object with temperature difference; the calibration object comprises a blackbody radiation source; wherein, the edge of the blackbody radiation source is provided with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the color of the cover is different from that of the blackbody radiation source;
converting the first plane data into a second coordinate system based on a coordinate conversion rule to obtain second plane data; the second coordinate system corresponds to the second sensor; one sensor of the first sensor and the second sensor is a thermal imager;
acquiring third plane data of the target object under a second coordinate system based on the second sensor;
and performing data processing on the second plane data and the third plane data.
In some possible embodiments, the method further comprises the step of obtaining a coordinate transformation rule between the first coordinate system and the second coordinate system;
acquiring a coordinate conversion rule between a first coordinate system and a second coordinate system, wherein the coordinate conversion rule comprises the following steps:
acquiring a first image containing a calibration object at a first position by using a first sensor;
acquiring a second image containing the calibration object at a second position by using a second sensor;
determining a first coordinate set of N setting points contained in the calibration object from the first image;
determining a second coordinate set of the N set points from the second image; n is an integer of 4 or more;
and determining a coordinate conversion rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set.
Thus, after the coordinate conversion rule between the first coordinate system and the second coordinate system is obtained, the conversion between the coordinate systems of the subsequent target object is paved.
In some possible embodiments, the coordinate transformation rule comprises a coordinate transformation matrix; determining a coordinate conversion rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set, wherein the coordinate conversion rule comprises the following steps:
determining N pairs of matching coordinates from the first coordinate set and the second coordinate set based on the same set point;
and substituting the N pairs of matched coordinates into a function of a coordinate conversion matrix containing the parameters to be determined to obtain the coordinate conversion matrix with the determined parameters.
In some possible embodiments, when the first sensor is a camera, determining a first set of coordinates of N set points included in the calibration object from the first image includes:
carrying out gray level conversion on the first image to obtain a first gray level image;
carrying out binarization processing on the first gray level image to obtain a processed first gray level image;
performing feature detection on the processed first gray level image, and determining N set points;
a first set of coordinates of the N set points in the first image is determined.
In some possible embodiments, when the second sensor is a thermal imager, determining a second set of coordinates for the N set points from the second image comprises:
determining a first temperature corresponding to the temperature difference;
performing image binarization segmentation on the second image by taking the first temperature as a temperature threshold value to obtain a processed second image;
and acquiring the positions of the N set points of the processed second image to obtain a second coordinate set of the N set points.
In some possible embodiments, before obtaining the positions of the N set points of the processed second image and obtaining the second coordinate set of the N set points, the method further includes:
and carrying out noise interference mitigation processing on the processed second image.
In some possible embodiments, the first location and the second location are the same location;
or; the first and second positions are two fixed positions of the first and second sensors in the device.
The present disclosure provides a data processing apparatus of a target object, including:
the first data acquisition module is used for acquiring first plane data of the target object in a first coordinate system based on the first sensor;
the conversion rule determining module is used for determining a coordinate conversion rule between the first coordinate system and the second coordinate system; the coordinate conversion rule is obtained based on a calibration object with temperature difference;
the second data acquisition module is used for converting the first plane data into a second coordinate system based on the coordinate conversion rule to obtain second plane data; the second coordinate system corresponds to the second sensor; one thermal imager exists in the first sensor and the second sensor;
the third data acquisition module is used for acquiring third plane data of the target object under a second coordinate system based on the second sensor;
and the processing module is used for carrying out data processing on the second plane data and the third plane data.
In some possible embodiments, the apparatus further includes a conversion rule obtaining device, including:
the first image acquisition module is used for acquiring a first image containing a calibration object at a first position by using a first sensor;
the second image acquisition module is used for acquiring a second image containing the calibration object at a second position by using a second sensor;
the first coordinate set acquisition module is used for determining a first coordinate set of N setting points contained in the calibration object from the first image;
the second coordinate set acquisition module is used for determining a second coordinate set of the N setting points from the second image; n is an integer of 4 or more;
and the conversion rule calculation module is used for determining a coordinate conversion rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set.
In some possible embodiments, the coordinate transformation rule comprises a coordinate transformation matrix; a conversion rule calculation module to:
determining N pairs of matching coordinates from the first coordinate set and the second coordinate set based on the same set point;
and substituting the N pairs of matched coordinates into a function of a coordinate conversion matrix containing the parameters to be determined to obtain the coordinate conversion matrix with the determined parameters.
In some possible embodiments, when the first sensor is a camera, the first coordinate set acquisition module is configured to:
carrying out gray level conversion on the first image to obtain a first gray level image;
carrying out binarization processing on the first gray level image to obtain a processed first gray level image;
performing feature detection on the processed first gray level image, and determining N set points;
a first set of coordinates of the N set points in the first image is determined.
In some possible embodiments, when the second sensor is a thermal imager, the second coordinate set acquisition module is configured to:
determining a first temperature corresponding to the temperature difference;
performing image binarization segmentation on the second image by taking the first temperature as a temperature threshold value to obtain a processed second image;
and acquiring the positions of the N set points of the processed second image to obtain a second coordinate set of the N set points.
In some possible embodiments, the second coordinate set obtaining module is configured to: and carrying out noise interference mitigation processing on the processed second image.
In some possible embodiments, the first location and the second location are the same location;
or; the first and second positions are two fixed positions of the first and second sensors in the device.
The present disclosure provides a binocular calibration system, which includes a calibration object, a first sensor, a second sensor and a data processor; one sensor of the first sensor and the second sensor is a thermal imager;
the calibration object comprises a blackbody radiation source; wherein, the edge of the blackbody radiation source is provided with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the color of the cover is different from that of the blackbody radiation source;
the method comprises the steps that a first sensor obtains first plane data of a target object under a first coordinate system;
the data processor determines a coordinate transformation rule between a first coordinate system and a second coordinate system; converting the first plane data into a second coordinate system based on a coordinate conversion rule to obtain second plane data; the coordinate conversion rule is obtained based on a calibration object with temperature difference; the second coordinate system corresponds to the second sensor;
the second sensor acquires third plane data of the target object under a second coordinate system;
the data processor performs data processing on the second plane data and the third plane data.
The present disclosure provides an electronic device comprising at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the at least one processor implements the data processing method of a target object according to any one of the first aspect by executing the instructions stored by the memory.
The present disclosure provides a computer-readable storage medium, in which at least one instruction or at least one program is stored, the at least one instruction or the at least one program being loaded by a processor and executed to implement the data processing method of a target object of any one of the first aspect.
The present disclosure provides a computer program product containing instructions which, when run on a computer, cause the computer to perform the data processing method of any one of the target objects of the first aspect of the present disclosure.
In the embodiment of the disclosure, a coordinate conversion rule between a first coordinate system and a second coordinate system is determined by acquiring first plane data of a target object under the first coordinate system based on a first sensor, wherein the coordinate conversion rule is acquired based on a calibration object with a temperature difference, and a covering is arranged on the edge of a blackbody radiation source, so that the temperature of the blackbody radiation source is different from that of the covering to form the temperature difference; and the color of the covering is different from that of the black body radiation source, the first plane data is converted to a second coordinate system based on a coordinate conversion rule to obtain second plane data, the second coordinate system corresponds to a second sensor, one sensor in the first sensor and the second sensor is a thermal imager, third plane data of the target object in the second coordinate system is obtained based on the second sensor, and data processing is carried out on the second plane data and the third plane data. In this way, the homography matrix between the sensors can be obtained through the calibration object with the temperature difference, and then the points in the camera can be mapped into the thermal imager.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present specification, and other drawings can be obtained by those skilled in the art without inventive efforts.
FIG. 1 shows a schematic diagram of an application environment in accordance with an embodiment of the present disclosure;
FIG. 2 shows a flow diagram of a method of data processing of a target object according to an embodiment of the present disclosure;
FIG. 3 illustrates a flow diagram for determining coordinate transformation rules, according to an embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of a blackbody radiation source according to an embodiment of the present disclosure;
FIG. 5 illustrates a flow diagram for determining a first set of coordinates in accordance with an embodiment of the present disclosure;
FIG. 6 illustrates a flow chart for determining a second set of coordinates according to an embodiment of the present disclosure;
FIG. 7 illustrates a flow diagram for determining coordinate transformation rules, according to an embodiment of the present disclosure;
FIG. 8 illustrates a schematic diagram of a coordinate transformation rule according to an embodiment of the present disclosure;
FIG. 9 shows a block diagram of a data processing apparatus of a target object according to an embodiment of the present disclosure;
FIG. 10 shows a block diagram of an electronic device in accordance with an embodiment of the disclosure;
FIG. 11 shows a block diagram of another electronic device in accordance with an embodiment of the disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments in the present description, belong to the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Referring to fig. 1, fig. 1 is a schematic view of an application environment according to an embodiment of the present disclosure, where the schematic view includes an electronic device 101, and the electronic device 101 shown in the schematic view is a face temperature identifier, but may be any electronic device integrating a camera and a thermal imager besides the face temperature identifier. Wherein the electronic device 101 comprises a first sensor and a second sensor. Optionally, in this embodiment of the application, the data acquired by the first sensor 1011 and the second sensor 1012 are two-dimensional data and/or three-dimensional data, which will be described below by taking the two-dimensional data as an example, and data in other forms may refer to a processing manner of the two-dimensional data, and thus are not described again.
Specifically, the electronic device 101 may obtain first plane data of the target object in a first coordinate system based on the first sensor, determine a coordinate transformation rule between the first coordinate system and a second coordinate system, where the coordinate transformation rule is obtained based on the calibration object having the temperature difference, transform the first plane data to the second coordinate system based on the coordinate transformation rule to obtain second plane data, where the second coordinate system corresponds to the second sensor, where one of the first sensor and the second sensor is a thermal imager, obtain third plane data of the target object in the second coordinate system based on the second sensor, and perform data processing on the second plane data and the third plane data.
The technical solution provided by the embodiment of the present disclosure may be applied to the extension of application scenarios such as data processing and target identification of a target object of an image or a video, and the embodiment of the present disclosure does not limit this.
Alternatively, the electronic device in the embodiments of the present disclosure may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, an in-vehicle device, a wearable device, or the like. In some possible implementations, the data processing method of the target object may be implemented by a processor calling computer readable instructions stored in a memory. The following describes a data processing method of a target object according to an embodiment of the present disclosure, taking an electronic device as an execution subject. The data processing method of the target object may be implemented by means of a processor calling computer readable instructions stored in a memory.
Fig. 2 shows a flowchart of a data processing method of a target object according to an embodiment of the present disclosure, and as shown in fig. 2, the method includes:
in step S201, first plane data of the target object in the first coordinate system is acquired based on the first sensor.
The first plane data may be data on an image of a plane of the target object captured by the first sensor, and the first plane data may include a plurality of first coordinate data since the photograph is two-dimensional and the plane may be regarded as a set of one point on the image. In the embodiment of the present application, the first coordinate data may be two-dimensional coordinate data.
Optionally, the first coordinate system may use a first pixel point at the upper left corner of the image captured by the first sensor as an origin, the long side of the image is an abscissa, and the short side is an ordinate axis established by the ordinate.
Alternatively, the target object may be a human face.
In step S203, determining a coordinate transformation rule between the first coordinate system and the second coordinate system, where the coordinate transformation rule is obtained based on a calibration object having a temperature difference, and the calibration object includes a blackbody radiation source; wherein, the edge of the blackbody radiation source is provided with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the cover is a different color than the blackbody radiation source.
In the embodiment of the present disclosure, since the coordinate transformation rule between the first coordinate system and the second coordinate system is obtained before the coordinate transformation rule between the first coordinate system and the second coordinate system is determined, in this case, the subsequent transformation of the first plane data into the second coordinate system based on the coordinate transformation rule can be performed. Therefore, the embodiment of the present disclosure also provides a way to obtain the coordinate transformation rule between the first coordinate system and the second coordinate system.
Fig. 3 shows a flowchart of determining a coordinate transformation rule according to an embodiment of the present disclosure, and as shown in fig. 3, the method includes:
in step S301, a first image including a calibration object is acquired at a first position using a first sensor.
In the embodiment of the present disclosure, taking an example that the electronic device is a face temperature identifier, the face temperature identifier necessarily includes two sensors, namely a camera and a thermal imager (thermal infrared imager). Alternatively, the camera may be an RGB/IR camera.
In the embodiment of the disclosure, because the imaging principles of the RGB/IR camera and the thermal imager are different, if the conventional method is directly used and the calibration is performed by using a common calibration object, the thermal imager cannot extract the characteristic points of the calibration plate from the image of the thermal imager because the thermal imager has no temperature difference. Therefore, when the calibration object is set, the calibration object must have a temperature difference.
In the embodiment of the application, the calibration object comprises a blackbody radiation source; wherein, the edge of the blackbody radiation source is provided with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the cover is a different color than the blackbody radiation source. The cover may be paper, a coating, plastic, metal, etc. that is a color difference from the blackbody radiation source. Alternatively, the cover may be attached, painted, snapped or otherwise secured to the blackbody radiation source.
In some alternative embodiments, taking the cover as white paper as an example, the calibration object may be a blackbody radiation source as shown in fig. 4; wherein, the edge of black body radiation source is pasted with white paper, pastes white paper's effect at its edge and does: firstly, the temperature of the black body radiation source is different from that of the white paper, so that temperature difference is formed, after an image is generated by a thermal imager, secondly, the white paper is uniformly and tidily attached to the periphery of the black body radiation source, and the detection of the set point in the image can be facilitated through the color difference between the white paper and the black body radiation source.
In step S303, a second image including the calibration object is acquired at a second position using a second sensor.
In some possible embodiments of the present disclosure, the first position and the second position may be the same position, that is, the camera and the thermal imager acquire the first image and the second image containing the calibration object at the same position.
In other possible embodiments, as the face temperature identifier is further described based on the above, since the face temperature identifier includes two sensors, namely the first sensor and the second sensor, the first position and the second position of the first sensor and the second sensor are two fixed positions of the first sensor and the second sensor in the device, respectively. For example, the central point of the camera and the central point of the thermal imager are both on the same horizontal plane, but the central point of the camera is located right to the central point of the thermal imager, and the distance between the central point of the camera and the central point of the thermal imager is 2 cm. Of course, the first and second positions may also be other relative positions.
In the embodiment of the present disclosure, step S201 may be expressed in more detail as acquiring the first plane data of the target object in the first coordinate system based on the first sensor at the first position, that is, in the case that the first sensor is a camera, the electronic device acquires the first plane data of the target object in the currently acquired camera image in the first coordinate system based on the first position of the camera in the electronic device.
Likewise, in the embodiment of the present disclosure, step S207 may be expressed in more detail as acquiring third planar data of the target object in the second coordinate system based on the second sensor at the second position, that is, in the case that the second sensor is a thermal imager, the electronic device acquires the third planar data of the target object in the currently acquired thermal imaging image in the second coordinate system based on the second position of the thermal imager in the electronic device.
In step S305, a first coordinate set of N set points included in the calibration object is determined from the first image.
In order to make the set points in the camera image and the thermal image acquired by the camera and the thermal imager easier to detect, and simplify the calibration process of the two sensors, the edge of the black body radiation source can still be pasted with white paper as an example for description. When the peripheral edge of the blackbody radiation source is bordered by white paper, a rectangular frame can be bordered by the white paper as shown in fig. 4. In the following, how to determine the first set of coordinates of the N set points included in the calibration object from the first image will be described with reference to the blackbody radiation source.
Fig. 5 shows a flowchart for determining a first set of coordinates, according to an embodiment of the present disclosure, as shown in fig. 5, the method comprising:
in step S3051, the first image is subjected to grayscale conversion to obtain a first grayscale image.
In step S3053, a binarization process is performed on the first grayscale image to obtain a processed first grayscale image.
In order to clearly detect a rectangular frame in the first gray scale image which appears after white paper is arranged on the edge when the subsequent electronic equipment can detect the characteristics of the first gray scale image, gray scale conversion can be performed on the first gray scale image to obtain a first gray scale image, and then binarization processing is performed on the first gray scale image to obtain a processed first gray scale image. In this way, it is possible to clearly distinguish where the blackbody radiation source is, where the white paper is.
In step S3055, feature detection is performed on the processed first grayscale image, and N set points are determined.
In the embodiment of the present disclosure, the electronic device may perform contour detection on the processed first grayscale image, perform filtering according to the shape and area rating factors, finally obtain a rectangle at the center as shown in fig. 4, and extract coordinate axes of four corner points of the rectangle, where the four corner points are N setting points.
In step S3057, a first set of coordinates of the N set points in the first image is determined.
Thus, the electronic device may obtain the first coordinate sets a1, a2, A3 and a4 of the four corner points in the first image according to the previously established first coordinate axes.
Therefore, the selected calibration target can be used for smoothly determining the first coordinate set of the N set points contained in the calibration object from the first image, and laying the foundation for obtaining the coordinate conversion rule through subsequent calculation.
In step S307, a second coordinate set of N set points is determined from the second image, N being an integer equal to or greater than 4.
Fig. 6 shows a flowchart for determining a second set of coordinates, according to an embodiment of the present disclosure, as shown in fig. 6, the method comprising:
in step S3071, a first temperature corresponding to the temperature difference is determined.
For example, typically, the blackbody radiation source has a temperature of 25 degrees celsius and the margin is 10 degrees celsius on white paper. The temperature difference existing between the two is 15 degrees celsius. The first temperature corresponding to the 15 degrees celsius temperature may be the temperature of the black body radiator and the second temperature may be the temperature of the white paper. When a second image is acquired based on the thermal imaging sensor, the different temperatures are not uniform in color across the image. As such, the electronic device may determine a first temperature, such as 25 degrees celsius, to which the temperature difference corresponds, or may determine a second temperature, such as 10 degrees celsius, to which the temperature difference corresponds.
In step S3073, the second image is subjected to image binarization segmentation using the first temperature as a temperature threshold, so as to obtain a processed second image.
In the embodiment of the disclosure, after the electronic device determines that the first temperature is 25 degrees celsius or the second temperature is 10 degrees celsius, the first temperature or the second temperature may be set as a temperature threshold, and the second image is binarized and segmented to extract four corner points of a rectangle, that is, N set points.
In step S3075, the position of the N setting points is obtained for the processed second image, so as to obtain a second coordinate set of the N setting points.
In this way, the electronic device may obtain the second coordinate sets B1, B2, B3, and B4 of the N set points in the second image, and make a cushion for obtaining the coordinate transformation rules for subsequent calculations.
The second coordinate set is a coordinate axis established by taking a first pixel point at the upper left corner of an image shot by a second sensor (thermal imager) as an origin, the long edge of the image as an abscissa and the short edge as an ordinate.
In the embodiment of the disclosure, before the electronic device performs position acquisition of the N setting points on the processed second image and obtains the second coordinate set of the N setting points, the electronic device may further perform noise interference mitigation processing on the processed second image. This is because noise is caused by many parts. Firstly, since the blackbody radiation source is at a distance from the thermal imager, there is a lot of thermal noise in the space, which can cause interference to the image taken by the thermal imager. Secondly, the thermal imager itself is an electronic device, and when an image is shot, an analog signal (temperature) in a space needs to be converted into a digital signal (voltage, current and the like), and electronic noise is introduced in the conversion process.
In step S309, a coordinate transformation rule between the first coordinate system and the second coordinate system is determined according to the first coordinate set and the second coordinate set.
Alternatively, the coordinate conversion rule may be referred to as a coordinate conversion matrix.
Fig. 7 shows a flowchart of determining a coordinate transformation rule according to an embodiment of the present disclosure, and as shown in fig. 7, the method includes:
in step S3091, N pairs of matching coordinates are determined from the first coordinate set and the second coordinate set based on the same set point.
The explanation is continued based on the above-described first coordinate set (a1, a2, A3, and a4) and second coordinate set (B1, B2, B3, and B4). As shown in fig. 8, fig. 8 is a schematic diagram of a coordinate transformation rule according to an embodiment of the present disclosure, and the electronic device may determine N pairs of matching coordinates from the first coordinate set and the second coordinate set based on the same set point (corner point), so that 4 pairs of matching coordinates (a1 and B1, a2 and B2, A3 and B3, a4 and B4) may be obtained.
In step S3093, the N pairs of matching coordinates are substituted into a function of a coordinate conversion matrix including parameters to be determined, so as to obtain a coordinate conversion matrix after the parameters are determined.
Since there is homography between data of the unified planar object of the camera and the thermal imager, the object coordinate conversion relationship in the first coordinate system and the second coordinate system can be expressed by the following formula (1):
Figure BDA0003383590380000141
wherein the content of the first and second substances,
Figure BDA0003383590380000142
is a point of the two-dimensional point in the first coordinate system after the homogeneous conversion,
Figure BDA0003383590380000143
is a point of the two-dimensional point in the second coordinate system after the homogeneous conversion. H is a coordinate transformation matrix to be determined, and can be used
Figure BDA0003383590380000144
And (4) showing.
For electronic equipment, a homography transformation matrix needs to be calibrated, namely a set of matching point calculation in a first coordinate system and a second coordinate system is collected. For each group of matching points
Figure BDA0003383590380000145
Is expressed by the following formula:
Figure BDA0003383590380000146
from the corresponding relation of plane coordinates and homogeneous coordinates
Figure BDA0003383590380000147
x: a coordinate x;
y: a coordinate y; w: scaling; r2: a two-dimensional planar coordinate system; p3: a homogeneous coordinate system. The above formula can be expressed as:
Figure BDA0003383590380000151
Figure BDA0003383590380000152
further transformation is as follows:
(h31xi+h32yi+h33)x'i=h11xi+h12yi+h13
(h31xi+h32yi+h33)y'i=h21xi+h22yi+h23… … formula (4)
That is to say, a group of matching points can obtain 2 groups of processes, so that only 4 groups of non-collinear matching points are needed to solve the unique solution of the coordinate table transformation matrix, and thus, at least 4 pairs of matching coordinates are substituted into a function of the coordinate transformation matrix containing the parameters to be determined, so that the parameters to be determined in the coordinate transformation matrix can be solved, and further, the available coordinate transformation matrix can be obtained.
In this way, rules for conversion between the camera and the thermal imager can be determined in preparation for subsequent conversion of data in the coordinate system of one of the sensors to the coordinate system of the other sensor.
In step S205, converting the first planar data into a second coordinate system based on the coordinate conversion rule to obtain second planar data; the second coordinate system corresponds to the second sensor; one of the first sensor and the second sensor is a thermal imager.
In step S207, third plane data of the target object in the second coordinate system is acquired based on the second sensor.
In step S209, data processing is performed on the second plane data and the third plane data.
The method and the device can map the points in the camera to the thermal imager, or map the points in the thermal imager to the camera. Assuming that the second sensor is a thermal imager, the data (first plane data) of the camera can be mapped into the thermal imager to become second plane data. The third planar data may then be acquired by the thermal imager itself. Because the two sensors are different in data acquisition emphasis point, the target object can be better and more comprehensively processed by combining the data acquired by the camera and the data acquired by the thermal imager.
In the above embodiments, the first sensor is defined as a camera and the second sensor is defined as a thermal imager. Of course, the first sensor may also be a thermal imager, the second sensor may be a camera, and other steps may refer to the above description, which is not repeated herein.
In the embodiment of the present disclosure, the first coordinate set and the second coordinate set in step S305 and step S307 may also be manually labeled by a technician.
In summary, the calibration object with the temperature difference is introduced in the embodiment of the disclosure, so that the first sensor and the second sensor can more easily acquire the coordinates of the set point in the binocular calibration, and a cushion is made for subsequently determining the homography matrix, and then the points in the camera can be mapped into the thermal imager, or the points in the thermal imager are mapped into the camera.
Fig. 9 shows a block diagram of a data processing apparatus of a target object according to an embodiment of the present disclosure, and as shown in fig. 9, the data processing apparatus of the target object includes:
a first data obtaining module 901, configured to obtain first plane data of a target object in a first coordinate system based on a first sensor;
a transformation rule determining module 902, configured to determine a coordinate transformation rule between the first coordinate system and the second coordinate system; the coordinate conversion rule is obtained based on a calibration object with temperature difference;
a second data obtaining module 903, configured to convert the first plane data to a second coordinate system based on a coordinate conversion rule, so as to obtain second plane data; the second coordinate system corresponds to the second sensor; one thermal imager exists in the first sensor and the second sensor;
a third data acquiring module 904, configured to acquire third plane data of the target object in the second coordinate system based on the second sensor;
and the processing module 905 is configured to perform data processing on the second plane data and the third plane data.
In some possible embodiments, the apparatus further includes a conversion rule obtaining device, including:
the first image acquisition module is used for acquiring a first image containing a calibration object at a first position by using a first sensor;
the second image acquisition module is used for acquiring a second image containing the calibration object at a second position by using a second sensor;
the first coordinate set acquisition module is used for determining a first coordinate set of N setting points contained in the calibration object from the first image;
the second coordinate set acquisition module is used for determining a second coordinate set of the N setting points from the second image; n is an integer of 4 or more;
and the conversion rule calculation module is used for determining a coordinate conversion rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set.
In some possible embodiments, the coordinate transformation rule comprises a coordinate transformation matrix; a conversion rule calculation module to:
determining N pairs of matching coordinates from the first coordinate set and the second coordinate set based on the same set point;
and substituting the N pairs of matched coordinates into a function of a coordinate conversion matrix containing the parameters to be determined to obtain the coordinate conversion matrix with the determined parameters.
In some possible embodiments, when the first sensor is a camera, the first coordinate set acquisition module is configured to:
carrying out gray level conversion on the first image to obtain a first gray level image;
carrying out binarization processing on the first gray level image to obtain a processed first gray level image;
performing feature detection on the processed first gray level image, and determining N set points;
a first set of coordinates of the N set points in the first image is determined.
In some possible embodiments, when the second sensor is a thermal imager, the second coordinate set acquisition module is configured to:
determining a first temperature corresponding to the temperature difference;
performing image binarization segmentation on the second image by taking the first temperature as a temperature threshold value to obtain a processed second image;
and acquiring the positions of the N set points of the processed second image to obtain a second coordinate set of the N set points.
In some possible embodiments, the second coordinate set obtaining module is configured to: and carrying out noise interference mitigation processing on the processed second image.
In some possible embodiments, the calibration object includes a blackbody radiation source; wherein, the edge of the black body radiation source is provided with white paper; so that the temperature of the black body radiation source is different from that of the white paper, and a temperature difference is formed.
In some possible embodiments, the first location and the second location are the same location;
or; the first and second positions are two fixed positions of the first and second sensors in the device.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
The embodiment of the present disclosure also provides a system, which includes a calibration object, a first sensor, a second sensor, and a data processor; one sensor of the first sensor and the second sensor is a thermal imager;
the calibration object comprises a blackbody radiation source; wherein, the edge of the blackbody radiation source is provided with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the color of the cover is different from that of the blackbody radiation source;
the method comprises the steps that a first sensor obtains first plane data of a target object under a first coordinate system;
the data processor determines a coordinate transformation rule between a first coordinate system and a second coordinate system; converting the first plane data into a second coordinate system based on a coordinate conversion rule to obtain second plane data; the coordinate conversion rule is obtained based on a calibration object with temperature difference; the second coordinate system corresponds to the second sensor;
the second sensor acquires third plane data of the target object under a second coordinate system;
the data processor performs data processing on the second plane data and the third plane data.
The embodiment of the present disclosure also provides a computer-readable storage medium, in which at least one instruction or at least one program is stored, and the at least one instruction or the at least one program is loaded by a processor and when executed, implements the above method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
The electronic device may be provided as a terminal, server, or other form of device.
The disclosed embodiments provide a computer program product containing instructions which, when run on a computer, cause the computer to perform the data processing method of the disclosed target object.
FIG. 10 shows a block diagram of an electronic device in accordance with an embodiment of the disclosure. For example, the electronic device 1000 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 10, electronic device 1000 may include one or more of the following components: processing component 1002, memory 1004, power component 1006, multimedia component 1008, audio component 1010, input/output (I/O) interface 1012, sensor component 1014, and communications component 1016.
The processing component 1002 generally controls overall operation of the electronic device 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1002 may include one or more processors 1020 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 1002 may include one or more modules that facilitate interaction between processing component 1002 and other components. For example, the processing component 1002 may include a multimedia module to facilitate interaction between the multimedia component 1008 and the processing component 1002.
The memory 1004 is configured to store various types of data to support operations at the electronic device 1000. Examples of such data include instructions for any application or method operating on the electronic device 1000, contact data, phonebook data, messages, images, videos, and so forth. The memory 1004 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 1006 provides power to the various components of the electronic device 1000. The power components 1006 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the electronic device 1000.
The multimedia component 1008 includes a screen that provides an output interface between the electronic device 1000 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1008 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 1000 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1010 is configured to output and/or input audio signals. For example, the audio component 1010 may include a Microphone (MIC) configured to receive external audio signals when the electronic device 1000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 1004 or transmitted via the communication component 1016. In some embodiments, audio component 1010 also includes a speaker for outputting audio signals.
I/O interface 1012 provides an interface between processing component 1002 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1014 includes one or more sensors for providing various aspects of status assessment for the electronic device 1000. For example, the sensor assembly 1014 may detect an open/closed state of the electronic device 1000, the relative positioning of components, such as a display and keypad of the electronic device 1000, the sensor assembly 1014 may also detect a change in position of the electronic device 1000 or a component of the electronic device 1000, the presence or absence of user contact with the electronic device 1000, orientation or acceleration/deceleration of the electronic device 1000, and a change in temperature of the electronic device 1000. The sensor assembly 1014 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1014 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1016 is configured to facilitate wired or wireless communication between the electronic device 1000 and other devices. The electronic device 1000 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1016 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1016 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 1004, is also provided that includes computer program instructions executable by the processor 1020 of the electronic device 1000 to perform the above-described methods.
FIG. 11 shows a block diagram of another electronic device in accordance with an embodiment of the disclosure. For example, the electronic device 1100 may be provided as a server. Referring to fig. 11, electronic device 1100 includes a processing component 1122 that further includes one or more processors and memory resources, represented by memory 1132, for storing instructions, such as application programs, that are executable by processing component 1122. The application programs stored in memory 1132 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1122 is configured to execute instructions to perform the above-described method.
The electronic device 1100 may also include a power component 1126 configured to perform power management of the electronic device 1100, a wired or wireless network interface 1150 configured to connect the electronic device 1100 to a network, and an input/output (I/O) interface 1158. The electronic device 700 may operate based on an operating system stored in memory 1132, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1132, is also provided that includes computer program instructions executable by the processing component 1122 of the electronic device 1100 to perform the methods described above.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (11)

1. A method of data processing of a target object, the method comprising:
acquiring first plane data of a target object in a first coordinate system based on a first sensor;
determining a coordinate transformation rule between the first coordinate system and the second coordinate system; the coordinate conversion rule is obtained based on a calibration object with temperature difference; the calibration object comprises a blackbody radiation source; the edge of the blackbody radiation source is provided with a cover, so that the temperature of the blackbody radiation source is different from that of the cover, and a temperature difference is formed; and the cover is a different color than the blackbody radiation source;
converting the first plane data to the second coordinate system based on the coordinate conversion rule to obtain second plane data; the second coordinate system corresponds to a second sensor; one sensor of the first sensor and the second sensor is a thermal imager;
acquiring third plane data of the target object under the second coordinate system based on the second sensor;
and performing data processing on the second plane data and the third plane data.
2. The method according to claim 1, further comprising the step of obtaining a coordinate transformation rule between the first coordinate system and a second coordinate system;
the obtaining of the coordinate conversion rule between the first coordinate system and the second coordinate system includes:
acquiring a first image containing the calibration object at a first position by using the first sensor;
acquiring a second image containing the calibration object at a second position by using the second sensor;
determining a first coordinate set of N setting points contained in the calibration object from the first image;
determining a second set of coordinates for the N set points from the second image; n is an integer greater than or equal to 4;
and determining a coordinate conversion rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set.
3. The method of claim 2, wherein the coordinate transformation rule comprises a coordinate transformation matrix; the determining a coordinate transformation rule between the first coordinate system and the second coordinate system according to the first coordinate set and the second coordinate set includes:
determining N pairs of matching coordinates from the first coordinate set and the second coordinate set based on the same set point;
and substituting the N pairs of matched coordinates into a function of a coordinate conversion matrix containing parameters to be determined to obtain the coordinate conversion matrix with the determined parameters.
4. The method according to claim 2, wherein when the first sensor is a camera, the determining a first set of coordinates of N set points included in the calibration object from the first image comprises:
carrying out gray level conversion on the first image to obtain a first gray level image;
carrying out binarization processing on the first gray level image to obtain a processed first gray level image;
performing feature detection on the processed first gray level image to determine the N set points;
a first set of coordinates of the N set points in the first image is determined.
5. The method of claim 2, wherein said determining a second set of coordinates for the N set points from the second image when the second sensor is the thermal imager comprises:
determining a first temperature corresponding to the temperature difference;
performing image binarization segmentation on the second image by taking the first temperature as a temperature threshold to obtain a processed second image;
and acquiring the positions of the N set points of the processed second image to obtain a second coordinate set of the N set points.
6. The method according to claim 5, wherein before obtaining the positions of the N set points of the processed second image to obtain the second coordinate set of the N set points, further comprising:
and carrying out noise interference mitigation processing on the processed second image.
7. The method of claim 2, wherein the first and second locations are the same location;
or; the first and second positions are two fixed positions of the first and second sensors in the device.
8. A data processing apparatus of a target object, comprising:
the first data acquisition module is used for acquiring first plane data of the target object in a first coordinate system based on the first sensor;
the conversion rule determining module is used for determining a coordinate conversion rule between the first coordinate system and the second coordinate system; the coordinate conversion rule is obtained based on a calibration object with temperature difference; the calibration object comprises a blackbody radiation source; wherein, the edge of the blackbody radiation source is pasted with a covering so as to lead the temperature of the blackbody radiation source and the covering to be different and form a temperature difference; and the cover is a different color than the blackbody radiation source;
the second data acquisition module is used for converting the first plane data into a second coordinate system based on the coordinate conversion rule to obtain second plane data; the second coordinate system corresponds to a second sensor; one thermal imager exists in the first sensor and the second sensor;
a third data acquisition module, configured to acquire third planar data of the target object in the second coordinate system based on the second sensor;
and the processing module is used for carrying out data processing on the second plane data and the third plane data.
9. A binocular calibration system is characterized by comprising a calibration object, a first sensor, a second sensor and a data processor; one sensor of the first sensor and the second sensor is a thermal imager;
the calibration object comprises a blackbody radiation source; the edge of the blackbody radiation source is provided with a cover, so that the temperature of the blackbody radiation source is different from that of the cover, and a temperature difference is formed; and the cover is a different color than the blackbody radiation source;
the first sensor acquires first plane data of a target object in a first coordinate system;
the data processor determining a coordinate transformation rule between the first coordinate system and a second coordinate system; converting the first plane data to the second coordinate system based on the coordinate conversion rule to obtain second plane data; the coordinate conversion rule is obtained based on the calibration object with the temperature difference; the second coordinate system corresponds to a second sensor;
the second sensor acquires third plane data of the target object under the second coordinate system;
and the data processor performs data processing on the second plane data and the third plane data.
10. A computer-readable storage medium, in which at least one instruction or at least one program is stored, the at least one instruction or the at least one program being loaded and executed by a processor to implement the data processing method of a target object according to any one of claims 1 to 8.
11. An electronic device comprising at least one processor, and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the at least one processor implements a data processing method of a target object according to any one of claims 1 to 8 by executing the instructions stored by the memory.
CN202111444526.5A 2021-11-30 2021-11-30 Target object data processing method and device, binocular calibration system, storage medium and equipment Pending CN114241058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111444526.5A CN114241058A (en) 2021-11-30 2021-11-30 Target object data processing method and device, binocular calibration system, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111444526.5A CN114241058A (en) 2021-11-30 2021-11-30 Target object data processing method and device, binocular calibration system, storage medium and equipment

Publications (1)

Publication Number Publication Date
CN114241058A true CN114241058A (en) 2022-03-25

Family

ID=80752234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111444526.5A Pending CN114241058A (en) 2021-11-30 2021-11-30 Target object data processing method and device, binocular calibration system, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN114241058A (en)

Similar Documents

Publication Publication Date Title
US10452890B2 (en) Fingerprint template input method, device and medium
CN107692997B (en) Heart rate detection method and device
CN109344832B (en) Image processing method and device, electronic equipment and storage medium
CN109819229B (en) Image processing method and device, electronic equipment and storage medium
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
CN112001321A (en) Network training method, pedestrian re-identification method, network training device, pedestrian re-identification device, electronic equipment and storage medium
CN114019473A (en) Object detection method and device, electronic equipment and storage medium
CN112184787A (en) Image registration method and device, electronic equipment and storage medium
CN112219224A (en) Image processing method and device, electronic equipment and storage medium
CN113139471A (en) Target detection method and device, electronic equipment and storage medium
CN111523346A (en) Image recognition method and device, electronic equipment and storage medium
CN114187498A (en) Occlusion detection method and device, electronic equipment and storage medium
CN112541971A (en) Point cloud map construction method and device, electronic equipment and storage medium
CN113689361B (en) Image processing method and device, electronic equipment and storage medium
CN111583142A (en) Image noise reduction method and device, electronic equipment and storage medium
CN113261011A (en) Image processing method and device, electronic equipment and storage medium
CN111192218A (en) Image processing method and device, electronic equipment and storage medium
CN107730443B (en) Image processing method and device and user equipment
CN113538310A (en) Image processing method and device, electronic equipment and storage medium
CN112102300A (en) Counting method and device, electronic equipment and storage medium
CN111861942A (en) Noise reduction method and device, electronic equipment and storage medium
CN111784773A (en) Image processing method and device and neural network training method and device
CN110333903B (en) Method and device for determining page loading duration
CN114519794A (en) Feature point matching method and device, electronic equipment and storage medium
CN114241058A (en) Target object data processing method and device, binocular calibration system, storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination