CN113643386A - Calibration method and device, electronic equipment and computer readable storage medium - Google Patents

Calibration method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN113643386A
CN113643386A CN202111191277.3A CN202111191277A CN113643386A CN 113643386 A CN113643386 A CN 113643386A CN 202111191277 A CN202111191277 A CN 202111191277A CN 113643386 A CN113643386 A CN 113643386A
Authority
CN
China
Prior art keywords
image
point
positioning
area
geometric relationship
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111191277.3A
Other languages
Chinese (zh)
Other versions
CN113643386B (en
Inventor
何文博
杨帆
陈朝军
李若岱
马堃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yuanluobu Intelligent Technology Co ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202111191277.3A priority Critical patent/CN113643386B/en
Publication of CN113643386A publication Critical patent/CN113643386A/en
Application granted granted Critical
Publication of CN113643386B publication Critical patent/CN113643386B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a calibration method and device, electronic equipment and a computer readable storage medium. The method comprises the following steps: acquiring a first image, wherein the first image is obtained by shooting a target scene by thermal imaging equipment, and the target scene comprises a thermal radiation object and at least one detection point; obtaining at least one second position according to the first position and the first geometric relation; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.

Description

Calibration method and device, electronic equipment and computer readable storage medium
Technical Field
The present application relates to the field of computer vision technologies, and in particular, to a calibration method and apparatus, an electronic device, and a computer-readable storage medium.
Background
The object point refers to a point in the real world, and a pixel in the image corresponding to the object point is referred to as an image point. How to determine the position of an image point in an image has a very wide application.
In the prior art, an object point in an image is detected by performing object detection on the image, and then the position of an image point corresponding to the object point in the image is determined. However, due to the low accuracy of the detection results obtained by performing object detection on the thermal image, the accuracy of determining the position of the image point in the thermal image by the current technology is low.
Disclosure of Invention
The application provides a calibration method and device, electronic equipment and a computer readable storage medium.
In a first aspect, a calibration method is provided, the method including:
acquiring a first image, wherein the first image is obtained by shooting a target scene by thermal imaging equipment, and the target scene comprises a thermal radiation object and at least one detection point;
obtaining at least one second position according to the first position and the first geometric relation; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.
With reference to any embodiment of the present application, before obtaining at least one second position according to the first position and the first geometric relationship, the method further includes:
Acquiring a temperature threshold;
determining a first pixel region from the first image having a temperature greater than or equal to the temperature threshold;
and obtaining the first position according to the position of the first pixel region in the first image.
With reference to any embodiment of the present application, the obtaining at least one second position according to the first position and the first geometric relationship includes:
performing the step of deriving at least one second position from the first position and the first geometrical relationship in case the radiation area satisfies a first condition, the first condition comprising at least one of: the area of the radiation area in the first image is within a first area range, and the size of the radiation area in the first image meets a first size requirement.
With reference to any embodiment of the present application, the obtaining at least one second position according to the first position and the first geometric relationship includes:
obtaining at least four second positions according to the first position and the first geometric relationship, wherein the at least four second positions are positions of at least four detection points in the first image;
The method further comprises the following steps: acquiring a second image, wherein the second image is obtained by shooting the target scene by visible light imaging equipment;
determining the positions of the at least four detection points in the second image to obtain at least four third positions;
and obtaining a conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image according to the at least four second positions and the at least four third positions.
With reference to any embodiment of the present application, the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: angular point, geometric center;
obtaining at least four second positions according to the first position and the first geometric relationship, including:
determining the positions of at least four first positioning points corresponding to the at least four detection points in the first image according to the first positions to obtain at least four fourth positions;
and obtaining the at least four second positions according to the at least four fourth positions.
With reference to any embodiment of the present application, the target scene further includes at least one positioning graph, and the determining the positions of the at least four detection points in the second image to obtain at least four third positions includes:
Determining the position of the at least one positioning graph in the second image to obtain at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship and the at least one fifth position, wherein the second geometric relationship is the geometric relationship between the at least one positioning pattern and the radiation area.
With reference to any embodiment of the present application, the first geometric relationship includes that the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: the second geometric relationship comprises a third geometric relationship, and the third geometric relationship is a geometric relationship between the at least one positioning graph and at least one first positioning point of the radiation area;
obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship, and the at least one fifth position, including:
and determining the positions of at least four first positioning points corresponding to the at least four detection points in the second image according to the third geometric relationship and the at least one fifth position to obtain the at least four third positions.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
With reference to any embodiment of the present application, the obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship, and the at least one fifth position includes:
obtaining a sixth position of the radiation area in the second image according to the second geometric relation and the at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship and the sixth position.
In combination with any embodiment of the present application, the at least one positioning graphic includes a nested graphic, and the at least one fifth location includes a seventh location of the nested graphic in the second image;
the determining the position of the at least one positioning graph in the second image to obtain at least one fifth position includes:
performing edge detection processing on the second image to obtain at least one first contour in the second image;
determining a second profile having a first number of sub-profiles from the at least one first profile;
and determining the second contour as the contour of the nested graph, and obtaining the seventh position according to the position of the contour of the nested graph in the second image.
In combination with any embodiment of the present application, the determining the second contour as a contour of the nested figure includes:
determining the second contour as a contour of the nested graphic if the second contour satisfies a second condition, the second condition including at least one of: the area enclosed by the outline is within the second area range, and the size of the outline meets the second size requirement.
In combination with any of the embodiments of the present application, the second geometric relationship includes a third geometric relationship between the at least one positioning feature and the at least one first positioning point of the irradiation region;
obtaining a sixth position of the radiation region in the second image according to the second geometric relationship and the at least one fifth position includes:
obtaining at least one eighth position of at least one first positioning point of the radiation region in the second image according to the third geometric relation and the at least one fifth position;
and obtaining a sixth position of the radiation area in the second image according to the at least one eighth position.
With reference to any embodiment of the present application, the third geometric relationship includes a third condition, and a second positioning point of the at least one positioning graph, a third positioning point of the at least one positioning graph, and a fourth positioning point are located on a same straight line, where the second positioning point is different from the third positioning point, and the second positioning point includes one of: angular point, geometric center, the third anchor point includes one of the following: the fourth positioning point is any one of the at least one first positioning point;
The third condition includes one of: a first ratio between a first distance and a second distance, a second ratio between the first distance and a third distance, a third ratio between the second distance and the third distance; the first distance is the distance between the second positioning point and the third positioning point, the second distance is the distance between the fourth positioning point and the second positioning point, and the third distance is the distance between the fourth positioning point and the third positioning point;
the at least one eighth position comprises a ninth position of the fourth localization point in the second image, the deriving at least one eighth position of the at least one first localization point of the radiation region in the second image in accordance with the third geometrical relationship and the at least one fifth position comprises:
determining a tenth position of the second positioning point in the second image and an eleventh position of the third positioning point in the second image according to the at least one fifth position;
and obtaining the ninth position according to the third geometric relationship, the tenth position and the eleventh position.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
In combination with any embodiment of the present application, the thermal imaging device and the visible light imaging device belong to an electronic device, and the method further includes:
the electronic equipment acquires a third image of a person to be detected by using the visible light imaging equipment and acquires a fourth image of the person to be detected by using the thermal imaging equipment;
performing skin detection processing on the third image, and determining a twelfth position of the skin area of the person to be detected in the third image;
determining a second pixel area corresponding to the skin area in the fourth image according to the conversion relation and the twelfth position;
and obtaining the body temperature of the person to be detected according to the temperature of the second pixel area.
In a second aspect, there is provided a calibration apparatus, the apparatus comprising:
an acquisition unit configured to acquire a first image, the first image being obtained by shooting a target scene by a thermal imaging apparatus, the target scene including a thermal radiation object and at least one detection point;
the processing unit is used for obtaining at least one second position according to the first position and the first geometric relation; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.
With reference to any embodiment of the present application, the obtaining unit is further configured to obtain a temperature threshold;
the processing unit is further used for determining a first pixel area with the temperature greater than or equal to the temperature threshold value from the first image;
the processing unit is further configured to obtain the first position according to a position of the first pixel region in the first image.
With reference to any one of the embodiments of the present application, the processing unit is configured to, in a case that the radiation region satisfies a first condition, perform a step of obtaining at least one second position according to a first position and a first geometric relationship, where the first condition includes at least one of: the area of the radiation area in the first image is within a first area range, and the size of the radiation area in the first image meets a first size requirement.
With reference to any embodiment of the present application, the number of the detection points is greater than or equal to 4, and the processing unit is configured to:
obtaining at least four second positions according to the first position and the first geometric relationship, wherein the at least four second positions are positions of at least four detection points in the first image;
The acquisition unit is further used for acquiring a second image, and the second image is obtained by shooting the target scene by a visible light imaging device;
the processing unit is further configured to determine positions of the at least four detection points in the second image to obtain at least four third positions;
the processing unit is further configured to obtain a conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image according to the at least four second positions and the at least four third positions.
With reference to any embodiment of the present application, the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: angular point, geometric center;
the processing unit is configured to:
determining the positions of at least four first positioning points corresponding to the at least four detection points in the first image according to the first positions to obtain at least four fourth positions;
and obtaining the at least four second positions according to the at least four fourth positions.
With reference to any embodiment of the present application, the target scene further includes at least one positioning graph, and the processing unit is configured to:
Determining the position of the at least one positioning graph in the second image to obtain at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship and the at least one fifth position, wherein the second geometric relationship is the geometric relationship between the at least one positioning pattern and the radiation area.
With reference to any embodiment of the present application, the first geometric relationship includes that the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: the second geometric relationship comprises a third geometric relationship, and the third geometric relationship is a geometric relationship between the at least one positioning graph and at least one first positioning point of the radiation area;
the processing unit is configured to:
and determining the positions of at least four first positioning points corresponding to the at least four detection points in the second image according to the third geometric relationship and the at least one fifth position to obtain the at least four third positions.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
In combination with any embodiment of the present application, the processing unit is configured to:
obtaining a sixth position of the radiation area in the second image according to the second geometric relation and the at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship and the sixth position.
In combination with any embodiment of the present application, the at least one positioning graphic includes a nested graphic, and the at least one fifth location includes a seventh location of the nested graphic in the second image;
the processing unit is configured to:
performing edge detection processing on the second image to obtain at least one first contour in the second image;
determining a second profile having a first number of sub-profiles from the at least one first profile;
and determining the second contour as the contour of the nested graph, and obtaining the seventh position according to the position of the contour of the nested graph in the second image.
In combination with any embodiment of the present application, the processing unit is configured to:
determining the second contour as a contour of the nested graphic if the second contour satisfies a second condition, the second condition including at least one of: the area enclosed by the outline is within the second area range, and the size of the outline meets the second size requirement.
In combination with any of the embodiments of the present application, the second geometric relationship includes a third geometric relationship between the at least one positioning feature and the at least one first positioning point of the irradiation region;
the processing unit is configured to:
obtaining at least one eighth position of at least one first positioning point of the radiation region in the second image according to the third geometric relation and the at least one fifth position;
and obtaining a sixth position of the radiation area in the second image according to the at least one eighth position.
With reference to any embodiment of the present application, the third geometric relationship includes a third condition, and a second positioning point of the at least one positioning graph, a third positioning point of the at least one positioning graph, and a fourth positioning point are located on a same straight line, where the second positioning point is different from the third positioning point, and the second positioning point includes one of: angular point, geometric center, the third anchor point includes one of the following: the fourth positioning point is any one of the at least one first positioning point;
The third condition includes one of: a first ratio between a first distance and a second distance, a second ratio between the first distance and a third distance, a third ratio between the second distance and the third distance; the first distance is the distance between the second positioning point and the third positioning point, the second distance is the distance between the fourth positioning point and the second positioning point, and the third distance is the distance between the fourth positioning point and the third positioning point;
the at least one eighth position comprises a ninth position of the fourth anchor point in the second image, the processing unit is configured to:
determining a tenth position of the second positioning point in the second image and an eleventh position of the third positioning point in the second image according to the at least one fifth position;
and obtaining the ninth position according to the third geometric relationship, the tenth position and the eleventh position.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
In combination with any embodiment of the present application, the thermal imaging device and the visible light imaging device belong to a calibration device, and the calibration device uses the visible light imaging device to acquire a third image of a person to be detected and uses the thermal imaging device to acquire a fourth image of the person to be detected;
The processing unit is further configured to perform skin detection processing on the third image, and determine a twelfth position of the skin area of the person to be detected in the third image;
the processing unit is further configured to determine a second pixel region corresponding to the skin region in the fourth image according to the conversion relation and the twelfth position;
the processing unit is further configured to obtain the body temperature of the person to be detected according to the temperature of the second pixel region.
In a third aspect, an electronic device is provided, which includes: a processor and a memory for storing computer program code comprising computer instructions, the electronic device performing the method of the first aspect and any one of its possible implementations as described above, if the processor executes the computer instructions.
In a fourth aspect, another electronic device is provided, including: a processor, transmitting means, input means, output means, and a memory for storing computer program code comprising computer instructions, which, when executed by the processor, cause the electronic device to perform the method of the first aspect and any one of its possible implementations.
In a fifth aspect, there is provided a computer-readable storage medium having stored therein a computer program comprising program instructions which, if executed by a processor, cause the processor to perform the method of the first aspect and any one of its possible implementations.
A sixth aspect provides a computer program product comprising a computer program or instructions which, when run on a computer, causes the computer to perform the method of the first aspect and any of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
In this application, the calibration device may determine a relationship between the position of the heat radiating area in the first image and the position of the at least one inspection point in the first image based on the first geometric relationship, and may further determine the position of the at least one inspection point in the thermal image based on the position of the heat radiating area in the thermal image and the first geometric relationship, thereby improving the accuracy of the position of the at least one inspection point in the thermal image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and, together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic diagram of a pixel coordinate system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of a calibration method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a pixel region corresponding to a radiation region according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a positioning diagram provided in an embodiment of the present application;
FIG. 5 is a schematic view of another positioning diagram provided in the embodiments of the present application;
FIG. 6 is a schematic view of a radiation area and a positioning pattern according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a nested graph according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another nested configuration provided by embodiments of the present application;
FIG. 9 is a schematic view of a calibration plate according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a calibration apparatus provided in an embodiment of the present application;
Fig. 11 is a schematic hardware structure diagram of a calibration apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one" means one or more, "a plurality" means two or more, "at least two" means two or three and three or more, "and/or" for describing an association relationship of associated objects, meaning that three relationships may exist, for example, "a and/or B" may mean: only A, only B and both A and B are present, wherein A and B may be singular or plural. The character "/" may indicate that the objects associated with each other are in an "or" relationship, meaning any combination of the items, including single item(s) or multiple items. For example, at least one (one) of a, b, or c, may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. The character "/" may also represent a division in a mathematical operation, e.g., a/b = a divided by b; 6/3= 2. At least one of the following "or similar expressions.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The object point refers to a point in the real world, and a pixel in the image corresponding to the object point is referred to as an image point. How to determine the position of an image point in an image has a very wide application. For example, the position of the person a in the real world can be represented by b point, and the position of the person a in the image can be determined by determining the position of the image point corresponding to the b point in the image. For another example, the person a moves from point b to point c, i.e., the moving distance of the person a is the distance from point b to point c. By determining the position of the image point corresponding to the point b in the image and the position of the image point corresponding to the point c in the image, the moving distance of the person a in the image can be determined.
In current techniques, it is possible to determine the position of a pixel in a visible light image, but the accuracy of determining the position of a pixel in a thermal image is low. Based on this, the embodiments of the present application provide a technical solution to improve the accuracy of the position of the image point in the thermal image.
In the embodiment of the present application, the positions in the image all refer to positions in pixel coordinates of the image. The abscissa of the pixel coordinate system is used for representing the number of columns where the pixel points are located, and the ordinate of the pixel coordinate system is used for representing the number of rows where the pixel points are located. For example, in the image shown in fig. 1, a pixel coordinate system XOY is constructed with the upper left corner of the image as the origin O of coordinates, the direction parallel to the rows of the image as the direction of the X axis, and the direction parallel to the columns of the image as the direction of the Y axis. The units of the abscissa and the ordinate are pixel points. For example, pixel A in FIG. 1 11Has the coordinate of (1, 1), and the pixel point A23Has the coordinates of (3, 2), and the pixel point A42Has the coordinates of (2, 4), and the pixel point A34The coordinates of (2) are (4, 3).
For convenience, the following description will use [ a, b ] to denote a value range greater than or equal to a and less than or equal to b, use (c, d) to denote a value range greater than c and less than or equal to d, and use [ e, f) to denote a value range greater than or equal to e and less than f.
The execution subject of the embodiment of the present application is a calibration apparatus, wherein the calibration apparatus may be any electronic device that can execute the technical solution disclosed in the embodiment of the method of the present application. Optionally, the calibration device may be one of the following: cell-phone, computer, panel computer, wearable smart machine.
It should be understood that the method embodiments of the present application may also be implemented by means of a processor executing computer program code. The embodiments of the present application will be described below with reference to the drawings. Referring to fig. 2, fig. 2 is a schematic flow chart of a calibration method according to an embodiment of the present disclosure.
201. The method comprises the steps of obtaining a first image, wherein the first image is obtained by shooting a target scene through thermal imaging equipment, and the target scene comprises a thermal radiation object and at least one detection point.
In an embodiment of the present application, the thermal imaging device is an imaging device that can generate a thermal image. For example, the thermal imaging device is a thermal imaging camera. As another example, the thermal imaging apparatus is a thermal imager.
In the embodiment of the present application, the target scene may be any scene. For example, the target scene is in an office. As another example, the target scene is a campus.
In the embodiment of the present application, the heat radiating object is an object that can radiate heat outward. Alternatively, the heat radiating object is a black body. The detection point is any point in the target scene. The target scene comprises at least one detection point, namely at least one detection point is at least one object point in the target scene. The target scene also includes thermal radiating objects.
In one implementation of acquiring the first image, the calibration device receives the first image input by the user through the input component. The above-mentioned input assembly includes: keyboard, mouse, touch screen, touch pad and audio input device.
In another implementation manner of acquiring the first image, the calibration device receives the first image sent by the terminal. The terminal may be any one of the following: cell-phone, computer, panel computer, server.
202. Obtaining at least one second position based on a first position and a first geometric relationship, the first position being a position of a radiation area of the heat-radiating object in the first image, the first geometric relationship being a geometric relationship between the radiation area and the at least one detection point, the at least one second position being a position of the at least one detection point in the first image.
In the embodiment of the present application, the first position is a position of the radiation area of the heat radiating object in the first image, that is, a pixel area corresponding to the radiation area in the first image can be determined according to the first position. Alternatively, the shape of the radiation area may be fixed, that is, the temperature of the radiation area in the target scene may be affected by the thermal radiation of the thermal radiation object, and the temperature of the area other than the radiation area in the target scene may not be affected by the thermal radiation of the thermal radiation object.
For example, in a target scene, a heat radiating object is placed in a container having a fixed shape, and the outer surface material of the container is a heat insulating material. Thus, the radiation area of the heat radiating object is a space inside the container.
For another example, there is a calibration plate between the heat radiating object and the thermal imaging apparatus, the calibration plate including a preset region and a non-preset region. Wherein the heat radiated from the heat radiating object can penetrate through the preset region, and the heat radiated from the heat radiating object can not penetrate through the non-preset region. At this time, the radiation area of the heat radiation object is the preset area of the calibration board.
In an embodiment of the present application, the first geometric relationship is a geometric relationship between a radiation area of the heat-radiating object and the at least one detection point. For example, in the case where the radiation area is rectangular and the number of detection points is 4, the first geometric relationship includes four detection points, which are four corner points of the radiation area, respectively. For another example, in the case where the radiation area is a circle and the number of detection points is 1, the first geometric relationship includes the detection points being the center of the radiation area. For another example, where the radiation area is trapezoidal and the number of detection spots is 2, the first geometric relationship includes one of the two detection spots being located 30 centimeters from the upper base of the radiation area and the other of the two detection spots being located 84 centimeters from the lower base of the radiation area.
In this embodiment of the application, the second position is a position of the detection point in the first image, that is, a position of the image point corresponding to the detection point in the first image. The calibration device can respectively determine the position of each detection point in the first image according to the first position and the first geometric relationship to obtain at least one second position. I.e. the at least one second position is the position of the at least one detection point in the first image.
For example, the at least one detection point includes detection point a. And the calibration device determines the position of the detection point a in the first image according to the first position and the first geometric relation. At this time, the at least one second position includes a position of the detection point a in the first image.
For example, the at least one detection point includes a detection point a and a detection point b. And the calibration device determines the position of the detection point a in the first image and the position of the detection point b in the first image according to the first position and the first geometric relation. At this time, the at least one second position includes a position of the detection point a in the first image and a position of the detection point b in the first image.
In a possible implementation manner, in a case that the detection point is a corner point of the radiation region, the calibration device determines a position of the corner point of the radiation region in the first image according to the first position, and further determines a position of the detection point in the first image.
In another possible implementation manner, in the case that the detection point is the geometric center of the radiation area, the calibration device determines the position of the geometric center of the radiation area in the first image according to the first position, and further determines the position of the detection point in the first image.
In yet another possible implementation, the radiation area includes a corner point a and a corner point b. The first geometric relationship comprises that the detection point, the corner point a and the corner point B are on the same straight line, and the ratio of the distance A to the distance B is 1/2, wherein the distance A is the distance between the corner point a and the detection point, and the distance B is the distance between the corner point a and the corner point B. The calibration device determines the position of the corner point a in the first image and the position of the corner point B in the first image according to the first position, and obtains the distance B according to the position of the corner point a in the first image and the position of the corner point B in the first image. And the calibration device further determines the position of the detection point in the first image according to the distance B, the ratio of the distance A to the distance B being 1/2 and the position of the corner point a in the first image.
In an embodiment of the application, the calibration device can determine the relationship between the position of the heat radiation region in the first image and the position of the at least one inspection point in the first image according to the first geometric relationship, and further can determine the position of the at least one inspection point in the thermal image according to the position of the heat radiation region in the thermal image and the first geometric relationship, so as to improve the accuracy of the position of the at least one inspection point in the thermal image.
As an alternative embodiment, before executing step 202, the calibration apparatus further executes the following steps:
1. a temperature threshold is obtained.
2. A first pixel region having a temperature greater than or equal to the temperature threshold is determined from the first image.
Since the temperature of the radiation area of the heat radiating object is affected by the heat radiated from the heat radiating object and is higher than the temperature of the radiation area of the non-heat radiating object, the temperature of the radiation area is higher than the temperature of the non-radiation area in the first image. Therefore, the pixels in the first image can be distinguished from belonging to the radiation area or the non-radiation area according to the temperature.
In the embodiment of the application, the temperature threshold is a basis for distinguishing the temperature of the pixel, and the pixel can be determined to belong to a radiation area or a non-radiation area according to the temperature of the pixel. Specifically, the temperature of the pixel is greater than or equal to the temperature threshold value representing that the temperature of the pixel is high, so that the pixel can be determined to belong to the radiation area; the temperature of the pixel is less than the temperature threshold value, and the temperature of the pixel is low, so that the pixel can be determined to belong to the non-radiation area.
The calibration device distinguishes pixels in the first image based on the temperature threshold, and takes a pixel area including pixels with high temperature as a first pixel area.
3. The first position is obtained according to the position of the first pixel region in the first image.
Since the temperature of the irradiation region is higher than that of the non-irradiation region in the object scene, the temperature of the pixel region corresponding to the irradiation region is higher than that of the pixel region corresponding to the non-irradiation region in the first image. Therefore, the calibration device can determine the position of the radiation region in the first image (i.e. the position of the pixel region corresponding to the radiation region in the first image) according to the position of the first pixel region in the first image, and obtain the first position.
In one possible implementation, the calibration means takes the position of the first pixel region in the first image as the first position.
In another possible implementation manner, the calibration device determines the position of the contour of the first pixel region in the first image according to the position of the first pixel region in the first image. The calibration device shifts the contour of the first pixel region by 1 pixel in the positive direction of the horizontal axis of the pixel coordinate system of the first image, and obtains a first shifted contour. The calibration means takes the position of the pixel region surrounded by the first moved contour in the first image as the first position.
In another possible implementation manner, the calibration device determines the position of the contour of the first pixel region in the first image according to the position of the first pixel region in the first image. The calibration device moves the contour of the first pixel region by 1 pixel in the positive direction of the longitudinal axis of the pixel coordinate system of the first image to obtain a second moved contour. The calibration means takes the position of the pixel region surrounded by the second moved contour in the first image as the first position.
The calibration device determines the pixel area with high temperature in the first image by executing the steps 1 to 3 according to the temperature threshold, and further determines the position of the radiation area in the first image according to the position of the pixel area with high temperature in the first image.
As an alternative embodiment, the calibration means performs step 202 in the case where the above-mentioned radiation region satisfies the first condition.
In an embodiment of the present application, the first condition includes at least one of: the area of the radiation area in the first image is within a first area range, and the size of the radiation area in the first image meets a first size requirement.
During the process of acquiring the first image by the thermal imaging device, there may be a factor causing distortion of the radiation region in the first image, and in the case of distortion of the radiation region in the first image, determining the position of the detection point in the first image according to the first geometric relationship and the first position tends to cause low accuracy of the position of the detection point in the first image.
For example, in a target scene, the radiation area is square. Due to distortion of the thermal imaging device, the shape of the pixel region in the first image corresponding to the radiation region is an irregular shape as shown in fig. 3.
Therefore, to improve the accuracy of the position of the detection point in the first image, the position of the detection point in the first image can be determined from the first geometric relationship and the first position without distortion of the radiation region in the first image.
It is considered that if the radiation region is distorted, the shape of the radiation region changes, and the area of the radiation region changes. Therefore, whether the radiation region is distorted or not can be determined depending on the area of the radiation region in the first image. Specifically, in the object scene, the shape of the radiation region is fixed, that is, the area of the radiation region is constant, so that the area of the radiation region in the first image (hereinafter referred to as a first reference area) can be determined depending on the area of the radiation region. A reasonable range of the area of the irradiated region in the thermal image (i.e., the first area range described above) may be determined based on the first reference area, and a determination may be made as to whether the irradiated region is distorted based on the first area range and the area of the irradiated region in the first image.
Specifically, in the case where the area of the radiation region in the first image is within the first area range, it is determined that the radiation region is not distorted, and further, the position of the detection point in the first image can be determined from the first geometric relationship and the first position. In the case where the area of the radiation region in the first image is not within the first area range, it is determined that the radiation region is distorted, and the step of determining the position of the detection point in the first image from the first geometric relationship and the first position is not performed.
It is considered that if the radiation area is distorted, the shape of the radiation area changes, and the size of the radiation area changes. Therefore, it is possible to determine whether the radiation region is distorted or not depending on the size of the radiation region in the first image. Specifically, in the object scene, the shape of the radiation area is fixed, that is, the size of the radiation area is constant, so that the size of the radiation area in the first image (hereinafter referred to as a first reference size) can be determined depending on the size of the radiation area. From the first reference size, a reasonable range of sizes of the irradiated regions in the thermal image can be determined (i.e., the first size requirement described above).
For example, the radiation area is rectangular, and the aspect ratio of the radiation area in the thermal image is determined to be in the range of [1.13, 1.28] according to the first reference size. The first dimension then includes that the aspect ratio of the irradiated area should be within 1.13, 1.28.
After determining the first size requirement, it is further determined whether the radiation region is distorted based on the first size requirement and the size of the radiation region in the first image. Specifically, in the case where the size of the radiation area in the first image satisfies the first size requirement, it is determined that the radiation area is not distorted, and the position of the detection point in the first image can be determined from the first geometric relationship and the first position. And under the condition that the size of the radiation area in the first image does not meet the first size requirement, determining that the radiation area is distorted, and further not performing the step of determining the position of the detection point in the first image according to the first geometric relation and the first position.
Optionally, the calibration device may perform the step of determining the position of the at least one detection point in the first image according to the first position and the first geometric relationship, when the area of the radiation region in the first image is within the first area range and the size of the radiation region in the first image meets the first size requirement.
As an alternative embodiment, the number of detection points is greater than or equal to 4. The calibration means performs the following steps in the process of performing step 202:
4. at least four second positions are obtained according to the first positions and the first geometric relationship.
In this step, since the number of the detection points is greater than or equal to 4, and the calibration device can respectively determine the position of each detection point in the first image according to the first position and the first geometric relationship, the calibration device obtains at least four second positions by executing step 4. I.e. the at least four second positions are the positions of the at least four detection points in the first image.
For example, the at least four detection points include a detection point a, a detection point b, a detection point c, and a detection point d. And the calibration device determines the position of the detection point a in the first image, the position of the detection point b in the first image, the position of the detection point c in the first image and the position of the detection point d in the first image according to the first position and the first geometric relation. At this time, the at least four second positions include a position of the detection point a in the first image, a position of the detection point b in the first image, a position of the detection point c in the first image, and a position of the detection point d in the first image.
When the calibration device finishes the step 4, the calibration device further executes the following steps:
5. and acquiring a second image, wherein the second image is obtained by shooting the target scene by visible light imaging equipment.
In the embodiment of the present application, the image acquired by the visible light imaging device is a visible light image, that is, the second image is a visible light image. For example, the visible light imaging device is an RGB camera, and the second image is an RGB image.
6. And determining the positions of the at least four detection points in the second image to obtain at least four third positions.
In this step, the third position is the position of the detection point in the second image, and the calibration device determines the position of each detection point in the second image respectively to obtain at least four third positions.
For example, the at least four detection points include a detection point a, a detection point b, a detection point c, and a detection point d. The calibration means determines the position of the detection point a in the second image, the position of the detection point b in the second image, the position of the detection point c in the second image, and the position of the detection point d in the second image. At this time, the at least four third positions include a position of the detection point a in the second image, a position of the detection point b in the second image, a position of the detection point c in the second image, and a position of the detection point d in the second image.
In one possible implementation, the detection points are corner points. And the calibration device determines the position of the detection point in the second image by detecting the corner point of the second image.
In another possible implementation, the detection point is a circle center. And the calibration device determines the position of the detection point in the second image by performing circle center detection on the second image.
In yet another possible implementation manner, the at least four detection points include a circle center and a corner point. And the calibration device determines the positions of at least four detection points in the second image by carrying out corner detection and circle center detection on the second image. For example, the at least four detection points include a detection point a, a detection point b, a detection point c, and a detection point d. And the calibration device determines the position of the detection point a in the second image and the position of the detection point c in the second image by performing corner detection on the second image, and determines the position of the detection point b in the second image and the position of the detection point d in the second image by performing circle center detection on the second image.
7. And obtaining a conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image according to the at least four second positions and the at least four third positions.
In the embodiment of the present application, based on the conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image, the coordinates in the pixel coordinate system of the first image may be converted into the coordinates in the pixel coordinate system of the second image, or the coordinates in the pixel coordinate system of the second image may be converted into the coordinates in the pixel coordinate system of the first image based on the conversion relationship.
The calibration device can determine the positions of at least four pairs of matching point pairs according to at least four second positions and at least four third positions. In the implementation of the present application, the matching point pair includes two pixels, one of the pixels belongs to the first image, the other pixel belongs to the second image, and both the pixels correspond to the same detection point. The positions of a pair of matching point pairs include the positions of the two pixels in the matching point pair in the image.
For example, the at least one detection point includes a detection point a, a pixel corresponding to the detection point a in the first image is a pixel b, and a pixel corresponding to the detection point a in the second image is a pixel c. At this time, the pixel b and the pixel c are two pixels in a pair of matching point pairs whose positions include the position of the pixel b in the first image and the position of the pixel c in the second image.
The calibration device can construct a conversion equation according to the pair of matching point pairs and the conversion relation. For example, the matching point pair includes a pixel a in the first image and a pixel b in the second image. The position of pixel a in the first image is p1 and the position of pixel b in the second image is p 2. If the conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is p3, then p1 × p3= p2, or p2 × p3= p 1.
The calibration device can construct at least four conversion equations according to the at least four pairs of matching point pairs and the conversion relation, further can simultaneously establish the at least four conversion equations to obtain a conversion equation set, and obtains the conversion relation by solving the conversion equation set.
And 4, the calibration device can obtain the conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image by executing the steps 4 to 7, namely the conversion relation between the pixel coordinate system of the visible light image and the pixel coordinate system of the thermal image. Since the accuracy of the positions of the at least four detection points in the thermal image can be improved based on the technical solutions provided in the foregoing, and the conversion relationship is obtained based on the positions of the at least four detection points in the thermal image, the accuracy of the conversion relationship can be improved based on the conversion relationship between the pixel coordinate system of the visible light image and the pixel coordinate system of the thermal image obtained according to the embodiments of the present application.
As an optional implementation manner, the at least four detection points are first positioning points of the radiation area, where the first positioning points include one of the following: corner points, geometric centers.
In this embodiment, any one of the at least four detection points may be one of a corner point and a geometric center. For example, the at least four detection points include a detection point a, a detection point b, a detection point c, and a detection point d, where the detection point a, the detection point b, and the detection point c are all corner points of the radiation area (it should be understood that the detection point a, the detection point b, and the detection point c are different corner points), and the detection point d is a geometric center of the radiation area.
For another example, the at least four detecting points include a detecting point a, a detecting point b, a detecting point c, and a detecting point d, where the detecting point a, the detecting point b, the detecting point c, and the detecting point d are all corner points of the radiation area (it should be understood that the detecting point a, the detecting point b, the detecting point c, and the detecting point d are different corner points).
The calibration means performs the following steps in the process of performing step 4:
8. and determining the positions of at least four first positioning points corresponding to the at least four detection points in the first image according to the first positions to obtain at least four fourth positions.
In this embodiment of the application, the fourth position is a position of the first positioning point corresponding to the detection point in the first image, and the at least four fourth positions include positions of the first positioning point corresponding to the at least four detection points in the first image.
For example, the radiation area is a rectangle, and four corner points of the rectangle are a corner point a, a corner point b, a corner point c, and a corner point d, respectively. The at least four detection points include a detection point A, a detection point B, a detection point C and a detection point D. If the detection point a is the corner point a, the detection point B is the corner point B, the detection point C is the corner point C, and the detection point D is the corner point D, the at least four fourth positions include the position of the corner point a in the first image, the position of the corner point B in the first image, the position of the corner point C in the first image, and the position of the corner point D in the first image.
For another example, the radiation region is a hexagon, and six corner points of the hexagon are respectively a corner point a, a corner point b, a corner point c, a corner point d, a corner point e and a corner point f. The at least four detection points include a detection point A, a detection point B, a detection point C and a detection point D. If the detection point a is the corner point a, the detection point B is the corner point B, the detection point C is the corner point C, and the detection point D is the corner point f, the at least four fourth positions include the position of the corner point a in the first image, the position of the corner point B in the first image, the position of the corner point C in the first image, and the position of the corner point f in the first image.
The calibration device can determine the outline of the radiation region in the first image according to the first position, and further can determine the position of a corner point corresponding to the detection point in the radiation region in the first image according to the outline of the radiation region in the first image, or determine the position of a geometric center corresponding to the detection point in the radiation region in the first image.
For example, if the radiation area is rectangular, the at least four first localization points corresponding to the at least four detection points include four corner points of the radiation area. The calibration device can determine the positions of the four corner points of the radiation region in the first image according to the first position. At this time, the at least four fourth locations include locations of four corner points of the radiation area in the first image.
For another example, if the radiation region is rectangular, the at least four first positioning points corresponding to the at least four detection points include three corner points of the radiation region except for a maximum corner point and a geometric center of the radiation region, where the maximum corner point is a corner point with a maximum abscissa among the four corner points, and the maximum corner point is a corner point with a maximum ordinate among the four corner points. The calibration device can determine the positions of three corner points except the maximum corner point in the radiation area in the first image and the position of the geometric center of the radiation area in the first image according to the first position. At this time, the at least four fourth locations include locations of three corner points other than the largest corner point in the radiation region in the first image and locations of geometric centers of the radiation region in the first image.
9. And obtaining at least four second positions according to the at least four fourth positions.
In a possible implementation manner, the calibration device uses the position of the first positioning point corresponding to the detection point in the first image as the position of the detection point in the first image, i.e., uses the fourth position as the second position.
For example, the at least four detection points include a detection point a, wherein the detection point a is a corner point a of the radiation area. At this time, the calibration device may use the position of the corner point a in the first image as the position of the detection point a in the first image.
In another possible implementation, the calibration device determines the abscissa and the ordinate (hereinafter referred to as reference abscissa and reference ordinate) of the first localization point corresponding to the detection point in the first image. And taking the sum of the reference abscissa and the first constant as the abscissa of the detection point in the first image, and taking the reference ordinate as the ordinate of the detection point in the first image to obtain the position of the detection point in the first image, wherein the first constant is a rational number.
For example, the detection point a is a corner point a of the radiation area. The calibration means determines the position of the corner point a in the first image as (x 1, y 1) depending on the first position. If the first constant is c1, the calibration apparatus determines that the position of the detection point a in the first image is (x 1+ c1, y 1).
In yet another possible implementation, the calibration device determines an abscissa and an ordinate (hereinafter referred to as a reference abscissa and a reference ordinate) of the first localization point corresponding to the detection point in the first image. And taking the sum of the reference ordinate and a second constant as the ordinate of the detection point in the first image, and taking the reference abscissa as the abscissa of the detection point in the first image to obtain the position of the detection point in the first image, wherein the second constant is a rational number.
For example, the detection point B is the geometric center B of the radiation area. The calibration means determines the position of the geometric center B of the irradiated region in the first image as (x 2, y 2) from the first position. If the first constant is c2, the calibration means determines that the position of the detection point b in the first image is (x 2, y2+ c 2).
In this embodiment, when the detection point is the first fixed point, the calibration device determines the position of the detection point in the first image by executing step 8 and step 9, thereby reducing the amount of data processing and increasing the processing speed.
As an optional implementation, the target scene further comprises at least one positioning graph. In the embodiment of the present application, the positioning pattern may be any pattern. For example, the positioning pattern is rectangular. For another example, the positioning pattern is circular. For another example, the positioning pattern is the pattern shown in fig. 4. For another example, the positioning pattern is the pattern shown in fig. 5.
It will be appreciated that in the case where the number of positioning patterns is greater than 1, the different positioning patterns may be the same and the different positioning patterns may be different patterns. For example, the at least one positioning pattern includes a pattern a, a pattern b, and a pattern c, wherein the pattern a and the pattern b are rectangles, and the pattern c is the pattern shown in fig. 5.
In this embodiment, the calibration means performs the following steps in performing step 6:
10. and determining the position of the at least one positioning graph in the second image to obtain at least one fifth position.
In the embodiment of the present application, the fifth position is a position of the positioning pattern in the second image, that is, a pixel area covered by the positioning pattern in the second image can be determined according to the fifth position.
The calibration device determines the position of a positioning pattern in the second image to obtain a fifth position, and the calibration device determines the position of at least one positioning pattern in the second image to obtain at least one fifth position. For example, the at least one positioning graphic comprises graphic a and graphic b, and then the at least one fifth location comprises the location of graphic a in the second image and the location of graphic b in the second image.
In a possible implementation manner, the calibration device determines the position of the at least one positioning pattern in the second image by performing positioning pattern detection processing on the second image, so as to obtain at least one fifth position.
Optionally, the positioning pattern detection processing may be implemented by a positioning pattern detection model, and the positioning pattern detection model is obtained by training a deep learning model by using a plurality of images with label information as training data. The annotation information of the images in the training data includes: the type of graphics in the image and the location of the graphics.
In another possible implementation manner, the calibration device processes the second image by using a positioning pattern detection algorithm, determines the position of the at least one positioning pattern in the second image, and obtains at least one fifth position. Wherein the positioning pattern detection algorithm comprises at least one of: a rectangle detection algorithm, a circle detection algorithm, a diamond detection algorithm, and a hexagon detection algorithm.
For example, the at least one positioning pattern includes a circle and a rectangle. And the calibration device processes the second image by using a rectangle detection algorithm, and determines a rectangle in the second image to obtain a target rectangle. And the calibration device processes the second image by using a circle detection algorithm, and determines the circle in the second image to obtain a target circle. At this time, the at least one fifth position includes a position of the target rectangle in the second image and a position of the target circle in the second image.
11. Obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship and the at least one fifth position, wherein the second geometric relationship is the geometric relationship between the at least one positioning pattern and the irradiation region.
In an embodiment of the application, the second geometric relationship is a geometric relationship between the at least one positioning pattern and the irradiation region. For example, the irradiation area is circular and the at least one positioning pattern comprises squares. The second geometric relationship includes the radiating area as an inscribed circle of the positioning pattern.
For another example, the irradiation area is circular, and the at least one positioning pattern includes a pattern a and a pattern b. The second geometric relationship includes that the center of the radiation area, the geometric center of the graph a and the geometric center of the graph b pass through the same straight line, and the ratio of a first reference distance to a second reference distance is 1/2, wherein the first reference distance is the distance between the center of the radiation area and the geometric center of the graph a, and the second reference distance is the distance between the center of the radiation area and the geometric center of the graph b.
For another example, the radiation area is rectangular, and the at least one positioning pattern includes a pattern a, a pattern B, a pattern C, and a pattern D, where the radiation area includes a corner point a, a corner point B, a corner point C, and a corner point D.
The second geometric relationship comprises that the corner point A, the corner point C, the geometric center of the graph a and the geometric center of the graph C pass through the same straight line, the corner point B, the corner point D, the geometric center of the graph B and the geometric center of the graph D pass through the same straight line, the ratio of the third reference distance to the fourth reference distance is 1/2, and the ratio of the fifth reference distance to the sixth reference distance is 1/3. The third reference distance is the distance between the corner point a and the geometric center of the graph a, the fourth reference distance is the distance between the corner point C and the geometric center of the graph C, the fifth reference distance is the distance between the corner point B and the geometric center of the graph B, and the sixth reference distance is the distance between the corner point D and the geometric center of the graph D.
In one possible embodiment, the calibration device determines the position of the irradiated region in the second image as a function of the second geometric relationship and the at least one fifth position. And determining the positions of at least four detection points in the second image according to the positions of the radiation areas in the second image and the first geometric relation to obtain at least four third positions.
Since the position of the positioning pattern can be detected by the second image (see specifically two implementation manners in step 10), the calibration device can determine the positions of the at least four detection points in the second image according to the first geometric relationship, the second geometric relationship and the position of the at least one positioning pattern in the second image by performing step 10 to step 11, when the position of the radiation region is determined to have a large error by performing pattern detection (such as rectangle detection and circle detection) on the second image, thereby improving the accuracy of the position of the radiation region in the second image. Alternatively, the greater the number of positioning patterns, the greater the accuracy of the position of the irradiated region in the second image.
Since the determined pixel area (hereinafter referred to as a first reference area) is closer to the pixel area included in the first image with the at least four detection points in the first image as boundary points, the more accurate the conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is, depending on the positions of the at least four detection points in the first image and the positions of the at least four detection points in the second image. And the first reference region belongs to the first image, i.e. the area of the first reference region is smaller than or equal to the area of the first image. Therefore, the larger the area of the first reference region is, the more accurate the obtained conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is according to the positions of the at least four detection points in the first image and the positions of the at least four detection points in the second image.
Similarly, taking at least four detection points in the second image as boundary points, the larger the area of the determined pixel region (hereinafter referred to as a second reference region) is, the more accurate the obtained conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is according to the positions of the at least four detection points in the first image and the positions of the at least four detection points in the second image.
For example, if the at least four detection points include all image points on the boundary of the first image, the first reference region is the first image. The at least four detection points include all image points on the boundary of the second image, and then the second reference area is the second image. At this time, according to the positions of the at least four detection points in the first image and the positions of the at least four detection points in the second image, the obtained conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image is the most accurate.
When the at least four detection points are boundary points of the radiation area, the first reference area is a pixel area corresponding to the radiation area in the first image, and the second reference area is a pixel area corresponding to the radiation area in the second image. Therefore, the larger the area of the radiation region is, the larger the area of the first reference region and the area of the second reference region are, so that the more accurate the obtained conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is according to the positions of the at least four detection points in the first image and the positions of the at least four detection points in the second image.
Because the target scene comprises the radiation area and the at least one positioning graph, and the size of the target scene is fixed, the area of the radiation area can be increased by reducing the area of the at least one positioning graph, and the accuracy of the conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image can be improved.
Therefore, as an optional implementation manner, the area of the radiation region in the embodiment of the present application is larger than the area of the positioning pattern (it should be understood that the area of the radiation region is larger than the area of the positioning pattern, which means that the area of the radiation region is larger than the area of any one positioning pattern in at least one positioning pattern), so that by reducing the area of the positioning pattern, the area of the radiation region can be increased, and further, the accuracy of the conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image is improved.
For example, in the image shown in fig. 6, the rectangular area ABCD is a radiation area, and 61, 62, 63, 64 are four positioning patterns, wherein the area of the radiation area is larger than the area of the positioning pattern 61, the area of the radiation area is larger than the area of the positioning pattern 62, the area of the radiation area is larger than the area of the positioning pattern 63, and the area of the radiation area is larger than the area of the positioning pattern 64.
In an alternative embodiment, the first geometric relationship includes a first positioning point where the at least four detection points are all radiation areas, and the first positioning point includes one of the following: corner points, geometric centers.
In this embodiment, the at least four detection points are all first fixed points, where the first fixed points are corner points of the radiation region, and the first fixed points are the geometric centers of the radiation region. For example, the at least four detection points include a detection point a, a detection point b, a detection point c, and a detection point d, where the detection point a, the detection point b, and the detection point c are all corner points of the radiation area (it should be understood that the detection point a, the detection point b, and the detection point c are different corner points), and the detection point d is a geometric center of the radiation area.
For another example, the at least four detecting points include a detecting point a, a detecting point b, a detecting point c, and a detecting point d, where the detecting point a, the detecting point b, the detecting point c, and the detecting point d are all corner points of the radiation area (it should be understood that the detecting point a, the detecting point b, the detecting point c, and the detecting point d are different corner points).
The second geometrical relationship comprises a third geometrical relationship, wherein the third geometrical relationship is a geometrical relationship between the at least one positioning pattern and the at least one first positioning point of the irradiation region. For example, the at least one positioning pattern comprises circles a and b, and the at least one first location of the irradiation region comprises a geometric center of the irradiation region. The third geometrical relationship includes the geometrical center of the irradiation region being the intersection of circle a and circle b.
For another example, the at least one positioning pattern comprises concentric circles a and circles b, and the at least one first positioning point of the irradiation region comprises a corner c of the irradiation region and a corner d of the irradiation region. The third geometric relation comprises that an angular point c of the radiation area coincides with the center of the concentric circle a, and an angular point d of the radiation area coincides with the center of the circle b.
For another example, the at least one positioning pattern comprises concentric circles a and concentric circles b, and the at least one first positioning point of the irradiation region comprises a corner point c of the irradiation region and a geometric center d of the irradiation region. The third geometric relation comprises that the angular point c of the radiation area coincides with the center of the concentric circle a, and the geometric center d of the radiation area coincides with the center of the concentric circle b.
In this embodiment, the calibration means performs the following steps in the process of performing step 11:
12. and determining the positions of at least four first positioning points corresponding to the at least four detection points in the second image according to the third geometric relationship and the at least one fifth position to obtain the at least four third positions.
In one possible implementation, the at least one positioning pattern includes a circle a and a circle b, and the third geometric relationship includes that the geometric center of the irradiation region is an intersection of the circle a and the circle b. The calibration device determines the position of the intersection of circle a and circle b in the second image and thus the position of the geometric center of the radiation region in the second image, depending on the at least one fifth position.
In another possible implementation manner, the at least one positioning pattern includes concentric circles a and circles b, the third geometric relationship includes that the corner point c of the radiation area coincides with the center of the concentric circle a, and the corner point d of the radiation area coincides with the center of the circle b. The calibration device can determine the position of the center of the concentric circle a in the second image according to at least one fifth position, and further determine the position of the corner point c of the radiation area in the second image. The calibration device can determine the position of the center of the circle b in the second image according to the at least one fifth position, and further determine the position of the corner point d of the radiation area in the second image.
In yet another possible implementation manner, the at least one positioning pattern includes concentric circles a and concentric circles b, the third geometric relationship includes that the corner point c of the radiation region coincides with the center of the concentric circle a, and the geometric center d of the radiation region coincides with the center of the concentric circle b. The calibration device can determine the position of the center of the concentric circle a in the second image according to at least one fifth position, and further determine the position of the corner point c of the radiation area in the second image. The calibration device can determine the position of the corner point d of the radiation area in the second image according to at least one fifth position and the position of the center of the concentric circle b in the second image.
As an alternative embodiment, the calibration means performs the following steps in the process of performing step 11:
13. and obtaining a sixth position of the radiation area in the second image according to the second geometric relation and the at least one fifth position.
In the embodiment of the present application, the position of the radiation region in the second image is a sixth position. The calibration device may determine the sixth position based on the position of the at least one positioning graphic in the second image and the second geometric relationship, i.e. the sixth position may be determined based on the at least one fifth position and the second geometric relationship.
14. And obtaining the at least four third positions according to the first geometric relationship and the sixth position.
It is to be understood that in step 202, the number of detection points is greater than or equal to 1, the first geometrical relationship being a geometrical relationship between the radiation area and the at least one detection point. In this step, the number of the detection points is greater than or equal to 4, and the first geometric relationship is a geometric relationship between the radiation area and at least four detection points.
The calibration device can respectively determine the position of each detection point in the second image according to the first geometric relationship and the sixth position to obtain at least four third positions.
The implementation manner for determining the position of the detection point in the second image in this step may refer to the implementation manner for determining the position of the detection point in the first image in step 202, and will not be described herein again. Specifically, the sixth position in this step corresponds to the first position in step 202, and the third position in this step corresponds to the second position in step 202.
As an alternative embodiment, at least one positioning graphic comprises a nested graphic. In the embodiment of the application, the nested graph comprises at least two outlines, and one root outline exists in the nested graph. The root contour includes contours where there is no parent contour and there is a child contour. A sub-profile of a profile refers to a profile that lies within the range encompassed by the profile.
For example, in the nested figure shown in FIG. 7, the sub-profile of profile 71 includes profile 72 and profile 73, the sub-profile of profile 72 includes profile 73, and there is no sub-profile of profile 73, when the root profile is profile 71. For another example, in the nested graph shown in fig. 8, the sub-profile of the profile 81 includes a profile 82 and a profile 83, and neither the profile 82 nor the profile 83 has a sub-profile, and in this case, the root profile is the profile 81. As another example, the nested figures are concentric circles.
In this embodiment, the calibration means performs the following steps in the process of performing step 10:
15. and carrying out edge detection processing on the second image to obtain at least one first contour in the second image.
In an embodiment of the present application, an edge detection process may be used to detect the position of a contour in an image. Optionally, the edge detection process may be implemented by one of the following methods: canny edge detection algorithm, sobel (sobel) operator, roberts edge detection operator, laplacian of gaussian (LOG) edge detection operator.
16. A second profile is determined from the at least one first profile, the number of sub-profiles being a first value.
In the embodiment of the present application, the first value is a positive integer. The calibration device determines a second profile having a first number of sub-profiles from the at least one first profile, i.e. the second profile has a first number of sub-profiles.
17. And determining the second contour as the contour of the nested graph, and obtaining a seventh position according to the position of the contour of the nested graph in the second image.
In an embodiment of the application, the seventh position belongs to at least one fifth position, and the seventh position is a position of the nested graphic in the second image.
Because the nested graph has the root contour, the root contour is the contour of the nested graph. Therefore, the calibration device determines the position of the root contour from the second image, i.e., the position of the contour of the nested graphic, and thus the position of the nested graphic in the second image, i.e., the seventh position.
In an embodiment of the application, the root contour is determined from the second image in dependence on the number of sub-contours of the root contour. Specifically, the second contour is determined to be a contour of the nested graph, that is, the contour with the number of sub-contours being a first value is determined to be a root contour of the nested graph.
For example, the positioning pattern is the pattern shown in fig. 5, and the number of sub-contours of the contour of the positioning pattern is 1, that is, the first value is 1. For another example, the positioning pattern is the pattern shown in fig. 7, and the number of sub-contours of the contour of the positioning pattern is 2, that is, the first value is 2.
It should be understood that the nested figures in the embodiments of the present application are examples only, and it should not be understood that at least one positioning figure includes only one nested figure. In practical applications, the at least one positioning graphic may include two or more nested graphics, and the at least one positioning graphic may include one or more nested graphics, as well as non-nested graphics. For example, the at least one positioning graphic includes a nested graphic a and a non-nested graphic b.
This embodiment should also not be understood as meaning that the calibration means only determine the position of the nested pattern in the second image during the execution of step 10. In a possible implementation manner, in the case that the at least one positioning pattern includes at least one target pattern other than the nested pattern, the calibration apparatus performs not only steps 15 to 17 but also the following steps in the process of performing step 10: the position of the at least one target feature in the second image is determined to obtain at least one thirteenth position. At this time, the at least one fifth position includes a seventh position and at least one thirteenth position.
Optionally, in a case that at least one target pattern includes one or more nested patterns, the calibration device may determine the positions of all nested patterns in the second image or determine the positions of some nested patterns in one or more nested patterns in the second image based on the technical solutions provided in steps 15 to 17, respectively.
For example, the at least one positioning pattern includes a nested pattern a and a nested pattern b. The calibration device can respectively determine the position of the nested graph a in the second image and the position of the nested graph b in the second image based on the technical schemes provided in the steps 15 to 17.
For another example, the at least one positioning pattern includes a nested pattern a and a nested pattern b. The calibration device can determine the position of the nested graph a in the second image based on the technical scheme provided in the steps 15-17. The device is calibrated and the position of the nested pattern b in the second image is determined based on other means, such as the implementation provided in step 10.
Optionally, in the case that the at least one target pattern includes at least one nested pattern and at least one non-nested pattern, the calibration device determines the positions of all nested patterns in the second image based on the technical solutions provided in steps 15 to 17, and determines the positions of the non-nested patterns in the second image according to other manners (such as the implementation manner provided in step 10).
And (3) respectively determining the positions of the partially nested graphs in the second image by the calibration device or based on the technical schemes provided in the steps 15-17, and determining the positions of the partially nested graphs and the non-nested graphs in the second image according to other modes (such as the implementation mode provided in the step 10).
For example, the at least one target graph includes a nested graph a, a nested graph b, and a non-nested graph c. The calibration device can determine the position of the nested graph a in the second image based on the technical solutions provided in the steps 15 to 17, and determine the position of the nested graph b in the second image and the position of the non-nested graph c in the second image based on other manners (such as the implementation manner provided in the step 10).
Optionally, in a case that the at least one target pattern includes at least one non-nested pattern, the calibration device determines a position of the at least one target pattern in the second image according to another implementation.
The at least one first contour obtained by performing the edge detection processing on the second image includes not only the contour of the positioning pattern but also a contour other than the contour of the positioning pattern. In this embodiment, the nested graph includes the root contour, so that the calibration device can determine the root contour of the positioning graph from the contours of the second image according to the number of the sub-contours, and further determine the position of the contour of the positioning graph in the second image, thereby improving the accuracy of the position of the contour of the positioning graph in the second image, and further improving the accuracy of the position of the positioning graph in the second image.
As an alternative embodiment, the calibration device determines the second contour as the contour of the nested figure when the second contour satisfies a second condition, wherein the second condition includes at least one of: the area enclosed by the outline is within the second area range, and the size of the outline meets the second size requirement.
During the process of acquiring the second image by the visible light imaging device, factors causing distortion of the nested graphics in the second image may exist, and in the case of distortion of the nested graphics of the second image, errors are prone to occur in the position of the nested graphics in the second image. In addition, in the target scene, there may be other patterns including the root contour (hereinafter referred to as interference patterns) in addition to the nested patterns, and the number of sub-contours of the root contour of the interference pattern may also be the first value, so that the calibration apparatus may misidentify the interference pattern as the nested patterns, thereby causing low accuracy in the position of the nested patterns.
Therefore, to improve the accuracy of the position of the nested graphic in the second image, the position of the nested graphic in the second image can be determined without distortion of the nested graphic in the second image.
It is considered that if the nested pattern is distorted, the shape of the nested pattern changes, and the area of the nested pattern changes. Further, the probability that the area of the interference pattern is the same as the area of the nested pattern is low. Therefore, whether the second contour is the contour of the nested graph can be determined according to the area surrounded by the second contour, and whether the nested graph corresponding to the second contour is distorted can be determined.
Specifically, in the target scene, the shape of the nested graphics is fixed, that is, the area of the nested graphics is constant, and thus the area of the nested graphics in the second image (hereinafter referred to as a second reference area) can be determined according to the area of the nested graphics. The reasonable range of the area of the nested graphics in the visible light image (i.e. the second area range) can be determined according to the second reference area, and then whether the second outline is the outline of the nested graphics can be determined according to the second area range and the area enclosed by the second outline, i.e. whether the graphics corresponding to the second outline are the nested graphics.
Specifically, under the condition that the area enclosed by the second contour is within the second area range, the second contour is determined to be the contour of the nested graph, the nested graph corresponding to the second contour is determined to be undistorted, and then the position of the nested graph in the second image can be obtained according to the position of the contour of the nested graph in the second image. And under the condition that the area enclosed by the second outline is out of the second area range, determining that the second outline is not the outline of the nested graph or determining that the nested graph corresponding to the second outline is distorted, and further not executing the step of obtaining the position of the nested graph in the second image according to the position of the outline of the nested graph in the second image.
It is considered that if the nested pattern is distorted, the shape of the nested pattern changes, and the size of the nested pattern changes. Further, the probability that the size of the interference pattern is the same as that of the nested pattern is low. Therefore, whether the second contour is the contour of the nested graph can be determined according to the size of the second contour, and whether the nested graph corresponding to the second contour is distorted can be determined.
Specifically, in the target scene, the shape of the nested figure is fixed, that is, the size of the nested figure is constant, and thus the size of the nested figure in the second image (hereinafter referred to as a second reference size) can be determined depending on the size of the nested figure. From the second reference size, a reasonable range of sizes of nested graphics in the visible image can be determined (i.e., the second size requirement described above).
For example, the outline of the nested pattern is rectangular, and the length-width ratio of the nested pattern in the visible light image is determined to be (1.12, 1.48) according to the second reference size, and the second size requires that the length-width ratio of the nested pattern should be within (1.12, 1.48).
After determining the second size requirement, it is further determined whether the second contour is a contour of the nested graphic based on the second size requirement and a size of the second contour in the second image, and it is determined whether the nested graphic corresponding to the second contour is distorted.
Specifically, under the condition that the size of the second contour in the second image meets the second size requirement, the second contour is determined to be the contour of the nested graph, the nested graph corresponding to the second contour is determined to be undistorted, and then the position of the nested graph in the second image can be obtained according to the position of the contour of the nested graph in the second image. And under the condition that the size of the second contour in the second image meets the second size requirement, determining that the second contour is not the contour of the nested graph or determining that the nested graph corresponding to the second contour is distorted, and further not performing the step of obtaining the position of the nested graph in the second image according to the position of the contour of the nested graph in the second image.
Optionally, the calibration device may determine that the second contour is a contour of the nested figure when an area surrounded by the second contour is within a second area range and a size of the second contour meets a second size requirement.
As an alternative embodiment, the second geometric relationship comprises a third geometric relationship, wherein the third geometric relationship is a geometric relationship between the at least one positioning feature and the at least one first positioning point of the irradiation field. The meaning of the third geometric relationship in this embodiment can be referred to the explanation of the third geometric relationship, and will not be described herein again.
In this embodiment, the calibration means performs the following steps in the process of performing step 13:
18. and obtaining at least one eighth position of the at least one first positioning point of the radiation area in the second image according to the third geometric relation and the at least one fifth position.
In the embodiment of the present application, the eighth position is a position of the first localization point of the irradiation region in the second image. The calibration device can respectively determine the position of each first positioning point in the second image according to the third geometric relationship and the at least one fifth position to obtain at least one eighth position.
19. And obtaining a sixth position of the radiation area in the second image according to the at least one eighth position.
In a possible implementation, the radiation area is rectangular, and the at least one eighth position includes positions of four corner points of the radiation area in the second image and positions of a geometric center of the radiation area in the second image. The calibration device may determine the position of the radiation region in the second image, i.e. the sixth position, according to the positions of the four corner points of the radiation region in the second image and the position of the geometric center of the radiation region in the second image.
In another possible implementation, the radiation area is circular, and the at least one eighth position includes a position of a center of the radiation area in the second image. The calibration means may determine the position of the radiation area in the second image, i.e. the sixth position, based on the position of the centre of the radiation area in the second image and the radius of the radiation area (optionally, the radius of the radiation area may be obtained before performing step 19).
In yet another possible implementation, the radiation area is square, and the at least one eighth position includes a position of one corner point of the radiation area in the second image and a side length of the radiation area. The calibration device may determine the position of the radiation region in the second image, i.e. the sixth position, according to the position of one corner point of the radiation region in the second image and the side length of the radiation region.
As an optional implementation manner, the third geometric relationship includes a third condition, and the second positioning point of the at least one positioning graph, the third positioning point in the at least one positioning graph, and the fourth positioning point are located on the same straight line, where the second positioning point includes one of: angular point, geometric center, third anchor point include one of the following: the angular point, the geometric center and the fourth positioning point are any one of at least one first positioning point of the radiation area.
In this embodiment of the present application, the second positioning point and the third positioning point may belong to the same positioning graph. For example, the at least one positioning graph includes a rectangle a, the second positioning point is a corner point b of the rectangle a, and the third positioning point is a geometric center of the rectangle a.
The second anchor point and the third anchor point may also belong to different anchor patterns. For example, the at least one positioning pattern comprises a rectangle a and a circle b, the second positioning point is the geometric center of the rectangle a, and the third positioning point is the geometric center of the circle b.
The third condition includes one of the following: the positioning method comprises the following steps of obtaining a first ratio between a first distance and a second distance, obtaining a second ratio between the first distance and a third distance, and obtaining a third ratio between the second distance and a third distance, wherein the first distance is the distance between a second positioning point and a third positioning point, the second distance is the distance between a fourth positioning point and the second positioning point, and the third distance is the distance between the fourth positioning point and the third positioning point.
In this embodiment, the calibration means performs the following steps in the course of performing step 18:
20. and determining a tenth position of the second positioning point in the second image and an eleventh position of the third positioning point in the second image according to the at least one fifth position.
For example, the at least one positioning graph includes a rectangle a, the second positioning point is a largest corner point of the rectangle a, and the third positioning point is a geometric center of the rectangle a, where the largest corner point is a corner point with a largest abscissa among the four corner points, and the largest corner point is a corner point with a largest ordinate among the four corner points.
In the at least one fifth position, the position of the rectangle a in the second image comprises: the maximum corner points are (3, 4), and the minimum corner points are (1, 2), wherein the minimum corner points are the corner points with the smallest abscissa among the four corner points, and the minimum corner points are the corner points with the smallest ordinate among the four corner points. The calibration device determines the position of the maximum corner point of the rectangle a in the second image as (3, 4) and the position of the geometric center of the rectangle a in the second image as (1, 1) according to the position of the rectangle a in the second image. In this case, the tenth position is (3, 4), and the tenth position is (1, 1).
For another example, the at least one positioning pattern includes a rectangle a and a circle b, the second positioning point is the largest corner point of the rectangle a, and the third positioning point is the center of the circle b.
In the at least one fifth position, the position of the rectangle a in the second image comprises: the maximum corner point is (3, 4), the minimum corner point is (1, 2), and the position of the circle b in the second image includes: the circle center is (5, 7) and the radius is 3. The calibration device determines the position of the largest corner point of the rectangle a in the second image as (3, 4) according to the position of the rectangle a in the second image, and determines the position of the center of the circle b in the second image as (5, 7) according to the position of the circle b in the second image. In this case, the tenth position is (3, 4), and the tenth position is (5, 7).
For another example, the at least one positioning graph includes a rectangle a and a rectangle b, the second positioning point is the largest corner point of the rectangle a, and the third positioning point is the smallest corner point of the rectangle b.
In the at least one fifth position, the position of the rectangle a in the second image comprises: the maximum corner point is (5, 4), the minimum corner point is (1, 2), and the position of the rectangle b in the second image includes: the largest corner point is (6, 9) and the smallest corner point is (4, 7). The calibration device determines the position of the maximum corner point of the rectangle a in the second image as (5, 4) according to the position of the rectangle a in the second image, and determines the position of the minimum corner point of the rectangle b in the second image as (4, 7) according to the position of the rectangle b in the second image. In this case, the tenth position is (5, 4), and the tenth position is (4, 7).
21. And determining a ninth position of the fourth positioning point in the second image according to the third geometric relationship, the tenth position and the eleventh position.
In the embodiment of the present application, the ninth location belongs to at least one eighth location, and the ninth location is a location of the fourth anchor point in the second image.
In one possible implementation, the third geometric relationship includes a first ratio between the first distance and the second distance. The calibration device obtains a first distance according to the tenth position and the eleventh position, and further obtains a second distance according to the first ratio and the first distance. And obtaining the position of the fourth positioning point in the second image, namely a ninth position, according to the second distance and the tenth position.
In another possible implementation, the third geometric relationship includes a second ratio between the first distance and the third distance. The calibration device obtains a first distance according to the tenth position and the eleventh position, and further obtains a third distance according to the second ratio and the first distance. And obtaining the position of the fourth positioning point in the second image, namely a ninth position, according to the third distance and the eleventh position.
In yet another possible implementation, the third geometric relationship includes a third ratio between the second distance and the third distance. And the calibration device obtains the position of the fourth positioning point in the second image, namely the ninth position, according to the tenth position, the eleventh position and the third ratio.
It should be understood that the second positioning point, the third positioning point, and the fourth positioning point in the embodiment of the present application are only examples, and it should not be understood that the calibration device only determines the position of the fourth positioning point in the second image during the process of executing step 18. In a possible implementation manner, in a case that the at least one first positioning point includes at least one fifth positioning point except the fourth positioning point, the calibration apparatus not only performs steps 20 to 21, but also performs the following steps in the process of performing step 18: and determining the position of the at least one fifth positioning point in the second image to obtain at least one fourteenth position. At this time, the at least one eighth position includes a ninth position and at least one fourteenth position.
The calibration device determines the position of the fourth positioning point in the second image by executing the steps 20-21, so that the data processing amount can be reduced, and the processing speed can be increased. In addition, because the second positioning point, the third positioning point and the fourth positioning point are located on the same straight line, and the position of the fourth positioning point in the second image is determined according to the position of the second positioning point in the second image and the position of the third positioning point in the second image, the accuracy of the position of the fourth positioning point in the image can be improved, and the accuracy of the position of the radiation region in the second image can be further improved.
Based on the technical scheme provided by the embodiment of the application, the embodiment of the application also provides several possible application scenarios:
scene A: more and more scenes are currently done with non-contact thermometry using electronics including visible light imaging devices and thermal imaging devices. For example, the electronic device including the visible light imaging device and the thermal imaging device is a temperature measurement terminal. Optionally, the calibration apparatus is an electronic device including a visible light imaging device and a thermal imaging device.
Specifically, the electronic device acquires a third image of the person to be detected by using the visible light imaging device, and acquires a fourth image of the person to be detected by using the thermal imaging device.
In the embodiment of the present application, the third image is a visible light image including a person to be detected, and the fourth image is a thermal image including a person to be detected. Optionally, the electronic device may acquire the fourth image of the person to be detected by using the thermal imaging device in the process of acquiring the third image of the person to be detected by using the visible light imaging device.
And the electronic equipment performs skin detection processing on the third image and determines a ninth position of the skin area of the person to be detected in the third image.
In an embodiment of the application, the skin detection process is used to detect a skin region in a visible light image. Alternatively, the skin detection process may be implemented by a skin detection model, wherein the skin detection model is a computer vision model for detecting skin regions. For example, the skin detection model is a neural network, wherein the neural network can detect skin regions.
In this embodiment of the application, the twelfth position is a position of the skin area in the third image, that is, the pixel area covered by the skin of the person to be detected can be determined from the third image according to the twelfth position.
Optionally, the skin area of the person to be detected is a face area of the person to be detected. At this time, the skin detection processing is face detection processing.
Optionally, in a case that the number of people in the third image is greater than 1, the electronic device regards the person with the largest area of the covered pixel region as the person to be detected.
The electronic device determines a second pixel region corresponding to the skin region of the person to be detected in the fourth image according to the conversion relationship and the twelfth position obtained in step 202.
Specifically, the electronic device converts the twelfth position according to the conversion relationship to obtain the position of the skin area of the person to be detected in the fourth image (hereinafter referred to as a fifteenth position). And determining a pixel area corresponding to the skin area of the person to be detected in the fourth image, namely the second pixel area according to the fifteenth position.
After the second pixel area is determined, the electronic device can obtain the body temperature of the person to be detected according to the temperature of the second pixel area. In one possible implementation manner, the electronic device takes the average temperature of the second pixel region as the body temperature of the person to be detected. For example, the second pixel region includes a pixel a and a pixel b, and the electronic device determines a temperature of the pixel a (hereinafter referred to as a first temperature) and a temperature of the pixel b (hereinafter referred to as a second temperature) from the fourth image and calculates an average value of the first temperature and the second temperature as the body temperature of the person to be detected.
In another possible implementation manner, the electronic device determines an average temperature of the second pixel region, and takes a sum of the average temperature and a third constant as the body temperature of the person to be detected, wherein the third constant is a rational number. For example, the second pixel region includes a pixel a and a pixel b, and the electronic device determines a temperature of the pixel a (hereinafter, referred to as a first temperature) and a temperature of the pixel b (hereinafter, referred to as a second temperature) from the fourth image and calculates an average value of the first temperature and the second temperature as an average temperature of the second pixel region. The electronic device calculates the sum of the average temperature and the third constant as the body temperature of the person to be detected.
In yet another possible implementation manner, the electronic device uses the temperature of any pixel in the second pixel region as the body temperature of the person to be detected. For example, the second pixel region includes a pixel a and a pixel b, and the electronic device determines a temperature of the pixel a (hereinafter referred to as a first temperature) and a temperature of the pixel b (hereinafter referred to as a second temperature) from the fourth image. The electronic equipment takes the first temperature as the body temperature of the person to be detected, and the electronic equipment or the second temperature as the body temperature of the person to be detected.
Scene B: in the scene a, the electronic device determines the basis of the skin area of the person to be detected as the twelfth position and the conversion relation from the fourth image. Therefore, the accuracy of the conversion relation has a great influence on the accuracy of the skin region of the person to be detected determined from the fourth image, thereby affecting the accuracy of the body temperature of the person to be detected. Therefore, it is very important how to determine the conversion relationship, that is, how to determine the conversion relationship between the pixel coordinate system of the visible light image and the pixel coordinate system of the thermal image.
Based on the inventive concept and technical solution provided by the embodiment of the present application, the embodiment of the present application provides a calibration board 90 as shown in fig. 9, where the calibration board 90 is used to calibrate a conversion relationship between a pixel coordinate system of a visible light image collected by a visible light imaging device and a pixel coordinate system of a thermal image collected by a thermal imaging device.
The calibration board 90 includes four positioning patterns and a hollow area, wherein the four positioning patterns are all in a shape of a 'hui', the four positioning patterns are respectively a positioning pattern a, a positioning pattern b, a positioning pattern c and a positioning pattern d, and the hollow area is an area surrounded by a rectangular ABCD. In the calibration plate 90, the hollow-out area is free of any object, or the hollow-out area object is a heat conductive material, and the area other than the hollow-out area includes a heat insulating material.
As shown in fig. 9, an angular point E of the positioning pattern 91, an angular point F of the positioning pattern 91, an angular point a of the hollow area, an angular point C of the hollow area, an angular point K of the positioning pattern 94, and an angular point J of the positioning pattern 94 are all in the same straight line. The corner point G of the positioning pattern 92, the corner point H of the positioning pattern 92, the corner point B of the hollowed-out region, the corner point D of the hollowed-out region, the corner point M of the positioning pattern 93, and the corner point L of the positioning pattern 93 are all in the same straight line.
Namely, the second geometric relationship includes that the corner point E of the positioning pattern 91, the corner point F of the positioning pattern 91, the corner point a of the hollowed-out region, the corner point C of the hollowed-out region, the corner point K of the positioning pattern 94, and the corner point J of the positioning pattern 94 are all in the same straight line, and the corner point G of the positioning pattern 92, the corner point H of the positioning pattern 92, the corner point B of the hollowed-out region, the corner point D of the hollowed-out region, the corner point M of the positioning pattern 93, and the corner point L of the positioning pattern 93 are all in the same straight line.
Before the electronic device uses the visible light imaging device and the thermal imaging device to shoot the calibration scene, the calibration board 90 is placed in the calibration scene, and the thermal radiation object is placed on one side of the calibration board 90, so that the calibration board 90 is located between the radiation object and the electronic device.
The electronic device shoots the calibration scene by using the visible light imaging device to obtain a fifth image, and shoots the calibration scene by using the thermal imaging device to obtain a sixth image. Since the hollowed-out area is a heat conductive material, and the area other than the hollowed-out area is a heat insulating material, the radiation area of the heat radiation object in the calibration board 90 is the hollowed-out area.
Therefore, the electronic device determines the pixel area corresponding to the hollow area from the sixth image according to the temperature threshold. Specifically, the electronic device determines that a pixel region with a temperature greater than or equal to a temperature threshold in the sixth image is a pixel region corresponding to the hollowed-out region, and then determines positions of four corner points of the pixel region corresponding to the hollowed-out region in the sixth image, that is, determines a position of the corner point a in the sixth image, a position of the corner point B in the sixth image, a position of the corner point C in the sixth image, and a position of the corner point D in the sixth image.
In addition, based on the technical scheme provided in the foregoing, the fifth image may be converted into a grayscale image, and then the edge detection processing is performed on the grayscale image of the fifth image through a canny edge detection algorithm, so as to determine the contour in the fifth image, thereby obtaining the first contour set. A second set of contours is derived from the first set of contours by determining the contours with a sub-contour of 1. And determining the contours in the second contour set meeting the second condition to obtain four third contours. And determining the positions of the four positioning graphs in the fifth image according to the positions of the four third outlines in the fifth image. According to the positions of the four positioning patterns in the fifth image, the position of the corner E in the fifth image, the position of the corner F in the fifth image, the position of the corner G in the fifth image, the position of the corner H in the fifth image, the position of the corner K in the fifth image, the position of the corner J in the fifth image, the position of the corner M in the fifth image, and the position of the corner L in the fifth image are determined. The electronic device may further determine the position of the corner a in the fifth image, the position of the corner B in the fifth image, the position of the corner C in the fifth image, and the position of the corner D in the fifth image according to the second geometric relationship, the position of the corner E in the fifth image, the position of the corner F in the fifth image, the position of the corner G in the fifth image, the position of the corner H in the fifth image, the position of the corner K in the fifth image, the position of the corner J in the fifth image, the position of the corner M in the fifth image, and the position of the corner L in the fifth image.
The electronic device can further obtain a conversion relation between the pixel coordinate system of the fifth image and the pixel coordinate system of the sixth image according to the position of the corner point a in the fifth image, the position of the corner point B in the fifth image, the position of the corner point C in the fifth image, the position of the corner point D in the fifth image, the position of the corner point a in the sixth image, the position of the corner point B in the sixth image, the position of the corner point C in the sixth image, and the position of the corner point D in the sixth image.
Optionally, in the calibration board 90 shown in fig. 9, the area of the hollow area is larger than the area of any one positioning pattern. In this way, the accuracy of the conversion relationship between the pixel coordinate system of the fifth image and the pixel coordinate system of the sixth image can be improved.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a calibration device according to an embodiment of the present application, where the calibration device 1 includes: an acquisition unit 11 and a processing unit 12. Optionally, the calibration apparatus 1 further includes: a thermal imaging device 13 and a visible light imaging device 14, wherein:
An acquisition unit 11 configured to acquire a first image, the first image being obtained by shooting a target scene by a thermal imaging device 13, the target scene including a thermal radiation object and at least one detection point;
a processing unit 12, configured to obtain at least one second position according to the first position and the first geometric relationship; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.
With reference to any embodiment of the present application, the obtaining unit 11 is further configured to obtain a temperature threshold;
the processing unit 12 is further configured to determine, from the first image, a first pixel region with a temperature greater than or equal to the temperature threshold;
the processing unit 12 is further configured to obtain the first position according to a position of the first pixel region in the first image.
With reference to any one of the embodiments of the present application, the processing unit 12 is configured to perform the step of obtaining at least one second position according to the first position and the first geometric relationship, when the irradiation region satisfies a first condition, where the first condition includes at least one of: the area of the radiation area in the first image is within a first area range, and the size of the radiation area in the first image meets a first size requirement.
With reference to any embodiment of the present application, the number of the detection points is greater than or equal to 4, and the processing unit 12 is configured to:
obtaining at least four second positions according to the first position and the first geometric relationship, wherein the at least four second positions are positions of at least four detection points in the first image;
the acquiring unit 11 is further configured to acquire a second image, where the second image is obtained by shooting the target scene with a visible light imaging device 14;
the processing unit 12 is further configured to determine positions of the at least four detection points in the second image, so as to obtain at least four third positions;
the processing unit 12 is further configured to obtain a conversion relationship between the pixel coordinate system of the first image and the pixel coordinate system of the second image according to the at least four second positions and the at least four third positions.
With reference to any embodiment of the present application, the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: angular point, geometric center;
the processing unit 12 is configured to:
determining the positions of at least four first positioning points corresponding to the at least four detection points in the first image according to the first positions to obtain at least four fourth positions;
And obtaining the at least four second positions according to the at least four fourth positions.
With reference to any embodiment of the present application, the target scene further includes at least one positioning graph, and the processing unit 12 is configured to:
determining the position of the at least one positioning graph in the second image to obtain at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship and the at least one fifth position, wherein the second geometric relationship is the geometric relationship between the at least one positioning pattern and the radiation area.
With reference to any embodiment of the present application, the first geometric relationship includes that the at least four detection points are all first positioning points of the radiation region, and the first positioning points include one of the following: the second geometric relationship comprises a third geometric relationship, and the third geometric relationship is a geometric relationship between the at least one positioning graph and at least one first positioning point of the radiation area;
the processing unit 12 is configured to:
and determining the positions of at least four first positioning points corresponding to the at least four detection points in the second image according to the third geometric relationship and the at least one fifth position to obtain the at least four third positions.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
In combination with any embodiment of the present application, the processing unit 12 is configured to:
obtaining a sixth position of the radiation area in the second image according to the second geometric relation and the at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship and the sixth position.
In combination with any embodiment of the present application, the at least one positioning graphic includes a nested graphic, and the at least one fifth location includes a seventh location of the nested graphic in the second image;
the processing unit 12 is configured to:
performing edge detection processing on the second image to obtain at least one first contour in the second image;
determining a second profile having a first number of sub-profiles from the at least one first profile;
and determining the second contour as the contour of the nested graph, and obtaining the seventh position according to the position of the contour of the nested graph in the second image.
In combination with any embodiment of the present application, the processing unit 12 is configured to:
Determining the second contour as a contour of the nested graphic if the second contour satisfies a second condition, the second condition including at least one of: the area enclosed by the outline is within the second area range, and the size of the outline meets the second size requirement.
In combination with any of the embodiments of the present application, the second geometric relationship includes a third geometric relationship between the at least one positioning feature and the at least one first positioning point of the irradiation region;
the processing unit 12 is configured to:
obtaining at least one eighth position of at least one first positioning point of the radiation region in the second image according to the third geometric relation and the at least one fifth position;
and obtaining a sixth position of the radiation area in the second image according to the at least one eighth position.
With reference to any embodiment of the present application, the third geometric relationship includes a third condition, and a second positioning point of the at least one positioning graph, a third positioning point of the at least one positioning graph, and a fourth positioning point are located on a same straight line, where the second positioning point is different from the third positioning point, and the second positioning point includes one of: angular point, geometric center, the third anchor point includes one of the following: the fourth positioning point is any one of the at least one first positioning point;
The third condition includes one of: a first ratio between a first distance and a second distance, a second ratio between the first distance and a third distance, a third ratio between the second distance and the third distance; the first distance is the distance between the second positioning point and the third positioning point, the second distance is the distance between the fourth positioning point and the second positioning point, and the third distance is the distance between the fourth positioning point and the third positioning point;
said at least one eighth position comprises a ninth position of said fourth anchor point in said second image, said processing unit 12 is configured to:
determining a tenth position of the second positioning point in the second image and an eleventh position of the third positioning point in the second image according to the at least one fifth position;
and obtaining the ninth position according to the third geometric relationship, the tenth position and the eleventh position.
In combination with any one of the embodiments of the present application, an area of the radiation region is larger than an area of the positioning pattern.
With reference to any embodiment of the present application, the thermal imaging device 13 and the visible light imaging device 14 belong to a calibration device, and the calibration device uses the visible light imaging device 14 to acquire a third image of a person to be detected and uses the thermal imaging device 13 to acquire a fourth image of the person to be detected;
The processing unit 12 is further configured to perform skin detection processing on the third image, and determine a twelfth position of the skin area of the person to be detected in the third image;
the processing unit 12 is further configured to determine a second pixel region corresponding to the skin region in the fourth image according to the conversion relationship and the twelfth position;
the processing unit 12 is further configured to obtain the body temperature of the person to be detected according to the temperature of the second pixel region.
In this application, the calibration device may determine a relationship between the position of the heat radiating area in the first image and the position of the at least one inspection point in the first image based on the first geometric relationship, and may further determine the position of the at least one inspection point in the thermal image based on the position of the heat radiating area in the thermal image and the first geometric relationship, thereby improving the accuracy of the position of the at least one inspection point in the thermal image.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present application may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Fig. 11 is a schematic hardware structure diagram of a calibration apparatus according to an embodiment of the present application. The calibration device 2 comprises a processor 21, a memory 22, an input device 23, and an output device 24. The processor 21, the memory 22, the input device 23 and the output device 24 are coupled by a connector, which includes various interfaces, transmission lines or buses, etc., and the embodiment of the present application is not limited thereto. It should be appreciated that in various embodiments of the present application, coupled refers to being interconnected in a particular manner, including being directly connected or indirectly connected through other devices, such as through various interfaces, transmission lines, buses, and the like.
The processor 21 may be one or more Graphics Processing Units (GPUs), and in the case that the processor 21 is one GPU, the GPU may be a single-core GPU or a multi-core GPU. Alternatively, the processor 21 may be a processor group composed of a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. Alternatively, the processor may be other types of processors, and the like, and the embodiments of the present application are not limited.
Memory 22 may be used to store computer program instructions, as well as various types of computer program code for executing the program code of aspects of the present application. Alternatively, the memory includes, but is not limited to, Random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or compact disc read-only memory (CD-ROM), which is used for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It is understood that, in the embodiment of the present application, the memory 22 may be used to store not only the relevant instructions, but also relevant data, for example, the memory 22 may be used to store the first image obtained through the input device 23, or the memory 22 may also be used to store at least one second position obtained through the processor 21, and so on, and the embodiment of the present application is not limited to the data specifically stored in the memory.
It will be appreciated that fig. 11 shows only a simplified design of the calibration arrangement. In practical applications, the calibration devices may also respectively include other necessary components, including but not limited to any number of input/output devices, processors, memories, etc., and all calibration devices that can implement the embodiments of the present application are within the protection scope of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It is also clear to those skilled in the art that the descriptions of the various embodiments of the present application have different emphasis, and for convenience and brevity of description, the same or similar parts may not be repeated in different embodiments, so that the parts that are not described or not described in detail in a certain embodiment may refer to the descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., Digital Versatile Disk (DVD)), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.

Claims (18)

1. A calibration method, characterized in that the method comprises:
acquiring a first image, wherein the first image is obtained by shooting a target scene by thermal imaging equipment, and the target scene comprises a thermal radiation object and at least one detection point;
obtaining at least one second position according to the first position and the first geometric relation; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.
2. The method of claim 1, wherein prior to deriving the at least one second location from the first location and the first geometric relationship, the method further comprises:
acquiring a temperature threshold;
determining a first pixel region from the first image having a temperature greater than or equal to the temperature threshold;
and obtaining the first position according to the position of the first pixel region in the first image.
3. The method of claim 2, wherein said deriving at least one second location from the first location and the first geometric relationship comprises:
performing the step of deriving at least one second position from the first position and the first geometrical relationship in case the radiation area satisfies a first condition, the first condition comprising at least one of: the area of the radiation area in the first image is within a first area range, and the size of the radiation area in the first image meets a first size requirement.
4. A method according to any one of claims 1 to 3, wherein the number of detection points is greater than or equal to 4, and said deriving at least one second position from the first position and the first geometric relationship comprises:
Obtaining at least four second positions according to the first position and the first geometric relationship, wherein the at least four second positions are positions of at least four detection points in the first image;
the method further comprises the following steps: acquiring a second image, wherein the second image is obtained by shooting the target scene by visible light imaging equipment;
determining the positions of the at least four detection points in the second image to obtain at least four third positions;
and obtaining a conversion relation between the pixel coordinate system of the first image and the pixel coordinate system of the second image according to the at least four second positions and the at least four third positions.
5. The method of claim 4, wherein the at least four detection points are each a first location point of the radiation area, the first location point comprising one of: angular point, geometric center;
obtaining at least four second positions according to the first position and the first geometric relationship, including:
determining the positions of at least four first positioning points corresponding to the at least four detection points in the first image according to the first positions to obtain at least four fourth positions;
And obtaining the at least four second positions according to the at least four fourth positions.
6. The method of claim 4, wherein the target scene further comprises at least one positioning graph, and wherein the determining the positions of the at least four detection points in the second image to obtain at least four third positions comprises:
determining the position of the at least one positioning graph in the second image to obtain at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship and the at least one fifth position, wherein the second geometric relationship is the geometric relationship between the at least one positioning pattern and the radiation area.
7. The method of claim 6, wherein the first geometric relationship comprises a first location point for which each of the at least four detection points is the radiation region, the first location point comprising one of: the second geometric relationship comprises a third geometric relationship, and the third geometric relationship is a geometric relationship between the at least one positioning graph and at least one first positioning point of the radiation area;
Obtaining the at least four third positions according to the first geometric relationship, the second geometric relationship, and the at least one fifth position, including:
and determining the positions of the at least four first positioning points corresponding to the at least four detection points in the second image according to the third geometric relationship and the at least one fifth position to obtain the at least four third positions.
8. The method of claim 7, wherein the area of the irradiation region is larger than the area of the positioning pattern.
9. The method of claim 6, wherein said deriving the at least four third positions from the first geometric relationship, the second geometric relationship, and the at least one fifth position comprises:
obtaining a sixth position of the radiation area in the second image according to the second geometric relation and the at least one fifth position;
and obtaining the at least four third positions according to the first geometric relationship and the sixth position.
10. The method of claim 9, wherein the at least one positioning graphic comprises a nested graphic, and the at least one fifth location comprises a seventh location of the nested graphic in the second image;
The determining the position of the at least one positioning graph in the second image to obtain at least one fifth position includes:
performing edge detection processing on the second image to obtain at least one first contour in the second image;
determining a second profile having a first number of sub-profiles from the at least one first profile;
and determining the second contour as the contour of the nested graph, and obtaining the seventh position according to the position of the contour of the nested graph in the second image.
11. The method of claim 10, wherein the determining the second contour as a contour of the nested graphic comprises:
determining the second contour as a contour of the nested graphic if the second contour satisfies a second condition, the second condition including at least one of: the area enclosed by the outline is within the second area range, and the size of the outline meets the second size requirement.
12. The method of claim 9, wherein the second geometric relationship comprises a third geometric relationship between the at least one positioning feature and at least one first location point of the irradiation region;
Obtaining a sixth position of the radiation region in the second image according to the second geometric relationship and the at least one fifth position includes:
obtaining at least one eighth position of at least one first positioning point of the radiation region in the second image according to the third geometric relation and the at least one fifth position;
and obtaining a sixth position of the radiation area in the second image according to the at least one eighth position.
13. The method of claim 12, wherein the third geometric relationship comprises a third condition, and wherein a second anchor point of the at least one positioning graphic, a third anchor point of the at least one positioning graphic, and a fourth anchor point are collinear, the second anchor point being different from the third anchor point, the second anchor point comprising one of: angular point, geometric center, the third anchor point includes one of the following: the fourth positioning point is any one of the at least one first positioning point;
the third condition includes one of: a first ratio between a first distance and a second distance, a second ratio between the first distance and a third distance, a third ratio between the second distance and the third distance; the first distance is the distance between the second positioning point and the third positioning point, the second distance is the distance between the fourth positioning point and the second positioning point, and the third distance is the distance between the fourth positioning point and the third positioning point;
The at least one eighth position comprises a ninth position of the fourth localization point in the second image, the deriving at least one eighth position of the at least one first localization point of the radiation region in the second image in accordance with the third geometrical relationship and the at least one fifth position comprises:
determining a tenth position of the second positioning point in the second image and an eleventh position of the third positioning point in the second image according to the at least one fifth position;
and obtaining the ninth position according to the third geometric relationship, the tenth position and the eleventh position.
14. The method of claim 13, wherein the area of the irradiation region is larger than the area of the positioning pattern.
15. The method of claim 4, wherein the thermal imaging device and the visible light imaging device belong to an electronic device, the method further comprising:
the electronic equipment acquires a third image of a person to be detected by using the visible light imaging equipment and acquires a fourth image of the person to be detected by using the thermal imaging equipment;
performing skin detection processing on the third image, and determining a twelfth position of the skin area of the person to be detected in the third image;
Determining a second pixel area corresponding to the skin area in the fourth image according to the conversion relation and the twelfth position;
and obtaining the body temperature of the person to be detected according to the temperature of the second pixel area.
16. A calibration arrangement, characterized in that the arrangement comprises:
an acquisition unit configured to acquire a first image, the first image being obtained by shooting a target scene by a thermal imaging apparatus, the target scene including a thermal radiation object and at least one detection point;
the processing unit is used for obtaining at least one second position according to the first position and the first geometric relation; the first position is a position of a radiation area of the heat radiating object in the first image, the first geometric relationship is a geometric relationship between the radiation area and the at least one detection point, and the at least one second position is a position of the at least one detection point in the first image.
17. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 1 to 15.
18. A computer-readable storage medium, in which a computer program is stored, which computer program comprises program instructions which, if executed by a processor, cause the processor to carry out the method of any one of claims 1 to 15.
CN202111191277.3A 2021-10-13 2021-10-13 Calibration method and device, electronic equipment and computer readable storage medium Active CN113643386B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111191277.3A CN113643386B (en) 2021-10-13 2021-10-13 Calibration method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111191277.3A CN113643386B (en) 2021-10-13 2021-10-13 Calibration method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113643386A true CN113643386A (en) 2021-11-12
CN113643386B CN113643386B (en) 2022-02-22

Family

ID=78426572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111191277.3A Active CN113643386B (en) 2021-10-13 2021-10-13 Calibration method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113643386B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093407A1 (en) * 2021-11-25 2023-06-01 上海商汤智能科技有限公司 Calibration method and apparatus, and electronic device and computer-readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007257133A (en) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd Object detection system
CN101487809B (en) * 2008-12-12 2011-03-30 北京理工大学 Zero point calibration method and its use in optical micro-scanning micro-thermal imaging system
CN101551275B (en) * 2009-04-30 2011-07-27 上海航遥信息技术有限公司 Technical method of vehicular multispectral scanner for monitoring industrial warm discharge water
CN102610052A (en) * 2003-05-14 2012-07-25 Vfs技术有限公司 Particle detector
CN104254869A (en) * 2012-02-29 2014-12-31 前视红外系统股份公司 A method and system for projecting a visible representation of infrared radiation
US20150216498A1 (en) * 2012-08-20 2015-08-06 Orangedental Gmbh & Co. Kg Geometric Characterization and Calibration of a Cone-Beam Computer Tomography Apparatus
US20170243326A1 (en) * 2016-02-19 2017-08-24 Seek Thermal, Inc. Pixel decimation for an imaging system
CN110879080A (en) * 2019-11-15 2020-03-13 武汉华中天经通视科技有限公司 High-precision intelligent measuring instrument and measuring method for high-temperature forge piece
US20210244374A1 (en) * 2018-07-30 2021-08-12 Xenselab Llc Apparatus and methods for x-ray imaging

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102610052A (en) * 2003-05-14 2012-07-25 Vfs技术有限公司 Particle detector
JP2007257133A (en) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd Object detection system
CN101487809B (en) * 2008-12-12 2011-03-30 北京理工大学 Zero point calibration method and its use in optical micro-scanning micro-thermal imaging system
CN101551275B (en) * 2009-04-30 2011-07-27 上海航遥信息技术有限公司 Technical method of vehicular multispectral scanner for monitoring industrial warm discharge water
CN104254869A (en) * 2012-02-29 2014-12-31 前视红外系统股份公司 A method and system for projecting a visible representation of infrared radiation
US20150216498A1 (en) * 2012-08-20 2015-08-06 Orangedental Gmbh & Co. Kg Geometric Characterization and Calibration of a Cone-Beam Computer Tomography Apparatus
US20170243326A1 (en) * 2016-02-19 2017-08-24 Seek Thermal, Inc. Pixel decimation for an imaging system
US20210244374A1 (en) * 2018-07-30 2021-08-12 Xenselab Llc Apparatus and methods for x-ray imaging
CN110879080A (en) * 2019-11-15 2020-03-13 武汉华中天经通视科技有限公司 High-precision intelligent measuring instrument and measuring method for high-temperature forge piece

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许琳等: "图像测量技术及其在无损检测中的应用", 《电子测量技术》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093407A1 (en) * 2021-11-25 2023-06-01 上海商汤智能科技有限公司 Calibration method and apparatus, and electronic device and computer-readable storage medium

Also Published As

Publication number Publication date
CN113643386B (en) 2022-02-22

Similar Documents

Publication Publication Date Title
US10924729B2 (en) Method and device for calibration
US9704246B2 (en) Image processing apparatus, image processing method, and storage medium
CN108734744A (en) A kind of remote big field-of-view binocular scaling method based on total powerstation
US20040155877A1 (en) Image processing apparatus
JP2007129709A (en) Method for calibrating imaging device, method for calibrating imaging system including arrangement of imaging devices, and imaging system
Suran QR Code Image Correction based on Corner Detection and Convex Hull Algorithm.
CN108074237B (en) Image definition detection method and device, storage medium and electronic equipment
CN108765584A (en) Laser point cloud data collection augmentation method, apparatus and readable storage medium storing program for executing
CN110807807B (en) Monocular vision target positioning pattern, method, device and equipment
CN113345044B (en) Household graph generation method and device
CN113643386B (en) Calibration method and device, electronic equipment and computer readable storage medium
CN112287798A (en) Temperature measuring method and device, electronic equipment and storage medium
CN112053427A (en) Point cloud feature extraction method, device, equipment and readable storage medium
CN111353325A (en) Key point detection model training method and device
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN112197708B (en) Measuring method and device, electronic device and storage medium
JP7432793B1 (en) Mapping methods, devices, chips and module devices based on three-dimensional point clouds
US9319666B1 (en) Detecting control points for camera calibration
CN112102391A (en) Measuring method and device, electronic device and storage medium
CN110298797B (en) Millimeter wave image processing method based on convolutional neural network
Ma et al. Spatial perception of tagged cargo using fused RFID and CV data in intelligent storage
CN116053549A (en) Battery cell positioning method, device and system
CN114136462A (en) Calibration method and device, electronic equipment and computer readable storage medium
CN112150527A (en) Measuring method and device, electronic device and storage medium
CN116503387B (en) Image detection method, device, equipment, system and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240416

Address after: 200030, Units 6-77, 6th Floor, No. 1900 Hongmei Road, Xuhui District, Shanghai

Patentee after: Shanghai Yuanluobu Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518000 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Patentee before: SHENZHEN SENSETIME TECHNOLOGY Co.,Ltd.

Country or region before: China