CN112529947B - Calibration method and device, electronic equipment and storage medium - Google Patents

Calibration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112529947B
CN112529947B CN202011420092.0A CN202011420092A CN112529947B CN 112529947 B CN112529947 B CN 112529947B CN 202011420092 A CN202011420092 A CN 202011420092A CN 112529947 B CN112529947 B CN 112529947B
Authority
CN
China
Prior art keywords
face
calibration
area
thermal infrared
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011420092.0A
Other languages
Chinese (zh)
Other versions
CN112529947A (en
Inventor
张国庆
张弼坤
丁诚诚
林佩材
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202011420092.0A priority Critical patent/CN112529947B/en
Publication of CN112529947A publication Critical patent/CN112529947A/en
Priority to PCT/CN2021/096072 priority patent/WO2022121243A1/en
Priority to TW110125930A priority patent/TW202223739A/en
Application granted granted Critical
Publication of CN112529947B publication Critical patent/CN112529947B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a calibration method and device, electronic equipment and a storage medium. The method comprises the following steps: responding to a received calibration triggering instruction, and entering a calibration user interface, wherein the interface comprises a thermal infrared image and a first face outline frame; responding to receiving an operation instruction for moving the first face outline frame, and moving the first face outline frame according to the operation instruction; and outputting calibration completion information under the condition that the first face outline frame is overlapped with the first face area in the thermal infrared image.

Description

Calibration method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to a calibration method and apparatus, an electronic device, and a storage medium.
Background
Binocular stereo vision (binocular stereo vision), which is an important form of machine vision, is based on the principle of parallax and uses an imaging device to acquire two images of a measured object from different positions (hereinafter, will be referred to as binocular images).
In some application scenarios, it is necessary to determine the pixel areas corresponding to the same object from the binocular images, and in the case that there is parallax between the binocular images, the accuracy of determining the pixel areas corresponding to the same object from the binocular images is low. Therefore, how to calibrate the parallax between binocular images is of great importance.
Disclosure of Invention
The application provides a calibration method and device, electronic equipment and a storage medium.
The application provides a calibration method, which comprises the following steps:
responding to a received calibration triggering instruction, and entering a calibration user interface, wherein the interface comprises a thermal infrared image and a first face outline frame;
responding to receiving an operation instruction for moving the first face outline frame, and moving the first face outline frame according to the operation instruction;
and outputting calibration completion information under the condition that the first face outline frame is overlapped with the first face area in the thermal infrared image.
In combination with any one of the embodiments of the present application, the method further includes:
and in response to receiving the instruction of calibration completion, displaying a superposition result of the first face outline frame and the first face area in the calibration user interface, wherein the superposition result comprises, but is not limited to, superposition ratio and a face forehead area.
In combination with any one of the embodiments of the present application, before the displaying, in the calibration user interface, a result of the overlapping of the first face outline box and the first face area, the method further includes:
obtaining a visible light image;
Performing face recognition processing on the visible light image to obtain the position of the reference pixel point area in the visible light image; the reference pixel point area is an area corresponding to the first face area in the visible light image;
determining the position of the first face region in the thermal infrared image according to the reference parallax and the position of the reference pixel point region in the visible light image; the reference parallax is a parallax between the visible light image and the thermal infrared image;
and obtaining a superposition result of the first face outline frame and the first face area according to the position of the first face area in the thermal infrared image and the termination position.
In combination with any one of the embodiments of the present application, the responding to receiving an operation instruction for moving the first face contour frame, moving the first face contour frame according to the operation instruction includes:
and moving the first face outline frame along the sliding direction of the object on the user interface under the condition that a moving instruction of the object sliding on the user interface is detected.
In combination with any one of the embodiments of the present application, the calibration method is applied to a calibration device;
Before the operation instruction for moving the first face outline box is received, the method further comprises:
displaying at least one virtual direction button in the calibration user interface; the at least one virtual direction button includes at least one of: an up virtual button, a down virtual button, a left virtual button, a right virtual button;
the responding to the receiving of the operation instruction for moving the first face outline box, moving the first face outline box according to the operation instruction comprises the following steps:
and under the condition that the fact that an object touches the at least one virtual direction button through the screen of the calibration device is detected, moving the face outline frame according to the direction indicated by the touched virtual direction button.
In combination with any embodiment of the present application, the first face contour frame is obtained according to at least one reference first face contour frame; the at least one reference first face outline frame is obtained by carrying out face detection processing on at least one face image; the acquisition conditions of the at least one face image are actual acquisition conditions; the actual acquisition conditions are image acquisition conditions under the application environment of the calibration device.
In combination with any one of the embodiments of the present application, the method further includes:
and resetting the first face outline frame to an initial position under the condition that an instruction for resetting the first face outline frame is received.
In combination with any one of the embodiments of the present application, the first face contour frame includes at least one face key point;
the overlapping of the first face outline frame and the first face area in the thermal infrared image comprises: the first face contour frame coincides with a boundary of the first face region and the at least one face key point coincides with a corresponding face key point in the first face region.
In combination with any of the embodiments of the present application, the calibration user interface further includes a second display area different from the first area, and the method further includes:
displaying a superposition effect preview image of the first face outline frame and the first face area in the second display area; the overlapping effect preview image comprises an overlapping effect image of a second face outline frame and the second face area; a temperature measuring area is marked in the second face outline frame; the position of the second face outline frame in the second display area corresponds to the position of the first face outline frame in the first display area; the second face region corresponds to the first face region.
In combination with any one of the embodiments of the present application, before outputting calibration completion information when the first face contour frame coincides with the first face region in the thermal infrared image, the method further includes:
in response to receiving an enlargement instruction for the thermal infrared image and the first face contour frame, enlarging the thermal infrared image and the first face contour frame according to the enlargement instruction;
and in response to receiving a zoom-out instruction for the thermal infrared image and the first face outline box, zooming in the thermal infrared image and the first face outline box according to the zoom-out instruction.
In some embodiments, the present application also provides a calibration device comprising:
the first processing unit is used for responding to the received calibration triggering instruction and entering a calibration user interface, wherein the interface comprises a thermal infrared image and a first face outline frame;
the second processing unit is used for responding to the received operation instruction for moving the first face outline frame and moving the first face outline frame according to the operation instruction;
and the output unit is used for outputting calibration completion information under the condition that the first face outline frame is overlapped with the first face area in the thermal infrared image.
In combination with any one of the embodiments of the present application, the calibration device further includes:
and the first display unit is used for responding to the received instruction of calibration completion, and displaying the superposition result of the first face outline frame and the first face area in the calibration user interface, wherein the superposition result comprises, but is not limited to, superposition and a face forehead area.
In combination with any one of the embodiments of the present application, the calibration device further includes:
the acquisition unit is used for acquiring a visible light image before the superposition result of the first face outline frame and the first face area is displayed in the calibration user interface;
the third processing unit is used for carrying out face recognition processing on the visible light image to obtain the position of the reference pixel point area in the visible light image; the reference pixel point area is an area corresponding to the first face area in the visible light image;
a fourth processing unit, configured to determine a position of the first face region in the thermal infrared image according to a reference parallax and a position of the reference pixel point region in the visible light image; the reference parallax is a parallax between the visible light image and the thermal infrared image;
And a fifth processing unit, configured to obtain a superposition result of the first face contour frame and the first face region according to the position of the first face region in the thermal infrared image and the termination position.
In combination with any one of the embodiments of the present application, the second processing unit is configured to:
and moving the first face outline frame along the sliding direction of the object on the user interface under the condition that a moving instruction of the object sliding on the user interface is detected.
In combination with any one of the embodiments of the present application, the calibration device further includes: a second display unit, configured to display at least one virtual direction button in the calibration user interface before the operation instruction for moving the first face contour frame is received; the at least one virtual direction button includes at least one of: an up virtual button, a down virtual button, a left virtual button, a right virtual button;
the second processing unit is used for:
and under the condition that the fact that an object touches the at least one virtual direction button through the screen of the calibration device is detected, moving the face outline frame according to the direction indicated by the touched virtual direction button.
In combination with any embodiment of the present application, the first face contour frame is obtained according to at least one reference first face contour frame; the at least one reference first face outline frame is obtained by carrying out face detection processing on at least one face image; the acquisition conditions of the at least one face image are actual acquisition conditions; the actual acquisition conditions are image acquisition conditions under the application environment of the calibration device.
In combination with any one of the embodiments of the present application, the calibration device further includes:
and the resetting unit is used for resetting the first face outline frame to an initial position under the condition that an instruction for resetting the first face outline frame is received.
In combination with any one of the embodiments of the present application, the first face contour frame includes at least one face key point;
the overlapping of the first face outline frame and the first face area in the thermal infrared image comprises: the first face contour frame coincides with a boundary of the first face region and the at least one face key point coincides with a corresponding face key point in the first face region.
In combination with any one of the embodiments of the present application, the calibration user interface further includes a second display area different from the first area, and the first processing unit is further configured to display, in the second display area, a preview of a coincidence effect of the first face outline frame and the first face area; the overlapping effect preview image comprises an overlapping effect image of a second face outline frame and the second face area; a temperature measuring area is marked in the second face outline frame; the position of the second face outline frame in the second display area corresponds to the position of the first face outline frame in the first display area; the second face region corresponds to the first face region.
In combination with any one of the embodiments of the present application, before the calibration completion information is output in the case where the first face contour frame coincides with the first face region in the thermal infrared image, the second processing unit is further configured to respond to receiving an amplifying instruction for the thermal infrared image and the first face contour frame, and amplify the thermal infrared image and the first face contour frame according to the amplifying instruction;
and in response to receiving a zoom-out instruction for the thermal infrared image and the first face outline box, zooming in the thermal infrared image and the first face outline box according to the zoom-out instruction.
In some embodiments, the present application further provides an electronic device, including: a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform a method as described in the first aspect and any one of its possible implementations.
In some embodiments, the present application also provides another electronic device, including: a processor, transmission means, input means, output means and memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to carry out the method as described in the first aspect and any one of its possible implementations.
In some embodiments, the present application also provides a computer readable storage medium having stored therein a computer program comprising program instructions which, when executed by a processor, cause the processor to perform a method as described in the first aspect and any one of the possible implementations thereof.
In some embodiments, the present application also provides a computer program product comprising a computer program or instructions which, when run on a computer, cause the computer to perform the method of the first aspect described above and any one of its possible implementations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly describe the technical solutions in the embodiments or the background of the present application, the following description will describe the drawings that are required to be used in the embodiments or the background of the present application.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the technical aspects of the application.
FIG. 1 is a schematic flow chart of a calibration method according to an embodiment of the present application;
fig. 2 is a schematic view of a first face contour frame according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a calibration user interface provided in an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a calibration device according to an embodiment of the present disclosure;
fig. 5 is a schematic hardware structure of a calibration device according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will clearly and completely describe the technical solution in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be understood that, in the present application, "at least one (item)" means one or more, "a plurality" means two or more, "at least two (items)" means two or three and three or more, "and/or" for describing an association relationship of an association object, three kinds of relationships may exist, for example, "a and/or B" may mean: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" may indicate that the context-dependent object is an "or" relationship, meaning any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural. The character "/" may also represent divisors in mathematical operations, e.g., a/b=a divided by b; 6/3=2. "at least one of the following" or its similar expressions.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
It is known that a person is prone to abnormal body temperature during a disease, and therefore, the body temperature of the person can be used as a basis for determining whether the person needs to further determine whether the person is ill. In some special cases, a safe distance needs to be kept between people, which results in that the body temperature of the person cannot be measured by means of contact measurement.
For example, respiratory infections are characterized by susceptibility to infection, and can pose a major hazard to social security when people with respiratory infections are present in public places or people. To reduce the transmission efficiency of respiratory infectious diseases, the close contact between people can be reduced. At this time, it is not preferable to measure the human body temperature by means of contact measurement.
In the case that the body temperature of the person cannot be measured by the contact measurement method, the body temperature of the person can be measured only by the non-contact measurement method. The specific practice of the existing non-contact measurement mode is that a temperature measurement terminal uses an RGB camera and a thermal imaging camera to shoot the face of an object to be measured respectively, and an RGB image and a temperature thermodynamic diagram are obtained. The temperature measurement terminal determines a first face region from the RGB image by performing face detection processing on the RGB image, and then determines a pixel region corresponding to the first face region from the temperature thermodynamic diagram as a target pixel region. And the temperature measuring terminal obtains the body temperature of the object to be measured according to the temperature of the target pixel point area.
Since the installation position of the RGB camera is different from that of the thermal imaging camera, the position of the face of the object to be measured in the RGB image is different from that of the face of the object to be measured in the thermograph. In this way, a deviation may exist between the target pixel point area determined by the temperature measurement terminal from the temperature thermodynamic diagram and the first face area of the object to be measured, and thus the obtained body temperature of the object to be measured is inaccurate.
Based on the above, the embodiment of the application provides a calibration method to determine the displacement difference between the RGB camera and the thermal imaging camera, so as to improve the accuracy of the body temperature of the object to be measured.
The execution main body of the embodiment of the application is a calibration device. Alternatively, the calibration means may be one of the following: cell phone, computer, server, panel computer. Embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application.
Referring to fig. 1, fig. 1 is a flow chart of a calibration method according to an embodiment of the present application.
101. And responding to the received calibration triggering instruction, and entering a calibration user interface.
In the embodiment of the application, the calibration triggering instruction is used for indicating the calibration device to start the calibration program. In one possible implementation, the calibration device has a communication connection with the display, via which the calibration device displays a frame of information on the display whether the calibration procedure is started. The user can input a calibration trigger instruction to the calibration device through the information frame.
For example, the information box includes a start calibration procedure and a non-start calibration procedure. The user can input a calibration trigger instruction to the calibration device by clicking to start the calibration program.
In another possible implementation, the user inputs the calibration trigger instruction to the calibration device by inputting voice data carrying information to start the calibration procedure to the calibration device.
In another possible implementation manner, the calibration device may receive the calibration trigger instruction, where the calibration device receives the calibration trigger instruction sent by the terminal. Alternatively, the terminal may be any of the following: cell phone, computer, tablet computer, server, wearable equipment.
And responding to the received calibration triggering instruction, and enabling the calibration device to enter a calibration user interface to perform subsequent calibration processing. Alternatively, the calibration device may enter the calibration user interface by switching to the user interface after receiving the calibration trigger instruction.
102. And responding to the received operation instruction for moving the first face outline frame, and moving the first face outline frame according to the operation instruction.
In an embodiment of the present application, the calibration user interface includes a first display area, where the first display area includes a thermal infrared image and a first face outline box. The first display area may be a part of the calibration user interface, or the first display area may be the whole area of the calibration user interface.
In this embodiment of the present application, the first face contour box may be a face contour shown in fig. 2. Optionally, the calibration device acquires the first face contour frame before executing step 102. In one implementation of acquiring the first face contour frame, the calibration device receives the first face contour frame input by the user through the input component. Optionally, the input assembly includes: a keyboard, a mouse, a touch screen, a touch pad, an audio input device, and the like.
In another implementation manner of acquiring the first face outline frame, the calibration device receives the first face outline frame sent by the terminal.
In the embodiment of the application, the initial position of the first face outline box in the calibration user interface is fixed. The calibration device responds to the received calibration triggering instruction, displays a calibration user interface, and displays a first face outline frame at an initial position in the calibration user interface.
In this embodiment of the present application, the operation instruction for moving the first face outline box is used to instruct the calibration device to move the first face outline box on the calibration user interface.
In one possible implementation, the user inputs an operation instruction for moving the first face contour frame to the calibration device through the input component, so that the calibration device moves the first face contour frame according to the operation instruction. For example, the user inputs an operation instruction for moving the first face contour frame to the left by 10 pixel units to the calibration device through the input component, and the calibration device moves the first face contour frame to the left by 10 pixel units in the calibration user interface when receiving the operation instruction.
103. And outputting calibration completion information under the condition that the first face outline frame is overlapped with the first face area in the thermal infrared image.
In the embodiment of the application, the thermal infrared image is a face image acquired by the thermal imaging device. Optionally, the calibration device displays the red hot external image in the calibration user interface after entering the calibration user interface.
In this embodiment of the present application, the initial position of the first face outline box in the calibration user interface is a position of the first face area in the visible light image when the visible light image is displayed in the calibration user interface. For example, the visible light image includes a first face region a. And displaying the visible light image in a calibration user interface, wherein the position of the first face area a in the visible light image is A.
In the embodiment of the application, the thermal infrared image and the visible light image are binocular images, namely, the thermal infrared image and the visible light image are two images obtained by shooting the same face at the same time by different imaging devices, wherein the imaging device for collecting the visible light image is a visible light imaging device. Alternatively, the visible light imaging device is an RGB imaging device.
For example, the temperature measurement terminal includes an RGB camera and an RGB camera. And the temperature measurement terminal shoots the face of the person three by using the RGB camera to obtain an RGB image, and shoots the face of the person three by using the RGB camera to obtain a thermal infrared image while collecting the RGB image by using the RGB camera.
Since the installation position of the optical imaging device can be different from the installation position of the thermal infrared imaging device, the position of the face of the object to be measured in the visible light image is different from the position of the face of the object to be measured in the thermal infrared image. Therefore, before the body temperature of the object to be measured is determined based on the visible light image and the thermal infrared image, the parallax between the visible light image and the thermal infrared image needs to be obtained by calibrating the visible light image and the thermal infrared image.
Because the initial position of the first face outline frame in the calibration user interface is the position of the first face area in the visible light image, the calibration between the visible light image and the thermal infrared image can be completed by moving the first face outline frame to enable the first face outline frame to coincide with the first face area in the thermal infrared image.
For example, if the calibration device moves the first face contour frame to the right in the calibration user interface by 5 pixel units, the first face contour frame is overlapped with the first face region in the thermal infrared image. Then it is stated that moving the visible light image 5 pixel units to the right of the calibrated user interface can cause the visible light image to coincide with the thermal infrared image. I.e., the parallax between the visible light image and the thermal infrared image is 5 pixel units.
And the calibration device outputs calibration completion information under the condition that the first face outline frame is determined to be coincident with the first face area in the thermal infrared image, so that a user is informed of the completion of the calibration between the thermal infrared image and the visible light image.
In one possible implementation, the calibration device has a communication connection with the display. The calibration device enables the display to output calibration completion information through the communication connection.
For example, the calibration device may display calibration complete information in a calibration user interface; for another example, the calibration device may switch the calibration user interface to another interface and display calibration completion information in the switched interface.
In another possible implementation, the calibration device may output calibration complete information by outputting voice data.
In yet another possible implementation, the calibration device may output calibration completion information by controlling the indicator light to blink.
In the embodiment of the application, the calibration device determines that the calibration between the visible light image and the thermal infrared image is completed under the condition that the first face outline frame is determined to be overlapped with the first face area in the thermal infrared image.
Based on the technical scheme provided by the embodiment of the application, the calibration device can display the calibration user interface, so that a user can determine whether the face outline frame coincides with the first face area in the thermal infrared image by moving the face outline frame, and further, the calibration between the visible light image and the thermal infrared image can be completed, and the operation complexity of parallax between the visible light image and the thermal infrared image obtained by the user is reduced.
As an alternative embodiment, the calibration device further performs the following steps:
1. and when the first face outline frame is overlapped with the first face area in the thermal infrared image, taking the position of the first face outline frame in the interface as a termination position.
2. And determining a displacement difference between the initial position and the final position of the first face outline frame in the interface as a parallax between the thermal infrared image and the visible light image.
In the embodiment of the present application, the parallax between the homonymous points refers to the difference between the positions of two pixel points which are homonymous points in the binocular image in the respective images.
In the embodiment of the present application, the parallax between the binocular images is the parallax obtained according to the parallaxes between all the homonymous points in the binocular images. Alternatively, the average value of the parallaxes between all the homonymous points in the binocular images is taken as the parallax between the binocular images.
In the embodiment of the application, the calibration device obtains the parallax between the visible light image and the thermal infrared image by determining the displacement difference, so that the data processing amount of the parallax between the visible light image and the thermal infrared image obtained by the calibration device is reduced, and the operation complexity of obtaining the parallax between the visible light image and the thermal infrared image by a user is reduced.
As an alternative embodiment, the calibration device further performs the following steps:
3. and in response to receiving the instruction of calibration completion, displaying a superposition result of the first face outline frame and the first face area in the first display area.
In the embodiment of the application, the coincidence result includes, but is not limited to, coincidence, and a forehead area of a human face. Wherein the overlap ratio may be one of the following: and a numerical value, a coincidence effect diagram between the first face outline frame and the first face area in the thermal infrared image.
The forehead region of the face refers to the forehead region of the face in the first face outline frame. Since the human body temperature is usually determined by measuring the temperature of the forehead of the human when the human body temperature is measured based on the visible light image and the thermal infrared image. Therefore, the forehead region of the human face is displayed in the calibration user interface, so that the user can know the coincidence degree between the forehead region in the outline frame of the human face and the forehead region in the thermal infrared image. Thus, the accuracy of the body temperature obtained based on the thermal infrared image and the visible light image is more facilitated to be improved.
As an alternative embodiment, the calibration means further performs the following steps before performing the displaying of the result of the coincidence of the first face contour frame and the first face region in the calibration user interface:
4. And obtaining a visible light image.
As described above, the light image and the thermal infrared image may be binocular images. In one implementation of capturing a visible light image, the calibration device receives the visible light image input by a user through the input assembly.
In another implementation of obtaining the visible light image, the calibration device receives the visible light image sent by the terminal.
In yet another implementation of acquiring a visible light image, the calibration device is loaded with a visible light imaging device. The calibration device collects visible light images by using visible light imaging equipment.
5. And performing face recognition processing on the visible light image to obtain the position of the reference pixel point area in the visible light image.
In this embodiment of the present application, the reference pixel area is an area corresponding to the first face area in the visible light image. Namely, the reference pixel point area corresponds to the face of the same person as the first face area in the thermal infrared image.
The calibration device can determine the position of the first face area in the visible light image by carrying out face recognition processing on the visible light image. Under the condition that the visible light image comprises 1 first face area, the calibration device takes the first face area in the visible light image as a reference pixel point area. And under the condition that the first face area contained in the visible light image exceeds 1, the calibration device takes the first face area with the largest area as a reference pixel point area.
6. And determining the position of the first face region in the thermal infrared image according to the reference parallax and the position of the reference pixel point region in the visible light image.
In the embodiment of the present application, the reference parallax is a parallax between a visible light image and a thermal infrared image. The calibration device can determine the position of the first face region in the thermal infrared image according to the reference parallax and the position of the reference pixel point region in the visible light image.
7. And obtaining a superposition result of the first face outline frame and the first face area according to the position of the first face area in the thermal infrared image and the termination position.
In one possible implementation manner, the calibration device may determine, as the overlapping result, an area overlapping ratio between the first face contour frame and the first face region in the thermal infrared image according to the position and the end position of the first face region in the thermal infrared image.
In another possible implementation manner, the calibration device may obtain, as the superposition result, a superposition effect diagram between the first face contour frame and the first face region in the thermal infrared image according to the position and the end position of the first face region in the thermal infrared image.
As an alternative embodiment, the calibration device performs the following steps in performing step 102:
8. and moving the first face outline frame along the sliding direction of the object on the user interface under the condition that a moving instruction of sliding the object on the user interface is detected.
In this step, the calibration device is in communication with the touch display. The calibration device detects a movement instruction of sliding of the object on the user interface by detecting the sliding operation of the object on the touch display.
And the calibration device moves the first face outline frame along the sliding direction of the object on the user interface under the condition that the movement instruction is detected. For example, a user slides a first face contour with a finger over a touch display; for another example, the user slides the first face contour frame on the touch display with a stylus.
Alternatively, the position where the object contacts the touch display may be located on the first face contour frame, or the position where the object contacts the touch display may not be located on the first face contour frame. For example, in the case where the touch display area is large enough to be difficult for the user to operate with one hand, a user finger accessible area (such as a lower left corner area or a lower right corner area of the touch display) of the touch display may be used as a control area, and the user may slide the movable first face contour frame within the control area using an object.
As an alternative embodiment, the calibration device further performs the following steps before performing step 102:
9. at least one virtual direction button is displayed in the calibration user interface.
In an embodiment of the present application, the at least one virtual direction button includes at least one of: an up virtual button, a down virtual button, a left virtual button, and a right virtual button.
The virtual direction button is used to move the first face contour box. Specifically, the upward virtual button is touched by an object, and the first face outline frame is characterized in that the first face outline frame should move to the upper side of the calibration user interface; the downward virtual button is touched by an object, and the first face outline frame is characterized by moving to the lower side of the calibration user interface; the left virtual button is touched by an object, and the first face outline frame is characterized by moving to the left side of the calibration user interface; the right virtual button is touched by the object, indicating that the first face contour box should be moved to the right of the calibration user interface.
After step 9 is performed, the calibration device performs a step in the process of performing step 102:
10. and when the fact that the object touches the at least one virtual direction button through the screen of the calibration device is detected, the face outline frame is moved according to the direction indicated by the touched virtual direction button.
In one possible implementation, the calibration device moves the first face outline frame to the upper side of the calibration user interface by n pixel units when detecting that the object touches the upward virtual button.
In the case where the at least one virtual direction button comprises an up virtual button, and the calibration means moves the first face contour frame n pixel units upward of the calibration user interface in case that an object touch to the up virtual button is detected. The specific value of n can be set according to actual requirements.
For example, if n=1, the object touches the up virtual button once, and the calibration device moves the first face contour frame 1 pixel unit towards the upper side of the calibration user interface; if n=3, the object touches the up virtual button once, and the calibration device moves the first face contour frame 3 pixel units to the upper side of the calibration user interface.
In another possible implementation manner, the calibration device moves the first face outline frame to the lower side of the calibration user interface by m pixel units when detecting that the object touches the down virtual button.
In the case where the at least one virtual direction button comprises a down virtual button, and the calibration means moves the first face contour frame m pixel units towards the underside of the calibration user interface in case that an object touch of the down virtual button is detected. The specific value of m can be set according to actual requirements.
For example, if m=1, the object touches the down virtual button once, the calibration device moves the first face contour frame 1 pixel unit towards the underside of the calibration user interface; if m=2, the object touches the down virtual button once and the calibration device moves the first face contour frame 2 pixel units down the calibration user interface.
In another possible implementation, the calibration device moves the first face outline frame to the left of the calibration user interface by i pixel units if it detects that the object touches the left virtual button.
The at least one virtual direction button comprises a left virtual button, and the calibration means moves the first face contour frame i pixel units to the left of the calibration user interface if the object is detected to touch the left virtual button. The specific value of i can be set according to the actual requirement.
For example, if i=1, then the object touches the virtual left button once, the calibration device moves the first face outline frame 1 pixel unit to the left of the calibration user interface; if i=2, the object touches the left virtual button once, and the calibration device moves the first face contour frame to the left of the calibration user interface by 2 pixel units.
In another possible implementation manner, the calibration device moves the first face outline frame to the right side of the calibration user interface by j pixel units when detecting that the object touches the right virtual button.
The at least one virtual direction button comprises a right virtual button, and the calibration device moves the first face outline frame towards the right side of the calibration user interface by j pixel units in the right side of the case that the object is detected to touch the right virtual button. The specific value of j can be set according to the actual requirement.
For example, if j=1, then the object touches the virtual right button once, the calibration device moves the first face outline frame 1 pixel unit to the right of the calibration user interface; if j=2, the object touches the virtual right button once, and the calibration device moves the first face contour frame 2 pixel units to the right of the calibration user interface.
The user can control the movement of the first face outline frame by touching the virtual direction button, so that the first face outline frame is overlapped with the first face area in the thermal infrared image.
As an alternative embodiment, the first face contour is derived from at least one reference first face contour. The at least one reference first face contour frame is obtained by performing face detection processing on the at least one face image. The acquisition conditions of at least one face image are actual acquisition conditions, wherein the actual acquisition conditions are image acquisition conditions under the application environment of the calibration device.
In the embodiment of the application, the face detection process is used for processing the face image to obtain the face outline and the face key points. The electronic equipment can obtain at least one face outline and at least one face key point by carrying out face detection processing on one face image. The electronic device performs face detection processing on at least one face image to obtain at least one reference first face outline frame.
The electronic device may further obtain a first face contour frame according to at least one reference first face contour frame. For example, the electronic device obtains the first face contour frame by fitting at least one reference first face contour frame. For another example, the electronic device may average at least one reference first face contour frame to obtain the first face contour frame. For another example, the electronic device may use any one of the at least one reference first face contour as the first face contour.
In the embodiment of the application, the acquisition condition includes one of the following factors: brightness, shooting angle. For example, the acquisition conditions in a waiting hall are different from the outdoor acquisition conditions; for another example, the waiting hall is provided with the calibration device 1 and the calibration device 2, wherein the shooting angle of the camera of the calibration device 1 is different from the shooting angle of the camera of the calibration device 2, that is, the acquisition condition of the calibration device 1 is different from the acquisition condition of the calibration device 2.
Obviously, there is a distinction between face images obtained by photographing the same person under different acquisition conditions. For example, the face image 1 is obtained by collecting the face image of the third person under the collection condition 1, and the face image 2 is obtained by collecting the face image of the third person under the collection condition 2. At this time, the face contour of the third person in the face image 1 is different from the face contour of the third person in the face image 2.
In the embodiment of the application, the image acquisition condition under the application environment of the calibration device is referred to as an actual acquisition condition. For example, the calibration device is applied to the hall of company a, and then the actual acquisition condition is the acquisition condition when the calibration device acquires an image in the hall of company a.
Under the condition that the acquisition condition of at least one face image is the same as the actual acquisition condition, the first face outline frame obtained according to the at least one fifth first face outline frame is closer to the first face outline frame obtained by shooting the face under the actual acquisition condition by the calibration device, and therefore the calibration accuracy between the visible light image and the thermal infrared image can be improved.
As an optional implementation manner, the first face contour box further includes at least one of the following face key points: eye keypoints, forehead keypoints, and eye keypoints. When the first face contour frame comprises at least one face key point, a user can judge whether the first face contour frame is overlapped with a first face region in the thermal infrared image or not, whether the boundary between the face contour in the first face contour frame and the boundary between the face contour in the first face region are overlapped or not can be used as a judgment basis, and whether at least one face key point in the first face contour frame is overlapped with a corresponding face key point in the first face region or not can be used as a judgment basis.
For example, assume that at least one face keypoint comprises an eye keypoint. When judging whether the first face outline frame is overlapped with the first face area in the thermal infrared image, the user not only uses whether the boundary between the face outline in the first face outline frame and the first face area is overlapped as a judging basis, but also uses whether the eye key points in the first face outline frame are overlapped with the eye key points in the first face area as a judging basis.
As an alternative embodiment, the calibration device resets the first face contour frame to the initial position when receiving an instruction to reset the first face contour frame. In this embodiment, the user may return the first face contour frame to the initial position by inputting an instruction to reset the first face contour frame to the calibration device.
As an alternative embodiment, the calibration user interface further comprises a second display area, the calibration means further performing the steps of:
11. and displaying a superposition effect preview image of the first face outline frame and the first face area in the second display area.
In this embodiment of the present application, the first display area is different from the second display area. Optionally, there is no intersection between the second display area and the first display area.
In this embodiment of the present application, the overlapping effect preview image includes an overlapping effect image of the second face contour frame and the second face area. The second face outline frame is internally marked with a temperature measuring area. Alternatively, the thermometry region may be a spot marked with a particular color. For example, assume that the thermometry region is a green dot. The green spot is illustrated as a temperature measurement spot of a thermal infrared imaging device.
Alternatively, the temperature measuring region may be a region marked with a frame, and in this case, the temperature measuring region is a temperature measuring region of the thermal infrared imaging apparatus.
In this embodiment of the present application, a position of the second face contour frame in the second display area corresponds to a position of the first face contour frame in the first display area. If the first face outline frame moves leftwards in the first display area, the second face outline frame moves leftwards in the second display area; if the first face contour frame moves upwards in the first display area, the second face contour frame moves upwards in the second display area.
For example, the user may move the first face contour frame upward within the first display area by clicking the up virtual button, at which time the second face contour frame moves upward within the second display area.
Optionally, the ratio between the moving distance of the first face outline frame in the first display area and the moving distance of the second face outline frame in the second display area is a preview ratio.
For example, assume that the preview scale is 2. If the user clicks the upward virtual button, the first face outline frame moves upward by 4 pixel units in the first display area, and at this time, the second face outline frame moves upward by 2 pixel units in the second display area.
In this embodiment of the present application, the second face area corresponds to the first face area, that is, in the overlapping effect preview image, the second face area is a preview image of the first face area. Optionally, the ratio between the size of the first face region and the size of the second face region is a preview ratio.
The calibration device can display the position of the temperature measuring area through the superposition effect preview image in the process of moving the first face outline frame by executing the step 11. Thus, a user can know which part of the temperature measurement object is measured when the thermal infrared imaging device acquires the thermal infrared image. For example, the temperature measuring area is in the right forehead area of the second face area, and then the temperature of the right forehead area of the temperature measuring object is obtained according to the visible light image and the thermal infrared image.
As an alternative embodiment, before performing step 103, the calibration device further performs the following steps:
12. and in response to receiving an amplifying instruction for the thermal infrared image and the face outline frame, amplifying the thermal infrared image and the face outline frame according to the amplifying instruction.
In the embodiment of the application, the amplifying instruction is used for instructing the calibration device to amplify the thermal infrared image and the human face outline frame. Optionally, the calibration device simultaneously amplifies the thermal infrared image and the first face outline frame according to the amplifying instruction.
13. And in response to receiving a zoom-out instruction for the thermal infrared image and the face outline frame, magnifying the thermal infrared image and the face outline frame according to the zoom-out instruction.
In the embodiment of the application, the shrinking instruction is used for instructing the calibration device to enlarge the thermal infrared image and the human face outline frame. Optionally, the calibration device simultaneously reduces the thermal infrared image and the first face outline frame according to the amplifying instruction.
The user inputs an amplifying instruction or a shrinking instruction to the calibration device so as to better observe the superposition effect between the first face outline frame and the first face area.
Optionally, the user inputs an amplifying instruction to the calibration device, which may be touching an amplifying button in the first display area; the user inputs a zoom-out instruction to the calibration device, which may be touching a zoom-in button in the first display area.
Based on the technical scheme disclosed by the embodiment of the application, the embodiment of the application also provides a possible application scene.
As described above, when the temperature measurement terminal is used to perform non-contact temperature measurement, the temperature measurement terminal determines a first face region from the RGB image, and then determines a pixel region corresponding to the first face region from the thermal infrared image as a target pixel region. And the temperature measuring terminal obtains the body temperature of the object to be measured according to the temperature of the target pixel point area. However, if there is a large parallax between the RGB image and the thermal infrared image, the obtained error of the body temperature of the object to be measured is large, and even an unreasonable body temperature may be obtained.
For example, measuring the body temperature of Zhang three by using a temperature measuring terminal to obtain the body temperature of Zhang three as 30 ℃; for another example, the temperature of Zhang San is measured by using a temperature measuring terminal, and the temperature of Zhang San is 43 degrees. It is apparent that 30 degrees and 40 degrees are both unreasonable body temperatures.
When the obtained body temperature is obviously unreasonable, the reason may be that the displacement difference between the RGB camera and the thermal imaging camera is large, resulting in a large parallax between the RGB image and the thermal infrared image. At this time, the user can determine the parallax between the RGB image and the thermal infrared image by using the technical scheme provided by the embodiment of the application, thereby improving the accuracy of the body temperature of the object to be measured.
It should be understood that when acquiring the face image of the third person, the temperature measurement terminal displays the acquired first face outline frame through the display, so that the position of the face is adjusted by the third person, and the face of the third person is positioned in the acquired first face outline frame. The method comprises the steps of collecting the position of a first face outline frame in a calibration user interface of a display, wherein the position of the first face outline frame in the calibration user interface is the same as the position of the first face outline frame in the calibration user interface.
And under the condition that the temperature measurement terminal determines that the face containing the third person is in the first face outline frame, an RGB camera is used for collecting an image containing the face containing the third person as an RGB image, and a thermal imaging camera is used for collecting the image containing the face containing the third person as a thermal infrared image.
The temperature measurement terminal can complete calibration between the RGB image and the thermal infrared image based on the technical scheme, the visible RGB image and the thermal infrared image provided by the embodiment of the application.
Specifically, the user can make the first face outline frame coincide with the first face area in the thermal infrared image by moving the first face outline frame, and input an instruction that the first face outline frame coincides with the first face area in the thermal infrared image to the temperature measurement terminal. The thermometry terminal may then determine the parallax between the RGB image and the thermal infrared image. In this way, the temperature measurement terminal can determine the target pixel point area from the thermal infrared image based on the parallax, thereby improving the accuracy of the body temperature of the object to be measured.
For example, FIG. 3 illustrates a calibration user interface displaying a first face outline box and a thermal infrared image. In the user interface, the dashed first face outline box is a first face outline box. Four virtual direction buttons are arranged below the user interface and are used for controlling the first face outline frame to move to the upper side of the user interface, the lower side of the user interface, the left side of the user interface and the right side of the user interface respectively. The user can move the first face outline box by touching the virtual direction button to enable the first face outline box to coincide with the first face area in the thermal infrared image.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
The foregoing details the method of embodiments of the present application, and the apparatus of embodiments of the present application is provided below.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a calibration device according to an embodiment of the present application, where the calibration device 1 includes: a first processing unit 11, a second processing unit 12, an output unit 13, a first display unit 14, an acquisition unit 15, a third processing unit 16, a fourth processing unit 17, a fifth processing unit 18, a second display unit 19, and a reset unit 20, wherein:
The first processing unit 11 is configured to enter a calibration user interface in response to a received calibration trigger instruction, where the interface includes a thermal infrared image and a first face outline frame;
a second processing unit 12, configured to, in response to receiving an operation instruction to move the first face contour frame, move the first face contour frame according to the operation instruction;
and the output unit 13 is used for outputting calibration completion information under the condition that the first face outline frame is overlapped with the first face area in the thermal infrared image.
In combination with any of the embodiments of the present application, the calibration device 1 further comprises:
a first display unit 14, configured to display, in response to receiving an instruction of calibration completion, a coincidence result of the first face outline frame and the first face area in the calibration user interface, where the coincidence result includes, but is not limited to, a coincidence ratio, and a face forehead area.
In combination with any one of the embodiments of the present application, the calibration device further includes:
an obtaining unit 15, configured to obtain a visible light image before displaying a superposition result of the first face outline frame and the first face area in the calibration user interface;
the third processing unit 16 is configured to perform face recognition processing on the visible light image to obtain a position of the reference pixel point area in the visible light image; the reference pixel point area is an area corresponding to the first face area in the visible light image;
A fourth processing unit 17, configured to determine a position of the first face region in the thermal infrared image according to a reference parallax and a position of the reference pixel point region in the visible light image; the reference parallax is a parallax between the visible light image and the thermal infrared image;
and a fifth processing unit 18, configured to obtain a superposition result of the first face contour frame and the first face region according to the position of the first face region in the thermal infrared image and the termination position.
In combination with any one of the embodiments of the present application, the second processing unit 12 is configured to:
and moving the first face outline frame along the sliding direction of the object on the user interface under the condition that a moving instruction of the object sliding on the user interface is detected.
In combination with any one of the embodiments of the present application, the calibration device further includes: a second display unit 19, configured to display at least one virtual direction button in the calibration user interface before the operation instruction for moving the first face contour frame is received; the at least one virtual direction button includes at least one of: an up virtual button, a down virtual button, a left virtual button, a right virtual button;
The second processing unit 12 is configured to:
and under the condition that the fact that an object touches the at least one virtual direction button through the screen of the calibration device is detected, moving the face outline frame according to the direction indicated by the touched virtual direction button.
In combination with any embodiment of the present application, the first face contour frame is obtained according to at least one reference first face contour frame; the at least one reference first face outline frame is obtained by carrying out face detection processing on at least one face image; the acquisition conditions of the at least one face image are actual acquisition conditions; the actual acquisition conditions are image acquisition conditions under the application environment of the calibration device.
In combination with any of the embodiments of the present application, the calibration device 1 further comprises:
and a resetting unit 20, configured to reset the first face contour frame to an initial position when receiving an instruction to reset the first face contour frame.
In combination with any one of the embodiments of the present application, the first face contour frame includes at least one face key point;
the overlapping of the first face outline frame and the first face area in the thermal infrared image comprises: the first face contour frame coincides with a boundary of the first face region and the at least one face key point coincides with a corresponding face key point in the first face region.
In combination with any one of the embodiments of the present application, the calibration user interface further includes a second display area different from the first area, and the first processing unit 11 is further configured to display, in the second display area, a preview of the overlapping effect of the first face contour frame and the first face area; the overlapping effect preview image comprises an overlapping effect image of a second face outline frame and the second face area; a temperature measuring area is marked in the second face outline frame; the position of the second face outline frame in the second display area corresponds to the position of the first face outline frame in the first display area; the second face region corresponds to the first face region.
In combination with any one of the embodiments of the present application, the second processing unit 12 is further configured to, before outputting calibration completion information if the first face contour frame coincides with the first face region in the thermal infrared image, respond to receiving an instruction for amplifying the thermal infrared image and the first face contour frame, and amplify the thermal infrared image and the first face contour frame according to the instruction for amplifying;
and in response to receiving a zoom-out instruction for the thermal infrared image and the first face outline box, zooming in the thermal infrared image and the first face outline box according to the zoom-out instruction.
In some embodiments, functions or modules included in the apparatus provided in the embodiments of the present application may be used to perform the methods described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
Fig. 5 is a schematic hardware structure of a calibration device according to an embodiment of the present application. The calibration device 2 comprises a processor 21, a memory 22, an input device 23 and an output device 24. The processor 21, memory 22, input device 23, and output device 24 are coupled by connectors, including various interfaces, transmission lines or buses, etc., as not limited in this application. It should be understood that in various embodiments of the present application, coupled is intended to mean interconnected by a particular means, including directly or indirectly through other devices, e.g., through various interfaces, transmission lines, buses, etc.
The processor 21 may be one or more graphics processors (graphics processing unit, GPUs), which may be single-core GPUs or multi-core GPUs in the case where the processor 21 is a GPU. Alternatively, the processor 21 may be a processor group formed by a plurality of GPUs, and the plurality of processors are coupled to each other through one or more buses. In the alternative, the processor may be another type of processor, and the embodiment of the present application is not limited.
Memory 22 may be used to store computer program instructions as well as various types of computer program code for performing aspects of the present application. Optionally, the memory includes, but is not limited to, a random access memory (random access memory, RAM), a read-only memory (ROM), an erasable programmable read-only memory (erasable programmable read only memory, EPROM), or a portable read-only memory (compact disc read-only memory, CD-ROM) for associated instructions and data.
The input means 23 are for inputting data and/or signals and the output means 24 are for outputting data and/or signals. The input device 23 and the output device 24 may be separate devices or may be an integral device.
It can be understood that, in the embodiment of the present application, the memory 22 may be used to store not only related instructions, but also related data, for example, the memory 22 may be used to store calibration triggering instructions obtained through the input device 23, or the memory 22 may also be used to store parallax between the visible light image and the thermal infrared image obtained through the processor 21, etc., where the embodiment of the present application does not limit the data specifically stored in the memory.
It will be appreciated that figure 5 shows only a simplified design of a calibration device. In practical applications, the calibration device may also include other necessary elements, including but not limited to any number of input/output devices, processors, memories, etc., and all calibration devices that can implement the embodiments of the present application are within the scope of protection of the present application.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein. It will be further apparent to those skilled in the art that the descriptions of the various embodiments herein are provided with emphasis, and that the same or similar parts may not be explicitly described in different embodiments for the sake of convenience and brevity of description, and thus, parts not described in one embodiment or in detail may be referred to in the description of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted across a computer-readable storage medium. The computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital versatile disk (digital versatile disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: a read-only memory (ROM) or a random access memory (random access memory, RAM), a magnetic disk or an optical disk, or the like.

Claims (11)

1. A method of calibration, the method comprising:
responding to a received calibration triggering instruction, and entering a calibration user interface, wherein a first display area of the interface comprises a thermal infrared image and a first face outline frame; the initial position of the first face outline frame in the user interface is the position of a reference pixel point area in a visible light image, and the reference pixel point area is an area corresponding to the first face area in the visible light image;
responding to receiving an operation instruction for moving the first face outline frame, and moving the first face outline frame according to the operation instruction;
Outputting calibration completion information under the condition that the first face outline frame is overlapped with a first face area in the thermal infrared image, and taking the position of the first face outline frame in the interface as a termination position; the calibration completion information comprises a reference parallax between the visible light image and the thermal infrared image;
obtaining a visible light image;
performing face recognition processing on the visible light image to obtain the position of the reference pixel point area in the visible light image;
determining the position of the first face region in the thermal infrared image according to the reference parallax and the position of the reference pixel point region in the visible light image;
obtaining a superposition result of the first face outline frame and the first face area according to the position of the first face area in the thermal infrared image and the termination position;
and in response to receiving the instruction of calibration completion, displaying the superposition result of the first face outline frame and the first face area in the first display area, wherein the superposition result comprises the superposition ratio and the face forehead area.
2. The method of claim 1, wherein the moving the first face contour box in accordance with the operation instruction in response to receiving the operation instruction to move the first face contour box comprises:
And moving the first face outline frame along the sliding direction of the object on the user interface under the condition that a moving instruction of the object sliding on the user interface is detected.
3. A method according to claim 1 or 2, characterized in that the calibration method is applied to a calibration device;
before the operation instruction for moving the first face outline box is received, the method further comprises:
displaying at least one virtual direction button in the calibration user interface; the at least one virtual direction button includes at least one of: an up virtual button, a down virtual button, a left virtual button, a right virtual button;
the responding to the receiving of the operation instruction for moving the first face outline box, moving the first face outline box according to the operation instruction comprises the following steps:
and under the condition that the fact that an object touches the at least one virtual direction button through the screen of the calibration device is detected, moving the face outline frame according to the direction indicated by the touched virtual direction button.
4. The method of claim 2, wherein the first face contour is derived from at least one reference first face contour; the at least one reference first face outline frame is obtained by carrying out face detection processing on at least one face image; the acquisition conditions of the at least one face image are actual acquisition conditions; the actual acquisition conditions are image acquisition conditions under the application environment of the calibration device.
5. The method according to claim 1 or 2, characterized in that the method further comprises:
and resetting the first face outline frame to an initial position under the condition that an instruction for resetting the first face outline frame is received.
6. The method according to claim 1 or 2, wherein the first face contour box comprises at least one face key point;
the overlapping of the first face outline frame and the first face area in the thermal infrared image comprises: the first face contour frame coincides with a boundary of the first face region and the at least one face key point coincides with a corresponding face key point in the first face region.
7. The method of claim 1 or 2, wherein the calibration user interface further comprises a second display area different from the first display area, the method further comprising:
displaying a superposition effect preview image of the first face outline frame and the first face area in the second display area; the overlapping effect preview image comprises an overlapping effect image of a second face outline frame and a second face area; a temperature measuring area is marked in the second face outline frame; the position of the second face outline frame in the second display area corresponds to the position of the first face outline frame in the first display area; the second face region corresponds to the first face region.
8. The method according to claim 1 or 2, wherein, in case the first face contour box coincides with a first face region in the thermal infrared image, the method further comprises, before outputting calibration complete information:
in response to receiving an enlargement instruction for the thermal infrared image and the first face contour frame, enlarging the thermal infrared image and the first face contour frame according to the enlargement instruction;
and in response to receiving a zoom-out instruction for the thermal infrared image and the first face outline box, zooming in the thermal infrared image and the first face outline box according to the zoom-out instruction.
9. A calibration device, characterized in that it comprises:
the first processing unit is used for responding to the received calibration triggering instruction and entering a calibration user interface, and a first display area of the interface comprises a thermal infrared image and a first face outline frame; the initial position of the first face outline frame in the user interface is the position of a reference pixel point area in a visible light image, and the reference pixel point area is an area corresponding to the first face area in the visible light image;
The second processing unit is used for responding to the received operation instruction for moving the first face outline frame and moving the first face outline frame according to the operation instruction;
the output unit is used for outputting calibration completion information under the condition that the first face outline frame is overlapped with a first face area in the thermal infrared image, and taking the position of the first face outline frame in the interface as a termination position; the calibration completion information comprises a reference parallax between the visible light image and the thermal infrared image;
obtaining a visible light image;
performing face recognition processing on the visible light image to obtain the position of the reference pixel point area in the visible light image;
determining the position of the first face region in the thermal infrared image according to the reference parallax and the position of the reference pixel point region in the visible light image;
obtaining a superposition result of the first face outline frame and the first face area according to the position of the first face area in the thermal infrared image and the termination position;
and in response to receiving the instruction of calibration completion, displaying the superposition result of the first face outline frame and the first face area in the first display area, wherein the superposition result comprises the superposition ratio and the face forehead area.
10. An electronic device, comprising: a processor and a memory for storing computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any one of claims 1 to 8.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program comprising program instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1 to 8.
CN202011420092.0A 2020-12-07 2020-12-07 Calibration method and device, electronic equipment and storage medium Active CN112529947B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202011420092.0A CN112529947B (en) 2020-12-07 2020-12-07 Calibration method and device, electronic equipment and storage medium
PCT/CN2021/096072 WO2022121243A1 (en) 2020-12-07 2021-05-26 Calibration method and apparatus, and electronic device, storage medium, and program product
TW110125930A TW202223739A (en) 2020-12-07 2021-07-14 Calibration method, electronic equipment and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011420092.0A CN112529947B (en) 2020-12-07 2020-12-07 Calibration method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112529947A CN112529947A (en) 2021-03-19
CN112529947B true CN112529947B (en) 2023-08-01

Family

ID=74998028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011420092.0A Active CN112529947B (en) 2020-12-07 2020-12-07 Calibration method and device, electronic equipment and storage medium

Country Status (3)

Country Link
CN (1) CN112529947B (en)
TW (1) TW202223739A (en)
WO (1) WO2022121243A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112529947B (en) * 2020-12-07 2023-08-01 北京市商汤科技开发有限公司 Calibration method and device, electronic equipment and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1811771B1 (en) * 2006-01-20 2009-04-29 Fluke Corporation Camera with visible light and infrared image blending
US10015474B2 (en) * 2014-04-22 2018-07-03 Fluke Corporation Methods for end-user parallax adjustment
CN105931240B (en) * 2016-04-21 2018-10-19 西安交通大学 Three dimensional depth sensing device and method
CN111212593A (en) * 2017-10-11 2020-05-29 深圳传音通讯有限公司 Photographing temperature measurement method and photographing temperature measurement system based on intelligent terminal
CN110288534B (en) * 2019-06-28 2024-01-16 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium
CN110909634A (en) * 2019-11-07 2020-03-24 深圳市凯迈生物识别技术有限公司 Visible light and double infrared combined rapid in vivo detection method
CN110991266B (en) * 2019-11-13 2024-02-20 北京智芯原动科技有限公司 Binocular face living body detection method and device
CN111209822A (en) * 2019-12-30 2020-05-29 南京华图信息技术有限公司 Face detection method of thermal infrared image
CN111507200A (en) * 2020-03-26 2020-08-07 北京迈格威科技有限公司 Body temperature detection method, body temperature detection device and dual-optical camera
CN111739069B (en) * 2020-05-22 2024-04-26 北京百度网讯科技有限公司 Image registration method, device, electronic equipment and readable storage medium
CN112001886A (en) * 2020-07-17 2020-11-27 深圳市优必选科技股份有限公司 Temperature detection method, device, terminal and readable storage medium
CN112529947B (en) * 2020-12-07 2023-08-01 北京市商汤科技开发有限公司 Calibration method and device, electronic equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Silhouette-based features for visible-infrared registration;Guillaume-Alexandre Bilodeau et al.;《 IEEE Xplore》;全文 *
可见光与红外图像自动配准算法的设计与实现;王俊影;李扬;袁浩期;简单;郝敏;;机电工程技术(第11期);全文 *

Also Published As

Publication number Publication date
TW202223739A (en) 2022-06-16
CN112529947A (en) 2021-03-19
WO2022121243A1 (en) 2022-06-16

Similar Documents

Publication Publication Date Title
EP3168782B1 (en) Screen module and fingerprint acquisition method
US9256986B2 (en) Automated guidance when taking a photograph, using virtual objects overlaid on an image
US10969949B2 (en) Information display device, information display method and information display program
JP6123694B2 (en) Information processing apparatus, information processing method, and program
EP2977924A1 (en) Three-dimensional unlocking device, three-dimensional unlocking method and program
US8988519B2 (en) Automatic magnification of data on display screen based on eye characteristics of user
EP3637763A1 (en) Colour detection method and terminal
EP2814000A1 (en) Image processing apparatus, image processing method, and program
JP2013250882A5 (en)
EP4095744A1 (en) Automatic iris capturing method and apparatus, computer-readable storage medium, and computer device
WO2016103769A1 (en) Manipulation input device, manipulation input method, and program
US11037334B2 (en) Image display device, image display method, and program
US9560272B2 (en) Electronic device and method for image data processing
US20210231502A1 (en) Information processing apparatus for helping intuitive and easy recognition of temperature of heat source
CN112529947B (en) Calibration method and device, electronic equipment and storage medium
CN108604128B (en) Processing method and mobile device
US20220083145A1 (en) Information display apparatus using line of sight and gestures
JP6686319B2 (en) Image projection device and image display system
US10969865B2 (en) Method for transmission of eye tracking information, head mounted display and computer device
KR101609353B1 (en) Interface method and device for controlling screen
KR102614026B1 (en) Electronic device having a plurality of lens and controlling method thereof
CN113873159A (en) Image processing method and device and electronic equipment
JP2009222446A (en) Distance measuring device and its program
WO2018161421A1 (en) Performance test method and performance test apparatus for touch display screen of terminal device
CN207601742U (en) A kind of intelligent operating system of eye control display screen scaling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40039110

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant