WO2022241964A1 - 测温方法、计算机设备和计算机可读存储介质 - Google Patents

测温方法、计算机设备和计算机可读存储介质 Download PDF

Info

Publication number
WO2022241964A1
WO2022241964A1 PCT/CN2021/113612 CN2021113612W WO2022241964A1 WO 2022241964 A1 WO2022241964 A1 WO 2022241964A1 CN 2021113612 W CN2021113612 W CN 2021113612W WO 2022241964 A1 WO2022241964 A1 WO 2022241964A1
Authority
WO
WIPO (PCT)
Prior art keywords
temperature measurement
temperature
infrared
distance
type
Prior art date
Application number
PCT/CN2021/113612
Other languages
English (en)
French (fr)
Inventor
郑勇
管建强
林维上
戴志涛
Original Assignee
深圳市沃特沃德信息有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市沃特沃德信息有限公司 filed Critical 深圳市沃特沃德信息有限公司
Publication of WO2022241964A1 publication Critical patent/WO2022241964A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/80Calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Definitions

  • the present application relates to the technical field of temperature measurement, in particular to a temperature measurement method, computer equipment and a computer-readable storage medium.
  • the existing temperature measuring equipment When the existing temperature measuring equipment measures the temperature of an object, it usually can only perform effective temperature measurement for a preset temperature measurement area.
  • the forehead temperature gun can accurately obtain the body temperature when measuring the temperature of the forehead area of the human body; and if the forehead temperature gun is used to measure other areas of the human body (such as arms, cochlea) or other objects (such as hot water) Since the built-in temperature compensation algorithm and distance compensation algorithm of the forehead thermometer are only for the forehead area of the human body, the final converted temperature value of the forehead thermometer is not the real temperature of other areas of the human body or other objects, and the accuracy and versatility are relatively low. Low.
  • the main purpose of this application is to provide a temperature measurement method, computer equipment and computer-readable storage medium, aiming at solving the disadvantages of existing temperature measurement equipment that can only measure temperature for a single area or object and have low versatility.
  • the present application provides a temperature measurement method, including:
  • a temperature value is calculated according to the temperature measurement data and the temperature measurement algorithm, and the temperature value is used as a temperature measurement result of the measured object.
  • the present application also provides a computer device, including a memory and a processor, wherein a computer program is stored in the memory, wherein a temperature measurement method is implemented when the processor executes the computer program;
  • the temperature measurement method includes:
  • a temperature value is calculated according to the temperature measurement data and the temperature measurement algorithm, and the temperature value is used as a temperature measurement result of the measured object.
  • the present application also provides a computer-readable storage medium on which a computer program is stored, wherein, when the computer program is executed by a processor, a temperature measurement method is implemented, and the temperature measurement method includes the following steps:
  • a temperature value is calculated according to the temperature measurement data and the temperature measurement algorithm, and the temperature value is used as a temperature measurement result of the measured object.
  • the temperature measurement system first collects the temperature measurement area image and temperature measurement data of the object to be measured, and then obtains the temperature measurement area image by identifying the temperature measurement area image Corresponding temperature measurement type.
  • the temperature measurement system screens the temperature measurement algorithm corresponding to the temperature measurement type, and calculates the temperature value of the current measured object according to the temperature measurement data and temperature measurement algorithm.
  • the temperature measurement type corresponding to the current measured object is identified by combining the captured temperature measurement area images, and then the corresponding temperature measurement algorithm is screened.
  • the temperature value of the measured object is calculated by substituting the collected temperature measurement data, which has excellent versatility and accuracy, and is convenient for users to operate and use.
  • Fig. 1 is a schematic diagram of the steps of the temperature measurement method in an embodiment of the present application
  • Fig. 2 is an imaging layout diagram involved in a temperature measurement method in an embodiment of the present application
  • Fig. 3 is a block diagram of the overall structure of the temperature measuring device in an embodiment of the present application.
  • Fig. 4 is a schematic block diagram of a computer device according to an embodiment of the present application.
  • a temperature measurement method is provided in an embodiment of the present application, including:
  • S1 Collect the temperature measurement area image and temperature measurement data of the measured object
  • S4 Calculate a temperature value according to the temperature measurement data and the temperature measurement algorithm, and use the temperature value as a temperature measurement result of the measured object.
  • the temperature measurement method is described by taking it applied to a mobile terminal as an example.
  • An infrared temperature measurement sensor and a camera are deployed on the mobile terminal, and the infrared temperature measurement sensor and the camera face the same direction.
  • the user points the mobile terminal at the object to be measured, and the temperature measurement system collects the temperature measurement data of the object to be measured through the infrared temperature sensor on the mobile terminal, and turns on the camera to collect the measurement data of the object to be measured by the infrared temperature sensor. image of the temperature zone.
  • the center of the field of view of the camera and the center O' of the imaging surface of the camera itself are located on the imaging optical axis (the point where the optical axis passes through the imaging surface is defined as the center O' of the imaging surface).
  • the temperature measurement system defines that the measurement point corresponding to the infrared temperature measurement sensor is aligned with the measured object to collect data as the first infrared measurement point P, and the measurement point corresponding to the first infrared measurement point P on the imaging surface of the camera is the second infrared measurement point Point P'.
  • the temperature measurement system uses the similar triangle imaging principle of lens imaging, combined with the relative positional relationship between the first infrared measurement point P and the lens center O of the camera and the image distance of the camera, to obtain the second infrared measurement point P' on the imaging surface The pixel coordinates of .
  • the temperature measurement system takes the second infrared measurement point P' as the center point, and intercepts the area of N*N pixels around the second infrared measurement point P' through the camera algorithm as The camera automatically identifies the area and captures an image of the temperature measurement area.
  • the temperature measurement system processes the temperature measurement area image through visual recognition technology, and recognizes the temperature measurement type corresponding to the temperature measurement area image.
  • the temperature measurement type includes the temperature measurement area type and/or the temperature measurement object type
  • the temperature measurement area type includes forehead, arm, auricle (cochlea), etc.
  • the temperature measurement object type includes people, cats, dogs and other living things as well as ice cream, Dead things like hot coffee.
  • the temperature measurement system stores a temperature measurement type and temperature measurement algorithm mapping table, and the temperature measurement system screens the corresponding temperature measurement algorithm from the temperature measurement type and temperature measurement algorithm mapping table according to the current temperature measurement type.
  • Algorithms include temperature measurement rules and temperature conversion formulas.
  • the temperature measurement system processes the temperature measurement data according to the temperature measurement algorithm, so as to obtain the temperature value corresponding to the measured object, which is the current temperature measurement result of the measured object.
  • the optical axis refers to the main optical axis of the lens (the main optical axis of the lens refers to the straight line passing through the optical center perpendicular to the lens, also called the main optical axis, or the line connecting the two focal points of the lens. where the straight line)
  • the temperature measurement system recognizes the temperature measurement type corresponding to the current measured object by combining the captured temperature measurement area images, and then screens the corresponding temperature measurement algorithm.
  • the temperature measurement algorithm corresponds to the temperature measurement type of the measured object
  • the temperature value of the measured object is calculated by substituting the collected temperature measurement data.
  • the application of the temperature measurement method is no longer limited to a specific temperature measurement object or temperature measurement area (for example, it can only be used for the forehead area, or it can only be used to measure pets), and it has excellent versatility, intelligence, flexibility and accuracy. It is convenient for users to operate and use.
  • the temperature measurement method is applied to a mobile terminal, and an infrared temperature measurement sensor and a camera are deployed on the mobile terminal, and the step of collecting the temperature measurement area image and temperature measurement data of the object to be measured includes:
  • S101 Use an infrared temperature measurement sensor to collect the temperature measurement data of the measured object, and use the camera to collect images;
  • S102 Determine the pixel coordinates of the first infrared measurement point P corresponding to the second infrared measurement point P' on the imaging surface of the camera; wherein, the first infrared measurement point P is aligned with the infrared temperature sensor Describe the measurement point when the measured object collects data;
  • S103 intercept the image of the preset area on the imaging surface to obtain the image of the temperature measurement area, wherein the preset area is within the range of N*N pixels centered on the pixel coordinates on the imaging surface area, N is not 0.
  • the infrared temperature measurement sensor and the camera are arranged on a plane in the same direction on the mobile terminal, and the distance between the infrared temperature measurement sensor and the camera is within a preset distance range, and the smaller the preset distance, the better.
  • the temperature measurement system collects the temperature measurement data of the measured object through the infrared temperature measurement sensor, and the temperature measurement data is the surface temperature of the measured object.
  • the temperature measurement system turns on the camera and uses the camera to collect images. After the camera is turned on, the lens center O of the camera and the center O' of the imaging surface are both located on the imaging optical axis.
  • the temperature measurement system constructs a three-dimensional Cartesian coordinate system with the lens center O of the camera as the origin.
  • the imaging surface is located on one side of the lens of the camera (on the object side); the measured object is located on the other side of the lens of the camera (on the object side).
  • the temperature measurement system obtains the relative position information between the lens center O of the camera and the first infrared measurement point P (the relative position information represents the coordinates of the first infrared measurement point P on the three-dimensional Cartesian coordinate system), and the image distance of the camera ;
  • the image distance of camera is the distance between the imaging surface and the lens plane of camera, that is, characterizes the second infrared measurement point P ' to the first vertical distance of the plane formed by the Y-axis and the Z-axis of the three-dimensional Cartesian coordinate system,
  • the aforementioned imaging plane is parallel to the lens plane of the camera.
  • the second infrared measurement point P' to the plane composed of the X-axis and Z-axis of the three-dimensional Cartesian coordinate system is calculated.
  • the temperature measurement system obtains the pixel size of the photosensitive chip of the camera, calculates the first number of pixels in the vertical direction between the center O' of the imaging surface and the second infrared measurement point P' according to the pixel size and the second vertical distance, and According to the pixel size and the third vertical distance, the second number of pixels in the horizontal direction between the center O' of the imaging surface and the second infrared measurement point P' is calculated, and the coordinate system conversion is completed, thereby combining the second infrared measurement point
  • the first number of pixels and the second number of pixels between P' and the center O' of the imaging surface obtain the pixel coordinates of the second infrared measurement point P' on the imaging surface.
  • the temperature measurement system takes the pixel coordinates of the second infrared measurement point P' as the reference, takes the second infrared measurement point P' as the center point, and intercepts the N*N pixel range around the second infrared measurement point P' on the imaging surface through the camera algorithm
  • the area is used as the camera to automatically identify the area, and the image of the temperature measurement area is captured.
  • the temperature measurement system locates and obtains the pixel coordinates of the second infrared measurement point P' corresponding to the first infrared measurement point P on the imaging surface of the camera through the above-mentioned processing and calculation, eliminating the infrared temperature sensor and the camera
  • the setting position between has the disadvantage of deviation, which improves the accuracy of the temperature measurement area.
  • the temperature measurement system intercepts the temperature measurement area image centered on the pixel coordinates of the second infrared measurement point P', which is conducive to the targeted identification of objects in the infrared temperature measurement area and the corresponding temperature measurement types in subsequent steps, without the need for Recognize all objects on the entire temperature measurement area image, narrow the area that needs to be identified, strengthen the target accuracy, and improve the recognition efficiency; collecting images from the pixel level can further improve the accuracy of temperature measurement area image acquisition, and then The accuracy of the temperature value of the measured object obtained by the subsequent measurement is improved.
  • the step of determining the pixel coordinates of the second infrared measurement point P' corresponding to the first infrared measurement point P on the imaging surface of the camera includes:
  • S1021 Construct a three-dimensional Cartesian coordinate system with the lens center O of the camera as the origin, and obtain the relative position information between the lens center O of the camera and the first infrared measurement point P, and the camera's relative position information respectively.
  • Image distance wherein, the X-axis of the three-dimensional Cartesian coordinate system passes through the lens center O, and the direction of the X-axis is the same as the optical axis direction of the lens of the camera, and the Y-axis of the three-dimensional Cartesian coordinate system
  • the plane formed by the Z axis is perpendicular to the X axis of the three-dimensional Cartesian coordinate system, and the image distance represents the distance between the second infrared measurement point P' and the Y axis and Z axis of the three-dimensional Cartesian coordinate system.
  • the first vertical distance of the plane, the image distance characterizes the first vertical distance from the second infrared measurement point P' to the plane formed by the Y axis and
  • S1022 According to the relative position information and the image distance, calculate the second vertical distance from the second infrared measurement point P' to the plane formed by the X-axis and the Z-axis of the three-dimensional Cartesian coordinate system, and to The third vertical distance of the plane formed by the X-axis and the Y-axis of the three-dimensional Cartesian coordinate system;
  • S1023 Obtain the pixel size of the photosensitive chip of the camera, and calculate the distance between the center O' of the imaging surface and the second infrared measurement point P' according to the pixel size and the second vertical distance.
  • the first number of pixels in the Y-axis direction of the three-dimensional Cartesian coordinate system and calculate the distance from the center O' of the imaging surface to the second infrared measurement point P' according to the pixel size and the third vertical distance between the second number of pixels in the Z-axis direction of the three-dimensional Cartesian coordinate system;
  • the temperature measurement system constructs a three-dimensional Cartesian coordinate system with the lens center O of the camera as the origin (defined as O), wherein the X-axis of the three-dimensional Cartesian coordinate system passes through the lens center O, And the direction of the X-axis is the same as the optical axis of the lens of the camera (the straight line where the line connecting the two focal points of the lens) is in the same direction; the plane formed by the Y-axis and the Z-axis of the three-dimensional Cartesian coordinate system is perpendicular to the X-axis, that is, the three-dimensional Cartesian
  • the plane formed by the Y axis and the Z axis of the coordinate system is the plane where the lens is located.
  • the temperature measurement system respectively obtains the relative position information between the lens center O of the camera and the first infrared measurement point P, and the image distance of the camera.
  • the relative position information between the first infrared measurement point P and the center of the camera represents the coordinates of the first infrared measurement point P in the three-dimensional Cartesian coordinate system, which is assumed to be the first infrared measurement point P(x1, y1, z1) .
  • the value of x1 is the distance between the plane where the infrared temperature sensor is located and the surface of the temperature measurement area of the measured object, and the distance can be obtained by measuring the distance sensor.
  • the distance measuring sensor is arranged on the mobile terminal, and the distance measuring sensor and the infrared temperature measuring sensor are located on the same plane on the mobile terminal, so that the distance between the distance measuring sensor and the surface of the temperature measuring area of the measured object is equal to that of the infrared temperature measuring sensor.
  • the distance between the temperature measurement sensor and the surface of the temperature measurement area of the measured object is the same, or the distance measurement sensor can also be set in the infrared temperature measurement sensor, or the infrared temperature measurement sensor combines the functions of infrared temperature measurement and infrared distance measurement .
  • the values of y1 and z1 are the distances between the center of the infrared temperature measurement sensor and the lens center O of the camera.
  • This distance is a known design parameter after the infrared temperature measurement sensor and the camera are deployed on the mobile terminal.
  • the image distance of the camera is the vertical distance between the imaging surface and the plane of the lens, so the image distance represents the second infrared measurement point P' to three-dimensional Cartesian
  • the first vertical distance of the first vertical distance of the plane formed by the Y-axis and the Z-axis of the coordinate system, that is, the value of the image distance is the value of x2.
  • the values of y2 and z2 can be calculated to obtain the coordinates (x2, y2, z2) of the second infrared measurement point P' in the three-dimensional Cartesian coordinate system.
  • the camera is equipped with a photosensitive chip, and the pixel size of the photosensitive chip can be entered into the internal database of the temperature measurement system by the developer in advance, or can be calculated from the image sensor size and pixel resolution of the camera.
  • the photosensitive chip is preferably a CMOS photosensitive chip.
  • the temperature measurement system substitutes the pixel size and the second vertical distance into the first preset formula, and calculates the distance between the second infrared measurement point P' and the center O' of the imaging surface in the Y-axis direction of the three-dimensional Cartesian coordinate system (that is, the two-dimensional Cartesian coordinate system Y' axis direction), the first number of pixels, wherein, the first preset formula is: n1 is the number of the first pixel; and the pixel size and the third vertical distance are substituted into the second preset formula to calculate the second infrared measurement point P' to the center O' of the imaging surface in the Z-axis direction of the three-dimensional Cartesian coordinate system (that is, the second number of pixels in the Z' axis direction of the two-dimensional rectangular coordinate system), wherein the second preset formula is: n2 is the second number of pixels.
  • the center of the field of view of the camera and the center O' of the imaging plane are both located on the imaging optical axis, that is, the center O' of the imaging plane is located on the x-axis of the three-dimensional Cartesian coordinate system. Therefore, the calculated first pixel number and second pixel number are equivalent to the coordinates (x2, y2, z2) of the second infrared measurement point P' on the three-dimensional Cartesian coordinate system established with the lens center O as the coordinate origin
  • the second infrared measurement point P' is obtained in the two-dimensional Cartesian coordinate system established with the center O' of the imaging surface as the origin of the coordinates (the horizontal and vertical axes of the two-dimensional Cartesian coordinate system are respectively Z', Y '), that is, the pixel coordinates of the second infrared measurement point P' on the imaging surface are composed of the first pixel number and the second pixel number, the pixel coordinates are (n2, n1), and the unit is pixel number.
  • the infrared temperature sensor is preferably arranged on the Z-axis or Y-axis of the three-dimensional Cartesian coordinate system with the lens center O of the camera as the coordinate origin, which can reduce the amount of calculation of the pixel coordinates of the second infrared measurement point P' .
  • n1 directly takes the value of 0, and the pixel coordinate at this time is (n2, 0); when the infrared temperature sensor is on the Y axis, n2 directly takes the value of 0, and the pixel at this time is
  • the coordinates are (0, n1), and when the position of the infrared temperature sensor is set as in the above two cases, the data processing speed of the temperature measurement can be improved, and the aesthetics of the design can be improved.
  • the relative position information of the first infrared measurement point P relative to the center of the camera (it can be a three-dimensional coordinate position), the image distance from the first infrared measurement point P to the center of the camera, and according to the principle of similar triangles, the second The pixel coordinates of the two-infrared measurement point P' on the imaging plane.
  • the above "relative position information of the first infrared measurement point P" can be directly obtained according to the position setting size of the infrared temperature sensor relative to the center of the camera, and the distance (measureable) between the infrared temperature sensor and the first infrared measurement point P.
  • the above-mentioned embodiment (including the specific direction definition of the coordinate system, etc.) is only a specific solution described for ease of understanding.
  • the temperature measurement type includes a temperature measurement area type and/or a temperature measurement object type
  • the step of identifying the temperature measurement area image and obtaining the temperature measurement type corresponding to the temperature measurement area image includes:
  • S201 Perform visual recognition processing on the temperature measurement area image, and judge whether to recognize the temperature measurement area type and/or temperature measurement object type corresponding to the temperature measurement area image;
  • the temperature measurement type includes the temperature measurement area type and/or the temperature measurement object type
  • the temperature measurement system uses visual recognition technology to identify and process the temperature measurement area image after acquiring the temperature measurement area image.
  • the principle of the visual recognition technology is the same as that of the existing object recognition technology, and will not be described in detail here.
  • the visual recognition technology cannot obtain the temperature measurement object type and/or temperature measurement area type of the measured object through the temperature measurement area image recognition, then expand the preset area on the imaging surface according to the preset multiple (for example, the pixels corresponding to the initial preset area The area is 8*8, and the preset multiple is 2, then the pixel area corresponding to the new preset area is 16*16), so as to collect images with larger pixel areas, obtain new temperature measurement area images after interception, and The new temperature measurement area image is processed again for visual recognition. If the temperature measurement area type and/or the temperature measurement object type corresponding to the new temperature measurement area image still cannot be obtained, a temperature measurement area image with a larger pixel area is taken again according to a preset multiple.
  • the preset multiple for example, the pixels corresponding to the initial preset area The area is 8*8, and the preset multiple is 2, then the pixel area corresponding to the new preset area is 16*16
  • the visual recognition processing can identify the type of temperature measurement area and/or the type of temperature measurement object corresponding to the image of the temperature measurement area, so as to avoid that the pixel area of the selected preset area is too small, and the captured image is too partial to be accurately recognized , which ensures the smooth realization of the temperature measurement method, and has a high degree of intelligence, and does not require the user to manually adjust the size of the preset area.
  • the temperature measurement type includes a temperature measurement area type and a temperature measurement object type
  • the step of screening the temperature measurement algorithm corresponding to the temperature measurement type includes:
  • the temperature measurement algorithm includes temperature measurement rules and/or temperature conversion formulas, and the temperature measurement system obtains the temperature measurement object type of the measured object through visual recognition technology, and then judges whether the measured object is a living thing according to the temperature measurement object type . Since the temperature measurement data collected by the temperature measurement sensor is the surface temperature of the measured object, if the measured object is not a living thing (such as ice cream, hot coffee, etc.), the temperature measurement data can be directly used as the temperature value of the measured object According to this, the temperature measurement system sets the first calculation method of the current time as: the temperature measurement data is the temperature value of the measured object. If the measured object is a living thing, the matching temperature conversion formula is screened according to the temperature measurement area type and the temperature measurement object type of the measured object.
  • the temperature measurement data is the surface temperature of the measured object
  • the temperature measurement object type of the measured object is human
  • the temperature measurement area type is forehead.
  • the human body temperature is calculated to match the real temperature of the measured object. Therefore, when the measured object is a living thing, the temperature measurement system is set to calculate the temperature value of the measured object according to the temperature measurement data and the temperature conversion formula to form the current temperature measurement algorithm.
  • a ranging sensor is also deployed on the mobile terminal, and the ranging sensor and the infrared temperature measuring sensor are arranged adjacent to each other.
  • the step of collecting the temperature measuring area image and the temperature measuring data of the measured object Also includes:
  • the temperature measurement system when the user uses the mobile terminal to measure the temperature of the measured object, the temperature measurement system obtains the temperature measurement distance between the infrared temperature measurement sensor and the measured object through the distance measuring sensor on the mobile terminal.
  • a preset distance range is set in the temperature measurement system, and the preset distance range is a suitable distance for temperature measurement, preferably 28-32 cm.
  • the infrared temperature measurement sensor has high accuracy in temperature measurement within the preset distance range.
  • the temperature measurement system compares the current temperature measurement distance with the preset distance range to determine whether the temperature measurement distance is within the preset distance range. If the temperature measurement distance is within the preset distance range, the temperature measurement system adopts the temperature measurement data currently collected by the infrared temperature measurement sensor.
  • the temperature measurement system If the temperature measurement distance is not within the preset distance range, the temperature measurement system outputs prompt information, which is used to remind the user to change the temperature measurement distance so that the temperature measurement distance is within the preset distance range.
  • the preset distance range is 28-32cm
  • the current temperature measurement distance is 40cm
  • the temperature measurement system will prompt the user to move the mobile terminal closer to the object to be measured, and explain the distance value, for example, it needs to be close to 8-12cm.
  • the step of calculating the temperature value according to the temperature measurement data and the temperature measurement algorithm, and using the temperature value as the temperature measurement result of the measured object it includes:
  • the temperature measurement system calculates and obtains the temperature value of the measured object, it marks the temperature value and the temperature measurement type on the captured temperature measurement area image. Then output the marked temperature measurement area image to the display interface for display, so that the user can intuitively understand the real temperature of the measured object through the temperature measurement area image and the marked temperature measurement type and temperature value, which is simple and clear, and is conducive to improving User experience.
  • an embodiment of the present application also provides a temperature measuring device, including:
  • Acquisition module used to collect the temperature measurement area image and temperature measurement data of the measured object
  • An identification module 2 configured to identify the temperature measurement area image, and obtain the temperature measurement type corresponding to the temperature measurement area image;
  • a screening module 3 configured to screen a temperature measurement algorithm corresponding to the temperature measurement type
  • the calculation module 4 is used to calculate a temperature value according to the temperature measurement data and the temperature measurement algorithm, and use the temperature value as the temperature measurement result of the measured object.
  • the temperature measurement method is applied to a mobile terminal, and an infrared temperature measurement sensor and a camera are deployed on the mobile terminal, and the acquisition module 1 includes:
  • a collection unit configured to use an infrared temperature measurement sensor to collect the temperature measurement data of the object to be measured, and use the camera to collect images;
  • An analysis unit configured to determine the pixel coordinates of the second infrared measurement point P' corresponding to the first infrared measurement point P on the imaging surface of the camera; wherein, the first infrared measurement point P is the infrared temperature sensor Aligning the measuring point when the measured object collects data;
  • An intercepting unit configured to intercept an image of a preset area on the imaging surface to obtain an image of the temperature measurement area, wherein the preset area is N*N on the imaging surface centered on the pixel coordinates The area within the pixel range, N is not 0.
  • parsing unit includes:
  • An acquisition subunit configured to construct a three-dimensional Cartesian coordinate system with the lens center O of the camera as the origin, and respectively acquire relative position information between the lens center O of the camera and the first infrared measurement point P, and The image distance of the camera, wherein the X-axis of the three-dimensional Cartesian coordinate system passes through the lens center O, and the direction of the X-axis is the same as the optical axis direction of the lens of the camera, and the three-dimensional Cartesian coordinate system
  • the plane formed by the Y-axis and the Z-axis of the system is perpendicular to the X-axis of the three-dimensional Cartesian coordinate system, and the image distance characterizes the Y-axis and the distance from the second infrared measurement point P' to the three-dimensional Cartesian coordinate system
  • the first calculation subunit is used to calculate the second infrared measurement point P' to the plane formed by the X-axis and Z-axis of the three-dimensional Cartesian coordinate system according to the relative position information and the image distance. Two vertical distances, and a third vertical distance to the plane formed by the X-axis and Y-axis of the three-dimensional Cartesian coordinate system;
  • the second calculation subunit is used to obtain the pixel size of the photosensitive chip of the camera, and calculate the center O' of the imaging surface to the second infrared measurement point P according to the pixel size and the second vertical distance ' between the first number of pixels in the Y-axis direction of the three-dimensional Cartesian coordinate system, and calculate the center O' of the imaging surface according to the pixel size and the third vertical distance to the second The second number of pixels in the Z-axis direction of the three-dimensional Cartesian coordinate system between the infrared measurement points P';
  • the analysis subunit is configured to obtain the pixel coordinates of the second infrared measurement point P' on the imaging surface according to the first number of pixels and the second number of pixels.
  • the temperature measurement type includes a temperature measurement area type and/or a temperature measurement object type
  • the identification module 2 includes:
  • the first judging unit is configured to perform visual recognition processing on the temperature measurement area image, and judge whether to recognize the temperature measurement area type and/or the temperature measurement object type corresponding to the temperature measurement area image;
  • the expansion unit is configured to expand the preset area according to a preset multiple and obtain a new temperature measurement area image by shooting if the temperature measurement area type and/or the temperature measurement object type corresponding to the temperature measurement area image cannot be recognized, And perform visual recognition processing on the new temperature measurement area image until the type of temperature measurement area and/or the type of temperature measurement object is obtained.
  • the temperature measurement type includes a temperature measurement area type and a temperature measurement object type
  • the screening module 3 includes:
  • a second judging unit configured to judge whether the measured object is a living thing according to the type of the temperature measuring object
  • the first setting unit is used to screen the temperature conversion formula matching the temperature measurement area type and the temperature measurement object type if the measured object is a living thing, and use the temperature conversion formula as the temperature measurement algorithm;
  • the second setting unit is used to call a first calculation method as the temperature measurement algorithm if the measured object is not a living thing, and the first calculation method is to directly use the temperature measurement data as the measured temperature data. Measure the temperature value of the object.
  • a ranging sensor is also deployed on the mobile terminal, and the ranging sensor is adjacent to the infrared temperature measuring sensor.
  • the temperature measuring device also includes:
  • An acquisition module 5 configured to acquire the temperature measurement distance between the infrared temperature measurement sensor and the measured object through the distance measurement sensor;
  • a judging module 6 configured to judge whether the temperature measurement distance is within a preset distance range
  • the output module 7 is configured to output prompt information to remind the user to change the temperature measurement distance if the temperature measurement distance is not within the preset distance range.
  • the temperature measuring device also includes:
  • An annotation module 8 configured to annotate the temperature value and the temperature measurement type on the temperature measurement area image
  • the display module 9 is configured to output the annotated image of the temperature measurement area to a display interface.
  • each module and unit in the temperature measuring device is used to correspondingly execute each step in the above temperature measuring method, and its specific implementation process is not described in detail here.
  • the temperature measuring device first collects the temperature measurement area image and temperature measurement data of the object to be measured, and then obtains the temperature measurement type corresponding to the temperature measurement area image by identifying the temperature measurement area image.
  • the temperature measurement device screens the temperature measurement algorithm corresponding to the temperature measurement type, and calculates the temperature value of the current measured object according to the temperature measurement data and the temperature measurement algorithm.
  • the temperature measurement type corresponding to the current measured object is identified by combining the captured temperature measurement area images, and then the corresponding temperature measurement algorithm is screened.
  • the temperature value of the measured object is calculated by substituting the collected temperature measurement data, which has excellent versatility and accuracy, and is convenient for users to operate and use.
  • an embodiment of the present application also provides a computer device, which may be a server, and its internal structure may be as shown in FIG. 3 .
  • the computer device includes a processor, memory, network interface and database connected by a system bus. Among them, the processor designed by the computer is used to provide calculation and control capabilities.
  • the memory of the computer device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system, computer programs and databases.
  • the internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage medium.
  • the database of the computer device is used to store data such as preset distance ranges.
  • the network interface of the computer device is used to communicate with an external terminal via a network connection.
  • the above-mentioned processor executes the steps of the above-mentioned temperature measuring method:
  • S1 Collect the temperature measurement area image and temperature measurement data of the measured object
  • S4 Calculate a temperature value according to the temperature measurement data and the temperature measurement algorithm, and use the temperature value as a temperature measurement result of the measured object.
  • An embodiment of the present application also provides a computer-readable storage medium.
  • the storage medium may be a non-volatile storage medium or a volatile storage medium on which a computer program is stored.
  • the computer program is executed by a processor Realize above-mentioned any embodiment temperature measuring method, described method is specifically:
  • S1 Collect the temperature measurement area image and temperature measurement data of the measured object
  • S4 Calculate a temperature value according to the temperature measurement data and the temperature measurement algorithm, and use the temperature value as a temperature measurement result of the measured object.
  • Nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.
  • SRAM Static RAM
  • DRAM Dynamic RAM
  • SDRAM Synchronous DRAM
  • SSRSDRAM Double Data Rate SDRAM
  • ESDRAM Enhanced SDRAM
  • SLDRAM Synchronous Link (Synchlink) DRAM
  • SLDRAM Synchronous Link (Synchlink) DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Radiation Pyrometers (AREA)

Abstract

一种测温方法、计算机设备和计算机可读存储介质,通过结合拍摄的测温区域图像,识别得到当前次被测物体所对应的测温类型,进而筛选与其对应的测温算法。在测温算法与被测物体的测温类型对应的情况下,代入采集的测温数据计算得到被测物体的温度值,具有优秀的通用性和准确度,便于用户操作使用。

Description

测温方法、计算机设备和计算机可读存储介质 技术领域
本申请涉及测温技术领域,特别涉及一种测温方法、计算机设备和计算机可读存储介质。
背景技术
现有测温设备在对物体进行测温时,通常只能针对预设设置的测温区域进行有效的温度测量。比如额温枪,在对人体的额头区域进行测温时,能够准确得到人体的体温;而如果使用额温枪对人体的其他区域(比如手臂、耳蜗)或者其他物体(比如热水)进行测温时,由于额温枪内置的温度补偿算法和距离补偿算法仅针对人体额头区域,导致额温枪最后换算得到的温度值并非人体的其他区域或者其他物体的真实温度,准确度和通用性较低。
技术问题
本申请的主要目的为提供一种测温方法、计算机设备和计算机可读存储介质,旨在解决现有测温设备仅能针对单一区域或物体进行温度测量、通用性较低的弊端。
技术解决方案
为实现上述目的,第一方面,本申请提供一种测温方法,包括:
采集被测物体的测温区域图像和测温数据;
识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
筛选与所述测温类型对应的测温算法;
根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
第二方面,本申请还提供一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机程序,其中,所述处理器执行所述计算机程序时实现一种测温方法;
其中,所述测温方法包括:
采集被测物体的测温区域图像和测温数据;
识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
筛选与所述测温类型对应的测温算法;
根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
第三方面,本申请还提供一种计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现一种测温方法,所述测温方法包括以下步骤:
采集被测物体的测温区域图像和测温数据;
识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
筛选与所述测温类型对应的测温算法;
根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
有益效果
本申请中提供的一种测温方法、计算机设备和计算机可读存储介质,测温系统首先采集被测物体的测温区域图像和测温数据,然后通过识别测温区域图像得到测温区域图像对应的测温类型。测温系统筛选与测温类型对应的测温算法,并根据测温数据和测温算法,计算得 到当前次被测物体的温度值。本申请通过结合拍摄的测温区域图像,识别得到当前次被测物体所对应的测温类型,进而筛选与其对应的测温算法。在测温算法与被测物体的测温类型对应的情况下,代入采集的测温数据计算得到被测物体的温度值,具有优秀的通用性和准确度,便于用户操作使用。
附图说明
图1是本申请一实施例中测温方法步骤示意图;
图2是本申请一实施例中测温方法所涉及的成像布局图;
图3是本申请一实施例中测温装置整体结构框图;
图4是本申请一实施例的计算机设备的结构示意框图。
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的最佳实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
参照图1、图2,本申请一实施例中提供了一种测温方法,包括:
S1:采集被测物体的测温区域图像和测温数据;
S2:识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
S3:筛选与所述测温类型对应的测温算法;
S4:根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
本实施例中,测温方法以应用于移动终端为例进行说明,移动终端上部署有红外测温传感器和摄像头,红外测温传感器和摄像头朝向同一方向。测温时,用户将移动终端对准被测物体,测温系统通过移动终端上的红外测温传感器采集被测物体的测温数据,并开启摄像头采集红外测温传感器对准被测物体的测温区域图像。在摄像头的视场范围内,摄像头的视场中心和摄像头自身的成像面的中心O'均位于成像光轴上(光轴穿设在成像面上的点定义为成像面的中心O')。测温系统定义红外测温传感器对准被测物体采集数据时所对应的测量点为第一红外测量点P,第一红外测量点P在摄像头的成像面所对应的测量点为第二红外测量点P'。测温系统通过透镜成像的相似三角形成像原理,结合第一红外测量点P与摄像头的透镜中心O之间的相对位置关系以及摄像头的像距,解算得到第二红外测量点P'在成像面的像素坐标。在确定第二红外测量点P'的像素坐标后,测温系统将第二红外测量点P'作为中心点,通过摄像头算法截取第二红外测量点P'周围N*N个像素范围的区域作为摄像头自动识别区域,拍摄得到测温区域图像。测温系统通过视觉识别技术对测温区域图像进行处理,识别得到测温区域图像所对应的测温类型。其中,测温类型包括测温区域类型和/或测温物体类型,测温区域类型包括额头、手臂、耳廓(耳蜗)等,测温物体类型包括人、猫、狗等活物以及冰淇淋、热咖啡等死物。测温系统内存储有测温类型与测温算法映射关系表,测温系统根据当前次的测温类型从测温类型与测温算法映射关系表中筛选得到对应的测温算法,该测温算法包括测温规则和温度转换公式。测温系统根据测温算法对测温数据进行相应的处理,从而得到与被测物体对应的温度值,该温度值即为被测物体当前次的测温结果。
需要说明的是,本实施方式中,光轴是指透镜的主光轴(透镜的主光轴是指通过光心垂直透镜的直线,也叫主光轴,或者说是透镜两焦点的连线所在的直线)
本实施例中,测温系统通过结合拍摄的测温区域图像,识别得到当前次被测物体所对应的测温类型,进而筛选与其对应的测温算法。在测温算法与被测物体的测温类型对应的情况 下,代入采集的测温数据计算得到被测物体的温度值。测温方法的应用不再局限于特定的测温物体或测温区域(比如只能针对额头区域,或者只能用于测量宠物),具有优秀的通用性、智能度、灵活性和准确度,便于用户操作使用。
进一步的,测温方法应用于移动终端,所述移动终端上部署有红外测温传感器和摄像头,所述采集被测物体的测温区域图像和测温数据的步骤,包括:
S101:使用红外测温传感器采集被测物体的所述测温数据,并使用所述摄像头采集图像;
S102:确定第一红外测量点P在所述摄像头的成像面对应的第二红外测量点P'的像素坐标;其中,所述第一红外测量点P为所述红外测温传感器对准所述被测物体采集数据时的测量点;
S103:截取所述成像面上预设区域的图像,得到所述测温区域图像,其中,所述预设区域为所述成像面上以所述像素坐标为中心的N*N个像素范围内的区域,N不为0。
本实施例中,红外测温传感器和摄像头设置在移动终端上同一方向的平面,且红外测温传感器和摄像头之间的距离在预设距离范围内,该预设距离越小越好。测温系统通过红外测温传感器采集被测物体的测温数据,该测温数据为被测物体的表面温度。同时,测温系统开启摄像头,使用摄像头进行采集图像。在开启摄像头后,摄像头的透镜中心O和成像面的中心O'均位于成像光轴上。测温系统以摄像头的透镜中心O为原点构建三维笛卡尔坐标系,成像面位于摄像头的透镜的一侧(在物侧);被测物体位于摄像头的透镜的另一侧(在物侧)。测温系统分别获取摄像头的透镜中心O与第一红外测量点P之间的相对位置信息(相对位置信息表征第一红外测量点P在三维笛卡尔坐标系上的坐标),以及摄像头的像距;其中,摄像头的像距为成像面到摄像头的透镜平面之间的距离,即表征第二红外测量点P'到三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离,前述成像面与该摄像头的透镜平面平行。根据透镜成像的相似三角形成像原理,结合第一红外测量点P的坐标、摄像头的像距,计算得到第二红外测量点P'到三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离。测温系统获取摄像头的感光芯片的像素尺寸,根据像素尺寸和第二垂直距离计算得到成像面的中心O'到第二红外测量点P'之间在竖直方向的第一像素个数,并根据像素尺寸和第三垂直距离计算得到成像面的中心O'到所述第二红外测量点P'之间在水平方向的第二像素个数,完成坐标系转换,从而结合第二红外测量点P'与成像面的中心O'之间的第一像素个数、第二像素个数,得到第二红外测量点P'在成像面上的像素坐标。测温系统以第二红外测量点P'的像素坐标为基准,将第二红外测量点P'作为中心点,通过摄像头算法截取成像面上第二红外测量点P'周围N*N个像素范围的区域作为摄像头自动识别区域,拍摄得到测温区域图像。
本实施例中,测温系统通过上述的处理和计算,定位得到第一红外测量点P在摄像头的成像面所对应的第二红外测量点P'的像素坐标,消除了红外测温传感器与摄像头之间的设置位置具有偏差的弊端,提高了温度测量区域的准确度。并且,测温系统以第二红外测量点P'的像素坐标为中心截取测温区域图像,有利于后续步骤中针对性的识别处于红外测温区域的物体以及对应的测温类型,而不需要识别整张测温区域图像上的所有物体,缩小了需要识别的区域,强化了目标准确度,也提升了识别效率;从像素层面采集图像能够进一步提高对测温区域图像采集的精准度,进而提高了后续测量得到的被测物体温度值的准确度。
进一步的,所述确定第一红外测量点P在所述摄像头的成像面对应的第二红外测量点P'的像素坐标的步骤,包括:
S1021:以所述摄像头的透镜中心O为原点构建三维笛卡尔坐标系,并分别获取所述摄像头的透镜中心O与所述第一红外测量点P之间的相对位置信息,以及所述摄像头的像距,其中,所述三维笛卡尔坐标系的X轴穿过所述透镜中心O,且X轴的方向与所述摄像头的透镜的光轴方向相同,所述三维笛卡尔坐标系的Y轴、Z轴构成的平面与所述三维笛卡尔坐标 系的X轴互相垂直,所述像距表征所述第二红外测量点P'到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离,所述像距表征所述第二红外测量点P'到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离;
S1022:根据所述相对位置信息和所述像距,计算得到所述第二红外测量点P'到所述三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到所述三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离;
S1023:获取所述摄像头的感光芯片的像素尺寸,根据所述像素尺寸和所述第二垂直距离计算得到所述成像面的中心O'到所述第二红外测量点P'之间在所述三维笛卡尔坐标系的Y轴方向的第一像素个数,并根据所述像素尺寸和所述第三垂直距离计算得到所述成像面的中心O'到所述第二红外测量点P'之间在所述三维笛卡尔坐标系的Z轴方向的第二像素个数;
S1024:根据所述第一像素个数和所述第二像素个数,得到所述第二红外测量点P'在所述成像面的所述像素坐标。
本实施例中,如图2所示,测温系统以摄像头的透镜中心O为原点(定义为O)构建三维笛卡尔坐标系,其中,三维笛卡尔坐标系的X轴穿过透镜中心O,并且X轴的方向与摄像头的透镜的光轴(透镜两焦点的连线所在的直线)方向相同;三维笛卡尔坐标系的Y轴和Z轴构成的平面与X轴互相垂直,即三维笛卡尔坐标系的Y轴和Z轴构成的平面为透镜所在的平面。测温系统分别获取摄像头的透镜中心O与第一红外测量点P之间的相对位置信息,以及摄像头的像距。其中,第一红外测量点P与摄像头的中心之间的相对位置信息表征第一红外测量点P在三维笛卡尔坐标系中的坐标,假定为第一红外测量点P(x1,y1,z1)。x1的值为红外测温传感器所在的平面与被测物体的测温区域表面之间的距离,该距离可以通过测距传感器测量得到。优选的,测距传感器设置在移动终端上,且测距传感器与红外测温传感器位于移动终端上的同一平面,由此使得测距传感器到被测物体的测温区域表面之间的距离与红外测温传感器与被测物体的测温区域表面之间的距离相同,或者,测距传感器也可设置于红外测温传感器中,又或者,红外测温传感器兼并红外测温和红外测距的功能。y1、z1的值为红外测温传感器的中心与摄像头的透镜中心O之间的距离,该距离在红外测温传感器和摄像头部署在移动终端上后,即为已知的设计参数。假定在三维笛卡尔坐标系的坐标为(x2,y2,z2),摄像头的像距为成像面与透镜的平面之间的垂直距离,因此像距表征第二红外测量点P'到三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离的第一垂直距离,即像距的值为x2的值。根据透镜成像的相似三角形成像原理,
Figure PCTCN2021113612-appb-000001
将x1、y1、z1、x2的值分别代入后即可计算得到y2、z2的值,从而得到第二红外测量点P'在三维笛卡尔坐标系的坐标(x2,y2,z2)。摄像头上搭载有感光芯片,感光芯片的像素尺寸可以由开发人员预先录入测温系统内部数据库,也可以由摄像头的图像传感器尺寸和像素分辨率计算得到。感光芯片优选为CMOS感光芯片,假设CMOS图像传感器尺寸为M1*N1,图像的像素对应的行数、列数为M2*N2,传感器尺寸计算公式为:像素尺寸=传感器尺寸/图像的像素对应的行数或列数,即
Figure PCTCN2021113612-appb-000002
S为像素尺寸;CMOS感光芯片X和Y方向上像素形状是正方形框,故边长一致。测温系统将像素尺寸和第二垂直距离代入第一预设公式中,计算得到第二红外测量点P'到成像面的中心O'之间在三维笛卡尔坐标系Y轴方向(即二维直角坐标系Y'轴方向)的第一像素个数,其中,第一预设公式为:
Figure PCTCN2021113612-appb-000003
n1为第一像素个数;并将 像素尺寸和第三垂直距离代入第二预设公式中,计算得到第二红外测量点P'到成像面的中心O'在三维笛卡尔坐标系Z轴方向(即二维直角坐标系Z'轴方向)的第二像素个数,其中,第二预设公式为:
Figure PCTCN2021113612-appb-000004
n2为第二像素个数。由于摄像头的视场中心和成像面的中心O'均位于成像光轴上,即成像面的中心O'位于三维笛卡尔坐标系的x轴上。因而计算所得的第一像素个数和第二像素个数,相当于将在以透镜中心O为坐标原点建立三维笛卡尔坐标系上第二红外测量点P'的坐标(x2,y2,z2)通过坐标系转换后,得到第二红外测量点P'在以成像面的中心O'为坐标原点建立的二维直角坐标系(该二维直角坐标系的横纵坐标轴分别为Z'、Y')上的像素坐标值,即第二红外测量点P'在成像面的像素坐标由第一像素个数和第二像素个数组成,该像素坐标为(n2,n1),单位为像素个数。
在实际应用中,红外测温传感器优选设置在以摄像头的透镜中心O为坐标原点的三维笛卡尔坐标系的Z轴或Y轴上,能够减少第二红外测量点P'的像素坐标的计算量。红外测温传感器位于Z轴上时,n1直接取值为0,此时的像素坐标为(n2,0);红外测温传感器位于Y轴上时,n2直接取值为0,此时的像素坐标为(0,n1),红外测温传感器位置如上述两种情况设置时,可以提高测温的数据处理速度,以及提升设计的美观度。
综上,根据第一红外测量点P相对摄像头的中心的相对位置信息(可以是三维坐标位置)、第一红外测量点P到摄像头的中心的像距,并根据相似三角形原理,即可获得第二红外测量点P'在成像面的像素坐标。而上述“第一红外测量点P的相对位置信息”可以直接根据红外测温传感器相对摄像头中心的位置设置尺寸、红外测温传感器与第一红外测量点P之间距离(可测出)得到。该上述实施例(包括坐标系的具体方向限定等)只是为了便于理解而说明的一个具体方案。
进一步的,所述测温类型包括测温区域类型和/或测温物体类型,所述识别所述测温区域图像,得到所述测温区域图像对应的测温类型的步骤,包括:
S201:对所述测温区域图像进行视觉识别处理,判断是否识别得到所述测温区域图像对应的测温区域类型和/或测温物体类型;
S202:若识别不到所述测温区域图像对应的测温区域类型和/或测温物体类型,则按照预设倍数扩大所述预设区域,拍摄得到新的测温区域图像,并对所述新的测温区域图像进行视觉识别处理。
本实施例中,测温类型包括测温区域类型和/或测温物体类型,测温系统在采集得到测温区域图像后,使用视觉识别技术对测温区域图像进行识别处理。其中,视觉识别技术的原理与现有物体识别技术相同,在此不做详述。在通过视觉识别技术处理后,如果可以根据测温区域图像直接得到被测物体的测温物体类型和/或测温区域类型,则根据测温物体类型和/或测温区域类型生成与测温类型并与测温区域图像进行关联,以便进行下一步的测温算法筛选。如果视觉识别技术无法通过测温区域图像识别得到被测物体的测温物体类型和/或测温区域类型,则按照预设倍数扩大成像面上的预设区域(比如初始预设区域对应的像素面积为8*8,预设倍数为2,则新的预设区域对应的像素面积为16*16),以便采集到更大像素面积的图像,截取后得到新的测温区域图像,并对新的测温区域图像再次进行视觉识别处理。如果仍然无法得到新的测温区域图像所对应的测温区域类型和/或测温物体类型,则按照预设倍数再次拍摄更大像素面积的测温区域图像。依次类推,直至视觉识别处理能够识别出测温区域图像对应的测温区域类型和/或测温物体类型,避免因选择的预设区域的像素面积过小,拍摄得到的图像过于局部无法准确识别,保证了测温方法的顺利实现,且具有较高的智能化程度,不需要用户进行手动调节预设区域大小。
进一步的,所述测温类型包括测温区域类型和测温物体类型,所述筛选与所述测温类型 对应的测温算法的步骤,包括:
S301:根据所述测温物体类型判断所述被测物体是否为活物;
S302:若所述被测物体为活物,则筛选与所述测温区域类型、所述测温物体类型匹配的温度转换公式,并将所述温度转换公式作为所述测温算法;
S303:若所述被测物体不是活物,则调取第一运算方式作为所述测温算法,所述第一运算方式为将所述测温数据直接作为所述被测物体的温度值。
本实施例中,测温算法包括测温规则和/或温度转换公式,测温系统通过视觉识别技术得到被测物体的测温物体类型,然后根据测温物体类型判断被测物体是否为活物。由于测温传感器所采集的测温数据为被测物体的表面温度,如果被测物体不是活物(比如为冰淇淋、热咖啡等物体),则可以将测温数据直接作为被测物体的温度值,测温系统据此设定当前次的第一运算方式为:测温数据为被测物体的温度值。如果被测物体为活物,则根据被测物体的测温区域类型、测温物体类型筛选匹配的温度转换公式。具体地,由于被测物体为活物,而测温数据为被测物体的表面温度,因而需要调取相应的温度补偿算法和距离补偿算法(即温度转换公式)对测温数据进行相应的处理,得到与被测物体对应的真实温度。比如被测物体的测温物体类型为人,测温区域类型为额头,在将测温数据代入温度转换公式后计算得到人体温度,以契合被测物体的真实温度情况。因此,在被测物体为活物时,测温系统设定根据测温数据和温度转换公式计算得到被测物体的温度值,形成当前次的测温算法。
进一步的,所述移动终端上还部署有测距传感器,所述测距传感器和所述红外测温传感器相邻设置,所述采集被测物体的测温区域图像和测温数据的步骤之前,还包括:
S5:通过所述测距传感器获取所述红外测温传感器与所述被测物体之间的测温距离;
S6:判断所述测温距离是否在预设距离范围内;
S7:若所述测温距离不在预设距离范围内,则输出提示信息,提醒用户改变测温距离。
本实施例中,用户使用移动终端对准被测物体进行温度测量时,测温系统通过移动终端上的测距传感器获取红外测温传感器与被测物体之间的测温距离。测温系统内设置有预设距离范围,该预设距离范围为测温适宜距离,优选为28—32cm,红外测温传感器在预设距离范围内进行温度测量具有较高的准确度。测温系统将当前次的测温距离与预设距离范围进行比对,判断测温距离是否在预设距离范围内。如果测温距离在预设距离范围内,则测温系统采纳红外测温传感器当前次所采集的测温数据。如果测温距离不在预设距离范围内,测温系统输出提示信息,该提示信息用于提醒用户改变测温距离,以使测温距离位于预设距离范围内。比如预设距离范围为28—32cm,当前次的测温距离为40cm,则测温系统提示用户将移动终端靠近被测物体,并说明靠近的距离值,比如需要靠近8—12cm。
进一步的,所述根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果的步骤之后,包括:
S8:将所述温度值和所述测温类型标注在所述测温区域图像上;
S9:将标注后的所述测温区域图像输出到显示界面。
本实施例中,测温系统计算得到被测物体的温度值后,将温度值和测温类型标注在拍摄的测温区域图像上。然后将标注后的测温区域图像输出到显示界面进行显示,使得用户可以通过测温区域图像以及标注的测温类型和温度值,直观的了解被测物体的真实温度,简单明了,有利于提高用户的使用体验。
参照图3,本申请一实施例中还提供了一种测温装置,包括:
采集模块1,用于采集被测物体的测温区域图像和测温数据;
识别模块2,用于识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
筛选模块3,用于筛选与所述测温类型对应的测温算法;
计算模块4,用于根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度 值作为所述被测物体的测温结果。
进一步的,测温方法应用于移动终端,所述移动终端上部署有红外测温传感器和摄像头,所述采集模块1,包括:
采集单元,用于使用红外测温传感器采集被测物体的所述测温数据,并使用所述摄像头采集图像;
解析单元,用于确定第一红外测量点P在所述摄像头的成像面对应的第二红外测量点P'的像素坐标;其中,所述第一红外测量点P为所述红外测温传感器对准所述被测物体采集数据时的测量点;
截取单元,用于截取所述成像面上预设区域的图像,得到所述测温区域图像,其中,所述预设区域为所述成像面上以所述像素坐标为中心的N*N个像素范围内的区域,N不为0。
进一步的,所述解析单元,包括:
获取子单元,用于以所述摄像头的透镜中心O为原点构建三维笛卡尔坐标系,并分别获取所述摄像头的透镜中心O与所述第一红外测量点P之间的相对位置信息,以及所述摄像头的像距,其中,所述三维笛卡尔坐标系的X轴穿过所述透镜中心O,且X轴的方向与所述摄像头的透镜的光轴方向相同,所述三维笛卡尔坐标系的Y轴、Z轴构成的平面与所述三维笛卡尔坐标系的X轴互相垂直,所述像距表征所述第二红外测量点P'到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离;
第一计算子单元,用于根据所述相对位置信息和所述像距,计算得到所述第二红外测量点P'到所述三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到所述三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离;
第二计算子单元,用于获取所述摄像头的感光芯片的像素尺寸,根据所述像素尺寸和所述第二垂直距离计算得到所述成像面的中心O'到所述第二红外测量点P'之间在所述三维笛卡尔坐标系的Y轴方向的第一像素个数,并根据所述像素尺寸和所述第三垂直距离计算得到所述成像面的中心O'到所述第二红外测量点P'之间在所述三维笛卡尔坐标系的Z轴方向的第二像素个数;
解析子单元,用于根据所述第一像素个数和所述第二像素个数,得到所述第二红外测量点P'在所述成像面的所述像素坐标。
进一步的,所述测温类型包括测温区域类型和/或测温物体类型,所述识别模块2,包括:
第一判断单元,用于对所述测温区域图像进行视觉识别处理,判断是否识别得到所述测温区域图像对应的测温区域类型和/或测温物体类型;
扩大单元,用于若识别不到所述测温区域图像对应的测温区域类型和/或测温物体类型,则按照预设倍数扩大所述预设区域,拍摄得到新的测温区域图像,并对所述新的测温区域图像进行视觉识别处理,直至得到测温区域类型和/或测温物体类型。
进一步的,所述测温类型包括测温区域类型和测温物体类型,所述筛选模块3,包括:
第二判断单元,用于根据所述测温物体类型判断所述被测物体是否为活物;
第一设定单元,用于若所述被测物体为活物,则筛选与所述测温区域类型、所述测温物体类型匹配的温度转换公式,并将所述温度转换公式作为所述测温算法;
第二设定单元,用于若所述被测物体不是活物,则调取第一运算方式作为所述测温算法,所述第一运算方式为将所述测温数据直接作为所述被测物体的温度值。
进一步的,所述移动终端上还部署有测距传感器,所述测距传感器和所述红外测温传感器相邻设置,所述测温装置,还包括:
获取模块5,用于通过所述测距传感器获取所述红外测温传感器与所述被测物体之间的测温距离;
判断模块6,用于判断所述测温距离是否在预设距离范围内;
输出模块7,用于若所述测温距离不在预设距离范围内,则输出提示信息,提醒用户改变测温距离。
进一步的,所述测温装置,还包括:
标注模块8,用于将所述温度值和所述测温类型标注在所述测温区域图像上;
显示模块9,用于将标注后的所述测温区域图像输出到显示界面。
本实施例中,测温装置中各模块、单元用于对应执行与上述测温方法中的各个步骤,其具体实施过程在此不做详述。
本实施例提供的一种测温装置,测温装置首先采集被测物体的测温区域图像和测温数据,然后通过识别测温区域图像得到测温区域图像对应的测温类型。测温装置筛选与测温类型对应的测温算法,并根据测温数据和测温算法,计算得到当前次被测物体的温度值。本申请通过结合拍摄的测温区域图像,识别得到当前次被测物体所对应的测温类型,进而筛选与其对应的测温算法。在测温算法与被测物体的测温类型对应的情况下,代入采集的测温数据计算得到被测物体的温度值,具有优秀的通用性和准确度,便于用户操作使用。
参照图4,本申请实施例中还提供一种计算机设备,该计算机设备可以是服务器,其内部结构可以如图3所示。该计算机设备包括通过系统总线连接的处理器、存储器、网络接口和数据库。其中,该计算机设计的处理器用于提供计算和控制能力。该计算机设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统、计算机程序和数据库。该内存储器为非易失性存储介质中的操作系统和计算机程序的运行提供环境。该计算机设备的数据库用于存储预设距离范围等数据。该计算机设备的网络接口用于与外部的终端通过网络连接通信。该计算机程序被处理器执行时以实现上述的任一实施例一种测温方法的功能。
上述处理器执行上述测温方法的步骤:
S1:采集被测物体的测温区域图像和测温数据;
S2:识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
S3:筛选与所述测温类型对应的测温算法;
S4:根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
本申请一实施例还提供一种计算机可读存储介质,所述存储介质可以是非易失性存储介质,也可以是易失性存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述的任一实施例测温方法,所述方法具体为:
S1:采集被测物体的测温区域图像和测温数据;
S2:识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
S3:筛选与所述测温类型对应的测温算法;
S4:根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储与一非易失性计算机可读取存储介质中,该计算机程序在执行时,可包括如上述各方法的实施例的流程。其中,本申请所提供的和实施例中所使用的对存储器、存储、数据库或其它介质的任何引用,均可包括非易失性和/或易失性存储器。非易失性存储器可以包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM通过多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双速据率 SDRAM(SSRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其它变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、装置、物品或者方法不仅包括那些要素,而且还包括没有明确列出的其它要素,或者是还包括为这种过程、装置、物品或者方法所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、装置、物品或者方法中还存在另外的相同要素。
以上所述仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其它相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (20)

  1. 一种测温方法,其特征在于,包括:
    采集被测物体的测温区域图像和测温数据;
    识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
    筛选与所述测温类型对应的测温算法;
    根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
  2. 根据权利要求1所述的测温方法,其特征在于,应用于移动终端,所述移动终端上部署有红外测温传感器和摄像头,所述采集被测物体的测温区域图像和测温数据的步骤,包括:
    使用红外测温传感器采集被测物体的所述测温数据,并使用所述摄像头采集图像;
    确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标;其中,所述第一红外测量点为所述红外测温传感器对准所述被测物体采集数据时的测量点;
    截取所述成像面上预设区域的图像,得到所述测温区域图像,其中,所述预设区域为所述成像面上以所述像素坐标为中心的N*N个像素范围内的区域,N不为0。
  3. 根据权利要求2所述的测温方法,其特征在于,所述确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标的步骤,包括:
    以所述摄像头的透镜中心为原点构建三维笛卡尔坐标系,并分别获取所述摄像头的透镜中心与所述第一红外测量点之间的相对位置信息,以及所述摄像头的像距,其中,所述三维笛卡尔坐标系的X轴穿过所述透镜中心,且X轴的方向与所述摄像头的透镜的光轴方向相同,所述三维笛卡尔坐标系的Y轴、Z轴构成的平面与所述三维笛卡尔坐标系的X轴互相垂直,所述像距表征所述第二红外测量点到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离;
    根据所述相对位置信息和所述像距,计算得到所述第二红外测量点到所述三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到所述三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离;
    获取所述摄像头的感光芯片的像素尺寸,根据所述像素尺寸和所述第二垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Y轴方向的第一像素个数,并根据所述像素尺寸和所述第三垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Z轴方向的第二像素个数;
    根据所述第一像素个数和所述第二像素个数,得到所述第二红外测量点在所述成像面的所述像素坐标。
  4. 根据权利要求2所述的测温方法,其特征在于,所述测温类型包括测温区域类型和/或测温物体类型,所述识别所述测温区域图像,得到所述测温区域图像对应的测温类型的步骤,包括:
    对所述测温区域图像进行视觉识别处理,判断是否识别得到所述测温区域图像对应的测温区域类型和/或测温物体类型;
    若识别不到所述测温区域图像对应的测温区域类型和/或测温物体类型,则按照预设倍数扩大所述预设区域,拍摄得到新的测温区域图像,并对所述新的测温区域图像进行视觉识别处理。
  5. 根据权利要求4所述的测温方法,其特征在于,所述测温类型包括测温区域类型和测温物体类型,所述筛选与所述测温类型对应的测温算法的步骤,包括:
    根据所述测温物体类型判断所述被测物体是否为活物;
    若所述被测物体为活物,则筛选与所述测温区域类型、所述测温物体类型匹配的温度转换公式,并将所述温度转换公式作为所述测温算法;
    若所述被测物体不是活物,则调取第一运算方式作为所述测温算法,所述第一运算方式为将所述测温数据直接作为所述被测物体的温度值。
  6. 根据权利要求2所述的测温算法,其特征在于,所述移动终端上还部署有测距传感器,所述测距传感器和所述红外测温传感器相邻设置,所述采集被测物体的测温区域图像和测温数据的步骤之前,还包括:
    通过所述测距传感器获取所述红外测温传感器与所述被测物体之间的测温距离;
    判断所述测温距离是否在预设距离范围内;
    若所述测温距离不在预设距离范围内,则输出提示信息,提醒用户改变测温距离。
  7. 根据权利要求1所述的测温方法,其特征在于,所述根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果的步骤之后,包括:
    将所述温度值和所述测温类型标注在所述测温区域图像上;
    将标注后的所述测温区域图像输出到显示界面。
  8. 一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机程序,其中,所述处理器执行所述计算机程序时实现一种测温方法;
    其中,所述测温方法包括:
    采集被测物体的测温区域图像和测温数据;
    识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
    筛选与所述测温类型对应的测温算法;
    根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果。
  9. 根据权利要求8所述的计算机设备,其中,部署有红外测温传感器和摄像头,所述采集被测物体的测温区域图像和测温数据的步骤,包括:
    使用红外测温传感器采集被测物体的所述测温数据,并使用所述摄像头采集图像;
    确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标;其中,所述第一红外测量点为所述红外测温传感器对准所述被测物体采集数据时的测量点;
    截取所述成像面上预设区域的图像,得到所述测温区域图像,其中,所述预设区域为所述成像面上以所述像素坐标为中心的N*N个像素范围内的区域,N不为0。
  10. 根据权利要求9所述的计算机设备,其中,所述确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标的步骤,包括:
    以所述摄像头的透镜中心为原点构建三维笛卡尔坐标系,并分别获取所述摄像头的透镜中心与所述第一红外测量点之间的相对位置信息,以及所述摄像头的像距,其中,所述三维笛卡尔坐标系的X轴穿过所述透镜中心,且X轴的方向与所述摄像头的透镜的光轴方向相同,所述三维笛卡尔坐标系的Y轴、Z轴构成的平面与所述三维笛卡尔坐标系的X轴互相垂直,所述像距表征所述第二红外测量点到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离;
    根据所述相对位置信息和所述像距,计算得到所述第二红外测量点到所述三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到所述三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离;
    获取所述摄像头的感光芯片的像素尺寸,根据所述像素尺寸和所述第二垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Y轴方向的第一像素个数,并根据所述像素尺寸和所述第三垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Z轴方向的第二像素个数;
    根据所述第一像素个数和所述第二像素个数,得到所述第二红外测量点在所述成像面的所述像素坐标。
  11. 根据权利要求9所述的计算机设备,其中,所述测温类型包括测温区域类型和/或测 温物体类型,所述识别所述测温区域图像,得到所述测温区域图像对应的测温类型的步骤,包括:
    对所述测温区域图像进行视觉识别处理,判断是否识别得到所述测温区域图像对应的测温区域类型和/或测温物体类型;
    若识别不到所述测温区域图像对应的测温区域类型和/或测温物体类型,则按照预设倍数扩大所述预设区域,拍摄得到新的测温区域图像,并对所述新的测温区域图像进行视觉识别处理。
  12. 根据权利要求11所述的计算机设备,其中,所述测温类型包括测温区域类型和测温物体类型,所述筛选与所述测温类型对应的测温算法的步骤,包括:
    根据所述测温物体类型判断所述被测物体是否为活物;
    若所述被测物体为活物,则筛选与所述测温区域类型、所述测温物体类型匹配的温度转换公式,并将所述温度转换公式作为所述测温算法;
    若所述被测物体不是活物,则调取第一运算方式作为所述测温算法,所述第一运算方式为将所述测温数据直接作为所述被测物体的温度值。
  13. 根据权利要求9所述的计算机设备,其中,还部署有测距传感器,所述测距传感器和所述红外测温传感器相邻设置,所述采集被测物体的测温区域图像和测温数据的步骤之前,还包括:
    通过所述测距传感器获取所述红外测温传感器与所述被测物体之间的测温距离;
    判断所述测温距离是否在预设距离范围内;
    若所述测温距离不在预设距离范围内,则输出提示信息,提醒用户改变测温距离。
  14. 根据权利要求8所述的计算机设备,其中,所述根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物体的测温结果的步骤之后,包括:
    将所述温度值和所述测温类型标注在所述测温区域图像上;
    将标注后的所述测温区域图像输出到显示界面。
  15. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现一种测温方法,所述测温方法包括以下步骤:
    采集被测物体的测温区域图像和测温数据;
    识别所述测温区域图像,得到所述测温区域图像对应的测温类型;
    筛选与所述测温类型对应的测温算法;
    根据所述测温数据和所述测温算法,计算得到温度值,并将所述温度值作为所述被测物 体的测温结果。
  16. 根据权利要求15所述的计算机可读存储介质,其特征在于,应用于移动终端,所述移动终端上部署有红外测温传感器和摄像头,所述采集被测物体的测温区域图像和测温数据的步骤,包括:
    使用红外测温传感器采集被测物体的所述测温数据,并使用所述摄像头采集图像;
    确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标;其中,所述第一红外测量点为所述红外测温传感器对准所述被测物体采集数据时的测量点;
    截取所述成像面上预设区域的图像,得到所述测温区域图像,其中,所述预设区域为所述成像面上以所述像素坐标为中心的N*N个像素范围内的区域,N不为0。
  17. 根据权利要求16所述的计算机可读存储介质,其特征在于,所述确定第一红外测量点在所述摄像头的成像面对应的第二红外测量点的像素坐标的步骤,包括:
    以所述摄像头的透镜中心为原点构建三维笛卡尔坐标系,并分别获取所述摄像头的透镜中心与所述第一红外测量点之间的相对位置信息,以及所述摄像头的像距,其中,所述三维笛卡尔坐标系的X轴穿过所述透镜中心,且X轴的方向与所述摄像头的透镜的光轴方向相同,所述三维笛卡尔坐标系的Y轴、Z轴构成的平面与所述三维笛卡尔坐标系的X轴互相垂直,所述像距表征所述第二红外测量点到所述三维笛卡尔坐标系的Y轴和Z轴组成的平面的第一垂直距离;
    根据所述相对位置信息和所述像距,计算得到所述第二红外测量点到所述三维笛卡尔坐标系的X轴和Z轴组成的平面的第二垂直距离,以及到所述三维笛卡尔坐标系的X轴和Y轴组成的平面的第三垂直距离;
    获取所述摄像头的感光芯片的像素尺寸,根据所述像素尺寸和所述第二垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Y轴方向的第一像素个数,并根据所述像素尺寸和所述第三垂直距离计算得到所述成像面的中心到所述第二红外测量点之间在所述三维笛卡尔坐标系的Z轴方向的第二像素个数;
    根据所述第一像素个数和所述第二像素个数,得到所述第二红外测量点在所述成像面的所述像素坐标。
  18. 根据权利要求16所述的计算机可读存储介质,其特征在于,所述测温类型包括测温区域类型和/或测温物体类型,所述识别所述测温区域图像,得到所述测温区域图像对应的测温类型的步骤,包括:
    对所述测温区域图像进行视觉识别处理,判断是否识别得到所述测温区域图像对应的测 温区域类型和/或测温物体类型;
    若识别不到所述测温区域图像对应的测温区域类型和/或测温物体类型,则按照预设倍数扩大所述预设区域,拍摄得到新的测温区域图像,并对所述新的测温区域图像进行视觉识别处理。
  19. 根据权利要求18所述的计算机可读存储介质,其特征在于,所述测温类型包括测温区域类型和测温物体类型,所述筛选与所述测温类型对应的测温算法的步骤,包括:
    根据所述测温物体类型判断所述被测物体是否为活物;
    若所述被测物体为活物,则筛选与所述测温区域类型、所述测温物体类型匹配的温度转换公式,并将所述温度转换公式作为所述测温算法;
    若所述被测物体不是活物,则调取第一运算方式作为所述测温算法,所述第一运算方式为将所述测温数据直接作为所述被测物体的温度值。
  20. 根据权利要求16所述的计算机可读存储介质,其特征在于,所述移动终端上还部署有测距传感器,所述测距传感器和所述红外测温传感器相邻设置,所述采集被测物体的测温区域图像和测温数据的步骤之前,还包括:
    通过所述测距传感器获取所述红外测温传感器与所述被测物体之间的测温距离;
    判断所述测温距离是否在预设距离范围内;
    若所述测温距离不在预设距离范围内,则输出提示信息,提醒用户改变测温距离。
PCT/CN2021/113612 2021-05-18 2021-08-19 测温方法、计算机设备和计算机可读存储介质 WO2022241964A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110540999.9 2021-05-18
CN202110540999.9A CN113237556A (zh) 2021-05-18 2021-05-18 测温方法、装置和计算机设备

Publications (1)

Publication Number Publication Date
WO2022241964A1 true WO2022241964A1 (zh) 2022-11-24

Family

ID=77135094

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113612 WO2022241964A1 (zh) 2021-05-18 2021-08-19 测温方法、计算机设备和计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN113237556A (zh)
WO (1) WO2022241964A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116520915A (zh) * 2023-06-28 2023-08-01 泰山学院 基于热红外图像的网络中心机房温度预警控制系统
CN116704009A (zh) * 2023-08-10 2023-09-05 深圳普达核工业数字测控有限公司 基于预制组件的施工测量数据处理方法、装置及其设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113237556A (zh) * 2021-05-18 2021-08-10 深圳市沃特沃德信息有限公司 测温方法、装置和计算机设备
CN113899395B (zh) * 2021-09-03 2023-04-18 珠海格力电器股份有限公司 温度湿度测量方法、装置、计算机设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488941A (zh) * 2016-01-15 2016-04-13 中林信达(北京)科技信息有限责任公司 基于红外-可见光图像的双光谱森林火情监测方法及装置
CN106855436A (zh) * 2015-12-08 2017-06-16 深圳超多维光电子有限公司 一种终端设备及温度测量的方法
CN111412992A (zh) * 2020-04-03 2020-07-14 华为技术有限公司 一种测量温度的方法及电子设备
CN111968163A (zh) * 2020-08-14 2020-11-20 济南博观智能科技有限公司 一种热电堆阵列测温方法及装置
CN113237556A (zh) * 2021-05-18 2021-08-10 深圳市沃特沃德信息有限公司 测温方法、装置和计算机设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7176440B2 (en) * 2001-01-19 2007-02-13 Honeywell International Inc. Method and apparatus for detecting objects using structured light patterns
CN111751001A (zh) * 2019-03-26 2020-10-09 奇酷互联网络科技(深圳)有限公司 温度检测方法、存储介质及移动终端
CN110110629B (zh) * 2019-04-25 2021-05-28 北京工业大学 面向室内环境控制的人员信息检测方法与系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106855436A (zh) * 2015-12-08 2017-06-16 深圳超多维光电子有限公司 一种终端设备及温度测量的方法
CN105488941A (zh) * 2016-01-15 2016-04-13 中林信达(北京)科技信息有限责任公司 基于红外-可见光图像的双光谱森林火情监测方法及装置
CN111412992A (zh) * 2020-04-03 2020-07-14 华为技术有限公司 一种测量温度的方法及电子设备
CN111968163A (zh) * 2020-08-14 2020-11-20 济南博观智能科技有限公司 一种热电堆阵列测温方法及装置
CN113237556A (zh) * 2021-05-18 2021-08-10 深圳市沃特沃德信息有限公司 测温方法、装置和计算机设备

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116520915A (zh) * 2023-06-28 2023-08-01 泰山学院 基于热红外图像的网络中心机房温度预警控制系统
CN116520915B (zh) * 2023-06-28 2023-09-05 泰山学院 基于热红外图像的网络中心机房温度预警控制系统
CN116704009A (zh) * 2023-08-10 2023-09-05 深圳普达核工业数字测控有限公司 基于预制组件的施工测量数据处理方法、装置及其设备
CN116704009B (zh) * 2023-08-10 2023-12-01 深圳普达核工业数字测控有限公司 基于预制组件的施工测量数据处理方法、装置及其设备

Also Published As

Publication number Publication date
CN113237556A (zh) 2021-08-10

Similar Documents

Publication Publication Date Title
WO2022241964A1 (zh) 测温方法、计算机设备和计算机可读存储介质
TWI555379B (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
JP5158386B2 (ja) 単一カメラによる画像計測処理装置,画像計測処理方法および画像計測処理プログラム
US8897539B2 (en) Using images to create measurements of structures through the videogrammetric process
JP2013122434A (ja) レーザーを用いた単眼カメラによる3次元形状位置計測装置,3次元形状位置計測処理方法および3次元形状位置計測処理プログラム
CN109035309A (zh) 基于立体视觉的双目摄像头与激光雷达间的位姿配准方法
CN109099889B (zh) 近景摄影测量系统和方法
WO2018232900A1 (zh) 空间定位装置、定位处理方法及装置
CN110779491A (zh) 一种水平面上目标测距的方法、装置、设备及存储介质
CN113358231B (zh) 红外测温方法、装置及设备
WO2021259365A1 (zh) 一种目标测温方法、装置及测温系统
WO2021004416A1 (zh) 一种基于视觉信标建立信标地图的方法、装置
WO2022218161A1 (zh) 用于目标匹配的方法、装置、设备及存储介质
CN111738215A (zh) 人体温度测量方法和计算机设备
CN111563926B (zh) 测量图像中物体物理尺寸的方法、电子设备、介质及系统
WO2022257794A1 (zh) 可见光图像和红外图像的处理方法及装置
CN114359334A (zh) 目标跟踪方法、装置、计算机设备和存储介质
JP2019109200A (ja) 校正用データ生成装置、校正用データ生成方法、キャリブレーションシステム、及び制御プログラム
CN110991306A (zh) 自适应的宽视场高分辨率智能传感方法和系统
WO2022052189A1 (zh) 动物外部特征获取方法、装置及计算机设备
US10356394B2 (en) Apparatus and method for measuring position of stereo camera
CN112241984A (zh) 双目视觉传感器标定方法、装置、计算机设备和存储介质
CN116485902A (zh) 标志点匹配方法、装置、计算机设备和存储介质
KR102458065B1 (ko) 안면 인식 체온 측정 장치 및 방법
CN113420702A (zh) 基于人脸进行温度检测的方法、装置、系统以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21940413

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE