CN114565517A - Image denoising method and device for infrared camera and computer equipment - Google Patents

Image denoising method and device for infrared camera and computer equipment Download PDF

Info

Publication number
CN114565517A
CN114565517A CN202111649543.2A CN202111649543A CN114565517A CN 114565517 A CN114565517 A CN 114565517A CN 202111649543 A CN202111649543 A CN 202111649543A CN 114565517 A CN114565517 A CN 114565517A
Authority
CN
China
Prior art keywords
infrared
light spot
pixel
image
connected domain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111649543.2A
Other languages
Chinese (zh)
Other versions
CN114565517B (en
Inventor
孟李艾俐
胡超
徐逸帆
董博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bone Shengyuanhua Robot Shenzhen Co ltd
Original Assignee
Bone Shengyuanhua Robot Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bone Shengyuanhua Robot Shenzhen Co ltd filed Critical Bone Shengyuanhua Robot Shenzhen Co ltd
Priority to CN202111649543.2A priority Critical patent/CN114565517B/en
Publication of CN114565517A publication Critical patent/CN114565517A/en
Application granted granted Critical
Publication of CN114565517B publication Critical patent/CN114565517B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application is applicable to the technical field of image processing, and provides an image denoising method and device for an infrared camera and computer equipment, wherein the method comprises the following steps: controlling an infrared camera to acquire an image of a round infrared reflective identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value; carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point; identifying a light spot connected domain in the infrared image after the gray level binarization processing, and detecting the shape of the light spot connected domain; and denoising the infrared image based on the shape of the light spot connected domain. By adopting the method, the interference of other factors on the detection of the infrared identification target can be eliminated, and the detection accuracy is improved.

Description

Image denoising method and device for infrared camera and computer equipment
Technical Field
The embodiment of the application belongs to the technical field of image processing, and particularly relates to an image denoising method and device for an infrared camera and computer equipment.
Background
With the rapid development of medical science and computer science, computer-assisted surgery has become a research and application hot spot of modern surgical operations, and has received extensive attention. The operation navigation system is an important application direction of computer-assisted surgery, and can help doctors select the optimal operation path, reduce operation injuries and improve the accuracy, convenience and success rate of the operation. Currently, the most commonly used surgical navigation system utilizes an infrared positioning tracker to perform real-time tracking and positioning on surgical instruments and infrared identification targets on focuses, calculates working positions, directions and motion paths of the surgical instruments, and completes surgery according to preoperative designed and planned routes and steps on the basis.
In the operation process, medical personnel, surgical instruments, background objects and the like in an operating room can cause some forms of infrared light reflection, and the irradiation of ambient light such as sunlight, an illumination light source and the like can also cause infrared light emission to a certain degree. Thus, when the infrared positioning tracker is applied, noise exists in the image of the infrared camera besides the light spot pattern of the infrared identification target. The image noise can interfere the judgment of the operation navigation system on the infrared identification target coordinate, and the accuracy of the operation is influenced.
Disclosure of Invention
In view of this, the embodiment of the present application provides an image denoising method and apparatus for an infrared camera, and a computer device, which are used to denoise an image of the infrared camera, eliminate interference of other factors on detection of an infrared identification target, and improve detection accuracy.
A first aspect of an embodiment of the present application provides an image denoising method for an infrared camera, including:
controlling an infrared camera to acquire an image of a round infrared reflective identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
identifying a light spot connected domain in the infrared image after the gray level binarization processing, and detecting the shape of the light spot connected domain;
and denoising the infrared image based on the shape of the light spot connected domain.
A second aspect of the embodiments of the present application provides an image denoising device for an infrared camera, including:
the infrared camera is used for acquiring a circular infrared reflection identification target, and the circular infrared reflection identification target is subjected to image acquisition by the image acquisition module to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
the binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the light spot connected domain identification module is used for identifying the light spot connected domain in the infrared image after the gray level binarization processing and detecting the shape of the light spot connected domain;
and the denoising module is used for denoising the infrared image based on the shape of the light spot connected domain.
A third aspect of embodiments of the present application provides a computer device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, implements the image denoising method for an infrared camera according to the first aspect.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the image denoising method for an infrared camera according to the first aspect.
A fifth aspect of the embodiments of the present application provides a computer program product, which when run on a computer causes the computer to execute the method for denoising an image of an infrared camera according to the first aspect.
Compared with the prior art, the embodiment of the application has the following advantages:
according to the embodiment of the application, the computer equipment can acquire the images of the round infrared reflecting identification targets by controlling the infrared camera, so that the infrared images can be obtained. According to the pixel value of each pixel point in the infrared image, the computer equipment can carry out gray level binarization processing on the infrared image. By identifying the light spot connected domain in the infrared image after the gray level binarization processing and detecting the shape of the light spot connected domain, the computer equipment can denoise the infrared image based on the shape of the light spot connected domain. According to the embodiment of the application, the infrared image is subjected to gray level binarization processing and the shape of the light spot communication domain is detected, so that various noises can be accurately filtered, and an ideal infrared light-reflecting identification target light spot is obtained.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic diagram of an infrared positioning tracker according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image denoising method of an infrared camera according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of an infrared image provided by an embodiment of the present application;
fig. 4 is a schematic diagram of an implementation manner of S201 in an image denoising method for an infrared camera according to an embodiment of the present application;
fig. 5 is a schematic diagram of an infrared image after a grayscale binarization process according to an embodiment of the application;
fig. 6 is a schematic diagram of an implementation manner of S203 in an image denoising method of an infrared camera according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a denoised IR image according to an embodiment of the present disclosure;
fig. 8 is a schematic algorithm flow diagram of an image denoising method of an infrared camera according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram of an image denoising apparatus of an infrared camera according to an embodiment of the present application;
fig. 10 is a schematic diagram of a computer device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Fig. 1 is a schematic diagram illustrating an operation of an infrared positioning tracker according to an embodiment of the present application. The infrared positioning tracker in fig. 1 is composed of an infrared light source and an infrared camera. The infrared light source is composed of an infrared LED array and is arranged on the circumference of the infrared camera. When the infrared light source is in work, the infrared light source emits light to irradiate a circular infrared light reflecting identification target (such as the identification target of the surgical instrument in figure 1) on a surgical instrument or a focus, and a reflected light spot pattern of the infrared light reflecting identification target is formed on a camera image. An infrared camera in the infrared positioning tracker is realized by adding an infrared filter matched with an infrared light source in front of the camera and is used for shooting the infrared reflective identification targets. Because the infrared reflectivity of objects around the identification target is low, the infrared camera shoots the obtained image and comprises infrared reflection identification target spot patterns with high brightness and contrast. The position and attitude parameters of the surgical instrument can be calculated by finding the central coordinates of the marking target light spots, so that the positioning is realized.
However, other objects in the operation scene can cause infrared light reflection to a certain extent, and ambient light can also generate reflection on each object in the operation scene, so that some noise exists in an image obtained by the infrared camera in addition to the light spot pattern of the infrared reflective identification target. In order to ensure the accuracy of positioning, the noise needs to be removed.
There are many algorithms in the prior art for image filtering, such as mean filtering, median filtering, gaussian filtering, etc. There are also many edge detection algorithms, such as circle detection algorithms, that are also well established. However, when the infrared reflective marker target is tilted at an angle, it will appear as an ellipse in a different direction at the image plane. Often, due to the existence of noise, the detection of the infrared reflection identification target ellipse by the existing algorithm is difficult and careless. Therefore, a key objective of the positioning and navigation system is to remove various interferences and provide an accurate, reliable and fast algorithm to complete the recognition of the infrared reflective targets.
The image denoising method of the infrared camera provided by the embodiment of the application can detect circles and ellipses in the infrared image after binarization processing. The method obtains high-quality infrared images by designing a reasonable infrared reflecting identification target and an infrared camera; and then preprocessing the infrared image by using a rapid image filtering method, further detecting a white light spot connected domain by using a region growing method, extracting required circular and elliptical light spot connected domains by using methods such as the number characteristic, the shape characteristic, the central moment and the like of pixels of the connected domain, and finishing accurate and reliable identification and detection of the infrared reflective identification target.
The technical solution of the present application will be described below by way of specific examples.
Referring to fig. 2, a schematic diagram of an image denoising method of an infrared camera provided in an embodiment of the present application is shown, where the method specifically includes the following steps:
s201, controlling an infrared camera to acquire images of a circular infrared reflective identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value.
The method can be applied to computer equipment, namely, the execution subject of the embodiment of the application can be the computer equipment. The computer equipment can be electronic equipment which can control an infrared camera to shoot the infrared reflection identification target and carry out denoising processing on the shot infrared image. For example, the specific type of the electronic device is not limited in the embodiments of the present application, such as a tablet computer, a notebook computer, a desktop computer, or a cloud server.
In one example of an embodiment of the present application, a computer device may be a component of a positioning navigation system. Taking the surgical navigation system as an example, the computer device can form the surgical navigation system together with an infrared positioning tracker composed of an infrared light source and an infrared camera, a surgical instrument, an infrared reflective marking target on a focus and the like.
When the computer equipment is used for positioning and navigating, the infrared camera can be controlled by the computer equipment to shoot the infrared reflecting identification target, so that a corresponding infrared image is obtained. The infrared reflective marking target can be circular in shape, and the light spot formed on the infrared image by the circular infrared reflective marking target is generally circular or oval. Fig. 3 is a schematic diagram of an infrared image provided in an embodiment of the present application, and the infrared image shown in fig. 3 is an original, unprocessed infrared image captured by an infrared camera. As can be seen from fig. 3, the ir image includes not only the ir retro-reflective marker target spots as in areas 301a, 301b, 301c and 301d, but also some noise as in areas 302a and 302 b.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 4, the controlling the infrared camera to perform image acquisition on the circular infrared reflective marking target in S201, and the obtaining of the infrared image may specifically include the following substeps S2011-S2013:
and S2011, controlling the infrared light source to irradiate the infrared reflective marking target.
In an embodiment of the present application, the computer device may first control the infrared light source to illuminate the infrared reflective marking target. Thus, infrared light reflected by the infrared reflective marker target will form a reflected spot on the image of the infrared camera.
In a specific implementation, before controlling the infrared light source to irradiate the infrared reflective marking target, the computer device needs to determine a wavelength range of infrared light emitted by the infrared light source. In one example, the infrared source used to position the tracker may be selected to have a wavelength in the near infrared range with a wavelength response that is as narrow as possible. For example, the wavelength range of infrared light can be determined to be 780-3000 nm. Specifically, infrared light of 850nm wavelength may be selected as the infrared light source.
In the embodiment of the application, the infrared camera can be realized by adding an infrared filter matched with an infrared light source in front of the camera. Therefore, after the infrared light wavelength range of the infrared light source is determined, the wavelength transmitted by the filter of the infrared camera can be determined according to the infrared light wavelength range. In general, the wavelength transmitted by the infrared camera filter should be consistent with the center wavelength of the infrared light source, and the wavelength range of the infrared light transmitted by the infrared camera filter should be as narrow as possible.
In addition, the intensity of the infrared light source should be strong enough to ensure that the infrared reflective marking target emits a light spot with sufficient brightness.
S2012, target shooting parameters of the infrared camera are determined, and the target shooting parameters comprise the aperture size and the exposure time of the infrared camera.
In general, the shooting parameters of a camera have a significant influence on the imaging quality. Therefore, in order to obtain a high-quality infrared image, before the infrared camera is controlled to shoot the infrared reflective marking target, target shooting parameters of the infrared camera need to be determined. Generally, the brightness of an image taken by a camera is closely related to the size of the aperture and the time of exposure. Therefore, the target photographing parameters may include an aperture size and an exposure time of the infrared camera. Specifically, the aperture of the infrared camera is not too large, so that the imaging quality can be ensured; secondly, the exposure time during shooting is also appropriate to ensure a sufficiently fast image acquisition response.
Because the intensity of the reflected light of the infrared reflecting marking target is generally higher than that of the reflected light of the background object, the size of the aperture and the exposure time can be determined by adopting the condition of the light spot brightness threshold of the infrared reflecting marking target: i.e., adjusting the aperture size and exposure time such that the pixel value within the ir-reflecting marker target spot is just above a set threshold (e.g., 230), the aperture size and exposure time are fixed. Therefore, the brightness of the infrared reflective marking target light spot can be ensured, and the background noise pixel value can be limited in a lower range.
In specific implementation, the infrared camera can be controlled to shoot the infrared reflective identification target under a plurality of different shooting parameters. The infrared camera is controlled to shoot the infrared reflective identification target under different aperture sizes and exposure times, and infrared images corresponding to each group of aperture sizes and exposure times are obtained. By analyzing light spots in each group of infrared images, when the pixel value of the light spot formed by the infrared reflecting identification target in the infrared image is larger than a first preset threshold value and the difference value obtained by subtracting the first preset threshold value from the pixel value of the light spot is smaller than a second preset threshold value, the current shooting parameters can be determined as target shooting parameters. Illustratively, the first preset threshold may be 230, that is, the pixel value of the light spot formed in the infrared image by the infrared reflective marking target is greater than 230 by adjusting shooting parameters such as aperture size and exposure time. However, if the pixel value of the light spot is not too large, the second predetermined threshold may be set, for example, the second predetermined threshold may be any value greater than 0 and less than 5. In this way, the pixel value of the light spot can be controlled to be just above the set first preset threshold.
S2013, the infrared camera is controlled to shoot the infrared reflection identification target under the target shooting parameters, and the infrared image is obtained.
After the wavelength range of the infrared light source and the target shooting parameters of the infrared camera are determined, the computer equipment can control the infrared camera to adjust the target shooting parameters, and then the infrared reflection identification target is shot under the target shooting parameters to obtain an infrared image. The infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value.
S202, carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point.
In the embodiment of the application, the infrared image can be subjected to gray level binarization processing according to the pixel value of each pixel point in the infrared image, and the infrared image after the gray level binarization processing only comprises two colors. Wherein, each pixel region that the facula covered is same colour, and other pixel regions that the facula did not cover are same colour.
In the embodiment of the application, when the grayscale binarization processing is performed on the infrared image, the pixel value of each pixel point of which the pixel value is less than or equal to the preset pixel threshold value can be assigned as a first numerical value; and assigning the pixel value of each pixel point with the pixel value larger than the preset pixel threshold value as a second numerical value, thereby realizing the gray level binarization processing of the infrared image.
In an example of the embodiment of the present application, the first value may be 0, the second value may be 255, and the preset pixel threshold may be 230. Namely, the pixel value of each pixel point with the pixel value higher than 230 in the infrared image is assigned to be 255, and the pixel values of other pixel points are assigned to be 0. After the processing, the pixel values of most background areas in the infrared image become 0, namely become black; and the white area is mainly the light spot range of the infrared reflective marking target.
Fig. 5 is a schematic diagram of an infrared image after a grayscale binarization process according to an embodiment of the application. In fig. 5, most of the background regions are black, for example, the regions 501, and these black regions are pixel point regions in which the pixel values are assigned to 0; while a small part of the area is white, for example, the white areas in the areas 502a, 502b, 502c and 502d are pixel point areas in which the pixel value is assigned to 255, that is, spot areas formed by the infrared reflective mark target in the infrared image. Compared with the original infrared image in fig. 3, after the grayscale binarization processing, the noise in the regions 302a and 302b in fig. 3 is removed.
In general, the area of the light spot formed by the infrared reflective marking target in the infrared image is small, and the sum of the areas of all the light spots does not exceed 3% of the total area of the infrared image generally. Therefore, after the infrared image is subjected to the grayscale binarization processing, the computer device may calculate an area of an area covered by a pixel point of which the pixel value is the second value in the infrared image, that is, an area of a light spot area that appears white in the infrared image after the grayscale binarization processing, where the pixel value is 255. If the occupation ratio of the area in the infrared image is larger than a first preset area threshold (such as 3%), the currently shot infrared image can be considered as an abnormal image. For example, when a human body approaches the infrared light source or the infrared camera, or other objects block the infrared camera, a large high-brightness interference region is generated, so that the area of the light spot region exceeds 3% of the total area of the infrared image. At this point, the computer may discard the infrared image, abandon the execution of the following program, and re-acquire the infrared image.
Generally, after the infrared image is subjected to gray level binarization processing, fine particles and fine line-shaped white noise may exist in the image besides white infrared reflection marking target light spots. Such as the line-shaped spot and the particle spot shown in region 502b in fig. 5. The particle spot may be a fine spot having an area smaller than a second predetermined area threshold. To this end, the computer device may employ an image erosion operation to remove such noise, trying to leave only the white spot of the ir-reflective target in the image.
In a specific implementation, the computer equipment can identify linear light spots and particle light spots in the infrared image after the gray level binarization processing; then, the infrared image after the gray level binarization processing is subjected to image corrosion, so that the linear light spots and the particle light spots can be deleted.
S203, identifying a light spot connected domain in the infrared image after the gray level binarization processing, and detecting the shape of the light spot connected domain.
In the embodiment of the present application, the processes of the grayscale binarization processing and the image erosion processing on the infrared image in the foregoing steps may be regarded as a preprocessing process of the infrared image. The computer equipment can further denoise the image by identifying the shape of the light spot in the infrared image obtained after preprocessing.
In the embodiment of the application, the computer device can identify the light spot connected domain in the infrared image after the gray level binarization processing.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 6, the identifying the light spot connected domain in the infrared image after the grayscale binarization processing in S203 may specifically include the following sub-steps S2031 to S2032:
s2031, determining the number and the plane coordinates of pixel points covered by light spots in the infrared image after the gray level binarization processing.
S2032, identifying the light spot connected domain according to the number and the plane coordinates of the pixel points covered by the light spots.
In a specific implementation, a two-dimensional array corresponding to the infrared image may be generated first, the two-dimensional array may be equal to the size of the infrared image and include a plurality of flag bits, and each flag bit corresponds to a pixel point in the infrared image after the grayscale binarization processing. The initial value of each flag bit in the two-dimensional array may be 0.
Then, traversing each pixel point in the infrared image after the gray level binarization processing, and if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array to 1. After traversing of each pixel point in the infrared image is completed, the number of the pixel points covered by the light spot can be obtained by counting the number of the flag bits of which the initial value is modified to 1 in the two-dimensional array. And according to the position of the mark bit with the initial value modified to 1 in the two-dimensional array, the plane coordinates of the pixel points covered by the light spots can be determined.
In this way, the computer device can calculate the number of pixel points in the white light spot connected domain with the pixel value of 255 and the plane coordinate of each pixel point after the gray level binarization processing.
After the processing of S2031-S2032, the computer device can obtain all the light spot connected domains in the whole infrared image. However, these spot communication fields are not all spots formed by the infrared reflective targets, but may also be noise. The computer device needs to remove these noises.
In the embodiment of the application, the moving area range and the moving angle of the infrared reflective identification target can be determined, and the number range of pixel points covered by light spots formed by the infrared reflective identification target in an infrared image is determined according to the moving area range and the moving angle.
Specifically, when the distance and the angle of the infrared reflective identification target relative to the infrared camera change, the area of the infrared reflective identification target in the infrared image correspondingly changes, so that the quantity change range of all pixel points covered by the white light spots formed by the infrared reflective identification target in the infrared image can be determined by observing the characteristics of the light spots of the infrared reflective identification target and analyzing the possible movement area range and the angle of the infrared reflective identification target in the positioning and navigation process.
Then, for any light spot, if the number of the pixel points covered by the light spot is not within the above number range, the light spot can be considered as noise, and the computer device can remove the light spot.
It should be noted that the range of the number of the pixel points covered by the light spot obtained by analyzing the range of the possible moving area and the angle of the infrared reflective identification target may be a specific numerical range interval. When the number of the pixel points covered by each light spot connected domain calculated by the computer equipment is greater than the upper limit of the numerical range interval or less than the lower limit of the numerical range interval, the light spot connected domain can be considered not to meet the limiting condition of the number range, and the computer equipment can judge that the corresponding light spot is noise and remove the noise.
The computer device may then perform row detection on the remaining connected regions of the spot.
In a specific implementation, because the infrared light reflecting identification target light spot is formed by reflecting infrared light by the round infrared light reflecting identification target, a quasi-round or elliptical light spot is formed in an infrared image obtained by shooting by the infrared camera. Therefore, the computer equipment can realize the shape detection of the infrared reflective marking target aiming at the characteristics of the ellipse.
In a possible implementation manner of the embodiment of the application, a rectangular area can be generated for any light spot connected domain according to the plane coordinates of each pixel point in the light spot connected domain. Illustratively, the spans of the horizontal axis coordinate and the vertical axis coordinate of each pixel point in the light spot communication area in the vertical and horizontal directions can be used as the length and the width of a rectangle, so as to obtain a rectangular area.
Then, half of the length of the diagonal line of the rectangular area can be taken as the radius r, and the ratio of the number of the pixel points of the light spot connected domain to the radius r is adopted for limiting. If the light spot connected domain meets the constraint that Tail/2/r is more than 0.65 r, the shape of the light spot connected domain can be judged to be circular or oval. Wherein Tail is the number of pixel points in the light spot connected domain.
In another possible implementation manner of the embodiment of the present application, the shape of the connected domain of the light spot may also be detected by using a central moment method. The central moment method is a quantitative calculation method.
In specific implementation, the central point of any light spot connected domain can be calculated according to the plane coordinates of each pixel point in the light spot connected domain.
It should be noted that the plane coordinates of each pixel point may include a horizontal axis coordinate and a vertical axis coordinate, and when the center point of the light spot connected domain is calculated according to the plane coordinates of each pixel point, the sum of the horizontal axis coordinates and the sum of the vertical axis coordinates of each pixel point in the light spot connected domain may be calculated respectively, and then a first ratio of the sum of the horizontal axis coordinates to the number of pixel points in the light spot connected domain and a second ratio of the sum of the vertical axis coordinates to the number of pixel points in the light spot connected domain are calculated respectively. The first ratio and the second ratio jointly form the horizontal axis coordinate and the vertical axis coordinate of the central point of the light spot connected domain.
Specifically, the following formula can be adopted to calculate the central point of the light spot connected domain:
Figure BDA0003444566570000101
wherein x isiAnd yiRespectively representing the horizontal axis coordinate and the vertical axis coordinate of a pixel point with the serial number i in a certain light spot connected domain;
Figure BDA0003444566570000102
and
Figure BDA0003444566570000103
respectively representing the horizontal axis coordinate and the vertical axis coordinate of the central point of the light spot connected domain; and N is the number of pixel points in the light spot connected domain.
Then, according to the central point obtained by calculation, the quadratic mean square central moment of the light spot connected domain can be calculated.
Specifically, for any pixel point in the light spot connected domain, the square of the difference between the horizontal axis coordinate of the pixel point and the horizontal axis coordinate of the central point is calculated to obtain a first square value, and then the square of the difference between the vertical axis coordinate of the pixel point and the vertical axis coordinate of the central point is calculated to obtain a second square value. The first square value and the second square value corresponding to all the pixel points in the light spot communication domain are added to obtain the square sum, and therefore the ratio of the square sum to the number of the pixel points in the light spot communication domain can be used as the quadratic mean square central moment of the light spot communication domain.
In one example, the following formula can be used to calculate the quadratic mean square center moment of the spot connected component:
Figure BDA0003444566570000104
wherein J is the quadratic mean square central moment of the light spot connected domain.
After the second mean square central moment of the light spot connected domain is obtained through calculation, the computer equipment can determine the shape of the light spot connected domain according to the second mean square central moment.
In a possible implementation manner of the embodiment of the application, when the computer device determines the shape of the light spot connected domain according to the calculated quadratic mean square central moment, the computer device may further reasonably limit the central moment threshold range of the light spot connected domain according to the position and the angle range where the infrared reflective marker target may appear.
In a specific implementation, the computer device can determine the threshold range of the central moment of the light spot connected domain formed by the infrared reflective identification target in the infrared image, namely, the quadratic mean square central moment of the light spot connected domain is set according to the possible position and angle range of the infrared reflective identification targetMinimum value of (J)minAnd maximum value Jmax
For any light spot connected domain, if the quadratic mean square central moment of the light spot connected domain is within the central moment threshold range, the computer equipment can judge that the shape of the light spot connected domain is circular or elliptical. Otherwise, for those that do not meet the central moment threshold range constraint, the computer device may consider them as noise and remove them.
S204, denoising the infrared image based on the shape of the light spot connected domain.
In this embodiment of the application, after the shape of each light spot connected component is obtained through the processing and detection of the foregoing steps, the computer device may remove light spot connected components whose shapes are not circular or elliptical.
Fig. 7 is a schematic diagram of a denoised infrared image according to an embodiment of the present disclosure. Referring to regions 702a, 702b, 702c, and 702d in fig. 7, and as can be seen in comparison with fig. 3 and 5, the noise in fig. 3 and 5 has been removed.
In the embodiment of the application, the computer equipment controls the infrared camera to acquire the image of the round infrared reflective identification target, so that the infrared image can be obtained. According to the pixel value of each pixel point in the infrared image, the computer equipment can carry out gray level binarization processing on the infrared image. By identifying the light spot connected domain in the infrared image after the gray level binarization processing and detecting the shape of the light spot connected domain, the computer equipment can denoise the infrared image based on the shape of the light spot connected domain. According to the embodiment of the application, the infrared image is subjected to gray level binarization processing and the shape of the light spot communication domain is detected, so that various noises can be accurately filtered, and an ideal infrared light-reflecting identification target light spot is obtained.
The image denoising method of the infrared camera provided by the embodiment of the application can be applied to various fields needing positioning and tracking, such as surgical navigation, surgical robot navigation control, scientific research teaching, motion analysis, virtual reality and the like. Taking the positioning navigation of the orthopaedic surgical robot as an example, the orthopaedic surgical robot is used to assist a doctor to perform an operation, and the navigation control is a key technology. Before operation, a doctor can complete operation planning and operation route design according to images and diagnosis results such as CT, MRI and the like. In the operation, the infrared reflecting marking targets are fixed on the bone (beside the focus) of a patient and the robot operating arm, and the positioning navigation system can acquire the position and the motion track of the robot relative to the focus position through the positioning tracker and complete the operation (such as cutting, polishing and the like) according to the preoperative planned route. In the process, the positioning navigation system can adopt the image denoising method of the infrared camera provided by the embodiment of the application to denoise the infrared image collected by the infrared camera, so that the positioning precision and the stability of the positioning tracker are improved, and support is provided for the success of the surgical operation.
It should be noted that, the sequence numbers of the steps in the foregoing embodiments do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
For ease of understanding, a complete example is provided below to describe an image denoising method for an infrared camera provided in the embodiments of the present application.
Fig. 8 is a schematic diagram of an algorithm flow of an image denoising method of an infrared camera according to an embodiment of the present disclosure, and the algorithm flow shown in fig. 8 may be applied to an image denoising process of a position tracker shown in fig. 1. The position tracker in fig. 1 includes an infrared light source and an infrared camera. The infrared light source consists of an annular array of infrared LEDs, surrounding an infrared camera, which emit light having a specific wavelength (e.g., 850 nm). After the infrared light source irradiates the infrared reflecting identification target, infrared light reflecting light spots (infrared reflecting identification target light spots) are formed. A transmission narrow-band filter with the same wavelength range as that of the infrared light source LED is arranged in front of the lens of the infrared camera, so that the infrared camera can only respond to the infrared light of the specific LED, and background light with other wavelengths is filtered. Thus, the position tracker obtains the reflection light spot of the infrared reflection mark target, and the background is in a relatively low brightness condition.
Specifically, according to the algorithm flow shown in fig. 8, the method for denoising the image of the infrared camera mainly includes the following steps:
s801, adjusting the aperture size and the exposure time of the camera: the infrared image collected by the infrared camera is a gray scale image, and the aperture size and the exposure time of the camera are adjusted properly. The aperture cannot be too large to ensure imaging quality. The exposure time is also reasonable to ensure a sufficiently fast image acquisition response. The aperture size and the exposure time are determined in this example using a luminance threshold condition that identifies the target spot: i.e., the aperture size and exposure time are adjusted so that the pixel gray scale value within the marker target spot is just above a set threshold (e.g., 230), the aperture size and exposure time are fixed. Thus, the brightness of the target light spot is ensured, and the background noise brightness (pixel value) is limited to a lower range.
S802, carrying out image gray level binarization processing and removing large and fine noises: pixel point values with pixel values (gray scale range 0-255) above a certain threshold (e.g., 230) are assigned 255 and others are assigned 0. After the processing, under the normal condition, most background pixel values can become 0, namely become black; the white areas are mainly the spot areas of the marker target and typically occupy a low (< 3%) proportion of the area.
Removing large interference: when a human body is close to the infrared light source or the infrared camera or an object blocks the infrared camera, a large high-brightness interference area can be generated in the infrared image. After the image binarization processing, it can be set that when the area of the white area is larger than 3% of the whole image area, the computer device can control to abandon the running of the following program and to re-collect the image until the condition is met.
Removing fine noise: after the gray level binarization processing, fine particles and fine linear white noise may exist in the image besides the white mark target light spot. To this end, an image erosion operation may be used in this example to remove this type of noise, trying to leave only the identified target spot in the image.
S803, acquiring a marked target light spot region by adopting a region growing algorithm: firstly, a two-dimensional array with the same size as the image is defined and used for storing the flag bits, and the initialization is all 0. Traversing each pixel in the image, if the pixel value is not 0, recording the number of the pixels plus 1 according to the area sequence number and recording the coordinates of the pixels; and traversing the four fields of the pixel, adding 1 when detecting a pixel which is not 0, then traversing the four fields of the four fields, and so on. And finally, calculating the number of the pixel points of all the white connected domains and the plane coordinate of each pixel point.
S804, judging the shape of the connected domain: the possible number change range of all pixels of the marked target light spot can be determined by analyzing the moving area of the infrared reflecting marked target, the number range of the pixels occupied by each connected domain is further limited, and if the number range is larger than or smaller than the limited range, noise interference is considered to be eliminated. And traversing all pixel point coordinates of each connected domain, respectively taking the difference between the maximum value and the minimum value of the x coordinate and the y coordinate in the connected domain, and calculating the length of a diagonal line of a rectangle formed by the two difference values. And finally, limiting the connected domain by using a formula, wherein the calculation formula is as follows: tail/2/r >0.65 r (tail represents the number of pixels in the connected domain, and r represents half of the length of the diagonal). When the connected domain meets the condition, the connected domain can be considered to be circular or elliptical, namely the connected domain is the mark target light spot; if not, the data are removed.
S805, quantitatively judging the quasi-circle or the ellipse by using a method of calculating the central moment: first, the center coordinates of each connected domain are calculated:
Figure BDA0003444566570000131
wherein x isiAnd yiRespectively representing the horizontal axis coordinate and the vertical axis coordinate of a pixel point with the serial number i in a certain communication domain;
Figure BDA0003444566570000132
and
Figure BDA0003444566570000133
a horizontal axis coordinate and a vertical axis coordinate respectively representing the center point of the connected domain; n is a pixel point in the connected domainThe number of the cells.
Calculating the quadratic mean square center moment J as:
Figure BDA0003444566570000134
and further calculating the facula conditions of the identification target at the possible positions and angles, setting a reasonable central moment range threshold value, limiting each connected domain, and removing the noise if the central moment range threshold value is not met.
After the steps are processed, the computer equipment can filter various noises to obtain ideal identification target light spots.
The embodiment of the application aims at image acquisition of the infrared camera, and can realize detection of the reflected light spot of the round identification target under irradiation of the infrared light source. Because there are human bodies, instruments and background objects reflected on the background and interference noise of other unknown light sources, various detection algorithms in the prior art are difficult to accurately identify the identification target light spot, and the situations of difficult detection and large deviation are easy to occur. According to the embodiment of the application, various noises are removed by a design algorithm according to the characteristic that the reflected light spot of the identification target presents an ellipse shape, and more accurate and more complete detection of the identification target is realized.
According to the method and the device, an image detection algorithm based on region growing is adopted, gray level binarization and preprocessing are firstly carried out on an original image aiming at the collected original image, then white pixel points are connected by adopting the region growing algorithm to form a block of connected domains, and the number of the pixel points in each connected domain is calculated. And finally, filtering by adopting an elliptical filtering algorithm to obtain the required characteristic points. The ellipse detection algorithm of the embodiment of the application can accurately identify most ellipses, accurately calculate the center coordinates of the ellipses and ensure the calculation accuracy; according to the method and the device, the interference of the human body and the motion thereof in the scene on the detection of the identification target can be removed, and the interference of peripheral objects can be filtered; meanwhile, the denoising method of the embodiment of the application has high processing speed and can ensure the operation real-time performance of the whole algorithm.
Referring to fig. 9, a schematic diagram of an image denoising device of an infrared camera provided in an embodiment of the present application is shown, and the device may specifically include an image acquisition module 901, a binarization processing module 902, a light spot connected domain identifying module 903, and a denoising module 904, where:
the image acquisition module 901 is configured to control an infrared camera to perform image acquisition on a circular infrared reflective identification target to obtain an infrared image, where the infrared image includes a plurality of pixel points, and each pixel point has a corresponding pixel value;
a binarization processing module 902, configured to perform grayscale binarization processing on the infrared image according to the pixel value of each pixel point;
a light spot connected domain identifying module 903, configured to identify a light spot connected domain in the infrared image after the grayscale binarization processing, and detect a shape of the light spot connected domain;
and the denoising module 904 is configured to denoise the infrared image and determine the identification target based on the shape of the light spot connected domain.
In this embodiment, the image acquisition module 901 may be specifically configured to: controlling an infrared light source to irradiate the infrared reflective marking target; determining target shooting parameters of the infrared camera, wherein the target shooting parameters comprise the aperture size and the exposure time of the infrared camera; and controlling the infrared camera to shoot the infrared light reflecting identification target under the target shooting parameters to obtain the infrared image.
In this embodiment of the present application, the image capturing module 901 may further be configured to: controlling the infrared camera to shoot the infrared reflective identification target under a plurality of different shooting parameters; and when the pixel value of a light spot formed by the infrared reflective identification target in the infrared image is larger than a first preset threshold value and the difference value obtained by subtracting the first preset threshold value from the pixel value of the light spot is smaller than a second preset threshold value, determining the current shooting parameters as the target shooting parameters.
In this embodiment of the present application, the image capturing module 901 may further be configured to: determining an infrared light wavelength range of the infrared light source; determining the wavelength transmitted by the optical filter of the infrared camera according to the infrared light wavelength range; the wavelength transmitted by the optical filter is consistent with the central wavelength of the infrared light source.
In the embodiment of the present application, the wavelength range of the infrared light is 780-3000 nm.
In this embodiment of the present application, the binarization processing module 902 may specifically be configured to: assigning the pixel value of each pixel point of which the pixel value is less than or equal to a preset pixel threshold value to be a first numerical value; assigning the pixel value of each pixel point with the pixel value larger than the preset pixel threshold value as a second numerical value; wherein the first value is 0 and the second value is 255.
In this embodiment of the present application, the binarization processing module 902 may further be configured to: calculating the area of an area covered by a pixel point of which the pixel value is the second numerical value in the infrared image after the gray level binarization processing; and if the occupation ratio of the area in the infrared image is larger than a first preset area threshold value, discarding the infrared image.
In this embodiment of the present application, the binarization processing module 902 may further be configured to: identifying linear light spots and particle light spots in the infrared image after the gray level binarization processing, wherein the area of the particle light spots is smaller than a second preset area threshold value; and carrying out image corrosion on the infrared image after the gray level binarization processing so as to delete the linear light spots and the particle light spots.
In this embodiment of the application, the light spot connected component identifying module 903 may specifically be configured to: determining the number and the plane coordinates of pixel points covered by light spots in the infrared image after the gray level binarization processing; and identifying the light spot connected domain according to the number and the plane coordinates of the pixel points covered by the light spots.
In this embodiment of the application, the light spot connected component identifying module 903 may be further configured to: generating a two-dimensional array corresponding to the infrared image, wherein the two-dimensional array comprises a plurality of zone bits, each zone bit corresponds to one pixel point in the infrared image after gray level binarization processing, and the initial value of each zone bit is 0; traversing each pixel point in the infrared image after the gray level binarization processing; for any one pixel point, if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array into 1; counting the number of the flag bits of which the initial value is modified to 1 in the two-dimensional array to obtain the number of the pixel points covered by the light spots; and determining the plane coordinates of the pixel points covered by the light spots according to the position of the mark bit with the initial value modified to 1 in the two-dimensional array.
In this embodiment of the application, the light spot connected component identifying module 903 may be further configured to: determining the moving area range and the moving angle of the infrared reflecting identification target; determining the number range of pixel points covered by light spots formed by the infrared reflection identification target in the infrared image according to the moving area range and the moving angle; and for any light spot, if the number of the pixel points covered by the light spot is not within the number range, removing the light spot.
In this embodiment of the application, the light spot connected component identifying module 903 may be further configured to: aiming at any light spot connected domain, generating a rectangular region according to the plane coordinates of each pixel point in the light spot connected domain; if the condition that Tail/2/r is more than 0.65 r is met, judging that the shape of the light spot communication domain is circular or elliptical; and Tail is the number of pixel points in the light spot connected domain, and r is half of the length of a diagonal line of the rectangular region.
In this embodiment of the application, the light spot connected component identifying module 903 may be further configured to: aiming at any one light spot connected domain, calculating the central point of the light spot connected domain according to the plane coordinates of each pixel point in the light spot connected domain; calculating a quadratic mean square central moment of the light spot connected domain according to the central point; and determining the shape of the light spot connected domain according to the quadratic mean square central moment.
In this embodiment of the application, the plane coordinates include a horizontal axis coordinate and a vertical axis coordinate, and the light spot connected component identifying module 903 may further be configured to: respectively calculating the sum of coordinates of a horizontal axis and the sum of coordinates of a vertical axis of each pixel point in the light spot connected domain; and respectively calculating a first ratio of the sum of the coordinates of the transverse axis to the number of the pixel points in the light spot connected domain and a second ratio of the sum of the coordinates of the longitudinal axis to the number of the pixel points in the light spot connected domain, wherein the first ratio and the second ratio form the coordinates of the transverse axis and the coordinates of the longitudinal axis of the central point.
In this embodiment, the light spot connected component identifying module 903 may be further configured to: aiming at any pixel point in the light spot connected domain, calculating the square of the difference between the horizontal axis coordinate of the pixel point and the horizontal axis coordinate of the central point to obtain a first square value, and calculating the square of the difference between the vertical axis coordinate of the pixel point and the vertical axis coordinate of the central point to obtain a second square value; adding the first square value and the second square value of all the pixel points in the light spot connected domain to obtain a square sum; and taking the ratio of the sum of squares to the number of pixel points in the light spot connected domain as the quadratic mean square central moment of the light spot connected domain.
In this embodiment of the application, the light spot connected component identifying module 903 may be further configured to: determining the central moment threshold range of the light spot connected domain formed by the infrared reflection identification target in the infrared image; and for any one light spot connected domain, if the quadratic mean square central moment of the light spot connected domain is within the central moment threshold range, judging that the shape of the light spot connected domain is circular or elliptical.
In this embodiment, the denoising module 904 may be specifically configured to: and removing the light spot connected domain with the shape which is not circular or elliptical.
For the apparatus embodiment, since it is substantially similar to the method embodiment, it is described relatively simply, and reference may be made to the description of the method embodiment section for relevant points.
Referring to fig. 10, a schematic diagram of a computer device provided in the embodiment of the present application is shown. As shown in fig. 10, the computer apparatus 1000 in the embodiment of the present application includes: a processor 1010, a memory 1020, and a computer program 1021 stored in the memory 1020 and operable on the processor 1010. The processor 1010, when executing the computer program 1021, implements the steps in the embodiments of the image denoising method for the infrared camera, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 1010, when executing the computer program 1021, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 901 to 904 shown in fig. 9.
Illustratively, the computer program 1021 may be partitioned into one or more modules/units that are stored in the memory 1020 and executed by the processor 1010 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing certain functions, which may be used to describe the execution of the computer program 1021 in the computer device 1000. For example, the computer program 1021 may be divided into an image acquisition module, a binarization processing module, a light spot connected domain identification module, and a denoising module, where the specific functions of the modules are as follows:
the infrared camera is used for acquiring a circular infrared light-reflecting identification target, and the circular infrared light-reflecting identification target is subjected to image acquisition by the image acquisition module to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
the binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the light spot connected domain identification module is used for identifying the light spot connected domain in the infrared image after the gray level binarization processing and detecting the shape of the light spot connected domain;
and the denoising module is used for denoising the infrared image and realizing identification target determination based on the shape of the light spot connected domain.
The computer device 1000 may be the computer device in the foregoing embodiments, and the computer device 1000 may be a desktop computer, a cloud server, or other computing devices. The computer device 1000 may include, but is not limited to, a processor 1010, a memory 1020. Those skilled in the art will appreciate that fig. 10 is only one example of a computer device 1000 and is not intended to limit the computer device 1000 and that the computer device 1000 may include more or less components than those shown, or some of the components may be combined, or different components, e.g., the computer device 1000 may also include input and output devices, network access devices, buses, etc.
The Processor 1010 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 1020 may be an internal storage unit of the computer device 1000, such as a hard disk or a memory of the computer device 1000. The memory 1020 may also be an external storage device of the computer device 1000, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the computer device 1000. Further, the memory 1020 may also include both internal and external storage units of the computer device 1000. The memory 1020 is used for storing the computer program 1021 and other programs and data required by the computer device 1000. The memory 1020 may also be used to temporarily store data that has been output or is to be output.
The embodiment of the application also discloses a computer device, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to realize the image denoising method of the infrared camera according to the foregoing embodiments.
The embodiment of the application also discloses a computer readable storage medium, which stores a computer program, and the computer program is executed by a processor to implement the image denoising method of the infrared camera according to the foregoing embodiments.
The embodiment of the present application further discloses a computer program product, when the computer program product runs on a computer, the computer is enabled to execute the image denoising method of the infrared camera according to the foregoing embodiments.
The functions implemented by the computer device in the embodiments of the present application may be stored in a computer-readable storage medium if implemented in the form of software functional units and sold or used as independent products. Based on such understanding, all or part of the processes in the above method embodiments may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the above method embodiments. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/computer device, recording medium, computer Memory, Read-Only Memory (ROM), Random-Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (20)

1. An image denoising method of an infrared camera, comprising:
controlling an infrared camera to carry out image acquisition on a circular infrared light-reflecting identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
identifying a light spot connected domain in the infrared image after the gray level binarization processing, and detecting the shape of the light spot connected domain;
and denoising the infrared image based on the shape of the light spot connected domain.
2. The method according to claim 1, wherein the controlling the infrared camera to perform image acquisition on the circular infrared reflective identification target to obtain an infrared image comprises:
controlling an infrared light source to irradiate the infrared reflective marking target;
determining target shooting parameters of the infrared camera, wherein the target shooting parameters comprise the aperture size and the exposure time of the infrared camera;
and controlling the infrared camera to shoot the infrared reflective identification target under the target shooting parameters to obtain the infrared image.
3. The method of claim 2, wherein the determining target capture parameters of the infrared camera comprises:
controlling the infrared camera to shoot the infrared reflective identification target under a plurality of different shooting parameters;
and when the pixel value of a light spot formed by the infrared reflective identification target in the infrared image is larger than a first preset threshold value and the difference value obtained by subtracting the first preset threshold value from the pixel value of the light spot is smaller than a second preset threshold value, determining the current shooting parameters as the target shooting parameters.
4. The method of any one of claims 2 or 3, wherein prior to controlling an infrared light source to illuminate the infrared-reflective signage target, the method further comprises:
determining an infrared light wavelength range of the infrared light source;
determining the wavelength transmitted by the optical filter of the infrared camera according to the infrared light wavelength range; and the wavelength transmitted by the optical filter is consistent with the central wavelength of the infrared light source.
5. The method of claim 4, wherein the infrared light has a wavelength ranging from 780 nm to 3000 nm.
6. The method according to any one of claims 1 to 3 or 5, wherein the performing a grayscale binarization process on the infrared image according to the pixel value of each pixel point comprises:
assigning the pixel value of each pixel point with the pixel value less than or equal to a preset pixel threshold value as a first numerical value; assigning the pixel value of each pixel point with the pixel value larger than the preset pixel threshold value as a second numerical value; wherein the first value is 0 and the second value is 255.
7. The method according to claim 6, wherein after performing a grayscale binarization process on the infrared image according to the pixel value of each of the pixel points, the method further comprises:
calculating the area of an area covered by a pixel point of which the pixel value is the second numerical value in the infrared image after the gray level binarization processing;
and if the occupation ratio of the area in the infrared image is larger than a first preset area threshold value, discarding the infrared image.
8. The method according to claim 6, wherein after performing a grayscale binarization process on the infrared image according to the pixel value of each of the pixel points, the method further comprises:
linear light spots and particle light spots in the infrared image after the gray level binarization processing are identified, wherein the area of the particle light spots is smaller than a second preset area threshold value;
and carrying out image corrosion on the infrared image after the gray level binarization processing so as to delete the linear light spots and the particle light spots.
9. The method according to any one of claims 1-3, 5 or 7-8, wherein the identifying the light spot connected domain in the infrared image after the gray level binarization processing comprises:
determining the number and the plane coordinates of pixel points covered by light spots in the infrared image after the gray level binarization processing;
and identifying the light spot connected domain according to the number and the plane coordinates of the pixel points covered by the light spots.
10. The method according to claim 9, wherein the determining the number and the plane coordinates of the pixel points covered by the light spot in the infrared image after the grayscale binarization processing comprises:
generating a two-dimensional array corresponding to the infrared image, wherein the two-dimensional array comprises a plurality of zone bits, each zone bit corresponds to one pixel point in the infrared image after gray level binarization processing, and the initial value of each zone bit is 0;
traversing each pixel point in the infrared image after the gray level binarization processing;
for any one pixel point, if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array into 1;
counting the number of the flag bits of which the initial value is modified to 1 in the two-dimensional array to obtain the number of the pixel points covered by the light spots;
and determining the plane coordinates of the pixel points covered by the light spots according to the position of the mark bit with the initial value modified to 1 in the two-dimensional array.
11. The method of claim 10, wherein after identifying the spot connectivity domain according to the number of the pixel points covered by the spot and the plane coordinates, the method further comprises:
determining the moving area range and the moving angle of the infrared reflecting identification target;
determining the number range of pixel points covered by light spots formed by the infrared reflection identification target in the infrared image according to the moving area range and the moving angle;
and for any light spot, if the number of the pixel points covered by the light spot is not within the number range, removing the light spot.
12. The method according to claim 9, wherein the detecting the shape of the light spot connected component comprises:
aiming at any light spot connected domain, generating a rectangular region according to the plane coordinates of each pixel point in the light spot connected domain;
if the condition that Tail/2/r is more than 0.65 r is met, judging that the shape of the light spot communication domain is circular or elliptical; and Tail is the number of pixel points in the light spot connected domain, and r is half of the length of a diagonal line of the rectangular region.
13. The method according to claim 9, wherein the detecting the shape of the light spot connected component comprises:
aiming at any one light spot connected domain, calculating the central point of the light spot connected domain according to the plane coordinates of each pixel point in the light spot connected domain;
calculating a quadratic mean square central moment of the light spot connected domain according to the central point;
and determining the shape of the light spot connected domain according to the quadratic mean square central moment.
14. The method of claim 13, wherein the plane coordinates comprise horizontal axis coordinates and vertical axis coordinates, and the calculating the center point of the spot neighborhood from the plane coordinates of each pixel in the spot neighborhood comprises:
respectively calculating the sum of coordinates of a horizontal axis and the sum of coordinates of a vertical axis of each pixel point in the light spot connected domain;
and respectively calculating a first ratio of the sum of the coordinates of the transverse axis to the number of the pixel points in the light spot connected domain and a second ratio of the sum of the coordinates of the longitudinal axis to the number of the pixel points in the light spot connected domain, wherein the first ratio and the second ratio form the coordinates of the transverse axis and the coordinates of the longitudinal axis of the central point.
15. The method according to any one of claims 13 or 14, wherein the calculating a quadratic mean square central moment of the spot connected component from the central point comprises:
aiming at any pixel point in the light spot connected domain, calculating the square of the difference between the horizontal axis coordinate of the pixel point and the horizontal axis coordinate of the central point to obtain a first square value, and calculating the square of the difference between the vertical axis coordinate of the pixel point and the vertical axis coordinate of the central point to obtain a second square value;
adding the first square value and the second square value of all the pixel points in the light spot connected domain to obtain a square sum;
and taking the ratio of the sum of squares to the number of pixel points in the light spot connected domain as the quadratic mean square central moment of the light spot connected domain.
16. The method of claim 15, wherein determining the shape of the connected component of the spot based on the quadratic mean square center moment comprises:
determining the central moment threshold range of the light spot connected domain formed by the infrared reflection identification target in the infrared image;
and for any light spot connected domain, if the quadratic mean square central moment of the light spot connected domain is within the central moment threshold range, judging that the shape of the light spot connected domain is circular or elliptical.
17. The method according to any one of claims 12-14 or 16, wherein denoising the infrared image based on the shape of the spot connected component comprises:
and removing the light spot connected domains which are not circular or elliptical in shape.
18. An image denoising apparatus of an infrared camera, comprising:
the infrared camera is used for acquiring a circular infrared reflection identification target, and the circular infrared reflection identification target is subjected to image acquisition by the image acquisition module to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
the binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the light spot connected domain identification module is used for identifying the light spot connected domain in the infrared image after the gray level binarization processing and detecting the shape of the light spot connected domain;
and the denoising module is used for denoising the infrared image based on the shape of the light spot connected domain.
19. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the method for image denoising of an infrared camera as claimed in any one of claims 1-17.
20. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the image denoising method of the infrared camera according to any one of claims 1 to 17.
CN202111649543.2A 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment Active CN114565517B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111649543.2A CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111649543.2A CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Publications (2)

Publication Number Publication Date
CN114565517A true CN114565517A (en) 2022-05-31
CN114565517B CN114565517B (en) 2023-09-29

Family

ID=81711536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111649543.2A Active CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Country Status (1)

Country Link
CN (1) CN114565517B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228589A (en) * 2023-03-22 2023-06-06 新创碳谷集团有限公司 Method, equipment and storage medium for eliminating noise points of visual inspection camera
WO2023236209A1 (en) * 2022-06-10 2023-12-14 北京小米移动软件有限公司 Image processing method and apparatus, electronic device, and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014075365A (en) * 2014-01-27 2014-04-24 Hitachi High-Technologies Corp Charged particle beam device, sample image acquisition method, and program recording medium
CN105719259A (en) * 2016-02-19 2016-06-29 上海理工大学 Pavement crack image detection method
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN108335308A (en) * 2017-01-20 2018-07-27 深圳市祈飞科技有限公司 A kind of orange automatic testing method, system and intelligent robot retail terminal
CN109299634A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Spot detection method, system, equipment and storage medium
CN109325468A (en) * 2018-10-18 2019-02-12 广州智颜科技有限公司 A kind of image processing method, device, computer equipment and storage medium
CN109729276A (en) * 2017-10-27 2019-05-07 比亚迪股份有限公司 Near-infrared image capture method, device, equipment and storage medium
CN109949252A (en) * 2019-04-15 2019-06-28 北京理工大学 A kind of infrared image hot spot minimizing technology based on penalty coefficient fitting
WO2019205290A1 (en) * 2018-04-28 2019-10-31 平安科技(深圳)有限公司 Image detection method and apparatus, computer device, and storage medium
CN110720985A (en) * 2019-11-13 2020-01-24 安徽领航智睿科技有限公司 Multi-mode guided surgical navigation method and system
CN111080691A (en) * 2019-12-17 2020-04-28 晶科电力科技股份有限公司 Infrared hot spot detection method and device for photovoltaic module
CN111862195A (en) * 2020-08-26 2020-10-30 Oppo广东移动通信有限公司 Light spot detection method and device, terminal and storage medium
CN112911157A (en) * 2021-03-27 2021-06-04 山东创能机械科技有限公司潍坊分公司 Automatic device for searching, tracking, aiming and ranging initial fire source based on infrared image recognition

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014075365A (en) * 2014-01-27 2014-04-24 Hitachi High-Technologies Corp Charged particle beam device, sample image acquisition method, and program recording medium
CN105719259A (en) * 2016-02-19 2016-06-29 上海理工大学 Pavement crack image detection method
CN108335308A (en) * 2017-01-20 2018-07-27 深圳市祈飞科技有限公司 A kind of orange automatic testing method, system and intelligent robot retail terminal
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN109299634A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Spot detection method, system, equipment and storage medium
CN109729276A (en) * 2017-10-27 2019-05-07 比亚迪股份有限公司 Near-infrared image capture method, device, equipment and storage medium
WO2019205290A1 (en) * 2018-04-28 2019-10-31 平安科技(深圳)有限公司 Image detection method and apparatus, computer device, and storage medium
CN109325468A (en) * 2018-10-18 2019-02-12 广州智颜科技有限公司 A kind of image processing method, device, computer equipment and storage medium
CN109949252A (en) * 2019-04-15 2019-06-28 北京理工大学 A kind of infrared image hot spot minimizing technology based on penalty coefficient fitting
CN110720985A (en) * 2019-11-13 2020-01-24 安徽领航智睿科技有限公司 Multi-mode guided surgical navigation method and system
CN111080691A (en) * 2019-12-17 2020-04-28 晶科电力科技股份有限公司 Infrared hot spot detection method and device for photovoltaic module
CN111862195A (en) * 2020-08-26 2020-10-30 Oppo广东移动通信有限公司 Light spot detection method and device, terminal and storage medium
CN112911157A (en) * 2021-03-27 2021-06-04 山东创能机械科技有限公司潍坊分公司 Automatic device for searching, tracking, aiming and ranging initial fire source based on infrared image recognition

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023236209A1 (en) * 2022-06-10 2023-12-14 北京小米移动软件有限公司 Image processing method and apparatus, electronic device, and storage medium
CN116228589A (en) * 2023-03-22 2023-06-06 新创碳谷集团有限公司 Method, equipment and storage medium for eliminating noise points of visual inspection camera
CN116228589B (en) * 2023-03-22 2023-08-29 新创碳谷集团有限公司 Method, equipment and storage medium for eliminating noise points of visual inspection camera

Also Published As

Publication number Publication date
CN114565517B (en) 2023-09-29

Similar Documents

Publication Publication Date Title
JP6360260B2 (en) Optical tracking method and system based on passive markers
CN114565517B (en) Image denoising method and device of infrared camera and computer equipment
JP2008052701A (en) Image processing method, image processing device, and program
CN101853333A (en) Method for picking marks in medical robot navigation positioning images
CN114092480B (en) Endoscope adjusting device, surgical robot and readable storage medium
JP4792214B2 (en) Image processing method and system for structured light profile of parts
CN110807807A (en) Monocular vision target positioning pattern, method, device and equipment
CN111419399A (en) Positioning tracking piece, positioning ball identification method, storage medium and electronic device
CN111091507A (en) Image processing method, image processing apparatus, electronic device, and storage medium
US11857153B2 (en) Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
CN114022554B (en) Massage robot acupoint detection and positioning method based on YOLO
CN115049528A (en) Hair image processing method, system, computer device, medium, and program product
CN101681510A (en) Registering device, checking device, program, and data structure
CN113284160B (en) Method, device and equipment for identifying surgical navigation mark beads
JP5056662B2 (en) Subcutaneous pattern acquisition device, subcutaneous pattern acquisition method, and structure template
Radlak et al. Automatic detection of bones based on the confidence map for rheumatoid arthritis analysis
JP2002131031A (en) Method and device for measuring three-dimensional shape
Abramov et al. Algorithms for detecting and tracking of objects with optical markers in 3D space
Povoroznyuk et al. Formalisation of the problem of the matched morphological filtering of biomedical signals and images
CN115068109B (en) Medical surgery navigation-oriented infrared target identification method and device
CN113269733B (en) Artifact detection method for radioactive particles in tomographic image
WO2020153186A1 (en) Endoscope device
CN111080542B (en) Image processing method, device, electronic equipment and storage medium
CN110660073B (en) Straight line edge recognition equipment
Long et al. A Practical Dental Image Enhancement Network for Early Diagnosis of Oral Dental Disease

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant