CN114565517B - Image denoising method and device of infrared camera and computer equipment - Google Patents

Image denoising method and device of infrared camera and computer equipment Download PDF

Info

Publication number
CN114565517B
CN114565517B CN202111649543.2A CN202111649543A CN114565517B CN 114565517 B CN114565517 B CN 114565517B CN 202111649543 A CN202111649543 A CN 202111649543A CN 114565517 B CN114565517 B CN 114565517B
Authority
CN
China
Prior art keywords
infrared
pixel
light spot
image
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111649543.2A
Other languages
Chinese (zh)
Other versions
CN114565517A (en
Inventor
孟李艾俐
胡超
徐逸帆
董博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bone Shengyuanhua Robot Shenzhen Co ltd
Original Assignee
Bone Shengyuanhua Robot Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bone Shengyuanhua Robot Shenzhen Co ltd filed Critical Bone Shengyuanhua Robot Shenzhen Co ltd
Priority to CN202111649543.2A priority Critical patent/CN114565517B/en
Publication of CN114565517A publication Critical patent/CN114565517A/en
Application granted granted Critical
Publication of CN114565517B publication Critical patent/CN114565517B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application is suitable for the technical field of image processing, and provides an image denoising method and device of an infrared camera and computer equipment, wherein the method comprises the following steps: controlling an infrared camera to acquire images of a round infrared reflective identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value; according to the pixel value of each pixel point, carrying out gray level binarization processing on the infrared image; recognizing a light spot connected domain in the infrared image after gray level binarization processing, and detecting the shape of the light spot connected domain; and denoising the infrared image based on the shape of the facula connected domain. By adopting the method, the interference of other factors on the detection of the infrared identification target can be eliminated, and the detection accuracy is improved.

Description

Image denoising method and device of infrared camera and computer equipment
Technical Field
The embodiment of the application belongs to the technical field of image processing, and particularly relates to an image denoising method and device of an infrared camera and computer equipment.
Background
With the rapid development of medicine and computer science, computer-assisted surgery has become a research and application hotspot of modern surgery, and has received great attention. The surgical navigation system is an important application direction of computer-assisted surgery, can help doctors to select an optimal surgical path, reduces surgical damage, and improves the accuracy, convenience and success rate of surgery. At present, the most commonly used surgical navigation system utilizes an infrared positioning tracker to track and position an infrared identification target on a surgical instrument and a focus in real time, calculate the working position, direction and movement path of the surgical instrument, and finish the operation according to a pre-operation designed and planned route and step on the basis.
In the operation process, medical staff, surgical instruments, background objects and the like in an operating room can cause certain forms of infrared light reflection, and irradiation of sunlight, illumination light sources and other environmental light can also cause certain degrees of infrared light emission. Thus, when the infrared positioning tracker is applied, noise exists in the image of the infrared camera besides the spot pattern of the infrared identification target. The existence of the image noise can interfere the discrimination of the operation navigation system to the infrared identification target coordinates, and the operation accuracy is affected.
Disclosure of Invention
In view of this, the embodiments of the present application provide an image denoising method, apparatus and computer device for an infrared camera, which are used for denoising an image of the infrared camera, eliminating interference of other factors on detection of an infrared identification target, and improving detection accuracy.
A first aspect of an embodiment of the present application provides an image denoising method of an infrared camera, including:
controlling an infrared camera to acquire images of a round infrared reflective identification target to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
according to the pixel value of each pixel point, carrying out gray level binarization processing on the infrared image;
recognizing a light spot connected domain in the infrared image after gray level binarization processing, and detecting the shape of the light spot connected domain;
and denoising the infrared image based on the shape of the facula connected domain.
A second aspect of an embodiment of the present application provides an image denoising apparatus of an infrared camera, including:
the image acquisition module is used for controlling the infrared camera to acquire images of the circular infrared reflective identification targets to obtain infrared images, wherein the infrared images comprise a plurality of pixel points, and each pixel point has a corresponding pixel value;
The binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the light spot connected domain identification module is used for identifying the light spot connected domain in the infrared image after gray level binarization processing and detecting the shape of the light spot connected domain;
and the denoising module is used for denoising the infrared image based on the shape of the facula communicating region.
A third aspect of an embodiment of the present application provides a computer device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the image denoising method of an infrared camera according to the first aspect, when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the image denoising method of an infrared camera according to the first aspect.
A fifth aspect of embodiments of the present application provides a computer program product, which when run on a computer causes the computer to perform the image denoising method of an infrared camera according to the first aspect.
Compared with the prior art, the embodiment of the application has the following advantages:
according to the embodiment of the application, the computer equipment can acquire the infrared image by controlling the infrared camera to acquire the image of the circular infrared reflective identification target. According to the pixel value of each pixel point in the infrared image, the computer equipment can carry out gray level binarization processing on the infrared image. The computer equipment can denoise the infrared image based on the shape of the light spot connected domain by identifying the light spot connected domain in the infrared image after gray level binarization processing and detecting the shape of the light spot connected domain. According to the embodiment of the application, through carrying out gray level binarization processing on the infrared image and carrying out shape detection on the light spot communicating region, various noises can be accurately filtered, and the ideal infrared reflection identification target light spot can be obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of an infrared positioning tracker according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image denoising method of an infrared camera according to an embodiment of the present application;
FIG. 3 is a schematic illustration of an infrared image provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of an implementation of S201 in an image denoising method of an infrared camera according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an infrared image after gray level binarization processing according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an implementation of S203 in an image denoising method of an infrared camera according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a denoised infrared image according to an embodiment of the present application;
fig. 8 is an algorithm flow chart of an image denoising method of an infrared camera according to an embodiment of the present application;
fig. 9 is a schematic diagram of an image denoising apparatus of an infrared camera according to an embodiment of the present application;
fig. 10 is a schematic diagram of a computer device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
Fig. 1 is a schematic working diagram of an infrared positioning tracker according to an embodiment of the present application. The infrared positioning tracker in fig. 1 consists of an infrared light source and an infrared camera. The infrared light source consists of an infrared LED array and is arranged on the periphery of the infrared camera. In operation, the infrared light source emits light to illuminate a circular infrared reflective identification target (such as the surgical instrument identification target in fig. 1) on a surgical instrument or a lesion, and a reflective spot pattern of the infrared reflective identification target is formed on the camera image. The infrared camera in the infrared positioning tracker is realized by adding an infrared filter matched with an infrared light source in front of the camera and is used for shooting the infrared reflective identification targets. Because the infrared reflectivity of the object around the mark target is low, the image shot by the infrared camera comprises the infrared reflection mark target spot pattern with high brightness and contrast. The position and posture parameters of the surgical instrument can be calculated by finding the central coordinates of the marking target spots, so that the positioning is realized.
However, as other objects in the surgical scene can cause reflection of infrared light to a certain extent and ambient light can also reflect on each object in the surgical scene, noise exists in the image shot by the infrared camera except for the light spot pattern of the infrared reflection identification target. In order to ensure the accuracy of positioning, noise needs to be removed.
There are many algorithms in the prior art for image filtering, such as mean filtering, median filtering, gaussian filtering, etc. There are also many edge detection algorithms, such as the circle detection algorithm, which are also quite well known. However, when the infrared retroreflective marker target is tilted at an angle, it will appear as an ellipse in a different direction at the image plane. Often, due to the existence of noise, the detection of the infrared reflective identification target ellipse by the existing algorithm can be difficult and missed. Therefore, a key objective of the positioning navigation system is to remove various interferences, and provide an accurate, reliable and fast algorithm to complete the identification of the infrared reflective identification target.
The image denoising method of the infrared camera provided by the embodiment of the application can detect circles and ellipses in the binarized infrared image. The method obtains high-quality infrared images through the infrared reflection identification targets and the infrared cameras which are reasonably designed; and then preprocessing an infrared image by using a rapid image filtering method, detecting a white facula connected domain by further using a region growing method, extracting required circular and elliptical facula connected domains by using methods such as the number characteristic, the shape characteristic and the central moment of pixels of the connected domain, and completing accurate and reliable identification and detection of an infrared reflection identification target.
The technical scheme of the application is described below through specific examples.
Referring to fig. 2, a schematic diagram of an image denoising method for an infrared camera according to an embodiment of the present application is shown, where the method specifically includes the following steps:
s201, controlling an infrared camera to acquire images of round infrared reflective identification targets to obtain infrared images, wherein the infrared images comprise a plurality of pixel points, and each pixel point has a corresponding pixel value.
The method can be applied to computer equipment, namely, an execution subject of the embodiment of the application can be the computer equipment. The computer equipment can be electronic equipment capable of controlling the infrared camera to shoot the infrared reflection identification target and denoising the shot infrared image. For example, the electronic device such as a tablet computer, a notebook computer, a desktop computer, or a cloud server, the specific type of the electronic device is not limited in the embodiments of the present application.
In one example of an embodiment of the application, the computer device may be a constituent unit of a positioning navigation system. Taking a surgical navigation system as an example, the computer equipment can form the surgical navigation system together with an infrared positioning tracker composed of an infrared light source and an infrared camera, surgical instruments, infrared reflection identification targets on focuses and the like.
During positioning and navigation, the computer equipment can control the infrared camera to shoot the infrared reflection identification target, so as to obtain a corresponding infrared image. The shape of the infrared reflective marking target can be circular, and the light spot formed on the infrared image by the circular infrared reflective marking target is usually circular or elliptical. Fig. 3 is a schematic diagram of an infrared image according to an embodiment of the present application, where the infrared image shown in fig. 3 is an original, unprocessed infrared image captured by an infrared camera. As can be seen in fig. 3, the infrared image includes not only infrared-reflective identification target spots as in regions 301a, 301b, 301c, and 301d, but also some noise as in regions 302a and 302 b.
In one possible implementation manner of the embodiment of the present application, as shown in fig. 4, in S201, controlling an infrared camera to perform image acquisition on a circular infrared reflective identification target, the obtaining an infrared image may specifically include the following substeps S2011-S2013:
s2011, controlling an infrared light source to irradiate the infrared reflective identification target.
In an embodiment of the application, the computer device may first control the infrared light source to illuminate the infrared reflective identification target. Thus, infrared light reflected by the infrared reflective marker target will form a reflected spot on the image of the infrared camera.
In particular implementations, the computer device needs to determine the wavelength range of the infrared light emitted by the infrared light source before controlling the infrared light source to illuminate the infrared light reflective identification target. In one example, the wavelength of the infrared light source used to position the tracker may be selected to be near infrared with a wavelength response that is as narrow as possible. For example, the infrared light wavelength range may be determined to be between 780 and 3000 nanometers. Specifically, infrared light of 850 nm wavelength may be selected as the infrared light source.
In the embodiment of the application, the infrared camera can be realized by adding an infrared filter matched with an infrared light source in front of the camera. Thus, after determining the infrared light wavelength range of the infrared light source, the wavelength at which the filter of the infrared camera transmits can be determined from the infrared light wavelength range. In general, the infrared camera filter should transmit wavelengths that are consistent with the center wavelength of the infrared light source, and the wavelength range of the transmitted infrared light should be as narrow as possible.
In addition, the light intensity of the infrared light source should also be strong enough to ensure that the infrared reflective marker target emits light spots of sufficient brightness.
S2012, determining target shooting parameters of the infrared camera, wherein the target shooting parameters comprise aperture size and exposure time of the infrared camera.
In general, the shooting parameters of a camera have an important influence on the imaging quality. Therefore, in order to obtain high-quality infrared images, it is also necessary to determine target shooting parameters of the infrared camera before controlling the infrared camera to shoot the infrared reflective marker target. In general, the brightness of an image photographed by a camera is closely related to the size of an aperture and the time of exposure. Thus, the target photographing parameters may include an aperture size of the infrared camera and an exposure time. In particular, the aperture of the infrared camera is not too large, so that the imaging quality can be ensured; second, the exposure time at the time of shooting is also appropriate to ensure a sufficiently fast image acquisition response.
Since the reflected light intensity of the infrared reflective marking target is generally higher than that of a background object, the aperture size and the exposure time can be determined by adopting the spot brightness threshold condition of the infrared reflective marking target: i.e., the aperture size and exposure time are adjusted such that the pixel value within the ir reflective index target spot is just above a set threshold (e.g., 230), the aperture size and exposure time are fixed. Therefore, the brightness of the infrared reflective marking target light spot can be ensured, and the background noise pixel value can be limited in a lower range.
In a specific implementation, the infrared camera can be controlled to shoot the infrared reflective identification target under a plurality of different shooting parameters. The infrared camera is controlled to shoot the infrared reflective identification targets under different aperture sizes and exposure time, and infrared images corresponding to each group of aperture sizes and exposure time are obtained. By analyzing the light spots in each group of infrared images, when the pixel value of the light spot formed by the infrared reflection identification target in the infrared image is larger than a first preset threshold value and the difference value obtained by subtracting the first preset threshold value from the pixel value of the light spot is smaller than a second preset threshold value, the current shooting parameter can be determined as the target shooting parameter. For example, the first preset threshold may be 230, that is, the pixel value of the light spot formed by the infrared reflective identification target in the infrared image is greater than 230 by adjusting the shooting parameters such as the aperture size and the exposure time. However, at the same time, the pixel value of the light spot is not too large, and the second preset threshold may be set, for example, the second preset threshold may be any value greater than 0 and less than 5. In this way, the pixel value of the spot can be controlled to be just above the set first preset threshold.
S2013, controlling the infrared camera to shoot the infrared reflective identification target under the target shooting parameters to obtain the infrared image.
After the wavelength range of the infrared light source and the target shooting parameters of the infrared camera are determined, the computer equipment can control the infrared camera to adjust to the target shooting parameters, and then the infrared reflective identification target is shot under the target shooting parameters to obtain an infrared image. The infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value.
S202, gray level binarization processing is carried out on the infrared image according to the pixel value of each pixel point.
In the embodiment of the application, according to the pixel value of each pixel point in the infrared image, gray level binarization processing can be performed on the infrared image, and the infrared image after the gray level binarization processing only comprises two colors. Wherein each pixel point area covered by the light spot is of the same color, and other pixel point areas not covered by the light spot are of the same color.
In the embodiment of the application, when gray level binarization processing is performed on an infrared image, the pixel value of each pixel point with the pixel value smaller than or equal to a preset pixel threshold value can be assigned as a first numerical value; and assigning the pixel value of each pixel point with the pixel value larger than the preset pixel threshold value to be a second value, thereby realizing the gray level binarization processing of the infrared image.
In an example of the embodiment of the present application, the first value may be 0, the second value may be 255, and the preset pixel threshold may be 230. That is, the pixel value of each pixel having a pixel value higher than 230 in the infrared image is assigned 255, and the pixel values of the other pixels are assigned 0. After such processing, the pixel values of most background areas in the infrared image will become 0, i.e. black; while the white area is mainly the light spot range of the infrared reflective identification target.
Fig. 5 is a schematic diagram of an infrared image after gray level binarization processing according to an embodiment of the present application. In fig. 5, most of the background areas are black, for example, areas 501, which are pixel areas with pixel values assigned to 0; while a small portion of the areas are white, such as the white areas in areas 501a, 501b, 501c and 501d, which are pixel areas where the pixel value is assigned to 255, i.e., the spot areas formed by the infrared retroreflective signage target in the infrared image. Compared to the original infrared image in fig. 3, the noise in the areas 302a and 302b in fig. 3 is removed after the gray level binarization process.
Typically, the infrared reflective identification target forms a small area of the light spot in the infrared image, and the sum of the areas of all the light spots is generally not more than 3% of the total area of the infrared image. Therefore, after the gray level binarization processing is performed on the infrared image, the computer device may calculate the area covered by the pixel point with the pixel value of the second value in the infrared image, that is, the area of the white spot area in the infrared image after the gray level binarization processing, where the pixel value is 255. If the area is larger than the first preset area threshold (e.g., 3%) in the infrared image, the infrared image obtained by current shooting can be considered as an abnormal image. For example, when a human body approaches a near infrared light source or an infrared camera, or other objects shield the infrared camera, a large area of high brightness interference will be generated, so that the area of the spot area exceeds 3% of the total area of the infrared image. At this point, the computer may discard the ir image, discard the subsequent program, and re-acquire the ir image.
In general, after the infrared image is subjected to gray level binarization, fine particles and fine-line white noise may exist in the image in addition to the white infrared reflective marker target light spot. For example, the linear spots and particle spots shown in region 501b in fig. 5. The particle spot may be a fine spot having an area less than a second predetermined area threshold. To this end, the computer device may employ image erosion operations to remove such noise, leaving as little white light as possible in the image as the infrared reflective identification target.
In a specific implementation, the computer equipment can identify linear light spots and particle light spots in the infrared image after gray level binarization processing; then, the linear flare and the particle flare can be deleted by image etching the infrared image after the gradation binarization processing.
S203, identifying a light spot connected domain in the infrared image after gray level binarization processing, and detecting the shape of the light spot connected domain.
In the embodiment of the application, the processes of gray level binarization processing, image corrosion processing and the like of the infrared image in the previous steps can be regarded as the preprocessing process of the infrared image. The computer device can further denoise the image by identifying the shape of the light spot in the infrared image obtained after the preprocessing.
In the embodiment of the application, the computer equipment can identify the light spot connected domain in the infrared image after gray level binarization processing.
In a possible implementation manner of the embodiment of the present application, as shown in fig. 6, identifying a spot connected domain in an infrared image after gray level binarization processing in S203 may specifically include the following substeps S2031-S2032:
s2031, determining the number of pixel points covered by light spots in the infrared image after gray level binarization processing and plane coordinates.
S2032, identifying the light spot connected domain according to the number of the pixel points covered by the light spot and the plane coordinates.
In a specific implementation, a two-dimensional array corresponding to the infrared image may be first generated, where the two-dimensional array may be equal to the size of the infrared image and multiple flag bits, and each flag bit corresponds to a pixel point in the infrared image after gray level binarization processing. The initial value of each flag bit in the two-dimensional array may be 0.
Then, each pixel point in the infrared image after gray level binarization processing is traversed, and if the pixel value of any pixel point is not 0, the zone bit corresponding to the pixel point in the two-dimensional array can be modified to be 1. After the traversal of each pixel point in the infrared image is completed, the number of pixel points covered by the light spots can be obtained by counting the number of the marker bits with the initial value modified to 1 in the two-dimensional array. And according to the position of the marker bit with the initial value being modified to be 1 in the two-dimensional array, the plane coordinates of the pixel point covered by the light spot can be determined.
Thus, the computer device can calculate the number of pixel points in the white spot connected domain and the plane coordinate of each pixel point with all the pixel values of 255 after the gray level binarization processing.
After the processing of S2031-S2032, the computer device may obtain all the spot connected areas in the entire infrared image. However, these spot communicating regions are not all spots formed by the infrared reflective marker target, but may also be noise. Computer devices need to remove this noise.
In the embodiment of the application, the moving area range and the moving angle of the infrared reflective identification target can be determined, and the number range of the pixel points covered by the light spots formed by the infrared reflective identification target in the infrared image can be determined according to the moving area range and the moving angle.
Specifically, when the distance and angle of the infrared reflective identification target relative to the infrared camera are changed, the area size of the infrared reflective identification target in the infrared image is correspondingly changed, so that the quantity change range of all pixel points covered by white light spots formed by the infrared reflective identification target in the infrared image can be determined by observing the characteristics of the infrared reflective identification target light spots and analyzing according to the possible moving area range and angle of the infrared reflective identification target in the positioning navigation process.
Then, for any light spot, if the number of pixel points covered by the light spot is not in the number range, the light spot can be considered as noise, and the computer device can remove the light spot.
It should be noted that, the range of the possible moving area of the infrared reflective marker target and the range of the number of pixel points covered by the light spot obtained by the angle analysis may be a specific numerical range interval. When the number of pixel points covered by each light spot connected domain is calculated by the computer equipment to be larger than the upper limit of the numerical range interval or smaller than the lower limit of the numerical range interval, the light spot connected domain can be considered to be not in accordance with the limiting condition of the number range, and the computer equipment can judge the corresponding light spot as noise and remove the noise.
The computer device may then perform a line-like detection of the remaining spot communicating areas.
In a specific implementation, since the infrared reflective marker target light spot is formed by reflecting infrared light by the circular infrared reflective marker target, the infrared reflective marker target light spot forms a circular or elliptical-like light spot in an infrared image shot by an infrared camera. Thus, the computer device can implement shape detection of the infrared reflective identification target for the elliptical features.
In a possible implementation manner of the embodiment of the present application, for any light spot connected domain, a rectangular area may be generated according to the plane coordinates of each pixel point in the light spot connected domain. For example, a rectangular area can be obtained as the length and width of the rectangle according to the span in the up-down-left-right direction of the horizontal axis coordinate and the vertical axis coordinate of each pixel point in the spot communication area.
Then, half of the diagonal length of the rectangular area can be taken as a radius r, and the ratio of the number of pixels of the light spot communication area to the radius r is adopted for limiting. If the light spot communicating region meets the constraint condition that Tail/2/r is more than 0.65 r, the shape of the light spot communicating region can be judged to be circular or elliptical. Wherein Tail is the number of pixel points in the light spot communication domain.
In another possible implementation manner of the embodiment of the present application, a central moment method may also be used to detect the shape of the light spot connected domain. The central moment method is a quantitative calculation method.
In a specific implementation, for any light spot connected domain, a center point of the light spot connected domain may be calculated according to plane coordinates of each pixel point in the light spot connected domain.
It should be noted that, the plane coordinates of each pixel point may include a horizontal axis coordinate and a vertical axis coordinate, when calculating the center point of the light spot communicating domain according to the plane coordinates of each pixel point, the sum of the horizontal axis coordinates and the sum of the vertical axis coordinates of each pixel point in the light spot communicating domain may be calculated respectively, and then a first ratio of the sum of the horizontal axis coordinates to the number of the pixel points in the light spot communicating domain and a second ratio of the sum of the vertical axis coordinates to the number of the pixel points in the light spot communicating domain may be calculated respectively. The first ratio and the second ratio together form a horizontal axis coordinate and a vertical axis coordinate of a center point of the light spot communicating region.
Specifically, the center point of the spot connected domain may be calculated using the following formula:
wherein x is i And y i Respectively representing the horizontal axis coordinate and the vertical axis coordinate of a pixel point with the serial number i in a certain light spot communication domain; x and y respectively represent a horizontal axis coordinate and a vertical axis coordinate of a central point of the light spot communicating region; n is the number of pixel points in the light spot communication domain.
And then, according to the calculated central point, calculating the secondary mean square central moment of the light spot connected domain.
Specifically, for any pixel point in the light spot communication domain, the square of the difference between the horizontal axis coordinate of the pixel point and the horizontal axis coordinate of the center point is calculated to obtain a first square value, and then the square of the difference between the vertical axis coordinate of the pixel point and the vertical axis coordinate of the center point is calculated to obtain a second square value. And adding the first square value and the second square value corresponding to all the pixel points in the light spot communication domain to obtain a square sum, so that the ratio of the square sum to the number of the pixel points in the light spot communication domain can be used as the secondary mean square center moment of the light spot communication domain.
In one example, the quadratic mean center moment of the spot connected domain can be calculated using the following formula:
and J is the second mean square center moment of the light spot connected domain.
After the secondary mean square center moment of the light spot communicating region is obtained through calculation, the computer equipment can determine the shape of the light spot communicating region according to the secondary mean square center moment.
In a possible implementation manner of the embodiment of the application, when the computer equipment determines the shape of the light spot communicating domain according to the calculated secondary mean square central moment, the central moment threshold range of the light spot communicating domain can be reasonably limited according to the possible position and angle range of the infrared reflective identification target.
In specific implementation, the computer equipment can determine the central moment threshold range of the light spot connected domain formed by the infrared reflective identification target in the infrared image, namely, the minimum value J of the secondary mean square central moment of the light spot connected domain is set according to the possible position and the possible angle range of the infrared reflective identification target min And maximum value J max
For any one of the light spot communicating domains, if the quadratic mean square center moment of the light spot communicating domain is within the center moment threshold range, the computer equipment can judge that the shape of the light spot communicating domain is circular or elliptical. Otherwise, the computer device may consider it as noise and remove it for those cases where the central moment threshold range constraint described above is not met.
S204, denoising the infrared image based on the shape of the facula communication domain.
In the embodiment of the application, after the processing and the detection of each step, the computer equipment can remove the light spot connected domain with a shape other than a circle or an ellipse.
Fig. 7 is a schematic diagram of a denoised infrared image according to an embodiment of the present application. Referring to regions 701a, 701b, 701c, and 701d in fig. 7, and comparing fig. 3 and 5, it can be seen that noise in fig. 3 and 5 has been removed.
In the embodiment of the application, the computer equipment can obtain the infrared image by controlling the infrared camera to acquire the image of the circular infrared reflective identification target. According to the pixel value of each pixel point in the infrared image, the computer equipment can carry out gray level binarization processing on the infrared image. The computer equipment can denoise the infrared image based on the shape of the light spot connected domain by identifying the light spot connected domain in the infrared image after gray level binarization processing and detecting the shape of the light spot connected domain. According to the embodiment of the application, through carrying out gray level binarization processing on the infrared image and carrying out shape detection on the light spot communicating region, various noises can be accurately filtered, and the ideal infrared reflection identification target light spot can be obtained.
The image denoising method of the infrared camera provided by the embodiment of the application can be applied to various fields needing positioning and tracking, such as surgical navigation, surgical robot navigation control, scientific research teaching, motion analysis, virtual reality and the like. Taking the positioning navigation of an orthopedic operation robot as an example, the orthopedic operation robot is used for assisting a doctor in carrying out an operation, and the navigation control is a key technology. Before operation, doctors can complete operation planning and operation route design according to CT, MRI and other images and diagnosis results. In operation, by fixing infrared reflective identification targets on bones (beside a focus) of a patient and on an operation arm of a robot, a positioning navigation system can acquire the position and movement track of the robot relative to the focus position through a positioning tracker, and the operation (such as cutting, polishing and the like) is completed according to a route planned before operation. In the process, the positioning navigation system can adopt the image denoising method of the infrared camera provided by the embodiment of the application to denoise the infrared image acquired by the infrared camera, thereby improving the positioning precision and stability of the positioning tracker and providing support for the success of operation.
It should be noted that, the sequence number of each step in the above embodiment does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not limit the implementation process of the embodiment of the present application in any way.
For easy understanding, a description will be given below of an image denoising method of an infrared camera according to an embodiment of the present application, with a complete example.
Fig. 8 is a schematic algorithm flow chart of an image denoising method for an infrared camera according to an embodiment of the present application, where the algorithm flow shown in fig. 8 may be applied to the image denoising process of the positioning tracker shown in fig. 1. The position tracker in fig. 1 includes an infrared light source and an infrared camera. The infrared light source comprises an annular array of infrared LEDs surrounding an infrared camera and emits light having a specific wavelength (e.g., 850 nm). After the infrared light source irradiates the infrared reflective identification target, an infrared light reflective spot (infrared reflective identification target spot) is formed. The front of the infrared camera lens is provided with a transmission narrow-band filter with the same optical wavelength range as the infrared light source LED, so that the infrared camera lens can only respond to the infrared light of a specific LED, and the background light with other wavelengths is filtered. Thus, the position tracker obtains a reflected light spot of the infrared reflective marker target, while the background is at a relatively low brightness.
Specifically, according to the algorithm flow shown in fig. 8, the implementation of denoising the image of the infrared camera mainly includes the following steps:
S801, camera aperture size and exposure time adjustment: the infrared image collected by the infrared camera is a gray scale image, and the aperture size and the exposure time of the camera are adjusted properly. The aperture must not be too large to ensure imaging quality. The exposure time is also reasonable to ensure a sufficiently fast image acquisition response. The brightness threshold condition identifying the target spot is employed in this example to determine the aperture size and length of exposure time: i.e., the aperture size and exposure time are adjusted such that the pixel gray value within the identified target spot is just above the set threshold (e.g., 230), the aperture size and exposure time are fixed. In this way, both the brightness of the identified target spot is ensured, and the background noise brightness (pixel value) is limited to a lower range.
S802, performing image gray level binarization processing and removing large and small noise: pixel values with image pixel values (gray scale range 0-255) above a certain threshold (e.g., 230) are assigned values of 255, the others are assigned values of 0. After such processing, most background pixel values will normally go 0, i.e., black; the white area is mainly the spot area of the target, and the area fraction occupied is typically low (< 3%).
And (3) removing large interference: when a human body approaches to a near infrared light source or an infrared camera or an object shields the infrared camera, a large high-brightness interference area is generated in an infrared image. After the image binarization process, it may be set that when the white area is greater than 3% of the whole image area, the computer device may control to discard the following program and re-acquire the image until the condition is satisfied, for an abnormal situation.
Removing fine noise: after gray level binarization processing, fine particles and fine line-shaped white noise may exist in the image in addition to the white mark target light spot. For this reason, image erosion operations may be employed in this example to remove this type of noise, leaving only the identified target spot in the image as much as possible.
S803, acquiring an identification target light spot area by adopting an area growth algorithm: firstly, defining a two-dimensional array with the same size as the image for storing the flag bit, and initializing all the flag bits to be 0. Traversing each pixel in the image, if the pixel value is not 0, recording the number of the pixels plus 1 according to the area serial number and recording the coordinates of the pixels; the four fields of pixels are traversed, adding 1 each time a pixel other than 0 is detected, then traversing the four fields of four fields, and so on. And finally, calculating the number of the pixel points of all the white connected domains and the plane coordinates of each pixel point.
S804, judging the shape of the connected domain: and analyzing according to the moving area of the infrared reflection mark target, determining the possible quantity change range of all pixels of the mark target light spot, further limiting the quantity range of the pixels occupied by each connected domain, and considering noise interference to be removed if the quantity range is higher or lower than the limiting range. And traversing all pixel point coordinates of each connected domain, respectively taking the difference between the maximum value and the minimum value of the x and y coordinates in the connected domain, and calculating the length of the diagonal line of the rectangle formed by the two difference values. Finally, the connected domain is limited by using a formula, and the calculation formula is as follows: tail/2/r >0.65 x r (tail represents the number of pixels in the connected domain, and r represents half of the diagonal length). When the connected domain meets the condition, the connected domain can be considered to be round or oval, namely the identified target light spot; and if not, removing.
S805, quantitatively judging the quasi-circular shape or the elliptic shape by using a calculation center moment method: first, calculating the center coordinates of each connected domain:
wherein x is i And y i Respectively representing the horizontal axis coordinate and the vertical axis coordinate of the pixel point with the serial number i in a certain connected domain;and->Respectively representing the horizontal axis coordinate and the vertical axis coordinate of the central point of the connected domain; n is the number of pixel points in the connected domain.
The second mean square center moment J is calculated as:
further, the spot condition of the identification target at the possible position and angle is calculated, a reasonable central moment range threshold is set, each connected domain is limited, and noise is considered as being unsatisfied with the set central moment threshold and is removed.
After the processing of the steps, the computer equipment can filter various noises to obtain ideal mark target spots.
The embodiment of the application aims at image acquisition of the infrared camera, and can realize detection of the reflection light spots of the circular identification target under the irradiation of the infrared light source. Because human bodies, instruments and background objects are reflected on the background, and interference noise of other unknown light sources possibly exists, various detection algorithms in the prior art are difficult to accurately identify the identification target light spots, and the conditions of difficult detection and large deviation are easy to occur. According to the embodiment of the application, various noises are removed by a design algorithm according to the characteristic that the reflection light spot of the identification target presents an ellipse, and the identification target is detected more accurately and perfectly.
According to the embodiment of the application, an image detection algorithm based on region growth is adopted, gray level binarization and preprocessing are firstly carried out on an acquired original image, then white pixel points are connected by adopting the region growth algorithm, a connected domain of a block is formed, and the number of the pixel points of each connected domain is calculated. And finally, filtering by adopting an elliptic filtering algorithm to obtain the required characteristic points. The ellipse detection algorithm of the embodiment of the application can accurately identify most ellipses, accurately calculate the center coordinates of the ellipses and ensure the accuracy of calculation; the embodiment of the application not only can remove the interference brought by the human body and the motion thereof to the detection of the identification target in the scene, but also can filter the interference of peripheral objects; meanwhile, the denoising method provided by the embodiment of the application has high processing speed and can ensure the operation instantaneity of the whole algorithm.
Referring to fig. 9, a schematic diagram of an image denoising apparatus for an infrared camera according to an embodiment of the present application is shown, where the apparatus may specifically include an image acquisition module 901, a binarization processing module 902, a spot connected domain identification module 903, and a denoising module 904, where:
the image acquisition module 901 is used for controlling the infrared camera to acquire an image of a circular infrared reflection identification target, so as to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, and each pixel point has a corresponding pixel value;
A binarization processing module 902, configured to perform gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the spot connected domain identification module 903 is configured to identify a spot connected domain in the infrared image after gray level binarization processing, and detect a shape of the spot connected domain;
and the denoising module 904 is used for denoising the infrared image and realizing identification target determination based on the shape of the light spot connected domain.
In the embodiment of the present application, the image acquisition module 901 may specifically be configured to: controlling an infrared light source to irradiate the infrared reflective identification target; determining target shooting parameters of the infrared camera, wherein the target shooting parameters comprise aperture size and exposure time of the infrared camera; and controlling the infrared camera to shoot the infrared reflective identification target under the target shooting parameters to obtain the infrared image.
In an embodiment of the present application, the image acquisition module 901 may further be configured to: controlling the infrared camera to shoot the infrared reflective identification target under a plurality of different shooting parameters; and when the pixel value of a light spot formed by the infrared reflective identification target in the infrared image is larger than a first preset threshold value and the difference value obtained by subtracting the first preset threshold value from the pixel value of the light spot is smaller than a second preset threshold value, determining the current shooting parameter as the target shooting parameter.
In an embodiment of the present application, the image acquisition module 901 may further be configured to: determining an infrared light wavelength range of the infrared light source; determining the wavelength transmitted by an optical filter of the infrared camera according to the infrared light wavelength range; wherein the wavelength of the light transmitted by the optical filter is consistent with the central wavelength of the infrared light source.
In the embodiment of the application, the infrared light wavelength range is 780-3000 nanometers.
In an embodiment of the present application, the binarization processing module 902 may specifically be configured to: assigning the pixel value of each pixel point with the pixel value smaller than or equal to a preset pixel threshold value to be a first numerical value; and assigning the pixel value of each pixel point with the pixel value larger than the preset pixel threshold value to a second numerical value; wherein the first value is 0 and the second value is 255.
In an embodiment of the present application, the binarization processing module 902 may be further configured to: calculating the area covered by the pixel point with the pixel value of the second numerical value in the infrared image after gray level binarization processing; and discarding the infrared image if the area ratio of the infrared image is larger than a first preset area threshold value.
In an embodiment of the present application, the binarization processing module 902 may be further configured to: recognizing linear light spots and particle light spots in the infrared image after gray level binarization processing, wherein the area of the particle light spots is smaller than a second preset area threshold; and performing image corrosion on the infrared image subjected to gray level binarization processing to delete the linear light spots and the particle light spots.
In the embodiment of the present application, the spot connected domain identification module 903 may specifically be configured to: determining the number of pixel points covered by light spots in the infrared image after gray level binarization processing and plane coordinates; and identifying the light spot communicating region according to the number of the pixel points covered by the light spot and the plane coordinates.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: generating a two-dimensional array corresponding to the infrared image, wherein the two-dimensional array comprises a plurality of zone bits, each zone bit corresponds to a pixel point in the infrared image after gray level binarization processing, and the initial value of each zone bit is 0; traversing each pixel point in the infrared image after gray level binarization processing; for any pixel point, if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array to be 1; counting the number of the marker bits with the initial value modified to 1 in the two-dimensional array to obtain the number of pixel points covered by the light spots; and determining the plane coordinates of the pixel covered by the light spot according to the position of the marker bit with the initial value being modified to 1 in the two-dimensional array.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: determining the moving area range and the moving angle of the infrared reflective identification target; determining the number range of pixel points covered by light spots formed by the infrared reflection identification target in the infrared image according to the moving area range and the moving angle; and aiming at any light spot, if the number of the pixel points covered by the light spot is not in the number range, removing the light spot.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: generating a rectangular region according to the plane coordinates of each pixel point in any light spot communicating region; if the Tail/2/r is more than 0.65 r, judging that the shape of the light spot communicating region is circular or elliptical; wherein Tail is the number of pixel points in the light spot communication domain, and r is half of the diagonal length of the rectangular region.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: calculating the center point of the light spot communicating domain according to the plane coordinates of each pixel point in the light spot communicating domain aiming at any light spot communicating domain; calculating a secondary mean square center moment of the facula connected domain according to the center point; and determining the shape of the light spot communicating region according to the secondary mean square center moment.
In the embodiment of the present application, the plane coordinates include a horizontal axis coordinate and a vertical axis coordinate, and the spot connected domain identification module 903 may be further configured to: respectively calculating the sum of the horizontal axis coordinates and the sum of the vertical axis coordinates of each pixel point in the light spot communication domain; and respectively calculating a first ratio of the sum of the horizontal axis coordinates to the number of the pixel points in the light spot communication domain and a second ratio of the sum of the vertical axis coordinates to the number of the pixel points in the light spot communication domain, wherein the first ratio and the second ratio form the horizontal axis coordinates and the vertical axis coordinates of the central point.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: calculating the square of the difference between the horizontal axis coordinates of the pixel points and the horizontal axis coordinates of the center point for any pixel point in the light spot communication domain to obtain a first square value, and calculating the square of the difference between the vertical axis coordinates of the pixel points and the vertical axis coordinates of the center point to obtain a second square value; adding the first square value and the second square value of all the pixel points in the light spot communication domain to obtain a square sum; and taking the ratio of the square sum to the number of the pixel points in the light spot communicating domain as the secondary mean square center moment of the light spot communicating domain.
In the embodiment of the present application, the spot connected domain identification module 903 may be further configured to: determining a central moment threshold range of the facula connected domain formed by the infrared reflection identification target in the infrared image; and aiming at any one of the light spot communicating domains, if the quadratic mean square central moment of the light spot communicating domain is positioned in the central moment threshold range, judging that the shape of the light spot communicating domain is circular or elliptical.
In an embodiment of the present application, the denoising module 904 may specifically be configured to: and removing the light spot communicating region which is not circular or elliptical in shape.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference should be made to the description of the method embodiments.
Referring to fig. 10, a schematic diagram of a computer device according to an embodiment of the present application is shown. As shown in fig. 10, a computer device 1000 in an embodiment of the present application includes: a processor 1010, a memory 1020 and a computer program 1021 stored in the memory 1020 and executable on the processor 1010. The processor 1010 performs steps of the image denoising method of the infrared camera described above, such as steps S101 to S105 shown in fig. 1, when executing the computer program 1021. Alternatively, the processor 1010 may perform the functions of the modules/units in the above-described device embodiments when executing the computer program 1021, for example, the functions of the modules 901 to 904 shown in fig. 9.
Illustratively, the computer program 1021 may be partitioned into one or more modules/units that are stored in the memory 1020 and executed by the processor 1010 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing particular functions to describe the execution of the computer program 1021 in the computer device 1000. For example, the computer program 1021 may be divided into an image acquisition module, a binarization processing module, a spot connected domain identification module, and a denoising module, where each module specifically functions as follows:
the image acquisition module is used for controlling the infrared camera to acquire images of the circular infrared reflective identification targets to obtain infrared images, wherein the infrared images comprise a plurality of pixel points, and each pixel point has a corresponding pixel value;
the binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the light spot connected domain identification module is used for identifying the light spot connected domain in the infrared image after gray level binarization processing and detecting the shape of the light spot connected domain;
And the denoising module is used for denoising the infrared image and realizing identification target determination based on the shape of the facula connected domain.
The computer device 1000 may be the computer device in the foregoing embodiments, and the computer device 1000 may be a desktop computer, a cloud server, or the like. The computer device 1000 may include, but is not limited to, a processor 1010, a memory 1020. It will be appreciated by those skilled in the art that fig. 10 is only one example of a computer device 1000 and is not intended to limit the computer device 1000, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the computer device 1000 may also include input and output devices, network access devices, buses, etc.
The processor 1010 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 1020 may be an internal storage unit of the computer device 1000, such as a hard disk or a memory of the computer device 1000. The memory 1020 may also be an external storage device of the computer device 1000, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the computer device 1000. Further, the memory 1020 may also include both internal and external storage units of the computer device 1000. The memory 1020 is used to store the computer program 1021 and other programs and data required by the computer device 1000. The memory 1020 may also be used to temporarily store data that has been output or is to be output.
The embodiment of the application also discloses a computer device which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the image denoising method of the infrared camera according to the previous embodiments when executing the computer program.
The embodiment of the application also discloses a computer readable storage medium, which stores a computer program, and the computer program realizes the image denoising method of the infrared camera according to the previous embodiments when being executed by a processor.
The embodiment of the application also discloses a computer program product which, when run on a computer, causes the computer to execute the image denoising method of the infrared camera in each embodiment.
Various functions implemented by the computer device in the embodiments of the present application may be stored in a computer-readable storage medium if implemented in the form of software functional units and sold or used as a stand-alone product. Based on such understanding, the present application may implement all or part of the flow of the above-described method embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of the above-described method embodiments when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to an apparatus/computer device, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/computer device and method may be implemented in other manners. For example, the apparatus/computer device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limited thereto. Although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (14)

1. An image denoising method of an infrared camera, comprising:
in the positioning navigation process of the orthopedic operation robot, controlling an infrared camera to acquire an image of a circular infrared reflection identification target under a target shooting parameter to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, each pixel point has a corresponding pixel value, the target shooting parameter is shooting parameters when the pixel value of a facula formed by the infrared reflection identification target in the infrared image is larger than a first preset threshold value, the difference value obtained by subtracting the first preset threshold value from the pixel value of the facula is smaller than a shooting parameter when the second preset threshold value is larger than 0 and smaller than 5, and the infrared reflection identification target is fixed beside a focus and on an operation arm of the orthopedic operation robot;
According to the pixel value of each pixel point, carrying out gray level binarization processing on the infrared image;
determining the number of pixel points covered by light spots in the infrared image after gray level binarization processing and plane coordinates;
identifying a light spot communicating region according to the number of the pixel points covered by the light spot and the plane coordinates, and detecting the shape of the light spot communicating region;
denoising the infrared image based on the shape of the facula communication domain;
wherein the gray level binarization processing is performed on the infrared image according to the pixel value of each pixel point, and the gray level binarization processing includes:
assigning the pixel value of each pixel point with the pixel value smaller than or equal to a preset pixel threshold value to 0; and assigning a pixel value of each pixel point with a pixel value greater than the preset pixel threshold value to 255, wherein the preset pixel threshold value is the same as the first preset threshold value;
the determining the number and the plane coordinates of the pixel points covered by the light spots in the infrared image after the gray level binarization processing comprises the following steps:
generating a two-dimensional array corresponding to the infrared image, wherein the two-dimensional array comprises a plurality of zone bits, each zone bit corresponds to a pixel point in the infrared image after gray level binarization processing, and the initial value of each zone bit is 0;
Traversing each pixel point in the infrared image after gray level binarization processing;
for any pixel point, if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array to be 1;
counting the number of the marker bits with the initial value modified to 1 in the two-dimensional array to obtain the number of pixel points covered by the light spots;
according to the position of the marker bit with the initial value being modified to 1 in the two-dimensional array, determining the plane coordinates of the pixel points covered by the light spots;
the detecting the shape of the light spot communicating region includes:
generating a rectangular region according to the plane coordinates of each pixel point in any light spot communicating region; if the Tail/2r is more than 0.65 r, judging that the shape of the light spot communicating region is circular or elliptical; wherein Tail is the number of pixel points in the light spot communication domain, and r is half of the diagonal length of the rectangular region; or,
calculating the center point of the light spot communicating domain according to the plane coordinates of each pixel point in the light spot communicating domain aiming at any light spot communicating domain; calculating the square of the difference between the horizontal axis coordinates of the pixel points and the horizontal axis coordinates of the center point for any pixel point in the light spot communication domain to obtain a first square value, and calculating the square of the difference between the vertical axis coordinates of the pixel points and the vertical axis coordinates of the center point to obtain a second square value; adding the first square value and the second square value of all the pixel points in the light spot communication domain to obtain a square sum; taking the ratio of the square sum to the number of the pixel points in the light spot communicating domain as the secondary mean square center moment of the light spot communicating domain; and determining the shape of the light spot communicating region according to a central moment threshold range of the quadratic mean square central moment, wherein the central moment threshold range is determined according to the position and the angle range where the infrared reflection mark target can appear.
2. The method according to claim 1, wherein the controlling the infrared camera to perform image acquisition on the circular infrared reflective identification target under the target shooting parameters to obtain an infrared image comprises:
controlling an infrared light source to irradiate the infrared reflective identification target;
determining target shooting parameters of the infrared camera, wherein the target shooting parameters comprise aperture size and exposure time of the infrared camera;
and controlling the infrared camera to shoot the infrared reflective identification target under the target shooting parameters to obtain the infrared image.
3. The method of claim 2, wherein the determining the target capture parameters of the infrared camera comprises:
controlling the infrared camera to shoot the infrared reflective identification target under a plurality of different shooting parameters;
the target photographing parameter is determined from a plurality of different photographing parameters.
4. A method according to any one of claims 2 or 3, wherein prior to controlling the infrared light source to illuminate the infrared light reflecting identification target, the method further comprises:
determining an infrared light wavelength range of the infrared light source;
determining the wavelength transmitted by an optical filter of the infrared camera according to the infrared light wavelength range; wherein the wavelength of the light transmitted by the optical filter is consistent with the central wavelength of the infrared light source.
5. The method of claim 4, wherein the infrared light has a wavelength range of 780-3000 nanometers.
6. The method according to claim 1, wherein after performing gray level binarization processing on the infrared image according to the pixel value of each of the pixel points, the method further comprises:
calculating the area covered by the pixel point with the pixel value of the second numerical value in the infrared image after gray level binarization processing;
and discarding the infrared image if the area ratio of the infrared image is larger than a first preset area threshold value.
7. The method according to claim 1, wherein after performing gray level binarization processing on the infrared image according to the pixel value of each of the pixel points, the method further comprises:
recognizing linear light spots and particle light spots in the infrared image after gray level binarization processing, wherein the area of the particle light spots is smaller than a second preset area threshold;
and performing image corrosion on the infrared image subjected to gray level binarization processing to delete the linear light spots and the particle light spots.
8. The method according to claim 1, wherein after identifying the spot connected domain according to the number of the pixel points covered by the spot and the plane coordinates, the method further comprises:
Determining the moving area range and the moving angle of the infrared reflective identification target;
determining the number range of pixel points covered by light spots formed by the infrared reflection identification target in the infrared image according to the moving area range and the moving angle;
and aiming at any light spot, if the number of the pixel points covered by the light spot is not in the number range, removing the light spot.
9. The method according to claim 1, wherein the plane coordinates include a horizontal axis coordinate and a vertical axis coordinate, and the calculating the center point of the spot connected domain according to the plane coordinates of each pixel point in the spot connected domain includes:
respectively calculating the sum of the horizontal axis coordinates and the sum of the vertical axis coordinates of each pixel point in the light spot communication domain;
and respectively calculating a first ratio of the sum of the horizontal axis coordinates to the number of the pixel points in the light spot communication domain and a second ratio of the sum of the vertical axis coordinates to the number of the pixel points in the light spot communication domain, wherein the first ratio and the second ratio form the horizontal axis coordinates and the vertical axis coordinates of the central point.
10. The method of claim 1, wherein the determining the shape of the spot connected domain from the center moment threshold range of the quadratic mean square center moment comprises:
Determining a central moment threshold range of the facula connected domain formed by the infrared reflection identification target in the infrared image;
and aiming at any one of the light spot communicating domains, if the quadratic mean square central moment of the light spot communicating domain is positioned in the central moment threshold range, judging that the shape of the light spot communicating domain is circular or elliptical.
11. The method according to any one of claims 1 or 9-10, wherein denoising the infrared image based on the shape of the spot connected domain comprises:
and removing the light spot communicating region which is not circular or elliptical in shape.
12. An image denoising apparatus of an infrared camera, comprising:
the image acquisition module is used for controlling the infrared camera to acquire an image of a circular infrared reflective identification target under a target shooting parameter to obtain an infrared image, wherein the infrared image comprises a plurality of pixel points, each pixel point has a corresponding pixel value, the target shooting parameter is that the pixel value of a facula formed by the infrared reflective identification target in the infrared image is larger than a first preset threshold, the difference value obtained by subtracting the first preset threshold from the pixel value of the facula is smaller than a shooting parameter when a second preset threshold is smaller than a second preset threshold, the second preset threshold is larger than 0 and smaller than 5, and the infrared reflective identification target is fixed on an operating arm of a focus side and orthopedic operation robot;
The binarization processing module is used for carrying out gray level binarization processing on the infrared image according to the pixel value of each pixel point;
the spot connected domain identification module is used for determining the number and the plane coordinates of pixel points covered by the light spots in the infrared image after gray level binarization processing, identifying the spot connected domain according to the number and the plane coordinates of the pixel points covered by the light spots, and detecting the shape of the spot connected domain;
the denoising module is used for denoising the infrared image based on the shape of the facula communication domain;
the binarization processing module is specifically used for:
assigning the pixel value of each pixel point with the pixel value smaller than or equal to a preset pixel threshold value to 0; and assigning a pixel value of each pixel point with a pixel value greater than the preset pixel threshold value to 255, wherein the preset pixel threshold value is the same as the first preset threshold value;
the light spot connected domain identification module is specifically used for:
generating a two-dimensional array corresponding to the infrared image, wherein the two-dimensional array comprises a plurality of zone bits, each zone bit corresponds to a pixel point in the infrared image after gray level binarization processing, and the initial value of each zone bit is 0;
Traversing each pixel point in the infrared image after gray level binarization processing;
for any pixel point, if the pixel value of the pixel point is not 0, modifying the flag bit corresponding to the pixel point in the two-dimensional array to be 1;
counting the number of the marker bits with the initial value modified to 1 in the two-dimensional array to obtain the number of pixel points covered by the light spots;
according to the position of the marker bit with the initial value being modified to 1 in the two-dimensional array, determining the plane coordinates of the pixel points covered by the light spots;
the light spot connected domain identification module is specifically used for:
generating a rectangular region according to the plane coordinates of each pixel point in any light spot communicating region; if the Tail/2r is more than 0.65 r, judging that the shape of the light spot communicating region is circular or elliptical; wherein Tail is the number of pixel points in the light spot communication domain, and r is half of the diagonal length of the rectangular region; or,
calculating the center point of the light spot communicating domain according to the plane coordinates of each pixel point in the light spot communicating domain aiming at any light spot communicating domain; calculating the square of the difference between the horizontal axis coordinates of the pixel points and the horizontal axis coordinates of the center point for any pixel point in the light spot communication domain to obtain a first square value, and calculating the square of the difference between the vertical axis coordinates of the pixel points and the vertical axis coordinates of the center point to obtain a second square value; adding the first square value and the second square value of all the pixel points in the light spot communication domain to obtain a square sum; taking the ratio of the square sum to the number of the pixel points in the light spot communicating domain as the secondary mean square center moment of the light spot communicating domain; and determining the shape of the light spot communicating region according to a central moment threshold range of the quadratic mean square central moment, wherein the central moment threshold range is determined according to the position and the angle range where the infrared reflection mark target can appear.
13. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the image denoising method of an infrared camera as claimed in any one of claims 1-11 when executing the computer program.
14. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the image denoising method of an infrared camera according to any one of claims 1 to 11.
CN202111649543.2A 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment Active CN114565517B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111649543.2A CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111649543.2A CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Publications (2)

Publication Number Publication Date
CN114565517A CN114565517A (en) 2022-05-31
CN114565517B true CN114565517B (en) 2023-09-29

Family

ID=81711536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111649543.2A Active CN114565517B (en) 2021-12-29 2021-12-29 Image denoising method and device of infrared camera and computer equipment

Country Status (1)

Country Link
CN (1) CN114565517B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117616777A (en) * 2022-06-10 2024-02-27 北京小米移动软件有限公司 Image processing method, device, electronic equipment and storage medium
CN116228589B (en) * 2023-03-22 2023-08-29 新创碳谷集团有限公司 Method, equipment and storage medium for eliminating noise points of visual inspection camera
CN117414154B (en) * 2023-09-05 2024-09-20 骨圣元化机器人(深圳)有限公司 Three-dimensional ultrasonic reconstruction method, device and ultrasonic system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014075365A (en) * 2014-01-27 2014-04-24 Hitachi High-Technologies Corp Charged particle beam device, sample image acquisition method, and program recording medium
CN105719259A (en) * 2016-02-19 2016-06-29 上海理工大学 Pavement crack image detection method
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN108335308A (en) * 2017-01-20 2018-07-27 深圳市祈飞科技有限公司 A kind of orange automatic testing method, system and intelligent robot retail terminal
CN109299634A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Spot detection method, system, equipment and storage medium
CN109325468A (en) * 2018-10-18 2019-02-12 广州智颜科技有限公司 A kind of image processing method, device, computer equipment and storage medium
CN109729276A (en) * 2017-10-27 2019-05-07 比亚迪股份有限公司 Near-infrared image capture method, device, equipment and storage medium
CN109949252A (en) * 2019-04-15 2019-06-28 北京理工大学 A kind of infrared image hot spot minimizing technology based on penalty coefficient fitting
WO2019205290A1 (en) * 2018-04-28 2019-10-31 平安科技(深圳)有限公司 Image detection method and apparatus, computer device, and storage medium
CN110720985A (en) * 2019-11-13 2020-01-24 安徽领航智睿科技有限公司 Multi-mode guided surgical navigation method and system
CN111080691A (en) * 2019-12-17 2020-04-28 晶科电力科技股份有限公司 Infrared hot spot detection method and device for photovoltaic module
CN111862195A (en) * 2020-08-26 2020-10-30 Oppo广东移动通信有限公司 Light spot detection method and device, terminal and storage medium
CN112911157A (en) * 2021-03-27 2021-06-04 山东创能机械科技有限公司潍坊分公司 Automatic device for searching, tracking, aiming and ranging initial fire source based on infrared image recognition

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014075365A (en) * 2014-01-27 2014-04-24 Hitachi High-Technologies Corp Charged particle beam device, sample image acquisition method, and program recording medium
CN105719259A (en) * 2016-02-19 2016-06-29 上海理工大学 Pavement crack image detection method
CN108335308A (en) * 2017-01-20 2018-07-27 深圳市祈飞科技有限公司 A kind of orange automatic testing method, system and intelligent robot retail terminal
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN109299634A (en) * 2017-07-25 2019-02-01 上海中科顶信医学影像科技有限公司 Spot detection method, system, equipment and storage medium
CN109729276A (en) * 2017-10-27 2019-05-07 比亚迪股份有限公司 Near-infrared image capture method, device, equipment and storage medium
WO2019205290A1 (en) * 2018-04-28 2019-10-31 平安科技(深圳)有限公司 Image detection method and apparatus, computer device, and storage medium
CN109325468A (en) * 2018-10-18 2019-02-12 广州智颜科技有限公司 A kind of image processing method, device, computer equipment and storage medium
CN109949252A (en) * 2019-04-15 2019-06-28 北京理工大学 A kind of infrared image hot spot minimizing technology based on penalty coefficient fitting
CN110720985A (en) * 2019-11-13 2020-01-24 安徽领航智睿科技有限公司 Multi-mode guided surgical navigation method and system
CN111080691A (en) * 2019-12-17 2020-04-28 晶科电力科技股份有限公司 Infrared hot spot detection method and device for photovoltaic module
CN111862195A (en) * 2020-08-26 2020-10-30 Oppo广东移动通信有限公司 Light spot detection method and device, terminal and storage medium
CN112911157A (en) * 2021-03-27 2021-06-04 山东创能机械科技有限公司潍坊分公司 Automatic device for searching, tracking, aiming and ranging initial fire source based on infrared image recognition

Also Published As

Publication number Publication date
CN114565517A (en) 2022-05-31

Similar Documents

Publication Publication Date Title
CN114565517B (en) Image denoising method and device of infrared camera and computer equipment
US9928592B2 (en) Image-based signal detection for object metrology
US10219866B2 (en) Optical tracking method and system based on passive markers
CN108292366B (en) System and method for detecting suspicious tissue regions during endoscopic surgery
US10007971B2 (en) Systems and methods for user machine interaction for image-based metrology
US20170262985A1 (en) Systems and methods for image-based quantification for allergen skin reaction
US20170262979A1 (en) Image correction and metrology for object quantification
US20170258391A1 (en) Multimodal fusion for object detection
US20170262977A1 (en) Systems and methods for image metrology and user interfaces
CN113284160B (en) Method, device and equipment for identifying surgical navigation mark beads
CN111080542B (en) Image processing method, device, electronic equipment and storage medium
CN114022554B (en) Massage robot acupoint detection and positioning method based on YOLO
JP6475312B1 (en) Optical tracking system and optical tracking method
CN109635761B (en) Iris recognition image determining method and device, terminal equipment and storage medium
CN114092480A (en) Endoscope adjusting device, surgical robot and readable storage medium
US7415148B2 (en) System and method for detecting anomalous targets including cancerous cells
CN115100104A (en) Defect detection method, device and equipment for glass ink area and readable storage medium
JP2005077411A (en) Method and system for image processing on structured optical profile of components
CN102483380B (en) Illumination/image-pickup system for surface inspection and data structure
CN113269733B (en) Artifact detection method for radioactive particles in tomographic image
JPH09276219A (en) Ophthalmic device
WO2022011029A1 (en) Apparatuses, systems, and methods for discounting an object while managing auto-exposure of image frames depicting the object
CN113221910A (en) Structured light image processing method, obstacle detection method, module and equipment
JP2007202811A (en) Radiation field recognition unit, radiation field recognition method, and program therefor
JP2006288467A (en) Device and method for judging irradiation field and its program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant