CN113781351A - Image processing method, apparatus and computer-readable storage medium - Google Patents

Image processing method, apparatus and computer-readable storage medium Download PDF

Info

Publication number
CN113781351A
CN113781351A CN202111085943.5A CN202111085943A CN113781351A CN 113781351 A CN113781351 A CN 113781351A CN 202111085943 A CN202111085943 A CN 202111085943A CN 113781351 A CN113781351 A CN 113781351A
Authority
CN
China
Prior art keywords
image
amplitude
foreground
local
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111085943.5A
Other languages
Chinese (zh)
Other versions
CN113781351B (en
Inventor
韩超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Anfang Biotechnology Co ltd
Original Assignee
Guangzhou Anfang Biotechnology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Anfang Biotechnology Co ltd filed Critical Guangzhou Anfang Biotechnology Co ltd
Priority to CN202111085943.5A priority Critical patent/CN113781351B/en
Publication of CN113781351A publication Critical patent/CN113781351A/en
Application granted granted Critical
Publication of CN113781351B publication Critical patent/CN113781351B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an image processing method, image processing equipment and a computer readable storage medium, wherein the method comprises the following steps: obtaining a foreground enhanced image according to the obtained image to be processed; determining first position information according to the foreground enhanced image, wherein the first position information is used for representing the position of a local extreme point in the foreground enhanced image; performing first processing on the foreground enhanced image to obtain an amplitude image; obtaining first amplitude information according to the first position information and the amplitude image, wherein the first amplitude information is used for representing the amplitude at the local extreme point; the position of the target point is determined based on the first position information and the first magnitude information. Thus, the image processing method provided by the embodiment of the invention can inhibit background interference near the probe point when the image processing is carried out on the probe point image of the fluorescence in-situ hybridization, and improves the accuracy of determining the position of the probe point.

Description

Image processing method, apparatus and computer-readable storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and a computer-readable storage medium.
Background
In research or production activities in the biological field, it is often necessary to identify genetic loci on chromosomes. In order to identify the genetic locus on the chromosome, a fluorescent in situ hybridization technique is usually used to combine the genetic probe with the genetic locus to be labeled and identified, and then the image of the genetic probe is processed to obtain the position information of the probe point. However, the image processing method in the related art cannot accurately determine the position of the probe point due to the background interference near the probe point when processing the probe point image.
Disclosure of Invention
The following is a summary of the subject matter described in detail herein. This summary is not intended to limit the scope of the claims.
Embodiments of the present invention provide an image processing method, an image processing apparatus, and a computer-readable storage medium, which can suppress background interference near a probe point when performing image processing on a probe point image, and improve the accuracy of determining the position of the probe point.
In a first aspect, an embodiment of the present invention provides an image processing method, where the method includes:
obtaining a foreground enhanced image according to the obtained image to be processed;
determining first position information according to the foreground enhanced image, wherein the first position information is used for representing the position of a local extreme point in the foreground enhanced image;
performing first processing on the foreground enhanced image to obtain an amplitude image;
obtaining first amplitude information according to the first position information and the amplitude image, wherein the first amplitude information is used for representing the amplitude at the local extreme point;
and determining the position of the target point according to the first position information and the first amplitude information.
According to the image processing method of the embodiment of the first aspect of the invention, at least the following beneficial effects are achieved:
according to the embodiment of the invention, an image to be processed is obtained, and a foreground enhanced image is obtained according to the image to be processed; obtaining a foreground enhanced image according to the obtained image to be processed; determining first position information according to the foreground enhanced image, wherein the first position information is used for representing the position of a local extreme point in the foreground enhanced image; performing first processing on the foreground enhanced image to obtain an amplitude image; obtaining first amplitude information according to the first position information and the amplitude image, wherein the first amplitude information is used for representing the amplitude at the local extreme point; and determining the position of the target point according to the first position information and the first amplitude information. Therefore, the foreground enhanced image is obtained according to the image to be processed, the image contrast is enhanced, meanwhile, the interference in the background is reduced, the first position information used for representing the position of the local extreme point in the foreground enhanced image is determined according to the foreground enhanced image, the distribution interval of the probe points can be preliminarily determined, and the processing precision is further improved. And then obtaining an amplitude image through the foreground enhanced image, obtaining first amplitude information used for representing the amplitude at the local extreme point according to the first position information and the amplitude image, determining the position of the target point according to the first position information and the first amplitude information, and further removing the position of the target point with the amplitude not meeting the condition, so that background interference near the target point in the image to be processed can be removed, and the accuracy of determining the position of the target point is improved. In summary, the embodiment of the invention can suppress background interference near the probe point when the probe point image is subjected to image processing, and improve the accuracy of determining the position of the probe point.
It can be understood that the obtaining of the foreground enhanced image according to the obtained image to be processed includes: carrying out contrast enhancement processing on the acquired image to be processed to obtain a contrast enhanced image; performing first Gaussian filtering processing on the contrast enhanced image to obtain a background blurred image; obtaining a positive difference image according to the contrast enhanced image and the background blurred image; and performing second Gaussian filtering processing on the positive difference image to obtain a foreground enhanced image, wherein the value of the standard deviation parameter corresponding to the second Gaussian filtering processing is smaller than the value of the standard deviation parameter corresponding to the first Gaussian filtering processing.
It is understood that the deriving of the positive difference image from the contrast-enhanced image and the background blurred image comprises: subtracting the background blurred image from the contrast enhanced image to obtain a full-difference image; and extracting sub-pixels with pixel values larger than zero in the full difference image, and obtaining a positive difference image according to the sub-pixels.
It is understood that the deriving of the positive difference image from the contrast-enhanced image and the background blurred image comprises: subtracting the background blurred image from the contrast enhanced image to obtain a full-difference image; and extracting sub-pixels with pixel values larger than zero in the full difference image to obtain a positive difference image.
It is to be understood that the determining the first location information from the foreground enhanced image includes: obtaining a non-maximum suppression local window and a non-maximum suppression pixel threshold according to a first preset parameter, wherein the first preset parameter is used for representing the area of the target point; obtaining a first point set according to the non-maximum suppression local window, the non-maximum suppression pixel threshold and the foreground enhancement image, wherein the first point set is a set of extreme points in the non-maximum suppression local window; screening the first point set to obtain a local extreme point; and determining the first position information according to the local extreme point.
It is to be understood that a plurality of the non-maximum suppression local windows are provided, each of the non-maximum suppression local windows has the corresponding first point set and the corresponding local extreme point, and the first position information is a set of position information of each of the local extreme points.
It is to be understood that the performing the first processing on the foreground enhanced image to obtain the magnitude image includes: performing local standard deviation calculation processing on the foreground enhanced image to obtain a local standard deviation image; and performing third Gaussian filtering processing on the local standard deviation image to obtain the amplitude image.
It is understood that the area of the window for which the foreground enhanced image is subjected to the local standard deviation calculation process coincides with the area of the non-maximum suppression local window.
It is understood that the determining the position of the target point according to the first position information and the first magnitude information includes: obtaining an amplitude threshold value according to the first amplitude information; and determining the position of the target point according to the amplitude threshold value, the first amplitude information and the first position information.
In a second aspect, an embodiment of the present invention provides an image processing apparatus, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the image processing method as described above when executing the computer program.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the image processing method as described above.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
FIG. 1 is a flow chart of an image processing method provided by an embodiment of the invention;
FIG. 2 is a flow chart of a specific method of step S100 in FIG. 1;
FIG. 3 is a flowchart of a specific method of step S130 in FIG. 2;
FIG. 4 is a flowchart of a specific method of step S200 in FIG. 1;
FIG. 5 is a flowchart of a specific method of step S300 in FIG. 1;
FIG. 6 is a flowchart of a specific method of step S500 in FIG. 1;
FIG. 7 is a schematic diagram of an image to be processed according to an embodiment of the present invention;
fig. 8 is a schematic diagram of the image to be processed in fig. 7 after image processing is completed.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It should be noted that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different from that in the flowcharts. The terms first, second and the like in the description and in the claims, and the drawings described above, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The embodiment of the invention provides an image processing method, image processing equipment and a computer readable storage medium, wherein a foreground enhanced image is obtained according to an image to be processed, the contrast of the image is enhanced, meanwhile, the interference in a background is reduced, first position information used for representing the position of a local extreme point in the foreground enhanced image is determined according to the foreground enhanced image, the distribution interval of probe points can be preliminarily determined, and the processing precision is further improved. And then obtaining an amplitude image through the foreground enhanced image, obtaining first amplitude information used for representing the amplitude at the local extreme point according to the first position information and the amplitude image, determining the position of the target point according to the first position information and the first amplitude information, and further removing the position of the target point with the amplitude not meeting the condition, so that background interference near the target point in the image to be processed can be removed, and the accuracy of determining the position of the target point is improved. In summary, the embodiment of the invention can suppress background interference near the probe point when the probe point image is subjected to image processing, and improve the accuracy of determining the position of the probe point.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a flowchart of an image processing method provided by an embodiment of the present invention, which includes, but is not limited to, step S100, step S200, step S300, step S400, and step S500.
And S100, obtaining a foreground enhanced image according to the obtained image to be processed.
In this step, the image to be processed is an image including position information of the probe point, and a foreground enhanced image can be obtained by performing image processing on the image to be processed, and the foreground enhanced image enhances the contrast of the image compared with the image to be processed, and can reduce interference caused by noise in the background of the image.
In one embodiment, the image to be processed is obtained by hybridizing a nucleic acid probe labeled with fluorescein with a nucleic acid sequence In a sample to be tested according to the Fluorescence In Situ Hybridization (FISH) principle, and observing the obtained image under a Fluorescence microscope after washing, wherein the position information of the probe point indicates the position of the nucleic acid probe.
It should be noted that the image to be processed may be a grayscale image including position information of the target point, or may be a color image including position information of the target point, which is not specifically limited in the embodiment of the present invention.
Step S200, determining first position information according to the foreground enhanced image, wherein the first position information is used for representing the position of a local extreme point in the foreground enhanced image.
In this step, the first position information for representing the position of the local extremum point in the foreground enhanced image is obtained according to the foreground enhanced image, so that the distribution interval of the target point can be initially determined, and the processing precision is further improved.
In an embodiment, the first position information is used to represent a position of a local extreme point in the foreground enhanced image, the local foreground enhanced image is selected through a local window, sub-pixels exceeding a threshold condition in the foreground enhanced image under the local window are selected according to the threshold condition matched with the local window to determine a first point set including a plurality of extreme points, the extreme point with the largest pixel value in the first point set is selected as the local extreme point, and finally the first position information is obtained according to the local extreme point.
It should be noted that, one local window corresponds to only one local extreme point, but a plurality of local windows may be provided, and therefore the first position information includes position information of local extreme points of a plurality of different local windows.
Step S300, carrying out first processing on the foreground enhanced image to obtain an amplitude image.
In this step, the amplitude image is an image of the gradient amplitude of each pixel point in the representation image after the foreground image is processed, and the gradient image is obtained according to the foreground enhanced image, so that the interference in the background can be reduced, and further positioning of the extreme point is facilitated.
Step S400, obtaining first amplitude information according to the first position information and the amplitude image, wherein the first amplitude information is used for representing the amplitude at the local extreme point.
In this step, the first position information includes position information of local extremal points of a plurality of different local windows, and the amplitude of each local extremal point can be obtained by the first position information and the amplitude image, where the first amplitude information is a set of amplitude size information of each local extremal point.
Step S500, determining the position of the target point according to the first position information and the first amplitude information.
In the step, the foreground enhanced image is obtained according to the image to be processed, the image contrast is enhanced, meanwhile, the interference in the background is reduced, the first position information used for representing the position of the local extreme point in the foreground enhanced image is determined according to the foreground enhanced image, the distribution interval of the probe points can be preliminarily determined, and the processing precision is further improved. And then obtaining an amplitude image through the foreground enhanced image, obtaining first amplitude information used for representing the amplitude at the local extreme point according to the first position information and the amplitude image, determining the position of the target point according to the first position information and the first amplitude information, and further removing the position of the target point with the amplitude not meeting the condition, so that background interference near the target point in the image to be processed can be removed, and the accuracy of determining the position of the target point is improved. In summary, the embodiment of the invention can suppress background interference near the probe point when the probe point image is subjected to image processing, and improve the accuracy of determining the position of the probe point.
It should be noted that the target point in this step may be a probe point for identifying a gene locus, or may be other types of target points, which is not specifically limited in the present invention.
In addition, referring to fig. 2, fig. 2 is a flowchart of a specific method of step S100 in fig. 1, and in the example of fig. 2, step S100 includes, but is not limited to, step S110, step S120, step S130, and step S140:
and step S110, carrying out contrast enhancement processing on the acquired image to be processed to obtain a contrast enhanced image.
In the step, the accuracy of identifying the probe point in the subsequent steps is improved by enhancing the contrast of the image.
In an embodiment, when the image to be processed is not a grayscale image, the image to be processed is processed to obtain a corresponding grayscale image, and the grayscale image is used for subsequent image processing.
In one embodiment, the non-linear enhancement is used to non-linearly stretch the luminance of each sub-pixel of the image to be processed.
Specifically, the weight distribution function employed for the non-linear stretching is
Figure BDA0003265550240000051
Wherein pdf (l) is the probability density function of the image pixels; after the weights of the sub-pixels of the image to be processed are obtained through the weight distribution function, the accumulative distribution function is adopted to accumulate according to the weight distribution to obtain the contrast enhanced image.
Specifically, the cumulative distribution function employed in embodiments of the present invention is
Figure BDA0003265550240000052
Wherein
Figure BDA0003265550240000053
And cdfw(l) Is the result after accumulation.
And step S120, performing first Gaussian filtering processing on the contrast enhanced image to obtain a background blurred image.
In this step, the first gaussian filtering process is performed on the contrast enhanced image, so that noise in the contrast enhanced image can be suppressed.
In an embodiment, first, a window area and a standard deviation parameter for performing the first gaussian filtering process are determined according to the size of the contrast enhanced image, and specifically, a kernel function used for performing the first gaussian filtering process is:
Figure BDA0003265550240000054
wherein σlIs a standard deviation parameter used for the first gaussian filtering process, and x and y are positions of pixels of the two-dimensional contrast enhanced image on the horizontal axis and the vertical axis of the image, respectively.
And step S130, obtaining a positive difference image according to the contrast enhanced image and the background blurred image.
In one embodiment, the positive difference image is obtained by subtracting the contrast-enhanced image and the background blurred image to obtain a full difference image, extracting sub-pixels with pixel values greater than zero in the full difference image, and obtaining the sub-pixels according to the sub-pixels.
Step S140, performing a second gaussian filtering process on the positive difference image to obtain a foreground enhanced image, wherein a value of a standard deviation parameter corresponding to the second gaussian filtering process is smaller than a value of a standard deviation parameter corresponding to the first gaussian filtering process.
In this step, a foreground enhanced image is obtained, and the interference in the background can be suppressed while the image contrast is enhanced.
In an embodiment, the window area for performing the second gaussian filtering process is the same as the window area for performing the first gaussian filtering process, and the kernel function for performing the second gaussian filtering process is also the same as the kernel function for performing the first gaussian filtering process, with the difference that the value of the standard deviation parameter for performing the second gaussian filtering process is one fifth of the value of the standard deviation parameter for performing the first gaussian filtering process.
In one embodiment, steps S120 to S140 can be summarized by the following formula:
Figure BDA0003265550240000055
wherein If_enRepresenting a foreground image, IorgIs a contrast enhanced image, Gl,GsRespectively, kernel functions used for performing the first gaussian filtering process and performing the second gaussian filtering process,
Figure BDA0003265550240000056
representing a convolution.
In addition, referring to fig. 3, fig. 3 is a flowchart of a specific method of step S130 in fig. 2, and in the example of fig. 3, step S130 includes, but is not limited to, step S131 and step S132:
and S131, subtracting the background blurred image from the contrast enhanced image to obtain a full-difference image.
Step S132, extracting sub-pixels with pixel values greater than zero in the full difference image, and obtaining a positive difference image according to the sub-pixels.
In step S131 and step S132, the background blurred image is subtracted from the contrast enhanced image to obtain a full difference image, so as to further suppress background interference and enhance the characteristics of the target point, and then sub-pixels in the full difference image are extracted according to the principle that the pixel values are not complex, and a positive difference image is obtained according to the sub-pixels.
Referring to fig. 4, fig. 4 is a flowchart of a specific method of step S200 in fig. 1, and in the example of fig. 4, step S200 includes, but is not limited to, step S210, step S220, step S230, and step S240:
step S210, a non-maximum suppression local window and a non-maximum suppression pixel threshold are obtained according to a first preset parameter, where the first preset parameter is used to represent the area of the target point.
In this step, the first preset parameter may be used to represent an area of the target point, and specifically, the first preset parameter may represent an interval of the number of pixels of the target point, and may also represent an interval of a ratio of the number of pixels of the target point to the number of pixels of the image to be processed. The first preset parameter may also be used to characterize a non-maximum suppression pixel threshold used in performing the non-maximum suppression processing, and a bounding box of a non-maximum suppression local window whose pixel value exceeds the threshold may be deleted in the non-maximum suppression processing.
Step S220, a first point set is obtained according to the non-maximum suppression local window, the non-maximum suppression pixel threshold, and the foreground enhanced image, where the first point set is a set of extreme points in the non-maximum suppression local window.
In an embodiment, a specific pixel point in the foreground enhanced image is selected according to the non-maximum-value-suppressed pixel threshold, specifically, the pixel value of the pixel point of the foreground enhanced image is compared with the non-maximum-value-suppressed pixel threshold, and the pixel point of which the pixel value is greater than the non-maximum-value-suppressed pixel threshold is used as the maximum point. Since there are a plurality of non-maximum suppression local windows, there are a plurality of extreme points, and the set of the plurality of extreme points is the first point set.
Step S230, a first point set is screened to obtain a local extreme point.
In an embodiment, the size of the non-maximum suppression local window is fixed, and the number of the non-maximum suppression local windows is an odd number, and in the process of processing the foreground enhanced image, the non-maximum suppression local window is slid on a plurality of extreme points in the first point set, so that the non-maximum suppression local window traverses the extreme points. And if the size of the pixel value of the central position of the non-maximum inhibition window is not less than the pixel values of other positions in the window, the extreme point of the center of the window is reserved, otherwise, the extreme point of the center of the window is abandoned. After the extreme points in the first point set are traversed, the points which are finally reserved are used as local extreme points.
In step S240, the first position information is determined according to the local extreme point.
In this step, the first position information is used to characterize the position of the local extreme point. Specifically, the first position information includes a vertical axis coordinate and a horizontal axis coordinate of a pixel point of the local extreme point on the foreground enhanced image.
Referring to fig. 5, fig. 5 is a flowchart of a specific method of step S300 in fig. 1, and in the example of fig. 5, step S300 includes, but is not limited to, step S310 and step S320:
and step S310, carrying out local standard deviation calculation processing on the foreground enhanced image to obtain a local standard deviation image.
In the step, the edge blurring caused by Gaussian filtering processing in the generation process of the foreground enhanced image is overcome by performing local standard deviation calculation processing on the foreground enhanced image.
Note that the area of the window for performing the local standard deviation calculation processing on the foreground enhanced image is equal to the area of the local window for non-maximum suppression.
Step S320, performing a third gaussian filtering process on the local standard deviation image to obtain an amplitude image.
In this step, the amplitude image can represent the gray gradient amplitude of each pixel in the local standard deviation image, so as to detect the edge of the target point.
In one embodiment, the local standard deviation image is squared before the third gaussian filtering process is performed on the local standard deviation image to enhance the feature of the target point in the local standard deviation image.
In addition, step S400 in fig. 1 further includes, but is not limited to, the following steps: and obtaining first amplitude information according to the first position information and the amplitude image, namely determining the position of each local extreme point in the amplitude image according to the first position information, and determining the amplitude of each local extreme point according to the position of each local extreme point in the amplitude image, wherein the first amplitude information is a set of the amplitudes of each local extreme point.
Referring to fig. 6, fig. 6 is a flowchart of a specific method of step S500 in fig. 1, and in the example of fig. 6, step S500 includes, but is not limited to, step S510 and step S520:
step S510, an amplitude threshold is obtained according to the first amplitude information.
In this step, the amplitude threshold is used to screen the local extreme point corresponding to the first position information, and the amplitude threshold is obtained according to the first amplitude information.
In an embodiment, the amplitude threshold is positively correlated with the average amplitude of the local extreme points represented by the first amplitude information. For example, when the average amplitude of each local extreme point represented by the first amplitude information is higher, and correspondingly, the first amplitude information is also higher, so that the proportion of the remaining local extreme points after the screening according to the amplitude threshold to the local extreme points before the screening is always consistent.
Step S520, determining the position of the target point according to the amplitude threshold, the first amplitude information and the first position information.
In an embodiment, the first amplitude information is screened according to an amplitude threshold, when the amplitude of the local extreme point represented by the first amplitude information is not within an interval specified by the amplitude threshold, the position corresponding to the local extreme point is deleted from the first position information, after traversing the amplitudes of all the local extreme points in the first amplitude information and performing the above processing, the position of the remaining local extreme point in the first position information is the position of the target point.
Referring to fig. 7, fig. 7 is a schematic diagram of an image to be processed according to a specific example of the present invention, in the example of fig. 7, the image to be processed is an image obtained by hybridizing a nucleic acid probe labeled with fluorescein with a nucleic acid sequence in a sample to be tested according to the fluorescence in situ hybridization technique, and observing the image under a fluorescence microscope after washing;
referring to FIG. 7, FIG. 7 includes two fluorescence regions each having four nucleic acid probes; the nucleic acid probes in the two fluorescent regions are divided into two types, the emission colors of the two nucleic acid probes are different from each other, and the emission colors of the two nucleic acid probes are different from the emission color of the fluorescent region.
The edges of the nucleic acid probe in FIG. 7 are blurred due to the influence of fluorescence in the fluorescence region and self-luminescence of the nucleic acid probe, and thus the accurate position of the probe spot cannot be determined.
Fig. 8 is a schematic diagram of the image to be processed in fig. 7 after image processing is completed, in the example of fig. 8, after the image to be processed in fig. 7 is processed by the image processing method according to the embodiment of the present invention, the obtained image does not show the fluorescence region and the light emitted by the nucleic acid probe any more, and in addition, the obtained image uses different line shapes to mark the fluorescence region and the edge of the nucleic acid probe, so the image processing method according to the embodiment of the present invention can suppress background interference near the probe point, and improve the accuracy of determining the probe point position.
In addition, an embodiment of the present invention provides an image processing apparatus including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
The processor and memory may be connected by a bus or other means.
The non-transitory software programs and instructions required to implement the image processing method of the above-described embodiment are stored in the memory, and when executed by the processor, perform the image processing method of the above-described embodiment, for example, performing the above-described method steps S100 to S500 in fig. 1, method steps S110 to S140 in fig. 2, method steps S131 to S132 in fig. 3, method steps S210 to S240 in fig. 4, method steps S310 to S320 in fig. 5, and method steps S510 to S520 in fig. 6.
The above-described embodiments of the apparatus are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may also be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Furthermore, an embodiment of the present application further provides a computer-readable storage medium storing computer-executable instructions, which are executed by a processor or a controller, for example, by a processor in the above-mentioned embodiment of the image processing apparatus, and can make the above-mentioned processor execute the image processing method in the above-mentioned embodiment, for example, execute the above-mentioned method steps S100 to S500 in fig. 1, method steps S110 to S140 in fig. 2, method steps S131 to S132 in fig. 3, method steps S210 to S240 in fig. 4, method steps S310 to S320 in fig. 5, and method steps S510 to S520 in fig. 6.
One of ordinary skill in the art will appreciate that all or some of the steps, systems, and methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.
While the preferred embodiments of the present invention have been described, the present invention is not limited to the above embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present invention, and such equivalent modifications or substitutions are included in the scope of the present invention defined by the claims.

Claims (10)

1. A method of image processing, the method comprising:
obtaining a foreground enhanced image according to the obtained image to be processed;
determining first position information according to the foreground enhanced image, wherein the first position information is used for representing the position of a local extreme point in the foreground enhanced image;
performing first processing on the foreground enhanced image to obtain an amplitude image;
obtaining first amplitude information according to the first position information and the amplitude image, wherein the first amplitude information is used for representing the amplitude at the local extreme point;
and determining the position of the target point according to the first position information and the first amplitude information.
2. The image processing method according to claim 1, wherein the obtaining the foreground enhanced image according to the obtained image to be processed comprises:
carrying out contrast enhancement processing on the acquired image to be processed to obtain a contrast enhanced image;
performing first Gaussian filtering processing on the contrast enhanced image to obtain a background blurred image;
obtaining a positive difference image according to the contrast enhanced image and the background blurred image;
and performing second Gaussian filtering processing on the positive difference image to obtain a foreground enhanced image, wherein the value of the standard deviation parameter corresponding to the second Gaussian filtering processing is smaller than the value of the standard deviation parameter corresponding to the first Gaussian filtering processing.
3. The method of claim 2, wherein the deriving a positive difference image from the contrast-enhanced image and the background-blurred image comprises:
subtracting the background blurred image from the contrast enhanced image to obtain a full-difference image;
and extracting sub-pixels with pixel values larger than zero in the full difference image, and obtaining a positive difference image according to the sub-pixels.
4. The method of image processing according to claim 1, wherein said determining first location information from the foreground enhanced image comprises:
obtaining a non-maximum suppression local window and a non-maximum suppression pixel threshold according to a first preset parameter, wherein the first preset parameter is used for representing the area of the target point;
obtaining a first point set according to the non-maximum suppression local window, the non-maximum suppression pixel threshold and the foreground enhancement image, wherein the first point set is a set of extreme points in the non-maximum suppression local window;
screening the first point set to obtain a local extreme point;
and determining the first position information according to the local extreme point.
5. The image processing method according to claim 4, wherein a plurality of the non-maximum suppression local windows are provided, each of the non-maximum suppression local windows has the corresponding first point set and the local extreme point, and the first position information is a set of position information of each of the local extreme points.
6. The image processing method according to claim 4, wherein the performing the first processing on the foreground enhanced image to obtain the magnitude image comprises:
performing local standard deviation calculation processing on the foreground enhanced image to obtain a local standard deviation image;
and performing third Gaussian filtering processing on the local standard deviation image to obtain the amplitude image.
7. The image processing method according to claim 6, wherein a window area of the foreground enhanced image subjected to the local standard deviation calculation process coincides with an area of the non-maximum suppression local window.
8. The image processing method of claim 1, wherein determining the position of the target point based on the first position information and the first magnitude information comprises:
obtaining an amplitude threshold value according to the first amplitude information;
and determining the position of the target point according to the amplitude threshold value, the first amplitude information and the first position information.
9. An image processing apparatus characterized by comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the image processing method according to any one of claims 1 to 8.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out an image processing method according to any one of claims 1 to 8.
CN202111085943.5A 2021-09-16 2021-09-16 Image processing method, apparatus and computer readable storage medium Active CN113781351B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111085943.5A CN113781351B (en) 2021-09-16 2021-09-16 Image processing method, apparatus and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111085943.5A CN113781351B (en) 2021-09-16 2021-09-16 Image processing method, apparatus and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN113781351A true CN113781351A (en) 2021-12-10
CN113781351B CN113781351B (en) 2023-12-08

Family

ID=78844531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111085943.5A Active CN113781351B (en) 2021-09-16 2021-09-16 Image processing method, apparatus and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113781351B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062957A1 (en) * 2010-09-13 2012-03-15 Samsung Electronics Co., Ltd Printing control device, image forming apparatus, and image forming method
US20130169760A1 (en) * 2012-01-04 2013-07-04 Lloyd Watts Image Enhancement Methods And Systems
CN107918931A (en) * 2016-10-10 2018-04-17 深圳市瀚海基因生物科技有限公司 Image processing method and system
CN110490204A (en) * 2019-07-11 2019-11-22 深圳怡化电脑股份有限公司 Image processing method, image processing apparatus and terminal
WO2021092815A1 (en) * 2019-11-13 2021-05-20 深圳市大疆创新科技有限公司 Identification method, temperature measurement method, device and storage medium
CN113077459A (en) * 2021-04-28 2021-07-06 北京的卢深视科技有限公司 Image definition detection method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062957A1 (en) * 2010-09-13 2012-03-15 Samsung Electronics Co., Ltd Printing control device, image forming apparatus, and image forming method
US20130169760A1 (en) * 2012-01-04 2013-07-04 Lloyd Watts Image Enhancement Methods And Systems
CN107918931A (en) * 2016-10-10 2018-04-17 深圳市瀚海基因生物科技有限公司 Image processing method and system
CN110490204A (en) * 2019-07-11 2019-11-22 深圳怡化电脑股份有限公司 Image processing method, image processing apparatus and terminal
WO2021092815A1 (en) * 2019-11-13 2021-05-20 深圳市大疆创新科技有限公司 Identification method, temperature measurement method, device and storage medium
CN113077459A (en) * 2021-04-28 2021-07-06 北京的卢深视科技有限公司 Image definition detection method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
缪慧司;梁光明;刘任任;丁建文;: "结合距离变换与边缘梯度的分水岭血细胞分割", 中国图象图形学报, no. 02, pages 192 - 198 *

Also Published As

Publication number Publication date
CN113781351B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
CN107507173B (en) No-reference definition evaluation method and system for full-slice image
CN111027546B (en) Character segmentation method, device and computer readable storage medium
CN110516584B (en) Cell automatic counting method based on dynamic learning for microscope
US20170178341A1 (en) Single Parameter Segmentation of Images
CN110706224B (en) Optical element weak scratch detection method, system and device based on dark field image
CN113744142B (en) Image restoration method, electronic device and storage medium
CN115457063A (en) Method, device and equipment for extracting edge of circular hole of PCB (printed Circuit Board) and storage medium
CN112200826B (en) Industrial weak defect segmentation method
CN117351011B (en) Screen defect detection method, apparatus, and readable storage medium
CN115689948B (en) Image enhancement method for detecting cracks of building water supply pipeline
JP2013238449A (en) Crack detection method
CN117094975A (en) Method and device for detecting surface defects of steel and electronic equipment
CN112634301A (en) Equipment area image extraction method and device
CN113780110A (en) Method and device for detecting weak and small targets in image sequence in real time
CN113888438A (en) Image processing method, device and storage medium
CN112053302A (en) Denoising method and device for hyperspectral image and storage medium
CN106815851B (en) A kind of grid circle oil level indicator automatic reading method of view-based access control model measurement
CN113781515A (en) Cell image segmentation method, device and computer readable storage medium
CN117576137A (en) Rock slice image edge detection method based on improved canny algorithm
CN113781351B (en) Image processing method, apparatus and computer readable storage medium
CN115937205A (en) Method, device and equipment for generating surface defect ceramic tile image and storage medium
CN113284158B (en) Image edge extraction method and system based on structural constraint clustering
CN110728686B (en) Voronoi-based vehicle-mounted lamp image segmentation method
CN114757867A (en) Cell tracking method, sperm optimization method, electronic device, and storage medium
CN108345893B (en) Straight line detection method and device, computer storage medium and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant