WO2020232910A1 - Procédé et appareil de comptage de cibles basés sur un traitement d'image, dispositif et support de stockage - Google Patents
Procédé et appareil de comptage de cibles basés sur un traitement d'image, dispositif et support de stockage Download PDFInfo
- Publication number
- WO2020232910A1 WO2020232910A1 PCT/CN2019/103845 CN2019103845W WO2020232910A1 WO 2020232910 A1 WO2020232910 A1 WO 2020232910A1 CN 2019103845 W CN2019103845 W CN 2019103845W WO 2020232910 A1 WO2020232910 A1 WO 2020232910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- target
- target object
- recognized
- gradient
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H70/00—ICT specially adapted for the handling or processing of medical references
- G16H70/60—ICT specially adapted for the handling or processing of medical references relating to pathologies
Definitions
- This application relates to the field of image detection technology, and in particular to a method, device, equipment, and storage medium for object statistics based on image processing.
- This application provides a method, device, equipment, and storage medium for object statistics based on image processing to improve the efficiency and accuracy of statistical objects.
- this application provides a method for object statistics based on image processing, including:
- the image to be recognized includes a target
- this application also provides a device for counting objects based on image processing, including:
- An image acquisition unit for acquiring an image to be recognized, the image to be recognized includes a target;
- a preprocessing unit configured to perform noise reduction and anti-color processing on the image to be identified to obtain a target image
- the construction processing unit is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target according to the differential pyramid ;
- a gradient calculation unit configured to calculate the image gradient of the boundary of the target object according to the local extremum of the target object
- the result statistics unit is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
- the present application also provides a computer device, the computer device includes a memory and a processor; the memory is used to store a computer program; the processor is used to execute the computer program and execute the The computer program implements the above-mentioned object statistical method based on image processing.
- the present application also provides a computer-readable storage medium that stores a computer program, and when the computer program is executed by a processor, the processor realizes the above-mentioned image-based processing The statistical method of the target object.
- the present application discloses a method, device, equipment and storage medium for object statistics based on image processing.
- the method includes: acquiring an image to be identified, the image to be identified includes a target object; and denoising the image to be identified Reverse color processing to obtain a target image; perform a convolution operation on the target image according to the Gauss-Laplace function to construct a Gaussian pyramid and construct a differential pyramid based on the Gaussian pyramid, and extract the target image based on the differential pyramid Local extremum; calculate the image gradient of the boundary of the target object according to the local extremum of the target object; perform differentiated statistics on the target object according to the image gradient, and display the statistical result.
- This method can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the statistics of the objects, and reducing the burden of manual statistics.
- FIG. 1 is a schematic flowchart of a method for object statistics based on image processing provided by an embodiment of the present application
- FIG. 2 is a schematic flowchart of sub-steps of the target object statistics method in FIG. 1;
- FIG. 3a is a schematic diagram of the effect of an image to be recognized provided by an embodiment of the present application.
- FIG. 3b is a schematic diagram of the effect of the target image provided by the embodiment of the present application.
- FIG. 4 is a schematic diagram of the construction process of a differential pyramid provided by an embodiment of the present application.
- FIG. 5 is a schematic diagram of local extreme points of target cells in scale space and two-dimensional image space provided by an embodiment of this application;
- FIG. 6 is a schematic diagram of the structure of a cascade filter provided by an embodiment of the application.
- FIG. 7 is a schematic diagram of a local extremum area of target cells provided by an embodiment of the application.
- FIG. 8 is a schematic flowchart of sub-steps of the target object statistics method in FIG. 1;
- Figure 9a is a schematic diagram of a target image provided by an embodiment of the application.
- FIG. 9b is a schematic diagram of a sub-image provided by an embodiment of the application.
- FIG. 10 is a schematic diagram of several sub-images after cropping of a target image provided by an embodiment of the application.
- FIG. 11 is a schematic diagram of several target cells with contour buffers after the contour line of the target cell is expanded according to an embodiment of the application;
- FIG. 12 is a schematic diagram showing the effect of target cell statistical results provided by an embodiment of the application.
- FIG. 13 is a schematic diagram of several endothelial cells counted in an embodiment of the application.
- FIG. 14 is a schematic diagram of several mesangial cells counted in an embodiment of the application.
- FIG. 15 is a schematic diagram of several podocytes calculated according to an embodiment of the application.
- FIG. 16 is a schematic flowchart of another object statistics method based on image processing provided by an embodiment of the application.
- FIG. 17 is a schematic block diagram of a device for counting objects based on image processing according to an embodiment of the application.
- FIG. 18 is a schematic block diagram of a preprocessing unit provided by an embodiment of this application.
- FIG. 19 is a schematic block diagram of a gradient calculation unit provided by an embodiment of the application.
- 20 is a schematic block diagram of another device for counting objects based on image processing provided by an embodiment of the application.
- FIG. 21 is a schematic block diagram of the structure of a computer device according to an embodiment of the application.
- the embodiments of the present application provide a method, device, equipment, and storage medium for object statistics based on image processing, which can be used to count objects on pathological images, and of course, can also be used for statistics on objects on other images.
- the target on the pathological image can be lymphocytes, DNA, RNA, chromosomes, and glomerular endothelial cells, mesangial cells, and podocytes.
- the following examples will take the endothelial cells, mesangial cells and podocytes of the glomerulus as targets for detailed introduction.
- the method for counting objects based on image processing can be applied to a terminal or a server, or the server and the terminal can be used interactively to quickly and accurately count the objects.
- the server and the terminal are used interactively, for example, the server sends the statistical results to the terminal for application.
- the server can be an independent server or a server cluster.
- the terminal can be an electronic device such as a tablet computer, a notebook computer, a desktop computer, a smart phone, or a wearable device.
- FIG. 1 is a schematic flowchart of a method for object statistics based on image processing provided by an embodiment of the present application.
- the object statistics method includes steps S101 to S105.
- Step S101 Obtain an image to be recognized.
- the image to be recognized is an image including a target object.
- the image to be recognized can be obtained by means of ordinary cameras, scanners, cameras, video cameras, image capture cards, microscopic digital cameras, scanners, and the like.
- the acquisition process is: collecting a sample of the kidney living tissue; staining the kidney living tissue to make a kidney section; collecting with an image acquisition device
- the image of the stained kidney section is used as the image to be recognized. Because the number of endothelial cells, mesangial cells, and podocytes in the glomerulus is usually hundreds of them, they are dark and indistinguishable with similar characteristics. Therefore, the sample is stained before acquiring the image to be identified.
- the staining agent used It is a Periodic Acid-Schiffstain (PAS), and the endothelial cells, mesangial cells, and podocytes of the glomeruli on the image to be identified after being stained by PAS are highlighted in purple. Therefore, the number of endothelial cells, mesangial cells, and podocytes of the glomeruli on the image to be recognized can be counted more intuitively, thereby improving the accuracy of counting target cells.
- PAS Periodic Acid-Schiffstain
- Step S102 Perform noise reduction and anti-color processing on the image to be identified to obtain a target image.
- the preprocessing includes noise reduction processing and color inversion processing, etc., wherein the noise reduction processing can adopt wavelet noise reduction or Fourier transform reduction Noise, etc., the specific noise reduction method is not limited here, as long as the effect of noise removal is achieved.
- the image to be recognized needs to be inverted.
- the inverted color can be called the inverted processing function or the corresponding image processing tool.
- FIG. 2 is the step of performing noise reduction and color reversal processing on the image to be recognized to obtain a target image, including sub-step S1021 and sub-step S1022.
- Step S1021 Smooth the image to be recognized by bilateral filtering to remove pseudo-point noise.
- Noise is also called noise. Noise is mainly caused by the influence of factors such as homogeneity and different spectrum, same spectrum and heterogeneity in the imaging process of the sensor. Common noises include salt and pepper noise and striped noise.
- this embodiment adopts a bilateral filtering algorithm to perform noise reduction processing on the image to be recognized.
- Bilateral filtering not only considers the distance between pixels, but also considers the similarity between gray levels. That is, bilateral filtering includes filtering within the spatial range and filtering within the gray-scale range. The expression of filtering in the above spatial range is:
- w s (i,j) is the spatial domain weight
- I(i,j) is the image to be recognized
- ⁇ is the neighborhood range at the pixel (x,y).
- filtering in the gray scale is similar to filtering in the space.
- the expression for filtering in the grayscale range is:
- I the image to be recognized after denoising
- w r (i, j) is the gray domain weight
- I(i, j) is the image to be recognized
- ⁇ is the neighborhood range at the pixel (x, y).
- Is the image to be recognized after denoising w s (i,j) is the spatial domain weight
- w r (i,j) is the gray domain weight
- w(i,j) is the spatial domain weight and gray
- the product of domain weights, I(i,j) is the image to be recognized
- ⁇ is the neighborhood range at the pixel (x,y).
- the bilateral filter in the area where the image to be recognized changes relatively smoothly, the pixel gray values in the neighborhood are not much different, and the bilateral filtering is converted into a Gaussian low-pass filter; In the area, the filter replaces the original gray value with the gray average value of pixels with similar gray levels in the neighborhood of the edge point. Therefore, the bilateral filter not only protects the edge information of the image, but also smoothes the noise of the image to be recognized.
- step S1022 the image to be recognized after the bilateral filtering process is subjected to color inversion processing to obtain a target image.
- the target cell area usually presents dark, nearly circular plaques of different sizes, in order to improve the statistical efficiency and accuracy of the target cells, the image to be recognized needs to be inverted.
- Each pixel in the image has four values, namely alpha, red, green, and blue. They are the basic elements that make up the color, and the value range of each element is [0,255].
- the inversion process is to subtract the R, G, and B values of each pixel of the image to be recognized from 255, as shown in Figure 3a and Figure 3b, Figure 3a is the image to be recognized before the inversion process, and Figure 3b is the inversion
- the processed image to be recognized is the target image.
- Step S103 Perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target according to the differential pyramid.
- L(x,y,k ⁇ ) and L(x,y, ⁇ ) are Gaussian pyramids
- D(x,y, ⁇ ) is the difference pyramid
- G(x, y, k ⁇ ) and G (x, y, ⁇ ) are Gauss-Laplace functions, respectively
- I (x, y) is the input two-dimensional image, that is, the target image.
- the differential pyramid is constructed by taking the difference between the convolution results of two adjacent images of different scales (pixels) in each layer of the Gaussian pyramid. .
- the differential pyramid can be thought of as a visual mechanism that simulates the nerves on the retina to extract information from the image and then provide it to the brain.
- the difference pyramid can effectively detect stable local extreme points in the scale space, and then calculate the local extreme value of the target through the difference pyramid.
- the construction process of the Gaussian pyramid and the difference pyramid is as follows: local extremum extraction uses a Gaussian-Laplace (Laplacian of Gaussian, LOG) function for convolution processing.
- the Laplacian smoothed by Gaussian is a second-order differential operator, which can generate a steep zero crossing (that is, a zero crossing from positive to negative) at the edge, and perform extreme point detection based on the zero crossing point .
- the convolution operation between an image and a two-dimensional function is actually to find the similarity between the image and this function.
- the Gauss-Laplacian operator and the Gauss kernel have the following relationship:
- Gaussian convolution kernel is the only linear transformation kernel that realizes scale transformation
- an image can be expressed in the scale space as the convolution of the image and the variable Gaussian kernel function, and the Gaussian pyramid operator is used as shown in the above formula (5) .
- Gauss-Laplacian operator can be approximately replaced by the Difference of Gaussian (DOG) operator.
- DOG Difference of Gaussian
- this embodiment further constructs a difference pyramid , To quickly and effectively count the target. Moreover, by fusing multi-scale differential pyramids to perform convolution operations, the extreme centers of targets of different sizes can be identified robustly and effectively, so that the center and size (radius) of the target can be known.
- FIG. 5 is a schematic diagram of the local extreme points of the target cell in the scale space and the two-dimensional image space, where L1 is the neighboring point of the previous scale, L2 is the neighboring point of the same scale space, and L3 is the neighboring point of the next scale.
- the image scale space is formed by convolving the image with a Gaussian filter with a variable kernel to obtain the Gaussian pyramid of the image. For example, divide the Gaussian pyramid into O groups, each group of S plus 3 layers, S is the number of layers between ⁇ and 2 ⁇ , and S is generally 2 or 3.
- the formation of each group of Gaussian pyramids is obtained by convolution of the input image and the cascade filter. In the following, taking S as 2 as an example, the formation process of the Gaussian pyramid will be described in detail.
- each group of Gaussian pyramids has 5 layers, which are obtained by four-stage cascade filters.
- the formation process of the Gaussian pyramid for the first group is shown in Figure 6.
- the image size of each layer of the second group of Gaussian pyramids is 1/4 of the image size of the first group of Gaussian pyramids
- the input image is the S-th layer image of the first group of Gaussian pyramids obtained through a sampling process with a sampling rate of 2.
- S the input image I 0 ′ of the second group of Gaussian pyramids is obtained by sampling the image I 1 mentioned above.
- a total of 4 sets of Gaussian pyramids are generated.
- the difference pyramid constructed indirectly by the Gauss-Laplace function or directly constructed by the Gaussian gold tower can extract the local extrema of the target.
- the extraction effect of the local extremum of the target cell in this embodiment can be seen in FIG. 7.
- Step S104 Calculate the image gradient of the boundary of the target object according to the local extremum of the target object.
- the target boundary refers to the boundary area of the target, such as the contour buffer of the target.
- the image gradient can be calculated according to the local extremum of the target using the image gradient algorithm to calculate the image gradient of the target boundary.
- step S104 includes sub-step S1041 to sub-step S1043.
- Step S1041 according to the local extremum of the target, crop the sub-images including the target in the target image.
- a rectangular area corresponding to the target is generated according to the local extremum of the target and the circumscribed rectangle of the target; the target on the target image is cropped according to the rectangular area to obtain the The sub-image of the target.
- the target image is cropped by taking the local extremum point of the target as the center and the circumscribed rectangle of the target by expanding a rectangular area of M*N pixels
- M and N are both positive integers.
- the values of M and N can be equal.
- Fig. 9a is a target image.
- the circumscribed rectangle of the target is expanded by a rectangular area of 5*5 pixels to obtain a sub-image of the target.
- the circumscribed rectangle is the inner rectangle shown in Fig. 9b.
- the rectangular area The area enclosed by the outer rectangle as shown in FIG. 9b is expanded by 5*5 pixels from the outer rectangle.
- H and D are both positive integers, thereby improving the statistical efficiency and accuracy of the object.
- Figure 10 is a number of sub-images cropped after the target image is uniformly resampled to 80*80 pixels.
- Step S1042 extract the contour of the target in the sub-image according to the watershed segmentation algorithm, and extract the contour buffer of the target with the contour as the center according to the morphological expansion operation.
- the contour line of the target cell is extracted with the local extremum center of the target cell in the sub-image as the seed point, and the contour line of the target cell is used as the center, using L x L structural elements (can be Using 3x3 structural elements) to perform morphological expansion calculations, thicken the extracted contour buffer of the target cell inside and outside the contour line of the target cell to obtain the contour buffer of each target cell, as shown in Fig. 11 Show.
- mathematical image morphology is based on morphology, using structural elements with a certain morphology to measure and extract the corresponding target contour in the image.
- Mathematical morphology operations mainly include corrosion, expansion, opening and closing, etc. Based on these basic operations, various practical mathematical morphology algorithms can be combined and derived, and then the image shape and structure can be analyzed and processed.
- Dilation is the use of vector addition to merge two sets. Dilation is the set of the sum of all vectors. The two operands of vector addition come from the set X and the structure element B, and any possible combination is obtained.
- the expression of the expansion operation is as follows:
- Dilation can fill the small holes in the image (the small holes are relatively small holes relative to the size of the structural elements) and the fine depressions that appear at the edges of the image, and have the effect of filtering the image outside.
- Corrosion is the use of vector subtraction on set elements to merge two sets. Corrosion is a dual operation of expansion, and erosion and expansion are not mutually reversible operations.
- the expression of the corrosion operation is as follows:
- Corrosion can eliminate the smaller part of the image, have the effect of filtering the image, and reduce the image.
- Step S1043 Calculate the image gradient of the contour buffer of the target object according to the image gradient algorithm.
- the classic image gradient algorithm considers the grayscale changes in a certain neighborhood of each pixel of the image, and uses the first-order or second-order derivative change law adjacent to the edge to set a gradient operator to convolve a certain neighborhood of the pixels in the image. To calculate.
- the morphological gradient is based on the combination of expansion or erosion and the difference of the image to enhance the intensity of the pixels in the neighborhood of the structural element.
- the basic operations of expansion and erosion are combined to use the image gradient calculation of the buffer.
- Step S105 Perform distinguishing statistics on the target object according to the image gradient, and display the statistics result.
- the target is distinguished according to the image gradient to obtain the target type corresponding to the target, and the number of the target in each target type is counted.
- displaying the statistical results can directly display the number of targets of each target type, or display in combination with graphics, as shown in FIG. 12.
- the performing differentiated statistics on the target object according to the image gradient includes:
- the target types are glomerular endothelial cells, mesangial cells, and podocytes.
- the prior knowledge of the three types of target cells can be programmed as follows:
- FIG. 13 shows the counted number of endothelial cells.
- the boundary of endothelial cells is between the bright background and the dark cells, and the corresponding preset image gradient range is: the gradient value is greater than or equal to 192.
- the preset image gradient range may also include other conditions. For example, if the number of pixels whose gradient value is greater than or equal to 192 in the contour buffer of the target cell exceeds 20%, it can be judged as an endothelial cell;
- FIG. 14 shows several mesangial cells counted.
- the preset image gradient range corresponding to the mesangial cells in the purple-red mesangial area is: the gradient value does not exceed 64.
- the preset image gradient range can also be that the number of pixels whose gradient value does not exceed 64 in the contour buffer of the target cell accounts for more than 20%, which can be judged as mesangial cells;
- FIG. 15 shows the counted podocytes.
- the podocytes are in the lavender non-mesangial area, and the corresponding preset image gradient range is: 50 ⁇ gradient value ⁇ 128.
- the preset image gradient range can also be pixels located between 50 and 128 in the contour buffer of the target cell, and the number of these pixels accounting for more than 80% can be determined as a podocyte.
- the object statistics method based on image processing of the present application can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the object statistics, and reducing the burden of manual statistics.
- FIG. 16 is a schematic flowchart of another method for object statistics based on image processing provided by an embodiment of the application.
- the method for counting objects includes steps S201 to S206.
- Step S201 Obtain an image to be recognized, and the image to be recognized includes a target object.
- the image to be recognized is an image including a target object, and specifically, an image obtained by collecting a sample including the target object by an image acquisition device is used as the image to be recognized.
- Step S202 Perform noise reduction and color inversion processing on the image to be identified to obtain a target image.
- the preprocessing includes noise reduction processing and color inversion processing.
- Step S203 Perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract a local extremum of the target object according to the differential pyramid.
- the difference pyramid is constructed by taking the difference between the convolution results of two adjacent images of different scales in each layer of the Gaussian pyramid.
- the difference pyramid can effectively detect stable local extreme points in the scale space, and then calculate the local extreme value of the target through the difference pyramid.
- Step S204 based on the non-maximum value suppression algorithm, perform deduplication processing on the target object according to the local extremum of the target object.
- a non-maximum value suppression algorithm is used to achieve deduplication.
- the non-maximum suppression algorithm can be understood as a local maximum search. This local represents a neighborhood. The neighborhood has two variable parameters, one is the dimension of the neighborhood, and the other is the size of the neighborhood.
- search range is 2n-1, where n is greater than or equal to 1;
- the local maximum value is satisfied, the value of the point is greater than the value of the adjacent point; when the point is determined to be the local maximum value, the iteration parameter can be increased by 2 to find the next local maximum value;
- the iteration parameter starts from the left. If the value of the point is not satisfied that the value is greater than the value on the right, the iteration parameter needs to be increased by 1 until the value of the current point is greater than the value on the right, and then the local optimal value is obtained. Then continue to step 2 until all the values are traversed.
- Step S205 Extract the image gradient of the boundary of the target object after de-duplication processing according to the local extremum of the target object.
- the sub-image is cropped according to the local extremum of the target; the contour line of the target in the sub-image is extracted according to the watershed segmentation algorithm; the contour buffer of the target is extracted according to the morphological expansion operation; and then calculated according to the image gradient algorithm The image gradient of the contour buffer of the target.
- Step S206 Perform differentiated statistics on the target object according to the image gradient, and display the statistical result.
- the target type corresponding to the target object and the preset image gradient range corresponding to the target type According to acquiring the target type corresponding to the target object and the preset image gradient range corresponding to the target type; performing distinguishing statistics on the target object according to the image gradient and the preset image gradient range, and displaying statistical results.
- the statistical result can be displayed on the target image or on the display interface through the terminal.
- the target object statistics based on image processing of the present application can quickly and accurately distinguish and count the objects in the target image, thereby improving the efficiency and accuracy of the target object statistics, and reducing the burden of manual statistics.
- FIG. 17 is a schematic block diagram of an image processing-based target statistics device provided by an embodiment of the present application.
- the target object statistics device 300 is configured to perform any of the aforementioned image processing-based target statistics method.
- the object statistics device 300 includes: an image acquisition unit 301, a preprocessing unit 302, a construction processing unit 303, a gradient calculation unit 304, and a result statistics unit 305.
- the image acquisition unit 301 is configured to acquire an image to be identified, and the image to be identified includes a target object.
- the preprocessing unit 302 is configured to perform noise reduction and color inversion processing on the image to be recognized to obtain a target image.
- the preprocessing unit 302 includes a denoising unit 3021 and an inversion processing unit 3022.
- the denoising unit 3021 is used to smooth the image to be recognized by bilateral filtering to filter out false point noise; the color inversion processing unit 3022 is used to perform the color inversion processing on the image to be recognized after the bilateral filtering process to obtain the target image.
- the construction processing unit 303 is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local pole of the target object according to the differential pyramid value.
- the gradient calculation unit 304 is configured to calculate the image gradient of the boundary of the target object according to the local extremum of the target object.
- the gradient calculation unit 304 includes a sub-image cropping unit 3041, a buffer extraction unit 3042, and a buffer calculation unit 3043.
- the sub-image cropping unit 3041 is used to crop the sub-images including the target in the target image according to the local extrema of the target;
- the buffer extraction unit 3042 is used to extract the sub-images according to the watershed segmentation algorithm The contour of the target in the image is extracted, and the contour buffer of the target is extracted with the contour as the center according to the morphological expansion operation;
- the buffer calculation unit 3043 is used to calculate the contour buffer of the target according to the image gradient algorithm Image gradient.
- the result statistics unit 305 is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
- the image processing-based target statistics device of the present application has a high degree of intelligent statistics, low data storage space requirements, and fast processing speed, and can accurately count targets and reduce the burden of manual statistics.
- FIG. 20 is a schematic block diagram of another object statistics device based on image processing provided by an embodiment of the application.
- the object statistics device 400 is configured to execute any one of the aforementioned methods for object statistics based on image processing.
- the object statistics device 400 includes: an image acquisition unit 401, a preprocessing unit 402, a construction processing unit 403, a deduplication unit 404, a gradient calculation unit 405, and a result statistics unit 406.
- the image acquisition unit 401 is configured to acquire an image to be identified, and the image to be identified includes a target object.
- the preprocessing unit 402 is configured to perform noise reduction and color inversion processing on the image to be recognized to obtain a target image.
- the construction processing unit 403 is configured to perform a convolution operation on the target image according to the Gauss-Laplacian function to construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local pole of the target object according to the differential pyramid. value.
- the deduplication unit 404 is based on a non-maximum value suppression algorithm, and is configured to perform deduplication processing on the target object according to the local extreme value of the target object.
- the gradient calculation unit 405 is configured to calculate the image gradient of the boundary of the target object after deduplication processing according to the local extremum of the target object.
- the result statistics unit 406 is configured to perform differentiated statistics on the target object according to the image gradient, and display the statistics result.
- the object statistical device based on image processing in the present application provides a fast and effective tool for image statistical features, without model training, and the device has high processing efficiency and accuracy.
- the above-mentioned object statistics device may be implemented in the form of a computer program, and the computer program may run on the computer device as shown in FIG. 21.
- FIG. 21 is a schematic block diagram of a computer device according to an embodiment of the present application.
- the computer equipment can be a server or a terminal.
- the computer device includes a processor, a memory, and a network interface connected through a system bus, where the memory may include a non-volatile storage medium and an internal memory.
- the non-volatile storage medium can store an operating system and a computer program.
- the computer program includes program instructions, and when the program instructions are executed, the processor can execute any object statistical method based on image processing.
- the processor is used to provide computing and control capabilities and support the operation of the entire computer equipment.
- the internal memory provides an environment for the operation of the computer program in the non-volatile storage medium.
- the processor can make the processor execute any object statistical method based on image processing.
- the network interface is used for network communication, such as sending assigned tasks.
- the network interface is used for network communication, such as sending assigned tasks.
- FIG. 21 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the computer device to which the solution of the present application is applied.
- the specific computer device may Including more or fewer parts than shown in the figure, or combining some parts, or having a different arrangement of parts.
- the processor may be a central processing unit (Central Processing Unit, CPU), the processor may also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), and application specific integrated circuits (Application Specific Integrated Circuits). Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
- the general-purpose processor may be a microprocessor or the processor may also be any conventional processor.
- the processor is used to run a computer program stored in a memory to implement the following steps:
- the image to be recognized includes a target object; perform noise reduction and anti-color processing on the image to be recognized to obtain a target image; perform convolution operation on the target image according to the Gauss-Laplace function To construct a Gaussian pyramid and construct a differential pyramid according to the Gaussian pyramid, and extract the local extrema of the target object according to the differential pyramid; calculate the image gradient of the target boundary according to the local extremum of the target; The image gradient performs distinguishing statistics on the target, and displays the statistical result.
- the processor when the processor implements the denoising and anti-color processing on the image to be recognized to obtain the target image, it is used to implement:
- the image to be recognized is smoothed and filtered through bilateral filtering to remove pseudo-point noise; and the image to be recognized after the bilateral filtering is processed by inverse color processing to obtain the target image.
- the processor when the processor realizes the calculation of the image gradient of the boundary of the target object according to the local extremum of the target object, it is configured to realize:
- the sub-image of the target image is cut out; the contour of the target in the sub-image is extracted according to the watershed segmentation algorithm, and the contour is calculated according to the morphological expansion calculation. Extract the contour buffer of the target object as the center; calculate the image gradient of the contour buffer of the target object according to an image gradient algorithm.
- the processor when the processor implements the clipping of sub-images including the target object in the target image according to the local extremum of the target object, the processor is configured to achieve:
- the processor before the processor realizes the calculation of the image gradient of the target object boundary according to the local extremum of the target object, it is further configured to realize:
- the processor realizes the extraction of the image gradient of the boundary of the target object according to the local extremum of the target object, it is used to realize: extract the deduplication processed image gradient according to the local extremum of the target object The image gradient of the boundary of the target.
- the processor when the processor realizes the differentiated statistics of the target object according to the image gradient, it is configured to realize:
- the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the present application Any of the object statistics methods based on image processing provided in the embodiments.
- the computer-readable storage medium may be the internal storage unit of the computer device described in the foregoing embodiment, such as the hard disk or memory of the computer device.
- the computer-readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a smart memory card (SMC), or a secure digital (Secure Digital, SD) equipped on the computer device. ) Card, Flash Card, etc.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé et un appareil de comptage de cibles basés sur un traitement d'image, un dispositif et un support de stockage. Le procédé consiste à : obtenir une image nécessitant une identification, l'image nécessitant une identification comprenant des cibles; traiter l'image nécessitant une identification pour obtenir une image cible; effectuer une opération de convolution sur l'image cible pour construire une pyramide gaussienne et une différence de pyramide gaussienne et extraire des extrêmes locaux des cibles en fonction de la différence de pyramide gaussienne; et calculer des gradients d'image de limites cibles selon les extrêmes locaux pour distinguer et compter les cibles et afficher le résultat de comptage.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910420808.8A CN110298817A (zh) | 2019-05-20 | 2019-05-20 | 基于图像处理的目标物统计方法、装置、设备及存储介质 |
CN201910420808.8 | 2019-05-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020232910A1 true WO2020232910A1 (fr) | 2020-11-26 |
Family
ID=68026982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/103845 WO2020232910A1 (fr) | 2019-05-20 | 2019-08-30 | Procédé et appareil de comptage de cibles basés sur un traitement d'image, dispositif et support de stockage |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110298817A (fr) |
WO (1) | WO2020232910A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112712465A (zh) * | 2020-12-31 | 2021-04-27 | 四川长虹网络科技有限责任公司 | 一种优化拍照式抄表终端通信数据量的方法和系统 |
CN114137984A (zh) * | 2021-11-29 | 2022-03-04 | 江苏科技大学 | 一种模块化传输平台及其控制方法和路径规划方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112541507B (zh) * | 2020-12-17 | 2023-04-18 | 中国海洋大学 | 多尺度卷积神经网络特征提取方法、系统、介质及应用 |
CN113222853B (zh) * | 2021-05-26 | 2022-07-12 | 武汉博宇光电系统有限责任公司 | 一种基于噪声估计的渐进式红外图像降噪方法 |
CN115311228A (zh) * | 2022-08-05 | 2022-11-08 | 山东省产品质量检验研究院 | 基于matlab图像边缘检测的球压压痕测量方法及系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090041376A1 (en) * | 2007-08-03 | 2009-02-12 | Joan Elizabeth Carletta | Method for real-time implementable local tone mapping for high dynamic range images |
CN103218831A (zh) * | 2013-04-21 | 2013-07-24 | 北京航空航天大学 | 一种基于轮廓约束的视频运动目标分类识别方法 |
CN103295224A (zh) * | 2013-03-14 | 2013-09-11 | 北京工业大学 | 一种基于均值漂移和分水岭的乳腺超声图像自动分割方法 |
CN104658011A (zh) * | 2015-01-31 | 2015-05-27 | 北京理工大学 | 一种智能交通运动目标检测跟踪方法 |
CN107092871A (zh) * | 2017-04-06 | 2017-08-25 | 重庆市地理信息中心 | 基于多尺度多特征融合的遥感影像建筑物检测方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101837286B1 (ko) * | 2016-08-10 | 2018-03-09 | 한국과학기술원 | 라플라시안 패치 기반 이미지 합성 방법 및 장치 |
CN107665479A (zh) * | 2017-09-05 | 2018-02-06 | 平安科技(深圳)有限公司 | 一种特征提取方法、全景拼接方法及其装置、设备及计算机可读存储介质 |
CN109145929A (zh) * | 2017-10-09 | 2019-01-04 | 苏州高科中维软件科技有限公司 | 一种基于sift尺度空间特征信息提取方法 |
-
2019
- 2019-05-20 CN CN201910420808.8A patent/CN110298817A/zh active Pending
- 2019-08-30 WO PCT/CN2019/103845 patent/WO2020232910A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090041376A1 (en) * | 2007-08-03 | 2009-02-12 | Joan Elizabeth Carletta | Method for real-time implementable local tone mapping for high dynamic range images |
CN103295224A (zh) * | 2013-03-14 | 2013-09-11 | 北京工业大学 | 一种基于均值漂移和分水岭的乳腺超声图像自动分割方法 |
CN103218831A (zh) * | 2013-04-21 | 2013-07-24 | 北京航空航天大学 | 一种基于轮廓约束的视频运动目标分类识别方法 |
CN104658011A (zh) * | 2015-01-31 | 2015-05-27 | 北京理工大学 | 一种智能交通运动目标检测跟踪方法 |
CN107092871A (zh) * | 2017-04-06 | 2017-08-25 | 重庆市地理信息中心 | 基于多尺度多特征融合的遥感影像建筑物检测方法 |
Non-Patent Citations (1)
Title |
---|
ZHANG, ZHIQIANG ET AL.: "A modified bilateral filtering algorithm", JOURNAL OF IMAGE AND GRAPHICS, vol. 14, no. 3, 31 March 2009 (2009-03-31), DOI: 20200215233621A * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112712465A (zh) * | 2020-12-31 | 2021-04-27 | 四川长虹网络科技有限责任公司 | 一种优化拍照式抄表终端通信数据量的方法和系统 |
CN112712465B (zh) * | 2020-12-31 | 2023-08-04 | 四川长虹网络科技有限责任公司 | 一种优化拍照式抄表终端通信数据量的方法和系统 |
CN114137984A (zh) * | 2021-11-29 | 2022-03-04 | 江苏科技大学 | 一种模块化传输平台及其控制方法和路径规划方法 |
CN114137984B (zh) * | 2021-11-29 | 2024-02-27 | 江苏科技大学 | 一种模块化传输平台及其控制方法和路径规划方法 |
Also Published As
Publication number | Publication date |
---|---|
CN110298817A (zh) | 2019-10-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020232910A1 (fr) | Procédé et appareil de comptage de cibles basés sur un traitement d'image, dispositif et support de stockage | |
Lopez-Molina et al. | Multiscale edge detection based on Gaussian smoothing and edge tracking | |
CN108765343B (zh) | 图像处理的方法、装置、终端及计算机可读存储介质 | |
Sun et al. | Gradient profile prior and its applications in image super-resolution and enhancement | |
Ghosh et al. | Fast scale-adaptive bilateral texture smoothing | |
Yang et al. | Constant time median and bilateral filtering | |
CN111145209A (zh) | 一种医学图像分割方法、装置、设备及存储介质 | |
US8406518B2 (en) | Smoothed local histogram filters for computer graphics | |
CN111402170A (zh) | 图像增强方法、装置、终端及计算机可读存储介质 | |
Chen et al. | Fast defocus map estimation | |
CN112150371B (zh) | 图像降噪方法、装置、设备及存储介质 | |
Liu et al. | Automatic blur-kernel-size estimation for motion deblurring | |
Deshpande et al. | A novel modified cepstral based technique for blind estimation of motion blur | |
WO2022233185A1 (fr) | Procédé et appareil de filtrage d'image, terminal et support de stockage lisible par ordinateur | |
Hacini et al. | A 2D-fractional derivative mask for image feature edge detection | |
CN107123124A (zh) | 视网膜图像分析方法、装置和计算设备 | |
Gu et al. | A novel total generalized variation model for image dehazing | |
WO2024001538A1 (fr) | Procédé et appareil de détection de rayure, dispositif électronique et support de stockage lisible | |
Oprisescu et al. | Automatic pap smear nuclei detection using mean-shift and region growing | |
Zingman et al. | Detection of texture and isolated features using alternating morphological filters | |
CN111311610A (zh) | 图像分割的方法及终端设备 | |
CN115841632A (zh) | 输电线路提取方法、装置以及双目测距方法 | |
Gao et al. | Multiscale phase congruency analysis for image edge visual saliency detection | |
He et al. | Structure-preserving texture smoothing with scale-aware intensity aggregation structure measurement | |
CN112967321A (zh) | 运动目标的检测方法、装置、终端设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19929337 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19929337 Country of ref document: EP Kind code of ref document: A1 |