CN113902817A - Cell picture splicing method based on gray value - Google Patents
Cell picture splicing method based on gray value Download PDFInfo
- Publication number
- CN113902817A CN113902817A CN202111390285.0A CN202111390285A CN113902817A CN 113902817 A CN113902817 A CN 113902817A CN 202111390285 A CN202111390285 A CN 202111390285A CN 113902817 A CN113902817 A CN 113902817A
- Authority
- CN
- China
- Prior art keywords
- cell
- picture
- cell picture
- area
- splicing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000001914 filtration Methods 0.000 claims abstract description 8
- 238000006243 chemical reaction Methods 0.000 claims abstract description 7
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000003066 decision tree Methods 0.000 claims description 7
- 244000141353 Prunus domestica Species 0.000 claims description 2
- 210000004027 cell Anatomy 0.000 description 63
- 239000000975 dye Substances 0.000 description 2
- 238000004043 dyeing Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000003743 erythrocyte Anatomy 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000012535 impurity Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000013138 pruning Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000011410 subtraction method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a cell picture splicing method based on gray values, which is characterized by comprising the following steps of: (1) shooting and collecting to obtain a cell picture; (2) performing color conversion on the cell picture, and converting the cell picture from an RGB (red, green and blue) channel to a gray channel; (3) removing noise points from the cell picture through Gaussian filtering or mean filtering; (4) carrying out dislocation subtraction on the cell pictures to eliminate the brightness difference; (5) randomly selecting an area A on the cell picture, moving the area A for multiple times by using a sliding block, calculating the mean square error of the area A, and finding out a matching area A1 with the maximum brightness change according to the mean square error; (6) calculating a matching error by using a template matching method, and finding out a point P with the maximum or minimum difference according to an error range; the method is simple in algorithm and wide in application range, and can quickly and accurately find the optimal matching point and realize quick splicing of the cell picture.
Description
Technical Field
The invention belongs to the field of medical image processing, and particularly relates to a cell picture splicing method based on gray values.
Background
Cell types are diversified, the color difference of different types of cells after dyeing is large, different types of dyeing agents can cause the color difference of the cells, impurities, dye liquor and the like can cause pollution of the cells to different degrees after dyeing, and therefore cell picture splicing is always a challenging subject. In recent years, experts as well as the skilled person have proposed many effective solutions. David Lowe et al put forward a scale-invariant feature transform (SIFT) algorithm in 1999 to match pictures, and the algorithm can be applicable to pictures of different sizes and rotations, but has large computation amount and high requirement on hardware. 2010 Tuing et al propose a method for matching by using edge color information of an image, wherein the algorithm firstly extracts edges, then selects an appropriate area by taking each edge point as a center, calculates a histogram, and performs feature matching by using a histogram intersection method. However, the algorithm needs to ensure that the detected edges of the two images to be matched are consistent, and meanwhile, due to the high similarity of cell edges, especially red blood cells, matching errors are easy to occur.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provides a cell picture splicing method based on gray values.
The technical scheme adopted by the invention for solving the problems is as follows: the cell picture splicing method based on the gray value is characterized by comprising the following steps of:
(1) shooting and collecting to obtain a cell picture;
(2) performing color conversion on the cell picture, and converting the cell picture from an RGB (red, green and blue) channel to a gray channel;
(3) removing noise points from the cell picture through Gaussian filtering or mean filtering;
(4) carrying out dislocation subtraction on the cell pictures to eliminate the brightness difference;
(5) randomly selecting an area A on the cell picture, moving the area A for multiple times by using a sliding block, calculating the mean square error of the area A, and finding out a matching area A1 with the maximum brightness change according to the mean square error;
(6) calculating a matching error by using a template matching method, and finding out a point P with the maximum or minimum difference according to an error range;
(7) calculating the offset of the cell picture, and finding out the overlapping area of the cell picture;
(8) calculating and recording the brightness difference of the cell picture overlapping region;
(9) setting a threshold, carrying out thresholding processing on the overlapping area, recording the overlapping rate of binary images, and further judging the overlapping rate of the binary images of the cell images with passing brightness and saturation;
(10) intercepting an area picture where a sliding block is located on the cell picture, converting the intercepted picture from an RGB (red, green and blue) channel to an HSV (hue, saturation and brightness) channel, and recording a saturation difference value of an overlapped area;
(11) respectively counting the brightness difference of the corresponding overlapping area, the overlapping rate of the binary image and the saturation difference when the cell image splicing is successful or failed;
(12) according to the statistical result, applying a decision tree algorithm to find out the corresponding rules of the brightness difference, the binary image overlap rate and the saturation difference value and making a selection condition for splicing the cell images;
(13) and finding out a matching point according to the selection condition and completing the splicing of the cell picture.
Preferably, the calculation formula for performing the offset subtraction on the cell picture in the step (4) is as follows:
wherein, (x, y) is a current cell picture pixel point; src is a picture of the original cell taken; c1、C2Is a specified constant; g (x, y) is the gray value of the treated cell picture at this point.
Preferably, the template matching method applied in the step (6) is one of a squared difference matching method, a normalized squared difference matching method, a correlation matching method, a normalized correlation matching method, a correlation coefficient matching method, and a normalized correlation coefficient matching method.
Preferably, the specific calculation formula for performing the thresholding process on the overlap region in the step (9) is as follows:
wherein src (x, y) is the value of the current cell picture pixel point; t is t1、t2Is a set threshold.
Preferably, the conversion formula for converting the clipped picture from the RGB (red, green and blue) channel to the HSV (color, saturation, brightness) channel in the step (10) is as follows:
wherein, the RGB value range is [0, 255 ].
Preferably, the decision tree algorithm in the step (12) adopts a loss function model and prunes the loss function model.
Preferably, the selection conditions established in the step (12) are as follows:
a. luminance difference < threshold T1;
b. the saturation difference < threshold T2;
c. binary map overlap > threshold T3;
the selection must be performed while satisfying the above 3 items.
Compared with the prior art, the invention has the following advantages and effects: the algorithm is simple and effective, the application range is wide, the brightness difference between the pictures is effectively eliminated by applying the staggered subtraction method, and the optimal matching point can be more accurately found by combining the template matching algorithm and the decision tree algorithm, so that the cell pictures can be quickly spliced.
Drawings
FIG. 1 is a first photograph of the cells with the luminance difference eliminated according to the present invention.
FIG. 2 is a second photograph of the cell with the luminance difference eliminated according to the present invention.
Fig. 3 is a region a1 where the luminance change of fig. 1 is the largest according to the present invention.
Fig. 4 is a diagram of the result of template matching according to the present invention.
Fig. 5 is a graph of the stitching results of the present invention.
Detailed Description
The present invention will be described in further detail below by way of examples with reference to the accompanying drawings, which are illustrative of the present invention and are not to be construed as limiting the present invention.
Example 1.
With reference to figure 1 of the drawings,
the embodiment of the invention provides a cell picture splicing method based on gray values, which comprises the following steps:
(1) and taking a cell picture.
(2) And performing color conversion on the cell picture, and converting the cell picture from an RGB (red, green and blue) channel to a gray channel.
(3) Creating a proper template M, and removing noise points of the cell picture through Gaussian filtering or mean filtering, wherein the noise point removal adopts a formula specifically as follows:
wherein, (x, y) is a current cell picture pixel point; m is a template; src is a picture of the original cell taken; a is the width of the template, and b is the height of the template; g (x, y) is the gray value of the treated cell picture at this point.
(4) Then, performing dislocation subtraction on the two cell pictures to be spliced to eliminate the brightness difference of the cell pictures to be spliced, wherein the result after eliminating the brightness difference is shown in fig. 1 and fig. 2, and the calculation formula for performing dislocation subtraction on the cell pictures is as follows:
wherein, (x, y) is a current cell picture pixel point; src is a picture of the original cell taken; c1、C2Is a specified constant; g (x, y) is the gray value of the treated cell picture at this point.
(5) In fig. 1, an area a is randomly selected, a slider is moved in the area a for a plurality of times, and the mean square error is calculated, wherein the specific calculation formula is as follows:
wherein, (x, y) is a current cell picture pixel point; src is a picture of the original cell taken;is the mean of the area A;is the total number of pixel points in the area a,is the mean square error of region a.
The matching area a1 where the luminance change is largest is found by the mean square error, as shown in fig. 3.
(6) And calculating the matching error of the area A1 and the corresponding area in the cell picture II by using a template matching method to obtain a matching result graph R, wherein as shown in FIG. 4, the specific calculation formula for calculating the matching error is as follows:
wherein, (x, y) is a current cell picture pixel point; t is the template image (i.e. A1); i is the image of the cell to be matched (i.e. fig. 2); a is the width of the template image, and b is the height of the template image; and R (x, y) is the value of the current pixel point corresponding to the configuration result graph R.
And finding out the point P with the maximum or minimum difference in the matching result graph R according to the error range of the corresponding area of the A1 and the cell picture II.
The template matching method is one of a squared difference matching method, a normalized variance matching method, a correlation matching method, a normalized correlation matching method, a correlation coefficient matching method, and a normalized correlation coefficient matching method.
(7) And calculating the offset according to the coordinates of the point P and the area A1 on the cell picture I, and finding out the overlapping area of the cell picture I and the cell picture II.
(8) And (4) moving the slide block in the overlapping area for multiple times, calculating and recording the brightness difference of the overlapping area of the two cell pictures.
(9) Setting a threshold, carrying out thresholding on the overlapping area, and recording the overlapping rate of the binary image, wherein the specific calculation formula of the thresholding is as follows:
wherein src (x, y) is the value of the current cell picture pixel point; t is t1、t2Is a set threshold.
And further judging the overlapping rate of the binary images of the cell images with passing brightness and saturation.
(10) Intercepting the area picture where the sliding block is located on the original cell picture, converting the intercepted picture from an RGB (red, green and blue) channel to an HSV (hue, saturation and brightness) channel, and recording the saturation difference value of an overlapped area, wherein the specific color conversion formula is as follows:
wherein, the RGB value range is [0, 255 ].
(11) And respectively counting the brightness difference of the corresponding overlapping area, the overlapping rate of the binary image and the saturation difference when the splicing of each group of cell images is successful or failed.
(12) According to the recording result in the step (11), applying a decision tree algorithm to find out the corresponding rules of the brightness difference, the binary image overlap rate and the saturation difference value and making selection conditions for cell image splicing, wherein the selection conditions specifically comprise:
a. luminance difference < threshold T1;
b. the saturation difference < threshold T2;
c. binary map overlap > threshold T3;
the selection must be performed while satisfying the above 3 items.
Wherein the threshold T1, the threshold T2 and the threshold T3 are all specific values calculated by the algorithm according to specific cell pictures.
(13) And (5) finding out a matching point according to the selection condition set in the step (12) and completing the splicing of the cell pictures.
In step (12) of this embodiment, a loss function model is used when the decision tree algorithm is applied, and pruning is performed on the loss function model to prevent overfitting.
In addition, it should be noted that the above contents described in the present specification are only illustrations of the present invention. All equivalent or simple changes of the features and principles of the invention according to the patent concepts are included in the scope of protection of the present patent. Various modifications, additions and substitutions for the specific embodiments described may be made by those skilled in the art without departing from the scope of the invention as defined in the accompanying claims.
Claims (7)
1. A cell picture splicing method based on gray values is characterized by comprising the following steps:
(1) shooting and collecting to obtain a cell picture;
(2) performing color conversion on the cell picture, and converting the cell picture from an RGB (red, green and blue) channel to a gray channel;
(3) removing noise points from the cell picture through Gaussian filtering or mean filtering;
(4) carrying out dislocation subtraction on the cell pictures to eliminate the brightness difference;
(5) randomly selecting an area A on the cell picture, moving the area A for multiple times by using a sliding block, calculating the mean square error of the area A, and finding out a matching area A1 with the maximum brightness change according to the mean square error;
(6) calculating a matching error by using a template matching method, and finding out a point P with the maximum or minimum difference according to an error range;
(7) calculating the offset of the cell picture, and finding out the overlapping area of the cell picture;
(8) calculating and recording the brightness difference of the cell picture overlapping region;
(9) setting a threshold, carrying out thresholding processing on the overlapping area, recording the overlapping rate of binary images, and further judging the overlapping rate of the binary images of the cell images with passing brightness and saturation;
(10) intercepting an area picture where a sliding block is located on the cell picture, converting the intercepted picture from an RGB (red, green and blue) channel to an HSV (hue, saturation and brightness) channel, and recording a saturation difference value of an overlapped area;
(11) respectively counting the brightness difference of the corresponding overlapping area, the overlapping rate of the binary image and the saturation difference when the cell image splicing is successful or failed;
(12) according to the statistical result, applying a decision tree algorithm to find out the corresponding rules of the brightness difference, the binary image overlap rate and the saturation difference value and making a selection condition for splicing the cell images;
(13) and finding out a matching point according to the selection condition and completing the splicing of the cell picture.
2. The method for splicing cell pictures based on gray scale values according to claim 1, wherein the calculation formula for performing the subtraction of the misalignment of the cell pictures in the step (4) is as follows:
wherein, (x, y) is a current cell picture pixel point; src is the primitive cell picture; c1、C2Is a specified constant; g (x, y) is the gray value of the treated cell picture at this point.
3. The method for splicing cell pictures based on gray scale values according to claim 1, wherein the template matching method applied in step (6) is one of a squared error matching method, a normalized squared error matching method, a correlation matching method, a normalized correlation matching method, a correlation coefficient matching method and a normalized correlation coefficient matching method.
4. The method for splicing cell pictures based on gray scale values according to claim 1, wherein the specific calculation formula for thresholding the overlapped region in the step (9) is as follows:
wherein src (x, y) is the value of the current cell picture pixel point; t is t1、t2Is a set threshold.
5. The method for splicing cell pictures based on gray-scale values as claimed in claim 1, wherein the conversion formula for converting the clipped picture from RGB (red, green and blue) channel to HSV (color, saturation, brightness) channel in step (10) is as follows:
wherein, the RGB value range is [0, 255 ].
6. The grey-value-based cell image stitching method according to claim 1, wherein the decision tree algorithm in the step (12) adopts a loss function model and prunes the loss function model.
7. The grey-scale value-based cell image stitching method according to claim 1, wherein the selection conditions established in the step (12) are:
a. luminance difference < threshold T1;
b. the saturation difference < threshold T2;
c. binary map overlap > threshold T3;
the selection must be performed while satisfying the above 3 items.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111390285.0A CN113902817A (en) | 2021-11-23 | 2021-11-23 | Cell picture splicing method based on gray value |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111390285.0A CN113902817A (en) | 2021-11-23 | 2021-11-23 | Cell picture splicing method based on gray value |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113902817A true CN113902817A (en) | 2022-01-07 |
Family
ID=79194875
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111390285.0A Pending CN113902817A (en) | 2021-11-23 | 2021-11-23 | Cell picture splicing method based on gray value |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113902817A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017193372A1 (en) * | 2016-05-13 | 2017-11-16 | 深圳市赛亿科技开发有限公司 | Method and system for realizing panorama mosaicking |
CN108122200A (en) * | 2017-12-20 | 2018-06-05 | 宁波视睿迪光电有限公司 | Image split-joint method and device |
CN108805865A (en) * | 2018-05-22 | 2018-11-13 | 杭州智微信息科技有限公司 | A kind of myeloplast localization method based on saturation degree cluster |
CN109991205A (en) * | 2019-05-05 | 2019-07-09 | 中国科学院重庆绿色智能技术研究院 | A kind of counting algorithm of circulating tumor cell and application |
EP3508841A1 (en) * | 2018-01-08 | 2019-07-10 | 4D Lifetec AG | Single cell gel electrophoresis |
CN111798423A (en) * | 2020-07-01 | 2020-10-20 | 上海理工大学 | Concrete crack picture splicing and detecting method |
-
2021
- 2021-11-23 CN CN202111390285.0A patent/CN113902817A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017193372A1 (en) * | 2016-05-13 | 2017-11-16 | 深圳市赛亿科技开发有限公司 | Method and system for realizing panorama mosaicking |
CN108122200A (en) * | 2017-12-20 | 2018-06-05 | 宁波视睿迪光电有限公司 | Image split-joint method and device |
EP3508841A1 (en) * | 2018-01-08 | 2019-07-10 | 4D Lifetec AG | Single cell gel electrophoresis |
CN108805865A (en) * | 2018-05-22 | 2018-11-13 | 杭州智微信息科技有限公司 | A kind of myeloplast localization method based on saturation degree cluster |
WO2019223706A1 (en) * | 2018-05-22 | 2019-11-28 | 杭州智微信息科技有限公司 | Saturation clustering-based method for positioning bone marrow white blood cells |
CN109991205A (en) * | 2019-05-05 | 2019-07-09 | 中国科学院重庆绿色智能技术研究院 | A kind of counting algorithm of circulating tumor cell and application |
CN111798423A (en) * | 2020-07-01 | 2020-10-20 | 上海理工大学 | Concrete crack picture splicing and detecting method |
Non-Patent Citations (2)
Title |
---|
胡社教;涂桂林;江萍;: "基于灰度相关图像拼接的改进算法", 合肥工业大学学报(自然科学版), no. 06, 28 June 2008 (2008-06-28) * |
高飞;唐晶;李景岗;张凤;龙莉;曹得华;汪键;: "LDBC-Ⅰ全自动血细胞图像分析仪对外周血有核细胞分类能力的验证评价", 现代检验医学杂志, no. 02, 15 March 2018 (2018-03-15) * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | A novel color edge detection algorithm in RGB color space | |
US7903870B1 (en) | Digital camera and method | |
CN109636784B (en) | Image saliency target detection method based on maximum neighborhood and super-pixel segmentation | |
CN102385753B (en) | Illumination-classification-based adaptive image segmentation method | |
US20020114512A1 (en) | Color clustering and segmentation using sigma filtering | |
CN110427966A (en) | One kind rejecting error hiding feature point methods based on characteristic point local feature | |
CN108615239B (en) | Tongue image segmentation method based on threshold technology and gray level projection | |
CN102306307B (en) | Positioning method of fixed point noise in color microscopic image sequence | |
Jiang et al. | Skin detection using color, texture and space information | |
CN111476744B (en) | Underwater image enhancement method based on classification and atmospheric imaging model | |
CN112419210A (en) | Underwater image enhancement method based on color correction and three-interval histogram stretching | |
Sari et al. | Tomato ripeness clustering using 6-means algorithm based on v-channel otsu segmentation | |
CN110298835B (en) | Leather surface damage detection method, system and related device | |
Indra et al. | Eggs Detection Using Otsu Thresholding Method | |
CN107256539B (en) | Image sharpening method based on local contrast | |
Do et al. | Skin color detection through estimation and conversion of illuminant color under various illuminations | |
CN114898412A (en) | Identification method for low-quality fingerprints and incomplete fingerprints | |
CN111046782A (en) | Fruit rapid identification method for apple picking robot | |
US9672447B2 (en) | Segmentation based image transform | |
CN111611940A (en) | Rapid video face recognition method based on big data processing | |
CN111667509B (en) | Automatic tracking method and system for moving target under condition that target and background colors are similar | |
WO2024016791A1 (en) | Method and apparatus for processing graphic symbol, and computer-readable storage medium | |
CN113902817A (en) | Cell picture splicing method based on gray value | |
CN111489371B (en) | Image segmentation method for scene histogram approximate unimodal distribution | |
Khan et al. | Shadow removal from digital images using multi-channel binarization and shadow matting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |