CN110490882B - Cell membrane staining image analysis method, device and system - Google Patents

Cell membrane staining image analysis method, device and system Download PDF

Info

Publication number
CN110490882B
CN110490882B CN201910765920.5A CN201910765920A CN110490882B CN 110490882 B CN110490882 B CN 110490882B CN 201910765920 A CN201910765920 A CN 201910765920A CN 110490882 B CN110490882 B CN 110490882B
Authority
CN
China
Prior art keywords
cell membrane
image
cell
membrane staining
staining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910765920.5A
Other languages
Chinese (zh)
Other versions
CN110490882A (en
Inventor
张军
颜克洲
姚建华
韩骁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910765920.5A priority Critical patent/CN110490882B/en
Publication of CN110490882A publication Critical patent/CN110490882A/en
Application granted granted Critical
Publication of CN110490882B publication Critical patent/CN110490882B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Abstract

The disclosure provides a cell membrane staining image analysis method, a cell membrane staining image analysis device and a cell membrane staining image analysis system, and relates to the field of artificial intelligence. The method comprises the following steps: acquiring a cell membrane staining image, marking the cell nucleus of the target cell in the cell membrane staining image to acquire a marked cell nucleus, and sketching the cell membrane of the target cell in the cell membrane staining image to acquire a marked cell membrane; acquiring a fusion image containing the marked cell nucleus and the marked cell membrane according to the position information of the marked cell nucleus and the marked information of the marked cell membrane; and determining the staining degree of the cell membrane according to the staining area of the cell membrane in the fusion image, and determining the number of the target cells with different cell membrane staining states according to the number of the cell nuclei corresponding to different cell membrane staining states in the fusion image. The cell membrane staining image analysis system can be used for systematically analyzing the cell membrane staining image, counting various staining conditions, realizing qualitative display and quantitative statistical analysis of the cell membrane staining image, and improving the accuracy of an analysis result.

Description

Cell membrane staining image analysis method, device and system
Technical Field
The disclosure relates to the technical field of artificial intelligence, in particular to a cell membrane staining image analysis method, a cell membrane staining image analysis device and a cell membrane staining image analysis system.
Background
Immunohistochemistry (IHC) is a technology for detecting and locating certain chemical substances in tissues and cells by utilizing the specific binding reaction of antigen and antibody, is formed by combining immunology and traditional histochemistry, and is characterized in that morphological change is combined with functional and metabolic change, the existence of certain protein and polypeptide substances is directly located on tissue sections, cell smears or cultured cell crawls, and the detected substances are analyzed by combining the technologies of an electronic computer image analysis system or laser scanning confocal microscopy and the like.
Currently, stained tissue sections, cell smears or cell crawls are photographed under a microscope to obtain a cell membrane stain image, and the cell membrane stain image is analyzed. When the cell membrane staining image is analyzed, the cell nucleus detection or cell membrane delineation is usually performed, and the unified analysis of the cell nucleus and the cell membrane is lacked, so that the information quantity acquired by a pathologist from the cell membrane staining image is less, and the accuracy of an image analysis result is lower.
In view of the above, there is a need in the art to develop a new cell membrane staining image analysis method.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The embodiment of the disclosure provides a cell membrane staining image analysis method, a cell membrane staining image analysis device and a cell membrane staining image analysis system, so that a cell nucleus detection result and a cell membrane delineation result can be fused at least to a certain extent, qualitative display and quantitative statistical analysis are realized, and the accuracy of a cell membrane staining image analysis result is improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to an aspect of the embodiments of the present disclosure, there is provided a cell membrane staining image analysis method, including: acquiring a cell membrane staining image, marking the cell nucleus of a target cell in the cell membrane staining image to acquire a marked cell nucleus, and delineating the cell membrane of the target cell in the cell membrane staining image to acquire a marked cell membrane; acquiring a fusion image containing the marked cell nucleus and the marked cell membrane according to the position information of the marked cell nucleus and the marking information of the marked cell membrane; and determining the staining degree of the cell membrane according to the staining area of the cell membrane in the fusion image, and determining the number of the target cells with different cell membrane staining states according to the number of the cell nuclei corresponding to different cell membrane staining states in the fusion image.
According to an aspect of the embodiments of the present disclosure, there is provided a cell membrane staining image analyzing apparatus including: the marking module is used for acquiring a cell membrane staining image, marking the cell nucleus of the target cell in the cell membrane staining image to acquire a marked cell nucleus, and sketching the cell membrane in the cell membrane staining image to acquire a marked cell membrane; the fusion module is used for acquiring a fusion image containing the labeled cell nucleus and the labeled cell membrane according to the position information of the labeled cell nucleus and the labeled information of the labeled cell membrane; and the counting module is used for determining the dyeing degree of the cell membranes according to the cell membrane dyeing area in the fusion image and determining the number of the target cells with different cell membrane dyeing states according to the number of the cell nuclei corresponding to different cell membrane dyeing states in the fusion image.
In some embodiments of the present disclosure, based on the foregoing, the marking module is configured to: carrying out color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to cell nucleus; processing the color channel images corresponding to the cell nuclei to mark all of the cell nuclei in the cell membrane stain image; and screening the marked cell nucleus according to the cell morphology corresponding to the target cell to obtain the marked cell nucleus.
In some embodiments of the present disclosure, based on the foregoing, the marking module is configured to: carrying out color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to the cell membrane; segmenting the color channel image corresponding to the cell membrane according to a first preset threshold value to obtain a segmented image; and extracting a central line of the segmentation image according to a skeleton extraction algorithm, and delineating the cell membrane according to the central line so as to obtain the marked cell membrane.
In some embodiments of the present disclosure, based on the foregoing solution, the fusion module includes: the area dividing unit is used for determining a closed area and a non-closed area according to the marking information of the marked cell membrane; a cell nucleus redefinition unit for determining the barycentric position of the closed region and using the barycentric position as the position information of the cell nucleus in the closed region; the segmentation graph forming unit is used for filling the inside of the closed area to obtain a segmentation graph corresponding to the closed area; and the fusion image generating unit is used for determining the fusion image according to the position information of the marked cell nucleus, the position information of the cell nucleus in the closed area and the segmentation map.
In some embodiments of the present disclosure, based on the foregoing scheme, the region dividing unit is configured to: determining a communication boundary area according to the central line corresponding to the marked cell membrane, and judging whether a closed communication area exists in the communication boundary area; if a closed communication region exists in the communication boundary region, when the closed communication region is determined to be an innermost region and the area of the closed communication region is greater than or equal to a preset area threshold value, marking the closed communication region as the closed region; if a non-closed connected region exists in the connected boundary region, marking the non-closed connected region as the non-closed region.
In some embodiments of the present disclosure, based on the foregoing scheme, the fused image generating unit is configured to: acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image; acquiring an intersection between the position information of the labeled cell nucleus and the first complementary set region, and acquiring a union of the intersection and the position information of the cell nucleus in the closed region to obtain the fusion image.
In some embodiments of the present disclosure, based on the foregoing scheme, the statistical module includes: the image decomposition unit is used for carrying out color channel decomposition on the cell membrane staining image in an RGB image format so as to obtain a color channel image corresponding to the cell membrane; the image segmentation unit is used for segmenting the color channel image corresponding to the cell membrane according to a second preset threshold value so as to obtain a cell membrane staining area template; and the image extraction unit is used for carrying out image extraction on the color channel image corresponding to the cell membrane through the cell membrane staining area template so as to obtain a cell membrane staining area image and determining the cell membrane staining degree according to the cell membrane staining area image.
In some embodiments of the present disclosure, based on the foregoing scheme, the image extraction unit includes: a first cell membrane staining degree determination unit for determining that the cell membrane staining degree is zero when the area of the cell membrane staining region image is zero; a second cell membrane staining degree determination unit for determining the cell membrane staining degree according to the pixel value in the cell membrane staining area image when the area of the cell membrane staining area image is not zero.
In some embodiments of the present disclosure, based on the foregoing scheme, the second cell membrane staining degree unit is configured to: calculating the pixel average value of all pixels in the cell membrane staining area image, and comparing the pixel average value with a staining threshold value; when the pixel average value is smaller than the staining threshold value, judging the cell membrane staining degree to be weak staining; when the pixel average is greater than or equal to the staining threshold, determining that the cell membrane staining degree is strong staining.
In some embodiments of the present disclosure, based on the foregoing, the statistics module is configured to: and acquiring a first number of cell nuclei in all the closed areas in the fusion image, wherein the first number is the number of target cells with complete cell membrane staining states.
In some embodiments of the present disclosure, based on the foregoing, the statistical module is configured to: filling cavities in the cell membrane staining area template, and expanding the cell membrane staining area filled with the cavities according to a preset distance to obtain an expanded area; acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image; acquiring a union of the dilated area and the first complementary set area, and acquiring an intersection between the union and the fused image to determine an incomplete cell membrane staining area; and acquiring a second number of cell nuclei corresponding to the incomplete cell membrane staining area in the fusion image, wherein the second number is the number of target cells with incomplete cell membrane staining state.
In some embodiments of the present disclosure, based on the foregoing, the statistics module is configured to: acquiring a second complementary set region complementary to the bulge region in the cell membrane stain image; acquiring an intersection between the fused image and the first complementary set region, and acquiring an intersection between the intersection and the second complementary set region to acquire a cell membrane staining-free region; and acquiring a third number of cell nuclei corresponding to the cell membrane staining-free area in the fusion image, wherein the third number is the number of target cells with a cell membrane staining-free state.
In some embodiments of the present disclosure, based on the foregoing scheme, the cell membrane staining image analyzing apparatus further comprises: and the data conversion module is used for comparing the number of the cell nucleuses corresponding to the different cell membrane staining states with the number of all the cell nucleuses in the fused image so as to obtain the number percentage of the target cells with the different cell membrane staining states.
According to an aspect of the embodiments of the present disclosure, there is provided a cell membrane staining image analysis system including: the microscope is used for observing the cell membrane staining sample; the shooting device is connected with the microscope and is used for shooting the cell membrane staining image displayed in the microscope; an image processing device connected with the shooting device and used for receiving the cell membrane staining image shot by the shooting device, wherein the image processing device comprises one or more processors, the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are used for executing the cell membrane staining image analysis method according to the embodiment; and the display device is connected with the image processing device and used for receiving the image analysis result output by the image processing device and displaying the image analysis result on a display screen of the display device.
In some embodiments of the present disclosure, a cell nucleus and a cell membrane of a target cell in a cell membrane staining image are first labeled to obtain a labeled cell nucleus and a labeled cell membrane; then fusing the detection result of the cell nucleus and the sketching result of the cell membrane according to the position information of the marked cell nucleus and the marking information of the marked cell membrane to form a fused image; and finally, determining the dyeing degree of the cell membrane according to the cell membrane dyeing area in the fusion image, and determining the number of target cells with different cell membrane dyeing states according to the number of cell nuclei corresponding to different cell membrane dyeing states. The technical scheme disclosed can carry out systematic cell membrane staining analysis to the cell membrane staining image, and the various dyeing conditions of statistics realize qualitative show and quantitative statistical analysis to the cell membrane staining image, have improved the precision of analysis result.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It should be apparent that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived by those of ordinary skill in the art without inventive effort. In the drawings:
fig. 1 shows a schematic diagram of an exemplary system architecture to which technical aspects of embodiments of the present disclosure may be applied;
figure 2 schematically illustrates a flow diagram of a method of cell membrane stain image analysis according to one embodiment of the present disclosure;
FIG. 3 schematically shows a flow diagram for obtaining a labeled nucleus according to one embodiment of the present disclosure;
4A-4B schematically illustrate a structural schematic of an input image and an output image containing labeled nuclei according to one embodiment of the present disclosure;
figure 5 schematically shows a flow diagram depicting a cell membrane according to one embodiment of the present disclosure;
6A-6B schematically illustrate a structural schematic of an input image and an output image containing labeled cell membranes according to one embodiment of the present disclosure;
FIG. 7 schematically illustrates a flow diagram for acquiring a fused image according to one embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow diagram for determining occlusion regions and non-occlusion regions, according to one embodiment of the present disclosure;
figure 9 schematically shows a flow diagram for determining the degree of staining of cell membranes according to one embodiment of the present disclosure;
figure 10 schematically shows a flow diagram for determining the degree of staining of cell membranes according to one embodiment of the present disclosure;
figure 11 schematically illustrates a flow diagram for determining the number of cancer cells with incomplete cell membrane staining according to one embodiment of the present disclosure;
figure 12 schematically shows a flow diagram for determining the number of cancer cells in a state without cell membrane staining according to one embodiment of the present disclosure;
figure 13 schematically illustrates a flow diagram for image analysis of a HER2 positive breast cancer cell membrane stain image according to one embodiment of the present disclosure;
14A-14C schematically illustrate structural schematics of HRE2 stain image analysis results, according to one embodiment of the present disclosure;
fig. 15 schematically shows a block diagram of a cell membrane stain image analysis apparatus according to one embodiment of the present disclosure;
figure 16 schematically illustrates a block diagram of a cell membrane staining image analysis system according to one embodiment of the present disclosure;
fig. 17 shows a schematic configuration diagram of a computer system suitable for implementing the image processing apparatus of the embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of the embodiments of the present disclosure may be applied.
As shown in fig. 1, system architecture 100 may include terminal device 101, network 102, and server 103. Network 102 is the medium used to provide communication links between terminal devices 101 and server 103. Network 102 may include various connection types such as wired communication links, wireless communication links, and the like.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired. For example, the server 103 may be a server cluster composed of a plurality of servers. The terminal device 101 may be an intelligent microscope for observing and photographing a stained tissue section or the like to obtain a cell membrane stain image, and the intelligent microscope is integrated with a real-time photographing device, and is capable of photographing an amplified cell membrane stain image in the microscope in real time and transmitting the obtained cell membrane stain image to the server 103 through the network 102, so that the server 103 analyzes the cell membrane stain image. In addition, the system architecture 100 may further include a terminal device 104, where the terminal device 104 may be one or more of a smart phone, a tablet computer, and a portable computer, and the terminal device 104 is connected to the terminal device 101 and configured to receive the cell membrane staining image captured by the terminal device 101 and send the received cell membrane staining image to the server 103 for image analysis.
In an embodiment of the present disclosure, the terminal device 101 sends the cell membrane staining image to the server 103 through the network 102, and after the server 103 obtains the cell membrane staining image, the cell nucleus of the target cell in the cell membrane staining image may be marked according to different staining colors and cell morphologies to obtain a marked cell nucleus, and the cell membrane may be sketched to obtain a marked cell membrane; fusing the cell nucleus detection result and the cell membrane delineation result according to the position information of the labeled cell nucleus and the label information of the labeled cell membrane to obtain a fused image; and finally, determining the dyeing degree of the cell membrane according to the cell membrane dyeing area in the fusion image, and determining the number of target cells with different cell membrane dyeing states according to the number of cell nuclei corresponding to different cell membrane dyeing states in the fusion image. The target cells can be cancer cells and the like, and qualitative display and quantitative statistical analysis are carried out on the target cells, so that doctors can be helped to know the illness state of patients more clearly, and guidance is provided for later treatment schemes. The technical scheme of the embodiment of the disclosure can carry out systematic cell membrane staining analysis on the cell membrane staining image, and count various staining conditions, thereby realizing qualitative display and quantitative statistical analysis of the cell membrane staining image, and improving the accuracy of the image analysis result.
It should be noted that the cell membrane staining image analysis method provided by the embodiment of the present disclosure is generally performed by a server, and accordingly, the cell membrane staining image analysis apparatus is generally disposed in the server. However, in other embodiments of the present disclosure, the cell membrane staining image analysis scheme provided by the embodiments of the present disclosure may also be performed by a terminal device.
The cell membrane staining image analysis method in the embodiments of the present disclosure is implemented based on an unsupervised learning algorithm, which is one of Artificial Intelligence (AI), which is a theory, method, technique, and application system that simulates, extends, and expands human Intelligence, senses the environment, acquires knowledge, and uses knowledge to obtain the best result using a digital computer or a machine controlled by a digital computer. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making.
The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence base technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
Computer Vision technology (CV) Computer Vision is a science for researching how to make a machine "see", and further means that a camera and a Computer are used for replacing human eyes to perform machine Vision such as identification, tracking and measurement on a target, and further performing graphic processing, so that the Computer processing becomes an image more suitable for human eyes to observe or transmitted to an instrument to detect. As a scientific discipline, computer vision research-related theories and techniques attempt to build artificial intelligence systems that can capture information from images or multidimensional data. Computer vision technologies generally include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technologies, virtual reality, augmented reality, synchronous positioning, map construction, and other technologies, and also include common biometric technologies such as face recognition and fingerprint recognition.
Machine Learning (ML) is a multi-domain cross discipline, and relates to a plurality of disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory and the like. The special research on how a computer simulates or realizes the learning behavior of human beings so as to acquire new knowledge or skills and reorganize the existing knowledge structure to continuously improve the performance of the computer. Machine learning is the core of artificial intelligence, is the fundamental approach to make computers have intelligence, and is applied in various fields of artificial intelligence. Machine learning and deep learning generally include techniques such as artificial neural networks, belief networks, reinforcement learning, transfer learning, inductive learning, and formal education learning.
With the research and progress of artificial intelligence technology, the artificial intelligence technology is developed and applied in a plurality of fields, such as common smart homes, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned driving, automatic driving, unmanned aerial vehicles, robots, smart medical care, smart customer service, and the like.
The scheme provided by the embodiment of the disclosure relates to an artificial intelligence image processing technology, and is specifically explained by the following embodiment:
the embodiment of the disclosure firstly provides a cell membrane staining image analysis method, and details of implementation of the technical scheme of the embodiment of the disclosure are explained in detail as follows:
fig. 2 schematically illustrates a flow chart of a cell membrane staining image analysis method according to one embodiment of the present disclosure, which may be performed by a server, which may be the server 103 shown in fig. 1. Referring to fig. 2, the cell membrane staining image analysis method at least includes steps S210 to S230, which are described in detail as follows:
in step S210, a cell membrane staining image is acquired, the cell nucleus of the target cell in the cell membrane staining image is marked to acquire a marked cell nucleus, and the cell membrane in the cell membrane staining image is outlined to acquire a marked cell membrane.
In one embodiment of the present disclosure, the cell membrane staining image may be an IHC membrane staining image, which is analyzed to have a very large specific gravity in IHC series examination, such as membrane staining of EGFR, HER2, CD117, CD3, CD5, CD20, etc., and grading or negative-positive determination by quantitative statistics of the membrane staining effect, for example, HER2 stains the positive cancer cell membrane brown, and the cell nucleus blue. The technical scheme of the embodiment of the present disclosure will be explained by taking an IHC film stained image as an example.
In one embodiment of the present disclosure, normal cells and abnormal cells, such as cancer cells, are present in the IHC membrane stain image. When the corresponding cell nucleus of the cancer cell in the IHC membrane staining image is marked, the cell nucleus can be marked by a method combining color channel decomposition and a cell nucleus detection algorithm.
Fig. 3 shows a schematic flowchart of the process of acquiring the labeled cell nucleus, and as shown in fig. 3, the method of acquiring the labeled cell nucleus at least includes steps S301 to S303, specifically:
in step S301, the cell membrane stain image having the RGB image format is subjected to color channel decomposition to acquire a color channel image corresponding to the cell nucleus.
In an embodiment of the present disclosure, the image format of the IHC membrane staining image obtained by the camera connected to the terminal device 101 is generally RGB format, and the cell nucleus and the cell membrane can be labeled in a targeted manner only by performing color channel decomposition on the IHC membrane staining image with RGB three-channel image to form an immunohistochemical channel image corresponding to the cell nucleus and the cell membrane. Generally, color channel decomposition is performed on an image in RGB format to obtain three immunohistochemical channel images, which are an H (hematoxylin) channel image, an E (eosin) channel image, and a DAB (diaminobenzidine) channel image, respectively, where the H channel image is an image corresponding to cell nuclei, and the DAB channel image is an image corresponding to cell membranes, so that the color channel image corresponding to the cell nuclei in the embodiment of the present disclosure is the H channel image.
In step S302, the color channel images corresponding to the cell nuclei are processed to mark all the cell nuclei in the cell membrane stain image.
In one embodiment of the present disclosure, nuclei in the H-channel image may be labeled by a watershed algorithm. Of course, the H-channel image may also be processed by other algorithms to mark the cell nuclei therein, which is not specifically limited by the embodiment of the present disclosure.
In step S303, the labeled nuclei are screened according to the cell morphology corresponding to the target cells to obtain labeled nuclei.
In one embodiment of the present disclosure, since the cancer cells and the normal cells have different cell morphologies, for example, the normal cells have similar sizes and regular shapes, and the cancer cells are usually much larger than the normal cells and irregular shapes, the labeled cell nuclei can be screened according to the cell morphologies such as the cell sizes and the cell shapes, only the labels for the cancer cells are retained, and the labeled cancer cell nuclei are used as the labeled cell nuclei, and meanwhile, the position information of the labeled cell nuclei is obtained to form the cancer cell coordinate set D detect . FIGS. 4A-4B are schematic diagrams showing the structure of an input image and an output image containing labeled cell nuclei, and as shown in FIG. 4A, the image in the visual field area is a cell membrane staining image having an RGB image format, and after cell nuclei are labeled by the method shown in FIG. 3, multiple cells can be displayed in the visual field areaPoints, as shown in FIG. 4B, are labeled nuclei.
In one embodiment of the present disclosure, after labeling the nucleus of the target cell, the cell membrane of the cancer cell in the cell membrane stain image may be delineated to obtain a labeled cell membrane.
Fig. 5 shows a schematic flow chart of cell membrane delineation, and as shown in fig. 5, the method of cell membrane delineation at least comprises steps S501-S503, specifically:
in step S501, the cell membrane stain image having the RGB image format is subjected to color channel decomposition to acquire a color channel image corresponding to the cell membrane.
In one embodiment of the present disclosure, similar to step S301, in order to delineate the cell membrane, it is required to perform color channel decomposition on the input image having RGB image format to obtain an immunohistochemical channel image including an H channel image, an E channel image, and a DAB channel image, where the DAB channel image is a brown staining channel image corresponding to the cell membrane in the IHC membrane staining image, and therefore the color channel image corresponding to the cell membrane in the embodiment of the present disclosure is the DAB channel image.
In step S502, a color channel image corresponding to the cell membrane is segmented according to a first preset threshold to obtain a segmented image.
In an embodiment of the present disclosure, the color channel image corresponding to the cell membrane may be segmented by an algorithm such as a threshold or the ohq method, specifically, the brown color channel image may be segmented according to a first preset threshold, and the segmented brown image is used as the segmented image. Wherein first preset threshold value can be set for according to actual need, in addition because the dyeing agent kind is different, and the depth after the dyeing is also different, therefore corresponds different dyeing agents, and first preset threshold value is also different.
In step S503, a centerline of the segmented image is extracted according to a skeleton extraction algorithm, and a cell membrane is delineated according to the centerline to obtain a labeled cell membrane.
In an embodiment of the present disclosure, since cells are densely distributed in a tissue sample, in order to make details of an IHC membrane staining image richer and facilitate observation and analysis, a cell membrane in the IHC membrane staining image may be delineated, specifically, a skeleton extraction algorithm may be used to extract a center line of a segmented image, and delineate a cell membrane according to the center line, so as to obtain a labeled cell membrane according to the delineated cell membrane. Fig. 6A-6B are schematic diagrams illustrating the structure of an input image and an output image containing labeled cell membranes, and as shown in fig. 6A, there may be overlapping interweaving between the stained cell membranes in the input image, which results in that individual cells cannot be accurately distinguished, but by labeling the cell membranes, it is possible to obtain cell membranes labeled with thinner center lines without overlapping intersections, as shown in fig. 6B, and according to the output image, a user can accurately locate the cell membranes and accurately distinguish adjacent cells.
In step S220, a fusion image including the labeled cell nucleus and the labeled cell membrane is acquired based on the position information of the labeled cell nucleus and the label information of the labeled cell membrane.
In one embodiment of the present disclosure, fusing the cell nucleus detection results and the cell membrane delineation results is necessary for systematic analysis of cell membrane nuclei. In the embodiment of the disclosure, fusion is performed mainly according to the position information of the labeled cell nucleus and the labeled cell membrane, wherein the position information of the labeled cell nucleus is set D detect The coordinates of all marked cancer cell nucleuses, and the marking information of the marked cell membrane is a central line formed when the cell membrane is drawn.
Fig. 7 shows a schematic flowchart of the process of acquiring the fusion image, and as shown in fig. 7, the process at least includes steps S701 to S704, specifically:
in step S701, an occlusion region and a non-occlusion region are determined based on the labeling information of the labeled cell membrane.
In one embodiment of the present disclosure, as can be seen from fig. 6B, some cells have an intact cell membrane package, and some cells have no intact cell membrane package, so that the connected boundary region formed by the central lines can be determined by counting the central lines of the labeled cell membranes, and then the closed region and the non-closed region can be determined according to the connected boundary region. Fig. 8 is a schematic flow chart illustrating the process of determining the closed region and the non-closed region, as shown in fig. 8, in step S801, determining a connected boundary region according to the center line corresponding to the labeled cell membrane, and determining whether there is a closed connected region in the connected boundary region; in step S802, if there is a closed connected region in the connected boundary region, when it is determined that the closed connected region is an innermost region and the area of the closed connected region is greater than or equal to a preset area threshold, marking the closed connected region as a closed region; the preset area threshold may be set according to actual needs, for example, may be set to 1000 pixels, and the like; in step S803, if there is a non-closed connected region in the connected boundary region, the non-closed connected region is marked as a non-closed region. In order to improve the visualization degree of the statistical analysis result, so that a user can clearly distinguish the closed region from the non-closed region, the closed region and the non-closed region may be labeled by different colors, for example, the closed region may be identified by pink, the non-closed region may be identified by green, and certainly, the closed region and the non-closed region may be identified by other different colors, which is not specifically limited by the present disclosure.
In step S702, the barycentric position of the closed region is determined, and the barycentric position is taken as the position information of the cell nucleus in the closed region.
In one embodiment of the present disclosure, in order to prevent the collision of the cell membrane delineation result and the cell nucleus detection result, after distinguishing the closed region from the non-closed region, the cell nucleus position within the closed region may be redefined. Specifically, the barycentric position of the closed region may be determined according to the pixel coordinates corresponding to each pixel point in the closed region, where the barycentric position is the position of the cell nucleus in the redefined closed region. By redefining the position of the cell nucleus in each closed region, a set D of position information of the cell nucleus in a closed region can be determined center
In step S703, the inside of the closed region is filled to acquire a segmentation map corresponding to the closed region.
In an embodiment of the present disclosure, binarization pixel filling may be performed inside the closed region, for example, all pixels inside the closed region are replaced by 1, and pixels outside the closed region are replaced by 0, so that the segmentation map corresponding to the closed region may be determined according to different pixel values.
In step S704, a fusion image is determined based on the position information of the marker cell nucleus, the position information of the cell nucleus in the closed region, and the segmentation map.
In one embodiment of the present disclosure, the fused image may be determined according to expression (1):
D all =D detect ∩C I M enclosed ∪D center (1)
wherein D is all For fused images fusing the results of the nuclear detection and the cell membrane delineation, D detect To mark the positional information of the nucleus, D center For position information of redefined nuclei in closed areas, M enclosed As a segmentation map corresponding to the closed region, C I M enclosed The segmentation map corresponding to the closed region is the complement in the IHC film stain image.
That is, the method of determining the fused image specifically includes: firstly, acquiring a first complementary set area which is complementary with a segmentation image in an IHC film staining image; then, acquiring intersection between the position information of the marked cell nucleus and the first complementary set region, namely the position information of the marked cell nucleus in the non-closed region; and finally, combining the intersection with the position information of the cell nucleus in the closed area to obtain a fusion image containing the marked cell nucleus and the marked cell membrane.
In step S230, the degree of cell membrane staining is determined according to the cell membrane staining area in the fused image, and the number of target cells having different cell membrane staining states is determined according to the number of nuclei corresponding to the different cell membrane staining states in the fused image.
In an embodiment of the present disclosure, after obtaining the fusion image, quantitative statistics may be performed on the fusion image, and the quantitative indicators are: cell membrane staining degree, number of cells with intact cell membrane staining status, number of cells with incomplete cell membrane staining status, and number of cell membranes with no cell staining status.
In one embodiment of the present disclosure, the degree of cell membrane staining is determined primarily from the stained area of the cell membrane in the fused image. In the IHC membrane staining image, the stained cell membrane is usually brown, so when the staining degree of the cell membrane in the IHC membrane staining image is analyzed statistically, the brown area image can be extracted, and the staining degree can be judged according to the brown area image.
Fig. 9 shows a schematic diagram of a process for determining the degree of staining of a cell membrane, as shown in fig. 9, the process at least comprises steps S901-S903, specifically:
in step S901, the cell membrane stain image having the RGB image format is subjected to color channel decomposition to acquire a color channel image corresponding to the cell membrane.
In one embodiment of the present disclosure, the input image is in RGB image format, and the color channel image corresponding to the cell membrane is an immunohistochemical channel image, so that to obtain the color channel image corresponding to the cell membrane, the IHC membrane staining image in RGB image format can be color channel decomposed to obtain the immunohistochemical channel image comprising the H channel image, the E channel image and the DAB channel image, wherein the DAB channel image is the brown staining channel image corresponding to the cell membrane.
In step S902, the color channel image corresponding to the cell membrane is segmented according to a second preset threshold to obtain a cell membrane staining area template.
In an embodiment of the present disclosure, the DAB channel image may be segmented according to a second preset threshold, where the second preset threshold may be the same as the first preset threshold, and both the second preset threshold and the first preset threshold are used for segmenting the brown stain channel image, and because different types of stains are used, the thresholds used in the segmentation are also different, so the first preset threshold and the second preset threshold may be set according to actual needs, which is not specifically limited in the embodiment of the present disclosure. By segmenting the DAB channel image, a cell membrane staining area template M can be obtained DAB The template M of the cell membrane staining area DAB Corresponding to the brown dyeing region, it can be embodied by binarized pixel points, where the pixel value corresponding to the cell membrane site is 1, and the pixel value corresponding to the non-cell membrane site is 0.
In step S903, image extraction is performed on the color channel image corresponding to the cell membrane by the cell membrane staining area template to obtain a cell membrane staining area image, and the cell membrane staining degree is determined from the cell membrane staining area image.
In one embodiment of the present disclosure, in order to determine the degree of staining of the cell membrane, it is necessary to extract a cell membrane staining region, and then determine the degree of staining of the cell membrane according to the pixel values in the cell membrane staining region. In the embodiment of the disclosure, the DAB channel image may be subjected to image extraction through the cell membrane staining area template to obtain the cell membrane staining area image, and specifically, a pixel matrix corresponding to the cell membrane staining area template may be multiplied point by a pixel matrix corresponding to the DAB channel image to obtain the cell membrane staining area image corresponding to the cell membrane staining area template.
In one embodiment of the present disclosure, since the cell membrane staining region template may not be necessarily obtained after the DAB channel image is subjected to threshold segmentation, when the cell membrane staining region template is not obtained, the area of the extracted cell membrane staining region image is zero, and thus it may be determined that the cell membrane is not stained; when the cell membrane stained area template is obtained, the area of the extracted cell membrane stained area image is not zero, and thus the degree of cell membrane staining can be determined from the pixel values of the cell membrane stained area image. Fig. 10 is a schematic view showing a flow chart for determining the degree of staining of a cell membrane, and as shown in fig. 10, in step S1001, the pixel average value of all pixels in an image of a stained area of a cell membrane is calculated and compared with a staining threshold value; in step S1002, when the pixel average value is less than the staining threshold, it is determined that the cell membrane staining degree is weak staining; in step S1003, when the pixel average value is greater than or equal to the staining threshold, the cell membrane staining degree is determined to be strong staining.
In one embodiment of the present disclosure, in a complete cell membrane staining image, the staining degree of the cell membrane is not completely the same, and there are complete staining, incomplete staining and no staining, so when counting the number of cancer cells, the number of cancer cells with different cell membrane staining states needs to be counted.
When the number of cancer cells with the complete cell membrane staining state is counted, the first number of cell nuclei in all closed areas in the fusion image can be obtained, and the first number is the position information set D of the cell nuclei in the closed areas center The number of elements in (1) is the number of cancer cells with intact cell membrane staining, which is the first number, since the closed region is the cell with intact cell membrane covering, and can be designated as card (D) center )。
When the number of cancer cells with incomplete cell membrane staining state is counted, the number of cancer cells with incomplete cell membrane staining state can be obtained by processing the cell membrane staining region template. Fig. 11 is a schematic diagram illustrating a flow of determining the number of cancer cells with incomplete cell membrane staining, as shown in fig. 11, in step S1101, a cavity filling is performed on a cell membrane staining area template, and the cell membrane staining area after the cavity filling is expanded by a preset distance to obtain an expanded area; since there are stained nuclei near the occlusion region, these nuclei may be excluded and voids may be formed in the cell membrane stained area template when the DAB channel image is thresholded, so that these voids may be filled first when the cell membrane stained area template is processed, for example, by using a pixel value of 1 instead of the original pixel value of 0. After the cavity is filled, the area corresponding to the cell membrane staining area template can be expanded to a certain degree according to the preset distance d to obtain an expanded area M expand (ii) a In step S1102, a first complementary set region complementary to the segmentation map in the cell membrane stain image is acquired; the first complement setRegion C in expression (1) I M enclosed (ii) a In step S1103, a union of the dilated area and the first complementary area is obtained, and an intersection between the union and the fused image is obtained to determine an incomplete cell membrane staining area; in step S1104, a second number of cell nuclei in the fused image corresponding to the incomplete cell membrane staining area is obtained, wherein the second number is the number of cancer cells with incomplete cell membrane staining status. Wherein, the incomplete cell membrane staining area can be specifically determined by expression (2):
D incomplete =D all ∩(M expand ∪C I M enclosed ) (2)
wherein D is incomplete As an incomplete cell membrane staining area, M expand Is an expansion region, C I M enclosed Is the first complementary region. Accordingly, the number of nuclei in the area of incomplete cell membrane staining is card (D) incomplete )。
When the number of cancer cells having a state without cell membrane staining is counted, the determination can be made based on the region outside the bulge region, the first complementary set region, and the number of nuclei in the fusion image. Fig. 12 is a schematic view showing a flow chart for determining the number of cancer cells in a state without cell membrane staining, and as shown in fig. 12, in step S1201, a second complementary region complementary to the bulge region in the cell membrane staining image is acquired; the second complementary region is the complementary C of the expansion region in the IHC film dyeing image I M expand (ii) a In step S1202, an intersection between the fused image and the first complementary set region is acquired, and an intersection between the intersection and the second complementary set region is acquired, so as to acquire a cell membrane staining free region; in step S1203, a third number of cell nuclei corresponding to the cell membrane staining free region in the fusion image is acquired, and the third number is the number of cancer cells having a cell membrane staining free state. Wherein, the cell membrane staining-free area can be determined according to the expression (3), and specifically comprises:
D none =D all ∩C I M enclosed ∩C I M expand (3)
wherein D is none As a region without staining of cell membrane, D all To fuse images, C I M enclosed Is a second complementary region, C I M expand Is the first complementary region. Accordingly, the number of nuclei in the cell membrane staining-free region was card (D) none )。
In one embodiment of the present disclosure, after the number of target cells having different cell membrane staining states is obtained, it may be converted into a cell count ratio for representation. Specifically, the number of nuclei corresponding to different cell membrane staining states may be compared with the number of all nuclei in the fused image to obtain the percentage of the number of target cells having cell membranes with different staining states. For example, the cancer cell proportion with intact cell membrane staining can be labeled as card (D) center )/card(D all ) The cancer cell ratio with incomplete cell membrane staining can be labeled as card (D) incomplete )/card(D all ) (ii) a The cancer cell proportion with no cell membrane staining can be labeled as card (D) none )/card(D all )。
Further, when qualitatively displaying the image analysis result, the cell nuclei of the target cells with different cell membrane staining states may be labeled with different colors, for example, the cell nuclei of the cancer cells with a complete cell membrane staining state may be labeled with a golden color, the cell nuclei of the cancer cells with an incomplete cell membrane staining state may be labeled with a red color, and the cell nuclei of the cancer cells with a no cell membrane staining state may be labeled with a blue color.
The cell membrane staining image analysis method in the embodiment of the present disclosure may be used for qualitative display and quantitative analysis of any membrane staining image, taking image analysis of a HER2 positive breast cancer cell membrane staining image as an example, fig. 13 shows a flow diagram of image analysis of a HER2 positive breast cancer cell membrane staining image, and as shown in fig. 13, in step S1301, a HER2 positive breast cancer cell membrane staining image is obtained; in step S1302, detecting a nucleus of a cancer cell in the HER2 positive breast cancer cell membrane staining image, and delineating a cell membrane of the cancer cell; in step S1303, the detection result of the cell nucleus and the delineation result of the cell membrane are fused to obtain a fused image; the specific fusion method is the same as the fusion method shown in fig. 7, and the embodiment of the disclosure is not described herein again; in step S1304, determining a degree of cell membrane staining according to the cell membrane staining region in the fused image, and determining the number of breast cancer cells having different cell membrane staining states according to the number of nuclei corresponding to the different cell membrane staining states in the fused image; the method for determining the degree of staining of cell membrane is the same as the method shown in fig. 9-10, and the method for determining the number of breast cancer cells with different states of staining of cell membrane is also the same as the method for determining the number of cancer cells with different states of staining of cell membrane in the above embodiments, and therefore, the embodiments of the present disclosure are not repeated herein; in step S1305, the detection result of the cell nucleus and the delineation result of the cell membrane are labeled in different colors in the HER 2-positive breast cancer cell membrane staining image, and the cell membrane staining degree and the number of breast cancer cells having different cell membrane staining states are displayed in the HER 2-positive breast cancer cell membrane staining image.
Fig. 14A-14C are structural schematic diagrams illustrating HRE2 staining image analysis results, and fig. 14A-14C are HER2 staining image analysis results under a microscope of 40 times, as shown in fig. 14A, wherein cell membranes are depicted by center lines, and center lines of different colors correspond to cell membranes in different states, such as pink center line to outline complete cell membranes, green center line to outline incomplete cell membranes, and simultaneously, cell nuclei are marked by different colors, such as gold to mark redefined cell nuclei in complete cell membranes, red to mark cell nuclei in incomplete cell membranes, and further, quantitative analysis results are shown above the images: degree of staining: deep dyeing, complete sketching: 95.2%, incomplete delineation: 4.8%, without sketching: 0.0 percent; as shown in fig. 14B, each subject was labeled with the same color as in fig. 14A in the HER2 film staining image, but the quantitative analysis result was different from fig. 14A, and the quantitative analysis result of the HER2 film staining image was: degree of staining: light dyeing, complete sketching: 84.6%, incomplete delineation: 15.4%, no delineation: 0.0 percent; as shown in fig. 14C, the HER2 membrane stain image showed no cell membrane staining, i.e. no centerline-labeled cell membrane, only cell nuclei with no cell membrane staining, which were labeled with blue, corresponding to the quantitative analysis results: degree of staining: no dyeing, complete sketching: 0.0%, incomplete delineation: 0.0%, no delineation: 100.0 percent. It is to be noted that the full delineation in the figure indicates the ratio of the number of target cells having a state of intact cell membrane staining, the incomplete delineation indicates the ratio of the number of target cells having a state of incomplete cell membrane staining, and the no delineation indicates the ratio of the number of target cells having a state of no cell membrane staining
Compared with the prior art in which only cell nucleus detection or cell membrane delineation is performed, the cell membrane staining image processing method can provide accurate qualitative display images and accurate quantitative analysis data, provides data support for correct diagnosis of pathologists, and further improves diagnosis efficiency and accuracy of diseases.
Embodiments of the disclosed device are described below, which can be used to perform the cell membrane staining image analysis method in the above embodiments of the present disclosure. For details not disclosed in the embodiments of the disclosed device, please refer to the embodiments of the cell membrane staining image analysis method disclosed above.
Fig. 15 schematically illustrates a block diagram of a cell membrane stain image analysis apparatus according to one embodiment of the present disclosure.
Referring to fig. 15, a cell membrane stain image analysis apparatus 1500 according to an embodiment of the present disclosure includes: a marking module 1501, a fusion module 1502, and a statistics module 1503.
The marking module 1501 is configured to obtain a cell membrane staining image, mark a cell nucleus of a target cell in the cell membrane staining image to obtain a marked cell nucleus, and delineate a cell membrane in the cell membrane staining image to obtain a marked cell membrane; a fusion module 1502, configured to obtain a fusion image including the labeled cell nucleus and the labeled cell membrane according to the position information of the labeled cell nucleus and the label information of the labeled cell membrane; the statistical module 1503 is configured to determine a degree of cell membrane staining according to the cell membrane staining region in the fusion image, and determine the number of target cells with different cell membrane staining states according to the number of nuclei corresponding to different cell membrane staining states in the fusion image.
In one embodiment of the present disclosure, the marking module 1501 is configured to: performing color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to cell nuclei; processing the color channel images corresponding to the cell nuclei to mark all of the cell nuclei in the cell membrane stain image; and screening the marked cell nucleus according to the cell morphology corresponding to the target cell to obtain the marked cell nucleus.
In one embodiment of the present disclosure, the marking module 1501 is configured to: carrying out color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to the cell membrane; segmenting the color channel image corresponding to the cell membrane according to a first preset threshold value to obtain a segmented image; and extracting a central line of the segmentation image according to a skeleton extraction algorithm, and delineating the cell membrane according to the central line to obtain the marked cell membrane.
In one embodiment of the present disclosure, the fusion module 1502 includes: the area dividing unit is used for determining a closed area and a non-closed area according to the marking information of the marked cell membrane; a cell nucleus redefinition unit for determining the barycentric position of the closed region and using the barycentric position as the position information of the cell nucleus in the closed region; a segmentation graph forming unit, configured to fill the inside of the closed region to obtain a segmentation graph corresponding to the closed region; and the fusion image generating unit is used for determining the fusion image according to the position information of the marked cell nucleus, the position information of the cell nucleus in the closed area and the segmentation map.
In one embodiment of the present disclosure, the region dividing unit is configured to: determining a communication boundary area according to the central line corresponding to the marked cell membrane, and judging whether a closed communication area exists in the communication boundary area; if a closed communication region exists in the communication boundary region, when the closed communication region is determined to be an innermost region and the area of the closed communication region is greater than or equal to a preset area threshold value, marking the closed communication region as the closed region; if a non-closed connected region exists in the connected boundary region, marking the non-closed connected region as the non-closed region.
In one embodiment of the present disclosure, the fused image generating unit is configured to: acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image; acquiring an intersection between the position information of the labeled cell nucleus and the first complementary set region, and acquiring a union of the intersection and the position information of the cell nucleus in the closed region to obtain the fusion image.
In one embodiment of the present disclosure, the statistics module 1503 includes: the image decomposition unit is used for carrying out color channel decomposition on the cell membrane staining image with the RGB image format so as to obtain a color channel image corresponding to the cell membrane; the image segmentation unit is used for segmenting the color channel image corresponding to the cell membrane according to a second preset threshold value so as to obtain a cell membrane staining area template; and the image extraction unit is used for extracting the color channel image corresponding to the cell membrane through the cell membrane staining area template so as to obtain the cell membrane staining area image and determining the cell membrane staining degree according to the cell membrane staining area image.
In one embodiment of the present disclosure, the image extraction unit includes: a first cell membrane staining degree determination unit for determining that the cell membrane staining degree is zero when the area of the cell membrane staining area image is zero; a second cell membrane staining degree determination unit for determining the cell membrane staining degree according to the pixel value in the cell membrane staining area image when the area of the cell membrane staining area image is not zero.
In one embodiment of the present disclosure, the second cell membrane staining degree unit is configured to: calculating the pixel average value of all pixels in the cell membrane staining area image, and comparing the pixel average value with a staining threshold value; when the pixel average value is smaller than the staining threshold value, judging the cell membrane staining degree to be weak staining; when the pixel average is greater than or equal to the staining threshold, determining that the cell membrane staining degree is strong staining.
In one embodiment of the present disclosure, the statistics module 1503 is configured to: and acquiring a first number of cell nuclei in all the closed areas in the fusion image, wherein the first number is the number of target cells with complete cell membrane staining states.
In one embodiment of the present disclosure, the statistics module 1503 is configured to: filling cavities in the cell membrane staining area template, and expanding the cell membrane staining area filled with the cavities according to a preset distance to obtain an expanded area; acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image; acquiring a union of the dilated region and the first complementary region, and acquiring an intersection between the union and the fused image to determine an incomplete cell membrane staining region; and acquiring a second number of cell nuclei corresponding to the incomplete cell membrane staining area in the fusion image, wherein the second number is the number of target cells with incomplete cell membrane staining state.
In one embodiment of the present disclosure, the statistics module 1503 is configured to: acquiring a second complementary set region complementary to the bulge region in the cell membrane stain image; acquiring an intersection between the fused image and the first complementary set region, and acquiring an intersection between the intersection and the second complementary set region to acquire a cell membrane staining-free region; and acquiring a third number of cell nuclei corresponding to the cell membrane staining-free area in the fusion image, wherein the third number is the number of target cells with a cell membrane staining-free state.
In one embodiment of the present disclosure, the cell membrane staining image analyzing apparatus 1500 further includes: and the data conversion module is used for comparing the number of the cell nucleuses corresponding to the different cell membrane staining states with the number of all the cell nucleuses in the fused image so as to obtain the number percentage of the target cells with the different cell membrane staining states.
The present disclosure also provides a cell membrane staining image analysis system, and fig. 16 shows a schematic structural diagram of the cell membrane staining image analysis system, as shown in fig. 16, a cell membrane image analysis system 1600 includes: a microscope 1601, an imaging device 1602, an image processing device 1603, and a display device 1604.
Specifically, a microscope 1601 for observing a cell membrane stained sample; a shooting device 1602, connected to the microscope, for shooting the cell membrane staining image in the microscope; an image processing device 1603 connected with the shooting device and used for receiving the cell membrane staining image shot by the shooting device, wherein the image processing device comprises one or more processors, the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to process the cell membrane staining image to realize the cell membrane staining image analysis method in the embodiment; and a display device 1604 connected to the image processing device, for receiving the image analysis result output by the image processing device and displaying the image analysis result on a display screen of the display device.
Fig. 17 shows a schematic structural diagram of a computer system suitable for implementing the image processing device 1503 of the present disclosure.
It should be noted that the computer system 1700 of the image processing apparatus 1503 shown in fig. 17 is only an example, and should not bring any limitation to the functions and the scope of the embodiments of the present disclosure.
As shown in fig. 17, the computer system 1700 includes a Central Processing Unit (CPU) 1701 which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 1702 or a program loaded from a storage portion 1708 into a Random Access Memory (RAM) 1703, implementing the cell membrane staining image analysis method described in the above embodiments. In the RAM 1703, various programs and data necessary for system operation are also stored. The CPU 1701, ROM 1702, and RAM 1703 are connected to each other through a bus 1704. An Input/Output (I/O) interface 1705 is also connected to the bus 1704.
The following components are connected to the I/O interface 1705: an input section 1706 including a keyboard, a mouse, and the like; an output section 1707 including a Display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 1708 including a hard disk and the like; and a communication section 1709 including a Network interface card such as a Local Area Network (LAN) card, a modem, or the like. The communication section 1709 performs communication processing via a network such as the internet. A driver 1710 is also connected to the I/O interface 1705 as necessary. A removable medium 1711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1710 as necessary, so that a computer program read out therefrom is mounted into the storage portion 1708 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1709 and/or installed from the removable medium 1711. When the computer program is executed by a Central Processing Unit (CPU) 1701, various functions defined in the system of the present disclosure are executed.
It should be noted that the computer readable medium shown in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM), a flash Memory, an optical fiber, a portable Compact Disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present disclosure also provides a computer-readable medium, which may be included in the image processing apparatus described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs, which when executed by one of the electronic devices, cause the electronic device to implement the method described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (15)

1. A method for analyzing a cell membrane staining image, comprising:
acquiring a cell membrane staining image, marking the cell nucleus of a target cell in the cell membrane staining image to acquire a marked cell nucleus, and sketching the cell membrane of the target cell in the cell membrane staining image to acquire a marked cell membrane;
determining a closed area and a non-closed area according to the marking information of the marked cell membrane, and acquiring a fusion image containing the marked cell nucleus and the marked cell membrane according to the position information of the marked cell nucleus and the closed area;
and determining the dyeing degree of the cell membrane according to the cell membrane dyeing area in the fusion image, and determining the number of target cells with different cell membrane dyeing states according to the number of cell nuclei corresponding to different cell membrane dyeing states in the fusion image.
2. The method for analyzing cell membrane staining image according to claim 1, wherein the labeling the cell nucleus of the target cell in the cell membrane staining image to obtain a labeled cell nucleus comprises:
performing color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to cell nuclei;
processing the color channel images corresponding to the cell nuclei to mark all of the cell nuclei in the cell membrane stain image;
and screening the marked cell nucleus according to the cell morphology corresponding to the target cell to obtain the marked cell nucleus.
3. The method for analyzing cell membrane staining image according to claim 1, wherein the delineating the cell membrane of the target cell in the cell membrane staining image to obtain a labeled cell membrane comprises:
carrying out color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to the cell membrane;
segmenting the color channel image corresponding to the cell membrane according to a first preset threshold value to obtain a segmented image;
and extracting a central line of the segmentation image according to a skeleton extraction algorithm, and delineating the cell membrane according to the central line so as to obtain the marked cell membrane.
4. The method for analyzing the cell membrane stain image according to claim 1, wherein the acquiring a fusion image containing the labeled cell nuclei and the labeled cell membrane based on the position information of the labeled cell nuclei and the closed regions comprises:
determining the gravity center position of the closed area, and taking the gravity center position as the position information of the cell nucleus in the closed area;
filling the inside of the closed area to obtain a segmentation map corresponding to the closed area;
and determining the fusion image according to the position information of the marked cell nucleus, the position information of the cell nucleus in the closed area and the segmentation image.
5. The method for analyzing the cell membrane stain image according to claim 1, wherein the determining of the closed region and the non-closed region according to the labeling information of the labeled cell membrane comprises:
determining a communication boundary area according to the central line corresponding to the marked cell membrane, and judging whether a closed communication area exists in the communication boundary area;
if a closed communication region exists in the communication boundary region, when the closed communication region is judged to be an innermost region and the area of the closed communication region is larger than or equal to a preset area threshold value, marking the closed communication region as the closed region;
if a non-closed connected region exists in the connected boundary region, marking the non-closed connected region as the non-closed region.
6. The cell membrane stain image analysis method of claim 4, the determining the fused image from the location information of the labeled nuclei, the location information of the nuclei in the closed area, and the segmentation map, comprising:
acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image;
acquiring an intersection between the position information of the labeled cell nucleus and the first complementary set region, and acquiring a union of the intersection and the position information of the cell nucleus in the closed region to obtain the fusion image.
7. The method for analyzing cell membrane staining image according to claim 4, wherein the determining the degree of cell membrane staining from the cell membrane staining area in the fused image comprises:
performing color channel decomposition on the cell membrane staining image with an RGB image format to obtain a color channel image corresponding to the cell membrane;
segmenting the color channel image corresponding to the cell membrane according to a second preset threshold value to obtain a cell membrane staining area template;
and performing image extraction on the color channel image corresponding to the cell membrane through the cell membrane staining area template to obtain a cell membrane staining area image, and determining the cell membrane staining degree according to the cell membrane staining area image.
8. The method for analyzing cell membrane staining image according to claim 7, wherein the determining the degree of cell membrane staining from the cell membrane staining area image comprises:
when the area of the cell membrane staining area image is zero, the cell membrane staining degree is zero;
when the area of the cell membrane staining area image is not zero, determining the cell membrane staining degree according to the pixel value in the cell membrane staining area image.
9. The method for analyzing cell membrane staining image according to claim 8, wherein the determining the cell membrane staining degree according to the pixel values in the cell membrane staining region image comprises:
calculating the pixel average value of all pixels in the cell membrane staining area image, and comparing the pixel average value with a staining threshold value;
when the pixel average value is less than the staining threshold value, determining the cell membrane staining degree as weak staining;
when the pixel average is greater than or equal to the staining threshold, determining the cell membrane staining degree as strong staining.
10. The method for analyzing cell membrane staining image according to claim 1, wherein the determining the number of target cells having different cell membrane staining states according to the number of nuclei corresponding to the different cell membrane staining states in the fused image comprises:
and acquiring a first number of cell nuclei in all the closed areas in the fusion image, wherein the first number is the number of target cells with complete cell membrane staining states.
11. The method for analyzing cell membrane staining image according to claim 7, wherein the determining the number of target cells with different cell membrane staining states according to the number of nuclei corresponding to different cell membrane staining states in the fused image comprises:
filling cavities in the cell membrane staining area template, and expanding the cell membrane staining area filled with the cavities according to a preset distance to obtain an expanded area;
acquiring a first complementary set region complementary to the segmentation map in the cell membrane staining image;
acquiring a union of the dilated region and the first complementary region, and acquiring an intersection between the union and the fused image to determine an incomplete cell membrane staining region;
and acquiring a second number of cell nuclei corresponding to the incomplete cell membrane staining area in the fusion image, wherein the second number is the number of target cells with incomplete cell membrane staining state.
12. The method for analyzing cell membrane staining images according to claim 11, wherein the determining the number of target cells with different cell membrane staining states according to the number of nuclei corresponding to different cell membrane staining states in the fusion image comprises:
acquiring a second complementary set region complementary to the bulge region in the cell membrane staining image;
acquiring an intersection between the fused image and the first complementary set region, and acquiring an intersection between the intersection and the second complementary set region to acquire a cell membrane staining-free region;
and acquiring a third number of cell nuclei corresponding to the cell membrane staining-free area in the fusion image, wherein the third number is the number of target cells with a cell membrane staining-free state.
13. The method for analyzing a cell membrane staining image according to any one of claims 10 to 12, further comprising:
and comparing the number of the cell nucleuses corresponding to the different cell membrane staining states with the number of all the cell nucleuses in the fused image to obtain the number percentage of the target cells with the different cell membrane staining states.
14. An apparatus for analyzing a cell membrane staining image, comprising:
the marking module is used for acquiring a cell membrane staining image, marking the cell nucleus of the target cell in the cell membrane staining image to acquire a marked cell nucleus, and sketching the cell membrane of the target cell in the cell membrane staining image to acquire a marked cell membrane;
the fusion module is used for determining a closed area and a non-closed area according to the marking information of the marked cell membrane and acquiring a fusion image containing the marked cell nucleus and the marked cell membrane according to the position information of the marked cell nucleus and the closed area;
and the counting module is used for determining the dyeing degree of the cell membranes according to the cell membrane dyeing area in the fusion image and determining the number of the target cells with different cell membrane dyeing states according to the number of the cell nuclei corresponding to different cell membrane dyeing states in the fusion image.
15. A system for analyzing a cell membrane staining image, comprising:
the microscope is used for observing the cell membrane staining sample;
the shooting device is connected with the microscope and is used for shooting the cell membrane staining image displayed in the microscope;
an image processing device connected to the camera device for receiving the cell membrane stain image taken by the camera device, and the image processing device comprising a storage device and one or more processors, wherein the storage device is used for storing one or more programs, when the one or more programs are executed by the one or more processors, the one or more processors are caused to execute the cell membrane stain image analysis method according to any one of claims 1 to 13 on the cell membrane stain image;
and the display device is connected with the image processing device and used for receiving the image analysis result output by the image processing device and displaying the image analysis result on a display screen of the display device.
CN201910765920.5A 2019-08-19 2019-08-19 Cell membrane staining image analysis method, device and system Active CN110490882B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910765920.5A CN110490882B (en) 2019-08-19 2019-08-19 Cell membrane staining image analysis method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910765920.5A CN110490882B (en) 2019-08-19 2019-08-19 Cell membrane staining image analysis method, device and system

Publications (2)

Publication Number Publication Date
CN110490882A CN110490882A (en) 2019-11-22
CN110490882B true CN110490882B (en) 2022-12-27

Family

ID=68552091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910765920.5A Active CN110490882B (en) 2019-08-19 2019-08-19 Cell membrane staining image analysis method, device and system

Country Status (1)

Country Link
CN (1) CN110490882B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111260677B (en) * 2020-02-20 2023-03-03 腾讯医疗健康(深圳)有限公司 Cell analysis method, device, equipment and storage medium based on microscopic image
CN111429440B (en) * 2020-03-31 2023-04-28 上海杏脉信息科技有限公司 Method, system, equipment, device and medium for detecting sufficiency of microscopic pathology image cells
CN111575318A (en) * 2020-05-29 2020-08-25 台州市立医院 Cell strain for stably expressing NFAT, synthetic method and application thereof
CN113515077A (en) * 2021-04-23 2021-10-19 重庆德方信息技术有限公司 System and method for monitoring human body cell staining process

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103097889A (en) * 2010-09-30 2013-05-08 日本电气株式会社 Information processing device, information processing system, information processing method, program, and recording medium
CN103489187A (en) * 2013-09-23 2014-01-01 华南理工大学 Quality test based segmenting method of cell nucleuses in cervical LCT image
CN104769415A (en) * 2012-07-25 2015-07-08 赛拉诺斯股份有限公司 Image analysis and measurement of biological samples

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7760927B2 (en) * 2003-09-10 2010-07-20 Bioimagene, Inc. Method and system for digital image based tissue independent simultaneous nucleus cytoplasm and membrane quantitation
US20090297015A1 (en) * 2005-10-13 2009-12-03 Fritz Jetzek Method for Detecting Contours in Images of Biological Cells
WO2015124777A1 (en) * 2014-02-21 2015-08-27 Ventana Medical Systems, Inc. Medical image analysis for identifying biomarker-positive tumor cells

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103097889A (en) * 2010-09-30 2013-05-08 日本电气株式会社 Information processing device, information processing system, information processing method, program, and recording medium
CN104769415A (en) * 2012-07-25 2015-07-08 赛拉诺斯股份有限公司 Image analysis and measurement of biological samples
CN103489187A (en) * 2013-09-23 2014-01-01 华南理工大学 Quality test based segmenting method of cell nucleuses in cervical LCT image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Accurate cell segmentation in microscopy images using membrane patterns;Sotiris Dimopoulos 等;《Bioimage informatics》;20140521;第30卷(第18期);2644-2651 *
尿移行细胞癌巴氏染色色度学定量研究;蓝永洪 等;《中国体视学与图像分析》;20091231;第14卷(第4期);402-405 *
肺癌细胞A549和H322爬片巴氏及HE染色图像色度学定量分析;盛文杰 等;《生物物理学报》;20140228;第30卷(第2期);137-145 *

Also Published As

Publication number Publication date
CN110490882A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN110490882B (en) Cell membrane staining image analysis method, device and system
US11901077B2 (en) Multiple instance learner for prognostic tissue pattern identification
CN112017189B (en) Image segmentation method and device, computer equipment and storage medium
CN110472616B (en) Image recognition method and device, computer equipment and storage medium
EP3961484A1 (en) Medical image segmentation method and device, electronic device and storage medium
CN110853022B (en) Pathological section image processing method, device and system and storage medium
US11232354B2 (en) Histopathological image analysis
CN109389129B (en) Image processing method, electronic device and storage medium
CN105122308B (en) System and method for using the multichannel biological marker of the structural unicellular division of continuous dyeing quantitative
CN110570352B (en) Image labeling method, device and system and cell labeling method
CN111260677B (en) Cell analysis method, device, equipment and storage medium based on microscopic image
CN110852316A (en) Image tampering detection and positioning method adopting convolution network with dense structure
CN111488921A (en) Panoramic digital pathological image intelligent analysis system and method
Öztürk et al. Cell‐type based semantic segmentation of histopathological images using deep convolutional neural networks
CN114445670B (en) Training method, device and equipment of image processing model and storage medium
WO2021120961A1 (en) Brain addiction structure map evaluation method and apparatus
CN114550169A (en) Training method, device, equipment and medium for cell classification model
CN110517273B (en) Cytology image segmentation method based on dynamic gradient threshold
CN110490159B (en) Method, device, equipment and storage medium for identifying cells in microscopic image
CN113706562B (en) Image segmentation method, device and system and cell segmentation method
CN111951271B (en) Method and device for identifying cancer cells in pathological image
CN112330690A (en) Cell segmentation method, device and equipment based on microscopic image and storage medium
CN113869367A (en) Model capability detection method and device, electronic equipment and computer readable medium
CN113411550A (en) Video coloring method, device, equipment and storage medium
CN116664873B (en) Image information processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant