WO2021093451A1 - 病理切片图像的处理方法、装置、系统及存储介质 - Google Patents

病理切片图像的处理方法、装置、系统及存储介质 Download PDF

Info

Publication number
WO2021093451A1
WO2021093451A1 PCT/CN2020/115842 CN2020115842W WO2021093451A1 WO 2021093451 A1 WO2021093451 A1 WO 2021093451A1 CN 2020115842 W CN2020115842 W CN 2020115842W WO 2021093451 A1 WO2021093451 A1 WO 2021093451A1
Authority
WO
WIPO (PCT)
Prior art keywords
stained
image
cell membrane
view
cells
Prior art date
Application number
PCT/CN2020/115842
Other languages
English (en)
French (fr)
Inventor
张军
颜克洲
姚建华
韩骁
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP20887251.5A priority Critical patent/EP3989160A4/en
Publication of WO2021093451A1 publication Critical patent/WO2021093451A1/zh
Priority to US17/515,170 priority patent/US11967069B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/40ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H70/00ICT specially adapted for the handling or processing of medical references
    • G16H70/60ICT specially adapted for the handling or processing of medical references relating to pathologies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the embodiments of the present application relate to the field of artificial intelligence intelligent medical technology, and in particular to a method, device, system, and storage medium for processing pathological slice images.
  • HER2 Human Epidermal Growth Factor Receptor-2, Human Epidermal Growth Factor Receptor-2
  • HER2 Human Epidermal Growth Factor Receptor-2, Human Epidermal Growth Factor Receptor-2
  • the related technology by staining the cell membrane of the pathological section, and then observing and analyzing the pathological section after the cell membrane staining under a microscope, it can assist doctors in detecting HER2.
  • the related technology lacks systematic cell membrane staining analysis on pathological slices, resulting in inaccurate final detection results.
  • the embodiments of the present application provide a method, device, system, and storage medium for processing pathological slice images, which can be used to solve the technical problem of inaccurate analysis of pathological slices in related technologies.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for processing pathological slice images, which is applied to a computer device, and the method includes:
  • n For the stained image in the i-th field of view in the n fields, determine the position of the nucleus of the cancer cell in the stained image in the i-th field of view, where i is a positive integer less than or equal to the n;
  • the analysis result of the pathological section is determined according to the number of the various types of cells in the stained images under the n visual fields.
  • an embodiment of the present application provides a pathological slice image processing device, the device includes:
  • An image acquisition module for acquiring stained images of the pathological section after cell membrane staining under n fields of view of the microscope, where n is a positive integer;
  • the cell nucleus detection module is used to determine the cell nucleus position of the cancer cell in the stained image in the i-th field of view for the stained image in the i-th field of view in the n fields, where i is less than or equal to the n Positive integer;
  • the cell membrane description module is used to generate the cell membrane description result of the staining image in the i-th field of view, and the cell membrane description result is used to indicate the integrity of the cell membrane staining and the staining intensity;
  • a quantity determining module configured to determine the quantity of various types of cells in the stained image under the i-th field of view according to the position of the cell nucleus and the description result of the cell membrane;
  • the result determination module is used to determine the analysis result of the pathological section according to the number of the various types of cells in the stained images under the n visual fields.
  • an embodiment of the present application provides an intelligent microscope system, the intelligent microscope system includes: a microscope, a camera, and computer equipment;
  • the microscope is used for observing pathological sections after cell membrane staining
  • the camera is used to obtain stained images of the pathological section under n fields of view of the microscope, where n is a positive integer;
  • the computer device is configured to determine the cell nucleus position of the cancer cell in the stained image in the i-th field of view for the stained image in the i-th field of view in the n fields, where i is less than or equal to the A positive integer of n; the cell membrane description result of the stained image in the i-th field of view is generated, the cell membrane description result is used to indicate the integrity of the cell membrane staining and the staining intensity; according to the cell nucleus position and the cell membrane description result, Determine the number of various types of cells in the stained image under the i-th field of view; determine the analysis result of the pathological slice according to the number of various types of cells in the stained image under the n field of view.
  • an embodiment of the present application provides a computer device, the computer device includes a processor and a memory, and at least one instruction, at least a program, code set, or instruction set is stored in the memory, and the at least one instruction , The at least one program, the code set or the instruction set is loaded and executed by the processor to implement the above-mentioned pathological slice image processing method.
  • an embodiment of the present application provides a computer-readable storage medium, and the computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the At least one program, the code set or the instruction set is loaded and executed by the processor to implement the above-mentioned pathological slice image processing method.
  • an embodiment of the present application provides a computer program product, which is used to implement the above-mentioned pathological slice image processing method when the computer program product is executed by a processor.
  • the information determines the number of various types of cells, and then determines the analysis results of the case slices, providing a technical solution for systematic cell membrane staining analysis of pathological slices, which helps to improve the accuracy of the detection results.
  • the deep learning method is not directly used to directly analyze the stained images of the pathological slices, and the analysis results are directly output by the model.
  • This black box processing method does not comply with the relevant diagnostic guidelines. Regulations for HER2 classification.
  • cancer cells are detected and classified according to the definition of the relevant diagnostic guidelines, so that the HER2 classification can be performed according to the judgment criteria of the guide, thereby improving the standardization and accuracy of the final HER2 classification results Sex.
  • Fig. 1 is a schematic diagram of a smart microscope system provided by an embodiment of the present application.
  • FIG. 2 is a flowchart of a method for processing pathological slice images provided by an embodiment of the present application
  • Figure 3 is a schematic diagram of a stained image under multiple fields of view of a microscope shown in the present application.
  • FIG. 4 is a schematic diagram of the input and output of a cell nucleus detection model shown in the present application.
  • FIG. 5 is a schematic diagram of an original image of a stained image, a closed region segmentation image, a weakly stained segmented image, and a strongly stained segmented image shown in the present application;
  • Fig. 6 is a flowchart of a complete technical solution shown in this application.
  • Fig. 7 is a block diagram of a pathological slice image processing device provided by an embodiment of the present application.
  • Fig. 8 is a block diagram of a pathological slice image processing device provided by another embodiment of the present application.
  • Fig. 9 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • AI Artificial Intelligence
  • AI is a theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge, and use knowledge to obtain the best results.
  • artificial intelligence is a comprehensive technology of computer science, which attempts to understand the essence of intelligence and produce a new kind of intelligent machine that can react in a similar way to human intelligence.
  • Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that the machines have the functions of perception, reasoning and decision-making.
  • Artificial intelligence technology is a comprehensive discipline, covering a wide range of fields, including both hardware-level technology and software-level technology.
  • Basic artificial intelligence technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, and mechatronics.
  • Artificial intelligence software technology mainly includes computer vision technology, speech processing technology, natural language processing technology, and machine learning/deep learning.
  • Computer Vision is a science that studies how to make machines "see”. Furthermore, it refers to the use of cameras and computers instead of human eyes to identify, track, and measure objects. , And further do graphic processing to make computer processing more suitable for human eyes to observe or send to the instrument to detect the image. As a scientific discipline, computer vision studies related theories and technologies, trying to establish an artificial intelligence system that can obtain information from images or multi-dimensional data.
  • Computer vision technology usually includes image processing, image recognition, image semantic understanding, image retrieval, OCR (Optical Character Recognition, optical character recognition), video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technology, virtual Technologies such as reality, augmented reality, synchronized positioning and map construction also include common facial recognition, fingerprint recognition and other biometric recognition technologies.
  • OCR Optical Character Recognition, optical character recognition
  • video processing video semantic understanding, video content/behavior recognition
  • 3D technology three-dimensional object reconstruction
  • virtual Technologies such as reality, augmented reality, synchronized positioning and map construction also include common facial recognition, fingerprint recognition and other biometric recognition technologies.
  • ML Machine Learning
  • Machine Learning is the core of artificial intelligence, the fundamental way to make computers intelligent, and its applications cover all fields of artificial intelligence.
  • Machine learning and deep learning usually include artificial neural networks, belief networks, reinforcement learning, transfer learning, inductive learning, teaching learning and other technologies.
  • artificial intelligence technology has been researched and applied in many fields, such as common smart homes, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned driving, autonomous driving, drones , Robots, intelligent medical care, intelligent customer service, etc., I believe that with the development of technology, artificial intelligence technology will be applied in more fields and play more and more important values.
  • the solutions provided by the embodiments of this application relate to the field of intelligent medical technology, using computer vision technology to perform image processing and analysis on the pathological slices stained by cell membranes, and finally determine the analysis results corresponding to the pathological slices, thereby assisting doctors in the diagnosis and diagnosis of diseases. treatment.
  • FIG. 1 shows a schematic diagram of an intelligent microscope system provided by an embodiment of the present application.
  • the smart microscope system may include: a microscope 11, a camera 12, and a computer device 13.
  • the microscope 11 is used to observe the pathological section.
  • the microscope 11 is used to observe the pathological section after cell membrane staining.
  • the camera 12 is used to capture and acquire a pathological slice image of the pathological slice under the field of view of the microscope 11.
  • the above-mentioned pathological section is a HER2 stained section.
  • HER2 staining stains the membranes of positive cancer cells brown and the nuclei of cells blue.
  • the HER2 classification generally includes four categories: 0, 1+, 2+ and 3+. Usually, it is necessary to collect and acquire the above-mentioned pathological slice images under the medium and high power field of the microscope (such as 10 times, 20 times, or 40 times).
  • the computer device 13 is configured to obtain the analysis result corresponding to the pathological slice image based on the above-mentioned pathological slice image captured by the camera 12 by executing the method procedure described in detail later.
  • the analysis result may be a HER2 grading result.
  • the computer device 13 may be any electronic device with computing and storage capabilities, such as a PC (Personal Computer, personal computer).
  • the microscope 11, the camera 12, and the computer device 13 may be configured to be in the same physical location, or even be configured to belong to the same physical device.
  • the microscope 11, the camera 12, and the computer device 13 may be configured to be in different positions and connected through a wired or wireless communication network, so as to transmit data or commands between each other.
  • FIG. 2 shows a flowchart of a method for processing a pathological slice image according to an embodiment of the present application.
  • the execution subject of each step of the method may be the computer equipment in the smart microscope system introduced above.
  • the method can include the following steps:
  • Step 201 Obtain stained images of the pathological section after cell membrane staining under n fields of the microscope, where n is a positive integer.
  • the aforementioned n fields are multiple microscope fields containing cancer cells obtained after observing the pathological section.
  • the n visual fields can be selected by the doctor.
  • the doctor can be responsible for collecting multiple microscope fields that typically contain areas of infiltrating cancer cells.
  • n is an integer greater than 1, for example, n is 10. Since the HER2 grading in the relevant diagnostic guidelines (such as the breast cancer HER2 detection guidelines) is defined on the basis of the full-film analysis of the pathological slices, and the image acquisition of the microscope application scene comes from multiple fields of view, it is difficult to stitch a complete pathology Slice the image. In this application, the HER2 classification is performed by adopting the strategy of using multiple typical field of view images selected by the doctor to approximate the diagnosis result of the whole film, which can avoid the whole film scan and obtain accurate HER2 classification results.
  • the relevant diagnostic guidelines such as the breast cancer HER2 detection guidelines
  • FIG. 3 shows a schematic diagram of stained images of a pathological section after cell membrane staining under multiple fields of the microscope, where images 31, 32, and 33 respectively represent the same pathological section in the microscope. Stained images under 3 fields of view. It should be noted that the stained image is actually an RGB image, with positive cancer cell membranes stained brown and cell nuclei stained blue.
  • Step 202 Determine the nucleus position of the cancer cell in the stained image in the i-th field of view for the stained image in the i-th field of view in the n fields, where i is a positive integer less than or equal to n.
  • a deep learning method may be used to perform cancer cell detection on the stained image to determine the position of the nucleus of the cancer cell in the stained image.
  • FCN Full Convolutional Networks
  • the input of the cell nucleus detection model can be a stained image (the stained image is an RGB image), and the input is a Gauss-like response image centered on the center point of the cell. By looking for the local maximum response position of the input heat map, all cancer cells can be obtained.
  • the stained image under the i-th field of view is processed by the cell nucleus detection model to obtain the cell nucleus position of the cancer cell in the stained image under the i-th field of view.
  • the cell nucleus position may include a coordinate set of cancer cells in the stained image under the i-th field of view (denoted as D detect ), and the coordinate set D detect includes the position coordinates of the cell nucleus of each cancer cell in the stained image under the i-th field of view. Exemplarily, as shown in FIG.
  • the left side is the input image 41 of the cell nucleus detection model, that is, the stained image under a certain field of view, and the right side is the stained image 42 that marks the position of the nucleus of the detected cancer cell, where , The position of the nucleus is marked with a small black dot.
  • cancer cell detection methods introduced above are only exemplary and explanatory, and the embodiments of the present application are not limited to other cancer cell detection methods that can be used.
  • Step 203 Generate a cell membrane description result of the staining image in the i-th field of view, and the cell membrane description result is used to indicate the integrity of the cell membrane staining and the staining intensity.
  • the integrity of the cell membrane staining and the staining intensity determine the analysis results of the pathological section, it is necessary to obtain the cell membrane description results of the stained image.
  • the integrity of cell membrane staining refers to whether the stained cell membrane is a complete cell membrane
  • the staining intensity refers to the depth of staining of the stained cell membrane.
  • the stained image can be processed and analyzed to obtain the cell membrane description result of the stained image.
  • the related process please refer to the description in the following embodiments.
  • Step 204 Determine the number of various types of cells in the stained image under the i-th field of view according to the cell nucleus position and the cell membrane description result.
  • the cells are classified into multiple types according to the staining of the cell membrane.
  • the aforementioned types include: intact strongly stained cells, incomplete strongly stained cells, intact weakly stained cells, incomplete weakly stained cells, and non-stained cells.
  • intact strongly stained cells refer to cells with complete cell membrane staining and high staining intensity
  • incomplete strongly stained cells refer to cells with incomplete cell membrane staining and high staining intensity
  • intact weakly stained cells refer to cells with complete cell membrane staining and low staining intensity
  • Cells, incomplete weakly stained cells refer to cells with incomplete cell membrane staining and low staining intensity
  • non-stained cells refer to cells that have not been stained.
  • the number of various types of cells in the stained image can be determined, so as to realize the quantitative analysis and judgment of the cells. For example, it is determined that the number of various types of cells in a stained image under a certain field of view includes: 170 intact strongly stained cells, 230 incomplete strongly stained cells, 2 intact weakly stained cells, and 104 incomplete weakly stained cells. 47 unstained cells.
  • Step 205 Determine the analysis result of the pathological section according to the number of various types of cells in the stained image under n fields of view.
  • n is an integer greater than 1
  • the above steps 202-204 can be performed to obtain the number of various types of cells in the stained image in each field of view. After that, the number of various types of cells in the stained image under the n fields of view is comprehensively counted. For each type of cell, sum the number of the type of cell in each of the above-mentioned stained images to obtain the total number of the type of cell in all n stained images.
  • the HER2 classification is 0.
  • the HER2 grade is 3+, it is judged to be HAR2 positive.
  • the HER2 grade is 2+, it is necessary to further apply the method of in situ hybridization to detect the expansion status of HER2 gene, or select different tissue blocks to re-test.
  • the HER2 classification is 1+ or 0, it is judged as HAR2 negative.
  • the stained images of the pathological section after cell membrane staining are obtained under n fields of the microscope, and the cell detection and cell membrane description are performed on each stained image to obtain each stained image
  • the nucleus position and cell membrane description results of the cancer cells in the cancer cells combined with the above two aspects of information, determine the number of various types of cells, and then determine the analysis results of the case section, providing a systematic cell membrane staining analysis technology for pathological sections
  • the scheme helps to improve the accuracy of the test results.
  • the deep learning method is not directly used to directly analyze the stained images of the pathological slices, and the analysis results are directly output by the model.
  • This black box processing method does not comply with the relevant diagnostic guidelines. Regulations for HER2 classification.
  • cancer cells are detected and classified according to the definition of the relevant diagnostic guidelines, so that the HER2 classification can be performed according to the judgment criteria of the guide, thereby improving the standardization and accuracy of the final HER2 classification results Sex.
  • the foregoing step 203 may include the following sub-steps:
  • the stained image is an RGB three-channel image, which can be expressed as I RGB .
  • I RGB 3 immunohistochemical channel images are generated, namely I H (hematoxylin) and I E (Ehong) and I DAB (Diaminobenzidine), where the I DAB channel image is a brown stained channel image.
  • the target staining channel image is the aforementioned brown staining channel image.
  • the cell membrane description result includes the first region segmentation result, the second region segmentation result, and the third region segmentation result.
  • the first region segmentation result is used to indicate the position of the cell membrane with complete staining
  • the second region segmentation result is used to indicate the cell membrane position where the staining intensity is greater than the first threshold value
  • the third region segmentation result is used to indicate the staining intensity is greater than the second threshold. Limit cell membrane position.
  • the first threshold value is less than the second threshold value, that is, the cells whose staining intensity is greater than the first threshold value and less than or equal to the second threshold value are Weakly stained cells, cells with a staining intensity greater than the second threshold value are strongly stained cells, and cells with a staining intensity less than the first threshold value can be regarded as non-stained cells.
  • this step may include the following sub-steps:
  • the target dyeing channel image is converted into a binary weakly dyed segmentation image.
  • the first threshold may be preset in combination with actual conditions, which is not limited in the embodiment of the present application.
  • the second threshold may be preset in combination with actual conditions, which is not limited in the embodiment of the present application.
  • the weakly stained segmented images also include strongly stained cells that are stronger than weakly stained, but the strongly stained segmented images only include strongly stained cells.
  • the weakly stained segmented image may be subjected to the first morphological processing to obtain the first region segmentation result; the weakly stained segmented image may be subjected to the second morphological processing to obtain The second region segmentation result; the second morphological processing is performed on the strongly stained segmented image to obtain the third region segmentation result.
  • performing the first morphological processing on the weakly stained segmented image may include: performing skeleton extraction processing on the weakly stained segmented image to obtain the cytoskeleton in the weakly stained segmented image; searching for the closed area where the cytoskeleton is closed; if the closed area is the innermost Layer enclosed area, the innermost enclosed area is filled to obtain the enclosed area segmented image; the position information of the foreground pixels in the enclosed area segmented image is extracted to obtain the first area segmentation result.
  • perform centerline extraction processing on the weakly stained segmented image M light complete the delineation of the cell membrane to obtain the cytoskeleton, and then perform statistical analysis on the centerline to count the enclosed area formed by each connected boundary.
  • the innermost enclosed area is determined to be the delineation boundary of a complete cell membrane. Fill all the innermost enclosed areas to obtain the enclosed area segmented image M enclosed , as shown in Figure 5, and define the position coordinate set of all foreground pixels in the enclosed area segmented image M enclosed as P enclosed , and P enclosed is the first area Segmentation result.
  • Performing the second morphological processing on the weakly stained segmented image may include: performing region expansion processing on the weakly stained segmented image to obtain the processed weakly stained segmented image; extracting the position information of the foreground pixels in the processed weakly stained segmented image to obtain the first Two region segmentation results.
  • the processed weakly stained segmented image E light is obtained , and the position coordinate set of all foreground pixels in the processed weakly stained segmented image E light is defined as P light , the P light is the second region segmentation result.
  • Performing the second morphological processing on the strongly stained segmented image may include: performing region expansion processing on the strongly stained segmented image to obtain the processed strongly stained segmented image; extracting the position information of the foreground pixels in the processed strongly stained segmented image to obtain the first Three region segmentation results.
  • the processed strongly dyed segmented image E heavy is obtained , and the position coordinates of all foreground pixels in the processed strongly dyed segmented image E heavy are defined as P heavy , the P heavy is the third region segmentation result.
  • determining the number of various types of cells in the stained image under the i-th field of view according to the cell nucleus position and the cell membrane description result includes the following sub-steps:
  • the coordinate set of complete strongly stained cells is D detect ⁇ P enclosed ⁇ P heavy
  • the number of corresponding cells is the number of set elements, namely card(D detect ⁇ P enclosed ⁇ P heavy ).
  • the coordinate set of incomplete strongly stained cells is D detect ⁇ C U P enclosed ⁇ P heavy , where C U P enclosed is the complement of P enclosed relative to the whole image U (that is, the stained image under the i-th field of view) .
  • the corresponding cell number is card(D detect ⁇ C U P enclosed ⁇ P heavy ).
  • the coordinate set of the complete weakly stained cells is D detect ⁇ P enclosed ⁇ C U P heavy ⁇ P light , where C U P heavy is P heavy relative to the whole image U (that is, the stained image under the i-th field of view) Complement.
  • the corresponding cell number is card(D detect ⁇ P enclosed ⁇ C U P heavy ⁇ P light ).
  • the coordinate set of incomplete weakly stained cells is D detect ⁇ C U P enclosed ⁇ C U P heavy ⁇ P light .
  • the corresponding cell number is card(D detect ⁇ C U P enclosed ⁇ C U P heavy ⁇ P light ).
  • the coordinate set of unstained cells is D detect ⁇ C U P light , where C U P light is the complement of P light with respect to the full image U (that is, the stained image in the i-th field of view).
  • the corresponding cell number is card(D detect ⁇ C U P light ).
  • Fig. 6 shows a flow chart of the complete technical solution of the present application.
  • For the stained image in a single field perform nuclear detection and cell membrane description to obtain the nucleus position and cell membrane of the cancer cell in the stained image under the single field of view Describe the results, combine the above two aspects of information to obtain the cell classification results under the single field of view, that is, the number of various types of cells.
  • the cell classification results in each single field are combined to obtain the cell classification results in multiple fields, and then the HER2 classification corresponding to the pathological section is determined based on this.
  • the results of cell classification under multi-fields are: intact strongly stained cells accounted for 20%, incomplete strongly stained cells accounted for 35%, intact weakly stained cells accounted for 5%, incomplete cells The proportion of weakly stained cells is 10%, and the proportion of non-stained cells is 30%, so the HER2 classification is 3+.
  • FIG. 7 shows a block diagram of a pathological slice image processing device provided by an embodiment of the present application.
  • the device has the function of realizing the above method example, and the function can be realized by hardware, or by hardware executing corresponding software.
  • the device can be the computer equipment described above, or it can be set in the computer equipment.
  • the device 700 may include: an image acquisition module 710, a cell nucleus detection module 720, a cell membrane description module 730, a quantity determination module 740, and a result determination module 750.
  • the image acquisition module 710 is configured to acquire stained images of the pathological section after the cell membrane staining under n fields of the microscope, where n is a positive integer.
  • the cell nucleus detection module 720 is configured to determine the cell nucleus position of the cancer cell in the stained image in the i-th field of view for the stained image in the i-th field of view in the n fields, where i is less than or equal to the A positive integer of n.
  • the cell membrane description module 730 is used to generate the cell membrane description result of the stained image in the i-th field of view, and the cell membrane description result is used to indicate the integrity of the cell membrane staining and the staining intensity.
  • the quantity determination module 740 is configured to determine the quantity of various types of cells in the stained image under the i-th field of view according to the position of the cell nucleus and the description result of the cell membrane.
  • the result determining module 750 is configured to determine the analysis result of the pathological slice according to the number of the various types of cells in the stained images under the n visual fields.
  • the stained images of the pathological section after cell membrane staining are obtained under n fields of the microscope, and the cell detection and cell membrane description are performed on each stained image to obtain each stained image
  • the nucleus position and cell membrane description results of the cancer cells in the cancer cells combined with the above two aspects of information, determine the number of various types of cells, and then determine the analysis results of the case section, providing a systematic cell membrane staining analysis technology for pathological sections
  • the scheme helps to improve the accuracy of the test results.
  • the cell membrane description module 730 includes: a decomposition and recombination sub-module 731 and a cell membrane description sub-module 732.
  • the decomposition and reorganization sub-module 731 is used to decompose and reorganize the color channel of the dyed image in the i-th field of view to obtain the target dyed channel image.
  • the cell membrane description submodule 732 is configured to generate the cell membrane description result according to the target staining channel image, and the cell membrane description result includes a first region segmentation result, a second region segmentation result, and a third region segmentation result.
  • the first region segmentation result is used to indicate the cell membrane position with complete staining
  • the second region segmentation result is used to indicate the cell membrane position whose staining intensity is greater than the first threshold value
  • the third region segmentation result is used to indicate The cell membrane position where the staining intensity is greater than the second threshold value.
  • the cell membrane description sub-module 732 includes: a first division unit 732a, a second division unit 732b, and a cell membrane description unit 732c.
  • the first segmentation unit 732a is configured to perform threshold segmentation processing on the target dye channel image using a first threshold to obtain a weakly dyed segmented image.
  • the second segmentation unit 732b is configured to perform threshold segmentation processing on the target dye channel image using a second threshold to obtain a strongly dyed segmented image.
  • the cell membrane description unit 732c is configured to generate the cell membrane description result according to the weakly stained segmented image and the strongly stained segmented image.
  • the cell membrane description unit 732c is used for:
  • the cell membrane description unit 732c is used to:
  • the enclosed area is the innermost enclosed area, fill the innermost enclosed area to obtain a closed area segmentation image
  • the cell membrane description unit 732c is used to:
  • the cell membrane description unit 732c is used to:
  • the types include: intact strongly stained cells, incomplete strongly stained cells, intact weakly stained cells, incomplete weakly stained cells, and non-stained cells.
  • the quantity determining module 740 is configured to:
  • the position of the cell nucleus, the complement of the first region segmentation result with respect to the stained image under the i-th field of view, and the number of elements in the intersection of the second region segmentation result are determined as the incomplete strong The number of stained cells;
  • the nucleus position, the first region segmentation result, the complement of the third region segmentation result with respect to the stained image under the i-th field of view, and the intersection of the elements of the second region segmentation result The number is determined as the number of intact weakly stained cells
  • the cell nucleus position, the first region segmentation result relative to the complement of the stained image under the i-th field of view, and the third region segmentation result relative to the complement of the stained image under the i-th field of view Determining the number of incomplete weakly stained cells by the number of elements in the intersection of the set and the second region segmentation result;
  • the cell membrane description result includes the first region segmentation result, the second region segmentation result, and the third region segmentation result
  • the first region segmentation result is used to indicate the position of the fully stained cell membrane.
  • the second region segmentation result is used to indicate a cell membrane location with a staining intensity greater than the first threshold value
  • the third region segmentation result is used to indicate a cell membrane location with a staining intensity greater than the second threshold value.
  • the cell nucleus detection module 720 is configured to process the stained image in the i-th field of view using a cell nucleus detection model to obtain the position of the cell nucleus.
  • the result determination module 750 is configured to:
  • the analysis result of the pathological section is determined.
  • An exemplary embodiment of the present application also provides an intelligent microscope system, including: a microscope, a camera, and computer equipment.
  • the microscope is used for observing pathological sections after cell membrane staining.
  • the camera is used to obtain stained images of the pathological section under n fields of view of the microscope, and the n is a positive integer.
  • the computer device is configured to determine the cell nucleus position of the cancer cell in the stained image in the i-th field of view for the stained image in the i-th field of view in the n fields, where i is less than or equal to the A positive integer of n; the cell membrane description result of the stained image under the i-th field of view is generated, the cell membrane description result is used to indicate the integrity of the cell membrane staining and the staining intensity; according to the cell nucleus position and the cell membrane description result, Determine the number of various types of cells in the stained image under the i-th field of view; determine the analysis result of the pathological slice according to the number of various types of cells in the stained image under the n field of view.
  • the computer device is also used to execute other steps introduced in the above method embodiment, which is not limited in the embodiment of the present application.
  • FIG. 9 shows a schematic structural diagram of a computer device provided by an embodiment of the present application. Specifically:
  • the computer equipment 900 includes a CPU (Central Processing Unit) 901, a system memory 904 including RAM (Random Access Memory) 902 and ROM (Read Only Memory) 903, and a connection The system memory 904 and the system bus 905 of the central processing unit 901.
  • the computer device 900 also includes a basic I/O (Input/Output input/output) system 906 that helps to transfer information between various devices in the computer, and a basic I/O (Input/Output) system 906 for storing the operating system 913, application programs 914, and other program modules 915.
  • the basic input/output system 906 includes a display 908 for displaying information and an input device 909 such as a mouse and a keyboard for the user to input information.
  • the display 908 and the input device 909 are both connected to the central processing unit 901 through the input and output controller 910 connected to the system bus 905.
  • the basic input/output system 906 may also include an input and output controller 910 for receiving and processing input from multiple other devices such as a keyboard, a mouse, or an electronic stylus.
  • the input and output controller 910 also provides output to a display screen, a printer, or other types of output devices.
  • the mass storage device 907 is connected to the central processing unit 901 through a mass storage controller (not shown) connected to the system bus 905.
  • the mass storage device 907 and its associated computer readable medium provide non-volatile storage for the computer device 900. That is, the mass storage device 907 may include a computer readable medium (not shown) such as a hard disk or a CD-ROM (Compact Disc Read-Only Memory) drive.
  • the computer-readable media may include computer storage media and communication media.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storing information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media include RAM, ROM, EPROM (Erasable Programmable Read Only Memory), flash memory or other solid-state storage technology, CD-ROM or other optical storage, tape cartridges, magnetic tape, disk storage or Other magnetic storage devices.
  • RAM random access memory
  • ROM read only memory
  • EPROM Erasable Programmable Read Only Memory
  • flash memory or other solid-state storage technology
  • CD-ROM or other optical storage CD-ROM or other optical storage
  • tape cartridges magnetic tape
  • disk storage disk storage or Other magnetic storage devices.
  • the computer device 900 may also be connected to a remote computer on the network through a network such as the Internet to run. That is, the computer device 900 can be connected to the network 912 through the network interface unit 911 connected to the system bus 905, or in other words, the network interface unit 911 can also be used to connect to other types of networks or remote computer systems (not shown). ).
  • the memory also includes at least one instruction, at least one program, code set, or instruction set.
  • the at least one instruction, at least one program, code set, or instruction set is stored in the memory and configured to be used by one or more processors. Execute to realize the above-mentioned pathological slice image processing method.
  • a computer-readable storage medium stores at least one instruction, at least one program, a code set, or an instruction set, the at least one instruction, the at least one program ,
  • the code set or the instruction set is executed by the processor of the terminal to realize the above-mentioned pathological slice image processing method.
  • the computer-readable storage medium may include: ROM, RAM, SSD (Solid State Drives, solid state drive), or optical disc, etc.
  • random access memory may include ReRAM (Resistance Random Access Memory) and DRAM (Dynamic Random Access Memory).
  • a computer program product is also provided, which is used to implement the above-mentioned pathological slice image processing method when the computer program product is executed by the processor of the terminal.
  • the "plurality” mentioned herein refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone.
  • the character "/” generally indicates that the associated objects before and after are in an "or” relationship.
  • the numbering of the steps described in this article only exemplarily shows a possible order of execution among the steps. In some other embodiments, the above steps may also be executed out of the order of numbers, such as two differently numbered ones. The steps are executed at the same time, or the two steps with different numbers are executed in the reverse order of the figure, which is not limited in the embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Multimedia (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

本申请提供了一种病理切片图像的处理方法、装置、系统及存储介质,涉及人工智能的智能医疗技术领域。所述方法包括:获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像;对于n个视野中第i个视野下的染色图像,确定第i个视野下的染色图像中癌症细胞的细胞核位置;生成第i个视野下的染色图像的细胞膜描述结果,该细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;根据细胞核位置和细胞膜描述结果,确定第i个视野下的染色图像中各种类型的细胞数量;根据n个视野下的染色图像中各种类型的细胞数量,确定病理切片的分析结果。本申请提供了一种对病理切片进行系统性的细胞膜染色分析的技术方案,有助于提升检测结果的准确性。

Description

病理切片图像的处理方法、装置、系统及存储介质
本申请要求于2019年11月14日提交的申请号为201911115369.6、发明名称为“病理切片图像的处理方法、装置、系统及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及人工智能的智能医疗技术领域,特别涉及一种病理切片图像的处理方法、装置、系统及存储介质。
背景技术
正确检测和评定乳腺癌的HER2(Human Epidermal growth factor Receptor-2,人类表皮生长因子受体-2)蛋白表达和基因扩增状态,对乳腺癌的临床治疗及预后判断至关重要。
在相关技术中,通过对病理切片进行细胞膜染色,然后在显微镜下对该经细胞膜染色后的病理切片进行观察和分析,能够辅助医生进行HER2检测。但是,相关技术缺少对病理切片进行系统性的细胞膜染色分析,导致最终检测结果不够准确。
发明内容
本申请实施例提供了一种病理切片图像的处理方法、装置、系统及存储介质,可用于解决相关技术中对病理切片的分析不够准确的技术问题。所述技术方案如下:
一方面,本申请实施例提供了一种病理切片图像的处理方法,应用于计算机设备中,所述方法包括:
获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,所述n为正整数;
对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;
生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果 用于指示细胞膜染色的完整性和染色强度;
根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;
根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
另一方面,本申请实施例提供了一种病理切片图像的处理装置,所述装置包括:
图像获取模块,用于获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,所述n为正整数;
细胞核检测模块,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;
细胞膜描述模块,用于生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;
数量确定模块,用于根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;
结果确定模块,用于根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
另一方面,本申请实施例提供了一种智能显微镜系统,所述智能显微镜系统包括:显微镜、相机和计算机设备;
所述显微镜,用于对经细胞膜染色后的病理切片进行观察;
所述相机,用于获取所述病理切片在所述显微镜的n个视野下的染色图像,所述n为正整数;
所述计算机设备,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
再一方面,本申请实施例提供了一种计算机设备,所述计算机设备包括处 理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述病理切片图像的处理方法。
再一方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如上述病理切片图像的处理方法。
还一方面,本申请实施例提供了一种计算机程序产品,所述计算机程序产品被处理器执行时,用于实现上述病理切片图像的处理方法。
本申请实施例提供的技术方案可以包括如下有益效果:
通过获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,对各个染色图像进行细胞检测和细胞膜描述,得到各个染色图像中癌症细胞的细胞核位置和细胞膜描述结果,结合上述两方面信息确定出各种类型的细胞数量,进而确定该病例切片的分析结果,提供了一种对病理切片进行系统性的细胞膜染色分析的技术方案,有助于提升检测结果的准确性。
另外,在本申请实施例提供的技术方案中,并没有直接使用深度学习方法直接对病理切片的染色图像进行分析,由模型直接输出分析结果,这种黑盒处理方式并不符合相关诊断指南对于HER2分级的规定。在本申请实施例提供的技术方案中,依据相关诊断指南的定义,对癌症细胞进行检测并分类,从而可以依照指南的判定准则进行HER2分级,从而提升最终得到的HER2分级结果的规范性和准确性。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个实施例提供的智能显微镜系统的示意图;
图2是本申请一个实施例提供的病理切片图像的处理方法的流程图;
图3是本申请示出的一种在显微镜的多个视野下的染色图像的示意图;
图4是本申请示出的一种细胞核检测模型的输入和输出的示意图;
图5是本申请示出的一种染色图像原图、封闭区域分割图像、弱染色分割图像和强染色分割图像的示意图;
图6是本申请示出的一种完整技术方案的流程图;
图7是本申请一个实施例提供的病理切片图像的处理装置的框图;
图8是本申请另一个实施例提供的病理切片图像的处理装置的框图;
图9是本申请一个实施例提供的计算机设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
AI(Artificial Intelligence,人工智能)是利用数字计算机或者数字计算机控制的机器模拟、延伸和扩展人的智能,感知环境、获取知识并使用知识获得最佳结果的理论、方法、技术及应用系统。换句话说,人工智能是计算机科学的一个综合技术,它企图了解智能的实质,并生产出一种新的能以人类智能相似的方式做出反应的智能机器。人工智能也就是研究各种智能机器的设计原理与实现方法,使机器具有感知、推理与决策的功能。
人工智能技术是一门综合学科,涉及领域广泛,既有硬件层面的技术也有软件层面的技术。人工智能基础技术一般包括如传感器、专用人工智能芯片、云计算、分布式存储、大数据处理技术、操作/交互系统、机电一体化等技术。人工智能软件技术主要包括计算机视觉技术、语音处理技术、自然语言处理技术以及机器学习/深度学习等几大方向。
计算机视觉技术(Computer Vision,CV):计算机视觉是一门研究如何使机器“看”的科学,更进一步的说,就是指用摄影机和电脑代替人眼对目标进行识别、跟踪和测量等机器视觉,并进一步做图形处理,使电脑处理成为更适合人眼观察或传送给仪器检测的图像。作为一个科学学科,计算机视觉研究相关的理论和技术,试图建立能够从图像或者多维数据中获取信息的人工智能系统。计算机视觉技术通常包括图像处理、图像识别、图像语义理解、图像检索、OCR(Optical Character Recognition,光学字符识别)、视频处理、视频语义理解、视频内容/行为识别、三维物体重建、3D技术、虚拟现实、增强现实、同步定位与地图构建等技术,还包括常见的人脸识别、指纹识别等生物特征识别技术。
ML(Machine Learning,机器学习)是一门多领域交叉学科,涉及概率论、统计学、逼近论、凸分析、算法复杂度理论等多门学科。专门研究计算机怎样模拟或实现人类的学习行为,以获取新的知识或技能,重新组织已有的知识结构使之不断改善自身的性能。机器学习是人工智能的核心,是使计算机具有智能的根本途径,其应用遍及人工智能的各个领域。机器学习和深度学习通常包括人工神经网络、置信网络、强化学习、迁移学习、归纳学习、示教学习等技术。
随着人工智能技术研究和进步,人工智能技术在多个领域展开研究和应用,例如常见的智能家居、智能穿戴设备、虚拟助理、智能音箱、智能营销、无人驾驶、自动驾驶、无人机、机器人、智能医疗、智能客服等,相信随着技术的发展,人工智能技术将在更多的领域得到应用,并发挥越来越重要的价值。
本申请实施例提供的方案涉及智能医疗技术领域,利用计算机视觉技术对经细胞膜染色后的病理切片进行图像处理和分析,最终确定出该病理切片对应的分析结果,从而辅助医生进行疾病的诊断与治疗。
下面,将通过几个实施例对本申请技术方案进行介绍说明。
请参考图1,其示出了本申请一个实施例提供的智能显微镜系统的示意图。该智能显微镜系统可以包括:显微镜11、相机12和计算机设备13。
显微镜11用于对病理切片进行观察。在本申请的实施例中,显微镜11用于对经细胞膜染色后的病理切片进行观察。
相机12用于拍摄获取病理切片在显微镜11的视野下的病理切片图像。可选地,上述病理切片为HER2染色切片。HER2染色将阳性癌症细胞膜染成棕色,细胞核染成蓝色。HER2分级一般包括0、1+、2+和3+四类。通常,需要在显微镜的中高倍视野下(如10倍、20倍或40倍)采集获取上述病理切片图像。
计算机设备13用于基于相机12拍摄获取的上述病理切片图像,通过执行稍后将详细描述的方法流程,获得该病理切片图像对应的分析结果。在上述病理切片为HER2染色切片的情况下,该分析结果可以是HER2分级结果。计算机设备13可以是任何具备计算和存储能力的电子设备,如PC(Personal Computer,个人计算机)。
显微镜11、相机12和计算机设备13可以配置为处于同一物理位置,甚至配置为属于同一物理设备。可替代地,显微镜11、相机12和计算机设备13可 以配置为处于不同位置,并且通过有线或者无线通信网络连接,从而在相互之间传输数据或者命令。
请参考图2,其示出了本申请一个实施例提供的病理切片图像的处理方法的流程图。该方法各步骤的执行主体可以是上文介绍的智能显微镜系统中的计算机设备。该方法可以包括如下几个步骤:
步骤201,获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,n为正整数。
可选地,上述n个视野是对病理切片进行观察后得到的多个包含癌症细胞的显微镜视野。该n个视野可以由医生负责选择。例如,以确定病理切片的HER2分级为例,由于HER2分级是定义在浸润癌细胞染色状况上的,因此可以由医生负责采集多个典型包含浸润癌细胞区域的显微镜视野。
在一个示例中,n为大于1的整数,例如n为10。由于相关诊断指南(比如乳腺癌HER2检测指南)上对HER2分级是定义在对病理切片进行全片分析的基础上,而显微镜应用场景的图像获取来自多个视野,因此很难拼接出完整的病理切片图像。本申请通过采用医生选择的多个典型视野图像近似替代全片诊断结果的策略进行HER2分级,能够避免全片扫描,并得到准确的HER2分级结果。
示例性地,如图3所示,其示出了经细胞膜染色后的病理切片在显微镜的多个视野下的染色图像的示意图,其中,图像31、32和33分别表示同一病理切片在显微镜的3个视野下的染色图像。需要说明的是,该染色图像实际上是RGB图像,阳性癌症细胞膜染成棕色,细胞核染成蓝色。
步骤202,对于n个视野中第i个视野下的染色图像,确定第i个视野下的染色图像中癌症细胞的细胞核位置,i为小于或等于n的正整数。
在示例性实施例中,可以采用深度学习方法对染色图像进行癌症细胞检测,确定出染色图像中癌症细胞的细胞核位置。比如,可以采用FCN(Fully Convolutional Networks,全卷积网络)构建细胞核检测模型,采用热力图回归的方式对该细胞核检测模型进行训练,完成训练的细胞核检测模型则可用于细胞核检测。细胞核检测模型的输入可以是染色图像(该染色图像为RGB图像),输入为以细胞中心点为中心的类高斯响应图像。通过寻找输入热力图的局部最大响应位置的方式,便可得到所有的癌症细胞。
可选地,通过细胞核检测模型对第i个视野下的染色图像进行处理,得到第i个视野下的染色图像中癌症细胞的细胞核位置。该细胞核位置可以包括第i个视野下的染色图像中癌症细胞的坐标集合(记为D detect),该坐标集合D detect包括第i个视野下的染色图像中各个癌症细胞的细胞核的位置坐标。示例性地,如图4所示,左侧为细胞核检测模型的输入图像41,也即某一个视野下的染色图像,右侧为标记出检测到的癌症细胞的细胞核位置的染色图像42,其中,细胞核位置以黑色小圆点标记。
当然,上文介绍的癌症细胞检测方法仅是示例性和解释性的,本申请实施例并不限定还可以采用其它的癌症细胞检测方法。
步骤203,生成第i个视野下的染色图像的细胞膜描述结果,该细胞膜描述结果用于指示细胞膜染色的完整性和染色强度。
由于细胞膜染色的完整性和染色强度决定了该病理切片的分析结果,因此需要获取染色图像的细胞膜描述结果。其中,细胞膜染色的完整性是指染色后的细胞膜是否为一个完整的细胞膜,染色强度是指染色后的细胞膜所呈现的着色深度。
在本申请实施例中,可以对染色图像进行处理和分析,得到该染色图像的细胞膜描述结果,相关流程可参见下文实施例中的介绍说明。
步骤204,根据细胞核位置和细胞膜描述结果,确定第i个视野下的染色图像中各种类型的细胞数量。
在本申请实施例中,依据细胞膜染色情况将细胞分为多种类型。可选地,上述类型包括:完整强染色细胞、不完整强染色细胞、完整弱染色细胞、不完整弱染色细胞和无染色细胞共5种类型。其中,完整强染色细胞是指细胞膜染色完整且染色强度高的细胞,不完整强染色细胞是指细胞膜染色不完整且染色强度高的细胞,完整弱染色细胞是指细胞膜染色完整且染色强度低的细胞,不完整弱染色细胞是指细胞膜染色不完整且染色强度低的细胞,无染色细胞是指没有进行染色的细胞。
通过结合染色图像的细胞核位置和细胞膜描述结果,可以确定出该染色图像中各种类型的细胞数量,从而实现对细胞的量化分析和判断。例如,确定出某个视野下的染色图像中各种类型的细胞数量包括:完整强染色细胞170个,不完整强染色细胞230个,完整弱染色细胞2个,不完整弱染色细胞104个,无染色细胞47个。
步骤205,根据n个视野下的染色图像中各种类型的细胞数量,确定病理切片的分析结果。
当n为大于1的整数时,对于该多个视野中每个视野下的染色图像,均可以执行上述步骤202-204,得到每个视野下的染色图像中各种类型的细胞数量。之后,综合统计出该n个视野下的染色图像中各种类型的细胞数量。对于每一种类型的细胞,将该种类型的细胞在上述各个染色图像中的数量求和,得到该种类型的细胞在所有n个染色图像中的总数量。
可选地,根据上述n个视野下的染色图像中各种类型的细胞数量,确定n个视野下的染色图像中各种类型的细胞占比;根据各种类型的细胞占比,确定病理切片的分析结果。以对乳腺癌病理切片进行HER2分级为例,如果完整强染色细胞的细胞占比大于10%,则HER2分级为3+。如果完整弱染色细胞的细胞占比大于10%或者完整强染色细胞的细胞占比小于或等于10%,则HER2分级为2+。如果不完整弱染色细胞的细胞占比大于10%,则HER2分级为1+。如果无染色细胞或者不完整弱染色细胞的细胞占比小于或等于10%,则HER2分级为0。当HER2分级为3+时判断为HAR2阳性,当HER2分级为2+时需要进一步应用原位杂交的方法进行HER2基因增扩状态检测,也可以选取不同的组织块重新检测。当HER2分级为1+或者0时判断为HAR2阴性。当然,上述关于根据各种类型的细胞占比确定病理切片的分析结果的介绍说明,仅是示例性和解释性的,在实际应用中,可以结合实际情况设定相应的分析结果判定标准,本申请实施例对此不作限定。
综上所述,本申请实施例提供的技术方案中,通过获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,对各个染色图像进行细胞检测和细胞膜描述,得到各个染色图像中癌症细胞的细胞核位置和细胞膜描述结果,结合上述两方面信息确定出各种类型的细胞数量,进而确定该病例切片的分析结果,提供了一种对病理切片进行系统性的细胞膜染色分析的技术方案,有助于提升检测结果的准确性。
另外,在本申请实施例提供的技术方案中,并没有直接使用深度学习方法直接对病理切片的染色图像进行分析,由模型直接输出分析结果,这种黑盒处理方式并不符合相关诊断指南对于HER2分级的规定。在本申请实施例提供的技术方案中,依据相关诊断指南的定义,对癌症细胞进行检测并分类,从而可以依照指南的判定准则进行HER2分级,从而提升最终得到的HER2分级结果 的规范性和准确性。
在示例性实施例中,上述步骤203可以包括如下几个子步骤:
1、对第i个视野下的染色图像进行色彩通道分解和重组,得到目标染色通道图像;
染色图像是一个RGB三通道图像,可以表示为I RGB,通过对该染色图像I RGB进行色彩通道分解和重组,生成3个免疫组化通道图像,分别是I H(苏木精),I E(依红)和I DAB(二氨基联苯胺),其中,I DAB通道图像为棕色染色通道图像。在本申请实施例中,目标染色通道图像即为上述棕色染色通道图像,通过对棕色染色通道图像进行分析,可以得到需要的细胞膜描述结果。
2、根据目标染色通道图像生成细胞膜描述结果。
在本申请实施例中,细胞膜描述结果包括第一区域分割结果、第二区域分割结果和第三区域分割结果。其中,第一区域分割结果用于指示染色完整的细胞膜位置,第二区域分割结果用于指示染色强度大于第一门限值的细胞膜位置,第三区域分割结果用于指示染色强度大于第二门限值的细胞膜位置。如果染色强度的取值越大表示染色强度越高,那么上述第一门限值小于第二门限值,也即染色强度大于第一门限值且小于等于第二门限值的细胞即为弱染色细胞,染色强度大于第二门限值的细胞即为强染色细胞,染色强度小于第一门限值的细胞可以视为无染色细胞。
可选地,本步骤可以包括如下几个子步骤:
2.1、对目标染色通道图像以第一阈值进行阈值分割处理,得到弱染色分割图像;
例如,对于目标染色通道图像中的每个像素,如果该像素的像素值大于第一阈值,则将该像素的像素值记为1,如果该像素的像素值小于或等于第一阈值,则将该像素的像素值记为0,最终将目标染色通道图像转化为一个二值化的弱染色分割图像。该第一阈值可以结合实际情况预先设定,本申请实施例对此不作限定。
2.2、对目标染色通道图像以第二阈值进行阈值分割处理,得到强染色分割图像;
例如,对于目标染色通道图像中的每个像素,如果该像素的像素值大于第二阈值,则将该像素的像素值记为1,如果该像素的像素值小于或等于第二阈值, 则将该像素的像素值记为0,最终将目标染色通道图像转化为一个二值化的强染色分割图像。该第二阈值可以结合实际情况预先设定,本申请实施例对此不作限定。
2.3、根据弱染色分割图像和强染色分割图像,生成细胞膜描述结果。
以目标染色通道图像为上述I DAB通道图像为例,由于I DAB通道图像中各个像素的像素值为负数,因此可以采用第一阈值t 1对图像I DAB进行阈值分割处理得到弱染色分割图像M light,采用第二阈值t 2对图像I DAB进行阈值分割处理得到强染色分割图像M heavy,t 2=a×t 1,a为小于1的系数。示例性地,t 1=-0.35,a=0.9。
需要说明的一点是,上述弱染色分割图像中除了包含弱染色细胞之外还包括比弱染色更强的强染色细胞,但上述强染色分割图像中仅包括强染色细胞。
可选地,在得到弱染色分割图像和强染色分割图像之后,可以对弱染色分割图像进行第一形态学处理,得到第一区域分割结果;对弱染色分割图像进行第二形态学处理,得到第二区域分割结果;对强染色分割图像进行第二形态学处理,得到第三区域分割结果。
其中,对弱染色分割图像进行第一形态学处理可以包括:对弱染色分割图像进行骨架提取处理,得到弱染色分割图像中的细胞骨架;查找细胞骨架闭合的封闭区域;若封闭区域是最内层封闭区域,则填充该最内层封闭区域,得到封闭区域分割图像;提取封闭区域分割图像中前景像素的位置信息,得到第一区域分割结果。可选地,对弱染色分割图像M light进行中心线提取处理,完成细胞膜勾画得到细胞骨架,然后对中心线进行统计分析,统计每个连通的边界构成的封闭区域。如果封闭区域是最内层封闭区域,则将该最内层封闭区域确定为是一个完整细胞膜的勾画边界。填充所有最内层封闭区域,得到封闭区域分割图像M enclosed,如图5所示,并定义封闭区域分割图像M enclosed中所有前景像素的位置坐标集合为P enclosed,该P enclosed即为第一区域分割结果。
对弱染色分割图像进行第二形态学处理可以包括:对弱染色分割图像进行区域膨胀处理,得到处理后的弱染色分割图像;提取处理后的弱染色分割图像中前景像素的位置信息,得到第二区域分割结果。区域膨胀处理的膨胀距离d可以结合实际情况预先设定,示例性地d=45,本申请实施例对此不作限定。如图5所示,假设对弱染色分割图像M light进行区域膨胀处理,得到处理后的弱染色分割图像E light,并定义处理后的弱染色分割图像E light中所有前景像素的位置坐标集合为P light,该P light即为第二区域分割结果。
对强染色分割图像进行第二形态学处理可以包括:对强染色分割图像进行区域膨胀处理,得到处理后的强染色分割图像;提取处理后的强染色分割图像中前景像素的位置信息,得到第三区域分割结果。区域膨胀处理的膨胀距离d可以结合实际情况预先设定,示例性地d=45,本申请实施例对此不作限定。如图5所示,假设对强染色分割图像M heavy进行区域膨胀处理,得到处理后的强染色分割图像E heavy,并定义处理后的强染色分割图像E heavy中所有前景像素的位置坐标集合为P heavy,该P heavy即为第三区域分割结果。
相应地,根据细胞核位置和细胞膜描述结果,确定第i个视野下的染色图像中各种类型的细胞数量,包括如下几个子步骤:
1、将细胞核位置、第一区域分割结果和第三区域分割结果的交集中元素的个数,确定为完整强染色细胞的数量;
也即,完整强染色细胞坐标集合为D detect∩P enclosed∩P heavy,对应的细胞数目为集合元素个数,即card(D detect∩P enclosed∩P heavy)。
2、将细胞核位置、第一区域分割结果相对于第i个视野下的染色图像的补集、以及第二区域分割结果的交集中元素的个数,确定为不完整强染色细胞的数量;
也即,不完整强染色细胞坐标集合为D detect∩C UP enclosed∩P heavy,这里C UP enclosed为P enclosed相对于全图U(也即第i个视野下的染色图像)的补集。对应的细胞数目为card(D detect∩C UP enclosed∩P heavy)。
3、将细胞核位置、第一区域分割结果、第三区域分割结果相对于第i个视野下的染色图像的补集、以及第二区域分割结果的交集中元素的个数,确定为完整弱染色细胞的数量;
也即,完整弱染色细胞坐标集合为D detect∩P enclosed∩C UP heavy∩P light,这里C UP heavy为P heavy相对于全图U(也即第i个视野下的染色图像)的补集。对应的细胞数目为card(D detect∩P enclosed∩C UP heavy∩P light)。
4、将细胞核位置、第一区域分割结果相对于第i个视野下的染色图像的补集、第三区域分割结果相对于第i个视野下的染色图像的补集、以及第二区域分割结果的交集中元素的个数,确定不完整弱染色细胞的数量;
也即,不完整弱染色细胞坐标集合为D detect∩C UP enclosed∩C UP heavy∩P light。对应的细胞数目为card(D detect∩C UP enclosed∩C UP heavy∩P light)。
5、将细胞核位置与第二区域分割结果相对于第i个视野下的染色图像的补 集的交集中元素的个数,确定为无染色细胞的数量。
也即,无染色细胞坐标集合为D detect∩C UP light,这里C UP light为P light相对于全图U(也即第i个视野下的染色图像)的补集。对应的细胞数目为card(D detect∩C UP light)。
在本申请实施例中,通过上述方式,实现了对病理切片的染色图像中各种细胞类型进行量化统计和分析,为HER2分级提供了可靠的数据支持。
结合参考图6,其示出了本申请完整技术方案的流程图。获取经细胞膜染色后的病理切片在显微镜的多个视野下的染色图像,对于单视野下的染色图像,进行细胞核检测和细胞膜描述,得到该单视野下的染色图像中癌症细胞的细胞核位置和细胞膜描述结果,结合上述两方面信息得到该单视野下的细胞分类结果,也即各种类型的细胞数量。综合各个单视野下的细胞分类结果,得到多视野下的细胞分类结果,然后据此确定出该病理切片对应的HER2分级。例如,多视野下的细胞分类结果为:完整强染色细胞的细胞占比为20%,不完整强染色细胞的细胞占比为35%,完整弱染色细胞的细胞占比为5%,不完整弱染色细胞的细胞占比为10%,无染色细胞的细胞占比为30%,则HER2分级为3+。
下述为本申请装置实施例,可以用于执行本申请方法实施例。对于本申请装置实施例中未披露的细节,请参照本申请方法实施例。
请参考图7,其示出了本申请一个实施例提供的病理切片图像的处理装置的框图。该装置具有实现上述方法示例的功能,所述功能可以由硬件实现,也可以由硬件执行相应的软件实现。该装置可以是上文介绍的计算机设备,也可以设置在计算机设备中。该装置700可以包括:图像获取模块710、细胞核检测模块720、细胞膜描述模块730、数量确定模块740和结果确定模块750。
图像获取模块710,用于获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,所述n为正整数。
细胞核检测模块720,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数。
细胞膜描述模块730,用于生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度。
数量确定模块740,用于根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量。
结果确定模块750,用于根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
综上所述,本申请实施例提供的技术方案中,通过获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,对各个染色图像进行细胞检测和细胞膜描述,得到各个染色图像中癌症细胞的细胞核位置和细胞膜描述结果,结合上述两方面信息确定出各种类型的细胞数量,进而确定该病例切片的分析结果,提供了一种对病理切片进行系统性的细胞膜染色分析的技术方案,有助于提升检测结果的准确性。
在示例性实施例中,如图8所示,所述细胞膜描述模块730包括:分解重组子模块731和细胞膜描述子模块732。
分解重组子模块731,用于对所述第i个视野下的染色图像进行色彩通道分解和重组,得到目标染色通道图像。
细胞膜描述子模块732,用于根据所述目标染色通道图像生成所述细胞膜描述结果,所述细胞膜描述结果包括第一区域分割结果、第二区域分割结果和第三区域分割结果。
其中,所述第一区域分割结果用于指示染色完整的细胞膜位置,所述第二区域分割结果用于指示染色强度大于第一门限值的细胞膜位置,所述第三区域分割结果用于指示染色强度大于第二门限值的细胞膜位置。
在示例性实施例中,如图8所示,所述细胞膜描述子模块732包括:第一分割单元732a、第二分割单元732b和细胞膜描述单元732c。
第一分割单元732a,用于对所述目标染色通道图像以第一阈值进行阈值分割处理,得到弱染色分割图像。
第二分割单元732b,用于对所述目标染色通道图像以第二阈值进行阈值分割处理,得到强染色分割图像。
细胞膜描述单元732c,用于根据所述弱染色分割图像和所述强染色分割图像,生成所述细胞膜描述结果。
在示例性实施例中,如图8所示,所述细胞膜描述单元732c,用于:
对所述弱染色分割图像进行第一形态学处理,得到所述第一区域分割结果;
对所述弱染色分割图像进行第二形态学处理,得到所述第二区域分割结果;
对所述强染色分割图像进行所述第二形态学处理,得到所述第三区域分割结果。
在示例性实施例中,所述细胞膜描述单元732c,用于:
对所述弱染色分割图像进行骨架提取处理,得到所述弱染色分割图像中的细胞骨架;
查找所述细胞骨架闭合的封闭区域;
若所述封闭区域是最内层封闭区域,则填充所述最内层封闭区域,得到封闭区域分割图像;
提取所述封闭区域分割图像中前景像素的位置信息,得到所述第一区域分割结果。
在示例性实施例中,所述细胞膜描述单元732c,用于:
对所述弱染色分割图像进行区域膨胀处理,得到处理后的弱染色分割图像;
提取所述处理后的弱染色分割图像中前景像素的位置信息,得到所述第二区域分割结果。
在示例性实施例中,所述细胞膜描述单元732c,用于:
对所述强染色分割图像进行区域膨胀处理,得到处理后的强染色分割图像;
提取所述处理后的强染色分割图像中前景像素的位置信息,得到所述第三区域分割结果。
在示例性实施例中,所述类型包括:完整强染色细胞、不完整强染色细胞、完整弱染色细胞、不完整弱染色细胞和无染色细胞。
在示例性实施例中,所述数量确定模块740,用于:
将所述细胞核位置、第一区域分割结果和第三区域分割结果的交集中元素的个数,确定为所述完整强染色细胞的数量;
将所述细胞核位置、所述第一区域分割结果相对于所述第i个视野下的染色图像的补集、以及第二区域分割结果的交集中元素的个数,确定为所述不完整强染色细胞的数量;
将所述细胞核位置、所述第一区域分割结果、所述第三区域分割结果相对于所述第i个视野下的染色图像的补集、以及所述第二区域分割结果的交集中元素的个数,确定为所述完整弱染色细胞的数量;
将所述细胞核位置、所述第一区域分割结果相对于所述第i个视野下的染色图像的补集、所述第三区域分割结果相对于所述第i个视野下的染色图像的补 集、以及所述第二区域分割结果的交集中元素的个数,确定所述不完整弱染色细胞的数量;
将所述细胞核位置与所述第二区域分割结果相对于所述第i个视野下的染色图像的补集的交集中元素的个数,确定为所述无染色细胞的数量;
其中,所述细胞膜描述结果包括所述第一区域分割结果、所述第二区域分割结果和所述第三区域分割结果,所述第一区域分割结果用于指示染色完整的细胞膜位置,所述第二区域分割结果用于指示染色强度大于第一门限值的细胞膜位置,所述第三区域分割结果用于指示染色强度大于第二门限值的细胞膜位置。
在示例性实施例中,所述细胞核检测模块720,用于通过细胞核检测模型对所述第i个视野下的染色图像进行处理,得到所述细胞核位置。
在示例性实施例中,所述结果确定模块750,用于:
根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述n个视野下的染色图像中所述各种类型的细胞占比;
根据所述各种类型的细胞占比,确定所述病理切片的分析结果。
需要说明的是,上述实施例提供的装置,在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
本申请一示例性实施例还提供了一种智能显微镜系统,包括:显微镜、相机和计算机设备。
所述显微镜,用于对经细胞膜染色后的病理切片进行观察。
所述相机,用于获取所述病理切片在所述显微镜的n个视野下的染色图像,所述n为正整数。
所述计算机设备,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数 量;根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
可选地,所述计算机设备还用于执行上文方法实施例中介绍的其它步骤,本申请实施例对此不作限定。
请参考图9,其示出了本申请一个实施例提供的计算机设备的结构示意图。具体来讲:
所述计算机设备900包括CPU(Central Processing Unit,中央处理单元)901、包括RAM(Random Access Memory,随机存取存储器)902和ROM(Read Only Memory,只读存储器)903的系统存储器904,以及连接系统存储器904和中央处理单元901的系统总线905。所述计算机设备900还包括帮助计算机内的各个器件之间传输信息的基本I/O(Input/Output输入/输出)系统906,和用于存储操作系统913、应用程序914和其他程序模块915的大容量存储设备907。
所述基本输入/输出系统906包括有用于显示信息的显示器908和用于用户输入信息的诸如鼠标、键盘之类的输入设备909。其中所述显示器908和输入设备909都通过连接到系统总线905的输入输出控制器910连接到中央处理单元901。所述基本输入/输出系统906还可以包括输入输出控制器910以用于接收和处理来自键盘、鼠标、或电子触控笔等多个其他设备的输入。类似地,输入输出控制器910还提供输出到显示屏、打印机或其他类型的输出设备。
所述大容量存储设备907通过连接到系统总线905的大容量存储控制器(未示出)连接到中央处理单元901。所述大容量存储设备907及其相关联的计算机可读介质为计算机设备900提供非易失性存储。也就是说,所述大容量存储设备907可以包括诸如硬盘或者CD-ROM(Compact Disc Read-Only Memory,只读光盘)驱动器之类的计算机可读介质(未示出)。
不失一般性,所述计算机可读介质可以包括计算机存储介质和通信介质。计算机存储介质包括以用于存储诸如计算机可读指令、数据结构、程序模块或其他数据等信息的任何方法或技术实现的易失性和非易失性、可移动和不可移动介质。计算机存储介质包括RAM、ROM、EPROM(Erasable Programmable Read Only Memory,可擦除可编程只读存储器)、闪存或其他固态存储其技术,CD-ROM或其他光学存储、磁带盒、磁带、磁盘存储或其他磁性存储设备。当然,本领域技术人员可知所述计算机存储介质不局限于上述几种。上述的系统 存储器904和大容量存储设备907可以统称为存储器。
根据本申请的各种实施例,所述计算机设备900还可以通过诸如因特网等网络连接到网络上的远程计算机运行。也即计算机设备900可以通过连接在所述系统总线905上的网络接口单元911连接到网络912,或者说,也可以使用网络接口单元911来连接到其他类型的网络或远程计算机系统(未示出)。
所述存储器还包括至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、至少一段程序、代码集或指令集存储于存储器中,且经配置以由一个或者一个以上处理器执行,以实现上述病理切片图像的处理方法。
在示例性实施例中,还提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或所述指令集在被终端的处理器执行时以实现上述病理切片图像的处理方法。
可选地,该计算机可读存储介质可以包括:ROM、RAM、SSD(Solid State Drives,固态硬盘)或光盘等。其中,随机存取记忆体可以包括ReRAM(Resistance Random Access Memory,电阻式随机存取记忆体)和DRAM(Dynamic Random Access Memory,动态随机存取存储器)。
在示例性实施例中,还提供一种计算机程序产品,所述计算机程序产品被终端的处理器执行时,用于实现上述病理切片图像的处理方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。另外,本文中描述的步骤编号,仅示例性示出了步骤间的一种可能的执行先后顺序,在一些其它实施例中,上述步骤也可以不按照编号顺序来执行,如两个不同编号的步骤同时执行,或者两个不同编号的步骤按照与图示相反的顺序执行,本申请实施例对此不作限定。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种病理切片图像的处理方法,应用于计算机设备中,所述方法包括:
    获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,所述n为正整数;
    对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;
    生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;
    根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;
    根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
  2. 根据权利要求1所述的方法,其中,所述生成所述第i个视野下的染色图像的细胞膜描述结果,包括:
    对所述第i个视野下的染色图像进行色彩通道分解和重组,得到目标染色通道图像;
    根据所述目标染色通道图像生成所述细胞膜描述结果,所述细胞膜描述结果包括第一区域分割结果、第二区域分割结果和第三区域分割结果;
    其中,所述第一区域分割结果用于指示染色完整的细胞膜位置,所述第二区域分割结果用于指示染色强度大于第一门限值的细胞膜位置,所述第三区域分割结果用于指示染色强度大于第二门限值的细胞膜位置。
  3. 根据权利要求2所述的方法,其中,所述根据所述目标染色通道图像生成所述细胞膜描述结果,包括:
    对所述目标染色通道图像以第一阈值进行阈值分割处理,得到弱染色分割图像;
    对所述目标染色通道图像以第二阈值进行阈值分割处理,得到强染色分割图像;
    根据所述弱染色分割图像和所述强染色分割图像,生成所述细胞膜描述结 果。
  4. 根据权利要求3所述的方法,其中,所述根据所述弱染色分割图像和所述强染色分割图像,生成所述细胞膜描述结果,包括:
    对所述弱染色分割图像进行第一形态学处理,得到所述第一区域分割结果;
    对所述弱染色分割图像进行第二形态学处理,得到所述第二区域分割结果;
    对所述强染色分割图像进行所述第二形态学处理,得到所述第三区域分割结果。
  5. 根据权利要求4所述的方法,其中,所述对所述弱染色分割图像进行第一形态学处理,得到所述第一区域分割结果,包括:
    对所述弱染色分割图像进行骨架提取处理,得到所述弱染色分割图像中的细胞骨架;
    查找所述细胞骨架闭合的封闭区域;
    若所述封闭区域是最内层封闭区域,则填充所述最内层封闭区域,得到封闭区域分割图像;
    提取所述封闭区域分割图像中前景像素的位置信息,得到所述第一区域分割结果。
  6. 根据权利要求4所述的方法,其中,所述对所述弱染色分割图像进行第二形态学处理,得到所述第二区域分割结果,包括:
    对所述弱染色分割图像进行区域膨胀处理,得到处理后的弱染色分割图像;
    提取所述处理后的弱染色分割图像中前景像素的位置信息,得到所述第二区域分割结果。
  7. 根据权利要求4所述的方法,其中,所述对所述强染色分割图像进行所述第二形态学处理,得到所述第三区域分割结果,包括:
    对所述强染色分割图像进行区域膨胀处理,得到处理后的强染色分割图像;
    提取所述处理后的强染色分割图像中前景像素的位置信息,得到所述第三区域分割结果。
  8. 根据权利要求1至7任一项所述的方法,其中,所述类型包括:完整强染色细胞、不完整强染色细胞、完整弱染色细胞、不完整弱染色细胞和无染色细胞。
  9. 根据权利要求8所述的方法,其中,所述根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量,包括:
    将所述细胞核位置、第一区域分割结果和第三区域分割结果的交集中元素的个数,确定为所述完整强染色细胞的数量;
    将所述细胞核位置、所述第一区域分割结果相对于所述第i个视野下的染色图像的补集、以及第二区域分割结果的交集中元素的个数,确定为所述不完整强染色细胞的数量;
    将所述细胞核位置、所述第一区域分割结果、所述第三区域分割结果相对于所述第i个视野下的染色图像的补集、以及所述第二区域分割结果的交集中元素的个数,确定为所述完整弱染色细胞的数量;
    将所述细胞核位置、所述第一区域分割结果相对于所述第i个视野下的染色图像的补集、所述第三区域分割结果相对于所述第i个视野下的染色图像的补集、以及所述第二区域分割结果的交集中元素的个数,确定所述不完整弱染色细胞的数量;
    将所述细胞核位置与所述第二区域分割结果相对于所述第i个视野下的染色图像的补集的交集中元素的个数,确定为所述无染色细胞的数量;
    其中,所述细胞膜描述结果包括所述第一区域分割结果、所述第二区域分割结果和所述第三区域分割结果,所述第一区域分割结果用于指示染色完整的细胞膜位置,所述第二区域分割结果用于指示染色强度大于第一门限值的细胞膜位置,所述第三区域分割结果用于指示染色强度大于第二门限值的细胞膜位置。
  10. 根据权利要求1至7任一项所述的方法,其中,所述确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,包括:
    通过细胞核检测模型对所述第i个视野下的染色图像进行处理,得到所述细 胞核位置。
  11. 根据权利要求1至7任一项所述的方法,其中,所述根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果,包括:
    根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述n个视野下的染色图像中所述各种类型的细胞占比;
    根据所述各种类型的细胞占比,确定所述病理切片的分析结果。
  12. 一种病理切片图像的处理装置,所述装置包括:
    图像获取模块,用于获取经细胞膜染色后的病理切片在显微镜的n个视野下的染色图像,所述n为正整数;
    细胞核检测模块,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;
    细胞膜描述模块,用于生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞膜描述结果用于指示细胞膜染色的完整性和染色强度;
    数量确定模块,用于根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;
    结果确定模块,用于根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
  13. 一种智能显微镜系统,所述智能显微镜系统包括:显微镜、相机和计算机设备;
    所述显微镜,用于对经细胞膜染色后的病理切片进行观察;
    所述相机,用于获取所述病理切片在所述显微镜的n个视野下的染色图像,所述n为正整数;
    所述计算机设备,用于对于所述n个视野中第i个视野下的染色图像,确定所述第i个视野下的染色图像中癌症细胞的细胞核位置,所述i为小于或等于所述n的正整数;生成所述第i个视野下的染色图像的细胞膜描述结果,所述细胞 膜描述结果用于指示细胞膜染色的完整性和染色强度;根据所述细胞核位置和所述细胞膜描述结果,确定所述第i个视野下的染色图像中各种类型的细胞数量;根据所述n个视野下的染色图像中所述各种类型的细胞数量,确定所述病理切片的分析结果。
  14. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至11任一项所述的方法。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至11任一项所述的方法。
PCT/CN2020/115842 2019-11-14 2020-09-17 病理切片图像的处理方法、装置、系统及存储介质 WO2021093451A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20887251.5A EP3989160A4 (en) 2019-11-14 2020-09-17 PATHOLOGICAL SECTION IMAGE PROCESSING METHOD, APPARATUS, SYSTEM AND INFORMATION HOLDER
US17/515,170 US11967069B2 (en) 2019-11-14 2021-10-29 Pathological section image processing method and apparatus, system, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911115369.6 2019-11-14
CN201911115369.6A CN110853022B (zh) 2019-11-14 2019-11-14 病理切片图像的处理方法、装置、系统及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/515,170 Continuation US11967069B2 (en) 2019-11-14 2021-10-29 Pathological section image processing method and apparatus, system, and storage medium

Publications (1)

Publication Number Publication Date
WO2021093451A1 true WO2021093451A1 (zh) 2021-05-20

Family

ID=69600907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/115842 WO2021093451A1 (zh) 2019-11-14 2020-09-17 病理切片图像的处理方法、装置、系统及存储介质

Country Status (4)

Country Link
US (1) US11967069B2 (zh)
EP (1) EP3989160A4 (zh)
CN (1) CN110853022B (zh)
WO (1) WO2021093451A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222734A (zh) * 2022-09-20 2022-10-21 山东大学齐鲁医院 一种用于胃黏膜肠上皮化生的图像分析方法及系统

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110853022B (zh) * 2019-11-14 2020-11-06 腾讯科技(深圳)有限公司 病理切片图像的处理方法、装置、系统及存储介质
CN111430011B (zh) * 2020-03-31 2023-08-29 杭州依图医疗技术有限公司 细胞染色图像的显示方法、处理方法及存储介质
CN113705318B (zh) * 2021-04-22 2023-04-18 腾讯医疗健康(深圳)有限公司 基于图像的识别方法、装置、设备及可读存储介质
CN113077486B (zh) * 2021-04-30 2021-10-08 深圳世源工程技术有限公司 一种山区植被覆盖率监测方法及系统
CN114638782B (zh) * 2022-01-10 2023-02-07 武汉中纪生物科技有限公司 一种宫颈脱落细胞标本的检测方法
CN115130543B (zh) * 2022-04-29 2024-04-12 腾讯科技(深圳)有限公司 图像识别方法和装置、存储介质及电子设备
CN114943723B (zh) * 2022-06-08 2024-05-28 北京大学口腔医学院 一种对不规则细胞进行分割计数的方法及相关设备
CN116246019B (zh) * 2023-02-27 2024-01-05 上海迪派生物科技有限公司 一种病理切片的3d重建方法、装置、设备及介质
CN116823761A (zh) * 2023-06-25 2023-09-29 青岛百创智能制造技术有限公司 基于细胞分割的信息处理方法、装置、设备及存储介质
CN117218139B (zh) * 2023-09-12 2024-05-24 珠海横琴圣澳云智科技有限公司 样本细胞密度的确定方法和装置
CN117575977B (zh) * 2024-01-17 2024-04-02 锦恒科技(大连)有限公司 一种用于卵巢组织分析的卵泡区域增强方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1074365A (zh) * 1992-12-29 1993-07-21 北京市开隆仪器设备公司 肿瘤图像诊断方法及系统
US20070020697A1 (en) * 2005-07-25 2007-01-25 Hernani Cualing Virtual flow cytometry on immunostained tissue-tissue cytometer
CN101560544A (zh) * 2008-04-18 2009-10-21 麦克奥迪实业集团有限公司 一种细胞综合检测方法
US20150086103A1 (en) * 2012-03-30 2015-03-26 Konica Minolta, Inc. Medical image processor and storage medium
US20170103521A1 (en) * 2014-02-21 2017-04-13 Ventana Medical Systems, Inc. Medical image analysis for identifying biomarker-positive tumor cells
CN108181334A (zh) * 2018-01-25 2018-06-19 浙江海洋大学 大菱鲆精细胞变态阶段的细胞学划分方法
CN110363762A (zh) * 2019-07-23 2019-10-22 腾讯科技(深圳)有限公司 细胞检测方法、装置、智能显微镜系统和可读存储介质
CN110853022A (zh) * 2019-11-14 2020-02-28 腾讯科技(深圳)有限公司 病理切片图像的处理方法、装置、系统及存储介质

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100279341A1 (en) * 2000-10-24 2010-11-04 Tissuegnostics Gmbh Methods and system for analyzing cells
WO2012041333A1 (en) * 2010-09-30 2012-04-05 Visiopharm A/S Automated imaging, detection and grading of objects in cytological samples
WO2013038331A1 (en) * 2011-09-13 2013-03-21 Koninklijke Philips Electronics N.V. System and method for the detection of abnormalities in a biological sample
US9147104B2 (en) * 2012-11-05 2015-09-29 The United States Of America As Represented By The Secretary Of The Air Force Systems and methods for processing low contrast images
US20150310613A1 (en) * 2012-12-07 2015-10-29 Canon Kabushiki Kaisha Image generating apparatus and image generating method
JP6405319B2 (ja) * 2012-12-28 2018-10-17 ザ ユニバーシティー オブ メルボルン 乳癌予知のための画像分析
AU2015345199A1 (en) * 2014-11-10 2017-04-27 Ventana Medical Systems, Inc. Classifying nuclei in histology images
US10049770B2 (en) * 2015-12-30 2018-08-14 Case Western Reserve University Prediction of recurrence of non-small cell lung cancer
CN105741266B (zh) * 2016-01-22 2018-08-21 北京航空航天大学 一种病理图像细胞核快速定位方法
EP3559902A1 (en) * 2016-12-22 2019-10-30 Ventana Medical Systems, Inc. Computer scoring based on primary stain and immunohistochemistry images
CN106940889B (zh) * 2017-03-30 2020-09-04 福建师范大学 基于像素邻域特征聚类的淋巴结he染色病理图像分割方法
JP7026694B2 (ja) * 2017-09-27 2022-02-28 富士フイルム株式会社 画像解析装置、方法およびプログラム
EP3701045A4 (en) * 2017-10-23 2021-07-28 The University Of Western Australia IMPROVEMENTS TO OR RELATING TO CELLAL ANALYSIS
WO2019110567A1 (en) * 2017-12-05 2019-06-13 Ventana Medical Systems, Inc. Method of computing tumor spatial and inter-marker heterogeneity
EP3729369A2 (en) * 2017-12-24 2020-10-28 Ventana Medical Systems, Inc. Computational pathology approach for retrospective analysis of tissue-based companion diagnostic driven clinical trial studies
CN108346145B (zh) * 2018-01-31 2020-08-04 浙江大学 一种病理切片中非常规细胞的识别方法
JP2021531790A (ja) * 2018-07-27 2021-11-25 ベンタナ メディカル システムズ, インコーポレイテッド 自動化された原位置ハイブリッド形成分析のためのシステム
CN109215017B (zh) 2018-08-16 2020-06-02 腾讯科技(深圳)有限公司 图片处理方法、装置、用户终端、服务器及存储介质
CN109903284B (zh) * 2019-03-04 2022-07-05 武汉大学 一种her2免疫组化图像自动判别方法及系统
CN110021013A (zh) * 2019-03-27 2019-07-16 广州金域医学检验中心有限公司 病理切片细胞的类型识别方法、装置和计算机设备
CN110390676A (zh) * 2019-07-26 2019-10-29 腾讯科技(深圳)有限公司 显微镜下医学染色图像的细胞检测方法、智能显微镜系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1074365A (zh) * 1992-12-29 1993-07-21 北京市开隆仪器设备公司 肿瘤图像诊断方法及系统
US20070020697A1 (en) * 2005-07-25 2007-01-25 Hernani Cualing Virtual flow cytometry on immunostained tissue-tissue cytometer
CN101560544A (zh) * 2008-04-18 2009-10-21 麦克奥迪实业集团有限公司 一种细胞综合检测方法
US20150086103A1 (en) * 2012-03-30 2015-03-26 Konica Minolta, Inc. Medical image processor and storage medium
US20170103521A1 (en) * 2014-02-21 2017-04-13 Ventana Medical Systems, Inc. Medical image analysis for identifying biomarker-positive tumor cells
CN108181334A (zh) * 2018-01-25 2018-06-19 浙江海洋大学 大菱鲆精细胞变态阶段的细胞学划分方法
CN110363762A (zh) * 2019-07-23 2019-10-22 腾讯科技(深圳)有限公司 细胞检测方法、装置、智能显微镜系统和可读存储介质
CN110853022A (zh) * 2019-11-14 2020-02-28 腾讯科技(深圳)有限公司 病理切片图像的处理方法、装置、系统及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3989160A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222734A (zh) * 2022-09-20 2022-10-21 山东大学齐鲁医院 一种用于胃黏膜肠上皮化生的图像分析方法及系统
CN115222734B (zh) * 2022-09-20 2023-01-17 山东大学齐鲁医院 一种用于胃黏膜肠上皮化生的图像分析方法及系统

Also Published As

Publication number Publication date
US11967069B2 (en) 2024-04-23
CN110853022B (zh) 2020-11-06
EP3989160A1 (en) 2022-04-27
US20220051404A1 (en) 2022-02-17
EP3989160A4 (en) 2022-08-24
CN110853022A (zh) 2020-02-28

Similar Documents

Publication Publication Date Title
WO2021093451A1 (zh) 病理切片图像的处理方法、装置、系统及存储介质
EP3961484B1 (en) Medical image segmentation method and device, electronic device and storage medium
US11935644B2 (en) Deep learning automated dermatopathology
EP4006831A1 (en) Image processing method and apparatus, server, medical image processing device and storage medium
CN111260677B (zh) 基于显微图像的细胞分析方法、装置、设备及存储介质
CN111524137B (zh) 基于图像识别的细胞识别计数方法、装置和计算机设备
CN109389129A (zh) 一种图像处理方法、电子设备及存储介质
US11176412B2 (en) Systems and methods for encoding image features of high-resolution digital images of biological specimens
Nguyen et al. Spatial statistics for segmenting histological structures in H&E stained tissue images
CN110490882B (zh) 细胞膜染色图像分析方法、装置及系统
CN112330690B (zh) 基于显微图像的细胞分割方法、装置、设备及存储介质
CN113920309A (zh) 图像检测方法、装置、医学图像处理设备及存储介质
CN114550169A (zh) 细胞分类模型的训练方法、装置、设备及介质
CN113096080A (zh) 图像分析方法及系统
CN114283406A (zh) 细胞图像识别方法、装置、设备、介质及计算机程序产品
CN114332854A (zh) 图像处理方法、装置、设备及存储介质
WO2021164320A1 (zh) 基于计算机视觉的导管特征获取方法、装置和智能显微镜
CN113822846A (zh) 医学图像中确定感兴趣区域的方法、装置、设备及介质
CN110363762A (zh) 细胞检测方法、装置、智能显微镜系统和可读存储介质
CN114764948A (zh) 活体检测方法、装置、设备及存储介质
CN111582404A (zh) 内容分类方法、装置及可读存储介质
CN113706449B (zh) 基于病理图像的细胞分析方法、装置、设备及存储介质
Kassim et al. A cell augmentation tool for blood smear analysis
CN115359325B (zh) 图像识别模型的训练方法、装置、设备和介质
Kassim et al. VASCilia (Vision Analysis StereoCilia): A Napari Plugin for Deep Learning-Based 3D Analysis of Cochlear Hair Cell Stereocilia Bundles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20887251

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020887251

Country of ref document: EP

Effective date: 20220124

NENP Non-entry into the national phase

Ref country code: DE