WO2014196097A1 - Système de traitement d'images, dispositif de traitement d'images, programme, support d'informations, et procédé de traitement d'images - Google Patents

Système de traitement d'images, dispositif de traitement d'images, programme, support d'informations, et procédé de traitement d'images Download PDF

Info

Publication number
WO2014196097A1
WO2014196097A1 PCT/JP2013/081361 JP2013081361W WO2014196097A1 WO 2014196097 A1 WO2014196097 A1 WO 2014196097A1 JP 2013081361 W JP2013081361 W JP 2013081361W WO 2014196097 A1 WO2014196097 A1 WO 2014196097A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
cell
region
captured
image processing
Prior art date
Application number
PCT/JP2013/081361
Other languages
English (en)
Japanese (ja)
Inventor
英人 織田
尾崎 良太
加藤 典司
熊澤 幸夫
Original Assignee
富士ゼロックス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士ゼロックス株式会社 filed Critical 富士ゼロックス株式会社
Priority to CN201380075661.4A priority Critical patent/CN105122037A/zh
Publication of WO2014196097A1 publication Critical patent/WO2014196097A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/778Active pattern-learning, e.g. online learning of image or video features
    • G06V10/7784Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors
    • G06V10/7788Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors the supervisor being a human, e.g. interactive learning with a human teacher
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • the present invention relates to an image processing system, an image processing apparatus, a program, a storage medium, and an image processing method.
  • fetal nucleated red blood cells NRBCs, hereinafter referred to as target cells
  • target cells fetal nucleated red blood cells
  • An object of the present invention is to enable a user to determine whether or not an object in a cell candidate region is a target cell by only referring to an image of the cell candidate region in a captured image.
  • the invention according to claim 1 for solving the above-described problem is that the target cell in each of the imaging unit that images the sample including the target cell at a plurality of focal lengths, and the captured image captured by the imaging unit.
  • An acquisition unit that acquires an image of a cell candidate region that is a candidate for the reflected region, a display unit that displays a plurality of the images acquired by the acquisition unit, and whether or not the target cell is captured in the cell candidate region
  • An image processing system including a means for receiving an input of a user's determination result relating to the above.
  • the image processing system according to claim 2 is the image processing system according to claim 1, wherein the display unit aligns the plurality of images according to a focal distance when each captured image is captured. It is characterized by displaying a list in order.
  • the image processing system according to claim 3 is the image processing system according to claim 1, wherein the display unit focuses each of the plurality of images when each captured image is captured. It displays in order in the display order according to distance.
  • the image processing system according to claim 4 is the image processing system according to any one of claims 1 to 3, wherein the display unit displays the plurality of images as the cell candidates in any captured image. It is characterized by being displayed together with an image of a peripheral area including the area.
  • the image processing system according to claim 5 is the image processing system according to any one of claims 1 to 4, wherein the display unit focuses the plurality of images when each captured image is captured. It is characterized by being displayed together with information indicating the distance.
  • the imaging unit captures a region in which the target cell is captured in each of the captured images obtained by imaging the sample including the target cell at each of a plurality of focal lengths.
  • an image processing apparatus including means for receiving an input of a result.
  • the invention according to claim 7 for solving the above-described problem is that the imaging unit captures a region including the target cell in each of the captured images obtained by imaging the sample including the target cell at each of a plurality of focal lengths.
  • the imaging unit captures a region in which the target cell is captured in each of the captured images obtained by imaging the sample including the target cell at each of a plurality of focal lengths.
  • Acquisition means for acquiring an image of a candidate cell candidate area, display means for displaying a plurality of the images acquired by the acquisition means, and determination result of a user regarding whether or not the target cell is shown in the cell candidate area A computer-readable storage medium storing a program for causing a computer to function as an input receiving means.
  • the invention according to claim 9 for solving the above-described problem is to capture a sample including a target cell at each of a plurality of focal lengths, and in each of the captured images, the region where the target cell is reflected. Obtaining an image of a candidate cell candidate region, displaying the plurality of obtained images, In the image processing method, an input of a user's determination result regarding whether or not the target cell is shown in the cell candidate region is received.
  • the user can determine whether or not the object in the cell candidate region is the target cell only by referring to the image of the cell candidate region in the captured image. Can be aimed at.
  • the object in the cell candidate region is a target cell, as compared with the case where the present configuration is not provided.
  • FIG. 1 is a system configuration diagram of an image processing system 1 according to the present embodiment.
  • the image processing system 1 includes an optical microscope 2, an image processing device 4, and a display device 6.
  • the image processing device 4 is connected to each of the optical microscope 2 and the display device 6 so that data communication is possible.
  • the optical microscope 2 images a sample on a slide glass placed on a sample stage with a CCD camera via an optical system such as an objective lens.
  • a sample obtained by applying maternal blood on a slide glass and performing May-Giemsa staining is used as the sample.
  • fetal nucleated red blood cells in maternal blood are stained blue-violet.
  • nucleated red blood cells are referred to as target cells.
  • the image processing device 4 is, for example, a personal computer, and acquires a captured image captured by the optical microscope 2.
  • the captured image includes images of various cells contained in the maternal blood.
  • the nuclei of target cells are stained slightly darker than the nuclei of other cells by Mei Giemsa staining.
  • the image processing device 4 identifies a cell candidate region that is a candidate for a region in which the target cell is shown, and displays a list of images of each identified cell candidate region on the display device 6.
  • the display device 6 displays a list of images of each cell candidate region specified by the image processing device 4.
  • FIG. 2A is a diagram illustrating an example of a screen displayed on the display device 6. As shown in the figure, a list of images of the specified cell candidate regions is displayed on the screen.
  • a visual result input button 11 is provided at the upper right of the image of each cell candidate region, and the user clicks the visual result input button 11 related to the cell candidate region for which it is determined whether or not the target cell is shown, and the determination is made. Enter the result.
  • each focal distance is expressed as a distance in the downward direction from the autofocus position of the focal position.
  • the symbol “+” indicates that the focal position is below the autofocus position, and the symbol “ ⁇ ” indicates that the focal position is above the autofocus position.
  • the size of the nucleus image of the nucleated red blood cell that is the target cell changes according to the focal length.
  • this labor is omitted. That is, in this image processing system 1, the optical microscope 2 is provided with a focusing mechanism. Moreover, the optical microscope 2 images a sample at each of a plurality of focal lengths.
  • buttons 7a and 7b for selecting an image display form are provided on the screen shown in FIG. 2A.
  • the button 7b is selected, as shown in FIG. 2B, a list of images at each focal length is displayed for each cell candidate region.
  • the numerical value below the image indicates the focal length. Therefore, the user can determine whether or not the object in the cell candidate region is a target cell by referring to the image of the cell candidate region in each captured image without performing the above operation. Yes.
  • FIG. 4A is a diagram for explaining the display mode (1).
  • the display mode (1) as shown in FIG. 4A, a list of images at each focal length is displayed for the cell candidate region selected by the user. Again, the numerical value below the image indicates the focal length. Therefore, by selecting the button 10a, the user can determine whether or not the object in the cell candidate region is a target cell by referring to the image of the cell candidate region in each captured image. .
  • FIG. 4B is a diagram for explaining the display mode (2).
  • FIG. 4B an image of a peripheral region including the cell candidate region selected by the user is displayed.
  • FIG. 4C is a diagram for explaining the display mode (3).
  • the display mode (3) as shown in FIG. 4C, an image at each focal length and an image of the peripheral region are displayed for the cell candidate region selected by the user.
  • the user can switch the display mode of the detailed display among the display modes (1) to (3) by switching the button to be selected from the buttons 10a, 10b, and 10c.
  • FIG. 5 shows how the display mode is switched.
  • FIG. 6 is a functional block diagram showing a functional group realized by the image processing system 1.
  • an imaging unit 12 a sample image acquisition unit 14, a sample image storage unit 16, a cell candidate region determination unit 18, a region candidate database 20 (hereinafter referred to as a region candidate DB 20), a candidate region image acquisition unit 22,
  • a visual confirmation information generating unit 24, a visual confirmation information storage unit 26, and a visual confirmation information display unit 28 are realized.
  • the imaging unit 12 is realized by the optical microscope 2.
  • Functions other than the imaging unit 12 are realized by the image processing device 4.
  • Functions other than the imaging unit 12 are image processing that is a computer including a control unit such as a microprocessor, a storage unit such as a memory, and an input / output unit that transmits and receives data to and from an external device such as an operation reception unit that receives a user operation.
  • the apparatus 4 is realized by reading and executing a program stored in a computer-readable information storage medium (for example, an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, a flash memory, etc.).
  • the program may be supplied to the image processing apparatus 4 as a computer via a data communication network such as the Internet.
  • the imaging unit 12 images the sample on the slide glass surface at each of a plurality of predetermined focal lengths, and outputs a plurality of captured images (hereinafter referred to as sample images) to the sample image acquisition unit 14. .
  • the specimen image is image data, and the focal distance when it is imaged is included as header information.
  • the sample image acquisition unit 14 acquires data of each of the plurality of sample images from the imaging unit 12 and stores each acquired sample image in the sample image storage unit 16.
  • the focal length is expressed as a downward distance from the autofocus position of the focal position.
  • region determination part 18 determines the cell candidate area
  • a significant pixel is a pixel whose pixel value (RGB value) is within a predetermined range.
  • region determination part 18 determines a cell candidate area
  • FIG. 7 shows an example of the pixel block 30, the nucleus candidate region 32, and the cell candidate region 34. In FIG. 7, a cell candidate region 34 including the upper left nucleus candidate region 32 is shown.
  • the cell candidate region determination unit 18 registers the determined cell candidate region 34 in the region candidate DB 20. That is, the cell candidate region determination unit 18 stores the determined record of the cell candidate region 34 in the region candidate DB 20.
  • the record includes the area ID of the cell candidate area 34.
  • the record includes coordinate data indicating the cell candidate region 34.
  • the record also includes a numerical value (hereinafter referred to as a score) indicating the probability that the target cell is shown in the cell candidate area 34.
  • the record includes a value of a visual flag (here, “0”) indicating a user's determination result regarding whether or not the target cell is shown in the cell candidate area 34.
  • the cell candidate region determination unit 18 cuts out an image in the cell candidate region 34 from the reference sample image, and outputs an output value when the image feature amount of the cut out image is input to a discriminator learned in advance. Obtained as the score of region 34.
  • the image feature amount for example, a HOG (Histograms of Oriented Gradients) feature amount is raised. The greater the score value, the higher the probability that the target cell is shown in the cell candidate region 34.
  • FIG. 8 is a diagram showing the stored contents of the area candidate DB 20.
  • the region candidate DB 20 includes a region ID field, a coordinate data field, a score field, and a visual result field.
  • the region ID field stores the region ID of the cell candidate region 34.
  • coordinate data indicating the cell candidate region 34 is stored in association with the region ID of the cell candidate region 34.
  • the coordinate data includes the position coordinates of the representative point (for example, the center or the upper left vertex) of the cell candidate region 34 and the size (width W and height H) of the cell candidate region 34.
  • the coordinate data may be data expressed in pixels or data expressed in millimeters.
  • the score field the score of the cell candidate region 34 is stored in association with the region ID of the cell candidate region 34.
  • the value of the visual flag is stored in association with the area ID of the cell candidate area 34.
  • the value “2” indicates that the user has determined that the target cell is shown in the cell candidate region 34
  • the value “1” indicates that it is unclear whether the target cell is shown in the cell candidate region 34.
  • the value “0” indicates that the user has determined that the user has not yet viewed the cell candidate area 34 or that the target cell is not shown in the cell candidate area 34.
  • a user comment regarding the cell candidate region 34 may be stored in the visual result field.
  • the cell candidate region determination unit 18 may register all the determined cell candidate regions 34 in the region candidate DB 20, or if the target cell is reflected in the determined cell candidate region 34, the above-described classifier identifies the target cell. Only the cell candidate region 34 may be registered in the region candidate DB 20.
  • the cell candidate region 34 identified as having the target cell reflected is, for example, a cell candidate region 34 having a score equal to or higher than a threshold, or a score “1” to “N” (“N” is “ 200 ”) is the second largest cell candidate region 34.
  • each cell candidate region 34 registered in the region candidate DB 20 will be referred to as a cell candidate region X, and the description will be continued.
  • the candidate area image acquisition unit 22 acquires an image of the cell candidate area X in each of the plurality of sample images captured by the imaging unit 12 as a candidate area image. Specifically, the candidate area image acquisition unit 22 cuts out an image of the cell candidate area X as a candidate area image from each sample image stored in the sample image storage unit 16.
  • the candidate area image is image data, and the focal distance when the specimen image including the candidate area image is captured is included as header information.
  • the visual confirmation information generation unit 24 generates visual confirmation information based on each of the plurality of candidate area images acquired by the candidate area image acquisition unit 22. Specifically, the visual confirmation information generation unit 24 generates information including a plurality of candidate region images acquired by the candidate region image acquisition unit 22 as visual confirmation information, and uses the generated visual confirmation information as a cell candidate region. The information is stored in the visual confirmation information storage unit 26 in association with the region ID X.
  • the visual confirmation information display unit 28 displays an image of the cell candidate region X in the reference sample image on the display device 6. Specifically, the visual confirmation information display unit 28 displays an image of the cell candidate region X in the reference sample image on the display device 6 when the button 7a is selected (see FIG. 2A).
  • the visual confirmation information display unit 28 displays a plurality of candidate region images acquired by the candidate region image acquisition unit 22 on the display device 6 based on the visual confirmation information. For example, when the button 7b is selected, the visual confirmation information display unit 28 displays a plurality of candidate region images included in the visual confirmation information on the display device 6 (see FIG. 2B). Further, for example, the visual confirmation information display unit 28 displays a plurality of candidate region images included in the visual confirmation information on the display device 6 when the button 10a is selected and the cell candidate region X is selected by the user. (See FIG. 4A).
  • the visual confirmation information display unit 28 displays a plurality of candidate region images included in the visual confirmation information in the reference specimen image. The image is displayed on the display device 6 together with the peripheral region image including the cell candidate region X (see FIG. 4C).
  • the information display part 28 for visual confirmation displays each candidate area
  • the visual confirmation information display unit 28 displays each candidate area image in association with the focal length included in the candidate area image (see FIGS. 2B, 4A, and 4C).
  • the visual confirmation information display unit 28 displays each candidate region image in a list in an order of alignment (for example, ascending or descending order of focal length) according to the focal length when each specimen image is captured (FIG. 2B).
  • the visual confirmation information display unit 28 displays each candidate area image in order in a display order (for example, ascending or descending order of the focal distance) according to the focal length included in each candidate area image. May be. The state where each candidate area image is displayed one by one is shown. Arrows indicate transitions of displayed candidate area images.
  • the visual confirmation information display unit 28 receives an input of a user's judgment result regarding whether or not the target cell is shown in the cell candidate region X.
  • the visual confirmation information display unit 28 determines whether or not the target cell is shown in the cell candidate region X when the visual result input button 11 (see FIGS. 2A and 2B) related to the cell candidate region X is clicked. The input of the user's judgment result about the is accepted.
  • the visual confirmation information display unit 28 stores information indicating the received determination result of the user in the visual result field in association with the region ID of the cell candidate region X.
  • the visual confirmation information display unit 28 is stored in the visual result field in association with the region ID of the cell candidate region X.
  • the value of the visual flag is updated to “2”, and when the input of the determination result that it is unclear whether the target cell is shown in the cell candidate region X is received, the value of the visual flag is updated to “1”. .
  • the visual confirmation information display unit 28 displays, on the display device 6, an image of the cell candidate region 34 when a sample is imaged at each focal length. The Therefore, the user can determine whether or not the object in the cell candidate region 34 is a target cell by referring to each image. In addition, since the images are displayed in the order according to the focal length, it becomes easy to grasp the change in the size of the core portion at a glance, and the above-described determination is facilitated.
  • FIG. 10 is a flowchart illustrating an example of processing executed by the candidate area image acquisition unit 22, the visual confirmation information generation unit 24, and the visual confirmation information display unit 28 in the image processing device 4.
  • the image processing apparatus 4 reads a plurality of sample images stored in the sample image storage unit 16.
  • the image processing apparatus 4 reads the coordinate data of each of the plurality of cell candidate regions registered in the region candidate DB 20.
  • the image processing device 4 initializes the visually confirmed region list stored in the storage unit (S102). Then, the image processing apparatus 4 executes steps S103 to S106 with each cell candidate region registered in the region candidate DB 20 as a cell candidate region (i).
  • the image processing apparatus 4 acquires the image of the cell candidate region (i) in each specimen image as the candidate region image by the candidate region image acquisition unit 22.
  • the image processing apparatus 4 generates visual confirmation information including a plurality of acquired candidate area images from the visual confirmation information generation unit 24 and stores the information in the visual confirmation information storage unit 26.
  • the image processing device 4 reads the visual confirmation information stored in the visual confirmation information storage unit 26 by the visual confirmation information display unit 28.
  • Each of the plurality of candidate area images included in the visual confirmation information is displayed on the display device 6 (S104).
  • the image processing apparatus 4 displays a list of each candidate area image in an arrangement order corresponding to the focal length included in each candidate area image (see FIG. 4A).
  • the image processing apparatus 4 displays each candidate area image in order in a display order corresponding to the focal length included in each candidate area image (see FIG. 9).
  • the image processing apparatus 4 displays each candidate area image together with an image of a peripheral area including the cell candidate area (i) in the reference specimen image (see FIG. 4C).
  • the image processing apparatus 4 receives an input of a determination result of the user regarding whether or not the target cell is shown in the cell candidate region (i) by the visual confirmation information display unit 28. Further, based on the received input, it is determined whether or not the user has determined that the target cell is shown in the cell candidate region (i) (S105).
  • the value of the visual flag stored in the visual result field in association with the area ID of the cell candidate area (i) is set to “ Update to 2 ”. Further, the cell candidate area (i) is registered in the visually confirmed area list (S106).
  • the image processing apparatus 4 sets another cell candidate region as the cell candidate region (i) and the steps after S103. Execute.
  • the image processing apparatus 4 outputs a visually confirmed region list.
  • nucleated red blood cells are target cells
  • cells other than nucleated red blood cells may be target cells. That is, the present invention can also be applied when cells other than nucleated red blood cells are target cells.
  • FIG. 11 is a diagram for explaining the display mode (4). As shown in the figure, in the display mode (4), each image of the peripheral area at each focal length is displayed in order.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

L'objet de la présente invention est de permettre à un utilisateur de déterminer si un objet dans une zone candidate de cellule est une cellule cible uniquement en se référant à une image de la zone candidate de cellule dans une image photographiée. Une unité d'imagerie (12) photographie un échantillon contenant une cellule cible à une pluralité de distances focales. Une unité (22) d'acquisition d'images de zones candidates acquiert des images de zones candidates de cellule, qui sont des zones candidates où la cellule cible peut être montrée, dans chacune des images photographiées. Une unité d'affichage d'informations pour la confirmation visuelle (28) affiche la pluralité d'images acquises par l'unité (22) d'acquisition d'images de zones candidates. De plus, l'unité d'affichage d'informations pour la confirmation visuelle (28) reçoit l'entrée du résultat de la détermination de l'utilisateur quant au fait qu'une cellule cible est montrée dans une région candidate de cellule ou non.
PCT/JP2013/081361 2013-06-07 2013-11-21 Système de traitement d'images, dispositif de traitement d'images, programme, support d'informations, et procédé de traitement d'images WO2014196097A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201380075661.4A CN105122037A (zh) 2013-06-07 2013-11-21 图像处理系统、图像处理设备、程序、存储介质和图像处理方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-121331 2013-06-07
JP2013121331A JP2014238344A (ja) 2013-06-07 2013-06-07 画像処理システム、画像処理装置及びプログラム

Publications (1)

Publication Number Publication Date
WO2014196097A1 true WO2014196097A1 (fr) 2014-12-11

Family

ID=52007764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/081361 WO2014196097A1 (fr) 2013-06-07 2013-11-21 Système de traitement d'images, dispositif de traitement d'images, programme, support d'informations, et procédé de traitement d'images

Country Status (3)

Country Link
JP (1) JP2014238344A (fr)
CN (1) CN105122037A (fr)
WO (1) WO2014196097A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021039441A1 (fr) * 2019-08-23 2021-03-04 ライトタッチテクノロジー株式会社 Méthode, dispositif et programme d'identification de tissu biologique
WO2021135393A1 (fr) * 2019-12-31 2021-07-08 深圳迈瑞生物医疗电子股份有限公司 Appareil d'analyse d'image et procédé d'imagerie associé

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3470510A4 (fr) * 2016-06-13 2019-06-26 Sony Corporation Dispositif, dispositif de traitement d'informations, programme, et procédé de traitement d'informations
CN116088158A (zh) * 2018-07-13 2023-05-09 深圳迈瑞生物医疗电子股份有限公司 一种细胞图像处理系统、方法、自动读片装置与存储介质
CN111698423A (zh) * 2020-06-18 2020-09-22 福建捷联电子有限公司 一种具有多种焦距照片的展示方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004507743A (ja) * 2000-08-25 2004-03-11 アムニス コーポレイション 時間遅延積分粒子分析装置の代替検出器構成および動作モード
JP2004150895A (ja) * 2002-10-29 2004-05-27 Natl Inst Of Radiological Sciences 標本画像データ処理方法及び標本検査システム
JP4346923B2 (ja) 2003-02-21 2009-10-21 晴夫 高林 標的細胞自動探索システム
JP2012073994A (ja) * 2010-09-03 2012-04-12 Sony Corp 画像処理装置および方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004507743A (ja) * 2000-08-25 2004-03-11 アムニス コーポレイション 時間遅延積分粒子分析装置の代替検出器構成および動作モード
JP2004150895A (ja) * 2002-10-29 2004-05-27 Natl Inst Of Radiological Sciences 標本画像データ処理方法及び標本検査システム
JP4346923B2 (ja) 2003-02-21 2009-10-21 晴夫 高林 標的細胞自動探索システム
JP2012073994A (ja) * 2010-09-03 2012-04-12 Sony Corp 画像処理装置および方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021039441A1 (fr) * 2019-08-23 2021-03-04 ライトタッチテクノロジー株式会社 Méthode, dispositif et programme d'identification de tissu biologique
WO2021135393A1 (fr) * 2019-12-31 2021-07-08 深圳迈瑞生物医疗电子股份有限公司 Appareil d'analyse d'image et procédé d'imagerie associé

Also Published As

Publication number Publication date
CN105122037A (zh) 2015-12-02
JP2014238344A (ja) 2014-12-18

Similar Documents

Publication Publication Date Title
JP4558047B2 (ja) 顕微鏡システム、画像生成方法、及びプログラム
US9934571B2 (en) Image processing device, program, image processing method, computer-readable medium, and image processing system
EP2889619B1 (fr) Dispositif de traitement d'images, procédé de traitement d'images, support lisible par ordinateur et système de traitement d'images
JP6194791B2 (ja) 画像処理装置及びプログラム
US11226280B2 (en) Automated slide assessments and tracking in digital microscopy
WO2014196097A1 (fr) Système de traitement d'images, dispositif de traitement d'images, programme, support d'informations, et procédé de traitement d'images
WO2014192184A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme et support de stockage
JP2009014939A (ja) 顕微鏡システム、そのvs画像生成方法、プログラム
JP2016517515A (ja) 細胞診標本を観察および解析するためのシステムおよび方法
AU2009251162B2 (en) Method for classifying slides using scatter plot distributions
JPWO2020066043A1 (ja) 顕微鏡システム、投影ユニット、及び、画像投影方法
JP4864709B2 (ja) 分散プロット分布を用いてスライドの染色品質を決定するシステム
JP4938428B2 (ja) 標本画像作成方法及び装置
CN107076980A (zh) 用于大视场显微扫描中嵌入图像的系统和方法
JP2005227097A (ja) 細胞画像解析装置
US20200074628A1 (en) Image processing apparatus, imaging system, image processing method and computer readable recoding medium
JP5530126B2 (ja) 三次元細胞画像解析システム及びそれに用いる三次元細胞画像解析装置
US20140210980A1 (en) Image acquisition apparatus and image acquisition method
JP6156137B2 (ja) 画像処理装置及びプログラム
JP2014157158A (ja) 細胞観察方法、三次元細胞画像解析システム及びそれに用いる三次元細胞画像解析装置
JP5648366B2 (ja) 顕微鏡制御装置及び領域判定方法
WO2021166089A1 (fr) Dispositif d'aide à l'évaluation, système d'aide à l'évaluation, procédé d'aide à l'évaluation et programme
US20130215146A1 (en) Image-drawing-data generation apparatus, method for generating image drawing data, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13886197

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2013886197

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE