CN105122037A - Image processing system, image processing device, program, storage medium, and image processing method - Google Patents

Image processing system, image processing device, program, storage medium, and image processing method Download PDF

Info

Publication number
CN105122037A
CN105122037A CN201380075661.4A CN201380075661A CN105122037A CN 105122037 A CN105122037 A CN 105122037A CN 201380075661 A CN201380075661 A CN 201380075661A CN 105122037 A CN105122037 A CN 105122037A
Authority
CN
China
Prior art keywords
image
cell
candidate region
target cell
focal length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201380075661.4A
Other languages
Chinese (zh)
Inventor
织田英人
尾崎良太
加藤典司
熊泽幸夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Publication of CN105122037A publication Critical patent/CN105122037A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/778Active pattern-learning, e.g. online learning of image or video features
    • G06V10/7784Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors
    • G06V10/7788Active pattern-learning, e.g. online learning of image or video features based on feedback from supervisors the supervisor being a human, e.g. interactive learning with a human teacher
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image processing system, an image processing device, a program, a storage medium, and an image processing method. The purpose of the present invention is to make it possible for a user to determine whether an object in a cell candidate area is a target cell only by referring to an image of the cell candidate area in a photographed image. An imaging unit (12) photographs a sample containing a target cell at a plurality of focal lengths. A candidate area image acquisition unit (22) acquires images of cell candidate areas, which are candidate areas where the target cell may be shown, in each of the photographed images. A unit for displaying information for visual confirmation (28) displays the plurality of images acquired by the candidate area image acquisition unit (22). Additionally, the unit for displaying information for visual confirmation (28) receives input of the result of the determination of the user regarding whether a target cell is shown in a cell candidate region.

Description

Image processing system, image processing equipment, program, storage medium and image processing method
Technical field
The present invention relates to image processing system, image processing equipment, program, storage medium and image processing method.
Background technology
When carrying out pre-natal diagnosis, detect and use the erythroblast (NRBC, hereinafter referred to target cell) that its quantity is very little in the blood obtained from parent, NRBC is derived from fetus.Very little owing to being present in from the quantity of the NRBC the blood that parent obtains, it is troublesome for therefore detecting NRBC by Visual Observations Observations.Thus, as following described in PTL1, the target image proposing the cell of the condition by searching for the color, shape, position relationship, area ratio etc. that meet such as NRBC mechanically detects the technology of NRBC.
Reference listing
Patent documentation
PTL1: the No. 4346923 Jap.P.
Summary of the invention
Technical matters
The object of the invention is to make user can catch the image of the cell candidate region in image to determine whether the object in cell candidate region is target cell by means of only reference.
For the technical scheme of problem
In order to solve the problem, the present invention according to claim 1 is a kind of image processing system, and it comprises: imaging device, and it catches the image of the sample comprising target cell with each focal length in multiple focal length; Acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of the image of catching as described imaging device; Display device is that its display is obtained by described acquisition device, be the image of multiple image; And for receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
In addition, image processing system according to claim 2 is image processing system according to claim 1, wherein, described display device is caught the putting in order of the focal length of image by based on being used for capturing each, being shown described multiple image in a listing format.
In addition, image processing system according to claim 3 is image processing system according to claim 1, wherein, described display device by according to being used for capturing order that each focal length of catching image carries out showing, alternately shows described multiple image one by one.
In addition, image processing system according to claim 4 is image processing system according to any one of claim 1 to 3, wherein, described multiple image shows together with comprising the image being present in the peripheral region of catching the cell candidate region in image described in arbitrary width by described display device.
In addition, image processing system according to claim 5 is image processing system according to any one of claim 1 to 4, wherein, described multiple image has been used for capturing each information of catching the focal length of image has shown together with expression by described display device.
In addition, in order to solve the problem, the present invention according to claim 6 is a kind of image processing equipment, it comprises: acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length; Display device is that its display is obtained by described acquisition device, be the image of multiple image; And for receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
In addition, in order to solve the problem, the present invention according to claim 7 is a kind of program, this program is provided for computing machine and functions as follows: acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length; Display device is that its display is obtained by described acquisition device, be the image of multiple image; And for receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
In addition, in order to solve the problem, the present invention according to claim 8 is a kind of computer-readable recording medium, it stores the program being provided for computing machine and functioning as follows: acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length; Display device is that its display is obtained by described acquisition device, be the image of multiple image; And for receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
In addition, in order to solve the problem, the present invention according to claim 9 is a kind of image processing method, and it comprises: the image of catching the sample comprising target cell with each focal length in multiple focal length; Each acquisition in described catching catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image; That display obtains, be the image of multiple image; And receive user to the input whether seeing the determination result that described target cell is relevant in described cell candidate region.
The beneficial effect of the invention
Whether the present invention according to claim 1,6,7,8 and 9, make user can be the determination of target cell by means of only make with reference to the image of cell candidate region caught in image about the object in cell candidate region.
Whether according to the present invention described in claim 2 to 5, compared with the situation not using this configuration, making to the object in object candidate area is that the determination that target cell is relevant becomes easier.
Accompanying drawing explanation
Fig. 1 is the figure of the system configuration of the image processing system illustrated according to this exemplary embodiment.
Fig. 2 A is the figure that the screen that display device shows is shown.
Fig. 2 B is the figure that the screen that display device shows is shown.
Fig. 3 is the figure of the example of the image of the various types of cells illustrated at each focal length place.
Fig. 4 A is the figure for illustration of display mode (1).
Fig. 4 B is the figure for illustration of display mode (2).
Fig. 4 C is the figure for illustration of display mode (3).
Fig. 5 illustrates figure display mode being changed into alternative mode.
Fig. 6 is the functional block diagram that one group of function that image processing system realizes is shown.
Fig. 7 is the figure of example that block of pixels, nucleus candidate region and cell candidate region are shown.
Fig. 8 is the figure of the example of the memory content illustrated in region candidate database.
Fig. 9 illustrates that alternately shows the figure of the mode of each candidate region image one by one.
Figure 10 is the process flow diagram of the example that the process that image processing equipment performs is shown.
Figure 11 is the figure for illustration of the display mode (4) for showing in detail.
Embodiment
Below, the details of the example of exemplary embodiment of the present invention will be described with reference to the accompanying drawings.
Fig. 1 shows the system configuration of the image processing system 1 according to this exemplary embodiment.As shown in Figure 1, image processing system 1 comprises optical microscope 2, image processing equipment 4 and display device 6.Image processing equipment 4 is connected to each in optical microscope 2 and display device 6 to make it possible to carry out data communication.
Optical microscope 2 is via the image of the sample on the optical systems such as such as object lens, the slide glass (mount) that uses CCD camera head to be captured in sample stage is arranged.In the present example embodiment, May-Giemsa dyeing carried out to the blood obtained from parent and is applied to slide glass, then, using the blood obtained as sample.As a result, bluish violet is dyed to from the erythroblast of the feotus vitality the blood that parent obtains.Below, erythroblast is called as target cell.
Image processing equipment 4 is such as personal computer, and obtains and catch image as the image of being caught by optical microscope 2.Catch the image that image comprises various types of cells included in the blood obtained from parent.Utilize May-Giemsa to dye, the nuclei dyeing of target cell (erythroblast) must be darker a little than the nucleus of other cells.
In addition, image processing equipment 4 determines the cell candidate region of the candidate as the region seeing target cell, and shows the list of the image of each cell candidate region determined on the display device 6.
Display device 6 shows the list of the image of each cell candidate region that image processing equipment 4 is determined.Fig. 2 A is the figure of the example of the screen that display on display device 6 is shown.As shown in Figure 2 A, the list display of the image of determined cell candidate region on the screen.
Every width image that user's reference shows on screen is to carry out pre-natal diagnosis.For each in the image of cell candidate region, Visual Observations Observations result load button 11 is arranged on the upper right quarter of the image of cell candidate region.User for having made it about whether seeing the cell candidate region of the determination of target cell and click Visual Observations Observations result load button 11, and inputs determination result.
Incidentally, may exist and cannot make situation about whether there is the determination of target cell only by reference picture.Thus, when cannot only by reference picture make determine, can expect, when focal length just changes, perform and use optical microscope 2 to observe the operation of the change of the size of the nuclear fractions of the cell seen in specific image.This is because due to the nucleus of the cell (lymphocyte, large-sized leucocyte, there is the red blood cell etc. of dirt) except target cell be smooth, therefore the image of its nuclear fractions has almost constant size, and has nothing to do with focal length; But the nucleus due to the erythroblast as target cell is not smooth, and therefore the size of the image of its nuclear fractions changes along with focal length.The example of the image of various types of cell under each focal length has been shown in Fig. 3.Numerical value described below each image represents focal length.Here, focal length is represented as in a downward direction from auto-focusing point to the length of focus.Mark "+" represent focus be positioned at below auto-focusing point, and mark "-" represent focus be positioned at above auto-focusing point.As shown in Figure 3, the size as the image of the erythroblast of target cell changes according to focal length.
But it is troublesome for performing the operation using optical microscope 2 to carry out the change of the size of the image of observation of cell core part.
In view of this point, image processing system 1 decreases this trouble.That is, in image processing system 1, optical microscope 2 is equipped with focusing.In addition, the image of sample caught by optical microscope 2 with each focal length in multiple focal length.
In addition, button 7a and 7b is arranged on the screen shown in Fig. 2 A to select image display mode.Then, when have selected button 7b, as shown in Figure 2 B, for each cell candidate region, the image of catching under being presented at each focal length in a listing format.Numerical value described below image represents focal length.Thus, by reference to the image of the cell candidate region of catching in image, whether user can make about the object in cell candidate region is the determination of target cell, and need not perform aforesaid operations.
In addition, for the cell candidate region that user clicks and selects, details display is performed to the screen shown in Fig. 2 A.Here, button 10a, 10b and 10c is arranged for the display mode selecting details display.Then, when have selected button 10a, under display mode (1), perform details display.Fig. 4 A is the figure for illustration of display mode (1).Under display mode (1), as shown in Figure 4 A, for the cell candidate region that user selects, the image of catching under each focal length shows in a listing format.Here, described below image numerical value also represents focal length.Thus, equally when have selected button 10a, by reference to the image of the cell candidate region of catching in image, whether user can make about the object in cell candidate region is the determination of target cell.
It should be noted that when have selected button 10b, (2) perform details display in a display format.Fig. 4 B is the figure for illustration of display mode (2).Under display mode (2), as shown in Figure 4 B, the image of the peripheral region comprising the cell candidate region that user selects is shown.In addition, when have selected button 10c, (3) perform details display in a display format.Fig. 4 C is the figure for illustration of display mode (3).Under display mode (3), as shown in Figure 4 C, for the cell candidate region that user selects, show the image of image and the peripheral region of catching with each focal length.User can change the display mode for details display among display mode (1) to (3) by the button selected in change button 10a, 10b and 10c.Fig. 5 shows display mode and is changed to alternative mode.
Below, will illustrate that the image be used for by reference to the cell candidate region of catching in image is to make user can make about the object in cell candidate region to be whether the technology of the determination of target cell.
Fig. 6 shows the functional block diagram of one group of function that image processing system 1 realizes.In image processing system 1, achieve image-generating unit 12, sample image acquiring unit 14, sample image storer 16, cell candidate region determining unit 18, region candidate database 20 (hereinafter, referred to region candidate DB20), candidate region image acquisition unit 22, visual inspection (visual-check-use) information generating unit 24, visual inspection information-storing device 26 and visual inspection information display section 28.Image-generating unit 12 is realized by optical microscope 2.
In addition, the function except image-generating unit 12 is realized by image processing equipment 4.Function except image-generating unit 12 is read by image processing equipment 4 and is performed and is stored in computer-readable information storage medium (such as, CD, disk, tape, magneto-optic disk, flash memory etc.) in program realize, image processing equipment 4 is equipped with the storage arrangement of such as microprocessor-based control device, such as storer, sends data to the external device (ED) of the operation receiving trap such as receiving user operation and so on and receives the computing machine of input-output unit etc. of data from this external device (ED).It should be noted that this program can also be provided to image processing equipment as computing machine via the data communication network of such as internet.
Below, by each in these functions of explanation.Image-generating unit 12 catches the image of the sample on slide surface with each focal length in predetermined multiple focal lengths, and several is caught image (hereinafter, referred to sample image) and export sample image acquiring unit 14 to.Sample image is view data, and comprise be used for capturing sample image focal length as header information.
Sample image acquiring unit 14 obtains the data of each several sample images from image-generating unit 12, and is stored in sample image storer 16 by the sample image that every width obtains.Focal length is represented as in a downward direction from auto-focusing point to the length of focus.
In addition, cell candidate region determining unit 18 determines above-mentioned cell candidate region.Particularly, cell candidate region determining unit 18 reads one of several sample images above-mentioned as master sample image from sample image storer 16, and extracts the nucleus candidate region corresponding with the nucleus of target cell from this master sample image.More specifically, cell candidate region determining unit 18 extracts the block of pixels of remarkable pixel from master sample image, and the circumscribed rectangular region extracting block of pixels for each block of pixels is as nucleus candidate region.Here, remarkable pixel is the pixel of the pixel value (rgb value) had in preset range.Then, cell candidate region determining unit 18 determines cell candidate region for each nucleus candidate region according to nucleus candidate region.Such as, for each nucleus candidate region, the rectangular area with specified size comprising nucleus candidate region is defined as cell candidate region by cell candidate region determining unit 18.The example of block of pixels 30, nucleus candidate region 32 and cell candidate region 34 has been shown in Fig. 7.In the figure 7, the cell candidate region 34 comprising nucleus candidate region 32 is shown at the upper left quarter of accompanying drawing.
In addition, determined cell candidate region 34 is registered in region candidate DB20 by cell candidate region determining unit 18.That is, the record of determined cell candidate region 34 is stored in region candidate DB20 by cell candidate region determining unit 18.This record comprises the region ID of cell candidate region 34.In addition, this record comprises the coordinate data representing cell candidate region 34.In addition, this record comprises the numerical value (hereinafter, being called as score) representing and see the probability of target cell in cell candidate region 34.In addition, this record comprises Visual Observations Observations mark value (here for " 0 "), its represent user about the determination result whether seeing target cell in cell candidate region 34.Here, cell candidate region determining unit 18 goes out the image in cell candidate region 34 from master sample image cut, and obtain and carry out the Discr. learnt and the output valve obtained, as the score of cell candidate region 34 by being inputed to by the image feature value of the image cut out from master sample image.The example of image feature value is such as HOG (histograms of oriented gradients) eigenwert.The value of score is larger, sees that the probability of target cell is higher in cell candidate region 34.
Fig. 8 shows the figure of the memory content in region candidate DB20.As shown in Figure 8, region candidate DB20 comprises ID hurdle, region, candidate data hurdle, obtains subfield and Visual Observations Observations result bar.The region ID of cell candidate region 34 is stored in ID hurdle, region.The region ID of the coordinate data and cell candidate region 34 that represent cell candidate region 34 is stored in coordinate data hurdle explicitly.Coordinate data comprises the position coordinates of the representative point (such as, center or left upper apex) of cell candidate region 34 and the size (width W and height H) of cell candidate region 34.Coordinate data can be the data or the data expressed in units of millimeter expressed by pixel.
In addition, the region ID of the score of cell candidate region 34 and cell candidate region 34 is stored in explicitly in subfield.In addition, the value marked by Visual Observations Observations and the region ID of cell candidate region 34 are stored in Visual Observations Observations result bar explicitly.Value " 2 " represents that user has determined to see target cell in cell candidate region 34.Value " 1 " represents user has determined not know whether see target cell in cell candidate region 34.Value " 0 " represents that user not yet sees that cell candidate region 34 or user have determined not see target cell in cell candidate region 34.Not only Visual Observations Observations mark but also user also can be stored in Visual Observations Observations result bar the annotation of cell candidate region 34.
It should be noted that determined all cells candidate region 34 can be registered in region candidate DB20 by cell candidate region determining unit 18.As an alternative, cell candidate region determining unit 18 can only by the middle of determined cell candidate region 34, differentiated to be registered in region candidate DB20 for the cell candidate region of the cell candidate region 34 seeing target cell by Discr..Here, the cell candidate region 34 being identified as the cell candidate region 34 seeing target cell is such as that score is more than or equal to the cell candidate region 34 of threshold value or must be divided into the maximum cell candidate region 34 to N large (such as, " N " is " 200 ").
Below, each cell candidate region 34 be registered in region candidate DB20 is called as cell candidate region X, and goes on to say.
In each width in several sample images that image-generating unit 12 is caught, candidate region image acquisition unit 22 obtains the image alternatively area image of cell candidate region X.Particularly, candidate region image acquisition unit 22 shears out the image of cell candidate region X from each sample image be stored in sample image storer 16, is used as candidate region image.Candidate region image is view data, and comprise be used for capturing the sample image comprising candidate region image focal length as header information.
Each width in several candidate region images that visual inspection information generating unit 24 obtains according to candidate region image acquisition unit 22 generates visual inspection information.Particularly, visual inspection information generating unit 24 generates the information of the multiple candidate regions comprising candidate region image acquisition unit 22 acquisition as visual inspection information, and is stored in explicitly in visual inspection information-storing device 26 by the region ID of generated visual inspection information and cell candidate region X.
Visual inspection information display section 28 shows the image of the cell candidate region X in master sample image on the display device 6.Particularly, when have selected button 7a, visual inspection information display section 28 shows the image (see Fig. 2 A) of the cell candidate region X in master sample image on the display device 6.
In addition, visual inspection with information display section 28 according to visual inspection information on the display device 6 show candidate area image acquiring unit 22 obtain several candidate region images.Such as, when have selected button 7b, visual inspection information display section 28 shows several candidate region images (see Fig. 2 B) be included in visual inspection information on the display device 6.In addition, such as, when have selected button 10a and user have selected a certain cell candidate region X, visual inspection information display section 28 shows several candidate region images (see Fig. 4 A) be included in visual inspection information on the display device 6.In addition, such as, when have selected button 10c and user have selected a certain cell candidate region X, the image of several candidate region images be included in visual inspection information together with the peripheral region of the cell candidate region X comprised in master sample image shows (see Fig. 4 C) by visual inspection information display section 28 on the display device 6.
Here, visual inspection by information display section 28 by candidate region image together with representing that the information of the focal length being used for capturing each sample image shows.Particularly, each candidate region image and the focal length be included in the image of candidate region carry out showing (see Fig. 2 B, Fig. 4 A and Fig. 4 C) by visual inspection information display section 28 explicitly.In addition, visual inspection information display section 28 is by based on being used for order (such as, by ascending order or the descending of focal length) the show candidate area image (see Fig. 2 B, Fig. 4 A and Fig. 4 C) in a listing format of layout of the focal length capturing each sample image.It should be noted that visual inspection information display section 28 can also show each candidate region image one by one by order (such as, by ascending order or the descending of the focal length) alternately carrying out showing according to the focal length be included in the image of candidate region.Show the mode that alternately shows each candidate region image one by one.Arrow represents the candidate region image being transferred to and will showing.
In addition, visual inspection information display section 28 receive user to the input whether seeing the determination result that target cell is relevant in a certain cell candidate region X.Here, when clicking Visual Observations Observations result load button 11 (see Fig. 2 A and Fig. 2 B) for a certain cell candidate region X, visual inspection information display section 28 receive user to the input whether seeing the determination result that target cell is relevant in the X of cell candidate region.
In addition, visual inspection with information display section 28 by received, the information of determination result that represents user and the region ID of cell candidate region X be stored in Visual Observations Observations result bar explicitly.Here, when receive represent in the X of cell candidate region, see the input of the determination result of target cell, the value that the Visual Observations Observations that the region ID with cell candidate region X is stored in observations hurdle explicitly marks is updated to " 2 " by visual inspection information display section 28.Receiving expression it is not clear whether when seeing the input of the determination result of target cell in the X of cell candidate region, the value that Visual Observations Observations marks is updated to " 1 " by visual inspection information display section 28.
Then, performance objective cell extraction is carried out according to the storage content in region candidate DB20.
In like fashion, in image processing equipment 4, for each in multiple focal length, the image of the cell candidate region 34 obtained when catching sample image with this focal length shows on the display device 6 by visual inspection information display section 28.Thus, by reference to image, whether user can make about the object in a certain cell candidate region 34 is the determination of target cell.In addition, by the order display image based on focal length, thus be beneficial to the change of the size determining nuclear fractions at a glance and be beneficial to and make such determination.
Figure 10 is the process flow diagram of the example of the process that the candidate region image acquisition unit 22 illustrated in image processing equipment 4, visual inspection information generating unit 24 and visual inspection information display section 28 perform.In S101, image processing equipment 4 reads several sample images be stored in sample image storer 16.In addition, in S101, image processing equipment 4 reads the coordinate data of the corresponding multiple cell candidate regions be registered in region candidate DB20.
Then, image processing equipment 4 carries out initialization (S102) to the visual inspection zone list be stored in storage arrangement.Then, cell candidate region (i) is used as in the cell candidate region be registered in region candidate DB20 by image processing equipment 4 seriatim, and performs step S103 to S106.
That is, in S103, image processing equipment 4 uses candidate region image acquisition unit 22 to obtain the image alternatively area image of cell candidate region (i) in each sample image.In addition, in S103, image processing equipment 4 uses visual inspection information generating unit 24 to generate to comprise the visual inspection information of several obtained candidate region images, and visual inspection information is kept in visual inspection information-storing device 26.
Then, such as, when user have selected cell candidate region (i), image processing equipment 4 reads the visual inspection information be stored in visual inspection information-storing device 26, and uses visual inspection information display section 28 to show on the display device 6 to be included in each width (S104) in several candidate region images in visual inspection information.Such as, in S104, image processing equipment 4 is by based on the clooating sequence (see Fig. 4 A) of the focal length be included in each candidate region image, in a listing format show candidate area image.As an alternative, such as, in S104, image processing equipment 4 shows each candidate region image one by one by the order (see Fig. 9) of carrying out showing according to the focal length be included in the image of candidate region, alternately.As an alternative, such as, in S104, the image of candidate region image together with the peripheral region of cell candidate region (i) comprised in master sample image shows (see Fig. 4 C) by image processing equipment 4.
Then, image processing equipment 4 use visual inspection information display section 28 to receive user to the input whether seeing the determination result that target cell is relevant in cell candidate region (i).In addition, determine whether user has determined to see target cell (S105) in cell candidate region (i) according to received input.
When user has determined to see target cell in cell candidate region (i) (being yes in S105), the value that the Visual Observations Observations that the region ID with cell candidate region (i) is stored in Visual Observations Observations result bar explicitly marks is updated to " 2 ".In addition, cell candidate region (i) is registered in (S106) in visual inspection zone list.
On the contrary, when user has determined not see target cell in cell candidate region (i) (being no in S105), another cell candidate region has been set to cell candidate region (i) and has performed S103 and subsequent step by image processing equipment 4.
Finally, in S107, image processing equipment 4 exports visual inspection zone list.
It should be noted that exemplary embodiment of the present invention is not limited to above-mentioned exemplary embodiment.
Such as, hereinbefore, the situation that erythroblast is target cell is described; But the cell except erythroblast also can be target cell.That is, the present invention also can be applicable to the situation that the cell except erythroblast is target cell.
In addition, such as, when have selected button 10b, image processing equipment 4 can not in a display format (2) (see Fig. 4 B) and in a display format (4) perform details display.Figure 11 is the figure for illustration of display mode (4).As shown in figure 11, under display mode (4), the image alternately of the peripheral region under each focal length shows one by one.
Reference numerals list
1 image processing system, 2 optical microscopes, 4 image processing equipments, 6 display devices, 7a, 7b, 10a, 10b, 10c button, 11 Visual Observations Observations result load buttons, 12 image-generating units, 14 sample image acquiring units, 16 sample image storeies, 18 cell candidate region determining units, 20 region candidate DB, 22 candidate region image acquisition units, 24 visual inspection information generating unit, 26 visual inspection information-storing devices, 28 visual inspection information display section, 30 block of pixels, 32 nucleus candidate regions, 34 cell candidate regions.

Claims (9)

1. an image processing system, it comprises:
Imaging device, it catches the image of the sample comprising target cell with each focal length in multiple focal length;
Acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of the image of catching as described imaging device;
Display device is that its display is obtained by described acquisition device, be the image of multiple image; And
For receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
2. image processing system according to claim 1, wherein,
Described display device is caught the putting in order of the focal length of image by based on being used for capturing each, being shown described multiple image in a listing format.
3. image processing system according to claim 1, wherein,
Described display device by according to being used for capturing order that each focal length of catching image carries out showing, alternately shows described multiple image one by one.
4. image processing system according to any one of claim 1 to 3, wherein,
Described multiple image shows together with comprising the image being present in the peripheral region of catching the cell candidate region in image described in arbitrary width by described display device.
5. image processing system according to any one of claim 1 to 4, wherein,
Described multiple image has been used for capturing each information of catching the focal length of image together with expression and has shown by described display device.
6. an image processing equipment, it comprises:
Acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length;
Display device is that its display is obtained by described acquisition device, be the image of multiple image; And
For receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
7. a program, is provided for computing machine and functions as follows:
Acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length;
Display device is that its display is obtained by described acquisition device, be the image of multiple image; And
For receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
8. a computer-readable recording medium, it stores the program being provided for computing machine and functioning as follows:
Acquisition device, it catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image at each of sample comprising target cell, described in catch image be the image that imaging device is caught with each focal length in multiple focal length;
Display device is that its display is obtained by described acquisition device, be the image of multiple image; And
For receive user to the device of input whether seeing the determination result that described target cell is relevant in described cell candidate region.
9. an image processing method, comprising:
The image of the sample comprising target cell is caught with each focal length in multiple focal length;
Each acquisition in described catching catches the image of the cell candidate region that acquisition is corresponding with the candidate in the region seeing described target cell in image;
That display obtains, be the image of multiple image; And
Receive user to the input whether seeing the determination result that described target cell is relevant in described cell candidate region.
CN201380075661.4A 2013-06-07 2013-11-21 Image processing system, image processing device, program, storage medium, and image processing method Pending CN105122037A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-121331 2013-06-07
JP2013121331A JP2014238344A (en) 2013-06-07 2013-06-07 Image processing system, image processor and program
PCT/JP2013/081361 WO2014196097A1 (en) 2013-06-07 2013-11-21 Image processing system, image processing device, program, storage medium, and image processing method

Publications (1)

Publication Number Publication Date
CN105122037A true CN105122037A (en) 2015-12-02

Family

ID=52007764

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380075661.4A Pending CN105122037A (en) 2013-06-07 2013-11-21 Image processing system, image processing device, program, storage medium, and image processing method

Country Status (3)

Country Link
JP (1) JP2014238344A (en)
CN (1) CN105122037A (en)
WO (1) WO2014196097A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109312286A (en) * 2016-06-13 2019-02-05 索尼公司 Device, information processing unit, program and information processing method
WO2020010634A1 (en) * 2018-07-13 2020-01-16 深圳迈瑞生物医疗电子股份有限公司 Cell image processing system and method, automatic smear reading device, and storage medium
CN111698423A (en) * 2020-06-18 2020-09-22 福建捷联电子有限公司 Method for displaying photos with various focal lengths

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7311142B2 (en) * 2019-08-23 2023-07-19 ライトタッチテクノロジー株式会社 Biological tissue identification device and biological tissue identification program
WO2021135393A1 (en) * 2019-12-31 2021-07-08 深圳迈瑞生物医疗电子股份有限公司 Image analysis apparatus and imaging method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6583865B2 (en) * 2000-08-25 2003-06-24 Amnis Corporation Alternative detector configuration and mode of operation of a time delay integration particle analyzer
JP4046161B2 (en) * 2002-10-29 2008-02-13 独立行政法人放射線医学総合研究所 Sample image data processing method and sample inspection system
JP4346923B2 (en) 2003-02-21 2009-10-21 晴夫 高林 Target cell automatic search system
JP5703781B2 (en) * 2010-09-03 2015-04-22 ソニー株式会社 Image processing apparatus and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109312286A (en) * 2016-06-13 2019-02-05 索尼公司 Device, information processing unit, program and information processing method
WO2020010634A1 (en) * 2018-07-13 2020-01-16 深圳迈瑞生物医疗电子股份有限公司 Cell image processing system and method, automatic smear reading device, and storage medium
CN111656247A (en) * 2018-07-13 2020-09-11 深圳迈瑞生物医疗电子股份有限公司 Cell image processing system, cell image processing method, automatic film reading device and storage medium
CN111656247B (en) * 2018-07-13 2023-03-10 深圳迈瑞生物医疗电子股份有限公司 Cell image processing system, cell image processing method, automatic film reading device and storage medium
CN111698423A (en) * 2020-06-18 2020-09-22 福建捷联电子有限公司 Method for displaying photos with various focal lengths

Also Published As

Publication number Publication date
JP2014238344A (en) 2014-12-18
WO2014196097A1 (en) 2014-12-11

Similar Documents

Publication Publication Date Title
AU2020200835B2 (en) System and method for reviewing and analyzing cytological specimens
JP4558047B2 (en) Microscope system, image generation method, and program
US7720272B2 (en) Automated microscopic sperm identification
CN105122037A (en) Image processing system, image processing device, program, storage medium, and image processing method
JP2006049964A (en) Image display system, image providing apparatus, image display apparatus and computer program
CN105637343A (en) Detection control device, program, detection system, storage medium and detection control method
CN111932542B (en) Image identification method and device based on multiple focal lengths and storage medium
US20040254738A1 (en) Method and system for organizing multiple objects of interest in field of interest
JP4864709B2 (en) A system for determining the staining quality of slides using a scatter plot distribution
US20140029813A1 (en) Method and system to digitize pathology specimens in a stepwise fashion for review
JP4897488B2 (en) A system for classifying slides using a scatter plot distribution
JP5530126B2 (en) Three-dimensional cell image analysis system and three-dimensional cell image analyzer used therefor
JP4578135B2 (en) Specimen image display method and specimen image display program
CN111462005A (en) Method, apparatus, computer device and storage medium for processing microscopic image
JP2014157158A (en) Cell observation method, three-dimensional cell image analysis system, and three-dimensional cell image analyzer used therefor
JP4785347B2 (en) Specimen image display method and specimen image display program
JP5048038B2 (en) Blood cell classification result display method and blood cell classification result display program
CN105637344A (en) Image processing device, program, storage medium, and image processing method
CN116012693A (en) Intelligent diagnosis method and device, infrared thermal imaging equipment and medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication