WO2019044416A1 - Dispositif de traitement d'imagerie, procédé de commande destiné à un dispositif de traitement d'imagerie et programme de traitement d'imagerie - Google Patents

Dispositif de traitement d'imagerie, procédé de commande destiné à un dispositif de traitement d'imagerie et programme de traitement d'imagerie Download PDF

Info

Publication number
WO2019044416A1
WO2019044416A1 PCT/JP2018/029509 JP2018029509W WO2019044416A1 WO 2019044416 A1 WO2019044416 A1 WO 2019044416A1 JP 2018029509 W JP2018029509 W JP 2018029509W WO 2019044416 A1 WO2019044416 A1 WO 2019044416A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
captured image
observation
area
Prior art date
Application number
PCT/JP2018/029509
Other languages
English (en)
Japanese (ja)
Inventor
隆史 涌井
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2019044416A1 publication Critical patent/WO2019044416A1/fr

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention is an imaging processing apparatus for observing an image of the entire observation target by relatively moving a stage on which a container containing the observation target is installed and an imaging optical system for forming an image of the observation target
  • the present invention relates to a control method of a photographing processing device and a photographing processing program.
  • Pluripotent stem cells such as ES (Embryonic Stem) cells and iPS (Induced Pluripotent Stem) cells have the ability to differentiate into cells of various tissues, and they can be used in regenerative medicine, drug development, disease elucidation, etc. It is noted that it can be applied in
  • pluripotent stem cells such as ES cells and iPS cells or cells induced to differentiate are imaged with a microscope or the like and the characteristics of the image are captured to evaluate the differentiation state of the cells etc. .
  • each observation area in the well is scanned by moving the stage on which the well plate or the like is installed with respect to the imaging optical system to photograph each observation area, and then imaging for each observation area
  • a method has been proposed for joining images to generate a composite image.
  • the focus position may not be optimal in all the observation areas, and a mistake may occur in the autofocus control, and the photographed image of a part of the observation areas may be an blurred image.
  • the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus, and the photographed image may be a dark image.
  • Images of individual cells can not be extracted with high accuracy for captured images of low quality such as blurred images and dark images. For this reason, for example, if the evaluation is performed using a feature value indicating the state of each cell, the accuracy of the evaluation result may be low, and the reliability may also be low. That is, there are cases where accurate evaluation results can not be obtained if the low-quality captured image and the non-low-quality captured image with no blurring and no problem with brightness are evaluated in the same manner.
  • Patent Document 1 In the case of performing tiling imaging, when the quality of a captured image of a certain observation area is low, a method for rephotographing the observation area has been proposed (see Patent Document 1). By performing re-shooting as in the method described in Patent Document 1, it is possible to obtain a shot image that is not of low quality, so it is possible to obtain an accurate evaluation result.
  • the method described in Patent Document 1 it is determined as follows whether the photographed image has low quality. That is, an evaluation value for focus and an evaluation value for brightness are calculated for the photographed image to be a target and the photographed image adjacent to the photographed image to be a target. Then, the continuity of the evaluation values of the two photographed images is determined. Specifically, when the difference between the evaluation value of the target photographed image and the evaluation value of the adjacent photographed image is larger than the threshold value, it is determined that the target photographed image is a low quality photographed image. .
  • the container By the way, in a container such as the well described above, cells are cultured on a culture medium, which is a culture solution. Therefore, in the container, the area in which the cells are present and the area of the culture medium are mixed. Here, since there is no cell in the area of the culture medium, it has a uniform density as an image. For this reason, when a culture medium is contained in an observation area
  • the container also contains cells not to be observed such as floating cells. When floating cells and the like are included, the focal position may be aligned with the floating cells rather than the target cells.
  • the present invention has been made in view of the above circumstances, and it is an object of the present invention to enable more accurate and reliable evaluation even if the quality of photographed images in each observation area in a container is low. Do.
  • the imaging processing apparatus includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other
  • An imaging device for capturing a plurality of photographed images by photographing the container a plurality of times while changing the position of the observation region;
  • An area detection unit that detects an area to be observed from a captured image;
  • the target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image
  • a captured image determination unit that determines whether the quality is high;
  • a processing control unit configured to control at least one of photographing of the target photographed image and image processing of the target photographed image when it is determined that the target photographed image is of low quality.
  • the “at least one other captured image in proximity to” means a captured image within a predetermined range based on the target captured image.
  • one or more captured images adjacent to the target captured image can be set as “at least one other captured image in proximity”.
  • the one or more photographed images further adjacent to the adjacent photographed image are “adjacent to each other within a predetermined range. It can be included in at least one other captured image.
  • Low quality means that the quality is low compared to other photographed images. Specifically, this means that the target captured image is blurred or dark compared to other captured images, and the quality can not be evaluated accurately.
  • the processing control unit may rephotograph the observation region corresponding to the target captured image when it is determined that the target captured image has a low quality.
  • the processing control unit may notify that the target captured image has a low quality when it is determined that the target captured image has a low quality.
  • notification is included in the control of photographing.
  • the processing control unit performs at least one of the brightness correction processing and the sharpness enhancement processing on the target photographed image when it is determined that the target photographed image has low quality. It may be one.
  • the captured image determination unit determines whether the quality of the target captured image is low based on the similarity of the detected observation target area. Good.
  • the photographed image determination unit determines that the quality of the target photographed image is low based on the partial area in the area of the observation target detected in each of the target photographed image and the other photographed images. It may be determined whether or not.
  • the imaging processing apparatus acquires a plurality of captured images by overlapping a part of the observation area
  • the captured image determination unit may determine whether the quality of the target captured image is low based on overlapping areas in each of the target captured image and the other captured image.
  • the processing control unit when it is determined that the target captured image has a low quality, the processing control unit replaces the overlapping region in the target captured image with the overlapping region in the other captured image. It may be one.
  • a control method of an imaging processing apparatus includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and at least one of the container and the imaging unit is relative to the other.
  • a control method of a photographing processing apparatus provided with an observation device for photographing the container a plurality of times while changing the position of the observation region and acquiring a plurality of photographed images, Detect the area of the observation target from the captured image, The target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image Determine whether it is quality or not If it is determined that the target captured image is of low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled.
  • the imaging processing program includes a container containing an observation target and an imaging unit for imaging an observation target for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other
  • a photographing processing program that causes a computer to execute a control method of a photographing processing apparatus provided with an observation device that photographs a container a plurality of times and acquires a plurality of photographed images while changing the position of an observation region;
  • the target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image
  • a procedure for determining whether the quality is If it is determined that the target captured image is of low quality, the computer is made to execute a procedure for controlling at least one of the shooting for the target captured image and the image processing for the target captured image.
  • the area of the observation target is detected from the plurality of photographed images, and the area of the observation target detected in the target photographed image among the plurality of photographed images, and at least one other close to the target photographed image. It is determined whether the quality of the target captured image is low based on the area of the observation target detected in the captured image of. Therefore, it is possible to accurately determine whether the quality of the target captured image is low without being affected by the area other than the observation target in the captured image. Further, in the present invention, when it is determined that the target captured image has a low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target photographed image in which at least one of photographing and image processing is controlled.
  • a block diagram showing a schematic configuration of a microscope observation system using an embodiment of the imaging processing apparatus of the present invention Diagram showing the scanning locus of each observation area in the well plate Figure showing detection results of cell area
  • a diagram showing an example of a photographed image of each observation area in a well A diagram showing an example of a photographed image of each observation area in a well
  • a diagram showing an example of adjacent target photographed images and other photographed images A diagram showing an example of adjacent target photographed images and other photographed images
  • Flow chart showing processing performed in the present embodiment Diagram for explaining duplication of part of the observation area A diagram for explaining setting of overlapping regions in a small region of a target photographed image and other photographed images
  • a diagram for describing replacement of a small area in a target captured image with a small area in another captured image Diagram for explaining another captured image in proximity to the target captured image Flow chart showing processing performed in another embodiment of the present invention
  • FIG. 1 is a block diagram showing a schematic configuration of a microscope observation system of the present embodiment.
  • the microscope observation system of the present embodiment includes a microscope apparatus 10, an imaging processing apparatus 20, a display apparatus 30, and an input apparatus 40.
  • the microscope apparatus 10 corresponds to the observation apparatus of the present invention.
  • the microscope device 10 photographs the cells contained in the culture vessel, and outputs a photographed image.
  • a phase contrast microscope apparatus including an imaging element such as a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor is used.
  • an imaging device an imaging device provided with a RGB (Red Green Blue) color filter may be used, or a monochrome imaging device may be used.
  • the phase difference image of the cells accommodated in the culture vessel is formed on the imaging device, and the imaging device outputs the phase difference image as a photographed image.
  • the microscope apparatus 10 you may use not only a phase contrast microscope apparatus but other microscope apparatuses, such as a differential interference microscope apparatus and a bright-field microscope apparatus.
  • the imaging target may be a cell colony in which a plurality of cells are aggregated, or a plurality of dispersed and distributed cells.
  • cells to be imaged include, for example, pluripotent stem cells such as iPS cells and ES cells, nerves derived from stem cells, cells of skin, myocardium and liver, and cells of organs taken from human body and cancer cells Etc.
  • a well plate having a plurality of wells is used as a culture vessel.
  • each well corresponds to the container of the present invention.
  • the microscope apparatus 10 is equipped with the stage in which a well plate is installed. The stage moves in the X and Y directions orthogonal to each other in the horizontal plane. By the movement of the stage, each observation area in each well of the well plate is scanned, and a photographed image for each observation area is photographed. The photographed image for each observation area is output to the photographing processing device 20.
  • FIG. 2 is a diagram showing a scanning locus of each observation area in the case of using a well plate 50 having six wells 51 by a solid line Sc. As shown in FIG. 2, each observation area in the well plate 50 is scanned along the solid line Sc from the scanning start point S to the scanning end point E by the movement of the stage in the X and Y directions.
  • autofocus control is performed by moving an imaging optical system that forms a phase difference image of a stage or a cell on an imaging device in the vertical direction.
  • the photographed image of each observation area in the well is photographed by moving the stage
  • the present invention is not limited to this, and observation is performed by moving the imaging optical system with respect to the stage. You may make it image
  • the well plate is used, but the container for containing cells is not limited to this, and other containers such as petri dishes or dishes may be used, for example.
  • the photographing processing device 20 includes an area detection unit 21, a photographed image determination unit 22, a processing control unit 23, and a display control unit 24.
  • the photographing processing device 20 is constituted by a computer provided with a central processing unit, a semiconductor memory, a hard disk and the like, and one embodiment of the photographing processing program of the present invention is installed in the hard disk. Then, when the central processing unit executes this photographing processing program, the area detecting unit 21, the photographed image judging unit 22, the processing control unit 23 and the display control unit 24 shown in FIG. 1 function.
  • the functions of the respective units are executed by the photographing processing program.
  • the present invention is not limited to this. For example, a plurality of integrated circuits (ICs), processors, application specific integrated circuits (ASICs), and FPGAs ( The functions of the respective units may be executed by appropriately combining a field-programmable gate array), a memory, and the like.
  • the area detection unit 21 detects an observation target area, that is, a cell area, from the captured image acquired by the microscope device 10.
  • the area detection unit 21 includes, for example, a discriminator that determines whether each pixel of the captured image represents a cell or a medium, and the pixel is determined to be contained in the cell by the discriminator. Is detected as a cell area from the photographed image.
  • the discriminator outputs the result of determination whether the photographed image of the cell area or the culture area is a photographed image using the photographed image of the cell area and the photographed image of the culture area as teacher data.
  • a well-known method can be used as a method of machine learning. For example, support vector machine (SVM), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), and denoising stack auto encoder (DSA) can be used.
  • SVM support vector machine
  • DNN deep neural network
  • CNN convolutional neural network
  • RNN recurrent neural network
  • DSA denoising stack auto encode
  • FIG. 3 is a diagram showing the detection result of the cell area.
  • each area divided by the rectangular area corresponds to each observation area.
  • the cell areas detected in the photographed image of each observation area are hatched.
  • the cell area is detected by detecting the cell area in the photographed image using the area detection unit 21 provided with the machine-learned classifier as described above.
  • the determination method is not limited to this.
  • an edge may be detected from the captured image and determined based on the amount of the edge, or it may be determined from a set of the maximum value and the minimum value of the pixel value. It may be determined by analyzing.
  • the captured image determination unit 22 determines whether the target captured image to be determined among the plurality of captured images has a low quality. Specifically, whether the quality of the target photographed image is low based on the cell area detected in the target photographed image and the cell area detected in at least one other photographed image adjacent to the target photographed image Determine
  • each observation area in the well is scanned, and autofocus control is performed in each observation area.
  • the focus position may not be optimal in all observation areas, and a mistake may occur in the auto focus control, and a photographed image of a part of the observation areas may be blurred and of low quality. . If such a blurred captured image is evaluated in the same manner as other unblurred captured images, an accurate evaluation result may not be obtained.
  • FIG. 4 is a view showing an example of a combined image obtained by combining the photographed images of the respective observation areas in the well. Each area divided by a rectangular area in FIG. 4 corresponds to each observation area. Further, in the example illustrated in FIG.
  • the photographed image of the observation area indicated by the dotted square is an image that is blurred.
  • an origin is set at the upper left corner of the composite image, and a coordinate system in which the left direction in the drawing is the X direction and the lower direction in the drawing is the Y direction is set. Therefore, each captured image can be represented by a two-dimensional coordinate system.
  • the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 4 can be expressed as (4, 3).
  • the cause of the low quality of the photographed image is not only the autofocus control error.
  • the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus 10 or vibration of the stage and the optical system, and the captured image may be a dark image. If such a dark photographed image is evaluated in the same manner as other photographed images of normal light quantity, an accurate evaluation result may not be obtained.
  • FIG. 5 is a view showing an example of a combined image obtained by combining the observation regions in the well. Each area divided by a rectangular area in FIG. 5 corresponds to each observation area. Further, in the example shown in FIG. 5, the photographed image of the observation area indicated by the dotted square is the photographed image of low quality due to the light amount fluctuation of the illumination light.
  • the coordinate system is set as in FIG. Therefore, the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 5 can be expressed as (4, 4).
  • the captured image determination unit 22 determines whether or not the captured image of the target observation area has a low quality. For this purpose, the captured image determination unit 22 sets at least one other captured image adjacent to the target captured image. In this embodiment, one captured image on the downstream side of the target captured image in the scanning direction shown in FIG. 2 is set as another captured image adjacent to the target captured image. For example, when the coordinates of the target captured image are (4, 3), the coordinates of the other captured image are (5, 3). When the coordinates of the target captured image are (5, 3), the coordinates of the other captured images are (5, 4).
  • the captured image determination unit 22 determines whether the quality of the target captured image is low based on the adjacent partial cell regions in each of the target captured image and the other captured image.
  • FIG. 6 is a view showing an example of adjacent target photographed images and other photographed images. In FIG. 6, it is assumed that the entire area of the target photographed image G0 and the other photographed image G1 is a cell area. As shown in FIG. 6, the captured image determination unit 22 sets small areas A0 and A1 adjacent to each other in the target captured image G0 and the other captured image G1. In the composite image shown in FIG. 4, the target captured image G0 and the target captured image G0 have coordinates (3, 2), and the coordinates of the other captured image G1 are (4, 2).
  • the entire area of the other captured image G1 may not be a cell area.
  • the captured image determination unit 22 sets the small regions A0 and A1 so as to include only the cell region in the target captured image G0 and the other captured image G1.
  • the photographed image determination unit 22 determines the similarity between the small areas A0 and A1.
  • the captured image determination unit 22 calculates an index value for determining the similarity.
  • the index value the mean square error of the pixel value of the corresponding pixel position in the case where the small areas A0 and A1 are overlapped is used.
  • the index value is not limited to the mean square error, and the difference between the average luminance of the small area A0 and the average luminance of the small area A1 or the range of the minimum value and the maximum value of the pixel values of the small area A0 The difference between the minimum value and the maximum value range of the pixel values of the small area A1 can be used.
  • the captured image determination unit 22 determines whether the index value is larger than a predetermined threshold value Th1, and when it is determined that the index value is larger than the threshold value Th1, the target image capturing is performed. It is determined that the image G0 is of low quality.
  • the captured image determination unit 22 determines the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. Calculate the maximum value of the first derivative.
  • the captured image determination unit 22 determines that the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. The absolute value of the difference value with the maximum value of the primary differential value is calculated.
  • the difference value is larger than a predetermined threshold value Th2, it is determined that the target captured image G0 is a blurred captured image.
  • a predetermined threshold value Th2 it is determined that the target captured image G0 is a blurred captured image.
  • the threshold value Th2 it is determined that the target captured image G0 is a dark captured image.
  • the processing control unit 23 controls at least one of imaging and image processing for the target captured image G0.
  • imaging and image processing will be described.
  • the process control unit 23 rephotographs the observation region of the target photographed image G0 determined to have low quality. For example, when the coordinates of the target photographed image G0 determined to be low quality are (4, 3) shown in FIG. 4, the processing control unit 23 sets the coordinates (4, 3) to the microscope device 10 It instructs to re-shoot the corresponding shooting area. At this time, when it is determined that the target captured image G0 is a dark captured image, an instruction is given to the microscope apparatus 10 so as to perform imaging with an appropriate brightness. On the other hand, when it is determined that the target captured image G0 is a blurred captured image, an instruction is given to the microscope apparatus 10 so that the focus is appropriately set on the observation region for which the target captured image G0 has been acquired. .
  • the process control unit 23 may notify that the target photographed image G0 has a low quality when it is determined that the target photographed image G0 has a low quality.
  • the display control unit 24 displays on the display device 30 a composite image obtained by connecting and combining a plurality of photographed images, but the composite image displayed on the display device 30 is low.
  • the captured image determined to have the quality may be displayed in an emphasized manner. For example, in the composite image illustrated in FIG. 4, a frame such as red may be added to the area surrounded by the dotted line, the frame may blink, or the area may blink. Also, notification may be made by text or voice.
  • the processing control unit 23 performs brightness correction processing on the target captured image G0 determined to be a dark captured image. Specifically, the average value of the pixel values of the small area A0 of the target captured image G0 is made to coincide with the average value of the pixel values of the small area A1 of the other captured image G1. Correct the pixel value so as to increase the brightness.
  • the process control unit 23 performs sharpness enhancement processing on the target captured image G0 determined to be a blurred captured image. Specifically, according to the magnitude of the absolute value of the difference between the maximum value of the first derivative of the small area A0 of the target captured image G0 and the maximum value of the first derivative of the small area A1 of the other captured image G1. Sharpness emphasis processing is performed by changing the degree of emphasis on sharpness.
  • a table in which the values of various difference values are associated with the degree of emphasis of sharpness is stored in the hard disk of the imaging processing apparatus 20. The processing control unit 23 refers to this table to acquire the degree of sharpness enhancement according to the value of the difference value, and performs the sharpness enhancement process on the processing target image G0.
  • the display control unit 24 generates a composite image by connecting a plurality of captured images, and causes the display device 30 to display the generated composite image. Further, when there is a photographed image judged to be of low quality, a notification to that effect, the position of the low quality photographed image in the composite image, etc. are displayed on the display device 30.
  • the display device 30 displays the composite image and the notification as described above, and includes, for example, a liquid crystal display.
  • the display device 30 may be configured by a touch panel and used as the input device 40.
  • the input device 40 includes a mouse, a keyboard, and the like, and receives various setting inputs from the user.
  • a well plate containing cells and medium is placed on the stage of the microscope apparatus 10 (step ST10). Then, by moving the stage in the X direction and the Y direction, the observation area in each well of the well plate is scanned, and photographed images of all the observation areas are photographed (step ST12). Photographed images photographed by the microscope apparatus 10 are sequentially input to the area detection unit 21 of the photographing processing device 20. The area detection unit 21 detects a cell area from the captured image (step ST14).
  • the captured image determination unit 22 determines whether the target captured image G0 among the plurality of captured images has a low quality (step ST16). If it is determined that the target captured image G0 is of low quality, the processing control unit 23 controls at least one of photographing of the target captured image G0 and image processing of the target captured image G0 (processing control: step ST18). Specifically, re-shooting of the observation area corresponding to the target captured image G0, notification that the target captured image G0 is a low-quality captured image, brightness correction processing for the target captured image G0, or for the target captured image G0 Perform sharpness enhancement processing. If the target captured image G0 is not of low quality, the process proceeds to step ST20.
  • steps ST16 and ST18 are repeated until the determination of all the observation areas is completed (ST20, NO).
  • the display control unit 24 combines all captured images and combines them to generate a composite image, and displays the generated composite image on the display device 30. (Step ST22), and the process ends.
  • the cell region to be observed is detected from the plurality of photographed images, and the cell region detected in the target photographed image G0 among the plurality of photographed images and the target photographed image G0 Based on the cell area detected in at least one other captured image G1 adjacent thereto, it is determined whether or not the target captured image G0 has low quality. Therefore, it is possible to accurately determine whether or not the target photographed image G0 has low quality without being affected by the area other than the cells in the photographed image. Further, in the present embodiment, when it is determined that the target captured image G0 is of low quality, at least one of the shooting for the target captured image G0 and the image processing for the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target pickup image G0 in which at least one of the photographing and the image processing is controlled.
  • the target photographed image G0 when the target photographed image G0 is compared with the adjacent small areas A0 and A1 of one other photographed image G1, the target photographed image G0 has a low quality, and the other photographed image G1 is low.
  • the index value becomes large. Therefore, it may not be possible to accurately determine which of the target captured image G0 and the other captured image G1 has low quality. Therefore, a plurality of other photographed images G1 may be used.
  • two left and right captured images of the target captured image G0 four captured images in upper, lower, left and right, four captured images in upper left, upper right, lower right and lower left, or eight captured images around the target captured image G0 It may be a photographed image of As described above, by using a plurality of other captured images G1, it is possible to more accurately determine whether the target captured image G0 has low quality.
  • FIG. 9 is a diagram for explaining duplication of part of the observation region. As shown in FIG. 9, in each photographed image, an adjacent photographed image and a partial area overlap.
  • the overlapping range is preferably, for example, up to about 10% of the length of each side of the observation area. Note that, in FIG. 9, overlapping regions are widely shown for the purpose of explanation.
  • the photographed image determination unit 22 reduces the overlapping area in the target photographed image G0 and the other photographed image G1. It may be set to the areas A0 and A1.
  • the processing control unit 23 replaces the overlapping region in the target captured image G0 with the overlapping region in the other captured image G1. Good.
  • the processing control unit 23 when controlling the image processing on the target captured image G0, the processing control unit 23 performs at least one of the brightness correction processing and the sharpness enhancement processing only on the region other than the small region A0 in the target captured image G0. It will be good. Therefore, the processing time required for image processing can be shortened.
  • the other captured images G1 to G4 when four captured images adjacent to the target captured image G0 are used as the other captured images G1 to G4, four small areas in the upper, lower, left, and right of the target captured image G0 are target captured images in the other captured images G1 to G4. Substituting with the small area A1 to A4 overlapping G0, at least one of the brightness correction process and the sharpness enhancing process may be performed only on the area other than the small area A1 to A4 in the target captured image G0.
  • the target photographed image G0 is compared with another photographed image G1 adjacent to the target photographed image G0 to determine whether the target photographed image G0 has low quality. Specifically, an index value representing the similarity between the subregions A0 and A1 in the target captured image G0 and the other captured image G1 is calculated, and the target captured image G0 is calculated if the index value is larger than the threshold Th1. It is judged that the quality is low. However, the index value decreases not only when both the target captured image G0 and the other captured image G1 are not of low quality but also when both the target captured image G0 and the other captured image G1 are of low quality. . Therefore, when the index value is smaller than the threshold value Th1, it may not be possible to accurately determine whether the target photographed image G0 has low quality.
  • the photographed image adjacent to the target photographed image G0 but also at least one photographed image close to the target photographed image G0 is used as another photographed image G1 to determine whether the target photographed image G0 has low quality or not May be determined.
  • this will be described as another embodiment.
  • the configuration of the imaging processing apparatus in the other embodiment is the same as the configuration of the imaging processing apparatus 20 shown in FIG. 1, and thus the detailed description of the apparatus will be omitted here.
  • a captured image within a predetermined range based on the target captured image G0 can be used as another captured image G1 in proximity to the target captured image G0.
  • one or more photographed images further adjacent to these adjacent photographed images can be used.
  • captured images within a 5 ⁇ 5 range centered on the target captured image G0 may be used as another captured image G1 close to the target captured image G0.
  • a captured image which is not adjacent to the target captured image G0 but is within a predetermined range based on the target captured image G0 may be another captured image G1 which is close to the target captured image G0.
  • a captured image including all or a part of it within a range of a circle having a predetermined radius centered on the target captured image G0 is used as another captured image G1 close to the target captured image G0 May be
  • FIG. 13 is a flow chart showing processing performed in another embodiment of the present invention.
  • the processing performed in the other embodiment is different from the above embodiment only in the processing of step ST16 in the flowchart shown in FIG. Therefore, only the process of step ST16 in the flowchart shown in FIG. 8 will be described here.
  • the photographed image determination unit 22 sets one other photographed image G1 to be compared with the target photographed image G0 (step ST30).
  • the other captured images G1 may be set in order from the captured image closer to the target captured image G0.
  • the captured image determination unit 22 calculates an index value representing the similarity between the target captured image G0 and the small regions A0 and A1 in the other captured image G1, and the index value is the threshold Th1. It is determined whether or not it is larger (step ST32). If step ST32 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is of low quality (step ST34), and proceeds to the process of step ST18 of FIG.
  • step ST32 the captured image determination unit 22 determines whether the comparison has ended for all other captured images G1 (step ST36). If step ST36 is negative, another captured image G1 to be compared with the target captured image G0 is set as the next captured image (step ST38), and the process returns to step ST32. If step ST36 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is not of low quality (step ST40), and proceeds to the process of step ST20 in FIG.
  • the captured images of all the observation areas are acquired, it is determined whether or not the target captured image is of low quality.
  • the photographed image is of low quality.
  • the captured image already acquired may be used as the other captured image G1.
  • the index value is calculated using adjacent small areas of the target captured image G0 and other captured images, and it is determined whether the target captured image G0 has low quality.
  • the index value may be calculated using the entire area of the target photographed image G0 and other photographed images, in particular, the entire area of the cell area.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Wood Science & Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Organic Chemistry (AREA)
  • Zoology (AREA)
  • Physics & Mathematics (AREA)
  • Biotechnology (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Microbiology (AREA)
  • Sustainable Development (AREA)
  • Medicinal Chemistry (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Genetics & Genomics (AREA)
  • Immunology (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Image Processing (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

L'invention concerne un dispositif de traitement d'imagerie, un procédé de commande destiné à un dispositif de traitement d'imagerie et un programme de traitement d'imagerie, permettant d'effectuer des évaluations plus précises et très fiables, même lorsque la qualité d'images acquises de zones observées au sein d'un récipient est faible. Selon la présente invention, une unité de détection de zone (21) détecte une zone cible d'observation à partir d'une pluralité d'images acquises. Une unité de détermination d'image acquise (22) détermine si une image acquise cible parmi la pluralité d'images acquises est de faible qualité, sur la base de la zone cible d'observation telle que détectée dans l'image acquise cible, et de la zone cible d'observation telle que détectée dans au moins une autre image acquise qui est proche de l'image acquise cible. Lorsqu'il a été déterminé que l'image acquise cible est de faible qualité, une unité de commande de traitement (23) commande l'imagerie de l'image acquise cible et/ou le traitement d'image de l'image acquise cible.
PCT/JP2018/029509 2017-08-31 2018-08-07 Dispositif de traitement d'imagerie, procédé de commande destiné à un dispositif de traitement d'imagerie et programme de traitement d'imagerie WO2019044416A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017166908A JP2020202748A (ja) 2017-08-31 2017-08-31 撮影処理装置、撮影処理装置の制御方法および撮影処理プログラム
JP2017-166908 2017-08-31

Publications (1)

Publication Number Publication Date
WO2019044416A1 true WO2019044416A1 (fr) 2019-03-07

Family

ID=65525333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029509 WO2019044416A1 (fr) 2017-08-31 2018-08-07 Dispositif de traitement d'imagerie, procédé de commande destiné à un dispositif de traitement d'imagerie et programme de traitement d'imagerie

Country Status (2)

Country Link
JP (1) JP2020202748A (fr)
WO (1) WO2019044416A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020188813A1 (fr) * 2019-03-20 2020-09-24 株式会社島津製作所 Systèmes d'analyse cellulaire

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008137667A1 (fr) * 2007-05-04 2008-11-13 Aperio Technologies, Inc. Système et procédé pour l'assurance qualité en pathologie
WO2010111656A2 (fr) * 2009-03-27 2010-09-30 Life Technologies Corporation Systèmes et procédés d'évaluation d'images
WO2011127361A2 (fr) * 2010-04-08 2011-10-13 Omnyx LLC Évaluation de qualité d'image comprenant comparaison de marges chevauchantes
WO2013183562A1 (fr) * 2012-06-04 2013-12-12 大日本印刷株式会社 Système d'enregistrement d'informations de milieu de culture, borne de communication, programme, système de gestion de santé et milieu de culture de type film
JP2014178357A (ja) * 2013-03-13 2014-09-25 Sony Corp デジタル顕微鏡装置、その撮像方法およびプログラム
JP2016125913A (ja) * 2015-01-05 2016-07-11 キヤノン株式会社 画像取得装置及び画像取得装置の制御方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008137667A1 (fr) * 2007-05-04 2008-11-13 Aperio Technologies, Inc. Système et procédé pour l'assurance qualité en pathologie
WO2010111656A2 (fr) * 2009-03-27 2010-09-30 Life Technologies Corporation Systèmes et procédés d'évaluation d'images
WO2011127361A2 (fr) * 2010-04-08 2011-10-13 Omnyx LLC Évaluation de qualité d'image comprenant comparaison de marges chevauchantes
WO2013183562A1 (fr) * 2012-06-04 2013-12-12 大日本印刷株式会社 Système d'enregistrement d'informations de milieu de culture, borne de communication, programme, système de gestion de santé et milieu de culture de type film
JP2014178357A (ja) * 2013-03-13 2014-09-25 Sony Corp デジタル顕微鏡装置、その撮像方法およびプログラム
JP2016125913A (ja) * 2015-01-05 2016-07-11 キヤノン株式会社 画像取得装置及び画像取得装置の制御方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020188813A1 (fr) * 2019-03-20 2020-09-24 株式会社島津製作所 Systèmes d'analyse cellulaire
JPWO2020188813A1 (ja) * 2019-03-20 2021-12-16 株式会社島津製作所 細胞解析装置
JP7006832B2 (ja) 2019-03-20 2022-01-24 株式会社島津製作所 細胞解析装置

Also Published As

Publication number Publication date
JP2020202748A (ja) 2020-12-24

Similar Documents

Publication Publication Date Title
US9088729B2 (en) Imaging apparatus and method of controlling same
US8830313B2 (en) Information processing apparatus, stage-undulation correcting method, program therefor
US20140098213A1 (en) Imaging system and control method for same
US10613313B2 (en) Microscopy system, microscopy method, and computer-readable recording medium
WO2019181053A1 (fr) Dispositif, procédé et programme de mesure de quantité de défocalisation et discriminateur
JP2012003214A (ja) 情報処理装置、情報処理方法、プログラム、撮像装置、及び光学顕微鏡を搭載した撮像装置
JP2016125913A (ja) 画像取得装置及び画像取得装置の制御方法
WO2018003181A1 (fr) Dispositif et procédé de photographie et programme de commande de photographie
US11209637B2 (en) Observation device, observation control method, and observation control program that control acceleration of a moveable stage having an installed subject vessel
CN109001902A (zh) 基于图像融合的显微镜聚焦方法
JP2017055916A (ja) 画像生成装置、画像生成方法及びプログラム
WO2019044416A1 (fr) Dispositif de traitement d'imagerie, procédé de commande destiné à un dispositif de traitement d'imagerie et programme de traitement d'imagerie
JP2013246052A (ja) 距離測定装置
US20200192059A1 (en) Imaging control apparatus, method, and program
WO2018061429A1 (fr) Dispositif, procédé et programme d'évaluation d'image capturée
US20190370967A1 (en) Cell image evaluation device, method, and program
JP2019144294A (ja) 画像処理装置、顕微鏡システム、画像処理方法および画像処理プログラム
JP6549061B2 (ja) 撮影装置および方法並びに撮影装置制御プログラム
JP6499506B2 (ja) 撮像装置および方法並びに撮像制御プログラム
US20130016192A1 (en) Image processing device and image display system
JP6848086B2 (ja) 観察装置および方法並びに観察装置制御プログラム
JP2023033982A (ja) 画像処理装置、画像処理システム、画像の鮮鋭化方法、及び、プログラム
JP6534294B2 (ja) 撮像装置および方法並びに撮像制御プログラム
JP2016206228A (ja) 合焦位置検出装置、合焦位置検出方法、撮像装置、撮像システム
US11127180B2 (en) Image processing apparatus, method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18849840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18849840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP