WO2019044416A1 - Imaging processing device, control method for imaging processing device, and imaging processing program - Google Patents

Imaging processing device, control method for imaging processing device, and imaging processing program Download PDF

Info

Publication number
WO2019044416A1
WO2019044416A1 PCT/JP2018/029509 JP2018029509W WO2019044416A1 WO 2019044416 A1 WO2019044416 A1 WO 2019044416A1 JP 2018029509 W JP2018029509 W JP 2018029509W WO 2019044416 A1 WO2019044416 A1 WO 2019044416A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
captured image
observation
area
Prior art date
Application number
PCT/JP2018/029509
Other languages
French (fr)
Japanese (ja)
Inventor
隆史 涌井
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2019044416A1 publication Critical patent/WO2019044416A1/en

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention is an imaging processing apparatus for observing an image of the entire observation target by relatively moving a stage on which a container containing the observation target is installed and an imaging optical system for forming an image of the observation target
  • the present invention relates to a control method of a photographing processing device and a photographing processing program.
  • Pluripotent stem cells such as ES (Embryonic Stem) cells and iPS (Induced Pluripotent Stem) cells have the ability to differentiate into cells of various tissues, and they can be used in regenerative medicine, drug development, disease elucidation, etc. It is noted that it can be applied in
  • pluripotent stem cells such as ES cells and iPS cells or cells induced to differentiate are imaged with a microscope or the like and the characteristics of the image are captured to evaluate the differentiation state of the cells etc. .
  • each observation area in the well is scanned by moving the stage on which the well plate or the like is installed with respect to the imaging optical system to photograph each observation area, and then imaging for each observation area
  • a method has been proposed for joining images to generate a composite image.
  • the focus position may not be optimal in all the observation areas, and a mistake may occur in the autofocus control, and the photographed image of a part of the observation areas may be an blurred image.
  • the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus, and the photographed image may be a dark image.
  • Images of individual cells can not be extracted with high accuracy for captured images of low quality such as blurred images and dark images. For this reason, for example, if the evaluation is performed using a feature value indicating the state of each cell, the accuracy of the evaluation result may be low, and the reliability may also be low. That is, there are cases where accurate evaluation results can not be obtained if the low-quality captured image and the non-low-quality captured image with no blurring and no problem with brightness are evaluated in the same manner.
  • Patent Document 1 In the case of performing tiling imaging, when the quality of a captured image of a certain observation area is low, a method for rephotographing the observation area has been proposed (see Patent Document 1). By performing re-shooting as in the method described in Patent Document 1, it is possible to obtain a shot image that is not of low quality, so it is possible to obtain an accurate evaluation result.
  • the method described in Patent Document 1 it is determined as follows whether the photographed image has low quality. That is, an evaluation value for focus and an evaluation value for brightness are calculated for the photographed image to be a target and the photographed image adjacent to the photographed image to be a target. Then, the continuity of the evaluation values of the two photographed images is determined. Specifically, when the difference between the evaluation value of the target photographed image and the evaluation value of the adjacent photographed image is larger than the threshold value, it is determined that the target photographed image is a low quality photographed image. .
  • the container By the way, in a container such as the well described above, cells are cultured on a culture medium, which is a culture solution. Therefore, in the container, the area in which the cells are present and the area of the culture medium are mixed. Here, since there is no cell in the area of the culture medium, it has a uniform density as an image. For this reason, when a culture medium is contained in an observation area
  • the container also contains cells not to be observed such as floating cells. When floating cells and the like are included, the focal position may be aligned with the floating cells rather than the target cells.
  • the present invention has been made in view of the above circumstances, and it is an object of the present invention to enable more accurate and reliable evaluation even if the quality of photographed images in each observation area in a container is low. Do.
  • the imaging processing apparatus includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other
  • An imaging device for capturing a plurality of photographed images by photographing the container a plurality of times while changing the position of the observation region;
  • An area detection unit that detects an area to be observed from a captured image;
  • the target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image
  • a captured image determination unit that determines whether the quality is high;
  • a processing control unit configured to control at least one of photographing of the target photographed image and image processing of the target photographed image when it is determined that the target photographed image is of low quality.
  • the “at least one other captured image in proximity to” means a captured image within a predetermined range based on the target captured image.
  • one or more captured images adjacent to the target captured image can be set as “at least one other captured image in proximity”.
  • the one or more photographed images further adjacent to the adjacent photographed image are “adjacent to each other within a predetermined range. It can be included in at least one other captured image.
  • Low quality means that the quality is low compared to other photographed images. Specifically, this means that the target captured image is blurred or dark compared to other captured images, and the quality can not be evaluated accurately.
  • the processing control unit may rephotograph the observation region corresponding to the target captured image when it is determined that the target captured image has a low quality.
  • the processing control unit may notify that the target captured image has a low quality when it is determined that the target captured image has a low quality.
  • notification is included in the control of photographing.
  • the processing control unit performs at least one of the brightness correction processing and the sharpness enhancement processing on the target photographed image when it is determined that the target photographed image has low quality. It may be one.
  • the captured image determination unit determines whether the quality of the target captured image is low based on the similarity of the detected observation target area. Good.
  • the photographed image determination unit determines that the quality of the target photographed image is low based on the partial area in the area of the observation target detected in each of the target photographed image and the other photographed images. It may be determined whether or not.
  • the imaging processing apparatus acquires a plurality of captured images by overlapping a part of the observation area
  • the captured image determination unit may determine whether the quality of the target captured image is low based on overlapping areas in each of the target captured image and the other captured image.
  • the processing control unit when it is determined that the target captured image has a low quality, the processing control unit replaces the overlapping region in the target captured image with the overlapping region in the other captured image. It may be one.
  • a control method of an imaging processing apparatus includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and at least one of the container and the imaging unit is relative to the other.
  • a control method of a photographing processing apparatus provided with an observation device for photographing the container a plurality of times while changing the position of the observation region and acquiring a plurality of photographed images, Detect the area of the observation target from the captured image, The target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image Determine whether it is quality or not If it is determined that the target captured image is of low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled.
  • the imaging processing program includes a container containing an observation target and an imaging unit for imaging an observation target for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other
  • a photographing processing program that causes a computer to execute a control method of a photographing processing apparatus provided with an observation device that photographs a container a plurality of times and acquires a plurality of photographed images while changing the position of an observation region;
  • the target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image
  • a procedure for determining whether the quality is If it is determined that the target captured image is of low quality, the computer is made to execute a procedure for controlling at least one of the shooting for the target captured image and the image processing for the target captured image.
  • the area of the observation target is detected from the plurality of photographed images, and the area of the observation target detected in the target photographed image among the plurality of photographed images, and at least one other close to the target photographed image. It is determined whether the quality of the target captured image is low based on the area of the observation target detected in the captured image of. Therefore, it is possible to accurately determine whether the quality of the target captured image is low without being affected by the area other than the observation target in the captured image. Further, in the present invention, when it is determined that the target captured image has a low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target photographed image in which at least one of photographing and image processing is controlled.
  • a block diagram showing a schematic configuration of a microscope observation system using an embodiment of the imaging processing apparatus of the present invention Diagram showing the scanning locus of each observation area in the well plate Figure showing detection results of cell area
  • a diagram showing an example of a photographed image of each observation area in a well A diagram showing an example of a photographed image of each observation area in a well
  • a diagram showing an example of adjacent target photographed images and other photographed images A diagram showing an example of adjacent target photographed images and other photographed images
  • Flow chart showing processing performed in the present embodiment Diagram for explaining duplication of part of the observation area A diagram for explaining setting of overlapping regions in a small region of a target photographed image and other photographed images
  • a diagram for describing replacement of a small area in a target captured image with a small area in another captured image Diagram for explaining another captured image in proximity to the target captured image Flow chart showing processing performed in another embodiment of the present invention
  • FIG. 1 is a block diagram showing a schematic configuration of a microscope observation system of the present embodiment.
  • the microscope observation system of the present embodiment includes a microscope apparatus 10, an imaging processing apparatus 20, a display apparatus 30, and an input apparatus 40.
  • the microscope apparatus 10 corresponds to the observation apparatus of the present invention.
  • the microscope device 10 photographs the cells contained in the culture vessel, and outputs a photographed image.
  • a phase contrast microscope apparatus including an imaging element such as a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor is used.
  • an imaging device an imaging device provided with a RGB (Red Green Blue) color filter may be used, or a monochrome imaging device may be used.
  • the phase difference image of the cells accommodated in the culture vessel is formed on the imaging device, and the imaging device outputs the phase difference image as a photographed image.
  • the microscope apparatus 10 you may use not only a phase contrast microscope apparatus but other microscope apparatuses, such as a differential interference microscope apparatus and a bright-field microscope apparatus.
  • the imaging target may be a cell colony in which a plurality of cells are aggregated, or a plurality of dispersed and distributed cells.
  • cells to be imaged include, for example, pluripotent stem cells such as iPS cells and ES cells, nerves derived from stem cells, cells of skin, myocardium and liver, and cells of organs taken from human body and cancer cells Etc.
  • a well plate having a plurality of wells is used as a culture vessel.
  • each well corresponds to the container of the present invention.
  • the microscope apparatus 10 is equipped with the stage in which a well plate is installed. The stage moves in the X and Y directions orthogonal to each other in the horizontal plane. By the movement of the stage, each observation area in each well of the well plate is scanned, and a photographed image for each observation area is photographed. The photographed image for each observation area is output to the photographing processing device 20.
  • FIG. 2 is a diagram showing a scanning locus of each observation area in the case of using a well plate 50 having six wells 51 by a solid line Sc. As shown in FIG. 2, each observation area in the well plate 50 is scanned along the solid line Sc from the scanning start point S to the scanning end point E by the movement of the stage in the X and Y directions.
  • autofocus control is performed by moving an imaging optical system that forms a phase difference image of a stage or a cell on an imaging device in the vertical direction.
  • the photographed image of each observation area in the well is photographed by moving the stage
  • the present invention is not limited to this, and observation is performed by moving the imaging optical system with respect to the stage. You may make it image
  • the well plate is used, but the container for containing cells is not limited to this, and other containers such as petri dishes or dishes may be used, for example.
  • the photographing processing device 20 includes an area detection unit 21, a photographed image determination unit 22, a processing control unit 23, and a display control unit 24.
  • the photographing processing device 20 is constituted by a computer provided with a central processing unit, a semiconductor memory, a hard disk and the like, and one embodiment of the photographing processing program of the present invention is installed in the hard disk. Then, when the central processing unit executes this photographing processing program, the area detecting unit 21, the photographed image judging unit 22, the processing control unit 23 and the display control unit 24 shown in FIG. 1 function.
  • the functions of the respective units are executed by the photographing processing program.
  • the present invention is not limited to this. For example, a plurality of integrated circuits (ICs), processors, application specific integrated circuits (ASICs), and FPGAs ( The functions of the respective units may be executed by appropriately combining a field-programmable gate array), a memory, and the like.
  • the area detection unit 21 detects an observation target area, that is, a cell area, from the captured image acquired by the microscope device 10.
  • the area detection unit 21 includes, for example, a discriminator that determines whether each pixel of the captured image represents a cell or a medium, and the pixel is determined to be contained in the cell by the discriminator. Is detected as a cell area from the photographed image.
  • the discriminator outputs the result of determination whether the photographed image of the cell area or the culture area is a photographed image using the photographed image of the cell area and the photographed image of the culture area as teacher data.
  • a well-known method can be used as a method of machine learning. For example, support vector machine (SVM), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), and denoising stack auto encoder (DSA) can be used.
  • SVM support vector machine
  • DNN deep neural network
  • CNN convolutional neural network
  • RNN recurrent neural network
  • DSA denoising stack auto encode
  • FIG. 3 is a diagram showing the detection result of the cell area.
  • each area divided by the rectangular area corresponds to each observation area.
  • the cell areas detected in the photographed image of each observation area are hatched.
  • the cell area is detected by detecting the cell area in the photographed image using the area detection unit 21 provided with the machine-learned classifier as described above.
  • the determination method is not limited to this.
  • an edge may be detected from the captured image and determined based on the amount of the edge, or it may be determined from a set of the maximum value and the minimum value of the pixel value. It may be determined by analyzing.
  • the captured image determination unit 22 determines whether the target captured image to be determined among the plurality of captured images has a low quality. Specifically, whether the quality of the target photographed image is low based on the cell area detected in the target photographed image and the cell area detected in at least one other photographed image adjacent to the target photographed image Determine
  • each observation area in the well is scanned, and autofocus control is performed in each observation area.
  • the focus position may not be optimal in all observation areas, and a mistake may occur in the auto focus control, and a photographed image of a part of the observation areas may be blurred and of low quality. . If such a blurred captured image is evaluated in the same manner as other unblurred captured images, an accurate evaluation result may not be obtained.
  • FIG. 4 is a view showing an example of a combined image obtained by combining the photographed images of the respective observation areas in the well. Each area divided by a rectangular area in FIG. 4 corresponds to each observation area. Further, in the example illustrated in FIG.
  • the photographed image of the observation area indicated by the dotted square is an image that is blurred.
  • an origin is set at the upper left corner of the composite image, and a coordinate system in which the left direction in the drawing is the X direction and the lower direction in the drawing is the Y direction is set. Therefore, each captured image can be represented by a two-dimensional coordinate system.
  • the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 4 can be expressed as (4, 3).
  • the cause of the low quality of the photographed image is not only the autofocus control error.
  • the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus 10 or vibration of the stage and the optical system, and the captured image may be a dark image. If such a dark photographed image is evaluated in the same manner as other photographed images of normal light quantity, an accurate evaluation result may not be obtained.
  • FIG. 5 is a view showing an example of a combined image obtained by combining the observation regions in the well. Each area divided by a rectangular area in FIG. 5 corresponds to each observation area. Further, in the example shown in FIG. 5, the photographed image of the observation area indicated by the dotted square is the photographed image of low quality due to the light amount fluctuation of the illumination light.
  • the coordinate system is set as in FIG. Therefore, the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 5 can be expressed as (4, 4).
  • the captured image determination unit 22 determines whether or not the captured image of the target observation area has a low quality. For this purpose, the captured image determination unit 22 sets at least one other captured image adjacent to the target captured image. In this embodiment, one captured image on the downstream side of the target captured image in the scanning direction shown in FIG. 2 is set as another captured image adjacent to the target captured image. For example, when the coordinates of the target captured image are (4, 3), the coordinates of the other captured image are (5, 3). When the coordinates of the target captured image are (5, 3), the coordinates of the other captured images are (5, 4).
  • the captured image determination unit 22 determines whether the quality of the target captured image is low based on the adjacent partial cell regions in each of the target captured image and the other captured image.
  • FIG. 6 is a view showing an example of adjacent target photographed images and other photographed images. In FIG. 6, it is assumed that the entire area of the target photographed image G0 and the other photographed image G1 is a cell area. As shown in FIG. 6, the captured image determination unit 22 sets small areas A0 and A1 adjacent to each other in the target captured image G0 and the other captured image G1. In the composite image shown in FIG. 4, the target captured image G0 and the target captured image G0 have coordinates (3, 2), and the coordinates of the other captured image G1 are (4, 2).
  • the entire area of the other captured image G1 may not be a cell area.
  • the captured image determination unit 22 sets the small regions A0 and A1 so as to include only the cell region in the target captured image G0 and the other captured image G1.
  • the photographed image determination unit 22 determines the similarity between the small areas A0 and A1.
  • the captured image determination unit 22 calculates an index value for determining the similarity.
  • the index value the mean square error of the pixel value of the corresponding pixel position in the case where the small areas A0 and A1 are overlapped is used.
  • the index value is not limited to the mean square error, and the difference between the average luminance of the small area A0 and the average luminance of the small area A1 or the range of the minimum value and the maximum value of the pixel values of the small area A0 The difference between the minimum value and the maximum value range of the pixel values of the small area A1 can be used.
  • the captured image determination unit 22 determines whether the index value is larger than a predetermined threshold value Th1, and when it is determined that the index value is larger than the threshold value Th1, the target image capturing is performed. It is determined that the image G0 is of low quality.
  • the captured image determination unit 22 determines the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. Calculate the maximum value of the first derivative.
  • the captured image determination unit 22 determines that the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. The absolute value of the difference value with the maximum value of the primary differential value is calculated.
  • the difference value is larger than a predetermined threshold value Th2, it is determined that the target captured image G0 is a blurred captured image.
  • a predetermined threshold value Th2 it is determined that the target captured image G0 is a blurred captured image.
  • the threshold value Th2 it is determined that the target captured image G0 is a dark captured image.
  • the processing control unit 23 controls at least one of imaging and image processing for the target captured image G0.
  • imaging and image processing will be described.
  • the process control unit 23 rephotographs the observation region of the target photographed image G0 determined to have low quality. For example, when the coordinates of the target photographed image G0 determined to be low quality are (4, 3) shown in FIG. 4, the processing control unit 23 sets the coordinates (4, 3) to the microscope device 10 It instructs to re-shoot the corresponding shooting area. At this time, when it is determined that the target captured image G0 is a dark captured image, an instruction is given to the microscope apparatus 10 so as to perform imaging with an appropriate brightness. On the other hand, when it is determined that the target captured image G0 is a blurred captured image, an instruction is given to the microscope apparatus 10 so that the focus is appropriately set on the observation region for which the target captured image G0 has been acquired. .
  • the process control unit 23 may notify that the target photographed image G0 has a low quality when it is determined that the target photographed image G0 has a low quality.
  • the display control unit 24 displays on the display device 30 a composite image obtained by connecting and combining a plurality of photographed images, but the composite image displayed on the display device 30 is low.
  • the captured image determined to have the quality may be displayed in an emphasized manner. For example, in the composite image illustrated in FIG. 4, a frame such as red may be added to the area surrounded by the dotted line, the frame may blink, or the area may blink. Also, notification may be made by text or voice.
  • the processing control unit 23 performs brightness correction processing on the target captured image G0 determined to be a dark captured image. Specifically, the average value of the pixel values of the small area A0 of the target captured image G0 is made to coincide with the average value of the pixel values of the small area A1 of the other captured image G1. Correct the pixel value so as to increase the brightness.
  • the process control unit 23 performs sharpness enhancement processing on the target captured image G0 determined to be a blurred captured image. Specifically, according to the magnitude of the absolute value of the difference between the maximum value of the first derivative of the small area A0 of the target captured image G0 and the maximum value of the first derivative of the small area A1 of the other captured image G1. Sharpness emphasis processing is performed by changing the degree of emphasis on sharpness.
  • a table in which the values of various difference values are associated with the degree of emphasis of sharpness is stored in the hard disk of the imaging processing apparatus 20. The processing control unit 23 refers to this table to acquire the degree of sharpness enhancement according to the value of the difference value, and performs the sharpness enhancement process on the processing target image G0.
  • the display control unit 24 generates a composite image by connecting a plurality of captured images, and causes the display device 30 to display the generated composite image. Further, when there is a photographed image judged to be of low quality, a notification to that effect, the position of the low quality photographed image in the composite image, etc. are displayed on the display device 30.
  • the display device 30 displays the composite image and the notification as described above, and includes, for example, a liquid crystal display.
  • the display device 30 may be configured by a touch panel and used as the input device 40.
  • the input device 40 includes a mouse, a keyboard, and the like, and receives various setting inputs from the user.
  • a well plate containing cells and medium is placed on the stage of the microscope apparatus 10 (step ST10). Then, by moving the stage in the X direction and the Y direction, the observation area in each well of the well plate is scanned, and photographed images of all the observation areas are photographed (step ST12). Photographed images photographed by the microscope apparatus 10 are sequentially input to the area detection unit 21 of the photographing processing device 20. The area detection unit 21 detects a cell area from the captured image (step ST14).
  • the captured image determination unit 22 determines whether the target captured image G0 among the plurality of captured images has a low quality (step ST16). If it is determined that the target captured image G0 is of low quality, the processing control unit 23 controls at least one of photographing of the target captured image G0 and image processing of the target captured image G0 (processing control: step ST18). Specifically, re-shooting of the observation area corresponding to the target captured image G0, notification that the target captured image G0 is a low-quality captured image, brightness correction processing for the target captured image G0, or for the target captured image G0 Perform sharpness enhancement processing. If the target captured image G0 is not of low quality, the process proceeds to step ST20.
  • steps ST16 and ST18 are repeated until the determination of all the observation areas is completed (ST20, NO).
  • the display control unit 24 combines all captured images and combines them to generate a composite image, and displays the generated composite image on the display device 30. (Step ST22), and the process ends.
  • the cell region to be observed is detected from the plurality of photographed images, and the cell region detected in the target photographed image G0 among the plurality of photographed images and the target photographed image G0 Based on the cell area detected in at least one other captured image G1 adjacent thereto, it is determined whether or not the target captured image G0 has low quality. Therefore, it is possible to accurately determine whether or not the target photographed image G0 has low quality without being affected by the area other than the cells in the photographed image. Further, in the present embodiment, when it is determined that the target captured image G0 is of low quality, at least one of the shooting for the target captured image G0 and the image processing for the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target pickup image G0 in which at least one of the photographing and the image processing is controlled.
  • the target photographed image G0 when the target photographed image G0 is compared with the adjacent small areas A0 and A1 of one other photographed image G1, the target photographed image G0 has a low quality, and the other photographed image G1 is low.
  • the index value becomes large. Therefore, it may not be possible to accurately determine which of the target captured image G0 and the other captured image G1 has low quality. Therefore, a plurality of other photographed images G1 may be used.
  • two left and right captured images of the target captured image G0 four captured images in upper, lower, left and right, four captured images in upper left, upper right, lower right and lower left, or eight captured images around the target captured image G0 It may be a photographed image of As described above, by using a plurality of other captured images G1, it is possible to more accurately determine whether the target captured image G0 has low quality.
  • FIG. 9 is a diagram for explaining duplication of part of the observation region. As shown in FIG. 9, in each photographed image, an adjacent photographed image and a partial area overlap.
  • the overlapping range is preferably, for example, up to about 10% of the length of each side of the observation area. Note that, in FIG. 9, overlapping regions are widely shown for the purpose of explanation.
  • the photographed image determination unit 22 reduces the overlapping area in the target photographed image G0 and the other photographed image G1. It may be set to the areas A0 and A1.
  • the processing control unit 23 replaces the overlapping region in the target captured image G0 with the overlapping region in the other captured image G1. Good.
  • the processing control unit 23 when controlling the image processing on the target captured image G0, the processing control unit 23 performs at least one of the brightness correction processing and the sharpness enhancement processing only on the region other than the small region A0 in the target captured image G0. It will be good. Therefore, the processing time required for image processing can be shortened.
  • the other captured images G1 to G4 when four captured images adjacent to the target captured image G0 are used as the other captured images G1 to G4, four small areas in the upper, lower, left, and right of the target captured image G0 are target captured images in the other captured images G1 to G4. Substituting with the small area A1 to A4 overlapping G0, at least one of the brightness correction process and the sharpness enhancing process may be performed only on the area other than the small area A1 to A4 in the target captured image G0.
  • the target photographed image G0 is compared with another photographed image G1 adjacent to the target photographed image G0 to determine whether the target photographed image G0 has low quality. Specifically, an index value representing the similarity between the subregions A0 and A1 in the target captured image G0 and the other captured image G1 is calculated, and the target captured image G0 is calculated if the index value is larger than the threshold Th1. It is judged that the quality is low. However, the index value decreases not only when both the target captured image G0 and the other captured image G1 are not of low quality but also when both the target captured image G0 and the other captured image G1 are of low quality. . Therefore, when the index value is smaller than the threshold value Th1, it may not be possible to accurately determine whether the target photographed image G0 has low quality.
  • the photographed image adjacent to the target photographed image G0 but also at least one photographed image close to the target photographed image G0 is used as another photographed image G1 to determine whether the target photographed image G0 has low quality or not May be determined.
  • this will be described as another embodiment.
  • the configuration of the imaging processing apparatus in the other embodiment is the same as the configuration of the imaging processing apparatus 20 shown in FIG. 1, and thus the detailed description of the apparatus will be omitted here.
  • a captured image within a predetermined range based on the target captured image G0 can be used as another captured image G1 in proximity to the target captured image G0.
  • one or more photographed images further adjacent to these adjacent photographed images can be used.
  • captured images within a 5 ⁇ 5 range centered on the target captured image G0 may be used as another captured image G1 close to the target captured image G0.
  • a captured image which is not adjacent to the target captured image G0 but is within a predetermined range based on the target captured image G0 may be another captured image G1 which is close to the target captured image G0.
  • a captured image including all or a part of it within a range of a circle having a predetermined radius centered on the target captured image G0 is used as another captured image G1 close to the target captured image G0 May be
  • FIG. 13 is a flow chart showing processing performed in another embodiment of the present invention.
  • the processing performed in the other embodiment is different from the above embodiment only in the processing of step ST16 in the flowchart shown in FIG. Therefore, only the process of step ST16 in the flowchart shown in FIG. 8 will be described here.
  • the photographed image determination unit 22 sets one other photographed image G1 to be compared with the target photographed image G0 (step ST30).
  • the other captured images G1 may be set in order from the captured image closer to the target captured image G0.
  • the captured image determination unit 22 calculates an index value representing the similarity between the target captured image G0 and the small regions A0 and A1 in the other captured image G1, and the index value is the threshold Th1. It is determined whether or not it is larger (step ST32). If step ST32 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is of low quality (step ST34), and proceeds to the process of step ST18 of FIG.
  • step ST32 the captured image determination unit 22 determines whether the comparison has ended for all other captured images G1 (step ST36). If step ST36 is negative, another captured image G1 to be compared with the target captured image G0 is set as the next captured image (step ST38), and the process returns to step ST32. If step ST36 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is not of low quality (step ST40), and proceeds to the process of step ST20 in FIG.
  • the captured images of all the observation areas are acquired, it is determined whether or not the target captured image is of low quality.
  • the photographed image is of low quality.
  • the captured image already acquired may be used as the other captured image G1.
  • the index value is calculated using adjacent small areas of the target captured image G0 and other captured images, and it is determined whether the target captured image G0 has low quality.
  • the index value may be calculated using the entire area of the target photographed image G0 and other photographed images, in particular, the entire area of the cell area.

Abstract

An imaging processing device, a control method for an imaging processing device, and an imaging processing program that make it possible to perform more accurate and highly reliable evaluations, even when the quality of captured images of observed areas inside a container is low. According to the present invention, an area detection unit 21 detects an observation target area from a plurality of captured images. A captured image determination unit 22 determines whether a target captured image from among the plurality of captured images is low quality on the basis of the observation target area as detected in the target captured image and the observation target area as detected in at least one other captured image that is close to the target captured image. When it has been determined that the target captured image is low quality, a processing control unit 23 controls imaging of the target captured image and/or image processing of the target captured image.

Description

撮影処理装置、撮影処理装置の制御方法および撮影処理プログラムImage processing apparatus, control method for image processing apparatus, and image processing program
 本発明は、観察対象が収容された容器が設置されたステージと観察対象の像を結像させる結像光学系とを相対的に移動させることによって、観察対象全体の像を観察する撮影処理装置、撮影処理装置の制御方法および撮影処理プログラムに関するものである。 The present invention is an imaging processing apparatus for observing an image of the entire observation target by relatively moving a stage on which a container containing the observation target is installed and an imaging optical system for forming an image of the observation target The present invention relates to a control method of a photographing processing device and a photographing processing program.
 ES(Embryonic Stem)細胞およびiPS(Induced Pluripotent Stem)細胞等の多能性幹細胞は、種々の組織の細胞に分化する能力を備えたものであり、再生医療、薬の開発、および病気の解明等において応用が可能なものとして注目されている。 Pluripotent stem cells such as ES (Embryonic Stem) cells and iPS (Induced Pluripotent Stem) cells have the ability to differentiate into cells of various tissues, and they can be used in regenerative medicine, drug development, disease elucidation, etc. It is noted that it can be applied in
 そして、ES細胞およびiPS細胞等の多能性幹細胞、または分化誘導された細胞等を顕微鏡等で撮像し、その画像の特徴を捉えることで細胞の分化状態等を評価する方法が提案されている。 Then, a method has been proposed in which pluripotent stem cells such as ES cells and iPS cells or cells induced to differentiate are imaged with a microscope or the like and the characteristics of the image are captured to evaluate the differentiation state of the cells etc. .
 一方、上述したように細胞を顕微鏡で撮像する際、高倍率な広視野画像を取得するため、いわゆるタイリング撮影を行うことが提案されている。具体的には、例えばウェルプレート等が設置されたステージを、結像光学系に対して移動させることによってウェル内の各観察領域を走査して各観察領域を撮影した後、観察領域毎の撮影画像を繋ぎ合わせて合成画像を生成する方法が提案されている。 On the other hand, when imaging cells with a microscope as described above, it is proposed to perform so-called tiling imaging in order to acquire a high magnification wide-field image. Specifically, for example, each observation area in the well is scanned by moving the stage on which the well plate or the like is installed with respect to the imaging optical system to photograph each observation area, and then imaging for each observation area A method has been proposed for joining images to generate a composite image.
 ここで、上述したようにウェル内の各観察領域を走査して撮影する際、各観察領域においてオートフォーカス制御が行われる。しかしながら、全ての観察領域において最適な焦点位置となるとは限らず、オートフォーカス制御でミスを生じ、一部の観察領域の撮影画像がボケた画像となる場合がある。また、例えば顕微鏡装置における光源に印可される電圧変動によって照明光の光量が変動し、撮影画像が暗い画像となってしまう場合がある。 Here, as described above, when scanning and photographing each observation area in the well, autofocus control is performed in each observation area. However, the focus position may not be optimal in all the observation areas, and a mistake may occur in the autofocus control, and the photographed image of a part of the observation areas may be an blurred image. Further, for example, the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus, and the photographed image may be a dark image.
 このようにボケた画像および暗い画像のように品質が低い撮影画像については、個々の細胞の画像を高精度に抽出することができない。このため、例えば個々の細胞の状態を示す特徴量を用いて評価を行うようにしたのでは、評価結果の精度が低くなり、信頼性も低い評価結果となる場合がある。すなわち、低品質の撮影画像と、ボケが無くかつ明るさも問題ない低品質ではない撮影画像とを同じように評価したのでは正確な評価結果を得ることができない場合がある。 Images of individual cells can not be extracted with high accuracy for captured images of low quality such as blurred images and dark images. For this reason, for example, if the evaluation is performed using a feature value indicating the state of each cell, the accuracy of the evaluation result may be low, and the reliability may also be low. That is, there are cases where accurate evaluation results can not be obtained if the low-quality captured image and the non-low-quality captured image with no blurring and no problem with brightness are evaluated in the same manner.
 このため、タイリング撮影を行う場合において、ある観察領域の撮影画像の品質が低い場合に、その観察領域を再撮影する手法が提案されている(特許文献1参照)。特許文献1に記載された手法のように再撮影を行うことにより、低品質ではない撮影画像を取得することができるため、正確な評価結果を得ることが可能となる。なお、特許文献1に記載の手法においては、撮影画像が低品質であるか否かの判断を以下のようにして行っている。すなわち、フォーカスについての評価値および明るさについての評価値等を、対象となる撮影画像および対象となる撮影画像に隣接する撮影画像について算出する。そして、2つの撮影画像の評価値についての連続性を判断する。具体的には、対象となる撮影画像の評価値と隣接する撮影画像の評価値との差がしきい値よりも大きい場合に、対象となる撮影画像が低品質の撮影画像であると判断する。 For this reason, in the case of performing tiling imaging, when the quality of a captured image of a certain observation area is low, a method for rephotographing the observation area has been proposed (see Patent Document 1). By performing re-shooting as in the method described in Patent Document 1, it is possible to obtain a shot image that is not of low quality, so it is possible to obtain an accurate evaluation result. In the method described in Patent Document 1, it is determined as follows whether the photographed image has low quality. That is, an evaluation value for focus and an evaluation value for brightness are calculated for the photographed image to be a target and the photographed image adjacent to the photographed image to be a target. Then, the continuity of the evaluation values of the two photographed images is determined. Specifically, when the difference between the evaluation value of the target photographed image and the evaluation value of the adjacent photographed image is larger than the threshold value, it is determined that the target photographed image is a low quality photographed image. .
特開2016-125913号公報JP, 2016-125913, A
 ところで、上述したウェル等の容器内においては、培養液である培地の上に細胞が培養されている。このため、容器内には、細胞が存在する領域と培地の領域とが混在している。ここで、培地の領域は細胞が存在しないため、画像としては均一な濃度を有する。このため、観察領域に培地が含まれる場合、その観察領域の撮影画像がボケているのか否かを評価することが困難となる。また、容器内には観察対象となる細胞の他、浮遊細胞等の観察対象でない細胞も含まれる。浮遊細胞等が含まれると、対象となる細胞ではなく、浮遊細胞に焦点位置が合ってしまう可能性がある。このような場合においても、対象となる観察領域の撮影画像がボケているのか否かを評価することが困難となる。したがって、特許文献1に記載された手法のように、単純に隣接する撮影画像間における評価値を用いたのでは、撮影画像が低品質であるか否かを正確に判断することができない。 By the way, in a container such as the well described above, cells are cultured on a culture medium, which is a culture solution. Therefore, in the container, the area in which the cells are present and the area of the culture medium are mixed. Here, since there is no cell in the area of the culture medium, it has a uniform density as an image. For this reason, when a culture medium is contained in an observation area | region, it becomes difficult to evaluate whether the picked-up image of the observation area | region is blurred. In addition to the cells to be observed, the container also contains cells not to be observed such as floating cells. When floating cells and the like are included, the focal position may be aligned with the floating cells rather than the target cells. Even in such a case, it is difficult to evaluate whether or not the photographed image of the target observation area is blurred. Therefore, it is not possible to accurately determine whether the quality of the photographed image is low or not by simply using the evaluation value between the adjacent photographed images as in the method described in Patent Document 1.
 本発明は上記事情に鑑みなされたものであり、容器内の各観察領域の撮影画像の品質が低くても、より正確かつ、信頼性の高い評価を行うことができるようにすることを目的とする。  The present invention has been made in view of the above circumstances, and it is an object of the present invention to enable more accurate and reliable evaluation even if the quality of photographed images in each observation area in a container is low. Do.
 本発明の撮影処理装置は、観察対象を収容した容器および観察対象を容器よりも小さい観察領域毎に撮影する撮像部を備え、容器および撮像部の少なくとも一方を他方に対して相対的に移動させて、観察領域の位置を変更しつつ、容器を複数回撮影して、複数の撮影画像を取得する観察装置と、
 撮影画像から観察対象の領域を検出する領域検出部と、
 複数の撮影画像のうちの対象撮影画像において検出された観察対象の領域および、対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された観察対象の領域に基づいて、対象撮影画像が低品質であるか否かを判定する撮影画像判定部と、
 対象撮影画像が低品質であると判定された場合、対象撮影画像についての撮影および対象撮影画像に対する画像処理の少なくとも一方を制御する処理制御部とを備える。
The imaging processing apparatus according to the present invention includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other An imaging device for capturing a plurality of photographed images by photographing the container a plurality of times while changing the position of the observation region;
An area detection unit that detects an area to be observed from a captured image;
The target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image A captured image determination unit that determines whether the quality is high;
And a processing control unit configured to control at least one of photographing of the target photographed image and image processing of the target photographed image when it is determined that the target photographed image is of low quality.
 「近接する少なくとも1つの他の撮影画像」とは、対象撮影画像を基準とした予め定められた範囲内にある撮影画像を意味する。例えば、対象撮影画像に隣接する1以上の撮影画像を「近接する少なくとも1つの他の撮影画像」とすることができる。また、対象撮影画像に隣接する1以上の撮影画像に加えて、またはこれに代えて、予め定められた範囲内において、この隣接する撮影画像にさらに隣接する1以上の撮影画像を、「近接する少なくとも1つの他の撮影画像」に含めることができる。 The “at least one other captured image in proximity to” means a captured image within a predetermined range based on the target captured image. For example, one or more captured images adjacent to the target captured image can be set as “at least one other captured image in proximity”. In addition to or instead of the one or more photographed images adjacent to the target photographed image, the one or more photographed images further adjacent to the adjacent photographed image are “adjacent to each other within a predetermined range. It can be included in at least one other captured image.
 「低品質である」とは、他の撮影画像と比較して品質が低いことを意味する。具体的には、他の撮影画像と比較して、対象撮影画像がボケていたり、暗かったりして、精度よく評価を行うことができない品質となっていることを意味する。 "Low quality" means that the quality is low compared to other photographed images. Specifically, this means that the target captured image is blurred or dark compared to other captured images, and the quality can not be evaluated accurately.
 なお、本発明による撮影処理装置においては、処理制御部は、対象撮影画像が低品質であると判定された場合、対象撮影画像に対応する観察領域を再撮影するものであってもよい。 Note that, in the imaging processing apparatus according to the present invention, the processing control unit may rephotograph the observation region corresponding to the target captured image when it is determined that the target captured image has a low quality.
 また、本発明による撮影処理装置においては、処理制御部は、対象撮影画像が低品質であると判定された場合、対象撮影画像が低品質であることを通知するものであってもよい。 Further, in the imaging processing apparatus according to the present invention, the processing control unit may notify that the target captured image has a low quality when it is determined that the target captured image has a low quality.
 なお、通知がなされた場合、操作者は対象撮影画像を再度撮影する処理を行うこととなる。このため、本発明において「通知」は、撮影の制御に含まれるものとする。 In addition, when the notification is made, the operator performs processing of capturing the target captured image again. Therefore, in the present invention, "notification" is included in the control of photographing.
 また、本発明による撮影処理装置においては、処理制御部は、対象撮影画像が低品質であると判定された場合、対象撮影画像に対して、明るさ補正処理およびシャープネス強調処理の少なくとも一方を行うものであってもよい。 Further, in the photographing processing device according to the present invention, the processing control unit performs at least one of the brightness correction processing and the sharpness enhancement processing on the target photographed image when it is determined that the target photographed image has low quality. It may be one.
 また、本発明による撮影処理装置においては、撮影画像判定部は、検出された観察対象の領域の類似性に基づいて、対象撮影画像が低品質であるか否かを判定するものであってもよい。 Further, in the imaging processing apparatus according to the present invention, the captured image determination unit determines whether the quality of the target captured image is low based on the similarity of the detected observation target area. Good.
 また、本発明による撮影処理装置においては、撮影画像判定部は、対象撮影画像および他の撮影画像のそれぞれにおいて検出された観察対象の領域における一部の領域に基づいて、対象撮影画像が低品質であるか否かを判定するものであってもよい。 Further, in the photographing processing apparatus according to the present invention, the photographed image determination unit determines that the quality of the target photographed image is low based on the partial area in the area of the observation target detected in each of the target photographed image and the other photographed images. It may be determined whether or not.
 また、本発明による撮影処理装置においては、撮影処理装置は、観察領域の一部を重複させて複数の撮影画像を取得し、
 撮影画像判定部は、対象撮影画像および他の撮影画像のそれぞれにおける重複する領域に基づいて、対象撮影画像が低品質であるか否かを判定するものであってもよい。
Further, in the imaging processing apparatus according to the present invention, the imaging processing apparatus acquires a plurality of captured images by overlapping a part of the observation area,
The captured image determination unit may determine whether the quality of the target captured image is low based on overlapping areas in each of the target captured image and the other captured image.
 また、本発明による撮影処理装置においては、対象撮影画像が低品質であると判定された場合、処理制御部は、対象撮影画像における重複する領域を、他の撮影画像における重複する領域と置換するものであってもよい。 Further, in the imaging processing apparatus according to the present invention, when it is determined that the target captured image has a low quality, the processing control unit replaces the overlapping region in the target captured image with the overlapping region in the other captured image. It may be one.
 本発明による撮影処理装置の制御方法は、観察対象を収容した容器および観察対象を容器よりも小さい観察領域毎に撮影する撮像部を備え、容器および撮像部の少なくとも一方を他方に対して相対的に移動させて、観察領域の位置を変更しつつ、容器を複数回撮影して、複数の撮影画像を取得する観察装置を備えた撮影処理装置の制御方法であって、
 撮影画像から観察対象の領域を検出し、
 複数の撮影画像のうちの対象撮影画像において検出された観察対象の領域および、対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された観察対象の領域に基づいて、対象撮影画像が低品質であるか否かを判定し、
 対象撮影画像が低品質であると判定された場合、対象撮影画像についての撮影および対象撮影画像に対する画像処理の少なくとも一方を制御する。
A control method of an imaging processing apparatus according to the present invention includes a container containing an observation object and an imaging unit for imaging an observation object for each observation region smaller than the container, and at least one of the container and the imaging unit is relative to the other. A control method of a photographing processing apparatus provided with an observation device for photographing the container a plurality of times while changing the position of the observation region and acquiring a plurality of photographed images,
Detect the area of the observation target from the captured image,
The target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image Determine whether it is quality or not
If it is determined that the target captured image is of low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled.
 本発明による撮影処理プログラムは、観察対象を収容した容器および観察対象を容器よりも小さい観察領域毎に撮影する撮像部を備え、容器および撮像部の少なくとも一方を他方に対して相対的に移動させて、観察領域の位置を変更しつつ、容器を複数回撮影して、複数の撮影画像を取得する観察装置を備えた撮影処理装置の制御方法をコンピュータに実行させる撮影処理プログラムであって、
 撮影画像から観察対象の領域を検出する手順と、
 複数の撮影画像のうちの対象撮影画像において検出された観察対象の領域および、対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された観察対象の領域に基づいて、対象撮影画像が低品質であるか否かを判定する手順と、
 対象撮影画像が低品質であると判定された場合、対象撮影画像についての撮影および対象撮影画像に対する画像処理の少なくとも一方を制御する手順とをコンピュータに実行させる。
The imaging processing program according to the present invention includes a container containing an observation target and an imaging unit for imaging an observation target for each observation region smaller than the container, and moves at least one of the container and the imaging unit relative to the other A photographing processing program that causes a computer to execute a control method of a photographing processing apparatus provided with an observation device that photographs a container a plurality of times and acquires a plurality of photographed images while changing the position of an observation region;
A procedure for detecting an area to be observed from a photographed image;
The target captured image is low based on the region of the observation target detected in the target captured image among the plurality of captured images and the region of the observation target detected in at least one other captured image close to the target captured image A procedure for determining whether the quality is
If it is determined that the target captured image is of low quality, the computer is made to execute a procedure for controlling at least one of the shooting for the target captured image and the image processing for the target captured image.
 本発明によれば、複数の撮影画像から、観察対象の領域を検出し、複数の撮影画像のうちの対象撮影画像において検出された観察対象の領域および、対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された観察対象の領域に基づいて、対象撮影画像が低品質であるか否かを判定するようにした。このため、撮影画像における観察対象以外の領域に影響されることなく、対象撮影画像が低品質であるか否かを正確に判定することができる。また、本発明においては、対象撮影画像が低品質であると判定された場合、対象撮影画像についての撮影および対象撮影画像に対する画像処理の少なくとも一方を制御するようにした。このため、撮影および画像処理の少なくとも一方が制御された対象撮影画像を用いて合成画像を生成することにより、観察対象に対して信頼性の高い評価を行うことができる。 According to the present invention, the area of the observation target is detected from the plurality of photographed images, and the area of the observation target detected in the target photographed image among the plurality of photographed images, and at least one other close to the target photographed image. It is determined whether the quality of the target captured image is low based on the area of the observation target detected in the captured image of. Therefore, it is possible to accurately determine whether the quality of the target captured image is low without being affected by the area other than the observation target in the captured image. Further, in the present invention, when it is determined that the target captured image has a low quality, at least one of the imaging of the target captured image and the image processing of the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target photographed image in which at least one of photographing and image processing is controlled.
本発明の撮影処理装置の一実施形態を用いた顕微鏡観察システムの概略構成を示すブロック図A block diagram showing a schematic configuration of a microscope observation system using an embodiment of the imaging processing apparatus of the present invention ウェルプレートにおける各観察領域の走査軌跡を示す図Diagram showing the scanning locus of each observation area in the well plate 細胞領域の検出結果を示す図Figure showing detection results of cell area ウェル内の各観察領域の撮影画像の一例を示す図A diagram showing an example of a photographed image of each observation area in a well ウェル内の各観察領域の撮影画像の一例を示す図A diagram showing an example of a photographed image of each observation area in a well 隣接する対象撮影画像および他の撮影画像の一例を示す図A diagram showing an example of adjacent target photographed images and other photographed images 隣接する対象撮影画像および他の撮影画像の一例を示す図A diagram showing an example of adjacent target photographed images and other photographed images 本実施形態において行われる処理を示すフローチャートFlow chart showing processing performed in the present embodiment 観察領域の一部の重複を説明するための図Diagram for explaining duplication of part of the observation area 対象撮影画像および他の撮影画像における重複する領域の小領域への設定を説明するための図A diagram for explaining setting of overlapping regions in a small region of a target photographed image and other photographed images 対象撮影画像における小領域と他の撮影画像における小領域との置換を説明するための図A diagram for describing replacement of a small area in a target captured image with a small area in another captured image 対象撮影画像に近接する他の撮影画像を説明するための図Diagram for explaining another captured image in proximity to the target captured image 本発明の他の実施形態において行われる処理を示すフローチャートFlow chart showing processing performed in another embodiment of the present invention
 以下、本発明の撮影処理装置、撮影処理装置の制御方法および撮影処理プログラムの一実施形態を用いた顕微鏡観察システムについて、図面を参照しながら詳細に説明する。図1は、本実施形態の顕微鏡観察システムの概略構成を示すブロック図である。 Hereinafter, a microscope observation system using an embodiment of a photographing processing device, a control method of the photographing processing device, and a photographing processing program of the present invention will be described in detail with reference to the drawings. FIG. 1 is a block diagram showing a schematic configuration of a microscope observation system of the present embodiment.
 本実施形態の顕微鏡観察システムは、図1に示すように、顕微鏡装置10、撮影処理装置20、表示装置30および入力装置40を備える。なお、顕微鏡装置10が本発明の観察装置に対応する。 As shown in FIG. 1, the microscope observation system of the present embodiment includes a microscope apparatus 10, an imaging processing apparatus 20, a display apparatus 30, and an input apparatus 40. The microscope apparatus 10 corresponds to the observation apparatus of the present invention.
 顕微鏡装置10は、培養容器内に収容された細胞を撮影し、撮影画像を出力する。本実施形態においては、具体的には、CCD(Charge-Coupled Device)イメージセンサまたはCMOS(Complementary Metal-Oxide Semiconductor)イメージセンサ等の撮像素子を備えた位相差顕微鏡装置を用いる。撮像素子としては、RGB(Red Green Blue)のカラーフィルタが設けられた撮像素子を用いてもよく、モノクロの撮像素子を用いてもよい。そして、培養容器内に収容された細胞の位相差像が撮像素子に結像され、撮像素子から撮影画像として位相差画像が出力される。なお、顕微鏡装置10としては、位相差顕微鏡装置に限らず、微分干渉顕微鏡装置および明視野顕微鏡装置等のその他の顕微鏡装置を用いてもよい。 The microscope device 10 photographs the cells contained in the culture vessel, and outputs a photographed image. In the present embodiment, specifically, a phase contrast microscope apparatus including an imaging element such as a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor is used. As an imaging device, an imaging device provided with a RGB (Red Green Blue) color filter may be used, or a monochrome imaging device may be used. Then, the phase difference image of the cells accommodated in the culture vessel is formed on the imaging device, and the imaging device outputs the phase difference image as a photographed image. In addition, as the microscope apparatus 10, you may use not only a phase contrast microscope apparatus but other microscope apparatuses, such as a differential interference microscope apparatus and a bright-field microscope apparatus.
 撮影対象としては、複数の細胞が凝集した細胞コロニーでもよいし、分散して分布した複数の細胞でもよい。また、撮影対象の細胞としては、例えばiPS細胞およびES細胞といった多能性幹細胞、幹細胞から分化誘導された神経、皮膚、心筋および肝臓の細胞、並びに人体から取り出された臓器の細胞およびがん細胞等がある。 The imaging target may be a cell colony in which a plurality of cells are aggregated, or a plurality of dispersed and distributed cells. In addition, cells to be imaged include, for example, pluripotent stem cells such as iPS cells and ES cells, nerves derived from stem cells, cells of skin, myocardium and liver, and cells of organs taken from human body and cancer cells Etc.
 また、本実施形態においては、培養容器として、複数のウェルを有するウェルプレートを用いる。なお、ウェルプレートを用いる場合、各ウェルが、本発明の容器に相当するものである。そして、顕微鏡装置10は、ウェルプレートが設置されるステージを備えている。ステージは、水平面内において直交するX方向およびY方向に移動するものである。このステージの移動によって、ウェルプレートの各ウェル内における各観察領域が走査され、観察領域毎の撮影画像が撮影される。観察領域毎の撮影画像は撮影処理装置20に出力される。 Moreover, in the present embodiment, a well plate having a plurality of wells is used as a culture vessel. In the case of using a well plate, each well corresponds to the container of the present invention. And the microscope apparatus 10 is equipped with the stage in which a well plate is installed. The stage moves in the X and Y directions orthogonal to each other in the horizontal plane. By the movement of the stage, each observation area in each well of the well plate is scanned, and a photographed image for each observation area is photographed. The photographed image for each observation area is output to the photographing processing device 20.
 図2は、6つのウェル51を有するウェルプレート50を用いた場合における各観察領域の走査軌跡を実線Scで示した図である。図2に示すように、ウェルプレート50内の各観察領域は、ステージのX方向およびY方向の移動によって走査開始点Sから走査終了点Eまでの実線Scに沿って走査される。 FIG. 2 is a diagram showing a scanning locus of each observation area in the case of using a well plate 50 having six wells 51 by a solid line Sc. As shown in FIG. 2, each observation area in the well plate 50 is scanned along the solid line Sc from the scanning start point S to the scanning end point E by the movement of the stage in the X and Y directions.
 また、本実施形態においては、ウェル内の各観察領域において、ステージまたは細胞の位相差像を撮像素子に結像する結像光学系を鉛直方向に移動させることによってオートフォーカス制御を行う。 Further, in the present embodiment, in each observation region in the well, autofocus control is performed by moving an imaging optical system that forms a phase difference image of a stage or a cell on an imaging device in the vertical direction.
 なお、本実施形態においては、ステージを移動させることによってウェル内の観察領域毎の撮影画像を撮影するようにしたが、これに限らず、結像光学系をステージに対して移動させることによって観察領域毎の撮影画像を撮影するようにしてもよい。または、ステージおよび結像光学系の両方を移動させるようにしてもよい。 In the present embodiment, although the photographed image of each observation area in the well is photographed by moving the stage, the present invention is not limited to this, and observation is performed by moving the imaging optical system with respect to the stage. You may make it image | photograph the photography image for every area | region. Alternatively, both the stage and the imaging optical system may be moved.
 また、本実施形態においては、ウェルプレートを用いるようにしたが、細胞が収容される容器としてはこれに限らず、例えばシャーレまたはディッシュ等その他の容器を用いるようにしてもよい。 Further, in the present embodiment, the well plate is used, but the container for containing cells is not limited to this, and other containers such as petri dishes or dishes may be used, for example.
 撮影処理装置20は、図1に示すように、領域検出部21、撮影画像判定部22、処理制御部23および表示制御部24を備える。撮影処理装置20は、中央処理装置、半導体メモリおよびハードディスク等を備えたコンピュータから構成されるものであり、ハードディスクに本発明の撮影処理プログラムの一実施形態がインストールされている。そして、この撮影処理プログラムが中央処理装置によって実行されることによって、図1に示す領域検出部21、撮影画像判定部22、処理制御部23および表示制御部24が機能する。なお、本実施形態においては、撮影処理プログラムによって、各部の機能を実行するようにしたが、これに限らず、例えば複数のIC(Integrated Circuit)、プロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、およびメモリ等を適宜組み合わせることによって各部の機能を実行するようにしてもよい。 As shown in FIG. 1, the photographing processing device 20 includes an area detection unit 21, a photographed image determination unit 22, a processing control unit 23, and a display control unit 24. The photographing processing device 20 is constituted by a computer provided with a central processing unit, a semiconductor memory, a hard disk and the like, and one embodiment of the photographing processing program of the present invention is installed in the hard disk. Then, when the central processing unit executes this photographing processing program, the area detecting unit 21, the photographed image judging unit 22, the processing control unit 23 and the display control unit 24 shown in FIG. 1 function. In the present embodiment, the functions of the respective units are executed by the photographing processing program. However, the present invention is not limited to this. For example, a plurality of integrated circuits (ICs), processors, application specific integrated circuits (ASICs), and FPGAs ( The functions of the respective units may be executed by appropriately combining a field-programmable gate array), a memory, and the like.
 領域検出部21は、顕微鏡装置10が取得した撮影画像から、観察対象領域すなわち細胞領域を検出する。領域検出部21は、例えば撮影画像の各画素が細胞を表すものであるか、培地を表すものであるかを判別する判別器を有し、判別器により細胞に含まれると判別された画素からなる領域を撮影画像から細胞領域として検出する。判別器は、細胞領域を撮影した撮影画像と培地領域を撮影した撮影画像とを教師データとし、細胞領域を撮影した撮影画像であるか培地領域を撮影した撮影画像であるかの判別結果を出力として機械学習によって生成されたものである。機械学習の手法としては、公知の手法を用いることができる。例えば、サポートベクタマシン(SVM)、ディープニューラルネットワーク(DNN)、畳み込みニューラルネットワーク(CNN)、リカレントニューラルネットワーク(RNN)、およびデノイジングスタックオートエンコーダ(DSA)等を用いることができる。 The area detection unit 21 detects an observation target area, that is, a cell area, from the captured image acquired by the microscope device 10. The area detection unit 21 includes, for example, a discriminator that determines whether each pixel of the captured image represents a cell or a medium, and the pixel is determined to be contained in the cell by the discriminator. Is detected as a cell area from the photographed image. The discriminator outputs the result of determination whether the photographed image of the cell area or the culture area is a photographed image using the photographed image of the cell area and the photographed image of the culture area as teacher data. As generated by machine learning. A well-known method can be used as a method of machine learning. For example, support vector machine (SVM), deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), and denoising stack auto encoder (DSA) can be used.
 ここで、ウェルには観察対象の細胞の他、培地中を浮遊している浮遊細胞等の観察対象ではない細胞が存在する。しかしながら、観察対象となる細胞を教師データとして判別器を学習することにより、浮遊細胞等を検出することなく、観察対象となる細胞からなる細胞領域のみを検出することができる。 Here, in the wells, there are cells other than the cells to be observed such as floating cells suspended in the culture medium, in addition to the cells to be observed. However, by learning the discriminator with the cells to be observed as the teacher data, it is possible to detect only the cell area composed of the cells to be observed without detecting the floating cells and the like.
 図3は細胞領域の検出結果を示す図である。なお、図3においては矩形領域で分割された各領域が各観察領域に相当する。図3においては、各観察領域の撮影画像において検出された細胞領域に斜線を付与している。 FIG. 3 is a diagram showing the detection result of the cell area. In addition, in FIG. 3, each area divided by the rectangular area corresponds to each observation area. In FIG. 3, the cell areas detected in the photographed image of each observation area are hatched.
 ここで、領域検出部21により細胞領域が検出されない撮影画像については、以下に説明する撮影画像判定部22および処理制御部23において行われる処理を省略するものとする。 Here, with regard to a photographed image in which no cell region is detected by the region detection unit 21, processing performed in the photographed image determination unit 22 and the processing control unit 23 described below is omitted.
 なお、本実施形態においては、上述したように機械学習された判別器を備えた領域検出部21を用いて、撮影画像における細胞領域を判別して細胞領域を検出しているが、細胞領域の判別方法としては、これに限定されるものではない。例えば撮影画像からエッジを検出し、エッジの量に基づいて判別するようにしてもよいし、画素値の最大値と最小値との組から判別するようにしてもよし、撮影画像の空間周波数成分を解析することによって判別するようにしてもよい。 In the present embodiment, the cell area is detected by detecting the cell area in the photographed image using the area detection unit 21 provided with the machine-learned classifier as described above. The determination method is not limited to this. For example, an edge may be detected from the captured image and determined based on the amount of the edge, or it may be determined from a set of the maximum value and the minimum value of the pixel value. It may be determined by analyzing.
 撮影画像判定部22は、複数の撮影画像のうちの判定の対象となる対象撮影画像が低品質であるか否かを判定する。具体的には、対象撮影画像において検出された細胞領域および、対象撮影画像に隣接する少なくとも1つの他の撮影画像において検出された細胞領域に基づいて、対象撮影画像が低品質であるか否かを判定する。 The captured image determination unit 22 determines whether the target captured image to be determined among the plurality of captured images has a low quality. Specifically, whether the quality of the target photographed image is low based on the cell area detected in the target photographed image and the cell area detected in at least one other photographed image adjacent to the target photographed image Determine
 本実施形態においては、上述したようにウェル内の各観察領域を走査し、各観察領域においてオートフォーカス制御を行う。オートフォーカス制御を行う際には、全ての観察領域において最適な焦点位置となるとは限らず、オートフォーカス制御でミスを生じ、一部の観察領域の撮影画像がボケて低品質となる場合がある。このようなボケた撮影画像を、その他のボケていない撮影画像と同じように評価したのでは、正確な評価結果が得られない場合がある。図4は、ウェル内の各観察領域の撮影画像を合成した合成画像の一例を示す図である。図4において矩形領域で分割された各領域が各観察領域に相当する。また、図4に示す例では、点線四角で示す観察領域の撮影画像がボケている画像である。なお、図4においては、合成画像の左上隅に原点を設定し、紙面左方向をX方向、紙面下方向をY方向とする座標系を設定する。このため、各撮影画像は2次元の座標系で表すことができる。例えば図4に示す点線四角で示す観察領域の撮影画像の座標は(4,3)と表すことができる。 In the present embodiment, as described above, each observation area in the well is scanned, and autofocus control is performed in each observation area. When performing auto focus control, the focus position may not be optimal in all observation areas, and a mistake may occur in the auto focus control, and a photographed image of a part of the observation areas may be blurred and of low quality. . If such a blurred captured image is evaluated in the same manner as other unblurred captured images, an accurate evaluation result may not be obtained. FIG. 4 is a view showing an example of a combined image obtained by combining the photographed images of the respective observation areas in the well. Each area divided by a rectangular area in FIG. 4 corresponds to each observation area. Further, in the example illustrated in FIG. 4, the photographed image of the observation area indicated by the dotted square is an image that is blurred. In FIG. 4, an origin is set at the upper left corner of the composite image, and a coordinate system in which the left direction in the drawing is the X direction and the lower direction in the drawing is the Y direction is set. Therefore, each captured image can be represented by a two-dimensional coordinate system. For example, the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 4 can be expressed as (4, 3).
 また、撮影画像が低品質となる原因はオートフォーカス制御ミスのみではない。例えば顕微鏡装置10における光源に印可される電圧変動、あるいはステージおよび光学系の振動によって照明光の光量が変動し、撮影画像が暗い画像となってしまう場合がある。このような暗い撮影画像を、その他の正常な光量の撮影画像と同じように評価したのでは、正確な評価結果が得られない場合がある。図5は、ウェル内の各観察領域を合成した合成画像の一例を示す図である。図5において矩形領域で分割された各領域が各観察領域に相当する。また、図5に示す例では、点線四角で示す観察領域の撮影画像が、照明光の光量変動によって低品質となった撮影画像である。なお、図5においても図4と同様に座標系を設定する。このため、図5に示す点線四角で示す観察領域の撮影画像の座標は(4,4)と表すことができる。 In addition, the cause of the low quality of the photographed image is not only the autofocus control error. For example, the light amount of the illumination light may change due to voltage fluctuation applied to the light source in the microscope apparatus 10 or vibration of the stage and the optical system, and the captured image may be a dark image. If such a dark photographed image is evaluated in the same manner as other photographed images of normal light quantity, an accurate evaluation result may not be obtained. FIG. 5 is a view showing an example of a combined image obtained by combining the observation regions in the well. Each area divided by a rectangular area in FIG. 5 corresponds to each observation area. Further, in the example shown in FIG. 5, the photographed image of the observation area indicated by the dotted square is the photographed image of low quality due to the light amount fluctuation of the illumination light. In FIG. 5, the coordinate system is set as in FIG. Therefore, the coordinates of the photographed image of the observation area indicated by the dotted square shown in FIG. 5 can be expressed as (4, 4).
 撮影画像判定部22は、対象観察領域の撮影画像が低品質であるか否かを判定する。このために撮影画像判定部22は、対象撮影画像に隣接する少なくとも1つの他の撮影画像を設定する。本実施形態においては、図2に示す走査方向において、対象撮影画像の下流側にある1つの撮影画像を対象撮影画像に隣接する他の撮影画像に設定する。例えば、対象撮影画像の座標が(4,3)の場合、他の撮影画像の座標は(5,3)となる。また、対象撮影画像の座標が(5,3)の場合は、他の撮影画像の座標は(5,4)となる。また、最後に撮影される座標が(5,5)の撮影画像については、判定の対象としないか、その前に撮影される座標が(4,5)の撮影画像以外の撮影画像、例えば座標が(5,4)の撮影画像を用いて判定を行う。 The captured image determination unit 22 determines whether or not the captured image of the target observation area has a low quality. For this purpose, the captured image determination unit 22 sets at least one other captured image adjacent to the target captured image. In this embodiment, one captured image on the downstream side of the target captured image in the scanning direction shown in FIG. 2 is set as another captured image adjacent to the target captured image. For example, when the coordinates of the target captured image are (4, 3), the coordinates of the other captured image are (5, 3). When the coordinates of the target captured image are (5, 3), the coordinates of the other captured images are (5, 4). In addition, with regard to a captured image of which coordinates (5, 5) are captured last, it is not a target of determination, or a captured image other than the captured image of which coordinates are captured before (4, 5), for example, coordinates The determination is performed using the captured image of (5, 4).
 そして、撮影画像判定部22は、対象撮影画像および他の撮影画像のそれぞれにおける隣接する一部の細胞領域に基づいて、対象撮影画像が低品質であるか否かを判定する。図6は隣接する対象撮影画像および他の撮影画像の一例を示す図である。なお、図6においては、対象撮影画像G0および他の撮影画像G1の全領域が細胞領域であるものとする。図6に示すように、撮影画像判定部22は、対象撮影画像G0および他の撮影画像G1において隣接する小領域A0,A1をそれぞれ設定する。なお、図4に示す合成画像において、対象撮影画像G0の座標が(3,2)であり、他の撮影画像G1の座標が(4,2)である場合のように、対象撮影画像G0および他の撮影画像G1の全領域が細胞領域でない場合がある。このような場合、撮影画像判定部22は、図7に示すように、対象撮影画像G0および他の撮影画像G1における細胞領域のみを含むように小領域A0,A1を設定する。 Then, the captured image determination unit 22 determines whether the quality of the target captured image is low based on the adjacent partial cell regions in each of the target captured image and the other captured image. FIG. 6 is a view showing an example of adjacent target photographed images and other photographed images. In FIG. 6, it is assumed that the entire area of the target photographed image G0 and the other photographed image G1 is a cell area. As shown in FIG. 6, the captured image determination unit 22 sets small areas A0 and A1 adjacent to each other in the target captured image G0 and the other captured image G1. In the composite image shown in FIG. 4, the target captured image G0 and the target captured image G0 have coordinates (3, 2), and the coordinates of the other captured image G1 are (4, 2). The entire area of the other captured image G1 may not be a cell area. In such a case, as shown in FIG. 7, the captured image determination unit 22 sets the small regions A0 and A1 so as to include only the cell region in the target captured image G0 and the other captured image G1.
 そして、撮影画像判定部22は、小領域A0,A1の類似性を判定する。このために、撮影画像判定部22は、類似性を判定するための指標値を算出する。指標値としては小領域A0,A1を重ねた場合における対応する画素位置の画素値についての平均二乗誤差を用いる。なお、指標値としては平均二乗誤差に限定されるものではなく、小領域A0の平均輝度と小領域A1の平均輝度との差、または小領域A0の画素値の最小値および最大値のレンジと小領域A1の画素値の最小値および最大値のレンジとの差等を用いることができる。 Then, the photographed image determination unit 22 determines the similarity between the small areas A0 and A1. For this purpose, the captured image determination unit 22 calculates an index value for determining the similarity. As the index value, the mean square error of the pixel value of the corresponding pixel position in the case where the small areas A0 and A1 are overlapped is used. The index value is not limited to the mean square error, and the difference between the average luminance of the small area A0 and the average luminance of the small area A1 or the range of the minimum value and the maximum value of the pixel values of the small area A0 The difference between the minimum value and the maximum value range of the pixel values of the small area A1 can be used.
 ここで、対象撮影画像G0の小領域A0と他の撮影画像G1の小領域A1とは隣接しているため、本来類似しているはずである。しかしながら、上述したように対象撮影画像G0がボケたり、暗くなったりすると、小領域A0,A1が類似しなくなり、上述したように算出した指標値の値が大きくなる。このため、撮影画像判定部22は、指標値が予め定められたしきい値Th1よりも大きいか否かを判定し、指標値がしきい値Th1よりも大きいと判定された場合に、対象撮影画像G0が低品質であると判定する。 Here, since the small area A0 of the target captured image G0 and the small area A1 of the other captured image G1 are adjacent to each other, they should be essentially similar. However, as described above, when the target captured image G0 is blurred or darkened, the small areas A0 and A1 are not similar, and the value of the index value calculated as described above becomes large. Therefore, the captured image determination unit 22 determines whether the index value is larger than a predetermined threshold value Th1, and when it is determined that the index value is larger than the threshold value Th1, the target image capturing is performed. It is determined that the image G0 is of low quality.
 なお、対象撮影画像G0が低品質であると判定された場合、撮影画像判定部22は、対象撮影画像G0の小領域A0の一次微分値の最大値および他の撮影画像G1の小領域A1の一次微分値の最大値を算出する。ここで、対象撮影画像G0が暗い場合、一次微分値はそれほど小さい値とはならないが、対象撮影画像G0がボケている場合、一次微分値が小さい値となる。このため、対象撮影画像G0が低品質であると判定された場合、撮影画像判定部22は、対象撮影画像G0の小領域A0の一次微分値の最大値と他の撮影画像G1の小領域A1の一次微分値の最大値との差分値の絶対値を算出する。そして、差分値が予め定められたしきい値Th2よりも大きい場合、対象撮影画像G0はボケた撮影画像であると判定する。なお、差分値がしきい値Th2以下である場合、対象撮影画像G0は暗い撮影画像であると判定する。 When it is determined that the target captured image G0 is of low quality, the captured image determination unit 22 determines the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. Calculate the maximum value of the first derivative. Here, when the target captured image G0 is dark, the first derivative does not have a very small value, but when the target captured image G0 is blurred, the first derivative has a small value. Therefore, when it is determined that the target captured image G0 has a low quality, the captured image determination unit 22 determines that the maximum value of the first derivative of the small region A0 of the target captured image G0 and the small region A1 of the other captured image G1. The absolute value of the difference value with the maximum value of the primary differential value is calculated. Then, if the difference value is larger than a predetermined threshold value Th2, it is determined that the target captured image G0 is a blurred captured image. When the difference value is equal to or less than the threshold value Th2, it is determined that the target captured image G0 is a dark captured image.
 処理制御部23は、対象撮影画像G0が低品質であると判定された場合、対象撮影画像G0についての撮影および画像処理の少なくとも一方を制御する。以下、撮影および画像処理の制御について説明する。 When it is determined that the target captured image G0 has a low quality, the processing control unit 23 controls at least one of imaging and image processing for the target captured image G0. Hereinafter, control of photographing and image processing will be described.
 まず、撮影の制御について、処理制御部23は、低品質であると判定された対象撮影画像G0の観察領域を再撮影する。例えば、低品質であると判定された対象撮影画像G0の座標が図4に示す(4,3)である場合、処理制御部23は、顕微鏡装置10に対して、座標(4,3)に対応する撮影領域を再撮影させる指示を行う。この際、対象撮影画像G0が暗い撮影画像であると判定された場合は、適切な明るさとなる撮影を行うように顕微鏡装置10に対して指示を行う。一方、対象撮影画像G0がボケた撮影画像であると判定された場合は、対象撮影画像G0を取得した観察領域に対してフォーカスが適切に合わせられるように、顕微鏡装置10に対して指示を行う。 First, with regard to control of photographing, the process control unit 23 rephotographs the observation region of the target photographed image G0 determined to have low quality. For example, when the coordinates of the target photographed image G0 determined to be low quality are (4, 3) shown in FIG. 4, the processing control unit 23 sets the coordinates (4, 3) to the microscope device 10 It instructs to re-shoot the corresponding shooting area. At this time, when it is determined that the target captured image G0 is a dark captured image, an instruction is given to the microscope apparatus 10 so as to perform imaging with an appropriate brightness. On the other hand, when it is determined that the target captured image G0 is a blurred captured image, an instruction is given to the microscope apparatus 10 so that the focus is appropriately set on the observation region for which the target captured image G0 has been acquired. .
 なお、処理制御部23は、対象撮影画像G0が低品質であると判定された場合、対象撮影画像G0が低品質であることを通知するようにしてもよい。例えば、本実施形態においては、後述するように表示制御部24が、複数の撮影画像を繋げて合成した合成画像を表示装置30に表示するが、表示装置30に表示された合成画像において、低品質であると判定された撮影画像を強調して表示してもよい。例えば、図4に示す合成画像において、点線で囲まれた領域に赤色等の枠を付与したり、枠を点滅させたり、領域を点滅させる等してもよい。また、テキストまたは音声により通知をしてもよい。 The process control unit 23 may notify that the target photographed image G0 has a low quality when it is determined that the target photographed image G0 has a low quality. For example, in the present embodiment, as described later, the display control unit 24 displays on the display device 30 a composite image obtained by connecting and combining a plurality of photographed images, but the composite image displayed on the display device 30 is low. The captured image determined to have the quality may be displayed in an emphasized manner. For example, in the composite image illustrated in FIG. 4, a frame such as red may be added to the area surrounded by the dotted line, the frame may blink, or the area may blink. Also, notification may be made by text or voice.
 次に画像処理の制御について説明する。処理制御部23は、暗い撮影画像と判定された対象撮影画像G0に対して、明るさ補正処理を行う。具体的には、対象撮影画像G0の小領域A0の画素値の平均値を、他の撮影画像G1の小領域A1の画素値の平均値と一致させて、対象撮影画像G0の全画素値の明るさを大きくするように画素値を補正する。 Next, control of image processing will be described. The processing control unit 23 performs brightness correction processing on the target captured image G0 determined to be a dark captured image. Specifically, the average value of the pixel values of the small area A0 of the target captured image G0 is made to coincide with the average value of the pixel values of the small area A1 of the other captured image G1. Correct the pixel value so as to increase the brightness.
 一方、処理制御部23は、ボケた撮影画像と判定された対象撮影画像G0に対して、シャープネス強調処理を行う。具体的には、対象撮影画像G0の小領域A0の一次微分値の最大値と、他の撮影画像G1の小領域A1の一次微分値の最大値との差分値の絶対値の大きさに応じてシャープネスの強調度を変更してシャープネス強調処理を行う。なお、本実施形態においては、各種差分値の値とシャープネスの強調度とを対応づけたテーブルが撮影処理装置20のハードディスクに記憶されている。処理制御部23は、このテーブルを参照して差分値の値に応じたシャープネスの強調度を取得して、処理対象画像G0に対してシャープネス強調処理を行う。 On the other hand, the process control unit 23 performs sharpness enhancement processing on the target captured image G0 determined to be a blurred captured image. Specifically, according to the magnitude of the absolute value of the difference between the maximum value of the first derivative of the small area A0 of the target captured image G0 and the maximum value of the first derivative of the small area A1 of the other captured image G1. Sharpness emphasis processing is performed by changing the degree of emphasis on sharpness. In the present embodiment, a table in which the values of various difference values are associated with the degree of emphasis of sharpness is stored in the hard disk of the imaging processing apparatus 20. The processing control unit 23 refers to this table to acquire the degree of sharpness enhancement according to the value of the difference value, and performs the sharpness enhancement process on the processing target image G0.
 表示制御部24は、複数の撮影画像を繋ぎ合わせることにより合成画像を生成し、生成した合成画像を表示装置30に表示させる。また、低品質であると判定された撮影画像が存在する場合にその旨の通知、または合成画像における低品質の撮影画像の位置等を表示装置30に表示させる。 The display control unit 24 generates a composite image by connecting a plurality of captured images, and causes the display device 30 to display the generated composite image. Further, when there is a photographed image judged to be of low quality, a notification to that effect, the position of the low quality photographed image in the composite image, etc. are displayed on the display device 30.
 表示装置30は、上述したように合成画像および通知等を表示するものであり、例えば液晶ディスプレイ等を備える。また、表示装置30をタッチパネルによって構成し、入力装置40と兼用してもよい。 The display device 30 displays the composite image and the notification as described above, and includes, for example, a liquid crystal display. In addition, the display device 30 may be configured by a touch panel and used as the input device 40.
 入力装置40は、マウスおよびキーボード等を備えたものであり、ユーザによる種々の設定入力を受け付ける。 The input device 40 includes a mouse, a keyboard, and the like, and receives various setting inputs from the user.
 次に、本実施形態の顕微鏡観察システムの作用について、図8に示すフローチャートを参照しながら説明する。 Next, the operation of the microscope observation system of the present embodiment will be described with reference to the flowchart shown in FIG.
 まず、細胞および培地が収容されたウェルプレートが顕微鏡装置10のステージ上に設置される(ステップST10)。そして、ステージがX方向およびY方向に移動することによって、ウェルプレートの各ウェル内の観察領域が走査され、全ての観察領域の撮影画像が撮影される(ステップST12)。顕微鏡装置10において撮影された撮影画像は、撮影処理装置20の領域検出部21に順次入力される。領域検出部21は、撮影画像から細胞領域を検出する(ステップST14)。 First, a well plate containing cells and medium is placed on the stage of the microscope apparatus 10 (step ST10). Then, by moving the stage in the X direction and the Y direction, the observation area in each well of the well plate is scanned, and photographed images of all the observation areas are photographed (step ST12). Photographed images photographed by the microscope apparatus 10 are sequentially input to the area detection unit 21 of the photographing processing device 20. The area detection unit 21 detects a cell area from the captured image (step ST14).
 次いで、撮影画像判定部22が、複数の撮影画像のうちの対象撮影画像G0が低品質であるか否かを判定する(ステップST16)。対象撮影画像G0が低品質であると判定された場合、処理制御部23が、対象撮影画像G0についての撮影および対象撮影画像G0に対する画像処理の少なくとも一方を制御する(処理制御:ステップST18)。具体的には、対象撮影画像G0に対応する観察領域の再撮影、対象撮影画像G0が低品質の撮影画像であることの通知、対象撮影画像G0に対する明るさ補正処理、または対象撮影画像G0に対するシャープネス強調処理を行う。なお、対象撮影画像G0が低品質でないない場合は、ステップST20の処理に進む。 Next, the captured image determination unit 22 determines whether the target captured image G0 among the plurality of captured images has a low quality (step ST16). If it is determined that the target captured image G0 is of low quality, the processing control unit 23 controls at least one of photographing of the target captured image G0 and image processing of the target captured image G0 (processing control: step ST18). Specifically, re-shooting of the observation area corresponding to the target captured image G0, notification that the target captured image G0 is a low-quality captured image, brightness correction processing for the target captured image G0, or for the target captured image G0 Perform sharpness enhancement processing. If the target captured image G0 is not of low quality, the process proceeds to step ST20.
 そして、全ての観察領域の判定が終了するまでステップST16,ST18の処理が繰り返される(ST20,NO)。全ての撮影画像の判定が終了した場合には(ST20,YES)、表示制御部24は全ての撮影画像を繋ぎ合わせて合成して合成画像を生成し、生成した合成画像を表示装置30に表示し(ステップST22)、処理を終了する。 Then, the processes of steps ST16 and ST18 are repeated until the determination of all the observation areas is completed (ST20, NO). When the determination of all captured images is completed (ST20, YES), the display control unit 24 combines all captured images and combines them to generate a composite image, and displays the generated composite image on the display device 30. (Step ST22), and the process ends.
 このように、本実施形態においては、複数の撮影画像から、観察対象である細胞領域を検出し、複数の撮影画像のうちの対象撮影画像G0において検出された細胞領域および、対象撮影画像G0に隣接する少なくとも1つの他の撮影画像G1において検出された細胞領域に基づいて、対象撮影画像G0が低品質であるか否かを判定するようにした。このため、撮影画像における細胞以外の領域に影響されることなく、対象撮影画像G0が低品質であるか否かを正確に判定することができる。また、本実施形態においては、対象撮影画像G0が低品質であると判定された場合、対象撮影画像G0についての撮影および対象撮影画像に対する画像処理の少なくとも一方を制御するようにした。このため、撮影および画像処理の少なくとも一方が制御された対象撮影画像G0を用いて合成画像を生成することにより、観察対象に対して信頼性の高い評価を行うことができる。 As described above, in the present embodiment, the cell region to be observed is detected from the plurality of photographed images, and the cell region detected in the target photographed image G0 among the plurality of photographed images and the target photographed image G0 Based on the cell area detected in at least one other captured image G1 adjacent thereto, it is determined whether or not the target captured image G0 has low quality. Therefore, it is possible to accurately determine whether or not the target photographed image G0 has low quality without being affected by the area other than the cells in the photographed image. Further, in the present embodiment, when it is determined that the target captured image G0 is of low quality, at least one of the shooting for the target captured image G0 and the image processing for the target captured image is controlled. For this reason, highly reliable evaluation can be performed on the observation target by generating the composite image using the target pickup image G0 in which at least one of the photographing and the image processing is controlled.
 なお、上記実施形態のように、対象撮影画像G0と1つの他の撮影画像G1の隣接する小領域A0,A1同士を比較した場合、対象撮影画像G0が低品質であり、他の撮影画像G1が低品質でないと判定された場合、および対象撮影画像G0が低品質でなく、他の撮影画像G1が低品質である場合のいずれの場合においても指標値が大きくなる。このため、対象撮影画像G0および他の撮影画像G1のいずれが低品質であるかを正確に判定できない場合がある。したがって、他の撮影画像G1を複数用いるようにしてもよい。例えば、対象撮影画像G0の左右の2つの撮影画像、上下左右の4つの撮影画像、左上、右上、右下および左下の4つの撮影画像、あるいは対象撮影画像G0の周囲の8つの撮影画像を他の撮影画像としてもよい。このように他の撮影画像G1を複数用いることにより、対象撮影画像G0が低品質であるか否かをより精度よく判定することができる。 As in the above embodiment, when the target photographed image G0 is compared with the adjacent small areas A0 and A1 of one other photographed image G1, the target photographed image G0 has a low quality, and the other photographed image G1 is low. In the case where it is determined that the quality is not low, and the target photographed image G0 is not low in quality, and the other photographed images G1 are low in quality, the index value becomes large. Therefore, it may not be possible to accurately determine which of the target captured image G0 and the other captured image G1 has low quality. Therefore, a plurality of other photographed images G1 may be used. For example, two left and right captured images of the target captured image G0, four captured images in upper, lower, left and right, four captured images in upper left, upper right, lower right and lower left, or eight captured images around the target captured image G0 It may be a photographed image of As described above, by using a plurality of other captured images G1, it is possible to more accurately determine whether the target captured image G0 has low quality.
 また、上記実施形態においては、観察領域の一部を重複させて撮影して撮影画像を取得するようにしてもよい。図9は観察領域の一部の重複を説明するための図である。図9に示すように、各撮影画像は隣接する撮影画像と一部の領域が重複している。重複させる範囲としては、例えば観察領域の各辺の長さの最大10%程度とすることが好ましい。なお、図9においては説明のために重複させる領域を広く示している。 In the above embodiment, a part of the observation area may be overlapped and photographed to acquire a photographed image. FIG. 9 is a diagram for explaining duplication of part of the observation region. As shown in FIG. 9, in each photographed image, an adjacent photographed image and a partial area overlap. The overlapping range is preferably, for example, up to about 10% of the length of each side of the observation area. Note that, in FIG. 9, overlapping regions are widely shown for the purpose of explanation.
 このように、観察領域の一部を重複させて撮影画像を取得した場合、撮影画像判定部22は、図10に示すように、対象撮影画像G0および他の撮影画像G1における重複する領域を小領域A0,A1に設定すればよい。 Thus, when a part of the observation area is overlapped and the photographed image is acquired, as shown in FIG. 10, the photographed image determination unit 22 reduces the overlapping area in the target photographed image G0 and the other photographed image G1. It may be set to the areas A0 and A1.
 なお、観察領域の一部を重複させて撮影画像を取得した場合において、対象撮影画像G0が低品質であると判定された場合、他の撮影画像G1は低品質ではないこととなる。このため、処理制御部23は、対象撮影画像G0が低品質であると判定された場合、対象撮影画像G0における重複する領域を、他の撮影画像G1における重複する領域と置換するようにしてもよい。 In addition, when it is determined that the target captured image G0 has low quality when the captured image is acquired by overlapping a part of the observation area, the other captured images G1 are not low in quality. Therefore, when it is determined that the target captured image G0 has low quality, the processing control unit 23 replaces the overlapping region in the target captured image G0 with the overlapping region in the other captured image G1. Good.
 すなわち、図10に示す対象撮影画像G0が低品質であると判定された場合、他の撮影画像G1は低品質ではないことから、図11に示すように、対象撮影画像G0における小領域A0を、他の撮影画像G1における小領域A1と置換してもよい。この場合、処理制御部23は、対象撮影画像G0に対する画像処理を制御する場合、対象撮影画像G0における小領域A0以外の領域に対してのみ、明るさ補正処理およびシャープネス強調処理の少なくとも一方を行えばよいこととなる。このため、画像処理に要する処理時間を短縮することができる。 That is, when it is determined that the target captured image G0 shown in FIG. 10 is of low quality, since the other captured images G1 are not of low quality, as shown in FIG. 11, the small area A0 in the target captured image G0 , And may be replaced with the small area A1 in the other captured image G1. In this case, when controlling the image processing on the target captured image G0, the processing control unit 23 performs at least one of the brightness correction processing and the sharpness enhancement processing only on the region other than the small region A0 in the target captured image G0. It will be good. Therefore, the processing time required for image processing can be shortened.
 また、対象撮影画像G0に隣接する4つの撮影画像を他の撮影画像G1~G4として用いた場合、対象撮影画像G0の上下左右の4つの小領域を他の撮影画像G1~G4における対象撮影画像G0と重複する小領域A1~A4と置換し、対象撮影画像G0における小領域A1~A4以外の領域に対してのみ、明るさ補正処理およびシャープネス強調処理の少なくとも一方を行ってもよい。 Further, when four captured images adjacent to the target captured image G0 are used as the other captured images G1 to G4, four small areas in the upper, lower, left, and right of the target captured image G0 are target captured images in the other captured images G1 to G4. Substituting with the small area A1 to A4 overlapping G0, at least one of the brightness correction process and the sharpness enhancing process may be performed only on the area other than the small area A1 to A4 in the target captured image G0.
 なお、上記実施形態においては、対象撮影画像G0とこれに隣接する他の撮影画像G1とを比較して、対象撮影画像G0が低品質であるか否かを判定している。具体的には、対象撮影画像G0および他の撮影画像G1における小領域A0,A1の類似性を表す指標値を算出し、指標値がしきい値Th1よりも大きい場合に、対象撮影画像G0が低品質であると判定している。しかしながら、指標値は、対象撮影画像G0および他の撮影画像G1の双方が低品質ではない場合のみならず、対象撮影画像G0および他の撮影画像G1の双方が低品質である場合においても小さくなる。したがって、指標値がしきい値Th1より小さい場合、対象撮影画像G0が低品質であるか否かを正確に判定できない場合がある。 In the above embodiment, the target photographed image G0 is compared with another photographed image G1 adjacent to the target photographed image G0 to determine whether the target photographed image G0 has low quality. Specifically, an index value representing the similarity between the subregions A0 and A1 in the target captured image G0 and the other captured image G1 is calculated, and the target captured image G0 is calculated if the index value is larger than the threshold Th1. It is judged that the quality is low. However, the index value decreases not only when both the target captured image G0 and the other captured image G1 are not of low quality but also when both the target captured image G0 and the other captured image G1 are of low quality. . Therefore, when the index value is smaller than the threshold value Th1, it may not be possible to accurately determine whether the target photographed image G0 has low quality.
 このため、対象撮影画像G0に隣接する撮影画像のみならず、対象撮影画像G0に近接する少なくとも1つの撮影画像を他の撮影画像G1として用いて、対象撮影画像G0が低品質であるか否かを判定してもよい。以下、これを他の実施形態として説明する。なお、他の実施形態における撮影処理装置の構成は、図1に示す撮影処理装置20の構成と同一であるため、ここでは装置についての詳細な説明は省略する。 Therefore, not only the photographed image adjacent to the target photographed image G0 but also at least one photographed image close to the target photographed image G0 is used as another photographed image G1 to determine whether the target photographed image G0 has low quality or not May be determined. Hereinafter, this will be described as another embodiment. Note that the configuration of the imaging processing apparatus in the other embodiment is the same as the configuration of the imaging processing apparatus 20 shown in FIG. 1, and thus the detailed description of the apparatus will be omitted here.
 ここで、対象撮影画像G0に近接する他の撮影画像G1としては、対象撮影画像G0を基準として予め定められた範囲内にある撮影画像を用いることができる。具体的には、対象撮影画像G0に隣接する複数の撮影画像に加えて、これらの隣接する撮影画像にさらに隣接する1以上の撮影画像を用いることができる。例えば、図12に示すように、対象撮影画像G0を中心とする5×5個の範囲内にある撮影画像を、対象撮影画像G0に近接する他の撮影画像G1としてもよい。なお、対象撮影画像G0に隣接しないが、対象撮影画像G0を基準とした予め定められた範囲内にある撮影画像を対象撮影画像G0に近接する他の撮影画像G1としてもよい。例えば、図12に示す斜線を付与した撮影画像を対象撮影画像G0に近接する他の撮影画像G1として用いてもよい。なお、対象撮影画像G0を中心とした予め定められた半径を有する円の範囲内に、その全部あるいはその一部が含まれる撮影画像を、対象撮影画像G0に近接する他の撮影画像G1として用いてもよい。 Here, as another captured image G1 in proximity to the target captured image G0, a captured image within a predetermined range based on the target captured image G0 can be used. Specifically, in addition to a plurality of photographed images adjacent to the target photographed image G0, one or more photographed images further adjacent to these adjacent photographed images can be used. For example, as illustrated in FIG. 12, captured images within a 5 × 5 range centered on the target captured image G0 may be used as another captured image G1 close to the target captured image G0. A captured image which is not adjacent to the target captured image G0 but is within a predetermined range based on the target captured image G0 may be another captured image G1 which is close to the target captured image G0. For example, the photographed image to which the diagonal lines shown in FIG. 12 are added may be used as another photographed image G1 close to the target photographed image G0. Note that a captured image including all or a part of it within a range of a circle having a predetermined radius centered on the target captured image G0 is used as another captured image G1 close to the target captured image G0 May be
 図13は本発明の他の実施形態において行われる処理を示すフローチャートである。なお、他の実施形態において行われる処理は、図8に示すフローチャートにおけるステップST16の処理のみが上記実施形態と異なる。このため、ここでは、図8に示すフローチャートにおけるステップST16の処理についてのみ説明する。 FIG. 13 is a flow chart showing processing performed in another embodiment of the present invention. The processing performed in the other embodiment is different from the above embodiment only in the processing of step ST16 in the flowchart shown in FIG. Therefore, only the process of step ST16 in the flowchart shown in FIG. 8 will be described here.
 図8に示すステップST14に引き続き、撮影画像判定部22は、対象撮影画像G0と比較する他の撮影画像G1を1つ設定する(ステップST30)。なお、他の撮影画像G1は、対象撮影画像G0に近い撮影画像から順に設定すればよい。そして、撮影画像判定部22は、上記実施形態と同様に、対象撮影画像G0および他の撮影画像G1における小領域A0,A1の類似性を表す指標値を算出し、指標値がしきい値Th1よりも大きいか否かを判定する(ステップST32)。ステップST32が肯定されると、撮影画像判定部22は、対象撮影画像G0は低品質であると判定し(ステップST34)、図8のステップST18の処理に進む。 Subsequent to step ST14 shown in FIG. 8, the photographed image determination unit 22 sets one other photographed image G1 to be compared with the target photographed image G0 (step ST30). The other captured images G1 may be set in order from the captured image closer to the target captured image G0. Then, similarly to the above embodiment, the captured image determination unit 22 calculates an index value representing the similarity between the target captured image G0 and the small regions A0 and A1 in the other captured image G1, and the index value is the threshold Th1. It is determined whether or not it is larger (step ST32). If step ST32 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is of low quality (step ST34), and proceeds to the process of step ST18 of FIG.
 ステップST32が否定されると、撮影画像判定部22は、全ての他の撮影画像G1について比較が終了したか否かを判定する(ステップST36)。ステップST36が否定されると、対象撮影画像G0と比較する他の撮影画像G1を次の撮影画像に設定し(ステップST38)、ステップST32の処理に戻る。ステップST36が肯定されると、撮影画像判定部22は、対象撮影画像G0は低品質ではないと判定し(ステップST40)、図8におけるステップST20の処理に進む。 If step ST32 is negative, the captured image determination unit 22 determines whether the comparison has ended for all other captured images G1 (step ST36). If step ST36 is negative, another captured image G1 to be compared with the target captured image G0 is set as the next captured image (step ST38), and the process returns to step ST32. If step ST36 is affirmed, the captured image determination unit 22 determines that the target captured image G0 is not of low quality (step ST40), and proceeds to the process of step ST20 in FIG.
 また、上記実施形態においては、全ての観察領域の撮影画像を取得した後に、対象撮影画像が低品質であるか否かを判定しているが、各観察領域の撮影画像を取得しつつ、対象撮影画像が低品質であるか否かを判定するようにしてもよい。この場合において、他の撮影画像G1を複数用いる場合、すでに取得された撮影画像を他の撮影画像G1として用いればよい。 Further, in the above embodiment, after the captured images of all the observation areas are acquired, it is determined whether or not the target captured image is of low quality. However, while acquiring the captured images of each observation area, It may be determined whether the photographed image is of low quality. In this case, in the case where a plurality of other captured images G1 are used, the captured image already acquired may be used as the other captured image G1.
 また、上記実施形態においては、対象撮影画像G0および他の撮影画像の隣接する小領域を用いて指標値を算出して、対象撮影画像G0が低品質であるか否かを判定しているが、対象撮影画像G0および他の撮影画像の全領域、とくに細胞領域の全領域を用いて指標値を算出してもよい。 Further, in the above embodiment, the index value is calculated using adjacent small areas of the target captured image G0 and other captured images, and it is determined whether the target captured image G0 has low quality. The index value may be calculated using the entire area of the target photographed image G0 and other photographed images, in particular, the entire area of the cell area.
   10  顕微鏡装置
   20  撮影処理装置
   21  領域検出部
   22  撮影画像判定部
   23  処理制御部
   24  表示制御部
   30  表示装置
   40  入力装置
   50  ウェルプレート
   51  ウェル
   A0,A1  小領域
   E   走査終了点
   G0  対象撮影画像
   G1  他の撮影画像
   S   走査開始点
   Sc  走査軌跡を示す実線  
DESCRIPTION OF SYMBOLS 10 microscope apparatus 20 imaging processing apparatus 21 area detection unit 22 photographed image determination unit 23 processing control unit 24 display control unit 30 display device 40 input device 50 well plate 51 well A0, A1 small area E scanning end point G0 target photographed image G1 other Photographed image of S S scan start point Sc Solid line indicating scan locus

Claims (10)

  1.  観察対象を収容した容器および前記観察対象を前記容器よりも小さい観察領域毎に撮影する撮像部を備え、前記容器および前記撮像部の少なくとも一方を他方に対して相対的に移動させて、前記観察領域の位置を変更しつつ、前記容器を複数回撮影して、複数の撮影画像を取得する観察装置と、
     前記撮影画像から前記観察対象の領域を検出する領域検出部と、
     前記複数の撮影画像のうちの対象撮影画像において検出された前記観察対象の領域および、前記対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された前記観察対象の領域に基づいて、前記対象撮影画像が低品質であるか否かを判定する撮影画像判定部と、
     前記対象撮影画像が低品質であると判定された場合、該対象撮影画像についての撮影および該対象撮影画像に対する画像処理の少なくとも一方を制御する処理制御部とを備えた撮影処理装置。
    A container containing an observation target and an imaging unit for imaging the observation target in each observation region smaller than the container, at least one of the container and the imaging unit is moved relative to the other to perform the observation An observation device for capturing a plurality of captured images by capturing the container a plurality of times while changing the position of the area;
    An area detection unit that detects an area of the observation target from the captured image;
    The area of the observation target detected in a target photographed image of the plurality of photographed images and the area of the observation target detected in at least one other photographed image close to the target photographed image A captured image determination unit that determines whether the target captured image has low quality;
    And a processing control unit configured to control at least one of photographing of the target photographed image and image processing on the target photographed image when it is determined that the target photographed image has a low quality.
  2.  前記処理制御部は、前記対象撮影画像が低品質であると判定された場合、該対象撮影画像に対応する前記観察領域を再撮影する請求項1に記載の撮影処理装置。 The imaging processing apparatus according to claim 1, wherein the processing control unit rephotographs the observation area corresponding to the target captured image when it is determined that the target captured image has a low quality.
  3.  前記処理制御部は、前記対象撮影画像が低品質であると判定された場合、前記対象撮影画像が低品質であることを通知する請求項1に記載の撮影処理装置。 The imaging processing apparatus according to claim 1, wherein the processing control unit notifies that the target captured image has a low quality when it is determined that the target captured image has a low quality.
  4.  前記処理制御部は、前記対象撮影画像が低品質であると判定された場合、前記対象撮影画像に対して、明るさ補正処理およびシャープネス強調処理の少なくとも一方を行う請求項1に記載の撮影処理装置。 The imaging processing according to claim 1, wherein the processing control unit performs at least one of brightness correction processing and sharpness enhancement processing on the target captured image when it is determined that the target captured image has low quality. apparatus.
  5.  前記撮影画像判定部は、検出された前記観察対象の領域の類似性に基づいて、前記対象撮影画像が低品質であるか否かを判定する請求項1から4のいずれか1項に記載の撮影処理装置。 The said picked-up image determination part is based on the similarity of the area | region of the said observation object detected, and determines whether the said object picked-up image is low quality according to any one of Claim 1 to 4 Imaging processing device.
  6.  前記撮影画像判定部は、前記対象撮影画像および前記他の撮影画像のそれぞれにおいて検出された前記観察対象の領域における一部の領域に基づいて、前記対象撮影画像が低品質であるか否かを判定する請求項1から5のいずれか1項に記載の撮影処理装置。 The captured image determination unit determines whether the quality of the target captured image is low based on a partial region in the region of the observation target detected in each of the target captured image and the other captured image. The imaging | photography processing apparatus of any one of Claim 1 to 5 which determines.
  7.  前記撮影処理装置は、前記観察領域の一部を重複させて前記複数の撮影画像を取得し、
     前記撮影画像判定部は、前記対象撮影画像および前記他の撮影画像のそれぞれにおける重複する領域に基づいて、前記対象撮影画像が低品質であるか否かを判定する請求項1から5のいずれか1項に記載の撮影処理装置。
    The photographing processing apparatus acquires a plurality of photographed images by overlapping a part of the observation area.
    The said picked-up image determination part determines whether the said target picked-up image is low quality based on the area | region which each overlaps in each of the said target picked-up image and the said other picked-up image. The photographing processing device according to item 1.
  8.  前記対象撮影画像が低品質であると判定された場合、前記処理制御部は、前記対象撮影画像における前記重複する領域を、前記他の撮影画像における前記重複する領域と置換する請求項7に記載の撮影処理装置。 The processing control unit according to claim 7, wherein the processing control unit replaces the overlapping area in the target photographed image with the overlapping area in the other photographed image when it is determined that the target photographed image has a low quality. Imaging processing equipment.
  9.  観察対象を収容した容器および前記観察対象を前記容器よりも小さい観察領域毎に撮影する撮像部を備え、前記容器および前記撮像部の少なくとも一方を他方に対して相対的に移動させて、前記観察領域の位置を変更しつつ、前記容器を複数回撮影して、複数の撮影画像を取得する観察装置を備えた撮影処理装置の制御方法であって、
     前記撮影画像から前記観察対象の領域を検出し、
     前記複数の撮影画像のうちの対象撮影画像において検出された前記観察対象の領域および、前記対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された前記観察対象の領域に基づいて、前記対象撮影画像が低品質であるか否かを判定し、
     前記対象撮影画像が低品質であると判定された場合、該対象撮影画像についての撮影および該対象撮影画像に対する画像処理の少なくとも一方を制御する撮影処理装置の制御方法。
    A container containing an observation target and an imaging unit for imaging the observation target in each observation region smaller than the container, at least one of the container and the imaging unit is moved relative to the other to perform the observation A control method of an imaging processing apparatus including an observation device which captures a plurality of captured images by capturing the container a plurality of times while changing the position of the area.
    Detecting an area of the observation target from the photographed image;
    The area of the observation target detected in a target photographed image of the plurality of photographed images and the area of the observation target detected in at least one other photographed image close to the target photographed image Determine whether the target captured image is of low quality,
    A control method of a photographing processing device which controls at least one of photographing of the target photographed image and image processing on the target photographed image when it is determined that the target photographed image has low quality.
  10.  観察対象を収容した容器および前記観察対象を前記容器よりも小さい観察領域毎に撮影する撮像部を備え、前記容器および前記撮像部の少なくとも一方を他方に対して相対的に移動させて、前記観察領域の位置を変更しつつ、前記容器を複数回撮影して、複数の撮影画像を取得する観察装置を備えた撮影処理装置の制御方法をコンピュータに実行させるための撮影処理プログラムであって、
     前記撮影画像から前記観察対象の領域を検出する手順と、
     前記複数の撮影画像のうちの対象撮影画像において検出された前記観察対象の領域および、前記対象撮影画像に近接する少なくとも1つの他の撮影画像において検出された前記観察対象の領域に基づいて、前記対象撮影画像が低品質であるか否かを判定する手順と、
     前記対象撮影画像が低品質であると判定された場合、該対象撮影画像についての撮影および該対象撮影画像に対する画像処理の少なくとも一方を制御する手順とをコンピュータに実行させるための撮影処理プログラム。
    A container containing an observation target and an imaging unit for imaging the observation target in each observation region smaller than the container, at least one of the container and the imaging unit is moved relative to the other to perform the observation An imaging processing program for causing a computer to execute a control method of an imaging processing apparatus including an observation device that acquires a plurality of captured images by capturing the container a plurality of times while changing the position of the area.
    Detecting an area of the observation target from the photographed image;
    The area of the observation target detected in a target photographed image of the plurality of photographed images and the area of the observation target detected in at least one other photographed image close to the target photographed image A procedure for determining whether the target captured image is of low quality;
    A photographing processing program for causing a computer to execute, when it is determined that the target photographed image is of low quality, at least one of photographing of the target photographed image and image processing on the target photographed image.
PCT/JP2018/029509 2017-08-31 2018-08-07 Imaging processing device, control method for imaging processing device, and imaging processing program WO2019044416A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-166908 2017-08-31
JP2017166908A JP2020202748A (en) 2017-08-31 2017-08-31 Photographing processing apparatus, control method of photographing processing apparatus, and photographing processing program

Publications (1)

Publication Number Publication Date
WO2019044416A1 true WO2019044416A1 (en) 2019-03-07

Family

ID=65525333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/029509 WO2019044416A1 (en) 2017-08-31 2018-08-07 Imaging processing device, control method for imaging processing device, and imaging processing program

Country Status (2)

Country Link
JP (1) JP2020202748A (en)
WO (1) WO2019044416A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020188813A1 (en) * 2019-03-20 2020-09-24 株式会社島津製作所 Cell analysis device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008137667A1 (en) * 2007-05-04 2008-11-13 Aperio Technologies, Inc. System and method for quality assurance in pathology
WO2010111656A2 (en) * 2009-03-27 2010-09-30 Life Technologies Corporation Systems and methods for assessing images
WO2011127361A2 (en) * 2010-04-08 2011-10-13 Omnyx LLC Image quality assessment including comparison of overlapped margins
WO2013183562A1 (en) * 2012-06-04 2013-12-12 大日本印刷株式会社 Culture-medium information registration system, communication terminal, program, health management system, and film-type culture medium
JP2014178357A (en) * 2013-03-13 2014-09-25 Sony Corp Digital microscope device, imaging method of the same and program
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008137667A1 (en) * 2007-05-04 2008-11-13 Aperio Technologies, Inc. System and method for quality assurance in pathology
WO2010111656A2 (en) * 2009-03-27 2010-09-30 Life Technologies Corporation Systems and methods for assessing images
WO2011127361A2 (en) * 2010-04-08 2011-10-13 Omnyx LLC Image quality assessment including comparison of overlapped margins
WO2013183562A1 (en) * 2012-06-04 2013-12-12 大日本印刷株式会社 Culture-medium information registration system, communication terminal, program, health management system, and film-type culture medium
JP2014178357A (en) * 2013-03-13 2014-09-25 Sony Corp Digital microscope device, imaging method of the same and program
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020188813A1 (en) * 2019-03-20 2020-09-24 株式会社島津製作所 Cell analysis device
JPWO2020188813A1 (en) * 2019-03-20 2021-12-16 株式会社島津製作所 Cell analyzer
JP7006832B2 (en) 2019-03-20 2022-01-24 株式会社島津製作所 Cell analyzer

Also Published As

Publication number Publication date
JP2020202748A (en) 2020-12-24

Similar Documents

Publication Publication Date Title
US9088729B2 (en) Imaging apparatus and method of controlling same
US8830313B2 (en) Information processing apparatus, stage-undulation correcting method, program therefor
US20140098213A1 (en) Imaging system and control method for same
US10613313B2 (en) Microscopy system, microscopy method, and computer-readable recording medium
JP2012003214A (en) Information processor, information processing method, program, imaging device and imaging device having light microscope
JP2016125913A (en) Image acquisition device and control method of image acquisition device
WO2019181053A1 (en) Device, method, and program for measuring defocus amount, and discriminator
WO2018003181A1 (en) Imaging device and method and imaging control program
JP2013132027A (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and image processing program
US11209637B2 (en) Observation device, observation control method, and observation control program that control acceleration of a moveable stage having an installed subject vessel
CN109001902A (en) Microscope focus method based on image co-registration
US10659694B2 (en) Imaging device, imaging method and imaging device control program
JP2015156011A (en) Image acquisition device and method for controlling the same
JP2017055916A (en) Image generation apparatus, image generation method, and program
WO2019044416A1 (en) Imaging processing device, control method for imaging processing device, and imaging processing program
US11756190B2 (en) Cell image evaluation device, method, and program
JP2013246052A (en) Distance measuring apparatus
US20200192059A1 (en) Imaging control apparatus, method, and program
JP6499506B2 (en) Imaging apparatus and method, and imaging control program
US20130016192A1 (en) Image processing device and image display system
JP2023033982A (en) Image processing device, image processing system, sharpening method of image, and program
JP6534294B2 (en) Imaging apparatus and method, and imaging control program
WO2019098018A1 (en) Observation device and method, and observation device control program
JP2016206228A (en) Focused position detection device, focused position detection method, imaging device and imaging system
JP2014149381A (en) Image acquisition apparatus and image acquisition method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18849840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18849840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP