WO2019176012A1 - Image processing method, image processing device, user interface device, image processing system and server - Google Patents

Image processing method, image processing device, user interface device, image processing system and server Download PDF

Info

Publication number
WO2019176012A1
WO2019176012A1 PCT/JP2018/009966 JP2018009966W WO2019176012A1 WO 2019176012 A1 WO2019176012 A1 WO 2019176012A1 JP 2018009966 W JP2018009966 W JP 2018009966W WO 2019176012 A1 WO2019176012 A1 WO 2019176012A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
division
image
subject
feature amount
Prior art date
Application number
PCT/JP2018/009966
Other languages
French (fr)
Japanese (ja)
Inventor
遊 廣澤
朗 松下
勲 坂根
良介 村田
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2018/009966 priority Critical patent/WO2019176012A1/en
Priority to JP2020506023A priority patent/JP6931418B2/en
Publication of WO2019176012A1 publication Critical patent/WO2019176012A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing method, an image processing device, a user interface device, an image processing system, and a server.
  • Patent Document 1 an image processing technique for recognizing a subject based on appearance characteristics of the subject in an image is known (see, for example, Patent Document 1).
  • Patent Document 1 a single image is divided into a plurality of small areas, a feature amount representing an appearance feature is calculated for each small region, and a category of each small region is determined based on the feature amount. According to such a method, when a plurality of subjects having different appearance features are included in one image, the plurality of subjects can be distinguished and recognized.
  • the management parameter is, for example, the proliferation rate, survival rate, or cell number of a cell group.
  • the feature amount calculated from the small area may differ depending on the size of the small area. Also, it may be unclear how much the size of the subject's appearance features correlate with the management parameters. For example, when a certain number of cells form a characteristic structure, in order to obtain a feature value that accurately represents the characteristic structure, the size of the small area is set so that the small area includes a certain number of cells. Need to be set. When the size of the small area is too small or too large for the entire number of cells, a feature amount that accurately represents a characteristic structure cannot be calculated from the small area. Furthermore, there may be no knowledge of how many cells form a characteristic structure.
  • the present invention has been made in view of the above-described circumstances, and is an image processing method and an image processing method that can divide a subject image into divided images having a size suitable for calculating a feature amount representing an appearance feature of the subject.
  • An object is to provide a device, a user interface device, an image processing system, and a server.
  • the present invention provides the following means.
  • a step of acquiring a subject image obtained by capturing an image of a subject a step of dividing the subject image into a plurality of divided images, and calculating at least some of the feature amounts of the plurality of divided images.
  • the feature amount is an amount representing an appearance feature of the subject
  • This is an image processing method that is information used for determination.
  • the subject image is divided into a plurality of divided images, and the feature amount of each divided image is calculated.
  • the feature amount is an amount representing the appearance feature of the subject in each divided image.
  • division assistance information is created based on the feature amount. Based on such division auxiliary information, it is possible to determine a subject image division method that can calculate a feature amount that more accurately represents an appearance feature of a subject.
  • the division auxiliary information includes a distribution map of the feature quantity, and in the distribution chart, a plot showing the individual feature quantity is displayed on an axis representing the size of the feature quantity. Good.
  • the size of the divided image is suitable for calculating the feature amount based on the distribution of plots on the axis. For example, when a plurality of subjects having different appearance features are included in the subject image, when the size of the divided image is appropriate, feature amounts reflecting individual appearance features are calculated. The variation increases and the distribution range of the plot on the axis increases. On the other hand, when the size of the divided image is inappropriate, the variation between the feature amounts is smaller than when the size of the divided image is appropriate, and the distribution range of the plot on the axis is narrow.
  • the auxiliary division information may include a statistical value based on the histogram of the feature amount.
  • the statistical value of the histogram of the feature amount is preferably a statistical value representing the width or variance of the histogram. Based on the statistical value of the histogram of the feature amount, it can be determined whether or not the size of the divided image is suitable for calculating the feature amount. For example, when a plurality of subjects having different appearance features are included in the subject image, when the size of the divided image is appropriate, feature amounts reflecting individual appearance features are calculated. The variation increases and the statistical value of the histogram increases. On the other hand, when the size of the divided image is inappropriate, the variation between the feature amounts becomes smaller and the statistical value of the histogram becomes smaller than when the size of the divided image is appropriate.
  • the division auxiliary information may include the divided image associated with each plot. With this configuration, it is possible to provide division auxiliary information that makes it easy to compare a divided image and a feature amount.
  • the method includes a step of obtaining a management parameter relating to an internal property of the subject, and a step of classifying the management parameter into one of a plurality of classes according to the size thereof, wherein the plot includes You may display in the aspect corresponding to the class of a management parameter.
  • the subject is a group of cells to be cultured
  • the management parameter includes a growth rate of the cell group, a survival rate of the cell group at a predetermined time after the end of the culture, or after the end of the culture. May be the number of cells in the cell group when a predetermined time elapses.
  • the step of changing the subject image division method to a division method determined based on the division auxiliary information, and the step of dividing the subject image into a plurality of divided images by the changed division method. May be included.
  • the size of the divided image may be changed in the step of changing the division method.
  • assistant information to a user interface apparatus may be included.
  • the user can check the division assistance information using the user interface device.
  • the above aspect may include a step of receiving a request for changing the division method from the user interface device.
  • Another aspect of the present invention includes a processor, the processor acquiring a subject image obtained by imaging a subject, a step of dividing the subject image into a plurality of divided images, and among the plurality of divided images, Calculating at least a part of the feature amount, and executing the step of generating the division assistance information based on the feature amount, and the step of generating the division assistance information based on the feature amount
  • the information processing apparatus is an image processing apparatus in which the information is information used for determining the subject image division method.
  • Another aspect of the present invention is a user interface device that includes a display that is connected to the image processing device via a communication network and displays divisional auxiliary information received from the image processing device.
  • Another aspect of the present invention is an image processing system including the image processing device and the user interface device.
  • Another aspect of the present invention is a server that is connected to the image processing apparatus via a communication network and receives and stores the division auxiliary information.
  • the subject image can be divided into divided images having a size suitable for calculating the feature amount representing the appearance feature of the subject.
  • FIG. 1 is an overall configuration diagram of an image processing system according to an embodiment of the present invention. It is an internal block diagram of the monitoring apparatus of FIG. 1, an image processing apparatus, and UI apparatus. 4 is a flowchart illustrating an image processing method according to an embodiment of the present invention. It is an example of an original image and a divided image. It is an example of the original image containing the several cell group from which an external feature differs mutually. It is an example of the division
  • the image processing system 100 includes a monitoring device 1 that generates an image of a cell group (subject) cultured in the culture vessel 4, and an image of the cell group from the monitoring device 1. And a user interface (UI) device 3 used by a user.
  • a monitoring device 1 that generates an image of a cell group (subject) cultured in the culture vessel 4, and an image of the cell group from the monitoring device 1.
  • UI user interface
  • the monitoring device 1 includes an imaging device 5, a processor 6, and a communication device 7.
  • the imaging device 5, the processor 6, and the communication device 7 are housed in a sealed box-shaped housing 8 (see FIG. 1).
  • the transparent top plate 8a of the housing 8 is used as a stage on which the culture vessel 4 is placed.
  • the monitoring device 1 is arranged in an incubator (not shown) together with the culture vessel 4 during the culture period.
  • the imaging device 5 has an imaging element (not shown) such as a CMOS image sensor or a CCD image sensor.
  • the imaging device 5 images the inside of the culture vessel 4 on the stage 8a with an imaging element, and generates an image of the cell group (subject image).
  • the processor 6 causes the imaging device 5 to perform imaging according to a preset schedule or according to an instruction from the image processing apparatus 2.
  • the communication device 7 is connected to a communication device 12 (described later) of the image processing device 2 through a communication network 21, and transmits / receives data, information, and signals to / from the image processing device 2.
  • the communication network 21 is, for example, the Internet, an intranet, a LAN (Local Area Network), a WAN (Wide Area Network), or a combination thereof.
  • the communication network 21 may be either wireless or wired.
  • the communication device 7 receives an image from the imaging device 5 and transmits the image to the communication device 12 of the image processing device 2.
  • the image processing device 2 includes a memory 10, a processor 11, and a communication device 12.
  • the image processing device 2 is disposed outside the incubator, and receives an image (original image) of a cell group from the monitoring device 1 in the incubator through communication between the communication devices 12 and 7.
  • the memory 10 stores an image processing program for causing the processor 11 to execute predetermined processing.
  • the processor 11 reads out the image processing program from the memory 10 and executes processing according to the image processing program, thereby creating the division auxiliary information.
  • the image processing and division auxiliary information executed by the processor 11 will be described in detail later.
  • the communication device 12 is connected to a communication device 16 (described later) of the UI device 3 via a communication network 22 and transmits / receives data, information, and signals to / from the UI device 3.
  • the communication network 22 is, for example, the Internet, an intranet, a LAN, a WAN, or a combination thereof, and may be either wireless or wired.
  • the communication device 12 transmits the division auxiliary information created by the processor 11 to the communication device 16 of the UI device 3.
  • the UI device 3 is a device used by a user to transmit / receive data, information, and signals to / from the image processing device 2, for example, a general-purpose tablet computer.
  • the UI device 3 includes a display 14, a processor 15, and a communication device 16.
  • the UI device 3 is installed with dedicated application software for displaying the division auxiliary information received from the image processing device 2. As will be described later, the user can use this application software to display the division auxiliary information on the display 14 or to send a request for changing the division auxiliary information to the image processing apparatus 2.
  • the image processing system 100 may include a plurality of sets of monitoring apparatuses 1, image processing apparatuses 2, and UI apparatuses 3. Furthermore, the system 100 may include a server 30 connected to a plurality of image processing apparatuses 2 via the communication network 23.
  • the communication network 23 is, for example, the Internet, an intranet, a LAN, a WAN, or a combination thereof, and may be either wireless or wired.
  • the server 30 is, for example, a cloud server on the Internet or a computer installed at an arbitrary location.
  • the server 30 receives and stores the divisional auxiliary information from the plurality of image processing apparatuses 2.
  • the server 30 may transmit the division auxiliary information received from one image processing apparatus 2 to another image processing apparatus 2.
  • the user can receive and display division auxiliary information created by a plurality of image processing apparatuses 2 using one UI apparatus 3.
  • the image processing method includes a step S1 for obtaining an original image, a step S2 for obtaining a management parameter of the original image, a step S3 for classifying the management parameter, and an original image.
  • step S5 is to calculate the feature amount of the divided image
  • step S6 is to create division auxiliary information based on the feature amount
  • step is to transmit the division auxiliary information to the UI device 3.
  • step S8 which receives the change of the division
  • step S1 the original image is input from the monitoring device 1 to the image processing device 2.
  • the image processing apparatus 2 may acquire a single original image, or may acquire a plurality of original images A, B, and C at a time as shown in FIG.
  • the original images A, B, and C are images of cell groups in different cultures.
  • the management parameters of the original image are input to the image processing apparatus 2.
  • the management parameter is a parameter relating to the internal property of the cell group such as the quality or activity of the cell group in the original image.
  • the management parameter is the proliferation rate of the cell group, the survival rate of the cell group when a predetermined time has elapsed after the end of the culture, or the number of cells when the predetermined time has elapsed after the end of the culture.
  • Management parameters are obtained by measuring cell populations during or after culturing. For example, when shipping cells produced by culture, the shipping survival rate is used as a management parameter.
  • the shipping survival rate is the survival rate of the cell group in the culture container 4 when a predetermined time has elapsed after the end of the culture.
  • the management parameters are input to the UI device 3 by the user, for example, and transmitted from the UI device 3 to the image processing device 2.
  • the management parameter is classified into one of a plurality of classes depending on its size. For example, as shown in FIG. 4, the shipping survival rate of the original image A is classified into the highest class I, the shipping survival rate of the original image B is classified into the middle class II, and the original image C is shipped. Survival is classified as the lowest class III.
  • the type and class of the management parameter are stored in the memory 10 in association with the original image.
  • the management parameters acquired and classified by the image processing apparatus 2 in steps S2 and S3 may be only one type or two or more types. One management parameter per type is given to one original image.
  • step S4 the original image is divided into a plurality of divided images by the initially set division method (step S43). Specifically, as shown in FIG. 4, the original image is equally divided into a preset initial division number, and a divided image having the same initial division number is generated from one original image. In the example of FIG. 4, the initial number of divisions is 4, and three original images A, B, and C are divided into four divided images A1 to A4, B1 to B4, and C1 to C4, respectively.
  • step S4 the type of cell group and the angle of view of the original image are set (step S41), and the initial division number may be determined based on the type of cell group and the angle of view of the original image (step S42).
  • Information regarding the type of cell group and the angle of view of the original image is set in the monitoring device 1 by the user at the start of culture, for example, and transmitted from the monitoring device 1 to the image processing device 2 together with the original image.
  • the feature amount is an amount indicating an appearance feature of the cell group in each divided image, for example, variation in shape, color, size, or orientation.
  • the feature amount may be a feature amount that is automatically generated by an artificial intelligence such as a neural network and that cannot be recognized by humans.
  • the feature amount may be calculated based on information included in the divided image, for example, the luminance, color, or edge direction of each pixel, and may be calculated using a HOG (Histograms of Oriented Gradients) feature amount.
  • the cell group in the region R1 is a small and substantially circular shape, and forms a single layer structure.
  • the cell group in the region R2 has an elliptical shape, is layered, and is irregularly oriented.
  • the cell group in the region R3 is elongated, forms a single layer structure, and is oriented in the same direction.
  • the feature amounts of the plurality of divided images A1, A2, A3, and A4 vary.
  • step S5 the feature amounts of some divided images instead of all the divided images may be calculated. For example, a predetermined number of divided images may be selected from all the divided images, and the feature amounts of only the selected divided images may be calculated.
  • the division auxiliary information 40 includes a one-dimensional distribution map of feature amounts.
  • an axis 41 representing the size of the feature quantity is defined, and a balloon-like plot 42 showing each feature quantity is displayed on the axis 41.
  • the plot 42 may be any figure such as an arrow, circle, or polygon.
  • the plot 42 is displayed in a manner corresponding to the management parameter class.
  • An aspect is the color of the plot 42, for example.
  • the difference in hatching direction of the plot 42 represents the difference in display mode.
  • the feature amount plot 42 of the divided images A1 to A4 and the feature amount plot of the divided images B1 to B4 42 and the plots 42 of feature amounts of the divided images C1 to C4 are displayed in mutually different colors.
  • the divided images A1 to A4, B1 to B4, and C1 to C4 may be used as the plot of the one-dimensional distribution map instead of the figure.
  • the divided auxiliary information 40 may include divided images A1 to A4, B1 to B4, and C1 to C4 that are displayed visually associated with the plot 42.
  • step S ⁇ b> 7 the auxiliary division information 40 is transmitted from the image processing apparatus 2 to the UI apparatus 3 by communication between the communication apparatuses 12 and 16.
  • step S8 a request for changing the original image division method from the UI device 3 is received.
  • the user uses the application software installed in the UI device 3 to display the division auxiliary information 40 received from the image processing device 2 on the display 14.
  • 6 and 7 show an example of a screen displayed on the display 14 of the UI device 3.
  • an area 51 for displaying auxiliary division information 40, an area 52 for displaying the type of management parameter, and a slider 53 for changing the number of divisions of the original image are displayed.
  • the user can know from the auxiliary division information 40 displayed in the area 51 whether or not the feature amount is correlated with the management parameter. For example, in the example of FIGS. 6 and 7, the class I plot 42 is unevenly distributed on the smaller feature amount side, and the class III plot 42 is unevenly distributed on the larger feature amount side. In such a case, it can be determined that the feature value correlates with the shipping survival rate. On the other hand, if there is no difference in the distribution of the plot 42 between the classes I, II, and III, it can be determined that the feature quantity is not correlated with the shipping survival rate.
  • the user can change the original image dividing method, specifically, the number of original image divisions, by operating the slider 53.
  • the size of the divided image is changed by changing the number of divisions.
  • the user determines a number of divisions different from the initial number of divisions by operating the slider 53, and transmits a request to change the determined number of divisions from the UI device 3 to the image processing device 2.
  • step S8 the image processing apparatus 2 receives the change of the division method for a certain period after transmitting the data of the auxiliary division information 40 to the UI apparatus 3. If there is no request for changing the division method from the UI device 3 at the time when the fixed period has elapsed (NO in step S8), the image processing device 2 ends the series of processing. On the other hand, when a request for changing the number of divisions is received from the UI device 3 within a certain period (YES in step S8), the image processing device 2 next executes step S9.
  • step S9 the original image is equally divided into the changed number of divisions determined by the user, and the divided image having the changed number of divisions is generated.
  • step S5 the feature amount of the divided image generated in step S9 is calculated (step S5), division auxiliary information is newly created (step S6), and the newly created division auxiliary information is sent from the image processing apparatus 2 to the UI. It is transmitted to the device 3 (step S7).
  • the size of the divided image is changed, the size of the cell group included in each divided image is changed, the feature amount of the divided image is changed, and the distribution of the plot 42 in the one-dimensional distribution diagram is changed. To do.
  • the feature amount is an amount representing variation in cell group orientation.
  • the divided image includes a single cell group
  • a feature amount that most clearly represents the variation in orientation is calculated. Therefore, variation occurs between the feature amounts of the plurality of divided images, and the plot 42 is distributed over a wide range in the one-dimensional distribution diagram.
  • the divided image includes a plurality of cell groups having different orientation characteristics from each other
  • a feature amount in which variation in orientation of the plurality of cell groups is averaged is calculated. Therefore, the variation between the feature amounts of the plurality of divided images is reduced, and the plot 42 is distributed within a narrow range in the one-dimensional distribution diagram.
  • the one-dimensional distribution map displayed on the screen of the UI apparatus 3 is updated.
  • the user can confirm the variation of the feature amount in each division number while changing the division number, and can specify the optimum division number for calculating the feature amount that most clearly represents the appearance feature of the cell group.
  • the presence / absence of the correlation between the management parameter and the feature amount and the degree of correlation can be accurately known.
  • the management parameter of the cell group can be estimated from the image of the cell group whose management parameter is unknown. That is, it is possible to divide an original image of a cell group whose management parameter is unknown by an optimal number of divisions, calculate a feature amount of the generated divided image, and estimate the management parameter based on the calculated feature amount. .
  • the type of management parameter associated with the plot 42 may be changeable. For example, it may be possible to select a desired type of management parameter from the pull-down menu in the area 52.
  • the display mode for example, the color of the plot 42 of the one-dimensional distribution map displayed on the UI device 3 changes.
  • the maximum number of divisions or the minimum size of the divided images that can be selected by the slider 53 may be determined according to the number of divisions of the cells in the culture vessel 4. If cells divide N times during the culture period, one cell increases to 2 N cells. The maximum division number of the original image or the minimum size of the division image may be determined so that at least about 2N cells are included in one division image. The number N of cell divisions can be estimated from the culture time.
  • the image processing apparatus 2 creates a one-dimensional distribution map of the feature quantity as the auxiliary division information 40. Instead, as shown in FIG. A dimension distribution map may be created. In this case, in step S5, a first feature amount and a second feature amount representing different appearance features are calculated. In the two-dimensional distribution diagram, an axis 41a representing the size of the first feature value and an axis 41b representing the size of the second feature value are defined.
  • the user decides whether to change the number of divisions and determines the number of divisions based on the auxiliary division information 40 displayed on the UI device 3.
  • the image processing apparatus 2 may determine whether to change the division number and determine the division number. That is, as shown in FIG. 10, the image processing apparatus 2 may execute steps S10 to S12 instead of steps S6 to S8.
  • a histogram of the feature amount is created (step S10), and a statistical value of the histogram is calculated as division auxiliary information (step S11).
  • the statistical value is a value representing the width or variance of the histogram, and preferably the half width of the histogram.
  • the half width is, for example, the half width of a Gaussian curve that approximates a histogram. Similar to the distribution width of the plot 42 in the one-dimensional distribution diagram, the half width of the histogram is an index indicating the degree of feature amount averaging, and changes according to the size of the divided image.
  • step S11 If the full width at half maximum is larger than the predetermined threshold (YES in step S11), the number of divisions is not changed and the process ends. On the other hand, when the half width is equal to or smaller than the predetermined threshold (NO in step S11), the number of divisions is changed (step S12), and then step S9 is executed.
  • step S12 the number of divisions is preferably changed in a direction in which the half width of the histogram is increased. Further, a table in which the number of divisions and the half width of the histogram are associated with each other is created while repeating the processes of steps S5, S10, S11, S12, and S9, and the change direction and the number of divisions are changed based on the table. It may be determined.
  • the histogram and the half width may be displayed on the UI device 3 every time the number of divisions is changed, or only the final result may be displayed on the UI device 3.
  • a plurality of divided images are generated by equally dividing the original image.
  • the original image dividing method is not limited to this, and the original image is divided by other methods. May be. For example, even if individual cells are extracted from the entire original image, the original image is divided into minimum units corresponding to one cell, and a divided image is generated by combining a predetermined number of minimum units adjacent to each other. Good. Further, the size of the divided image may be changed by changing the number of combined minimum units.
  • a preferable initial division number can be predicted from the cell type, the culture condition, or the management parameter type, in step S4, the cell type, the culture condition, and the management parameter type.
  • the initial division number may be determined based on at least one. For example, in the culture of iPS cells in an adhesion culture system, it has been confirmed that the cell viability correlates with the stratified structure of several tens of cells. Therefore, when investigating the correlation between the survival rate of iPS cells adherently cultured and the feature amount, the initial division number of the original image is set to a value such that each divided image includes several tens of cells. Also good. In step S12, the number of divisions may be changed in a direction in which several tens of cells are included in the divided image.
  • the subject is a cell group cultured in the culture vessel 4, but may be another subject.
  • the feature amount of the external feature of the mucous membrane which is a lesion part observed by an endoscope and has a high correlation with the malignancy of the lesion part, may be calculated.
  • step S1 the image processing apparatus 2 receives the original image from the monitoring apparatus 1.
  • the image processing apparatus 2 itself selects a cell group in the culture container 4.
  • An original image may be generated by imaging. That is, the image processing apparatus 2 may include the imaging device 5 and the housing 8 and execute both generation of the original image and image processing in the incubator.
  • the image processing described above may be executed by the server 30.
  • the image processing apparatus according to the present embodiment may be realized as a server.
  • the server 30 includes a memory and a processor configured similarly to the memory 10 and the processor 11, and is connected to the monitoring device 1 and the UI device 3 via the communication network 23.
  • the original image is transmitted from the monitoring device 1 to the server 30, division auxiliary information is created by the above-described image processing in the server 30, and the division auxiliary information is transmitted from the server 30 to the UI device 3.
  • high processing capability is required.
  • the server 30, particularly the cloud server, to perform image processing the image processing system 100 can be simplified and reduced in cost.

Abstract

An image processing method including: a step (S1) for acquiring a subject image in which a subject is imaged; a step (S4) for dividing the subject image into a plurality of divided images; a step (S5) for calculating the feature quantity of at least some of the plurality of divided images, the feature quantity being a quantity that represents the external characteristics of the subject; and a step (S6) for creating auxiliary division information on the basis of the feature quantity. The auxiliary division information is used for determining a method for dividing the subject image.

Description

画像処理方法、画像処理装置、ユーザインタフェース装置、画像処理システム、およびサーバImage processing method, image processing apparatus, user interface apparatus, image processing system, and server
 本発明は、画像処理方法、画像処理装置、ユーザインタフェース装置、画像処理システム、およびサーバに関するものである。 The present invention relates to an image processing method, an image processing device, a user interface device, an image processing system, and a server.
 従来、画像内の被写体の外見的特徴に基づいて被写体を認識する画像処理技術が知られている(例えば、特許文献1参照。)。特許文献1では、1枚の画像を複数の小領域に分割し、外見的特徴を表す特徴量を各小領域について算出し、特徴量に基づいて各小領域のカテゴリを判別している。このような方法によれば、1枚の画像内に異なる外見的特徴を有する複数の被写体が含まれる場合に、複数の被写体を区別して認識することができる。 Conventionally, an image processing technique for recognizing a subject based on appearance characteristics of the subject in an image is known (see, for example, Patent Document 1). In Patent Document 1, a single image is divided into a plurality of small areas, a feature amount representing an appearance feature is calculated for each small region, and a category of each small region is determined based on the feature amount. According to such a method, when a plurality of subjects having different appearance features are included in one image, the plurality of subjects can be distinguished and recognized.
 一方で、近年、培養細胞を使用した研究において、細胞または細胞群の外見的特徴から、細胞の質または活性を示すパラメータ(管理パラメータ)を推測する技術が注目されている。管理パラメータとは、例えば、細胞群の増殖率、生存率、または細胞数である。 On the other hand, in recent years, in research using cultured cells, a technique for estimating a parameter (management parameter) indicating the quality or activity of a cell from the appearance characteristics of the cell or the cell group has attracted attention. The management parameter is, for example, the proliferation rate, survival rate, or cell number of a cell group.
特開2017-5389号公報Japanese Unexamined Patent Publication No. 2017-5389
 しかしながら、小領域のサイズに応じて小領域から算出される特徴量が異なることがある。また、どの程度のサイズの被写体の外見的特徴が管理パラメータと相関するかが不明であることがある。
 例えば、ある一定数の細胞が特徴的な構造を形成する場合、その特徴的な構造を正確に表す特徴量を得るためには、小領域が一定数の細胞を含むように小領域のサイズが設定される必要がある。小領域のサイズが一定数の細胞全体のサイズに対して小さすぎる、または大きすぎる場合、特徴的な構造を正確に表す特徴量を小領域から算出することができない。さらに、どの程度の数の細胞が特徴的な構造を形成するかという知見が存在しないことがある。
However, the feature amount calculated from the small area may differ depending on the size of the small area. Also, it may be unclear how much the size of the subject's appearance features correlate with the management parameters.
For example, when a certain number of cells form a characteristic structure, in order to obtain a feature value that accurately represents the characteristic structure, the size of the small area is set so that the small area includes a certain number of cells. Need to be set. When the size of the small area is too small or too large for the entire number of cells, a feature amount that accurately represents a characteristic structure cannot be calculated from the small area. Furthermore, there may be no knowledge of how many cells form a characteristic structure.
 本発明は、上述した事情に鑑みてなされたものであって、被写体の外見的特徴を表す特徴量の算出に適したサイズの分割画像に被写体画像を分割することができる画像処理方法、画像処理装置、ユーザインタフェース装置、画像処理システム、およびサーバを提供することを目的とする。 The present invention has been made in view of the above-described circumstances, and is an image processing method and an image processing method that can divide a subject image into divided images having a size suitable for calculating a feature amount representing an appearance feature of the subject. An object is to provide a device, a user interface device, an image processing system, and a server.
 上記目的を達成するため、本発明は以下の手段を提供する。
 本発明の一態様は、被写体を撮像した被写体画像を取得する工程と、前記被写体画像を複数の分割画像に分割する工程と、前記複数の分割画像の内、少なくとも一部の特徴量を算出し、該特徴量が前記被写体の外見的特徴を表す量である、工程と、前記特徴量に基づいて分割補助情報を作成する工程とを含み、前記分割補助情報が、前記被写体画像の分割方法の決定に用いられる情報である画像処理方法である。
In order to achieve the above object, the present invention provides the following means.
According to one aspect of the present invention, a step of acquiring a subject image obtained by capturing an image of a subject, a step of dividing the subject image into a plurality of divided images, and calculating at least some of the feature amounts of the plurality of divided images. , The feature amount is an amount representing an appearance feature of the subject, and a step of creating division auxiliary information based on the feature amount, wherein the division auxiliary information is a method of dividing the subject image. This is an image processing method that is information used for determination.
 本態様によれば、被写体画像が複数の分割画像に分割され、各分割画像の特徴量が算出される。特徴量は、各分割画像内の被写体の外見的特徴を表す量である。続いて、特徴量に基づいて分割補助情報が作成される。このような分割補助情報に基づいて、被写体の外見的特徴をより正確に表す特徴量を算出することができる被写体画像の分割方法を決定することができる。 According to this aspect, the subject image is divided into a plurality of divided images, and the feature amount of each divided image is calculated. The feature amount is an amount representing the appearance feature of the subject in each divided image. Subsequently, division assistance information is created based on the feature amount. Based on such division auxiliary information, it is possible to determine a subject image division method that can calculate a feature amount that more accurately represents an appearance feature of a subject.
 上記態様においては、前記分割補助情報が、前記特徴量の分布図を含み、該分布図において、前記特徴量の大きさを表す軸上に個々の前記特徴量を示すプロットが表示されていてもよい。
 この構成によって、軸上のプロットの分布に基づいて、分割画像のサイズが特徴量の算出に適しているか否かを判断することができる。例えば、外見的特徴が異なる複数の被写体が被写体画像に含まれている場合、分割画像のサイズが適切であるときには、個々の外見的特徴を反映した特徴量が算出されるため、特徴量間のばらつきは大きくなり、軸上のプロットの分布範囲は広くなる。一方、分割画像のサイズが不適切であるときには、分割画像のサイズが適切であるときと比較して、特徴量間のばらつきが小さくなり、軸上のプロットの分布範囲は狭くなる。
In the above aspect, the division auxiliary information includes a distribution map of the feature quantity, and in the distribution chart, a plot showing the individual feature quantity is displayed on an axis representing the size of the feature quantity. Good.
With this configuration, it is possible to determine whether the size of the divided image is suitable for calculating the feature amount based on the distribution of plots on the axis. For example, when a plurality of subjects having different appearance features are included in the subject image, when the size of the divided image is appropriate, feature amounts reflecting individual appearance features are calculated. The variation increases and the distribution range of the plot on the axis increases. On the other hand, when the size of the divided image is inappropriate, the variation between the feature amounts is smaller than when the size of the divided image is appropriate, and the distribution range of the plot on the axis is narrow.
 上記態様においては、前記分割補助情報が、前記特徴量のヒストグラムに基づく統計値を含んでいてもよい。
 特徴量のヒストグラムの統計値は、ヒストグラムの幅または分散を表す統計値であることが好ましい。特徴量のヒストグラムの統計値に基づいて、分割画像のサイズが特徴量の算出に適しているか否かを判断することができる。例えば、外見的特徴が異なる複数の被写体が被写体画像に含まれている場合、分割画像のサイズが適切であるときには、個々の外見的特徴を反映した特徴量が算出されるため、特徴量間のばらつきは大きくなり、ヒストグラムの統計値は大きくなる。一方、分割画像のサイズが不適切であるときには、分割画像のサイズが適切であるときと比較して、特徴量間のばらつきが小さくなり、ヒストグラムの統計値は小さくなる。
In the above aspect, the auxiliary division information may include a statistical value based on the histogram of the feature amount.
The statistical value of the histogram of the feature amount is preferably a statistical value representing the width or variance of the histogram. Based on the statistical value of the histogram of the feature amount, it can be determined whether or not the size of the divided image is suitable for calculating the feature amount. For example, when a plurality of subjects having different appearance features are included in the subject image, when the size of the divided image is appropriate, feature amounts reflecting individual appearance features are calculated. The variation increases and the statistical value of the histogram increases. On the other hand, when the size of the divided image is inappropriate, the variation between the feature amounts becomes smaller and the statistical value of the histogram becomes smaller than when the size of the divided image is appropriate.
 上記態様においては、前記分割補助情報が、各前記プロットと関連付けられた前記分割画像を含んでいてもよい。
 この構成によって、分割画像と特徴量との比較が容易である分割補助情報を提供することができる。
In the above aspect, the division auxiliary information may include the divided image associated with each plot.
With this configuration, it is possible to provide division auxiliary information that makes it easy to compare a divided image and a feature amount.
 上記態様においては、前記被写体の内的性質に関する管理パラメータを取得する工程と、前記管理パラメータをその大きさによって複数のクラスのいずれかに分類する工程とを含み、前記プロットが、前記被写体画像の管理パラメータのクラスに対応する態様で表示されてもよい。
 この構成によって、特徴量と管理パラメータとの相関の有無および相関の程度を分布図から認識することができる。
In the above aspect, the method includes a step of obtaining a management parameter relating to an internal property of the subject, and a step of classifying the management parameter into one of a plurality of classes according to the size thereof, wherein the plot includes You may display in the aspect corresponding to the class of a management parameter.
With this configuration, the presence / absence of the correlation between the feature quantity and the management parameter and the degree of correlation can be recognized from the distribution map.
 上記態様においては、前記被写体が、培養される細胞群であり、前記管理パラメータが、前記細胞群の増殖率、培養終了後の所定時間経過時における前記細胞群の生存率、または、培養終了後の所定時間経過時における前記細胞群の細胞数であってもよい。
 この構成によって、被写体画像内の細胞群の外見的特徴に基づいて、当該細胞群の内的性質である増殖率、生存率または細胞数を推定することができる。
In the above aspect, the subject is a group of cells to be cultured, and the management parameter includes a growth rate of the cell group, a survival rate of the cell group at a predetermined time after the end of the culture, or after the end of the culture. May be the number of cells in the cell group when a predetermined time elapses.
This configuration makes it possible to estimate the growth rate, survival rate, or number of cells, which are the internal properties of the cell group, based on the appearance characteristics of the cell group in the subject image.
 上記態様においては、前記被写体画像の分割方法を前記分割補助情報に基づいて決定された分割方法に変更する工程と、変更された分割方法で前記被写体画像を複数の分割画像に分割する工程とを含んでいてもよい。
 変更された分割方法で被写体画像を複数の分割画像に再度分割することによって、被写体の外見的特徴をより正確に表す特徴量を算出可能な分割画像を得ることができる。
In the above aspect, the step of changing the subject image division method to a division method determined based on the division auxiliary information, and the step of dividing the subject image into a plurality of divided images by the changed division method. May be included.
By dividing the subject image again into a plurality of divided images by the changed division method, it is possible to obtain a divided image that can calculate a feature amount that more accurately represents the appearance feature of the subject.
 上記態様においては、前記分割方法を変更する工程において、前記分割画像のサイズが変更されてもよい。
 この構成によって、外見的特徴が被写体のサイズと関連している場合に、あるサイズの分割画像から算出された特徴量と他のサイズの分割画像から算出された特徴量とを比較することで、外見的特徴をより正確に表す特徴量が算出される分割画像のサイズを特定することができる。
In the above aspect, the size of the divided image may be changed in the step of changing the division method.
By this configuration, when the appearance feature is related to the size of the subject, by comparing the feature amount calculated from the divided image of a certain size with the feature amount calculated from the divided image of another size, It is possible to specify the size of the divided image from which the feature amount that more accurately represents the appearance feature is calculated.
 上記態様においては、前記分割補助情報をユーザインタフェース装置に送信する工程を含んでいてもよい。
 この構成によって、ユーザは、ユーザインタフェース装置を使用して分割補助情報を確認することができる。
In the said aspect, the process of transmitting the said division | segmentation auxiliary | assistant information to a user interface apparatus may be included.
With this configuration, the user can check the division assistance information using the user interface device.
 上記態様においては、前記ユーザインタフェース装置からの前記分割方法の変更の要求を受け付ける工程を含んでいてもよい。
 この構成によって、ユーザがユーザインタフェース装置から送信した分割方法の変更の要求に従って、被写体画像の分割方法を変更することができる。
The above aspect may include a step of receiving a request for changing the division method from the user interface device.
With this configuration, it is possible to change the division method of the subject image in accordance with the request for changing the division method transmitted from the user interface device by the user.
 本発明の他の態様は、プロセッサを備え、該プロセッサが、被写体を撮像した被写体画像を取得する工程と、前記被写体画像を複数の分割画像に分割する工程と、前記複数の分割画像の内、少なくとも一部の特徴量を算出し、該特徴量が前記被写体の外見的特徴を表す量である、工程と、前記特徴量に基づいて分割補助情報を作成する工程とを実行し、前記分割補助情報が、前記被写体画像の分割方法の決定に用いられる情報である、画像処理装置である。 Another aspect of the present invention includes a processor, the processor acquiring a subject image obtained by imaging a subject, a step of dividing the subject image into a plurality of divided images, and among the plurality of divided images, Calculating at least a part of the feature amount, and executing the step of generating the division assistance information based on the feature amount, and the step of generating the division assistance information based on the feature amount The information processing apparatus is an image processing apparatus in which the information is information used for determining the subject image division method.
 本発明の他の態様は、上記画像処理装置と通信ネットワークによって接続され、前記画像処理装置から受信した分割補助情報を表示するディスプレイを備えるユーザインタフェース装置である。
 本発明の他の態様は、上記画像処理装置と、上記ユーザインタフェース装置とを備える画像処理システムである。
 本発明の他の態様は、上記画像処理装置と通信ネットワークによって接続され、前記分割補助情報を受信および記憶するサーバである。
Another aspect of the present invention is a user interface device that includes a display that is connected to the image processing device via a communication network and displays divisional auxiliary information received from the image processing device.
Another aspect of the present invention is an image processing system including the image processing device and the user interface device.
Another aspect of the present invention is a server that is connected to the image processing apparatus via a communication network and receives and stores the division auxiliary information.
 本発明によれば、被写体の外見的特徴を表す特徴量の算出に適したサイズの分割画像に被写体画像を分割することができるという効果を奏する。 According to the present invention, there is an effect that the subject image can be divided into divided images having a size suitable for calculating the feature amount representing the appearance feature of the subject.
本発明の一実施形態に係る画像処理システムの全体構成図である。1 is an overall configuration diagram of an image processing system according to an embodiment of the present invention. 図1のモニタリング装置、画像処理装置およびUI装置の内部構成図である。It is an internal block diagram of the monitoring apparatus of FIG. 1, an image processing apparatus, and UI apparatus. 本発明の一実施形態に係る画像処理方法を示すフローチャートである。4 is a flowchart illustrating an image processing method according to an embodiment of the present invention. 元画像および分割画像の一例である。It is an example of an original image and a divided image. 外見的特徴が相互に異なる複数の細胞群を含む元画像の一例である。It is an example of the original image containing the several cell group from which an external feature differs mutually. 画像処理装置によって作成される分割補助情報およびUI装置の画面の一例である。It is an example of the division | segmentation auxiliary information produced by the image processing apparatus, and the screen of UI apparatus. 画像処理装置によって作成される分割補助情報およびUI装置の画面の他の例である。10 is another example of division auxiliary information created by the image processing apparatus and a screen of the UI apparatus. スライダによる元画像の分割数の変更を説明する図である。It is a figure explaining the change of the division | segmentation number of the original image by a slider. 画像処理装置によって作成される分割補助情報およびUI装置の画面の他の例である。10 is another example of division auxiliary information created by the image processing apparatus and a screen of the UI apparatus. 図3の画像処理方法の変形例を示すフローチャートである。It is a flowchart which shows the modification of the image processing method of FIG. 図1の画像処理システムの変形例の全体構成図である。It is a whole block diagram of the modification of the image processing system of FIG.
 以下に、本発明の一実施形態に係る画像処理装置、ユーザインタフェース装置、画像処理システム、およびサーバについて図面を参照して説明する。
 本実施形態に係る画像処理システム100は、図1に示されるように、培養容器4内で培養される細胞群(被写体)の画像を生成するモニタリング装置1と、モニタリング装置1から細胞群の画像を取得し細胞群の画像を処理する画像処理装置2と、ユーザによって使用されるユーザインタフェース(UI)装置3とを備えている。
Hereinafter, an image processing apparatus, a user interface apparatus, an image processing system, and a server according to an embodiment of the present invention will be described with reference to the drawings.
As shown in FIG. 1, the image processing system 100 according to the present embodiment includes a monitoring device 1 that generates an image of a cell group (subject) cultured in the culture vessel 4, and an image of the cell group from the monitoring device 1. And a user interface (UI) device 3 used by a user.
 モニタリング装置1は、図2に示されるように、撮像デバイス5と、プロセッサ6と、通信装置7とを備えている。撮像デバイス5、プロセッサ6および通信装置7は、密閉された箱型の筐体8(図1参照。)内に収容されている。筐体8の透明な天板8aは、培養容器4が載置されるステージとして使用される。モニタリング装置1は、培養期間中、培養容器4と一緒にインキュベータ(図示略)内に配置される。 As shown in FIG. 2, the monitoring device 1 includes an imaging device 5, a processor 6, and a communication device 7. The imaging device 5, the processor 6, and the communication device 7 are housed in a sealed box-shaped housing 8 (see FIG. 1). The transparent top plate 8a of the housing 8 is used as a stage on which the culture vessel 4 is placed. The monitoring device 1 is arranged in an incubator (not shown) together with the culture vessel 4 during the culture period.
 撮像デバイス5は、CMOSイメージセンサまたはCCDイメージセンサのような撮像素子(図示略)を有する。撮像デバイス5は、撮像素子によってステージ8a上の培養容器4の内部を撮像し、細胞群の画像(被写体画像)を生成する。
 プロセッサ6は、予め設定されたスケジュールに従って、または、画像処理装置2からの指示に従って、撮像デバイス5に撮像を実行させる。
The imaging device 5 has an imaging element (not shown) such as a CMOS image sensor or a CCD image sensor. The imaging device 5 images the inside of the culture vessel 4 on the stage 8a with an imaging element, and generates an image of the cell group (subject image).
The processor 6 causes the imaging device 5 to perform imaging according to a preset schedule or according to an instruction from the image processing apparatus 2.
 通信装置7は、画像処理装置2の通信装置12(後述)と通信ネットワーク21によって接続され、画像処理装置2との間でデータ、情報および信号を送受信する。通信ネットワーク21は、例えば、インターネット、イントラネット、LAN(Local Area Network)、WAN(Wide Area Network)、またはこれらの組み合わせである。通信ネットワーク21は、無線および有線のいずれであってもよい。通信装置7は、撮像デバイス5から画像を受け取り、画像を画像処理装置2の通信装置12へ送信する。 The communication device 7 is connected to a communication device 12 (described later) of the image processing device 2 through a communication network 21, and transmits / receives data, information, and signals to / from the image processing device 2. The communication network 21 is, for example, the Internet, an intranet, a LAN (Local Area Network), a WAN (Wide Area Network), or a combination thereof. The communication network 21 may be either wireless or wired. The communication device 7 receives an image from the imaging device 5 and transmits the image to the communication device 12 of the image processing device 2.
 画像処理装置2は、メモリ10と、プロセッサ11と、通信装置12とを備えている。画像処理装置2は、インキュベータの外に配置され、通信装置12,7間の通信によって、インキュベータ内のモニタリング装置1から細胞群の画像(元画像)を受信する。
 メモリ10には、プロセッサ11に所定の処理を実行させるための画像処理プログラムが格納されている。プロセッサ11は、メモリ10から画像処理プログラムを読み出し画像処理プログラムに従って処理を実行することによって、分割補助情報を作成する。プロセッサ11が実行する画像処理および分割補助情報については後で詳述する。
The image processing device 2 includes a memory 10, a processor 11, and a communication device 12. The image processing device 2 is disposed outside the incubator, and receives an image (original image) of a cell group from the monitoring device 1 in the incubator through communication between the communication devices 12 and 7.
The memory 10 stores an image processing program for causing the processor 11 to execute predetermined processing. The processor 11 reads out the image processing program from the memory 10 and executes processing according to the image processing program, thereby creating the division auxiliary information. The image processing and division auxiliary information executed by the processor 11 will be described in detail later.
 通信装置12は、UI装置3の通信装置16(後述)と通信ネットワーク22によって接続され、UI装置3との間でデータ、情報および信号を送受信する。通信ネットワーク22は、例えば、インターネット、イントラネット、LAN、WAN、またはこれらの組み合わせであり、無線および有線のいずれであってもよい。通信装置12は、プロセッサ11によって作成された分割補助情報をUI装置3の通信装置16へ送信する。 The communication device 12 is connected to a communication device 16 (described later) of the UI device 3 via a communication network 22 and transmits / receives data, information, and signals to / from the UI device 3. The communication network 22 is, for example, the Internet, an intranet, a LAN, a WAN, or a combination thereof, and may be either wireless or wired. The communication device 12 transmits the division auxiliary information created by the processor 11 to the communication device 16 of the UI device 3.
 UI装置3は、ユーザが画像処理装置2との間でデータ、情報および信号を送受信するために使用するデバイス、例えば汎用のタブレット型コンピュータである。UI装置3は、ディスプレイ14と、プロセッサ15と、通信装置16とを備える。
 UI装置3には、画像処理装置2から受信した分割補助情報を表示するための専用のアプリケーションソフトウェアがインストールされている。ユーザは、後述するように、このアプリケーションソフトウェアを使用して、分割補助情報をディスプレイ14に表示したり分割補助情報の変更の要求を画像処理装置2に送信したりすることができる。
The UI device 3 is a device used by a user to transmit / receive data, information, and signals to / from the image processing device 2, for example, a general-purpose tablet computer. The UI device 3 includes a display 14, a processor 15, and a communication device 16.
The UI device 3 is installed with dedicated application software for displaying the division auxiliary information received from the image processing device 2. As will be described later, the user can use this application software to display the division auxiliary information on the display 14 or to send a request for changing the division auxiliary information to the image processing apparatus 2.
 画像処理システム100は、図1に示されるように、複数組のモニタリング装置1、画像処理装置2およびUI装置3を備えていてもよい。
 さらに、システム100は、複数台の画像処理装置2と通信ネットワーク23によって接続されたサーバ30を備えていてもよい。通信ネットワーク23は、例えば、インターネット、イントラネット、LAN、WAN、またはこれらの組み合わせであり、無線および有線のいずれであってもよい。サーバ30は、例えば、インターネット上のクラウドサーバ、または、任意の場所に設置されたコンピュータである。
As illustrated in FIG. 1, the image processing system 100 may include a plurality of sets of monitoring apparatuses 1, image processing apparatuses 2, and UI apparatuses 3.
Furthermore, the system 100 may include a server 30 connected to a plurality of image processing apparatuses 2 via the communication network 23. The communication network 23 is, for example, the Internet, an intranet, a LAN, a WAN, or a combination thereof, and may be either wireless or wired. The server 30 is, for example, a cloud server on the Internet or a computer installed at an arbitrary location.
 サーバ30は、複数台の画像処理装置2から分割補助情報を受信し記憶する。サーバ30は、一の画像処理装置2から受信した分割補助情報を他の画像処理装置2に送信してもよい。ユーザは、1台のUI装置3を使用して、複数台の画像処理装置2によって作成された分割補助情報を受信し表示させることができる。 The server 30 receives and stores the divisional auxiliary information from the plurality of image processing apparatuses 2. The server 30 may transmit the division auxiliary information received from one image processing apparatus 2 to another image processing apparatus 2. The user can receive and display division auxiliary information created by a plurality of image processing apparatuses 2 using one UI apparatus 3.
 次に、画像処理装置2が実行する画像処理方法について説明する。
 本実施形態に係る画像処理方法は、図3に示されるように、元画像を取得するステップS1と、元画像の管理パラメータを取得するステップS2と、管理パラメータを分類するステップS3と、元画像を複数の分割画像に分割するステップS4と、分割画像の特徴量を算出するステップS5と、特徴量に基づいて分割補助情報を作成するステップS6と、分割補助情報をUI装置3に送信するステップS7と、元画像の分割方法の変更を受け付けるステップS8と、変更された分割方法で元画像を複数の分割画像に再度分割するステップS9とを含む。
Next, an image processing method executed by the image processing apparatus 2 will be described.
As shown in FIG. 3, the image processing method according to the present embodiment includes a step S1 for obtaining an original image, a step S2 for obtaining a management parameter of the original image, a step S3 for classifying the management parameter, and an original image. Are divided into a plurality of divided images, step S5 is to calculate the feature amount of the divided image, step S6 is to create division auxiliary information based on the feature amount, and step is to transmit the division auxiliary information to the UI device 3. S7, step S8 which receives the change of the division | segmentation method of an original image, and step S9 which again divides | segments an original image into a some divided image with the changed division | segmentation method.
 ステップS1において、モニタリング装置1から元画像が画像処理装置2に入力される。画像処理装置2は、単一の元画像を取得してもよく、図4に示されるように、一度に複数枚の元画像A,B,Cを取得してもよい。図4の例において、元画像A,B,Cは、異なる培養における細胞群の画像である。 In step S1, the original image is input from the monitoring device 1 to the image processing device 2. The image processing apparatus 2 may acquire a single original image, or may acquire a plurality of original images A, B, and C at a time as shown in FIG. In the example of FIG. 4, the original images A, B, and C are images of cell groups in different cultures.
 次に、ステップS2において、元画像の管理パラメータが画像処理装置2に入力される。管理パラメータは、元画像内の細胞群の質または活性等、細胞群の内的性質に関するパラメータである。例えば、管理パラメータは、細胞群の増殖率、培養終了後の所定時間経過時における細胞群の生存率、または、培養終了後の所定時間経過時における細胞数である。 Next, in step S2, the management parameters of the original image are input to the image processing apparatus 2. The management parameter is a parameter relating to the internal property of the cell group such as the quality or activity of the cell group in the original image. For example, the management parameter is the proliferation rate of the cell group, the survival rate of the cell group when a predetermined time has elapsed after the end of the culture, or the number of cells when the predetermined time has elapsed after the end of the culture.
 管理パラメータは、培養中または培養終了後に細胞群を測定することによって得られる。例えば、培養によって生産された細胞を出荷する場合、出荷時生存率が管理パラメータとして使用される。出荷時生存率とは、培養終了後の所定時間経過時の培養容器4内の細胞群の生存率である。管理パラメータは、例えば、ユーザによってUI装置3に入力され、UI装置3から画像処理装置2に送信される。 Management parameters are obtained by measuring cell populations during or after culturing. For example, when shipping cells produced by culture, the shipping survival rate is used as a management parameter. The shipping survival rate is the survival rate of the cell group in the culture container 4 when a predetermined time has elapsed after the end of the culture. The management parameters are input to the UI device 3 by the user, for example, and transmitted from the UI device 3 to the image processing device 2.
 次に、ステップS3において、管理パラメータが、その大きさによって、複数のクラスのいずれかに分類される。例えば、図4に示されるように、元画像Aの出荷時生存率は最も高いクラスIに分類され、元画像Bの出荷時生存率は真ん中のクラスIIに分類され、元画像Cの出荷時生存率は最も低いクラスIIIに分類される。管理パラメータの種類およびクラスは、元画像と関連付けられてメモリ10に記憶される。
 ステップS2,S3において画像処理装置2が取得および分類する管理パラメータは、1種類のみであってもよく、2種類以上であってもよい。1枚の元画像に対して、1種類当たり1個の管理パラメータが与えられる。
Next, in step S3, the management parameter is classified into one of a plurality of classes depending on its size. For example, as shown in FIG. 4, the shipping survival rate of the original image A is classified into the highest class I, the shipping survival rate of the original image B is classified into the middle class II, and the original image C is shipped. Survival is classified as the lowest class III. The type and class of the management parameter are stored in the memory 10 in association with the original image.
The management parameters acquired and classified by the image processing apparatus 2 in steps S2 and S3 may be only one type or two or more types. One management parameter per type is given to one original image.
 次に、ステップS4において、初期設定された分割方法で元画像が複数の分割画像に分割される(ステップS43)。具体的には、図4に示されるように、元画像は、予め設定された初期分割数に等分され、1枚の元画像から同一サイズの初期分割数の分割画像が生成される。図4の例において、初期分割数は4であり、3枚の元画像A,B,Cが4枚の分割画像A1~A4,B1~B4,C1~C4にそれぞれ分割されている。 Next, in step S4, the original image is divided into a plurality of divided images by the initially set division method (step S43). Specifically, as shown in FIG. 4, the original image is equally divided into a preset initial division number, and a divided image having the same initial division number is generated from one original image. In the example of FIG. 4, the initial number of divisions is 4, and three original images A, B, and C are divided into four divided images A1 to A4, B1 to B4, and C1 to C4, respectively.
 ステップS4において、細胞群の種類および元画像の画角が設定され(ステップS41)、細胞群の種類および元画像の画角に基づいて初期分割数が決定されてもよい(ステップS42)。細胞群の種類および元画像の画角に関する情報は、例えば、培養開始時にユーザによってモニタリング装置1に設定され、元画像と一緒にモニタリング装置1から画像処理装置2に送信される。 In step S4, the type of cell group and the angle of view of the original image are set (step S41), and the initial division number may be determined based on the type of cell group and the angle of view of the original image (step S42). Information regarding the type of cell group and the angle of view of the original image is set in the monitoring device 1 by the user at the start of culture, for example, and transmitted from the monitoring device 1 to the image processing device 2 together with the original image.
 次に、ステップS5において、ステップS4で生成された全ての分割画像の特徴量が算出される。特徴量は、各分割画像内の細胞群の外見的特徴、例えば、形状、色、大きさ、または、配向のばらつきを示す量である。特徴量は、ニューラルネットワーク等の人工知能によって自動生成された、人間には認識できない特徴量であってもよい。特徴量は、分割画像に含まれる情報、例えば、各画素の輝度、色、またはエッジ方向に基づいて算出され、HOG(Histograms of Oriented Gradients)特徴量を用いて算出されてもよい。 Next, in step S5, the feature amounts of all the divided images generated in step S4 are calculated. The feature amount is an amount indicating an appearance feature of the cell group in each divided image, for example, variation in shape, color, size, or orientation. The feature amount may be a feature amount that is automatically generated by an artificial intelligence such as a neural network and that cannot be recognized by humans. The feature amount may be calculated based on information included in the divided image, for example, the luminance, color, or edge direction of each pixel, and may be calculated using a HOG (Histograms of Oriented Gradients) feature amount.
 図5に示されるように、同一の培養容器4内で培養される同一種の細胞群であっても、外見的特徴に差異が生じ得る。例えば、領域R1の細胞群は、小さい略真円形であり、単層構造を形成している。領域R2の細胞群は、楕円形であり、重層化し、不規則に配向している。領域R3の細胞群は、細長く、単層構造を形成し、同一方向に配向している。このように、1枚の元画像A内に相互に異なる外見的特徴を有する複数の細胞群が含まれている場合、複数の分割画像A1,A2、A3,A4の特徴量にばらつきが生じる。 As shown in FIG. 5, even in the same type of cell group cultured in the same culture vessel 4, a difference in appearance characteristics may occur. For example, the cell group in the region R1 is a small and substantially circular shape, and forms a single layer structure. The cell group in the region R2 has an elliptical shape, is layered, and is irregularly oriented. The cell group in the region R3 is elongated, forms a single layer structure, and is oriented in the same direction. As described above, when a plurality of cell groups having different appearance features are included in one original image A, the feature amounts of the plurality of divided images A1, A2, A3, and A4 vary.
 ステップS5において、全ての分割画像ではなく、一部の分割画像の特徴量が算出されてもよい。例えば、全ての分割画像の中から所定数の分割画像が選択され、選択された分割画像のみの特徴量が算出されてもよい。 In step S5, the feature amounts of some divided images instead of all the divided images may be calculated. For example, a predetermined number of divided images may be selected from all the divided images, and the feature amounts of only the selected divided images may be calculated.
 次に、ステップS6において、元画像の分割方法の決定に用いられる分割補助情報が作成される。分割補助情報40は、図6に示されるように、特徴量の1次元分布図を含む。1次元分布図において、特徴量の大きさを表す軸41が定義され、軸41上に個々の特徴量を示す吹き出し状のプロット42が表示されている。プロット42は、矢印、丸または多角形等の任意の図形であってよい。 Next, in step S6, division auxiliary information used to determine the original image division method is created. As shown in FIG. 6, the division auxiliary information 40 includes a one-dimensional distribution map of feature amounts. In the one-dimensional distribution diagram, an axis 41 representing the size of the feature quantity is defined, and a balloon-like plot 42 showing each feature quantity is displayed on the axis 41. The plot 42 may be any figure such as an arrow, circle, or polygon.
 プロット42は、管理パラメータのクラスに応じた態様で表示される。態様とは、例えば、プロット42の色である。図6の例において、プロット42のハッチングの向きの違いは、表示態様の違いを表している。例えば、元画像A,B,Cの出荷時生存率のクラスI,II,IIIが相互に異なる場合、分割画像A1~A4の特徴量のプロット42と、分割画像B1~B4の特徴量のプロット42と、分割画像C1~C4の特徴量のプロット42は、相互に異なる色で表示される。 The plot 42 is displayed in a manner corresponding to the management parameter class. An aspect is the color of the plot 42, for example. In the example of FIG. 6, the difference in hatching direction of the plot 42 represents the difference in display mode. For example, when the class I, II, and III of the shipping survival rates of the original images A, B, and C are different from each other, the feature amount plot 42 of the divided images A1 to A4 and the feature amount plot of the divided images B1 to B4 42 and the plots 42 of feature amounts of the divided images C1 to C4 are displayed in mutually different colors.
 1次元分布図のプロットとして、図形に代えて、分割画像A1~A4,B1~B4,C1~C4が用いられてもよい。あるいは、図7に示されるように、分割補助情報40に、プロット42と視覚的に関連付けられて表示される分割画像A1~A4,B1~B4,C1~C4が含まれていてもよい。 The divided images A1 to A4, B1 to B4, and C1 to C4 may be used as the plot of the one-dimensional distribution map instead of the figure. Alternatively, as shown in FIG. 7, the divided auxiliary information 40 may include divided images A1 to A4, B1 to B4, and C1 to C4 that are displayed visually associated with the plot 42.
 次に、ステップS7において、分割補助情報40が通信装置12,16間の通信によって画像処理装置2からUI装置3に送信される。
 次に、ステップS8において、UI装置3からの元画像の分割方法の変更の要求が受け付けられる。
Next, in step S <b> 7, the auxiliary division information 40 is transmitted from the image processing apparatus 2 to the UI apparatus 3 by communication between the communication apparatuses 12 and 16.
In step S8, a request for changing the original image division method from the UI device 3 is received.
 ユーザは、UI装置3にインストールされたアプリケーションソフトウェアを使用して、画像処理装置2から受信した分割補助情報40をディスプレイ14に表示させる。図6および図7は、UI装置3のディスプレイ14に表示される画面の一例を示している。画面には、分割補助情報40を表示する領域51と、管理パラメータの種類を表示する領域52と、元画像の分割数を変更するためのスライダ53とが表示される。 The user uses the application software installed in the UI device 3 to display the division auxiliary information 40 received from the image processing device 2 on the display 14. 6 and 7 show an example of a screen displayed on the display 14 of the UI device 3. On the screen, an area 51 for displaying auxiliary division information 40, an area 52 for displaying the type of management parameter, and a slider 53 for changing the number of divisions of the original image are displayed.
 ユーザは、領域51に表示される分割補助情報40から、特徴量が管理パラメータと相関しているか否かを知ることができる。
 例えば、図6および図7の例において、特徴量の小さい側にクラスIのプロット42が偏って分布し、特徴量の大きい側にクラスIIIのプロット42が偏って分布している。このような場合には、特徴量が出荷時生存率と相関していると判断することができる。一方、クラスI,II,III間でプロット42の分布に差異が認められない場合には、特徴量が出荷時生存率と相関していないと判断することができる。
The user can know from the auxiliary division information 40 displayed in the area 51 whether or not the feature amount is correlated with the management parameter.
For example, in the example of FIGS. 6 and 7, the class I plot 42 is unevenly distributed on the smaller feature amount side, and the class III plot 42 is unevenly distributed on the larger feature amount side. In such a case, it can be determined that the feature value correlates with the shipping survival rate. On the other hand, if there is no difference in the distribution of the plot 42 between the classes I, II, and III, it can be determined that the feature quantity is not correlated with the shipping survival rate.
 図8に示されるように、ユーザは、スライダ53の操作によって、元画像の分割方法、具体的には元画像の分割数を変更することができる。分割数の変更によって、分割画像のサイズが変更される。分割数を変更したい場合、ユーザは、スライダ53の操作によって初期分割数とは異なる分割数を決定し、決定された分割数への変更の要求をUI装置3から画像処理装置2へ送信する。 As shown in FIG. 8, the user can change the original image dividing method, specifically, the number of original image divisions, by operating the slider 53. The size of the divided image is changed by changing the number of divisions. When the user wants to change the number of divisions, the user determines a number of divisions different from the initial number of divisions by operating the slider 53, and transmits a request to change the determined number of divisions from the UI device 3 to the image processing device 2.
 ステップS8において、画像処理装置2は、分割補助情報40のデータをUI装置3に送信した後、一定期間、分割方法の変更を受け付ける。一定期間が経過した時点で分割方法の変更の要求がUI装置3から無い場合には(ステップS8のNO)、画像処理装置2は、一連の処理を終了する。一方、一定期間内にUI装置3から分割数の変更の要求が有った場合には(ステップS8のYES)、画像処理装置2は、次にステップS9を実行する。 In step S8, the image processing apparatus 2 receives the change of the division method for a certain period after transmitting the data of the auxiliary division information 40 to the UI apparatus 3. If there is no request for changing the division method from the UI device 3 at the time when the fixed period has elapsed (NO in step S8), the image processing device 2 ends the series of processing. On the other hand, when a request for changing the number of divisions is received from the UI device 3 within a certain period (YES in step S8), the image processing device 2 next executes step S9.
 ステップS9において、ユーザによって決定された変更後の分割数に元画像が等分され、変更後の分割数の分割画像が生成される。
 次に、ステップS9において生成された分割画像の特徴量が算出され(ステップS5)、分割補助情報が新たに作成され(ステップS6)、新たに作成された分割補助情報が画像処理装置2からUI装置3に送信される(ステップS7)。
In step S9, the original image is equally divided into the changed number of divisions determined by the user, and the divided image having the changed number of divisions is generated.
Next, the feature amount of the divided image generated in step S9 is calculated (step S5), division auxiliary information is newly created (step S6), and the newly created division auxiliary information is sent from the image processing apparatus 2 to the UI. It is transmitted to the device 3 (step S7).
 分割数が変更されると、分割画像のサイズが変化し、各分割画像に含まれる細胞群のサイズが変化し、分割画像の特徴量が変化し、1次元分布図におけるプロット42の分布が変化する。 When the number of divisions is changed, the size of the divided image is changed, the size of the cell group included in each divided image is changed, the feature amount of the divided image is changed, and the distribution of the plot 42 in the one-dimensional distribution diagram is changed. To do.
 例えば、特徴量が細胞群の配向のばらつきを表す量である場合を考える。
 分割画像が単一の細胞群を含むとき、配向のばらつきを最も明確に表す特徴量が算出される。したがって、複数の分割画像の特徴量間にばらつきが生じ、1次元分布図においてプロット42が広い範囲に分布する。
 一方、分割画像が、相互に配向特性が異なる複数の細胞群を含むとき、複数の細胞群の配向のばらつきが平均化された特徴量が算出される。したがって、複数の分割画像の特徴量間のばらつきは小さくなり、1次元分布図においてプロット42が狭い範囲内に分布する。
For example, let us consider a case where the feature amount is an amount representing variation in cell group orientation.
When the divided image includes a single cell group, a feature amount that most clearly represents the variation in orientation is calculated. Therefore, variation occurs between the feature amounts of the plurality of divided images, and the plot 42 is distributed over a wide range in the one-dimensional distribution diagram.
On the other hand, when the divided image includes a plurality of cell groups having different orientation characteristics from each other, a feature amount in which variation in orientation of the plurality of cell groups is averaged is calculated. Therefore, the variation between the feature amounts of the plurality of divided images is reduced, and the plot 42 is distributed within a narrow range in the one-dimensional distribution diagram.
 本実施形態によれば、ユーザが元画像の分割数の変更の要求を画像処理装置2に送信する度に、UI装置3の画面に表示される1次元分布図が更新される。これにより、ユーザは、各分割数における特徴量のばらつきを分割数を変更しながら確認し、細胞群の外見的特徴を最も明確に表す特徴量の算出に最適な分割数を特定することができる。また、最適な分割数を用いて生成された分割画像の特徴量に基づいて、管理パラメータと特徴量との相関の有無および相関の程度を正確に知ることができる。 According to the present embodiment, each time the user transmits a request for changing the number of divisions of the original image to the image processing apparatus 2, the one-dimensional distribution map displayed on the screen of the UI apparatus 3 is updated. Thereby, the user can confirm the variation of the feature amount in each division number while changing the division number, and can specify the optimum division number for calculating the feature amount that most clearly represents the appearance feature of the cell group. . Further, based on the feature amount of the divided image generated using the optimum number of divisions, the presence / absence of the correlation between the management parameter and the feature amount and the degree of correlation can be accurately known.
 また、管理パラメータと相関する特徴量が一度特定された後は、管理パラメータが未知である細胞群の画像から、当該細胞群の管理パラメータを推定することができる。すなわち、管理パラメータが未知である細胞群の元画像を最適な分割数で分割し、生成された分割画像の特徴量を算出し、算出された特徴量に基づいて管理パラメータを推定することができる。 In addition, after the feature quantity correlated with the management parameter is specified once, the management parameter of the cell group can be estimated from the image of the cell group whose management parameter is unknown. That is, it is possible to divide an original image of a cell group whose management parameter is unknown by an optimal number of divisions, calculate a feature amount of the generated divided image, and estimate the management parameter based on the calculated feature amount. .
 各元画像に複数種類の管理パラメータが関連付けられている場合には、プロット42と対応付ける管理パラメータの種類を変更可能であってもよい。例えば、領域52のプルダウンメニューの中から所望の種類の管理パラメータを選択可能であってもよい。管理パラメータの種類が変更されると、UI装置3に表示される1次元分布図のプロット42の表示態様、例えば色が変化する。 When multiple types of management parameters are associated with each original image, the type of management parameter associated with the plot 42 may be changeable. For example, it may be possible to select a desired type of management parameter from the pull-down menu in the area 52. When the type of the management parameter is changed, the display mode, for example, the color of the plot 42 of the one-dimensional distribution map displayed on the UI device 3 changes.
 本実施形態において、スライダ53によって選択可能な最大分割数または分割画像の最小サイズは、培養容器4内の細胞の分裂回数に応じて決定されてもよい。
 培養期間中に細胞がN回分裂した場合、1個の細胞が2個の細胞に増える。1枚の分割画像に少なくとも約2個の細胞が含まれるように、元画像の最大分割数または分割画像の最小サイズが決定されてもよい。細胞の分裂回数Nは、培養時間から推定することができる。
In the present embodiment, the maximum number of divisions or the minimum size of the divided images that can be selected by the slider 53 may be determined according to the number of divisions of the cells in the culture vessel 4.
If cells divide N times during the culture period, one cell increases to 2 N cells. The maximum division number of the original image or the minimum size of the division image may be determined so that at least about 2N cells are included in one division image. The number N of cell divisions can be estimated from the culture time.
 本実施形態においては、画像処理装置2が、分割補助情報40として、特徴量の1次元分布図を作成することとしたが、これに代えて、図9に示されるように、特徴量の2次元分布図を作成してもよい。
 この場合、ステップS5において、相互に異なる外見的特徴を表す第1の特徴量および第2の特徴量が算出される。2次元分布図において、第1の特徴量の大きさを表す軸41aと、第2の特徴量の大きさを表す軸41bが定義される。
In the present embodiment, the image processing apparatus 2 creates a one-dimensional distribution map of the feature quantity as the auxiliary division information 40. Instead, as shown in FIG. A dimension distribution map may be created.
In this case, in step S5, a first feature amount and a second feature amount representing different appearance features are calculated. In the two-dimensional distribution diagram, an axis 41a representing the size of the first feature value and an axis 41b representing the size of the second feature value are defined.
 本実施形態においては、ユーザが、UI装置3に表示された分割補助情報40に基づいて、分割数を変更するか否かの判断および分割数の決定を行うこととしたが、これに代えて、画像処理装置2が分割数の変更の判断および分割数の決定を行ってもよい。すなわち、画像処理装置2は、図10に示されるように、ステップS6からS8に代えて、ステップS10からS12を実行してもよい。 In this embodiment, the user decides whether to change the number of divisions and determines the number of divisions based on the auxiliary division information 40 displayed on the UI device 3. The image processing apparatus 2 may determine whether to change the division number and determine the division number. That is, as shown in FIG. 10, the image processing apparatus 2 may execute steps S10 to S12 instead of steps S6 to S8.
 ステップS5の後、特徴量のヒストグラムが作成され(ステップS10)、ヒストグラムの統計値が分割補助情報として算出される(ステップS11)。統計値は、ヒストグラムの幅または分散を表す値であり、好ましくは、ヒストグラムの半値幅である。半値幅は、例えば、ヒストグラムを近似するガウス曲線の半値幅である。ヒストグラムの半値幅は、1次元分布図におけるプロット42の分布幅と同様に、特徴量の平均化の程度を示す指標であり、分割画像のサイズに応じて変化する。 After step S5, a histogram of the feature amount is created (step S10), and a statistical value of the histogram is calculated as division auxiliary information (step S11). The statistical value is a value representing the width or variance of the histogram, and preferably the half width of the histogram. The half width is, for example, the half width of a Gaussian curve that approximates a histogram. Similar to the distribution width of the plot 42 in the one-dimensional distribution diagram, the half width of the histogram is an index indicating the degree of feature amount averaging, and changes according to the size of the divided image.
 半値幅が所定の閾値よりも大きい場合(ステップS11のYES)、分割数の変更は行われず、処理が終了する。一方、半値幅が所定の閾値以下である場合(ステップS11のNO)、分割数が変更され(ステップS12)、続いてステップS9が実行される。
 ステップS12において、分割数は、ヒストグラムの半値幅が大きくなる方向に変更されることが好ましい。また、ステップS5,S10,S11,S12,S9の処理を繰り返しながら分割数とヒストグラムの半値幅とが対応付けられたテーブルが作成され、テーブルに基づいて、分割数の変更の方向および分割数が決定されてもよい。
 ヒストグラムおよび半値幅は、分割数が変更される度にUI装置3に表示されてもよく、最終結果のみがUI装置3に表示されてもよい。
If the full width at half maximum is larger than the predetermined threshold (YES in step S11), the number of divisions is not changed and the process ends. On the other hand, when the half width is equal to or smaller than the predetermined threshold (NO in step S11), the number of divisions is changed (step S12), and then step S9 is executed.
In step S12, the number of divisions is preferably changed in a direction in which the half width of the histogram is increased. Further, a table in which the number of divisions and the half width of the histogram are associated with each other is created while repeating the processes of steps S5, S10, S11, S12, and S9, and the change direction and the number of divisions are changed based on the table. It may be determined.
The histogram and the half width may be displayed on the UI device 3 every time the number of divisions is changed, or only the final result may be displayed on the UI device 3.
 本実施形態においては、元画像を等分することによって複数の分割画像が生成されることとしたが、元画像の分割方法はこれに限定されるものではなく、他の方法で元画像が分割されてもよい。
 例えば、元画像全体から個々の細胞が抽出され、1個の細胞に対応する最小ユニットに元画像が分割され、相互に隣接する所定数の最小ユニットを結合することによって分割画像が生成されてもよい。また、最小ユニットの結合数を変更することによって、分割画像のサイズが変更されてもよい。
In this embodiment, a plurality of divided images are generated by equally dividing the original image. However, the original image dividing method is not limited to this, and the original image is divided by other methods. May be.
For example, even if individual cells are extracted from the entire original image, the original image is divided into minimum units corresponding to one cell, and a divided image is generated by combining a predetermined number of minimum units adjacent to each other. Good. Further, the size of the divided image may be changed by changing the number of combined minimum units.
 本実施形態においては、細胞の種類、培養条件、または管理パラメータの種類から、好ましい初期分割数を予測することができる場合に、ステップS4において、細胞の種類、培養条件、および管理パラメータの種類の少なくとも1つに基づいて初期分割数が決定されてもよい。
 例えば、接着培養系でのiPS細胞の培養において、細胞生存率が、数十個の細胞の重層構造と相関することが確認されている。したがって、接着培養されたiPS細胞の生存率と特徴量との相関を調べる場合には、元画像の初期分割数が、各分割画像に数十個の細胞が含まれるような値に設定されてもよい。また、ステップS12において、数十個の細胞が分割画像に含まれる方向に、分割数が変更されてもよい。
In this embodiment, when a preferable initial division number can be predicted from the cell type, the culture condition, or the management parameter type, in step S4, the cell type, the culture condition, and the management parameter type. The initial division number may be determined based on at least one.
For example, in the culture of iPS cells in an adhesion culture system, it has been confirmed that the cell viability correlates with the stratified structure of several tens of cells. Therefore, when investigating the correlation between the survival rate of iPS cells adherently cultured and the feature amount, the initial division number of the original image is set to a value such that each divided image includes several tens of cells. Also good. In step S12, the number of divisions may be changed in a direction in which several tens of cells are included in the divided image.
 本実施形態においては、被写体が、培養容器4内で培養される細胞群であることとしたが、他の被写体であってもよい。例えば、被写体が、内視鏡によって観察される病変部であり、病変部の悪性度と相関の高い粘膜の外見的特徴の特徴量を算出してもよい。 In the present embodiment, the subject is a cell group cultured in the culture vessel 4, but may be another subject. For example, the feature amount of the external feature of the mucous membrane, which is a lesion part observed by an endoscope and has a high correlation with the malignancy of the lesion part, may be calculated.
 本実施形態においては、ステップS1において、画像処理装置2がモニタリング装置1から元画像を受信することとしたが、これに代えて、画像処理装置2が、自身で培養容器4内の細胞群を撮像し元画像を生成してもよい。すなわち、画像処理装置2が、撮像デバイス5および筐体8を備え、インキュベータ内で元画像の生成および画像処理の両方を実行してもよい。 In the present embodiment, in step S1, the image processing apparatus 2 receives the original image from the monitoring apparatus 1. However, instead of this, the image processing apparatus 2 itself selects a cell group in the culture container 4. An original image may be generated by imaging. That is, the image processing apparatus 2 may include the imaging device 5 and the housing 8 and execute both generation of the original image and image processing in the incubator.
 本実施形態においては、図11に示されるように、上述した画像処理がサーバ30によって実行されてもよい。すなわち、本実施形態に係る画像処理装置がサーバとして実現されてもよい。
 この場合、サーバ30は、メモリ10およびプロセッサ11と同様に構成されたメモリおよびプロセッサを備え、モニタリング装置1およびUI装置3と通信ネットワーク23によって接続される。元画像はモニタリング装置1からサーバ30に送信され、サーバ30において上述した画像処理によって分割補助情報が作成され、サーバ30からUI装置3に分割補助情報が送信される。
 多くの処理を必要とする画像処理を実現するためには、高い処理能力が要求される。画像処理を、サーバ30、特にクラウドサーバに担わせることで、画像処理システム100の簡素化および低コスト化を図ることができる。
In the present embodiment, as shown in FIG. 11, the image processing described above may be executed by the server 30. That is, the image processing apparatus according to the present embodiment may be realized as a server.
In this case, the server 30 includes a memory and a processor configured similarly to the memory 10 and the processor 11, and is connected to the monitoring device 1 and the UI device 3 via the communication network 23. The original image is transmitted from the monitoring device 1 to the server 30, division auxiliary information is created by the above-described image processing in the server 30, and the division auxiliary information is transmitted from the server 30 to the UI device 3.
In order to realize image processing that requires a lot of processing, high processing capability is required. By allowing the server 30, particularly the cloud server, to perform image processing, the image processing system 100 can be simplified and reduced in cost.
1 モニタリング装置
2 画像処理装置
3 ユーザインタフェース装置
4 培養容器
5 撮像デバイス
6,11,15 プロセッサ
7,12,16 通信装置
8 筐体
8a 天板
10 メモリ
14 ディスプレイ
21,22,22,23 通信ネットワーク
30 サーバ
40 分割補助情報
41,41a,41b 軸
42 プロット
100 画像処理システム
A,B,C 元画像
A1~A4,B1~B4,C1~C4 分割画像
DESCRIPTION OF SYMBOLS 1 Monitoring apparatus 2 Image processing apparatus 3 User interface apparatus 4 Culture container 5 Imaging device 6, 11, 15 Processor 7, 12, 16 Communication apparatus 8 Case 8a Top plate 10 Memory 14 Display 21, 22, 22, 23 Communication network 30 Server 40 Split auxiliary information 41, 41a, 41b Axis 42 Plot 100 Image processing systems A, B, C Original images A1-A4, B1-B4, C1-C4 Split images

Claims (14)

  1.  被写体を撮像した被写体画像を取得する工程と、
     前記被写体画像を複数の分割画像に分割する工程と、
     前記複数の分割画像の内、少なくとも一部の特徴量を算出し、該特徴量が前記被写体の外見的特徴を表す量である、工程と、
     前記特徴量に基づいて分割補助情報を作成する工程とを含み、
     前記分割補助情報が、前記被写体画像の分割方法の決定に用いられる情報である画像処理方法。
    Acquiring a subject image obtained by imaging the subject;
    Dividing the subject image into a plurality of divided images;
    Calculating a feature amount of at least a part of the plurality of divided images, wherein the feature amount is an amount representing an appearance feature of the subject; and
    Creating division auxiliary information based on the feature amount,
    An image processing method, wherein the division auxiliary information is information used to determine a method for dividing the subject image.
  2.  前記分割補助情報が、前記特徴量の分布図を含み、該分布図において、前記特徴量の大きさを表す軸上に個々の前記特徴量を示すプロットが表示されている請求項1に記載の画像処理方法。 The division auxiliary information includes a distribution map of the feature quantity, and a plot showing each of the feature quantities is displayed on an axis representing the size of the feature quantity in the distribution chart. Image processing method.
  3.  前記分割補助情報が、前記特徴量のヒストグラムに基づく統計値を含む請求項1に記載の画像処理方法。 The image processing method according to claim 1, wherein the auxiliary division information includes a statistical value based on a histogram of the feature amount.
  4.  前記分割補助情報が、各前記プロットと関連付けられた前記分割画像を含む請求項2に記載の画像処理方法。 3. The image processing method according to claim 2, wherein the auxiliary division information includes the divided images associated with the plots.
  5.  前記被写体の内的性質に関する管理パラメータを取得する工程と、
     前記管理パラメータをその大きさによって複数のクラスのいずれかに分類する工程とを含み、
     前記プロットが、前記被写体画像の管理パラメータのクラスに対応する態様で表示される請求項2または請求項4に記載の画像処理方法。
    Obtaining management parameters relating to the internal properties of the subject;
    Classifying the management parameter into any of a plurality of classes according to its size,
    The image processing method according to claim 2, wherein the plot is displayed in a manner corresponding to a management parameter class of the subject image.
  6.  前記被写体が、培養される細胞群であり、
     前記管理パラメータが、前記細胞群の増殖率、培養終了後の所定時間経過時における前記細胞群の生存率、または、培養終了後の所定時間経過時における前記細胞群の細胞数である請求項5に記載の画像処理方法。
    The subject is a group of cells to be cultured;
    6. The management parameter is a growth rate of the cell group, a survival rate of the cell group at a predetermined time after the end of culture, or a cell number of the cell group at a predetermined time after the end of the culture. An image processing method described in 1.
  7.  前記被写体画像の分割方法を前記分割補助情報に基づいて決定された分割方法に変更する工程と、
     変更された分割方法で前記被写体画像を複数の分割画像に分割する工程とを含む請求項1から請求項6のいずれかに記載の画像処理方法。
    Changing the division method of the subject image to a division method determined based on the division auxiliary information;
    The image processing method according to claim 1, further comprising a step of dividing the subject image into a plurality of divided images by the changed division method.
  8.  前記分割方法を変更する工程において、前記分割画像のサイズが変更される請求項7に記載の画像処理方法。 The image processing method according to claim 7, wherein the size of the divided image is changed in the step of changing the division method.
  9.  前記分割補助情報をユーザインタフェース装置に送信する工程を含む請求項7または請求項8に記載の画像処理方法。 9. The image processing method according to claim 7, further comprising a step of transmitting the division auxiliary information to a user interface device.
  10.  前記ユーザインタフェース装置からの前記分割方法の変更の要求を受け付ける工程を含む請求項9に記載の画像処理方法。 The image processing method according to claim 9, further comprising a step of receiving a request for changing the division method from the user interface device.
  11.  プロセッサを備え、
     該プロセッサが、
     被写体を撮像した被写体画像を取得する工程と、
     前記被写体画像を複数の分割画像に分割する工程と、
     前記複数の分割画像の内、少なくとも一部の特徴量を算出し、該特徴量が前記被写体の外見的特徴を表す量である、工程と、
     前記特徴量に基づいて分割補助情報を作成する工程とを実行し、
     前記分割補助情報が、前記被写体画像の分割方法の決定に用いられる情報である、画像処理装置。
    With a processor,
    The processor
    Acquiring a subject image obtained by imaging the subject;
    Dividing the subject image into a plurality of divided images;
    Calculating a feature amount of at least a part of the plurality of divided images, wherein the feature amount is an amount representing an appearance feature of the subject; and
    Executing the division auxiliary information based on the feature amount,
    The image processing apparatus, wherein the division auxiliary information is information used to determine a method for dividing the subject image.
  12.  請求項11に記載の画像処理装置と通信ネットワークによって接続され、前記画像処理装置から受信した分割補助情報を表示するディスプレイを備えるユーザインタフェース装置。 A user interface device comprising a display that is connected to the image processing device according to claim 11 through a communication network and displays auxiliary division information received from the image processing device.
  13.  請求項11に記載の画像処理装置と、
     請求項12に記載のユーザインタフェース装置とを備える画像処理システム。
    An image processing apparatus according to claim 11;
    An image processing system comprising the user interface device according to claim 12.
  14.  請求項11に記載の画像処理装置と通信ネットワークによって接続され、前記分割補助情報を受信および記憶するサーバ。 A server that is connected to the image processing apparatus according to claim 11 through a communication network, and that receives and stores the division auxiliary information.
PCT/JP2018/009966 2018-03-14 2018-03-14 Image processing method, image processing device, user interface device, image processing system and server WO2019176012A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/009966 WO2019176012A1 (en) 2018-03-14 2018-03-14 Image processing method, image processing device, user interface device, image processing system and server
JP2020506023A JP6931418B2 (en) 2018-03-14 2018-03-14 Image processing methods, image processing devices, user interface devices, image processing systems, servers, and image processing programs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/009966 WO2019176012A1 (en) 2018-03-14 2018-03-14 Image processing method, image processing device, user interface device, image processing system and server

Publications (1)

Publication Number Publication Date
WO2019176012A1 true WO2019176012A1 (en) 2019-09-19

Family

ID=67907519

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/009966 WO2019176012A1 (en) 2018-03-14 2018-03-14 Image processing method, image processing device, user interface device, image processing system and server

Country Status (2)

Country Link
JP (1) JP6931418B2 (en)
WO (1) WO2019176012A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022220300A1 (en) * 2021-04-16 2022-10-20 国立大学法人九州大学 Treatment effect prediction device, treatment effect prediction method, treatment effect prediction program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08329241A (en) * 1995-06-05 1996-12-13 Xerox Corp Improvement method of contrast
JP2014232485A (en) * 2013-05-30 2014-12-11 三星電子株式会社Samsung Electronics Co.,Ltd. Texture detection device, texture detection method, texture detection program, and image processing system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2951909B2 (en) * 1997-03-17 1999-09-20 松下電器産業株式会社 Gradation correction device and gradation correction method for imaging device
JP2015200695A (en) * 2014-04-04 2015-11-12 キヤノン株式会社 Image processor and control method thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08329241A (en) * 1995-06-05 1996-12-13 Xerox Corp Improvement method of contrast
JP2014232485A (en) * 2013-05-30 2014-12-11 三星電子株式会社Samsung Electronics Co.,Ltd. Texture detection device, texture detection method, texture detection program, and image processing system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
UENO ET AL., IECE TECHNICAL REPORT, vol. 107, no. 115, 1 June 2007 (2007-06-01), pages 63 - 68 *
YUKA ET AL., 6 March 2013 (2013-03-06), pages 623 - 624 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022220300A1 (en) * 2021-04-16 2022-10-20 国立大学法人九州大学 Treatment effect prediction device, treatment effect prediction method, treatment effect prediction program

Also Published As

Publication number Publication date
JPWO2019176012A1 (en) 2021-01-07
JP6931418B2 (en) 2021-09-01

Similar Documents

Publication Publication Date Title
US20220058525A1 (en) Model integration apparatus, model integration method, computer-readable storage medium storing a model integration program, inference system, inspection system, and control system
CN107423551B (en) Imaging method and imaging system for performing medical examinations
WO2018101004A1 (en) Cell image evaluation system and program for controlling cell image evaluation
JP5871325B2 (en) Information processing apparatus, information processing system, information processing method, program, and recording medium
CN102843511B (en) Image is processed back-up system, messaging device and image and is processed support method
CN104680524B (en) A kind of leafy vegetable disease screening method
CN107295310B (en) planting monitoring method and planting monitoring device
US10007835B2 (en) Cell region display control device, method, and program
EP3326109A1 (en) System and method for providing a recipe
JP2020024672A (en) Information processor, information processing method and program
CN110263748A (en) Method and apparatus for sending information
WO2019176012A1 (en) Image processing method, image processing device, user interface device, image processing system and server
CN109191386A (en) A kind of quick Gamma bearing calibration and device based on BPNN
JP2020507153A (en) Plant monitoring
US20210019656A1 (en) Information processing device, information processing method, and computer program
JP2009303011A (en) Image data processing apparatus, program thereof, and method thereof
JP2015171334A (en) Colony counting method, occurrence detection method of combination among bacteria colonies, counting method of bacteria colonies, colony counting program, and colony counter
WO2015045012A1 (en) Colony inspection program, colony inspection device, and colony inspection method
Linhares et al. How good are RGB cameras retrieving colors of natural scenes and paintings?—A study based on hyperspectral imaging
CN111064864A (en) Method and device for setting distortion correction parameters and endoscope system
CN108805062A (en) A kind of intelligent monitor system
KR20160064265A (en) Environment control system per cultivation block
CN110515084B (en) Method for estimating number of field bird targets based on acoustic imaging technology
JP2015231343A (en) Colony classification amending method, classification amending program, and classification amending apparatus
JP5449444B2 (en) Colony counting device, colony counting method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18909346

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020506023

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18909346

Country of ref document: EP

Kind code of ref document: A1