WO2011122200A1 - データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム - Google Patents
データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム Download PDFInfo
- Publication number
- WO2011122200A1 WO2011122200A1 PCT/JP2011/054557 JP2011054557W WO2011122200A1 WO 2011122200 A1 WO2011122200 A1 WO 2011122200A1 JP 2011054557 W JP2011054557 W JP 2011054557W WO 2011122200 A1 WO2011122200 A1 WO 2011122200A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- evaluation
- motion
- unit
- index data
- data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/223—Analysis of motion using block-matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present disclosure relates to a data processing device and a data processing method, an image processing device and a method, and a program, and in particular, a data processing device that generates data that serves as an index when performing an evaluation on an object that has a periodic motion, and
- the present invention relates to a method, an image processing apparatus and method, and a program.
- a measurement point is set in an imaging screen obtained by photographing cardiomyocytes, and the luminance of the measurement point is automatically measured, and the deformation period of the cardiomyocytes is measured from the measurement value.
- Is known for example, see Patent Document 1).
- the periodic change in luminance is a measurement target
- the measurement is limited to the time interval of the pulsation cycle.
- it is non-invasive, it still has the same problem as measuring potential in that the information that can be quantified remains in the cycle of pulsation, and it is still difficult to obtain an accurate evaluation result. .
- the present disclosure has been made in view of such a situation so that the motion of an object that performs a periodic motion typified by cultured cardiomyocytes can be evaluated with higher accuracy and accuracy than before.
- the purpose is to do.
- a first aspect of the present disclosure is that a plurality of frame image data forming moving image data having an image content of an object that performs periodic motion is defined as predetermined pixels.
- a motion detecting unit that detects time-series data of motion for each block by dividing the block into a number of arrangements, and at least one type of each block based on the detected time-series data of motion for each block
- the feature amount calculation unit calculates a plurality of types of the feature amounts for each block, and the classification unit calculates the classification data based on the calculated types of the feature amounts. May be generated. This brings about the effect
- the feature amount calculation unit may calculate an average motion direction that is an average value of motion directions per unit time in a fixed time as one type of the feature amount. This brings about the effect
- the feature amount calculation unit may calculate an average motion amount that is an average value of motion amounts per unit time in a fixed time as one type of the feature amount. This brings about the effect
- the feature amount calculation unit may calculate an average amplitude that is an average value of amplitudes of a certain amount or more of the motion amount obtained in a predetermined time as one type of the feature amount.
- the feature amount calculation unit may calculate an average acceleration that is an average value of motion accelerations per unit time in a fixed time as one type of the feature amount.
- the feature amount calculation unit calculates an average motion interval that is an average value of time intervals at which a certain amount of motion amount amplitude is obtained in a certain time as one type of the feature amount. Also good. Thereby, the effect
- the feature amount calculation unit calculates a motion start time that is a time from a predetermined timing to a timing at which a certain amount of motion amount amplitude is obtained as one type of the feature amount. Also good. Thereby, the effect
- the classification unit calculates a distance between each of the plurality of templates having different combinations of feature amounts corresponding to the plurality of classification categories and the block, and calculates the calculated distance.
- a process of classifying the block as belonging to any one of the plurality of classification categories may be performed for each block. This brings about the effect
- the classification unit performs clustering by the k-means method based on the feature amount calculated corresponding to each block, so that each of the blocks belongs to any one of a predetermined number of classification categories May be classified as This brings about the effect
- a motion detection unit that detects a motion of the evaluation target using an image of the evaluation target, and a motion vector that indicates the motion of the evaluation target detected by the motion detection unit
- An index data generation unit that shows the characteristics of the movement of the evaluation target, generates index data used as an index for the evaluation of the evaluation target, evaluates the index data generated by the index data generation unit, and evaluates
- the image processing apparatus includes an evaluation value calculation unit that calculates.
- the index data generation unit generates index data regarding the magnitude of the amplitude of the motion of the evaluation target and index data regarding the frequency per unit time of the peak of the motion of the evaluation target, and the evaluation value calculation unit Using the index data relating to the magnitude of the motion of the evaluation target generated by the index data generation unit, an evaluation value for evaluating the magnitude of the motion of the evaluation target is calculated, and the index data generation unit
- the evaluation value for evaluating the frequency per unit time of the motion peak of the evaluation target can be calculated using the index data relating to the frequency per unit time of the peak of the motion of the evaluation target generated by the above.
- the index data relating to the magnitude of the motion amplitude of the evaluation target may be an average value of the entire image of the evaluation target of the product of the normalized amplitude and the normalized variance of the amplitude. .
- the index data related to the magnitude of the motion amplitude of the evaluation target is an image of the evaluation target in a region where the value of the product of the normalized amplitude and the normalized variance of the amplitude is a value equal to or greater than a predetermined threshold. It is possible to make it a ratio to the whole.
- the index data related to the frequency per unit time of the movement target peak to be evaluated is the entire screen of the product of the normalized number of peaks per unit time and the normalized variance of the number of peaks per unit time It is possible to make it an average value.
- the index data relating to the frequency per unit time of the movement target peak to be evaluated has a predetermined value of the product of the normalized number of peaks per unit time and the normalized variance of the number of peaks per unit time It is possible to make the ratio of the area having a value equal to or greater than the threshold value to the entire image to be evaluated.
- the index data generation unit further generates index data related to a classification result obtained by classifying each partial region of the evaluation target image based on a feature amount of the evaluation target motion
- the evaluation value calculation unit can further calculate an evaluation value for evaluating the classification result of the feature quantity of the motion to be evaluated using the index data relating to the classification result generated by the index data generation unit.
- the index data generation unit calculates a motion amount of the evaluation target detected by the motion detection unit, and the evaluation value calculation unit displays temporal changes in the motion amount calculated by the index data generation unit. And can be displayed.
- the index data generation unit generates index data indicating a change in a temporal change in the calculated amount of movement due to administration of a drug to the cardiomyocytes of a peak of a waveform indicating relaxation of the cardiomyocytes to be evaluated.
- the evaluation value calculation unit can evaluate the index data calculated by the index data generation unit and calculate an evaluation value.
- the image processing apparatus further includes an imaging unit that captures the evaluation target and obtains the evaluation target image, and the motion detection unit detects a motion of the evaluation target using the evaluation target image obtained by the imaging unit. Can do.
- the motion detection unit can detect the motion of the evaluation target between the frame images in the evaluation period of a predetermined length of the evaluation target image that is a moving image.
- the motion detecting unit can repeat the detection of the motion of the evaluation target during the evaluation period a predetermined number of times.
- the evaluation value calculation unit evaluates each of the plurality of types of index data generated by the index data generation unit, calculates an evaluation value, and evaluates the evaluation object by integrating the calculated evaluation values.
- the evaluation value to be calculated can be calculated.
- the evaluation object can be a cell that moves spontaneously.
- the evaluation object may be a cultured cell generated by culturing a cell collected from a living body.
- the motion detection unit of the image processing device detects the motion of the evaluation target using the evaluation target image
- the index data generation unit of the image processing device detects the detected
- a motion vector indicating the motion of the evaluation target is used to indicate the characteristics of the motion of the evaluation target, generate index data used as an index for the evaluation of the evaluation target
- the evaluation value calculation unit of the image processing apparatus An image processing method for evaluating the generated index data and calculating an evaluation value.
- the computer further includes a motion detection unit that detects a motion of the evaluation target using an image of the evaluation target, and a motion vector that indicates the detected motion of the evaluation target.
- a motion detection unit that detects a motion of the evaluation target using an image of the evaluation target
- a motion vector that indicates the detected motion of the evaluation target.
- an index data generation unit that shows the characteristics of the movement of the target and generates index data used as an index for the evaluation of the evaluation target
- an evaluation value calculation unit that evaluates the generated index data and calculates an evaluation value It is a program to make it function.
- an evaluation target image is used to detect a motion of the evaluation target, a motion vector indicating the detected motion of the evaluation target is used to indicate the characteristics of the evaluation target motion, Index data used as an index for evaluation of an evaluation target is generated, the generated index data is evaluated, and an evaluation value is calculated.
- classification data serving as an index for enabling accurate and accurate evaluation of an object that performs periodic motion can be obtained.
- FIG. 1 is a diagram illustrating a configuration example of a cultured cardiomyocyte evaluation system 100.
- FIG. 3 is a block diagram illustrating a configuration example of an evaluation index data generation device 300.
- FIG. It is a figure which shows the structural example of the evaluation object image data 600 typically.
- 3 is a block diagram illustrating a configuration example of a motion detection unit 310.
- FIG. It is a figure which shows typically the process which divides
- FIG. 6 is a diagram schematically illustrating a structure example of motion detection data 700. It is a figure which shows the example of the feature-value calculated.
- FIG. 26 is a block diagram which shows the main structural examples of a chemical
- index data generation process. It is a flowchart explaining the example of the flow of an impact evaluation process. It is a figure explaining the mode of change of pulsation rhythm by medicine administration. It is a figure explaining the mode of dispersion of pulsation behavior by medicine administration. FIG. 26 is a block diagram illustrating a main configuration example of a personal computer.
- First embodiment evaluation index data generation process: an example in which a classification process is executed using a template
- Second embodiment evaluation index data generation processing: an example in which classification processing is executed by the k-average method
- Third embodiment cultured cardiomyocyte evaluation apparatus
- Fourth embodiment drug evaluation apparatus
- Fifth embodiment personal computer
- FIG. 1 shows a configuration example of a cultured cardiomyocyte evaluation system 100.
- the cultured cardiomyocyte evaluation system 100 shown in this figure is for evaluating the quality of the cultured cardiomyocytes 500.
- Cultured cardiomyocytes 500 are cultured for heart disease as cells for such treatment.
- there is a state of technology development for mass-producing such cultured cells so that a sufficient amount can be supplied to a medical site at low cost.
- the cultured cells are mass-produced in this way, it is required to be able to evaluate the cultured cells efficiently and accurately.
- the cultured cardiomyocytes 500 perform movement corresponding to pulsation by themselves.
- the quality of cultured cardiomyocytes 500 can be judged by evaluating whether the movement according to the pulsation is good.
- the cultured cardiomyocyte evaluation system 100 records moving image data obtained by imaging the cultured cardiomyocytes 500, and performs evaluation based on the motion detection result for the recorded moving image data. Thereby, although it is noninvasive, a detailed and accurate evaluation result is obtained rather than visual evaluation.
- the cultured cardiomyocyte evaluation system 100 includes, for example, an imaging device 110, an evaluation target image data generation / recording device 200, an evaluation index data generation device 300, and an evaluation device 400, as shown in the figure.
- the imaging device 110 is for imaging the cultured cardiomyocytes 500 to be evaluated.
- a state in which the cultured cardiomyocytes 500 are directly photographed by the imaging device 110 is shown, but in actuality, for example, a microscope image of the cultured cardiomyocytes 500 is captured.
- the imaging position of the imaging device 110 with respect to the cultured cardiomyocytes 500 is fixed.
- the evaluation target image data generation / recording device 200 generates evaluation target image data based on the image signal input from the imaging device 110, and records and stores the generated evaluation target image data in, for example, an internal recording medium. It is a device.
- the evaluation target image data generated here is, for example, moving image data generated from an image signal obtained by imaging the cultured cardiomyocytes 500.
- the evaluation index data generation device 300 inputs, for example, moving image data as evaluation target image data stored in the evaluation target image data generation recording device 200, and is used as an index for evaluation of the cultured cardiomyocytes 500.
- the evaluation device 400 is a device that obtains an evaluation result by processing the evaluation index data generated by the evaluation index data generation device 300.
- FIG. 2 shows a configuration example of the evaluation index data generation device 300.
- the evaluation index data generation device 300 shown in this figure includes a motion detection unit 310, a motion detection data storage unit 320, a feature amount calculation unit 330, and a classification processing unit 340.
- the evaluation target image data 600 shown in this figure is obtained by reproducing the data recorded in the evaluation target image data generation / recording apparatus 200, and as described above, a moving image composed of frame image data. It is data.
- the motion detection unit 310 is a part that inputs the evaluation target image data 600 and executes a motion detection process. Note that a specific example of motion detection processing by the motion detection unit 310 in this case and a structure example of motion detection data will be described later.
- the motion detection data storage unit 320 is a part that stores motion detection data obtained as a result of the motion detection processing of the motion detection unit 310.
- the feature amount calculation unit 330 is a part that calculates and acquires a predetermined feature amount using the motion detection data stored in the motion detection data storage unit 320. An example of the feature amount calculated here will be described later.
- the classification processing unit 340 is a part for obtaining the evaluation index data 800 by executing the classification process based on the feature amount information obtained by the feature amount calculation unit 330. A specific example of this classification process will be described later.
- the evaluation index data 800 obtained by the classification processing unit 340 is an example of classification data described in the claims.
- FIG. 3 shows a structural example of the evaluation target image data 600 input by the evaluation index data generation device 300.
- the evaluation target image data 600 is composed of first to (T + 1) th frame image data 610-1 to (T + 1) corresponding to a fixed time.
- the moving image data as the evaluation target image data 600 stored in the evaluation target image data generation / recording apparatus 200 may be composed of the frame image data 610-1 to 610-T shown in FIG. 3 as they are. Alternatively, it may be moving image data including frame image data 610-1 to T as a partial section. In the latter case, for example, from the moving image data as the evaluation target image data 600 stored in the evaluation target image data generation / recording apparatus 200, the image section determined to be optimal for the evaluation is extracted. Then, the evaluation index data generation device 300 may input the moving image data of the image section as the evaluation target image data 600 of FIG.
- FIG. 4 shows a configuration example of the motion detection unit 310.
- the motion detection unit 310 shown in this figure includes a frame memory 311 and a motion vector calculation unit 312.
- the frame memory 311 is a part that holds frame image data 610 that is sequentially input as the evaluation target image data 600 every frame period.
- the motion vector calculation unit 312 is a part that calculates a motion vector.
- the motion vector calculation unit 312 inputs the frame image data input as the evaluation target image data 600 at the current time and the frame image data of the next time held in the frame memory 311. Then, a motion vector is calculated using these two frame image data. The calculated motion vector is held in the motion detection data storage unit 320 as motion detection data 700.
- the motion vector calculation unit 312 receives the frame image data 610 at the current time and the frame image data 610 at the next time.
- the motion vector calculation unit 312 divides the input frame image data 610 into blocks. That is, as shown in FIG. 5, the two-dimensional pixel region formed by the frame image data 610 is divided into M pieces in the horizontal direction and N pieces in the vertical direction. As a result, the frame image data 610 is divided into (M ⁇ N) blocks 611.
- Each of the blocks 611 is composed of, for example, (16 ⁇ 16) pixels.
- the motion vector calculation unit 312 calculates a motion vector using the unit of the block 611 as a processing target as the motion detection processing. Then, the motion detection processing is executed by sequentially using the first to (T + 1) th frame image data 610.
- the motion detection data 700 obtained at the stage when the last motion detection process using the Tth and (T + 1) th frame image data 610 is completed is as shown in FIG.
- the motion detection data 700 shown in this figure is composed of T frame unit motion detection data 710-1 to 710-1 to T.
- Each of the frame unit motion detection data 710-1 to 710 -T is obtained by performing motion detection processing on the current frame image data 610 and the next frame image data 610 obtained for each frame period.
- the third frame unit motion detection data 710-1 includes the third frame image data 610-4 and the third frame image data 610-3 as the frame image data of the current time and the next time, respectively. It is obtained by inputting and performing motion detection.
- each of the frame corresponding motion detection data 710-1 to 710 -T is formed by (M ⁇ N) block unit motion detection data 711.
- Each block-unit motion detection data 711 corresponds to one block 611 and is data indicating a motion vector detected for the corresponding block 611.
- the motion detection data 700 has a structure having (M ⁇ N) block unit motion detection data 711 for each frame corresponding motion detection data 710. This means that time-series data regarding motion vectors is obtained corresponding to each block 611 forming the frame image data 610.
- the feature amount calculation unit 330 calculates a plurality of feature amounts using the motion detection data 700 stored in the motion detection data storage unit 320. First, an example of a feature amount calculated and acquired by the feature amount calculation unit 330 will be described with reference to FIG.
- FIG. 7 shows the motion vector indicated by the block unit motion detection data 711 corresponding to a certain block 611 in time series. That is, the block unit motion detection data 711 corresponding to one block 611 is similarly T in correspondence with the fact that the frame corresponding motion detection data 710 is T as described with reference to FIG.
- FIG. 7 shows a sample of motion vectors indicated by the T block unit motion detection data 711 in time series.
- the block unit motion detection data 711 includes the motion amount of the horizontal component and the motion amount of the vertical component as motion vector information.
- the horizontal component or the vertical component is used. It is assumed that one amount of movement is shown.
- the vertical axis indicates the amount of motion
- the horizontal axis indicates the frame, that is, time.
- an average motion amount Vav an average motion direction ⁇ av, an average amplitude Aav, an average acceleration Bav, an average pulsation interval Dav, and a pulsation start time S are assumed here. As will be understood from the following description, these feature amounts are all obtained based on the detected motion vector. These feature amounts are obtained for each block 611.
- the T block-unit motion detection data 711 indicates a change in motion amount over time. This is obtained in response to the periodic change between the state in which the cultured cardiomyocytes 500 move and the state in which they remain stationary according to the pulsation generated in the cultured cardiomyocytes 500. From this point, it can be said that, for example, a change in the amount of movement according to the passage of time has a characteristic according to the pulsation. Therefore, in the present disclosure, the average of the T motion amounts obtained in the evaluation section is treated as the feature amount.
- This average motion amount Vav is obtained by the following equation, where Vx and Vy are the motion amounts of the horizontal direction component and the vertical direction component obtained as motion vectors for each block unit motion detection data 711, and n is a variable corresponding to the frame order. be able to. That is, as the average motion amount Vav, first, a combined motion amount V obtained by combining the motion amounts Vx and Vy of the horizontal direction component and the vertical direction component for each of the T block unit motion detection data 711 is obtained. Then, it is obtained by calculating an average value for these T combined motion amounts V.
- an average value of the horizontal component motion amount Vx (horizontal average motion amount Vavx) and an average value of the vertical component motion amount Vy (vertical average motion amount Vavy) in the evaluation period are calculated.
- the average motion amount Vav may be calculated by synthesizing the horizontal average motion amount Vavx and the vertical average motion amount Vavy.
- the average motion direction ⁇ av is an average value for the T motion directions ⁇ obtained in the evaluation section.
- This average motion direction ⁇ av can be obtained by the following equation.
- the distribution of the average motion direction ⁇ av corresponding to each block 611 in the entire frame image it is possible to evaluate how uniform the cultured cardiomyocytes 500 are when moving due to pulsation. In addition, it is possible to evaluate the distribution state of the portion where the motion direction is not uniform.
- the average amplitude Aav is an average value for amplitudes greater than or equal to a certain value obtained as described above in the evaluation section.
- the average amplitude Aav can be obtained, for example, as follows.
- peak detection is performed on the combined motion amount V which is K motion vectors in the evaluation section, and the detected peak value is averaged to obtain the average amplitude Aav.
- peak detection is performed on the amount of movement of the horizontal component, and the detected peak values are averaged to obtain the average amplitude (horizontal average amplitude) Aavx of the horizontal component.
- the average amplitude (vertical average amplitude) Aavy of the vertical direction component is obtained.
- an operation of calculating the average amplitude Aav by performing an operation of combining the horizontal average amplitude Aavx and the vertical average amplitude Aave can be considered. For example, it can be evaluated that the larger the average amplitude Aav, the larger the movement corresponding to the pulsation in the portion of the cultured cardiomyocyte 500 corresponding to the block 611.
- the average acceleration Bav is an average value for acceleration obtained in the evaluation section, and can be obtained, for example, as follows.
- the composite motion amount V indicated by a predetermined number of block-unit motion detection data 711 moving back and forth in chronological order is differentiated.
- the acceleration B for each predetermined time in the evaluation section is calculated.
- an average acceleration Bav is obtained as an average value of these accelerations B.
- a horizontal amount of motion Vx indicated by a predetermined number of block unit motion detection data 711 preceding and following in time series order is sequentially differentiated to calculate a horizontal acceleration Bx per predetermined time, and an average of these horizontal accelerations Bx.
- the horizontal average acceleration Bavx is obtained as a value.
- the vertical average acceleration Bavy is calculated.
- the average acceleration Bav can also be obtained by combining the horizontal average acceleration Bavx and the vertical average acceleration Bavy.
- the average acceleration Bav is an index of agility when the cultured cardiomyocyte 500 changes from a stationary state to a moving state according to the pulsation. If average acceleration Bav is high, it can be evaluated that the movement according to the pulsation about the corresponding culture
- the average pulsation interval Dav is an average value of the pulsation intervals obtained in the evaluation section, and can be obtained as follows, for example.
- the average pulsation interval Dav is obtained by calculating the average value of the calculated pulsation intervals D. For example, it is possible to evaluate to what extent the pulsation time interval is uniform in the entire cultured cardiomyocyte 500 by the distribution of the average pulsation interval Dav for each block 611 in the entire frame. When attention is paid to a non-uniform distribution, it is possible to evaluate the distribution state of the pulsation time interval shift.
- the average pulsation interval Dav is an example of the average motion interval described in the claims.
- the pulsation start time S is obtained by measuring the time from when the evaluation interval starts until the peak of the amplitude of the motion amount according to the motion of the first pulsation is obtained.
- the pulsation start time S can also be evaluated to the extent that the pulsation start timing is uniform in the entire cultured cardiomyocyte 500 by, for example, the distribution of the pulsation start time S for each block 611 in the entire frame. Further, when attention is paid to a non-uniform distribution, it is possible to evaluate the distribution state of the pulsation start timing shift.
- each of the above feature amounts is calculated based on the detected motion vector (motion amount). That is, in the present disclosure, various items can be quantified from the time-series data of the motion vector.
- the feature amount calculation unit 330 can calculate any of the six feature amounts of the average motion amount Vav, the average motion direction ⁇ av, the average amplitude Aav, the average acceleration Bav, the average pulsation interval Dav, and the pulsation start time S. In this way, it can be configured. For example, in practice, a feature amount required for the classification process executed by the classification processing unit 340 is calculated from these feature amounts.
- the classification processing unit 340 executes a classification process using a plurality of types of feature amounts calculated by the feature amount calculation unit 330 as described above, and obtains the classification processing result as evaluation index data 800.
- classification methods There are several possible classification methods.
- a method called clustering is adopted. That is, a plurality of classification sections called clusters are set, and each block 611 forming the frame image data 610 shown in FIG. 5 is classified into one of a plurality of clusters according to the feature amount. .
- the template method is adopted in the first embodiment.
- the feature amounts to be adopted are the average motion amount Vav and the average motion direction ⁇ av among the previously assumed features.
- the feature amount calculation unit 330 calculates the average motion amount Vav and the average motion direction ⁇ av.
- the classification processing unit 340 includes, for example, the following first to fifth templates by combining the average motion amount Vav and the average motion direction ⁇ av. Note that FIG. 8 is referred to when describing these templates. FIG. 8 shows a specific example of the average motion direction ⁇ defined in the following first to fifth templates.
- (Vav, ⁇ av) represents a combination of the average motion amount Vav and the average motion direction ⁇ av.
- the average motion amount Vav is “0”, which indicates that the motion amount is 0 over the evaluation section. That is, the first template is a template in which the image portion corresponding to the position of the block 611 remains stationary in the evaluation section.
- a in the above equation can take any value other than 0 as the average motion amount Vav. That is, the second template uses a combination in which an image portion corresponding to the position of the block 611 has a motion and the average motion direction ⁇ av is the direction of “0” shown in FIG.
- the third template uses a combination in which the image portion corresponding to the position of the block 611 has a motion and the average motion direction ⁇ av is the direction of “ ⁇ / 4 (45 °)”. .
- the fifth template has a motion state in which the image portion corresponding to the position of the block 611 has a motion and the average motion direction ⁇ is the direction of 3 ⁇ / 4 (135 °) shown in FIG. It is a thing.
- each cluster corresponding to each of the first to fifth templates is referred to as a first to fifth cluster.
- the classification processing unit 340 calculates each distance between the combination (Vav, ⁇ av) of the feature amounts obtained for one block 611 and the first to fifth templates. Then, the block 611 is classified as belonging to the cluster corresponding to the template having the closest calculated distance. For example, if the calculated distance is closest to the third template, the block 611 is classified into the third cluster. Such a classification process is performed for each block 611. As a result, data of contents obtained by classifying each of the blocks 611 forming the frame image data 610 into one of the first to fifth clusters is obtained. This is the evaluation index data 800 obtained by the classification process as the first embodiment.
- FIG. 9 schematically shows the evaluation index data 800 obtained by the classification process as the first embodiment.
- the evaluation index data 800 includes a group of (M ⁇ N) individual classification result data 801.
- Each of the individual classification result data 801 has a one-to-one correspondence with the block 611 forming the frame image data 610, and the cluster into which the corresponding block 611 is classified is any of the first to fifth clusters.
- Information indicating whether or not.
- the numbers 1 to 5 shown for each individual classification result data 801 indicate whether the block 611 corresponding to the individual classification result data 801 is classified into the first to fifth clusters. .
- FIG. 9 shows a structure in which individual classification result data 801 is arranged as (M ⁇ N) matrices as evaluation index data 800.
- Each of the individual classification result data 801 arranged in this way corresponds to the block 611 arranged at the same position in the frame image data 610.
- the distribution of the first to fifth clusters in one frame image is shown in the evaluation index data 800 having the structure shown in FIG.
- This shows information about which part is moving and which part is not moving in the whole imaged cultured cardiomyocyte 500, and in which direction the moving part is moving. It is perceived as being. More specifically, the movement related to the pulsation of the cultured cardiomyocytes 500 can be grasped as follows.
- the cultured cardiomyocytes 500 need to be moved first, and those having many portions without movement are evaluated as poor quality.
- this point can be evaluated based on the number of individual classification result data 801 classified into the first cluster.
- the quality is higher when the directions of the movements are aligned as much as possible.
- the proportion of movement directions due to pulsation in the entire cultured cardiomyocytes 500 is determined by making the occupancy rate of each cluster into a histogram or the like. Can be evaluated. Furthermore, it is possible to accurately know not only the ratio of the direction of movement but also the distribution state of these portions when the movement direction is different from the cluster distribution shown in FIG. it can.
- the individual classification result data 801 classified into the third cluster is distributed in a wide area at the center in the array of the individual classification result data 801 corresponding to the frame image data 610.
- the evaluation apparatus 400 performs color coding according to the cluster classified for each block 611 corresponding to the individual classification result data 801 according to FIG. An image is generated and displayed. The evaluator confirms this image, so that it is possible to accurately grasp the state of the direction of movement of the cultured cardiomyocytes 500.
- the present technology need not be limited to the combination of the average motion amount Vav and the average motion direction ⁇ av in the above description. That is, it is possible to select a combination in which one or more arbitrary feature amounts are selected from the six examples mentioned above. When a plurality of feature amounts are selected, various evaluation items are obtained according to the selected combination.
- the evaluation apparatus 400 in this case inputs the evaluation index data 800 shown in FIG. 9 and performs processing according to a predetermined algorithm, thereby recognizing the state of each part as described above and using the recognition result as an evaluator. Is output in a format that can be grasped. For example, if the cluster classification result as shown in FIG. 9 is expressed and displayed as an image, the evaluator can visually grasp the above matters.
- FIG. 10 illustrates an example of a processing procedure executed by the evaluation index data generation device 300 according to the first embodiment. Note that the processing of each step in this figure is appropriately executed by any of the motion detection unit 310, the feature amount calculation unit 330, and the classification processing unit 340 shown in FIG. Moreover, at least a part of the processing as each step shown in FIG. 10 can be configured to be realized by a CPU (Central Processing Unit) in a computer device executing a program.
- a CPU Central Processing Unit
- the processing from step S901 to S907 in FIG. 10 is a motion detection process executed by the motion detection unit 310.
- the motion detection unit 310 substitutes 2 for a variable n corresponding to a number assigned to the frame image data 610 forming the evaluation target image data 600 (step S901).
- the motion vector calculation unit 312 in the motion detection unit 310 inputs the (n ⁇ 1) th frame image data and the nth frame image data (step S902). That is, the previous frame image data held in the frame memory 311 and the current frame image data are input.
- the motion vector calculation unit 312 executes a process of dividing each of the input frame image data into blocks each having a predetermined number of pixels (step S903).
- a motion detection process is executed by a technique such as block matching (step S904).
- step S904 By the motion detection process in step S904, one frame unit motion detection data 710- (n-1) in the motion detection data 700 shown in FIG. 6 is obtained. Therefore, the motion detection unit 310 stores the frame unit motion detection data 710- (n ⁇ 1) in the motion detection data storage unit 320 (step S905).
- the motion detection unit 310 increments the variable n (step S906), and determines whether or not the variable n is larger than the maximum value (T + 1) (step S907).
- the maximum value (T + 1) corresponds to the number of frame image data forming the evaluation target image data 600.
- the processing from step S902 is repeatedly executed.
- the first frame unit motion detection data 710-1 to the T frame unit motion detection data 710 -T are sequentially stored in the motion detection data storage unit 320.
- it is determined that the variable n has become larger than the maximum value T (step S907), and the procedure proceeds to step S908 and subsequent steps.
- step S907 when it is determined that the variable n is greater than (T + 1), the feature amount calculation unit 330 executes a process of calculating the feature amount using the motion detection data 700.
- the feature amounts calculated here are, for example, the average motion amount Vav and the average motion direction ⁇ av as described above.
- the classification processing by the template method described above is executed by the classification processing unit 340.
- the classification processing unit 340 first substitutes 1 for the variable i indicating the number assigned to the (M ⁇ N) blocks 611 forming the frame image data 610 (step S909).
- the classification processing unit 340 calculates the distance between the feature amount (Vav, ⁇ av) calculated for the i-th block 611 and the feature amount (Vav, ⁇ av) for each of a plurality of templates prepared in advance ( Step S910).
- the feature amount (Vav, ⁇ av) indicates a combination of the above-described average motion amount Vav and average motion direction ⁇ av.
- the classification processing unit 340 classifies the i-th block 611 as belonging to the cluster corresponding to the template having the shortest distance. Then, individual classification result data 801 indicating the classification result is generated (step S911).
- the individual classification result data 801 includes information of contents in which the identifier of the cluster classified as the identifier of the i-th block is associated.
- five clusters of the first to fifth clusters are prepared in response to the preparation of the first to fifth templates.
- the block 611 to be classified is classified into any one of the first to fifth clusters in step S911.
- the classification processing unit 340 increments the variable i (step S912), and determines whether or not the variable i is larger than the maximum value (M ⁇ N) (step S913).
- the process of cluster classification is sequentially repeated for each block by returning to the process of step S910.
- the classification processing unit 340 generates and outputs the evaluation index data 800 from the individual classification result data 801 obtained by the processes of steps S910 to S912 so far (step S914).
- the evaluation index data 800 may be generated by a combination in which one or more arbitrary feature amounts are selected from the six examples given above. Further, based on the above-described calculation method, the average motion amount Vav can also be obtained by being decomposed into a horizontal average motion amount Vavx and a vertical average motion amount Vavy.
- the average amplitude Aav is obtained by decomposing into a horizontal average amplitude Aavx and a vertical average amplitude Aavy.
- the average acceleration Bav is obtained by decomposing into a horizontal average acceleration Bavx and a vertical average acceleration Bavy. Therefore, for example, it is conceivable that the feature amounts of the horizontal direction component and the vertical direction component are treated as independent ones and used to generate the evaluation index data 800.
- the number of clusters is five, but other numbers may be set.
- Second Embodiment> [Configuration of Evaluation Index Data Generation Device] Although the classification process in the first embodiment uses a template, other methods of the classification process can be considered. Therefore, as a second embodiment, a configuration that employs another classification processing method will be described.
- the configuration of the evaluation index data generation device 300 corresponding to the second embodiment is the same as that shown in FIG. However, the number of types of feature amounts calculated by the feature amount calculation unit 330 and the classification processing procedure executed by the classification processing unit 340 are different as described below.
- FIG. 11 shows an example of a processing procedure executed by the evaluation index data generation device 300 corresponding to the second embodiment.
- the processing from step S901 to S907 is the same as that in FIG. 10 corresponding to the first embodiment.
- step S907 If it is determined in step S907 that the variable n has become larger than the maximum value T, the feature amount calculation unit 330 uses the motion detection data 700 stored in the motion detection data storage unit 320 to perform each block 611. A feature amount is calculated (step S908A).
- step S908A the horizontal average motion amount Vavx, the vertical average motion amount Vave, the average motion direction ⁇ av, the horizontal average amplitude Aavx, the vertical average amplitude Aave, the horizontal average acceleration Bavx, the vertical average acceleration Bavy, the average pulsation interval Dav, and the pulsation start time S
- the horizontal average motion amount Vavx the vertical average motion amount Vave, the average motion direction ⁇ av
- the horizontal average amplitude Aavx the vertical average amplitude Aave
- the horizontal average acceleration Bavx the vertical average acceleration Bavy
- the average pulsation interval Dav the pulsation start time
- the classification processing unit 340 performs clustering based on the k-means method (k average method) using the feature amount calculated as described above. That is, the classification processing unit 340 calculates a 9-dimensional vector x obtained by combining the nine feature amounts for each block 611 (step S921).
- step S921 (M ⁇ N) vectors x corresponding to each block 611 are obtained.
- the classification processing unit 340 in this case first executes the first cluster classification (initial classification) for (M ⁇ N) vectors xi (1 ⁇ i ⁇ (M ⁇ N)) according to the k-average method. . That is, K samples corresponding to the preset number of clusters K are extracted from the vector xi, and the distance between these samples and the vector xi other than the sample is calculated. Then, the vector xi other than the sample is classified as belonging to the same cluster as the sample having the closest calculated distance.
- the classification processing unit 340 calculates the center of gravity Gj (1 ⁇ j ⁇ K) for each of the first to Kth clusters corresponding to the last classification result so far (step S923). These centroids Gj differ depending on the final classification result.
- the classification processing unit 340 calculates the distance from the center of gravity Gj to each cluster for each vector xi (step S924). Then, for each vector xi, reclassification is performed as belonging to the cluster with the shortest calculated distance (step S925). The processing in steps S923 to S925 is repeated until it is determined in step S926 that the classification result has no change and is the same as the previous time.
- the classification processing unit 340 generates and outputs the evaluation index data 800 from the finally obtained classification result (step S914A). That is, in the final classification result, the vector xi is classified into any cluster. Therefore, the classification processing unit 340 generates the individual classification result data 801 by associating the identifier of the block 611 corresponding to the vector xi with the identifier of the classified cluster, for example. Then, a group of the individual classification result data 801 for each of the i-th to M ⁇ N-th blocks 611 is generated as the evaluation index data 800.
- the evaluation index data 800 obtained in the second embodiment also has, for example, the structure shown in FIG.
- Each cluster represents a combination of numerical ranges of a plurality of different feature amounts. For this reason, each cluster has a different meaning in relation to the periodic motion as a pulsation. Therefore, an accurate and detailed evaluation result can be obtained also by using the evaluation index data 800 obtained by the second example.
- the present disclosure shows an example for embodying the present technology, and as clearly indicated in the present disclosure, the matters in the present disclosure and the invention-specific matters in the claims have a corresponding relationship, respectively. . Similarly, the matters specifying the invention in the claims and the matters in the present disclosure having the same names as the claims have a corresponding relationship.
- the present technology is not limited to the embodiment, and can be embodied by making various modifications to the embodiment without departing from the gist of the present technology.
- combinations other than those specifically described in the above-described embodiments may be adopted as feature amounts used for generating the evaluation index data 800.
- feature quantities other than those specifically described in the above embodiments may be obtained and combined.
- the motion amount, the motion direction, the amplitude, the acceleration, the pulsation interval, and the like are all average values for values obtained corresponding to each frame period. It was.
- these feature quantities have changes in time series. Therefore, for example, it is also conceivable to calculate a change in time series as a feature amount and use it for generating the evaluation index data 800.
- other algorithms and methods may be adopted as the classification processing executed by the classification processing unit 340.
- the evaluation object is the cultured cardiomyocytes 500
- the configuration of the present disclosure can be applied, for example, if the object is other than this and the movement has periodicity.
- the processing procedure described in this disclosure may be regarded as a method having a series of these procedures, and may be regarded as a program for causing a computer to execute the series of procedures or a recording medium storing the program.
- a recording medium for example, a CD (Compact Disc), an MD (MiniDisc), a DVD (Digital Versatile Disc), a memory card, a Blu-ray Disc (Blu-ray Disc (registered trademark)), or the like can be used.
- the cell evaluation method may be other than those described above.
- an evaluation value may be obtained for an index calculated from a motion vector obtained for each block of cultured cells.
- each block (partial region) moves at predetermined time intervals (for example, each frame).
- a vector 1002 is obtained, and a time-dependent change in the amount of movement is obtained for each block as in a graph 1003 shown in FIG. 12C, and the amplitude of cell movement as shown in the graph 1004 shown in FIG.
- data indicating changes over time in the number of beats may be generated, evaluation values for evaluating these indices may be obtained, and cell movement may be evaluated based on the evaluation values.
- the evaluation object for example, cell movement
- the evaluation object can be evaluated more quantitatively.
- a motion vector for generating an index more various indices can be obtained more easily and non-invasively. That is, the evaluation object (for example, cell movement) can be more correctly evaluated.
- FIG. 13 is a block diagram illustrating a main configuration example of the cultured cardiomyocyte evaluation apparatus.
- the cultured cardiomyocyte evaluation apparatus 1100 shown in FIG. 13 is an apparatus that evaluates the movement of the cultured cardiomyocyte 500, as in the cultured cardiomyocyte evaluation system 100 of FIG. That is, the cultured cardiomyocyte evaluation apparatus 1100 implements the cultured cardiomyocyte evaluation system 100 as one apparatus.
- the configuration of the cultured cardiomyocyte evaluation system 100 is arbitrary as long as the function of the entire system does not change.
- the plurality of devices shown in FIG. 1 may be configured as one device, or one device may be configured as a plurality of devices.
- a cultured cardiomyocyte evaluation system The entire 100 may be configured as one apparatus.
- the cultured cardiomyocyte evaluation apparatus 1100 is configured by a plurality of apparatuses, such as the cultured cardiomyocyte evaluation system 100 of the first embodiment and the second embodiment. However, the following description will be made using the cultured cardiomyocyte evaluation apparatus 1100.
- the cultured cardiomyocytes 500 can be evaluated by a method other than the above-described evaluation method. . That is, as described above, the cultured cardiomyocyte evaluation apparatus 1100 of the present embodiment obtains an evaluation value for evaluating the movement of the evaluation target.
- a cultured cardiomyocyte 500 shown in FIG. 13 is a biological tissue (cell group) for heart disease generated by culturing cardiomyocytes collected from a living body outside the body. Cardiomyocytes constantly beat and repeatedly contract and relax. When such cardiomyocytes are cultured and grown like the cultured cardiomyocytes 500, the operations of the cells ideally become related to each other, and the entire cultured cardiomyocytes 500 pulsate as one living tissue. It becomes like this.
- the cultured cardiomyocyte evaluation apparatus 1100 uses, for example, the cultured cardiomyocytes 500 cultured as described above, and evaluates the movement of the cultured cardiomyocytes 500 in order to evaluate the quality of the cultured cardiomyocytes 500.
- the evaluation target of the cultured cardiomyocyte evaluation apparatus 1100 may be other than the cultured cardiomyocyte 500.
- cultured cells other than cardiomyocytes may be evaluated.
- the evaluation target may be other than cells.
- the evaluation object is one that can move and that can be evaluated by evaluating the movement. This movement may be autonomous (spontaneous) like a cardiomyocyte, or may be due to an electric signal supplied from the outside.
- the cultured cardiomyocyte evaluation apparatus 1100 includes an imaging unit 1101, an evaluation target image data generation recording unit 1102, an evaluation index data generation unit 1103, and an evaluation unit 1104.
- the imaging unit 1101 corresponds to the imaging device 110 in FIG. That is, the imaging unit 1101 images the cultured cardiomyocytes 500 that are evaluation targets. Note that the imaging unit 1101 may directly image the cultured cardiomyocytes 500 (without passing through other members), or may image the cultured cardiomyocytes 500 through other members such as a microscope. You may do it. Moreover, the cultured cardiomyocytes 500 may or may not be fixed with respect to the imaging unit 1101. In general, it is desirable that the cultured cardiomyocyte 500 is fixed to the imaging unit 1101 because the cultured cardiomyocyte evaluation apparatus 1100 detects movement (temporal change in position).
- the imaging unit 1101 supplies an image signal of the image of the cultured cardiomyocyte 500 obtained by imaging to the evaluation target image data generation / recording unit 1102.
- the evaluation target image data generation / recording unit 1102 corresponds to the evaluation target image data generation / recording apparatus 200 of FIG. That is, the evaluation target image data generation / recording unit 1102 generates evaluation target image data based on the image signal supplied from the imaging unit 1101, and records the generated evaluation target image data in, for example, an internal recording medium and stores it. To do.
- the evaluation target image data generated here is, for example, moving image data generated from an image signal obtained by imaging the cultured cardiomyocytes 500.
- the evaluation target image data generation / recording unit 1102 may extract only a part of the frame images from a plurality of frame images supplied from the imaging unit 1101 and use them as evaluation target image data. Good. Further, for example, the evaluation target image data generation / recording unit 1102 extracts a partial area of each frame image supplied from the imaging unit 1101 as a small frame image, and a moving image including the small frame image is determined as the evaluation target image data. You may make it. Further, for example, the evaluation target image data generation / recording unit 1102 may perform arbitrary image processing on each frame image supplied from the imaging unit 1101 and use the image processing result as evaluation target image data. . As the image processing, for example, image enlargement, reduction, rotation, deformation, brightness and chromaticity correction, sharpness, noise removal, intermediate frame image generation, and the like can be considered. Of course, any image processing other than these may be used.
- the evaluation target image data generation / recording unit 1102 supplies the stored evaluation target image data to the evaluation index data generation unit 1103 at a predetermined timing.
- the evaluation index data generation unit 1103 corresponds to the evaluation index data generation device 300 in FIG. That is, the evaluation index data generation unit 1103 performs, for each block that is a partial region obtained by dividing the entire region of the image of the evaluation target (cultured cardiomyocytes 500) into a plurality of frames between the frame images of the supplied evaluation target image data. The motion of the evaluation target (cultured cardiomyocytes 500) is detected. The evaluation index data generation unit 1103 represents the detected motion of each block as a motion vector, and obtains various feature amounts (motion feature amount data) related to the motion of the evaluation target (cultured cardiomyocytes 500) from the motion vector. Further, as described in the first embodiment and the second embodiment, the evaluation index data generation unit 1103 classifies each block based on the motion feature amount data.
- the evaluation index data generation unit 1103 supplies the motion feature amount data and the classification result generated as described above to the evaluation unit 1104 as evaluation index data.
- Evaluation unit 1104 corresponds to the evaluation device 400 of FIG. That is, the evaluation unit 1104 calculates an evaluation value for each supplied evaluation index data, and integrates the calculated evaluation values to obtain an evaluation value of the evaluation target (cultured cardiomyocytes 500).
- FIG. 14 is a block diagram illustrating a main configuration example of the evaluation index data generation unit 1103.
- the evaluation index data generation unit 1103 includes a motion detection unit 310 and a motion detection data storage unit 320 as in the evaluation index data generation device 300 of FIG. 2. Further, the evaluation index data generation unit 1103 has a feature amount calculation unit 1123 instead of the feature amount calculation unit 330 of the evaluation index data generation device 300 of FIG. 2, and the classification processing unit of the evaluation index data generation device 300 of FIG. A classification processing unit 1124 is provided instead of 340. Further, the evaluation index data generation unit 1103 includes a motion feature amount data history storage memory 1125.
- the motion detection unit 310 receives the evaluation target image data 600 to perform motion detection, and supplies the detection result (motion vector) as motion detection data to the motion detection data storage unit 320 for storage.
- the motion detection unit 310 includes the frame memory 311 and the motion vector calculation unit 312, and M ⁇ N (the entire region of each frame image of the evaluation target image data 600 ( M and N are divided into arbitrary natural number) blocks, and motion detection is performed on each block by a technique such as block matching between frame images, for example, to generate a motion vector.
- the motion detection unit 310 performs motion detection in an evaluation section having a predetermined length (for example, T + 1 frame (T is an arbitrary natural number)). For example, as illustrated in FIG. 6, the motion detection unit 310 generates (M ⁇ N ⁇ T) motion detection data (motion vectors) using (T + 1) frame images, and detects motion detection data. The data is stored in the storage unit 320.
- a predetermined length for example, T + 1 frame (T is an arbitrary natural number).
- T is an arbitrary natural number
- the feature amount calculation unit 1123 detects the motion detection. Data is acquired, and the feature amount related to the motion of the cultured cardiomyocytes 500 is calculated from the motion detection data.
- the feature amount calculation unit 1123 calculates a feature amount related to the motion (beat) of the cultured cardiomyocytes 500 for each block using (M ⁇ N ⁇ T) motion detection data (motion vectors).
- the feature amount calculation unit 1123 calculates the average value (average amplitude Aav) of the motion amplitude of the cultured cardiomyocytes 500 within the evaluation interval illustrated in FIG. Calculate as one.
- the amplitude is the amplitude when the amount of motion changes.
- the average amplitude Aav is an average value of amplitude in the evaluation section. This amplitude A and average amplitude Aav are calculated in each block.
- each amplitude A is calculated as in the following equation (1).
- the average amplitude Aav in the evaluation interval is calculated as shown in the following expression (2) using each amplitude An calculated as shown in the expression (1).
- the feature amount calculation unit 1123 calculates such an average amplitude Aav for each block.
- the feature amount calculation unit 1123 calculates the average value (average value) of the pulsation intervals (or the number of pulsations per unit time) of the cultured cardiomyocytes 500 within one evaluation section illustrated in FIG.
- the pulsation interval Dav is calculated as one of the feature amounts related to the movement of the cultured cardiomyocytes 500.
- the pulsation interval D is the peak interval of the amount of movement as shown in FIG.
- the average pulsation interval Dav is an average value of the pulsation intervals D in the evaluation section.
- the feature amount calculation unit 1123 calculates such an average pulsation interval Dav for each block.
- the pulsation interval Dn is calculated as in the following equation (3).
- the average pulsation interval Dav in the evaluation section is calculated as in the following formula (4).
- the feature amount calculation unit 1123 calculates such an average pulsation interval Dav for each block.
- the type and number of feature amounts calculated by the feature amount calculation unit 1123 are arbitrary.
- the average motion amount Vav, the average motion direction ⁇ av, the average acceleration Bav, and the pulsation start time S may be calculated as feature amounts.
- each feature-value is arbitrary.
- the feature amount calculation unit 1123 calculates the average of each component, You may make it synthesize
- the feature amount calculation unit 1123 supplies the calculated feature amount as motion feature amount data to the motion feature amount data history storage memory 1125 to be stored.
- the feature quantity calculation unit 1123 may sequentially supply the obtained feature quantities as motion feature quantity data to the motion feature quantity data history storage memory 1125 for storage. Further, the feature quantity calculation unit 1123 may store a part of the obtained feature quantity in the motion feature quantity data history storage memory 1125 as motion feature quantity data.
- the feature quantity calculation unit 1123 also supplies the calculated feature quantity to the classification processing unit 1124.
- the classification processing unit 1124 executes classification processing using a plurality of types of feature amounts calculated by the feature amount calculation unit 1123, and uses the classification processing result as motion feature amount data. Is supplied to the motion feature data history storage memory 1125 and stored therein.
- the evaluation index data generation unit 1103 repeats the generation of evaluation index data as described above S times. That is, the imaging unit 1101 continues imaging and generates frame images for at least (evaluation interval (T + 1 frame) ⁇ S times), and the evaluation target image data generation recording unit 1102 at least (evaluation interval ⁇ S Times) of evaluation target image data. In the evaluation object image data, the evaluation sections do not have to be continuous in time.
- the evaluation index data generation unit 1103 generates a feature amount for each block as described above for each evaluation section.
- M ⁇ N ⁇ S feature amounts are stored in the motion feature amount data history storage memory 1125 as shown in FIG.
- the feature quantity data history storage memory 1125 stores more feature quantities (M ⁇ N ⁇ S ⁇ number of types).
- the motion feature amount data history storage memory 1125 uses the stored feature amount as an evaluation index at a predetermined timing.
- the data 800 is supplied to the evaluation unit 1104.
- FIG. 16 is a block diagram illustrating a main configuration example of the evaluation unit 1104.
- the evaluation unit 1104 includes an evaluation unit (evaluation unit for each index) for each of the supplied evaluation index data 800.
- the evaluation unit 1104 includes an amplitude evaluation unit 1141, a pulsation number evaluation unit 1142, and a classification result evaluation unit 1143 as evaluation units for each index.
- the amplitude evaluation unit 1141 evaluates the average amplitude Aav supplied as evaluation index data.
- the pulsation number evaluation unit 1142 evaluates the average pulsation interval Dav supplied as evaluation index data.
- the classification result evaluation unit 1143 evaluates the classification processing result supplied as evaluation index data.
- Such an evaluation unit for each index indicates the type of index data that can be evaluated by the evaluation unit 1104.
- the evaluation unit 1104 is set so that all of the supplied evaluation index data can be evaluated. Therefore, for example, when other evaluation index data 800 is supplied to the evaluation unit 1104, an evaluation unit corresponding to the evaluation index data 800 is prepared in the evaluation unit 1104.
- the types and number of evaluation units for each index included in the evaluation unit 1104 depend on the types and number of evaluation index data supplied.
- the amplitude evaluation unit 1141 first shows each amplitude A (average amplitude Aav), which is evaluation index data, for each block of the frame image as shown in the following equation (5), as shown in A of FIG. Normalization is performed using a function fa such as a curve 1161 of the graph (amplitude A ′ normalized by the function fa is obtained).
- the amplitude evaluation unit 1141 determines each of the M ⁇ N ⁇ S average amplitudes Aav as a function fa. Normalize using.
- the function fa may be any function as long as the value of the amplitude A is larger as the value is larger and smaller as the value is smaller. That is, the normalized amplitude A ′ takes a larger value as the amplitude is larger, and takes a smaller value as the amplitude is smaller.
- the amplitude evaluation unit 1141 calculates the variance N of the past N amplitudes for each block, as in the following equation (6).
- a with an overline is an average value of each amplitude A (average amplitude Aav).
- N S. That is, for example, assuming that the number of blocks of the entire frame image is M ⁇ N and calculation of the average amplitude Aav is repeated S times, the amplitude evaluation unit 1141 calculates M ⁇ N ⁇ S average amplitudes Aav from M ⁇ N ⁇ M ⁇ N ⁇ M ⁇ N ⁇ S average amplitudes Aav.
- N variances Va are calculated.
- the amplitude evaluation unit 1141 normalizes the variance Va of each amplitude using a function ga like a curve 1162 of the graph shown in FIG.
- the amplitude variance Va ′ normalized by ga is obtained).
- the amplitude evaluation unit 1141 normalizes each of the M ⁇ N variance Va using the function ga.
- the function ga may be any function as long as the value of the variance Va is smaller as the value is larger and larger as the value is smaller. That is, the normalized amplitude variance Va ′ takes a larger value as the variation is smaller, and takes a smaller value as the variation is larger.
- the amplitude evaluation unit 1141 calculates the average value (M ⁇ N number) of the entire screen of the product of the normalized amplitude A ′ and the normalized amplitude variance Va ′ as in the following equation (8). (Average value) is calculated as the evaluation value Ea.
- the evaluation value Ea increases as the normalized amplitude and the variance of the normalized amplitude in the entire frame image increase. That is, the larger the amplitude of each block is, the more stable it is (the amplitude is larger and the variation in its time direction is smaller), the higher the evaluation is made.
- the amplitude evaluation unit 1141 uses the number of blocks Na1 in which the product of the normalized amplitude A ′ and the normalized amplitude variance Va ′ is equal to or greater than a predetermined threshold Ta1, as shown in the following equation (9).
- the ratio of the entire frame image may be calculated as the evaluation value Ea.
- Threshold value Ta1 is an arbitrary value set in advance. The larger this value is set, the higher the evaluation standard (stricter evaluation conditions) and the smaller the evaluation value Ea. In this case, the evaluation value Ea increases as the number of blocks in which the product of amplitude and variance is larger than a predetermined reference and stable in the entire frame image.
- the variation between the blocks is smaller than the case where the evaluation value Ea is calculated using the average value as described above.
- the evaluation may be high even if the variation between blocks is large.
- the evaluation is performed using the threshold, even if the values of some blocks are extremely large, the evaluation does not increase unless the number of blocks Na1 is large.
- the amplitude evaluation method is not limited to the above-described example.
- the pulsation of the cultured cardiomyocytes may be compared with that in an ideal normal culture, and the comparison result may be evaluated.
- an ideal pulsation transition pattern (ideal transition pattern) during normal culture is predetermined.
- the amplitude evaluation unit 1141 compares the pulsation transition pattern (measurement transition pattern) of the cultured cardiomyocytes with this ideal transition pattern, and evaluates the similarity.
- a solid line 1171 indicates an ideal amplitude transition pattern
- a dotted line 1172 indicates an amplitude measurement transition pattern. The smaller the difference between the two, the larger the evaluation value.
- the amplitude evaluation unit 1141 calculates the sum Da of the distances between the two transition patterns at each elapsed time for each block as shown in the following formula (10).
- a (k) is the amplitude A (average amplitude Aav) in the measurement transition pattern
- a I (k) is the amplitude A (average amplitude Aav) in the ideal transition pattern.
- k indicates the number of measurement values (elapsed time) (when S measurements are repeated, 0 ⁇ k ⁇ S ⁇ 1).
- W a (k) is a weighting coefficient, and its value is arbitrary. For example, although immediately after the start of measurement is not important differences between the two transition patterns, as the elapsed time becomes longer, if it is determined that both the transition pattern is approximated, the value of the weighting coefficient W a, the value of k is large Indeed, it is set to be larger.
- the amplitude evaluation unit 1141 next calculates the sum of distances Da as shown in the following equation (11). Normalization is performed using a function ha such as a solid line 1173 of the graph shown in 18 B (a normalized distance sum Da ′ is calculated).
- This function ha may be any function as long as the value of the sum of distances Da is smaller as the value is larger and larger as the value is smaller. That is, the normalized distance sum Da ′ takes a larger value as the difference between the ideal transition pattern and the measured transition pattern is smaller, and takes a smaller value as the difference between the ideal transition pattern and the measured transition pattern is larger.
- the amplitude evaluation unit 1141 calculates the average value (M ⁇ N average values) of the entire screen of the normalized distance sum Da ′ as the evaluation value Ea as in the following Expression (12). .
- the evaluation value Ea increases as the difference between the measurement transition and the ideal transition in the entire frame image decreases.
- the amplitude evaluation unit 1141 evaluates the ratio of the number of blocks Na2 in which the normalized distance sum Da ′ is a value equal to or greater than a predetermined threshold Ta2 in the entire frame image, as in the following Expression (13).
- the value Ea may be calculated.
- the threshold value Ta2 is an arbitrary value set in advance. The larger this value is set, the higher the evaluation standard (stricter evaluation conditions) and the smaller the evaluation value Ea. In this case, the evaluation value Ea increases as the number of blocks in which the difference between the measurement transition and the ideal transition is smaller than a predetermined reference and is stable in the entire frame image increases.
- the amplitude evaluation unit 1141 calculates the evaluation value Ea that evaluates the amplitude based on the index data regarding the amplitude of the heartbeat pulsation. That is, the amplitude evaluation unit 1141 can quantitatively evaluate the pulsation amplitude of the cardiomyocytes.
- the pulsation number evaluation unit 1142 first calculates the pulsation number R per unit time (for example, 1 minute) from the pulsation interval D (average pulsation interval Dav) as in the following formula (14). To do.
- the number R of beats per unit time is an average value of the number of beats per unit time in the evaluation period (for example, (T + 1) frame).
- the pulsation number evaluation unit 1142 calculates the pulsation number R per unit time for each block. Further, the pulsation number evaluation unit 1142 calculates the pulsation number R per unit time for each evaluation period. That is, if the number of blocks of one frame image is M ⁇ N and the evaluation period is repeated S times, the beat number evaluation unit 1142 calculates the number of beats R per unit time as (M ⁇ N ⁇ S). Calculate pieces.
- the pulsation rate evaluation unit 1142 uses the function fr like the curve 1181 of the graph shown in FIG. 19A as the following equation (15) for the pulsation rate R per unit time. (The number of beats R ′ per unit time normalized by the function fr is obtained).
- the beat number evaluation unit 1142 calculates M ⁇ N ⁇ S beats per unit time.
- Each of the dynamic numbers R is normalized using the function fr.
- fr is a function that increases the value of the number of beats R per unit time as the value is closer to the appropriate value and decreases as the value is farther from the appropriate value, any function can be used. It may be a simple function. That is, the normalized number of beats per unit time R ′ takes a larger value as it approaches the predetermined number of appropriate beats per unit time, and the predetermined number of appropriate beats per unit time. The farther it is, the smaller the value.
- the pulsation number evaluation unit 1142 obtains the variance Vr per unit time for the past N times for each block as in the following equation (16).
- R with an overline is the average value of the number of beats R per unit time.
- N S. That is, for example, assuming that the number of blocks of the entire frame image is M ⁇ N and the calculation of the number of beats R per unit time is repeated S times, the number of beats evaluation unit 1142 is M ⁇ N ⁇ S. From the number of beats R per unit time, M ⁇ N variances Vr are calculated.
- the pulsation number evaluation unit 1142 normalizes the variance Vr of each amplitude using a function gr like a curve 1182 of the graph shown in FIG. 19B as in the following equation (17). (Determine the variance Vr ′ of the number of beats per unit time normalized by the function gr).
- the pulsation number evaluation unit 1142 normalizes each of the M ⁇ N variances Vr using the function gr.
- the function gr may be any function as long as the value of the variance Vr is smaller as the value is larger and larger as the value is smaller. That is, the normalized variance Vr ′ of the number of beats per unit time takes a larger value as the variation is smaller, and takes a smaller value as the variation is larger.
- the beat number evaluation unit 1142 calculates the normalized beat number R ′ per unit time and the normalized beat number variance Vr ′ per unit time as shown in the following equation (18).
- the average value (M ⁇ N average values) of the entire product screen is calculated as the evaluation value Er.
- the evaluation value Er increases as the number of beats per unit time normalized and the variance in the entire frame image increases. That is, the case where the normalized number of beats R per unit time of each block is more stable (the number of beats per unit time is closer to an appropriate value and there is less variation in the time direction). Highly appreciated.
- the beat number evaluation unit 1142 calculates the product of the normalized beat number R ′ per unit time and the normalized variance Vr ′ of beat number per unit time as shown in the following equation (19).
- the ratio of the number of blocks Nr1 in which the value of N is equal to or greater than a predetermined threshold value Tr1 to the entire frame image may be calculated as the evaluation value Er.
- the threshold value Tr1 is an arbitrary value set in advance. The larger this value is set, the higher the evaluation standard (stricter evaluation conditions) and the smaller the evaluation value Er. In this case, the evaluation value Er increases as the number of pulsations per unit time in the entire frame image is closer to an appropriate value than the predetermined reference and the number of blocks is stable in the time direction. .
- the variation between the blocks is smaller than the case where the evaluation value Er is calculated using the average value as described above.
- the evaluation may be high even if the variation between blocks is large.
- the evaluation is performed using the threshold value, even if the values of some blocks are extremely large, the evaluation does not increase unless the number of blocks Nr1 is large.
- the method for evaluating the number of beats per unit time is not limited to the example described above.
- the pulsation of the cultured cardiomyocytes may be compared with that in an ideal normal culture, and the comparison result may be evaluated.
- an ideal pulsation transition pattern (ideal transition pattern) during normal culture is predetermined.
- the pulsation number evaluation unit 1142 compares the pulsation transition pattern (measurement transition pattern) of the cultured cardiomyocytes with this ideal transition pattern and evaluates the similarity as shown in the graph of FIG. 20A. To do.
- a solid line 1191 indicates an ideal transition pattern of the number of beats per unit time
- a dotted line 1192 indicates a measurement transition pattern of the number of beats per unit time. The smaller the difference between the two, the larger the evaluation value.
- the pulsation number evaluation unit 1142 calculates the sum Dr of the distances between the two transition patterns in each elapsed time for each block as in the following equation (20).
- R (k) is the number of beats R per unit time in the measurement transition pattern
- R I (k) is the number of beats per unit time in the ideal transition pattern.
- k indicates the number of measurement values (elapsed time) (when S measurements are repeated, 0 ⁇ k ⁇ S ⁇ 1).
- W r (k) is a weighting coefficient, and its value is arbitrary. For example, although immediately after the start of measurement is not important differences between the two transition patterns, as the elapsed time becomes longer, if it is determined that both the transition pattern is approximated, the value of the weighting factor W r, the value of k is large Indeed, it is set to be larger.
- the pulsation number evaluation unit 1142 calculates the sum Dr of the distances as shown in the following equation (21). Then, normalization is performed using a function hr like a solid line 1193 in the graph shown in FIG. 20B (a sum Dr ′ of normalized distances is calculated).
- This function hr may be any function as long as the value of the sum of distances Dr is smaller as the value is larger and larger as the value is smaller. That is, the normalized distance sum Dr ′ takes a larger value as the difference between the ideal transition pattern and the measurement transition pattern is smaller, and takes a smaller value as the difference between the ideal transition pattern and the measurement transition pattern is larger.
- the pulsation number evaluation unit 1142 uses the average value (M ⁇ N average values) of the entire screen of the sum of the normalized distances Dr ′ as the evaluation value Er as in the following Expression (22). calculate.
- the evaluation value Er increases as the difference between the measurement transition and the ideal transition in the entire frame image decreases.
- the pulsation number evaluation unit 1142 represents the ratio of the number of blocks Nr2 in which the sum Dr ′ of normalized distances is a value equal to or greater than a predetermined threshold Tr2 in the entire frame image, as in the following Expression (23).
- the evaluation value Er may be calculated.
- the threshold value Tr2 is an arbitrary value set in advance. The larger this value is set, the higher the evaluation standard (stricter evaluation conditions) and the smaller the evaluation value Er. In this case, the evaluation value Er increases as the difference between the measurement transition and the ideal transition in the entire frame image is smaller than a predetermined reference and stable.
- the pulsation rate evaluation unit 1142 calculates the evaluation value Er that evaluates the pulsation rate per unit time based on the index data regarding the pulsation rate per unit time of the pulsation of the cardiomyocytes. To do. That is, the pulsation number evaluation unit 1142 can quantitatively evaluate the number of pulsations per unit time of the pulsation of the cardiomyocytes.
- the classification result evaluation unit 1143 calculates the evaluation value so that the value increases when the ratio of blocks classified into a predetermined cluster (desired cluster) in which the feature amount is desirable is larger.
- the classification result evaluation unit 1143 counts the number of times each block is classified as C in the classification performed n times in the past, and compares the number N with a predetermined threshold value Tc1. The number Nc of blocks satisfying equation (24) is obtained.
- the classification result evaluation unit 1143 uses the number of blocks Nc thus obtained to calculate an evaluation value Ec obtained by evaluating the classification result as in the following expression (25) (the number of blocks of one frame image is N ⁇ N).
- the classification result evaluation unit 1143 calculates the evaluation value Ec obtained by evaluating the classification result of the feature amount of pulsation of cardiomyocytes. That is, the classification result evaluation unit 1143 can quantitatively evaluate the feature value classification result of the pulsation of the cardiomyocytes.
- the evaluation unit 1104 further includes an evaluation integration unit 1144.
- the evaluation unit for each index of the evaluation unit 1104 supplies the evaluation value for each index calculated by the evaluation unit 1104 to the evaluation integration unit 1144.
- the evaluation integration unit 1144 integrates the evaluation values supplied from the evaluation units for each index by a predetermined calculation, and generates an evaluation value E of the evaluation target (cultured cardiomyocytes 500). For example, the evaluation integration unit 1144 calculates the sum of the evaluation values for each index as the evaluation value E as shown in the following formula (26).
- the evaluation value Ea is an evaluation value of the average amplitude Aav supplied from the amplitude evaluation unit 1141
- the evaluation value Er is an evaluation of the average pulsation interval Dav supplied from the pulsation number evaluation unit 1142.
- the evaluation value Ec is an evaluation value of the classification processing result supplied from the classification result evaluation unit 1143.
- the weighting factors Wa, Wr, and Wc are factors that weight the evaluation values Ea, Er, and Ec, respectively.
- the evaluation integration unit 1144 can integrate the evaluation values for each index by arbitrarily weighting them, so that the evaluation target can be quantitatively evaluated based on more various criteria.
- the evaluation integration unit 1144 outputs the evaluation value E calculated as described above to the outside of the evaluation unit 1104 as the evaluation value 1150 to be evaluated.
- the evaluation value 1150 output from the evaluation unit 1104 is displayed on a monitor, for example, as character information or image information, and is presented to a user or the like, or another device that performs arbitrary processing using the evaluation value 1150 ( (Not shown).
- the evaluation value 1150 may be recorded on a recording medium (not shown).
- the evaluation unit 1104 can quantitatively evaluate more various indexes by more various methods. As a result, the evaluation unit 1104 can more accurately evaluate the evaluation target (cardiomyocytes).
- the imaging unit 1101 of the cultured cardiomyocyte evaluation apparatus 1100 images the evaluation target in step S1001.
- the evaluation target image data generation recording unit 1102 generates evaluation target image data from the image signal obtained by imaging in step S1001.
- step S1003 the evaluation index data generation unit 1103 generates evaluation index data that is data of various indexes for evaluating the movement of the evaluation target from the evaluation target image data generated in step S1002.
- step S1004 the evaluation unit 1104 evaluates the movement of the evaluation target using the evaluation index data generated in step S1003, and calculates an evaluation value.
- step S1005 the evaluation unit 1104 outputs the evaluation value calculated in step S1004, and ends the evaluation process.
- the motion detection unit 310 of the evaluation index data generation unit 1103 detects the motion to be evaluated for each block and generates a motion vector in step S1021.
- the motion detection data storage unit 320 stores the motion vector of each block generated in step S1021.
- step S1023 the motion detection unit 310 determines whether motion detection has been performed for a predetermined evaluation period. When it is determined that there is a frame image for which motion detection has not been performed in the predetermined evaluation period, the motion detection unit 310 returns the process to step S1021 and repeats motion detection for a new processing target frame image.
- step S1023 If it is determined in step S1023 that motion detection has been performed on all frame images to be processed in the predetermined evaluation period, the motion detection unit 310 advances the process to step S1024.
- step S1024 the feature amount calculation unit 1123 calculates a feature amount related to the motion of the evaluation target such as the average amplitude Aav and the average pulsation interval Dav from the motion vector stored in step S1022.
- step S1025 the motion feature amount data history storage memory 1125 stores the feature amount calculated in step S1024 as motion feature amount data.
- step S1026 the classification processing unit 1124 classifies each block based on the feature amount calculated in step S1024.
- step S1027 the motion feature amount data history storage memory 1125 stores the classification result performed in step S1026 as motion feature amount data.
- step S1028 the feature amount calculation unit 1123 determines whether or not the calculation of the feature amount has been repeated a predetermined number of times (for example, S times). If it is determined that the predetermined number of times has not been reached, the process is performed. Returning to S1021, the subsequent processing is repeated. If it is determined in step S1028 that the calculation of the feature amount has been repeated a predetermined number of times, the feature amount calculation unit 1123 advances the process to step S1029.
- a predetermined number of times for example, S times.
- step S1029 the motion feature data history storage memory 1125 outputs the stored motion feature data to the evaluation unit 1104 as evaluation index data.
- the motion feature data history storage memory 1125 ends the evaluation index data generation process, returns the process to step S1003 in FIG. 21, and executes the processes after step S1004.
- the amplitude evaluation unit 1141 of the evaluation unit 1104 evaluates the amplitude of the motion to be evaluated based on the evaluation index data regarding the amplitude and calculates the evaluation value Ea in step S1041.
- step S1042 the pulsation rate evaluation unit 1142 evaluates the pulsation rate per unit time of the motion to be evaluated based on the evaluation index data regarding the pulsation rate per unit time, and calculates the evaluation value Er. To do.
- step S1043 the classification result evaluation unit 1143 evaluates the classification result of each block based on the movement of the evaluation object based on the evaluation index data regarding the classification result, and calculates the evaluation value Ec.
- step S1044 the evaluation integration unit 1144 integrates the evaluation values of the indices and calculates an evaluation value E to be evaluated.
- the evaluation integration unit 1144 ends the motion evaluation process, returns the process to step S1004 in FIG. 21, and executes the processes after step S1005.
- the cultured cardiomyocyte evaluation apparatus 1100 can more quantitatively evaluate an evaluation target (for example, cell movement).
- an evaluation target for example, cell movement
- a motion vector for generating an index more various indices can be obtained more easily and non-invasively. That is, the evaluation object (for example, cell movement) can be more correctly evaluated.
- the pulsations in various regions determined by analysis of the phase difference observation movie of cultured cardiomyocytes show cooperative pulsations depending on the number of days of culture, but change with the administration of various drugs. By detecting such fluctuations by some method, it becomes possible to evaluate in advance the drug toxicity, effects and the like at the time of drug discovery.
- the method of detecting and evaluating the movement of cells is used to evaluate the contraction / relaxation extension in the cell pulsation caused by the drug administration, and the evaluation result is used to evaluate the toxicity and the like of the drug.
- the pulsation of cardiomyocytes consists of contraction and relaxation. For example, if the entry / exit of ions in the potassium channel of the cell is inhibited, the relaxation time is prolonged (it is difficult to return from the contracted state).
- FIG. 24 is a block diagram illustrating a main configuration example of the medicine evaluation apparatus.
- a drug evaluation apparatus 1300 shown in FIG. 24 is an apparatus that evaluates the influence (efficacy, side effects, etc.) of a drug by the movement of cultured cardiomyocytes 500 to which the drug is administered.
- the drug evaluation device 1300 includes an imaging unit 1101 and an evaluation target image data generation / recording unit 1102 similar to the cultured cardiomyocyte evaluation device 1100 of FIG.
- the imaging unit 1101 images the cultured cardiomyocytes 500 before and after drug administration.
- the evaluation target image data generation recording unit 1102 generates evaluation target image data based on the image signal supplied from the imaging unit 1101, and records and stores the generated evaluation target image data in, for example, an internal recording medium. That is, evaluation target image data is generated for each moving image of cultured cardiomyocytes 500 before and after drug administration.
- the drug evaluation device 1300 includes an evaluation index data generation unit 1303 instead of the evaluation index data generation unit 1103 of the cultured cardiomyocyte evaluation device 1100, and further includes an evaluation unit 1304 instead of the evaluation unit 1104.
- the evaluation index data generation unit 1303 acquires evaluation target image data from the evaluation target image data generation recording unit 1102.
- the evaluation index data generation unit 1303 generates evaluation index data using the acquired evaluation target image data and supplies it to the evaluation unit 1304.
- the evaluation index data generation unit 1303 is a partial region in which the entire region of the frame image is divided into a plurality of portions between the frame images of the evaluation target image data that is a moving image of the cultured cardiomyocytes 500, for example.
- motion detection generation of motion vectors
- the evaluation index data generation unit 1303 performs such motion detection of each block for a predetermined period (a predetermined number of frames). This period may be the time of the moving image captured by the imaging unit 1101 or may be shorter.
- the evaluation index data generation unit 1303 further obtains the motion amount (motion vector length) of each generated motion vector. That is, the evaluation index data generation unit 1303 generates a motion amount for each frame of each block for a predetermined period.
- FIG. 25 is a diagram illustrating an example of a temporal change in the amount of movement of a certain block obtained by the evaluation index data generation unit 1303, that is, an example of a state of pulsation of cells.
- the horizontal axis indicates the elapsed time (number of frames), and the vertical axis indicates the amount of movement (pixels / frame).
- a curve 1311 (before) shows the pulsation of cultured cardiomyocytes 500 before drug administration
- a curve 1312 (after) shows the pulsation of cultured cardiomyocytes 500 after drug administration.
- the waveform for one beat one contraction and one relaxation is shown.
- the pulsation of the cardiomyocytes is composed of contraction and relaxation, and the mountain formed on the left side in the curves 1311 and 1312 is due to the “contraction” operation, and the mountain formed on the right side is due to the “relaxation” operation. is there.
- the point P1-1 indicates the contraction peak of the curve 1311 (before), and the point P1-2 indicates the relaxation peak of the curve 1311 (before).
- the point P2-1 indicates the contraction peak of the curve 1312 (after), and the point P2-2 indicates the relaxation peak of the curve 1312 (after).
- relaxation of the myocardium corresponds to the T wave as referred to in the electrocardiogram, and corresponds to repolarization of the myocardial cell membrane.
- This extension of the T wave is generally called QT extension as an extension of the time between the Q wave and the T wave. If this symptom appears, the possibility of arrhythmia is pointed out.
- QT extension is an extension of the time between the Q wave and the T wave.
- DL-sotalol is known to inhibit potassium channels. That is, when DL-sotalol is administered to cultured cardiomyocytes 500, the relaxation process changes due to a change in potassium channel function that works in the relaxation process.
- the relaxation peak is shifted before and after the drug administration. More specifically, the time at the point P2-2 is delayed (shifted) by the time d from the time at the point P1-2. That is, it is possible to confirm the occurrence of QT prolongation due to drug administration (for example, change in potassium channel function due to DL-sotalol administration).
- this QT extension can be observed by conventional potential measurement, but a dedicated culture dish having electrodes is required. In pulsation imaging using calcium, only the left peak is basically observed, and it is difficult to observe the right peak. Therefore, it is not suitable for evaluation of this extension.
- changes in cell pulsation behavior can be captured without adding any reagent such as a fluorescent dye to the cells and without using a special culture dish. That is, the evaluation can be performed easily, non-invasively and inexpensively.
- the evaluation index data generation unit 1303 further uses, for example, a motion amount group arranged in time series from the motion amount of each motion vector (length of motion vector). Then, feature quantities for waveform comparison such as the coordinates of the points P2-1 and P2-2 or the time d are calculated, and the feature quantities are supplied to the evaluation unit 1304 as evaluation index data.
- the evaluation unit 1304 images the supplied evaluation index data, evaluates it quantitatively, calculates an evaluation value for the movement of the cultured cardiomyocytes 500, and outputs it.
- the evaluation unit 1304 displays the graph image indicating the pulsation pattern as shown in FIG. 25, for example, or determines whether or not the QT extension has occurred by determining the threshold value of the time d indicating the degree of QT extension. Judgment.
- the graph shown in FIG. 25 is an example of imaging, and other than this, for example, the pulsation pattern of the cell may be expressed by an arbitrary image such as a bar graph, a distribution diagram, or a schematic diagram. Good. Moreover, the medicine to evaluate is arbitrary.
- FIG. 26 is a block diagram illustrating a main configuration example of the evaluation index data generation unit 1303.
- the evaluation index data generation unit 1303 includes a motion detection unit 310, performs motion detection between the frame images of the evaluation target image data 600 (moving image), and calculates a motion vector for each block. Generate.
- the evaluation index data generation unit 1303 includes a motion amount absolute value calculation unit 1321, a motion amount absolute value storage unit 1322, a feature amount calculation unit 1323, and a feature amount storage unit 1324.
- the motion amount absolute value calculation unit 1321 calculates a motion amount (absolute value of the length of the motion vector) (hereinafter also referred to as a motion amount absolute value) for each motion vector detected by the motion detection unit 310.
- the motion amount absolute value calculation unit 1321 stores the calculated motion amount absolute value in the motion amount absolute value storage unit 1322.
- the motion amount absolute value storage unit 1322 stores the motion amount absolute value for each block between all frames of the evaluation target image data 600. For example, when there are a plurality of evaluation target image data before and after drug administration, the motion amount absolute value storage unit 1322 stores the motion amount absolute value for each evaluation target image data.
- the feature amount calculation unit 1323 calculates a predetermined feature amount used for evaluation using the motion amount absolute value stored in the motion amount absolute value storage unit 1322.
- the feature amount storage unit 1324 stores the feature amount calculated by the feature amount calculation unit 1323. This feature amount is supplied to the evaluation unit 1304 as evaluation index data 800 at a predetermined timing or in response to a request from the evaluation unit 1304 or the like.
- FIG. 27 is a block diagram illustrating a main configuration example of the evaluation unit 1304.
- the evaluation unit 1304 includes a feature amount acquisition unit 1341, a feature comparison unit 1342, a display unit 1343, and an output unit 1344.
- the feature quantity acquisition unit 1341 uses, as the evaluation index data 800, a desired feature quantity (for example, the feature quantity of the evaluation target (cultured cardiomyocytes 500) designated by the user) from the evaluation index data generation unit 1303 (feature quantity storage unit 1324). get.
- the feature amount acquisition unit 1341 supplies the acquired feature amount to the display unit 1343 for display, or supplies the acquired feature amount to the output unit 1344 for supply to another device.
- the feature amount acquisition unit 1341 supplies the acquired feature amount to the feature comparison unit 1342.
- the feature comparison unit 1342 quantitatively evaluates the supplied feature amount. For example, the feature comparison unit 1342 quantitatively compares the feature amounts of the plurality of cultured cardiomyocytes 500 with each other, for example, before and after drug administration, and compares the feature amounts with a predetermined threshold. evaluate.
- the feature comparison unit 1342 supplies the comparison result to the display unit 1343 for display, or supplies the comparison result to the output unit 1344 for supply to another device.
- the display unit 1343 includes a display device such as a monitor, for example, converts the data supplied from the feature amount acquisition unit 1341 or the feature comparison unit 1342 into an image, and displays the image on the display device. For example, the display unit 1343 generates and displays a graph as shown in FIG. 25, for example, using the amount of motion acquired by the feature amount acquisition unit 1341. Further, for example, the display unit 1343 images and displays the evaluation result supplied from the feature comparison unit 1342.
- a display device such as a monitor, for example, converts the data supplied from the feature amount acquisition unit 1341 or the feature comparison unit 1342 into an image, and displays the image on the display device. For example, the display unit 1343 generates and displays a graph as shown in FIG. 25, for example, using the amount of motion acquired by the feature amount acquisition unit 1341. Further, for example, the display unit 1343 images and displays the evaluation result supplied from the feature comparison unit 1342.
- the output unit 1344 has an interface such as an external terminal, for example, and outputs data supplied from the feature amount acquisition unit 1341 or the feature comparison unit 1342 to an external device, a network, or the like.
- the drug evaluation device 1300 can easily and non-invasively evaluate the influence of the drug administration on the pulsation of the cardiomyocytes.
- the evaluation unit has been described as evaluating the occurrence of QT extension, but the parameter to be evaluated may be other than this. That is, the calculated feature amount is also arbitrary.
- a difference in motion amount between the points P1-2 and P2-2 may be used as the feature amount.
- the difference between the time and the amount of motion between the points P1-1 and P2-1 may be used as the feature amount.
- the width of the contraction peak or the width of the relaxation peak may be used as the feature amount.
- other parameters may be used as the feature amount.
- the evaluation unit 1304 may perform such evaluation for all blocks in the observation region or for some blocks. Furthermore, the evaluation unit 1304 may perform such evaluation for all the pulsations in the observation period, or may be performed for some pulsations.
- the imaging unit 1101 of the drug evaluation device 1300 images the evaluation target in step S1301.
- the evaluation target image data generation recording unit 1102 generates evaluation target image data from the image signal obtained by imaging in step S1301.
- step S1303 the evaluation index data generation unit 1303 generates evaluation index data using the evaluation target image data generated in step S1302.
- the evaluation unit 1304 evaluates the influence of the drug by observing the pulsation pattern (eg, QT prolongation) of the cultured cardiomyocytes 500 before and after the drug administration using the evaluation index data generated in step S1303. To do.
- step S1305 the output unit 1344 of the evaluation unit 1304 outputs the evaluation value calculated in step S1304 to the outside of the drug evaluation device 1300, and ends the evaluation process.
- the display unit 1343 may image the evaluation value and display the image on the display device as described above. Further, as described above, the display unit 1343 may image various feature amounts calculated in the process of step S1303 and display them on the display device, or the output unit 1344 may display the various feature amounts as a drug. You may make it output outside the evaluation apparatus 1300. FIG.
- the motion detection unit 310 of the evaluation index data generation unit 1303 detects the motion of the evaluation target for each block and generates a motion vector in step S1321.
- the motion amount absolute value calculation unit 1321 calculates the motion amount absolute value of the motion vector generated in step S1321.
- step S1323 the motion amount absolute value storage unit 1322 stores the motion amount absolute value calculated in step S1322.
- step S1324 the motion detection unit 310 determines whether motion detection has been performed for a predetermined period (evaluation interval). When it is determined that there is a frame image for which motion detection has not been performed in the predetermined evaluation section, the motion detection unit 310 returns the process to step S1321 and repeats motion detection for a new processing target frame image.
- step S1324 If it is determined in step S1324 that motion detection has been performed on all the frame images to be processed in the predetermined evaluation section, the motion detection unit 310 advances the process to step S1325.
- step S1325 the feature amount calculation unit 1323 calculates a feature amount using the motion amount absolute value stored in step S1323.
- the feature amount storage unit 1324 stores the feature amount calculated in step S1325.
- step S1327 the feature amount calculation unit 1323 determines whether or not the calculation of the feature amount has been repeated a predetermined number of times (for example, S times). If it is determined that the predetermined number of times has not been reached, the process is performed. The process returns to S1321, and the subsequent processing is repeated. If it is determined in step S1327 that the feature quantity calculation has been repeated a predetermined number of times, the feature quantity calculation unit 1323 ends the evaluation index data generation process, returns the process to FIG. 28, and executes the processes in and after step S1304.
- the feature amount acquisition unit 1341 of the evaluation unit 1304 acquires a desired motion vector from the feature amount storage unit 1324 in step S1341.
- step S1342 the feature comparison unit 1342 compares the feature amount acquired in step S1341 between the objects.
- the feature comparison unit 1342 ends the impact evaluation process, returns the process to FIG. 28, and causes the process of step S1305 to be executed.
- the drug evaluation apparatus 1300 obtains the influence on the pulsation of the myocardial cells by the drug administration by obtaining the feature value regarding the temporal change of the movement amount of the observation target whose movement is detected by the evaluation unit 1304. Can be easily evaluated. Since this method does not use a special culture dish or fluorescent reagent, it can be evaluated simply, non-invasively and inexpensively, and is also suitable for automation. In this method, the observation area may be a relatively narrow range, for example, about 0.6 mm square, and the test can be performed with a small number of cells and a small reagent.
- FIG. 31 is a diagram showing an example of pulsation before and after drug administration.
- Each of the eight graphs shown in FIG. 31 is an observation result of the state of pulsation (a temporal change in the absolute value of the amount of movement) of a predetermined portion in the observation region of the cultured cardiomyocytes 500.
- the horizontal axis indicates time (sec), and the vertical axis indicates the motion amount absolute value (pixcel / frame) between frames. That is, each amplitude shown in each graph represents the pulsation of cultured cardiomyocytes 500.
- the top graph shows the pulsation before and after the administration of an organic solvent (control) (for example, dimethyl sulfoxide).
- the second graph from the top shows the state of pulsation before and after administration of aspirin (aspirin (acetylsalicylic acid)).
- the third graph from the top shows the state of pulsation before and after administration of DL-sotalol.
- the bottom 3D plot shows the state of pulsation before and after administration of 18- ⁇ -Glycyrrhetinic acid.
- the function of the potassium channel is inhibited, whereby not only the relaxation waveform (beat width) changes (becomes unstable) but also the right side of FIG. As shown in the third graph from the top, the beat timing becomes unstable (the peak interval varies). Also, the absolute value of the amount of movement at the peak becomes unstable (varies occur).
- 18- ⁇ -glycyrrhetinic acid is known to inhibit gap junctions. Even when this 18- ⁇ -glycyrrhetinic acid is administered, the pulsation timing and the absolute value of the amount of movement at the peak become unstable as shown in the graph in the fourth row from the upper right of FIG. Occurs).
- FIG. 32 is a diagram showing an example of pulsation variation before and after drug administration.
- Each of the eight graphs shown in FIG. 32 is obtained by superimposing pulsation waveforms (a plurality of pulsations) repeated in a predetermined portion within the observation region of cultured cardiomyocytes 500.
- each graph represents time (sec)
- the vertical axis represents the absolute value of motion amount between frames (pixcel / frame).
- the left graph shows the pulsation before drug administration
- the right graph shows the pulsation after drug administration (after a predetermined time has elapsed since administration).
- the drugs to be administered are, in order from the top, organic solvent (control), aspirin (aspirin (acetylsalicylic acid)), DL-sotalol, 18- It is ⁇ -glycyrrhetinic acid (18- ⁇ -Glycyrrhetinic acid).
- each pulse of myocardial cells does not vary as much as before administration.
- 18- ⁇ -glycyrrhetinic acid has a large variation in peak height in the contraction waveform as shown in the fourth graph from the upper right side of FIG. 31 due to the inhibition of the gap junction function. Occurs.
- the correlation between pulsations between cells is observed by observing changes in the state of pulsation caused by drug administration with respect to specific cells (specific partial regions) within the observation region of cultured cardiomyocytes 500. Information that cannot be obtained simply by doing. Therefore, drug evaluation can be performed using an index different from that used when observing the correlation of pulsation between cells.
- a CPU (Central Processing Unit) 1501 of the personal computer 1500 has various types according to a program stored in a ROM (Read Only Memory) 1502 or a program loaded from a storage unit 1513 to a RAM (Random Access Memory) 1503. Execute the process.
- the RAM 1503 also appropriately stores data necessary for the CPU 1501 to execute various processes.
- the CPU 1501, the ROM 1502, and the RAM 1503 are connected to each other via a bus 1504.
- An input / output interface 1510 is also connected to the bus 1504.
- the input / output interface 1510 includes an input unit 1511 including a keyboard and a mouse, a display including a CRT (Cathode Ray Tube) and an LCD (Liquid Crystal Display), an output unit 1512 including a speaker, a hard disk, and the like.
- a communication unit 1514 including a storage unit 1513 and a modem is connected. The communication unit 1514 performs communication processing via a network including the Internet.
- a drive 1515 is connected to the input / output interface 1510 as necessary, and a removable medium 1521 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory is appropriately mounted, and a computer program read from them is loaded. It is installed in the storage unit 1513 as necessary.
- a program constituting the software is installed from a network or a recording medium.
- this recording medium is distributed to distribute a program to a user separately from the apparatus main body, and includes a magnetic disk (including a flexible disk) on which a program is recorded, an optical disk ( It is only composed of removable media 1521 consisting of CD-ROM (compact disc-read only memory), DVD (including digital Versatile disc), magneto-optical disk (including MD (mini disc)), or semiconductor memory. Rather, it is composed of a ROM 1502 on which a program is recorded and a hard disk included in the storage unit 1513, which is distributed to the user in a state of being incorporated in the apparatus main body in advance.
- a magnetic disk including a flexible disk
- an optical disk It is only composed of removable media 1521 consisting of CD-ROM (compact disc-read only memory), DVD (including digital Versatile disc), magneto-optical disk (including MD (mini disc)), or semiconductor memory. Rather, it is composed of a ROM 1502 on which a program is recorded and a hard disk included in the storage unit 1513, which is
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but may be performed in parallel or It also includes processes that are executed individually.
- system represents the entire apparatus composed of a plurality of devices (apparatuses).
- the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
- the configurations described above as a plurality of devices (or processing units) may be combined into a single device (or processing unit).
- a configuration other than that described above may be added to the configuration of each device (or each processing unit).
- a part of the configuration of a certain device (or processing unit) may be included in the configuration of another device (or other processing unit). . That is, the present technology is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present technology.
- a plurality of frame image data forming moving image data having an image content of an object that performs periodic motion is divided into blocks having an array of a predetermined number of pixels, and time-series data of movement for each block is detected.
- a motion detector A feature amount calculation unit that calculates at least one type of feature amount for each block based on the detected time-series data of each block; Classification data indicating the result of classifying each of the blocks forming any one of the plurality of frame image data as belonging to any one of a predetermined number of classification categories is used as the calculated feature amount.
- a data processing apparatus comprising: a classification processing unit that is generated based on the classification processing unit.
- the feature quantity calculation unit calculates a plurality of types of feature quantities for each block, The data processing device according to (1), wherein the classification unit generates the classification data based on the calculated plural types of the feature amounts.
- the feature amount calculation unit calculates an average motion direction that is an average value of motion directions per unit time in a fixed time as one type of the feature amount. Data processing according to (1) or (2) apparatus.
- the feature amount calculation unit calculates an average motion amount that is an average value of motion amounts per unit time in a fixed time as one type of the feature amount. (1) to (3) Data processing equipment.
- the feature amount calculation unit calculates an average amplitude that is an average value of amplitudes of a certain amount or more of the motion amount obtained in a certain time as one type of the feature amount.
- the feature amount calculation unit calculates an average acceleration that is an average value of motion acceleration per unit time in a fixed time as one type of the feature amount. Data processing equipment. (7) The feature amount calculation unit calculates an average motion interval that is an average value of time intervals at which a certain amount of motion amount amplitude is obtained in a certain time as one type of the feature amount. ). (8) The feature amount calculation unit calculates a motion start time which is a time from a predetermined timing to a timing at which a certain amount of motion amount amplitude is obtained as one type of the feature amount. (1) to (7 ).
- the classification unit calculates a distance between each of the plurality of templates having different combinations of feature amounts corresponding to the plurality of classification categories and the block, and the block is determined based on the calculated distance.
- the data processing device according to any one of (1) to (8), wherein a process of classifying as belonging to any one of a plurality of classification categories is performed for each block.
- the classification unit performs clustering by a k-means method based on the feature amount calculated corresponding to each block, so that each of the blocks is one of a predetermined number of classification categories.
- the data processing device according to any one of (1) to (9).
- a plurality of frame image data forming moving image data having an image content of an object that performs periodic motion is divided into blocks having an array of a predetermined number of pixels, and time-series data of movement for each block is detected.
- Classification data indicating the result of classifying each of the blocks forming any one of the plurality of frame image data as belonging to any one of a predetermined number of classification categories is used as the calculated feature amount.
- a data processing method comprising: a classification processing procedure generated based on the data.
- a motion detection unit that detects a motion of the evaluation target using an image of the evaluation target; Index data that shows the characteristics of the motion of the evaluation target using a motion vector indicating the motion of the evaluation target detected by the motion detection unit, and generates index data used as an index for the evaluation of the evaluation target
- An image processing apparatus comprising: an evaluation value calculation unit that evaluates the index data generated by the index data generation unit and calculates an evaluation value.
- the index data generation unit generates index data related to the amplitude of the motion of the evaluation target and index data related to the frequency per unit time of the peak of the motion of the evaluation target
- the evaluation value calculation unit calculates an evaluation value for evaluating the magnitude of the motion of the evaluation target using the index data relating to the amplitude of the motion of the evaluation target generated by the index data generation unit. Further, the evaluation value for evaluating the frequency per unit time of the peak of the motion of the evaluation target using the index data related to the frequency per unit time of the peak of the motion of the evaluation target generated by the index data generation unit.
- the index data regarding the magnitude of the motion amplitude of the evaluation target is an average value of the entire image of the evaluation target of the product of the normalized amplitude and the normalized variance of the amplitude (13 ).
- the index data relating to the magnitude of the motion amplitude of the evaluation target is the evaluation of a region where the value of the product of the normalized amplitude and the normalized variance of the amplitude is greater than or equal to a predetermined threshold value.
- the index data regarding the frequency per unit time of the motion peak of the evaluation target is a product of the normalized number of peaks per unit time and the normalized variance of the number of peaks per unit time.
- the index data regarding the frequency per unit time of the motion peak of the evaluation target is a product of the normalized number of peaks per unit time and the normalized variance of the number of peaks per unit time.
- the index data generation unit further generates index data related to a classification result obtained by classifying each partial region of the evaluation target image based on the feature quantity of the evaluation target motion
- the evaluation value calculation unit further calculates an evaluation value for evaluating the classification result of the feature quantity of the motion to be evaluated using the index data related to the classification result generated by the index data generation unit (13 ) To (17).
- the index data generation unit calculates a motion amount of the evaluation target detected by the motion detection unit, The image processing device according to any one of (12) to (18), wherein the evaluation value calculation unit images and displays a temporal change in the amount of motion calculated by the index data generation unit.
- the index data generation unit indicates index data indicating a change due to administration of a drug to the cardiomyocytes of a peak of a waveform indicating relaxation of the cardiomyocytes to be evaluated in the temporal change of the calculated amount of motion.
- the motion detection unit detects the motion of the evaluation target between the frame images in the evaluation period of a predetermined length of the evaluation target image that is a moving image.
- the image processing apparatus according to any one of the above.
- the image processing device according to (22), wherein the motion detection unit repeats the detection of the motion of the evaluation target in the evaluation period a predetermined number of times.
- the evaluation value calculation unit evaluates each of the plurality of types of index data generated by the index data generation unit, calculates an evaluation value, and integrates the calculated evaluation values, thereby evaluating the evaluation
- the image processing apparatus according to any one of (12) to (23), wherein an evaluation value for evaluating an object is calculated.
- the image processing device according to any one of (12) to (24), wherein the evaluation target is a cell that moves spontaneously.
- the evaluation target is a cultured cell generated by culturing a cell collected from a living body.
- the motion detection unit of the image processing device detects the motion of the evaluation target using the evaluation target image, An index used by the index data generation unit of the image processing device to indicate the characteristics of the motion of the evaluation target using a detected motion vector indicating the motion of the evaluation target, and to be used as an index for the evaluation of the evaluation target Generate data, An image processing method, wherein an evaluation value calculation unit of the image processing apparatus evaluates the generated index data and calculates an evaluation value.
- a motion detector that detects the motion of the evaluation object using an image of the evaluation object; Using a motion vector indicating the detected motion of the evaluation target, an index data generation unit that indicates the characteristics of the motion of the evaluation target and generates index data used as an index for the evaluation of the evaluation target; A program for evaluating the generated index data and causing it to function as an evaluation value calculation unit that calculates an evaluation value.
Abstract
Description
前記評価値算出部は、さらに、前記指標データ生成部により生成された前記分類結果に関する指標データを用いて、前記評価対象の動きの特徴量の分類結果を評価する評価値を算出することができる。
1.第1の実施の形態(評価指標データ生成処理:テンプレートを用いて分類処理を実行する例)
2.第2の実施の形態(評価指標データ生成処理:k平均法により分類処理を実行する例)
3.第3の実施の形態(培養心筋細胞評価装置)
4.第4の実施の形態(薬剤評価装置)
5.第5の実施の形態(パーソナルコンピュータ)
[培養心筋細胞評価システムの構成例]
図1は、培養心筋細胞評価システム100の構成例を示している。この図に示す培養心筋細胞評価システム100は、培養心筋細胞500の品質を評価するためのものである。
図2は、評価指標データ生成装置300の構成例を示している。この図に示す評価指標データ生成装置300は、動き検出部310、動き検出データ格納部320、特徴量算出部330および分類処理部340を備える。なお、この図において示される評価対象画像データ600は、評価対象画像データ生成記録装置200にて記録されているものを再生して得られるものであり、前述のようにフレーム画像データから成る動画像データである。
図3は、評価指標データ生成装置300が入力する評価対象画像データ600の構造例を示している。この図に示すように、評価対象画像データ600は、一定時間分に対応する1番目から(T+1)番目までのフレーム画像データ610-1乃至(T+1)から成る。
図4は、動き検出部310の構成例を示している。この図に示す動き検出部310は、フレームメモリ311および動きベクトル算出部312を備える。フレームメモリ311は、評価対象画像データ600として1フレーム期間ごとに順次入力されてくるフレーム画像データ610を保持する部位である。
特徴量算出部330は、動き検出データ格納部320に格納された動き検出データ700を利用して複数の特徴量を算出する。まず、特徴量算出部330が算出して取得する特徴量の例について図7を参照して説明する。
分類処理部340は、上記のようにして特徴量算出部330により算出される複数種類の特徴量を利用して分類処理を実行し、その分類処理結果を評価指標データ800として得る。このような分類手法としてはいくつか考えられるが、ここでは、クラスタリングといわれる手法を採用することとする。すなわち、クラスタといわれる分類区分を複数設定し、図5に示したフレーム画像データ610を形成する各ブロック611を、その特徴量に応じて複数のクラスタのうちの何れかに分類しようというものである。
(Vav,θav)=(0,0)
(Vav,θav)=(a,0)
(Vav,θav)=(a,π/4)
(Vav,θav)=(a,π/2)
(Vav,θav)=(a,3π/4)
図10のフローチャートは、第1の実施の形態における評価指標データ生成装置300が実行する処理手順例を示している。なお、この図における各ステップの処理は、図2に示される動き検出部310、特徴量算出部330および分類処理部340の何れかが適宜実行するものとなる。また、図10に示す各ステップとしての処理の少なくとも一部は、コンピュータ装置におけるCPU(Central Processing Unit)がプログラムを実行することで実現されるものとして構成可能である。
[評価指標データ生成装置の構成]
上記第1の実施の形態における分類処理はテンプレートを利用するものであったが、分類処理の仕方としては他にも考えることができる。そこで、第2の実施の形態として、他の分類処理の手法を採用した構成について説明する。
図11のフローチャートは、第2の実施の形態に対応して評価指標データ生成装置300が実行する処理手順例を示している。この図において、ステップS901乃至S907までの処理は、先の第1の実施の形態に対応する図10と同様となる。
[評価方法の他の例の概要]
細胞の評価方法は、上述した以外であってもよい。例えば、培養細胞のブロック毎に求めた動きベクトルから算出した指標について評価値を求めるようにしてもよい。
図13は、培養心筋細胞評価装置の主な構成例を示すブロック図である。
図14は、評価指標データ生成部1103の主な構成例を示すブロック図である。図14に示されるように、評価指標データ生成部1103は、図2の評価指標データ生成装置300と同様に、動き検出部310および動き検出データ格納部320を有する。また、評価指標データ生成部1103は、図2の評価指標データ生成装置300の特徴量算出部330の代わりに特徴量算出部1123を有し、図2の評価指標データ生成装置300の分類処理部340の代わりに分類処理部1124を有する。さらに、評価指標データ生成部1103は、動き特徴量データ履歴格納メモリ1125を有する。
図16は、評価部1104の主な構成例を示すブロック図である。図16に示されるように、評価部1104は、供給される評価指標データ800のそれぞれについて評価部(各指標用の評価部)を有する。図16の例において、評価部1104は、その各指標用の評価部として、振幅評価部1141、拍動数評価部1142、および分類結果評価部1143を有する。
次に、振幅評価部1141による振幅評価の具体的な例について説明する。一般的に、心筋細胞の拍動は、振幅が大きく安定していることが望ましい。そこで、振幅評価部1141が、振幅がより大きく安定している場合に値が大きくなるように評価値を算出するようにする。
なお、振幅の評価方法は、上述した例に限らない。例えば、培養した心筋細胞の拍動を、理想的な正常培養時の場合と比較し、その比較結果を評価するようにしてもよい。この場合、理想的な正常培養時の拍動の推移パターン(理想推移パターン)が予め定められている。
次に、拍動数評価部1142による単位時間当たりの拍動数評価の具体的な例について説明する。一般的に、心筋細胞の拍動は、単位時間当たりの拍動数(レート)が適切な値で安定していることが望ましい。そこで、拍動数評価部1142が、単位時間当たりの拍動数がより適切な値で安定している場合に値が大きくなるように評価値を算出するようにする。
なお、単位時間当たりの拍動数の評価方法は、上述した例に限らない。例えば、培養した心筋細胞の拍動を、理想的な正常培養時の場合と比較し、その比較結果を評価するようにしてもよい。この場合、理想的な正常培養時の拍動の推移パターン(理想推移パターン)が予め定められている。
次に、分類結果評価部1143によるクラスタ分類結果の評価の具体的な例について説明する。一般的に、心筋細胞の拍動は、望ましいクラスタに分類されたブロックの割合が多いほど望ましい。そこで、分類結果評価部1143が、特徴量が望ましい状態である所定のクラスタ(望ましいクラスタ)に分類されたブロックの割合がより多い場合に値が大きくなるように評価値を算出するようにする。
図16に戻り、評価部1104は、さらに、評価統合部1144を有する。評価部1104の各指標用の評価部は、それぞれが算出した各指標用の評価値を評価統合部1144に供給する。
次に、図21のフローチャートを参照して、培養心筋細胞評価装置1100により実行される評価処理の流れの例を説明する。
次に、図21のステップS1003において実行される評価指標データ生成処理の流れの例を、図22のフローチャートを参照して説明する。
次に、図21のステップS1004において実行される動き評価処理の流れの例を、図23のフローチャートを参照して説明する。
[他の評価への応用]
なお、評価対象の動きの協同性の評価することにより、その評価対象の動きに影響を及ぼす他の物体(例えば、気体、液体、固体の投与等)や任意の環境条件(例えば、温度、湿度、気圧、明度、振動、磁場等)等の評価を行うようにしてもよい。
図24は、薬剤評価装置の主な構成例を示すブロック図である。図24に示される薬剤評価装置1300は、薬剤による影響(効能や副作用等)を、その薬剤が投与された培養心筋細胞500の動きによって評価する装置である。
図26は、評価指標データ生成部1303の主な構成例を示すブロック図である。図26に示されるように、評価指標データ生成部1303は、動き検出部310を有し、評価対象画像データ600(動画像)の各フレーム画像間の動き検出を行い、ブロック毎の動きベクトルを生成する。
図27は、評価部1304の主な構成例を示すブロック図である。図27に示されるように、評価部1304は、特徴量取得部1341、特徴比較部1342、表示部1343、および出力部1344を有する。
次に、図28のフローチャートを参照して、薬剤評価装置1300により実行される評価処理の流れの例を説明する。
次に、図28のステップS1303において実行される評価指標データ生成処理の流れの例を、図29のフローチャートを参照して説明する。
次に、図30のフローチャートを参照して、図28のステップS1304において実行される影響評価処理の流れの例を説明する。
図31は、薬剤投与前後の拍動の様子の例を示す図である。図31に示される8個のグラフは、いずれも、培養心筋細胞500の観察領域内の所定の部分の拍動の様子(動き量絶対値の時間的変化)の観察結果である。横軸は時刻(sec)を示し、縦軸はフレーム間の動き量絶対値(pixcel/frame)を示す。つまり、各グラフにおいて示される各振幅が培養心筋細胞500の拍動を表している。
[パーソナルコンピュータ]
上述した一連の処理は、ハードウエアにより実行させることもできるし、ソフトウエアにより実行させることもできる。この場合、例えば、図33に示されるようなパーソナルコンピュータとして構成されるようにしてもよい。
(1) 周期的な運動を行う物体の画像内容を有する動画像データを形成する複数のフレーム画像データを所定画素数の配列によるブロックに分割して当該ブロックごとの動きの時系列データを検出する動き検出部と、
前記検出されたブロックごとの動きの時系列データに基づいて、前記ブロックごとに少なくとも1種類の特徴量を算出する特徴量算出部と、
前記複数のフレーム画像データの何れか1つを形成する前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類した結果を示す分類データを前記算出された特徴量に基づいて生成する分類処理部と
を具備するデータ処理装置。
(2) 前記特徴量算出部は、複数種類の前記特徴量を前記ブロックごとに算出し、
前記分類部は、算出された複数種類の前記特徴量に基づいて前記分類データを生成する
前記(1)に記載のデータ処理装置。
(3) 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動き方向の平均値である平均動き方向を算出する
前記(1)または(2)に記載のデータ処理装置。
(4) 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動き量の平均値である平均動き量を算出する
前記(1)乃至(3)のいずれかに記載のデータ処理装置。
(5) 前記特徴量算出部は、前記特徴量の1種類として一定時間において得られた一定以上の動き量の振幅の平均値である平均振幅を算出する
前記(1)乃至(4)のいずれかに記載のデータ処理装置。
(6) 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動きの加速度の平均値である平均加速度を算出する
前記(1)乃至(5)のいずれかに記載のデータ処理装置。
(7) 前記特徴量算出部は、前記特徴量の1種類として一定時間において一定以上の動き量の振幅が得られる時間間隔の平均値である平均動き間隔を算出する
前記(1)乃至(6)のいずれかに記載のデータ処理装置。
(8) 前記特徴量算出部は、前記特徴量の1種類として所定のタイミングから一定以上の動き量の振幅が得られるタイミングまでの時間である動き開始時間を算出する
前記(1)乃至(7)のいずれかに記載のデータ処理装置。
(9) 前記分類部は、前記複数の分類区分に対応して異なる特徴量の組み合わせを有する複数のテンプレートの各々と前記ブロックとの距離を算出し、算出された距離に基づいて前記ブロックを前記複数の分類区分のうちの何れか1つに属するものとして分類する処理を前記ブロックごとに行う
前記(1)乃至(8)のいずれかに記載のデータ処理装置。
(10) 前記分類部は、前記ブロックごとに対応して算出された特徴量に基づいてk平均法によるクラスタリングを行うことで、前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類する
前記(1)乃至(9)のいずれかに記載のデータ処理装置。
(11) 周期的な運動を行う物体の画像内容を有する動画像データを形成する複数のフレーム画像データを所定画素数の配列によるブロックに分割して当該ブロックごとの動きの時系列データを検出する動き検出手順と、
前記検出されたブロックごとの動きの時系列データに基づいて、前記ブロックごとに少なくとも1種類の特徴量を算出する特徴量算出手順と、
前記複数のフレーム画像データの何れか1つを形成する前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類した結果を示す分類データを前記算出された特徴量に基づいて生成する分類処理手順と
を具備するデータ処理方法。
(12) 評価対象の画像を用いて前記評価対象の動きを検出する動き検出部と、
前記動き検出部により検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成する指標データ生成部と、
前記指標データ生成部により生成された前記指標データを評価し、評価値を算出する評価値算出部と
を備える画像処理装置。
(13) 前記指標データ生成部は、前記評価対象の動きの振幅の大きさに関する指標データと、前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データを生成し、
前記評価値算出部は、前記指標データ生成部により生成された前記評価対象の動きの振幅の大きさに関する指標データを用いて、前記評価対象の動きの振幅の大きさ評価する評価値を算出し、さらに、前記指標データ生成部により生成された前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データを用いて、前記評価対象の動きのピークの単位時間当たりの頻度を評価する評価値を算出する
前記(12)に記載の画像処理装置。
(14) 前記評価対象の動きの振幅の大きさに関する指標データは、正規化した前記振幅と、正規化した前記振幅の分散との積の前記評価対象の画像全体の平均値である
前記(13)に記載の画像処理装置。
(15) 前記評価対象の動きの振幅の大きさに関する指標データは、正規化した前記振幅と正規化した前記振幅の分散との積の値が所定の閾値以上の値となる領域の、前記評価対象の画像全体に占める割合である
前記(13)または(14)のいずれかに記載の画像処理装置。
(16) 前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データは、正規化した単位時間当たりの前記ピークの数と、正規化した単位時間当たりの前記ピークの数の分散との積の画面全体の平均値である
前記(13)乃至(15)のいずれかに記載の画像処理装置。
(17) 前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データは、正規化した単位時間当たりの前記ピークの数と正規化した単位時間当たりの前記ピークの数の分散との積の値が所定の閾値以上の値となる領域の、前記評価対象の画像全体に占める割合である
前記(13)乃至(16)のいずれかに記載の画像処理装置。
(18) 前記指標データ生成部は、さらに、前記評価対象の動きの特徴量に基づいて前記評価対象の画像の各部分領域を分類した分類結果に関する指標データを生成し、
前記評価値算出部は、さらに、前記指標データ生成部により生成された前記分類結果に関する指標データを用いて、前記評価対象の動きの特徴量の分類結果を評価する評価値を算出する
前記(13)乃至(17)のいずれかに記載の画像処理装置。
(19) 前記指標データ生成部は、前記動き検出部により検出された前記評価対象の動き量を算出し、
前記評価値算出部は、前記指標データ生成部により算出された前記動き量の時間的変化を画像化し、表示する
前記(12)乃至(18)のいずれかに記載の画像処理装置。
(20) 前記指標データ生成部は、算出した前記動き量の時間的変化の、前記評価対象である心筋細胞の弛緩を示す波形のピークの、前記心筋細胞への薬剤投与による変化を示す指標データを生成し、
前記評価値算出部は、前記指標データ生成部により算出された前記指標データを評価し、評価値を算出する
前記(19)に記載の画像処理装置。
(21) 前記評価対象を撮像し、前記評価対象の画像を得る撮像部をさらに備え、
前記動き検出部は、前記撮像部により得られた前記評価対象の画像を用いて前記評価対象の動きを検出する
前記(12)乃至(20)のいずれかに記載の画像処理装置。
(22) 前記動き検出部は、動画像である前記評価対象の画像の、所定の長さの評価期間の各フレーム画像間の前記評価対象の動きを検出する
前記(12)乃至(21)のいずれかに記載の画像処理装置。
(23) 前記動き検出部は、前記評価期間の前記評価対象の動きの検出を、所定回数繰り返す
前記(22)に記載の画像処理装置。
(24) 前記評価値算出部は、前記指標データ生成部により生成された複数種類の前記指標データのそれぞれを評価して評価値を算出し、算出した各評価値を統合することにより、前記評価対象を評価する評価値を算出する
前記(12)乃至(23)のいずれかに記載の画像処理装置。
(25) 前記評価対象は、自発的に動く細胞である
前記(12)乃至(24)のいずれかに記載の画像処理装置。
(26) 前記評価対象は、生体より採取した細胞を培養して生成した培養細胞である
前記(12)乃至(25)のいずれかに記載の画像処理装置。
(27) 画像処理装置の動き検出部が、評価対象の画像を用いて前記評価対象の動きを検出し、
前記画像処理装置の指標データ生成部が、検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成し、
前記画像処理装置の評価値算出部が、生成された前記指標データを評価し、評価値を算出する
画像処理方法。
(28) コンピュータを、
評価対象の画像を用いて前記評価対象の動きを検出する動き検出部、
検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成する指標データ生成部、
生成された前記指標データを評価し、評価値を算出する評価値算出部
として機能させるためのプログラム。
Claims (28)
- 周期的な運動を行う物体の画像内容を有する動画像データを形成する複数のフレーム画像データを所定画素数の配列によるブロックに分割して当該ブロックごとの動きの時系列データを検出する動き検出部と、
前記検出されたブロックごとの動きの時系列データに基づいて、前記ブロックごとに少なくとも1種類の特徴量を算出する特徴量算出部と、
前記複数のフレーム画像データの何れか1つを形成する前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類した結果を示す分類データを前記算出された特徴量に基づいて生成する分類処理部と
を具備するデータ処理装置。 - 前記特徴量算出部は、複数種類の前記特徴量を前記ブロックごとに算出し、
前記分類部は、算出された複数種類の前記特徴量に基づいて前記分類データを生成する請求項1記載のデータ処理装置。 - 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動き方向の平均値である平均動き方向を算出する請求項1記載のデータ処理装置。
- 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動き量の平均値である平均動き量を算出する請求項1記載のデータ処理装置。
- 前記特徴量算出部は、前記特徴量の1種類として一定時間において得られた一定以上の動き量の振幅の平均値である平均振幅を算出する請求項1記載のデータ処理装置。
- 前記特徴量算出部は、前記特徴量の1種類として一定時間における単位時間ごとの動きの加速度の平均値である平均加速度を算出する請求項1記載のデータ処理装置。
- 前記特徴量算出部は、前記特徴量の1種類として一定時間において一定以上の動き量の振幅が得られる時間間隔の平均値である平均動き間隔を算出する請求項1記載のデータ処理装置。
- 前記特徴量算出部は、前記特徴量の1種類として所定のタイミングから一定以上の動き量の振幅が得られるタイミングまでの時間である動き開始時間を算出する請求項1記載のデータ処理装置。
- 前記分類部は、前記複数の分類区分に対応して異なる特徴量の組み合わせを有する複数のテンプレートの各々と前記ブロックとの距離を算出し、算出された距離に基づいて前記ブロックを前記複数の分類区分のうちの何れか1つに属するものとして分類する処理を前記ブロックごとに行う請求項1記載のデータ処理装置。
- 前記分類部は、前記ブロックごとに対応して算出された特徴量に基づいてk平均法によるクラスタリングを行うことで、前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類する請求項1記載のデータ処理装置。
- 周期的な運動を行う物体の画像内容を有する動画像データを形成する複数のフレーム画像データを所定画素数の配列によるブロックに分割して当該ブロックごとの動きの時系列データを検出する動き検出手順と、
前記検出されたブロックごとの動きの時系列データに基づいて、前記ブロックごとに少なくとも1種類の特徴量を算出する特徴量算出手順と、
前記複数のフレーム画像データの何れか1つを形成する前記ブロックのそれぞれを所定数の分類区分のうちの何れか1つに属するものとして分類した結果を示す分類データを前記算出された特徴量に基づいて生成する分類処理手順とを具備するデータ処理方法。 - 評価対象の画像を用いて前記評価対象の動きを検出する動き検出部と、
前記動き検出部により検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成する指標データ生成部と、
前記指標データ生成部により生成された前記指標データを評価し、評価値を算出する評価値算出部と
を備える画像処理装置。 - 前記指標データ生成部は、前記評価対象の動きの振幅の大きさに関する指標データと、前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データを生成し、
前記評価値算出部は、前記指標データ生成部により生成された前記評価対象の動きの振幅の大きさに関する指標データを用いて、前記評価対象の動きの振幅の大きさ評価する評価値を算出し、さらに、前記指標データ生成部により生成された前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データを用いて、前記評価対象の動きのピークの単位時間当たりの頻度を評価する評価値を算出する
請求項12に記載の画像処理装置。 - 前記評価対象の動きの振幅の大きさに関する指標データは、正規化した前記振幅と、正規化した前記振幅の分散との積の前記評価対象の画像全体の平均値である
請求項13に記載の画像処理装置。 - 前記評価対象の動きの振幅の大きさに関する指標データは、正規化した前記振幅と正規化した前記振幅の分散との積の値が所定の閾値以上の値となる領域の、前記評価対象の画像全体に占める割合である
請求項13に記載の画像処理装置。 - 前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データは、正規化した単位時間当たりの前記ピークの数と、正規化した単位時間当たりの前記ピークの数の分散との積の画面全体の平均値である
請求項13に記載の画像処理装置。 - 前記評価対象の動きのピークの単位時間当たりの頻度に関する指標データは、正規化した単位時間当たりの前記ピークの数と正規化した単位時間当たりの前記ピークの数の分散との積の値が所定の閾値以上の値となる領域の、前記評価対象の画像全体に占める割合である
請求項13に記載の画像処理装置。 - 前記指標データ生成部は、さらに、前記評価対象の動きの特徴量に基づいて前記評価対象の画像の各部分領域を分類した分類結果に関する指標データを生成し、
前記評価値算出部は、さらに、前記指標データ生成部により生成された前記分類結果に関する指標データを用いて、前記評価対象の動きの特徴量の分類結果を評価する評価値を算出する
請求項13に記載の画像処理装置。 - 前記指標データ生成部は、前記動き検出部により検出された前記評価対象の動き量を算出し、
前記評価値算出部は、前記指標データ生成部により算出された前記動き量の時間的変化を画像化し、表示する
請求項12に記載の画像処理装置。 - 前記指標データ生成部は、算出した前記動き量の時間的変化の、前記評価対象である心筋細胞の弛緩を示す波形のピークの、前記心筋細胞への薬剤投与による変化を示す指標データを生成し、
前記評価値算出部は、前記指標データ生成部により算出された前記指標データを評価し、評価値を算出する
請求項19に記載の画像処理装置。 - 前記評価対象を撮像し、前記評価対象の画像を得る撮像部をさらに備え、
前記動き検出部は、前記撮像部により得られた前記評価対象の画像を用いて前記評価対象の動きを検出する
請求項12に記載の画像処理装置。 - 前記動き検出部は、動画像である前記評価対象の画像の、所定の長さの評価期間の各フレーム画像間の前記評価対象の動きを検出する
請求項12に記載の画像処理装置。 - 前記動き検出部は、前記評価期間の前記評価対象の動きの検出を、所定回数繰り返す
請求項22に記載の画像処理装置。 - 前記評価値算出部は、前記指標データ生成部により生成された複数種類の前記指標データのそれぞれを評価して評価値を算出し、算出した各評価値を統合することにより、前記評価対象を評価する評価値を算出する
請求項12に記載の画像処理装置。 - 前記評価対象は、自発的に動く細胞である
請求項12に記載の画像処理装置。 - 前記評価対象は、生体より採取した細胞を培養して生成した培養細胞である
請求項12に記載の画像処理装置。 - 画像処理装置の動き検出部が、評価対象の画像を用いて前記評価対象の動きを検出し、
前記画像処理装置の指標データ生成部が、検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成し、
前記画像処理装置の評価値算出部が、生成された前記指標データを評価し、評価値を算出する
画像処理方法。 - コンピュータを、
評価対象の画像を用いて前記評価対象の動きを検出する動き検出部、
検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の動きの特徴を示し、前記評価対象の評価のための指標として用いられる指標データを生成する指標データ生成部、
生成された前記指標データを評価し、評価値を算出する評価値算出部
として機能させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180025497.7A CN102906789B (zh) | 2010-03-29 | 2011-02-28 | 数据处理装置、数据处理方法、图像处理装置和方法以及程序 |
US13/637,233 US9786052B2 (en) | 2010-03-29 | 2011-02-28 | Image processing apparatus and method for evaluating objects in an image |
JP2012508154A JP5772817B2 (ja) | 2010-03-29 | 2011-02-28 | 画像処理装置および方法、並びに、プログラム |
US15/669,169 US10311582B2 (en) | 2010-03-29 | 2017-08-04 | Image processing apparatus and method for evaluating objects in an image |
US16/401,494 US10692223B2 (en) | 2010-03-29 | 2019-05-02 | Image processing apparatus and method for evaluating objects in an image |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010074306 | 2010-03-29 | ||
JP2010-074306 | 2010-03-29 | ||
JP2010234504 | 2010-10-19 | ||
JP2010-234504 | 2010-10-19 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/637,233 A-371-Of-International US9786052B2 (en) | 2010-03-29 | 2011-02-28 | Image processing apparatus and method for evaluating objects in an image |
US15/669,169 Division US10311582B2 (en) | 2010-03-29 | 2017-08-04 | Image processing apparatus and method for evaluating objects in an image |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011122200A1 true WO2011122200A1 (ja) | 2011-10-06 |
Family
ID=44711929
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/054557 WO2011122200A1 (ja) | 2010-03-29 | 2011-02-28 | データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム |
Country Status (4)
Country | Link |
---|---|
US (3) | US9786052B2 (ja) |
JP (3) | JP5772817B2 (ja) |
CN (1) | CN102906789B (ja) |
WO (1) | WO2011122200A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012118049A1 (ja) * | 2011-02-28 | 2012-09-07 | ソニー株式会社 | 表示制御装置および方法、画像処理装置および方法、並びに、プログラム |
WO2013165301A1 (en) * | 2012-05-03 | 2013-11-07 | General Electric Company | Automatic segmentation and characterization of cellular motion |
JP2014075999A (ja) * | 2012-10-10 | 2014-05-01 | Nikon Corp | 心筋細胞の運動検出方法、画像処理プログラム及び画像処理装置、心筋細胞の培養方法、心筋細胞の薬剤評価方法及び薬剤製造方法 |
JP2014179061A (ja) * | 2013-02-14 | 2014-09-25 | Sony Corp | 分析システム、分析プログラム及び分析方法 |
WO2015008682A1 (ja) * | 2013-07-19 | 2015-01-22 | ソニー株式会社 | 細胞評価装置および方法、並びにプログラム |
JP2015019620A (ja) * | 2013-07-19 | 2015-02-02 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
WO2015041177A1 (ja) * | 2013-09-18 | 2015-03-26 | 株式会社ニコン | 画像解析装置、画像解析方法、画像解析プログラム、細胞の製造方法、細胞の培養方法、および細胞の製造装置 |
US9582894B2 (en) | 2012-12-27 | 2017-02-28 | Tampereen Ylipoisto | Visual cardiomyocyte analysis |
JPWO2015068329A1 (ja) * | 2013-11-08 | 2017-03-09 | ソニー株式会社 | 細胞分析システム、細胞分析プログラム及び細胞分析方法 |
JP2017099405A (ja) * | 2017-02-28 | 2017-06-08 | 株式会社ニコン | 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置 |
WO2017154318A1 (ja) * | 2016-03-09 | 2017-09-14 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
JP2018061511A (ja) * | 2013-02-14 | 2018-04-19 | ソニー株式会社 | 神経細胞評価方法、神経細胞評価プログラム及び神経細胞評価方法 |
US10209236B2 (en) | 2012-12-27 | 2019-02-19 | Sony Corporation | Cell analyzer system, cell analyzer program, and cell analyzing method |
JP2019053791A (ja) * | 2019-01-15 | 2019-04-04 | ソニー株式会社 | 画像処理装置、画像処理プログラム及び画像処理方法 |
US10330583B2 (en) | 2014-09-30 | 2019-06-25 | Fujifilm Corporation | Cell imaging apparatus and method for generating a composite image |
WO2019131806A1 (ja) * | 2017-12-26 | 2019-07-04 | 株式会社マイオリッジ | 心筋細胞の薬剤応答性試験方法 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011122200A1 (ja) | 2010-03-29 | 2011-10-06 | ソニー株式会社 | データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム |
JP6147172B2 (ja) * | 2013-11-20 | 2017-06-14 | キヤノン株式会社 | 撮像装置、画像処理装置、画像処理方法、及びプログラム |
US10535137B2 (en) * | 2014-01-07 | 2020-01-14 | Sony Corporation | Analysis system and analysis method |
JP2017126254A (ja) * | 2016-01-15 | 2017-07-20 | キヤノン株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2017146696A (ja) | 2016-02-16 | 2017-08-24 | ソニー株式会社 | 画像処理装置、画像処理方法及び画像処理システム |
KR102035860B1 (ko) * | 2018-02-06 | 2019-10-23 | 한림대학교 산학협력단 | 연속 제스처 인식 시스템 |
JP7086630B2 (ja) * | 2018-02-09 | 2022-06-20 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
FR3086435B1 (fr) * | 2018-09-26 | 2021-06-11 | Univ Claude Bernard Lyon | Procede d’analyse automatisee des contractions cellulaires d’un ensemble de cellules biologiques. |
US20220198615A1 (en) * | 2019-05-31 | 2022-06-23 | Nippon Telegraph And Telephone Corporation | Image processing apparatus, image processing method, and program |
CN110780780B (zh) * | 2019-09-04 | 2022-03-22 | 西安万像电子科技有限公司 | 图像处理方法及装置 |
CN110866906B (zh) * | 2019-11-12 | 2022-07-08 | 安徽师范大学 | 基于图像边缘提取的三维培养人体心肌细胞搏动检测方法 |
CN113076894B (zh) * | 2021-04-09 | 2022-05-17 | 中山大学 | 一种连续帧目标检测去重方法及装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003169319A (ja) * | 2001-11-30 | 2003-06-13 | Mitsubishi Electric Corp | 映像監視装置 |
JP2006079594A (ja) * | 2004-08-13 | 2006-03-23 | Sony Corp | 移動物体検出装置及び方法 |
WO2008149055A1 (en) * | 2007-06-05 | 2008-12-11 | General Electric Company | Automatic characterization of cellular motion |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5081531A (en) * | 1989-01-11 | 1992-01-14 | U.S. Philips Corporation | Method and apparatus for processing a high definition television signal using motion vectors representing more than one motion velocity range |
US6095976A (en) * | 1997-06-19 | 2000-08-01 | Medinol Ltd. | Method for enhancing an image derived from reflected ultrasound signals produced by an ultrasound transmitter and detector inserted in a bodily lumen |
JP2000092454A (ja) * | 1998-09-14 | 2000-03-31 | Sony Corp | 画像情報変換装置および画像情報変換方法 |
GB2366112B (en) * | 1998-12-29 | 2003-05-28 | Kent Ridge Digital Labs | Method and apparatus for embedding digital information in digital multimedia data |
US6597738B1 (en) * | 1999-02-01 | 2003-07-22 | Hyundai Curitel, Inc. | Motion descriptor generating apparatus by using accumulated motion histogram and a method therefor |
JP4564634B2 (ja) * | 2000-08-14 | 2010-10-20 | キヤノン株式会社 | 画像処理方法及び装置並びに記憶媒体 |
JP3539632B2 (ja) | 2001-02-14 | 2004-07-07 | 東京工業大学長 | 画像変化抽出方法、およびその画像処理プログラム |
CN100349549C (zh) * | 2003-10-29 | 2007-11-21 | 福州大学 | 全方向m型心动图的速度场和加速度场的检测方法及其装置 |
DE10353785B4 (de) * | 2003-11-18 | 2006-05-18 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Verfahren und Vorrichtung zur Erfassung von verschiedenen Zelltypen von Zellen in einer biologischen Probe |
US20070014452A1 (en) * | 2003-12-01 | 2007-01-18 | Mitta Suresh | Method and system for image processing and assessment of a state of a heart |
US8900149B2 (en) | 2004-04-02 | 2014-12-02 | Teratech Corporation | Wall motion analyzer |
US7440008B2 (en) * | 2004-06-15 | 2008-10-21 | Corel Tw Corp. | Video stabilization method |
US7916173B2 (en) * | 2004-06-22 | 2011-03-29 | Canon Kabushiki Kaisha | Method for detecting and selecting good quality image frames from video |
ATE431397T1 (de) * | 2005-06-01 | 2009-05-15 | Fraunhofer Ges Forschung | Verfahren zur optischen bestimmung des dynamischen verhaltens von kontrahierenden zellen |
EP1949297B1 (en) * | 2005-10-14 | 2015-07-29 | Unisense Fertilitech A/S | Determination of a change in a cell population |
JP4914659B2 (ja) | 2006-07-04 | 2012-04-11 | パイオニア株式会社 | 映像処理装置、その方法、そのプログラム、および、そのプログラムを記録した記録媒体 |
JP2008076088A (ja) * | 2006-09-19 | 2008-04-03 | Foundation For Biomedical Research & Innovation | 細胞のモニター方法およびモニター装置 |
JP4823179B2 (ja) * | 2006-10-24 | 2011-11-24 | 三洋電機株式会社 | 撮像装置及び撮影制御方法 |
US8202720B2 (en) * | 2007-06-08 | 2012-06-19 | Mitsubishi Chemical Medience Corporation | Model cell chip, apparatus for evaluating drug effect using the model cell chip and method of evaluating drug effect |
CN101224110A (zh) * | 2007-12-24 | 2008-07-23 | 南京理工大学 | 三维心肌形变应变计算方法 |
JP4962304B2 (ja) | 2007-12-26 | 2012-06-27 | 株式会社豊田中央研究所 | 歩行者検出装置 |
JP4958806B2 (ja) | 2008-01-22 | 2012-06-20 | 三洋電機株式会社 | ぶれ検出装置、ぶれ補正装置及び撮像装置 |
CN101297763A (zh) * | 2008-04-18 | 2008-11-05 | 福州大学 | 解剖式m型心动图的瞬时速度与加速度的检测方法 |
CN102056547B (zh) * | 2008-06-03 | 2014-05-14 | 株式会社日立医疗器械 | 医用图像处理装置及医用图像处理方法 |
JP2010004261A (ja) | 2008-06-19 | 2010-01-07 | Sony Corp | 画像処理装置、及び画像処理方法 |
US8831101B2 (en) * | 2008-08-02 | 2014-09-09 | Ecole De Technologie Superieure | Method and system for determining a metric for comparing image blocks in motion compensated video coding |
US8290255B2 (en) * | 2009-02-06 | 2012-10-16 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, and program |
JP5374220B2 (ja) * | 2009-04-23 | 2013-12-25 | キヤノン株式会社 | 動きベクトル検出装置およびその制御方法、ならびに撮像装置 |
WO2011122200A1 (ja) | 2010-03-29 | 2011-10-06 | ソニー株式会社 | データ処理装置およびデータ処理方法、画像処理装置および方法、並びに、プログラム |
JP6078943B2 (ja) | 2011-02-28 | 2017-02-15 | ソニー株式会社 | 画像処理装置および方法、並びに、プログラム |
US9070004B2 (en) * | 2012-05-03 | 2015-06-30 | General Electric Company | Automatic segmentation and characterization of cellular motion |
-
2011
- 2011-02-28 WO PCT/JP2011/054557 patent/WO2011122200A1/ja active Application Filing
- 2011-02-28 CN CN201180025497.7A patent/CN102906789B/zh active Active
- 2011-02-28 JP JP2012508154A patent/JP5772817B2/ja not_active Expired - Fee Related
- 2011-02-28 US US13/637,233 patent/US9786052B2/en active Active
-
2015
- 2015-07-01 JP JP2015132417A patent/JP6222529B2/ja active Active
-
2017
- 2017-08-04 US US15/669,169 patent/US10311582B2/en active Active
- 2017-10-10 JP JP2017196539A patent/JP6504417B2/ja active Active
-
2019
- 2019-05-02 US US16/401,494 patent/US10692223B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003169319A (ja) * | 2001-11-30 | 2003-06-13 | Mitsubishi Electric Corp | 映像監視装置 |
JP2006079594A (ja) * | 2004-08-13 | 2006-03-23 | Sony Corp | 移動物体検出装置及び方法 |
WO2008149055A1 (en) * | 2007-06-05 | 2008-12-11 | General Electric Company | Automatic characterization of cellular motion |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103380209A (zh) * | 2011-02-28 | 2013-10-30 | 索尼公司 | 显示控制设备和方法、图像处理设备和方法以及程序 |
US10650534B2 (en) | 2011-02-28 | 2020-05-12 | Sony Corporation | Display control and image processing of a cell image |
US10311581B2 (en) | 2011-02-28 | 2019-06-04 | Sony Corporation | Display control and image processing of a cell image |
WO2012118049A1 (ja) * | 2011-02-28 | 2012-09-07 | ソニー株式会社 | 表示制御装置および方法、画像処理装置および方法、並びに、プログラム |
WO2013165301A1 (en) * | 2012-05-03 | 2013-11-07 | General Electric Company | Automatic segmentation and characterization of cellular motion |
US9070004B2 (en) | 2012-05-03 | 2015-06-30 | General Electric Company | Automatic segmentation and characterization of cellular motion |
JP2015518378A (ja) * | 2012-05-03 | 2015-07-02 | ゼネラル・エレクトリック・カンパニイ | 細胞運動の自動セグメンテーション及び特徴付け |
EP2845146A4 (en) * | 2012-05-03 | 2016-09-14 | Gen Electric | AUTOMATIC SEGMENTATION AND CHARACTERIZATION OF CELL MOVEMENT |
JP2014075999A (ja) * | 2012-10-10 | 2014-05-01 | Nikon Corp | 心筋細胞の運動検出方法、画像処理プログラム及び画像処理装置、心筋細胞の培養方法、心筋細胞の薬剤評価方法及び薬剤製造方法 |
US9582894B2 (en) | 2012-12-27 | 2017-02-28 | Tampereen Ylipoisto | Visual cardiomyocyte analysis |
US10209236B2 (en) | 2012-12-27 | 2019-02-19 | Sony Corporation | Cell analyzer system, cell analyzer program, and cell analyzing method |
US10049452B2 (en) | 2013-02-14 | 2018-08-14 | Sony Corporation | Information processing apparatus, information processing method, and cell analysis system |
US10891734B2 (en) | 2013-02-14 | 2021-01-12 | Sony Corporation | Information processing apparatus, information processing method, and cell analysis system |
JP2014179061A (ja) * | 2013-02-14 | 2014-09-25 | Sony Corp | 分析システム、分析プログラム及び分析方法 |
JP2018061511A (ja) * | 2013-02-14 | 2018-04-19 | ソニー株式会社 | 神経細胞評価方法、神経細胞評価プログラム及び神経細胞評価方法 |
JP2019195337A (ja) * | 2013-02-14 | 2019-11-14 | ソニー株式会社 | 神経細胞評価方法、神経細胞評価プログラム及び神経細胞評価方法 |
JP2015019620A (ja) * | 2013-07-19 | 2015-02-02 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
US11087472B2 (en) | 2013-07-19 | 2021-08-10 | Sony Corporation | Image processing device, method, and medium for calculating propagation speed and direction of object pulsations |
US9972097B2 (en) | 2013-07-19 | 2018-05-15 | Sony Corporation | Image processing device, method, and medium for calculating propagation speed and direction of object pulsations |
US10157459B2 (en) | 2013-07-19 | 2018-12-18 | Sony Corporation | Cell evaluation apparatus and method to search for an area to be observed in cardiomyocytes |
WO2015008682A1 (ja) * | 2013-07-19 | 2015-01-22 | ソニー株式会社 | 細胞評価装置および方法、並びにプログラム |
WO2015041177A1 (ja) * | 2013-09-18 | 2015-03-26 | 株式会社ニコン | 画像解析装置、画像解析方法、画像解析プログラム、細胞の製造方法、細胞の培養方法、および細胞の製造装置 |
US10163218B2 (en) | 2013-09-18 | 2018-12-25 | Nikon Corporation | Image analysis device, image analysis method, image analysis program, cell manufacturing method, cell culturing method, and cell manufacturing device |
JPWO2015041177A1 (ja) * | 2013-09-18 | 2017-03-02 | 株式会社ニコン | 画像解析装置、画像解析方法、画像解析プログラム、細胞の製造方法、細胞の培養方法、および細胞の製造装置 |
US10482598B2 (en) | 2013-11-08 | 2019-11-19 | Sony Corporation | Cell analysis system, cell analysis program and cell analysis method |
US10163203B2 (en) | 2013-11-08 | 2018-12-25 | Sony Corporation | Cell analysis system, cell analysis program and cell analysis method |
JPWO2015068329A1 (ja) * | 2013-11-08 | 2017-03-09 | ソニー株式会社 | 細胞分析システム、細胞分析プログラム及び細胞分析方法 |
US10861154B2 (en) | 2013-11-08 | 2020-12-08 | Sony Corporation | Cell analysis system, cell analysis program and cell analysis method |
US10330583B2 (en) | 2014-09-30 | 2019-06-25 | Fujifilm Corporation | Cell imaging apparatus and method for generating a composite image |
WO2017154318A1 (ja) * | 2016-03-09 | 2017-09-14 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
JPWO2017154318A1 (ja) * | 2016-03-09 | 2019-01-10 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
US10845186B2 (en) | 2016-03-09 | 2020-11-24 | Sony Corporation | Information processing device, information processing method, and information processing system |
JP2017099405A (ja) * | 2017-02-28 | 2017-06-08 | 株式会社ニコン | 心筋細胞の運動検出方法、心筋細胞の培養方法、薬剤評価方法、画像処理プログラム及び画像処理装置 |
WO2019131806A1 (ja) * | 2017-12-26 | 2019-07-04 | 株式会社マイオリッジ | 心筋細胞の薬剤応答性試験方法 |
JPWO2019131806A1 (ja) * | 2017-12-26 | 2020-12-17 | 株式会社マイオリッジ | 心筋細胞の薬剤応答性試験方法 |
US11726083B2 (en) | 2017-12-26 | 2023-08-15 | Myoridge Co. Ltd. | Method for testing drug response of cardiomyocytes |
JP7341478B2 (ja) | 2017-12-26 | 2023-09-11 | 株式会社マイオリッジ | 心筋細胞の薬剤応答性試験方法 |
JP2019053791A (ja) * | 2019-01-15 | 2019-04-04 | ソニー株式会社 | 画像処理装置、画像処理プログラム及び画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102906789B (zh) | 2017-05-17 |
JPWO2011122200A1 (ja) | 2013-07-08 |
US10311582B2 (en) | 2019-06-04 |
JP2015207308A (ja) | 2015-11-19 |
JP6504417B2 (ja) | 2019-04-24 |
JP6222529B2 (ja) | 2017-11-01 |
US10692223B2 (en) | 2020-06-23 |
CN102906789A (zh) | 2013-01-30 |
US20190259166A1 (en) | 2019-08-22 |
JP5772817B2 (ja) | 2015-09-02 |
US20170337697A1 (en) | 2017-11-23 |
US20130070971A1 (en) | 2013-03-21 |
US9786052B2 (en) | 2017-10-10 |
JP2018038411A (ja) | 2018-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6222529B2 (ja) | 細胞評価装置および方法、並びに、細胞評価システム | |
US10121064B2 (en) | Systems and methods for behavior detection using 3D tracking and machine learning | |
Kropf et al. | ECG classification based on time and frequency domain features using random forests | |
CN202795476U (zh) | 图像处理设备 | |
JP2012239661A (ja) | 心拍数・呼吸数検出装置,方法およびプログラム | |
EP3428268A1 (en) | Information processing device, information processing method, program, and information processing system | |
US11087472B2 (en) | Image processing device, method, and medium for calculating propagation speed and direction of object pulsations | |
Boussard et al. | Three-dimensional spike localization and improved motion correction for Neuropixels recordings | |
Ma et al. | Multi-Scale Dynamic Graph Learning for Brain Disorder Detection with Functional MRI | |
Baruchi et al. | Functional holography analysis: simplifying the complexity of dynamical networks | |
Prabhakar et al. | EM based non-linear regression and singular value decomposition for epilepsy classification | |
JP2017175965A (ja) | 画像処理装置、画像処理方法、及び画像処理システム | |
Paul et al. | Hybrid shallow and deep learned feature mixture model for arrhythmia classification | |
JP6191888B2 (ja) | 画像処理装置および方法、並びに、プログラム | |
Liu et al. | Facial expression awareness based on multi-scale permutation entropy of EEG | |
Anzellotti et al. | Measuring and modeling transformations of information between brain regions with fMRI | |
Parsons et al. | Robust and fast heart rate variability analysis of long and noisy electrocardiograms using neural networks and images | |
Mameli et al. | Weight Estimation from an RGB-D camera in top-view configuration | |
Bryant et al. | Extracting interpretable signatures of whole-brain dynamics through systematic comparison | |
Vanitha et al. | Brain Activity Analysis using Deep Neural Network | |
Aragones et al. | Variable selection for nonlinear dimensionality reduction of biological datasets through bootstrapping of correlation networks | |
Jarchi et al. | Localization of seizure sources using blind identification and a new clustering algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180025497.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11762440 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012508154 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13637233 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11762440 Country of ref document: EP Kind code of ref document: A1 |