WO2017159011A1 - 情報処理装置、情報処理方法、プログラム及び情報処理システム - Google Patents
情報処理装置、情報処理方法、プログラム及び情報処理システム Download PDFInfo
- Publication number
- WO2017159011A1 WO2017159011A1 PCT/JP2017/000690 JP2017000690W WO2017159011A1 WO 2017159011 A1 WO2017159011 A1 WO 2017159011A1 JP 2017000690 W JP2017000690 W JP 2017000690W WO 2017159011 A1 WO2017159011 A1 WO 2017159011A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature amount
- information processing
- region
- evaluation value
- calculation unit
- Prior art date
Links
Images
Classifications
-
- C—CHEMISTRY; METALLURGY
- C12—BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
- C12M—APPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
- C12M1/00—Apparatus for enzymology or microbiology
- C12M1/34—Measuring or testing with condition measuring or sensing means, e.g. colony counters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/5005—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells
- G01N33/5008—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics
- G01N33/502—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing non-proliferative effects
- G01N33/5026—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing non-proliferative effects on cell morphology
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
- G01N33/48—Biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
- G01N33/5005—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells
- G01N33/5008—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics
- G01N33/502—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing non-proliferative effects
- G01N33/5029—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving human or animal cells for testing or evaluating the effect of chemical or biological compounds, e.g. drugs, cosmetics for testing non-proliferative effects on cell motility
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, a program, and an information processing system.
- Non-Patent Document 1 discloses a technique for evaluating the phagocytic function of phagocytic cells using flow cytometry and fluorescence imaging.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, program, and information processing system capable of evaluating in more detail the functions related to absorption or release of biological samples.
- a detection unit that detects at least one attention area from at least one captured image of a plurality of captured images with different imaging times for a biological sample, and the at least one attention area.
- a feature value calculation unit that calculates a feature value related to a change in the plurality of captured images, and an evaluation value calculation that calculates an evaluation value for a function related to absorption or release of the biological sample based on the feature value
- an information processing apparatus including the unit.
- the processor detects at least one region of interest from at least one captured image of a plurality of captured images having different imaging times for the biological sample; Calculating feature values related to changes in the plurality of captured images of the region of interest, and calculating an evaluation value for a function related to absorption or release of the biological sample based on the feature values , An information processing method is provided.
- the computer includes a detection unit that detects at least one region of interest from at least one captured image of a plurality of captured images having different imaging times for a biological sample, and at least the above-described
- a feature amount calculation unit for calculating a feature amount related to a change in the plurality of captured images of one region of interest; and an evaluation value for a function related to absorption or release of the biological sample based on the feature amount
- An evaluation value calculation unit to be calculated and a program for causing the program to function are provided.
- an imaging device including an imaging unit that generates a plurality of captured images having different imaging times for a biological sample, and at least one captured image of the plurality of captured images.
- a detection unit that detects one region of interest; a feature amount calculation unit that calculates a feature amount related to a change in the plurality of captured images of the at least one region of interest; and the biological based on the feature amount
- An information processing system includes an information processing apparatus including an evaluation value calculation unit that calculates an evaluation value for a function related to absorption or release of a sample.
- FIG. 3 is a functional block diagram illustrating a functional configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. It is a figure for demonstrating the detection process of the attention area by the detection part which concerns on the embodiment. It is a figure for demonstrating the attention area identification process by the detection part which concerns on the same embodiment. It is a figure for demonstrating the change of the position of the outline of an attention area. It is a figure for demonstrating the change of the shape of the outline of an attention area. It is a figure for demonstrating the motion inside an attention area. It is a figure for demonstrating the pixel information inside an attention area.
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an outline of a configuration of an information processing system 1 according to an embodiment of the present disclosure.
- the information processing system 1 includes an imaging device 10 and an information processing device 20.
- the imaging device 10 and the information processing device 20 are connected by various wired or wireless networks.
- the imaging device 10 is a device that generates a captured image (moving image).
- the imaging device 10 according to the present embodiment is realized by a digital camera, for example.
- the imaging device 10 may be realized by any device having an imaging function, such as a smartphone, a tablet, a game machine, or a wearable device.
- Such an image pickup apparatus 10 has various members such as an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) and a lens for controlling the formation of a subject image on the image sensor. And real space is imaged using these.
- a function as an imaging unit of the imaging apparatus 10 is realized by these imaging elements and various members.
- the imaging device 10 includes a communication device for transmitting and receiving moving images and the like with the information processing device 20.
- the imaging device 10 is provided above the imaging stage S for imaging the culture medium M in which the biological sample to be observed is cultured. And the imaging device 10 produces
- the imaging device 10 may image the culture medium M directly (without passing through other members), or may image the culture medium M through other members such as a microscope.
- the frame rate is not particularly limited, but is preferably set according to the degree of change of the observation target. Note that the imaging device 10 images a certain imaging region including the culture medium M in order to correctly track changes in the observation target.
- the moving image generated by the imaging device 10 is transmitted to the information processing device 20.
- the captured image generated by the imaging device 10 is not limited to a moving image.
- the captured images generated by the imaging device 10 may be a plurality of captured images generated at different imaging times.
- the plurality of captured images may be still images captured continuously.
- the imaging device 10 is a camera installed in an optical microscope or the like, but the present technology is not limited to such an example.
- the imaging device 10 may be an imaging device included in an electron microscope using an electron beam such as SEM (Scanning Electron Microscope) or TEM (Transmission Electron Microscope), or an AFM. (Atomic Force Microscope) or even an imaging device included in SPM (Scanning Probe Microscope) using a short needle such as STM (Scanning Tunneling Microscope) Good.
- the moving image generated by the imaging device 10 is a moving image obtained by irradiating an observation target with an electron beam in the case of an electron microscope, for example.
- the imaging device 10 is an SPM
- the moving image generated by the imaging device 10 is a moving image obtained by tracing an observation target using a short hand.
- the information processing apparatus 20 is an apparatus having an image analysis function.
- the information processing apparatus 20 is realized by any apparatus having an image analysis function, such as a PC (Personal Computer), a tablet, and a smartphone.
- the information processing device 20 includes a processing circuit and a communication device.
- the communication apparatus acquires a plurality of captured images (for example, moving images or still images captured continuously) from the imaging apparatus 10 and processes them. At least one region of interest is detected from at least one of the plurality of captured images acquired by the circuit. Then, the processing circuit calculates a feature amount based on changes in the detected region of interest on a plurality of captured images.
- the said processing circuit calculates the evaluation value about the function (for example, phagocytic function) which concerns on absorption or discharge
- Each process performed by the processing circuit of the information processing device 20 is output to a storage device or a display device provided inside or outside the information processing device 20.
- the information processing apparatus 20 may be realized by one or a plurality of information processing apparatuses on a network. A functional configuration for realizing each function of the information processing apparatus 20 will be described later.
- the information processing system 1 is comprised by the imaging device 10 and the information processing apparatus 20, this technique is not limited to this example.
- the imaging device 10 may perform processing related to the information processing device 20 (for example, detection processing and analysis processing).
- the information processing system 1 is realized by an imaging device having a detection function, an analysis function, and the like.
- the observation object according to the present embodiment is mainly a biological sample.
- Biological samples are, for example, various cells, organelles or biological tissues, or living organisms such as microorganisms or plankton, which can be observed using an optical microscope, and biological organisms such as viruses.
- An object having a function is meant.
- the biological sample according to the present embodiment means a living body that can move in the culture medium M on the imaging stage S of the imaging device 10.
- an observation object is referred to as an observation object.
- the observation target according to the present embodiment is an observation target that develops a function related to absorption or release.
- the function related to absorption may be a function of absorbing a substance from the outside of the biological sample such as a phagocytic function by phagocytic cells.
- the phagocytic cells may be migratory immune cells such as macrophages, dendritic cells or neutrophils. These immune cells have a function of phagocytosing pathogens such as cancer cells, particles such as microbeads, bacteria or viruses. That is, these pathogens, particles, bacteria or viruses correspond to substances on which the function of the biological sample acts.
- the function related to release may be a function of releasing a substance from the inside of a biological sample, for example, cell secretion. More specifically, the function related to release is a function related to the release of ATP (Adenosine Triphosphate), compounds such as histamine, proteins such as enzymes, fine particles, or calcium ions by cells. Also good. That is, these compounds, proteins, microparticles, calcium ions, and the like correspond to substances on which the function of the biological sample acts.
- ATP Addenosine Triphosphate
- observation object according to the present embodiment is not limited to a biological sample.
- a non-biological sample such as a device such as a MEMS (Micro Electro Mechanical Systems) having a function of absorbing or releasing a substance can also be an object of observation according to the present embodiment.
- MEMS Micro Electro Mechanical Systems
- Non-Patent Document 1 DH Munn et.al. “Phagocytosis of Tumor Cells by Human Monocytes Cultured in Recombinant Macrophage Colony- Stimulating Factor ", J. Exp. Med., Vol. 172 (1990) p.231-237.)
- flow cytometry or fluorescence imaging There is a technique using flow cytometry or fluorescence imaging. According to this technique, the presence or absence of expression of a phagocytic function by the phagocytic cell can be evaluated by performing flow cytometry one by one for the phagocytic cell to be observed. Moreover, the state of the phagocytic function by the phagocyte can be qualitatively observed by fluorescence imaging.
- the technique disclosed in the above non-patent literature does not sufficiently evaluate the function related to the absorption or release of the observation target.
- the phagocytic function by flow cytometry only the phagocytic function of phagocytic cells at a specific timing (for example, the timing when phagocytic cells are cultured in a medium for a predetermined period and the phagocytic cells are taken out from the medium) is evaluated. Therefore, in the evaluation using flow cytometry, it is only evaluated whether or not the phagocytic cells cultured for a predetermined period have developed the phagocytic function.
- the timing at which phagocytic cells cultured in a medium express the phagocytic function the substance phagocytosed by the phagocytic cell (the phagocytic substance: the function of the biological sample is It is difficult to evaluate the number or type of examples of substances that are acted upon.
- phagocytic cells obtained from the medium are introduced into the flow cytometer, so that information on the culture environment in the phagocyte culture medium is lost. Therefore, it is difficult to evaluate the culture environment related to the expression of the phagocytic function of phagocytic cells.
- the information processing system 1 detects an area (attention area) corresponding to an observation target or the like from at least one captured image among the plurality of captured images, and the plurality of captured images of the attention area are displayed. And calculating an evaluation value for a function related to absorption or release of an observation target, such as a phagocytic function, based on the calculated feature value.
- an observation target such as a phagocytic function
- the overview of the information processing system 1 according to an embodiment of the present disclosure has been described above.
- the information processing apparatus 20 included in the information processing system 1 according to an embodiment of the present disclosure is realized in the following embodiment. Hereinafter, a specific configuration example and processing example of the information processing apparatus 20 will be described.
- the information processing apparatus 20 according to an embodiment of the present disclosure will be described with reference to FIGS.
- the function to be evaluated is not limited to the phagocytic function
- the evaluation target is not particularly limited as long as it is a function related to absorption or release of the observation target.
- FIG. 2 is a functional block diagram illustrating a functional configuration example of the information processing apparatus 20 according to an embodiment of the present disclosure.
- the information processing apparatus 20 includes a control unit 200, a communication unit 210, and a storage unit 220.
- the function of the control unit 200 is realized by a processing circuit such as a CPU (Central Processing Unit) included in the information processing apparatus 20.
- the function of the communication unit 210 is realized by a communication device provided in the information processing apparatus 20.
- the function of the storage unit 220 is realized by a storage device such as a storage provided in the information processing device 20.
- each functional unit will be described.
- the control unit 200 controls the overall operation of the information processing apparatus 20. Further, as shown in FIG. 2, the control unit 200 includes functions of a detection unit 201, a feature amount calculation unit 202, an evaluation value calculation unit 203, and a display control unit 204, and includes the functions of the information processing apparatus 20 according to the present embodiment. Take control of actions. The functions of each functional unit included in the control unit 200 will be described later.
- the communication unit 210 is a communication unit included in the information processing apparatus 20 and performs various types of communication with an external apparatus wirelessly or by wire via a network (or directly).
- the communication unit 210 communicates with the imaging device 10. More specifically, the communication unit 210 acquires a plurality of captured images generated by the imaging device 10. In the present embodiment, the communication unit 210 will be described as acquiring a moving image generated by the imaging device 10.
- the communication unit 210 may communicate with other devices other than the imaging device 10.
- the communication unit 210 transmits information related to an evaluation value obtained from an evaluation value calculation unit 203, which will be described later, or information related to display of an evaluation value obtained from the display control unit 204, to an external information processing device or display device. May be.
- the storage unit 220 is a storage unit included in the information processing apparatus 20 and stores information acquired by the communication unit 210, information obtained by each function unit of the control unit 200, and the like. In addition, the storage unit 220 appropriately outputs stored information in response to a request from each functional unit of the control unit 200 or the communication unit 210.
- the detection unit 201 detects at least one region of interest from one captured image constituting the moving image acquired by the communication unit 210 from the imaging device 10.
- the attention area means an area for estimating the motion of the observation target.
- the detection unit 201 may detect a region of interest for an object image included in one captured image.
- This attention area may be an area (hereinafter referred to as an observation target area) corresponding to an observation target (for example, a biological sample such as a phagocytic cell) included in the moving image, and corresponds to another object. It may be a region.
- the detection unit 201 may detect not only the observation target region but also the attention region for an object other than the observation target.
- the object other than the observation target may be, for example, a substance on which the phagocytosis function by the observation target is acted (consumed substance).
- the detection unit 201 may detect, for example, a region surrounded by a closed curve that forms an outline of the observation target and the eroded substance as a region of interest.
- FIG. 3 is a diagram for explaining attention area detection processing by the detection unit 201 according to the present embodiment.
- the captured image F1 includes a plurality of object regions 1000.
- the detection unit 201 may detect the object region 1000 by, for example, image recognition and set the object region 1000 as the attention region 1100 (see the schematic diagram F32 in FIG. 3).
- the outline of the attention area 1100 may be the outline of the object area 1000 (that is, the boundary line between the object area 1000 and the non-object area).
- the detection unit 201 may detect a region formed by a closed curve corresponding to the contour of the object, or a region corresponding to the tissue existing inside the object. It may be detected. More specifically, the attention area detected by the detection unit 201 may be an area corresponding to a part of a tissue or the like included in the observation target. For example, when it is considered that the phagocytic function is expressed in a part of the tissue included in the observation target, the detection unit 201 may detect an area corresponding to a part of the tissue as the attention area. Thereby, the phagocytic function by the structure
- the detection unit 201 may detect a plurality of regions for one captured image. For example, when a plurality of objects are included in one captured image, the detection unit 201 may detect a region of interest for each object. As a result, the feature amount is calculated for each change of the object (for example, the observation target), and the phagocytic function of each observation target can be evaluated. In addition, the detection unit 201 may detect a region of interest for each of a plurality of captured images constituting a moving image. In this case, the feature amount calculation unit 202 described later can calculate the feature amount based on the change between the captured images of each detected region of interest without estimating the change of the region of interest.
- the attention area may be detected through an operation of a user who uses the information processing apparatus 20, or the attention area is determined from one captured image that the detection unit 201 forms a moving image by a technique such as image analysis. It may be detected automatically. In the latter case, the detection unit 201 may set an area corresponding to the object detected by image analysis as the attention area. For example, the detection unit 201 may detect a region of interest using a feature amount (for example, a dynamic range) related to the luminance of one captured image.
- a feature amount for example, a dynamic range
- the one captured image employed for detecting the attention area is not particularly limited as long as it is a captured image constituting a moving image.
- the one captured image may be a captured image corresponding to the first frame among the moving images acquired by the communication unit 210.
- the attention area for the captured image of the first frame for example, when calculating the feature amount related to the deformation of the attention area in the moving image, the position of the attention area in the first frame can be used as a reference.
- the one captured image may be a captured image in a frame corresponding to a point in time when processing related to one evaluation is started.
- the process relating to one evaluation may be, for example, a chemical process such as a drug injection for an observation target. Thereby, it is possible to perform the evaluation on the basis immediately before the influence of the above processing on the observation target.
- the detection unit 201 may arrange a plurality of tracking points with respect to a region of interest detected in one captured image.
- a tracking point is a point arranged corresponding to a region of interest detected for one captured image.
- the tracking points are arranged at a predetermined interval on the contour line that defines the region of interest.
- a feature amount calculation unit 202 described later detects the position of the tracking point in another captured image captured at a time different from the one captured image used when the region of interest is detected.
- the feature amount calculation unit 202 can detect the movement of the attention area based on the movement position of the tracking point.
- the number and arrangement interval of tracking points may be determined according to the type of observation target or the shape of the region of interest. For example, when the shape of the region of interest changes greatly, it is preferable to increase the number of tracking points arranged and reduce the arrangement interval. Thereby, even if the form of the observation target changes greatly, the change in the form of the observation target can be tracked with high accuracy. In order to reduce the calculation load, it is preferable to reduce the number of tracking points and increase the arrangement interval.
- the detection unit 201 may identify the detected attention area as a first area and a second area.
- FIG. 4 is a diagram for explaining the attention area identification processing by the detection unit 201 according to the present embodiment.
- the detection unit 201 identifies the first region 1101 and the second region 1111, respectively. That is, the detection unit 201 detects the attention areas 1101 and 1111 for the observation target area 1001 and the engulfed substance area 1011 (see the schematic diagram F42 in FIG. 4).
- the detection unit 201 identifies whether each region of interest is the region of interest related to the observation target or the region of interest related to the engulfed substance, so that the feature amount calculation unit 202 described later calculates only the feature amount of the observation target. Can be calculated.
- the detection unit 201 when the detection unit 201 can detect only the observation target or the engulfed substance, the above-described identification process may not be performed.
- the feature value may be calculated by the feature value calculation unit 202 for one of the observation target and the engulfed substance, and the evaluation value may be calculated by the evaluation value calculation unit 203 based on the calculated feature value.
- the attention area identification processing by the detection unit 201 may be performed based on the image information of the detected attention area.
- the image information means information about the shape of the detected attention area, pixel information inside the attention area, or the like.
- the information on the shape of the attention area may be information on the area of the attention area, the length of the contour line, or the length in the XY direction on the captured image, for example.
- the pixel information inside the attention area is color information inside the attention area (for example, information on a specific fluorescent color indicated by the fluorescent image) or texture information (for example, a phase difference image or a bright field of the captured image). Pixel information obtained from an image).
- data learned in advance for these pieces of image information may be stored in advance in the storage unit 220 in association with the observation target, the engulfed substance, and the like.
- the detection unit 201 collates the image information of the detected attention area with the learning data acquired from the storage unit 220. Thereby, it is possible to identify whether the detected attention area is the attention area related to the observation target or the attention area related to the engulfed substance.
- Information about the attention area detected by the detection unit 201 is output to the feature amount calculation unit 202.
- information about the attention area may be output to the display control unit 204 for presentation to the user.
- an area detected for an observation target is described as an attention area unless it is particularly necessary to distinguish.
- the feature amount calculation unit 202 calculates a feature amount related to a change on the moving image of the attention area detected by the detection unit 201 (change on a plurality of captured images).
- the feature amount is used by an evaluation value calculation unit 203 described later to calculate an evaluation value.
- the types of feature amounts according to the present embodiment are as shown in the following list.
- the feature amount calculation unit 202 may calculate a feature amount based on, for example, the movement of the outline of the region of interest on a moving image.
- the movement of the outline means (a) a change in the position of the outline, or (b) a change in the shape of the outline.
- FIG. 5 is a diagram for explaining a change in the position of the contour line of the region of interest.
- the attention area 1101 related to the observation target area 1001 has moved to the position indicated by the arrow 2001 in another captured image.
- the observation object often moves around in order to search for the engulfed substance.
- the contour line corresponding to the observation object also moves on the moving image as the observation object moves. Therefore, when the region of interest is moving, it is considered that the observation target is not phagocytosing the engulfed substance.
- the contour line can also be stationary. Therefore, by calculating the feature amount based on the change in the position of the contour line and using the feature amount for calculating the evaluation value, it is possible to grasp the presence or timing of the phagocytic function of the observation target or the timing thereof.
- the feature amount based on the change in the position of the contour line may be calculated based on the moving distance of the center position of the contour line, for example.
- the center position may be specified by a weighted average of the coordinates of the contour line or the like.
- a feature amount based on a change in the position of the contour line may be calculated using a known technique for calculating the movement distance of one region on the moving image.
- FIG. 6 is a diagram for explaining a change in the shape of the outline of the attention area. As illustrated in FIG. 6, it is assumed that the shape of a part of the outline of the attention area 1101 related to the observation target area 1001 is deformed in the direction indicated by the arrow 2002 in another captured image. At this time, as shown in FIG. 6, the observation object may cause a part of the tissue to protrude in order to engulf the engulfed substance existing in the vicinity of the observation object. Thereby, the observation target captures the engulfed substance using the protruding portion.
- the contour of the attention area corresponding to the observation target also changes in the shape of the portion corresponding to the deformed portion of the observation target. Therefore, by calculating the feature amount based on the change in the shape of the contour line and using the feature amount for calculating the evaluation value, it is possible to grasp the presence or timing of the phagocytic function of the observation target or the timing thereof.
- the feature amount based on the change in the shape of the contour line may be calculated based on, for example, the amount of change in the area of the region of interest or the length of the contour line.
- the feature amount may be calculated using a known technique for detecting a local change in the shape of the contour line of the region of interest.
- the feature amount calculation unit 202 may detect a change in the contour line of the attention area using various methods. For example, when a plurality of tracking points are arranged in the attention area in the detection unit 201, the feature amount calculation unit 202 detects the movement of the attention area by estimating the movement of the tracking points arranged for the attention area. May be.
- a process of detecting the movement of the attention area by the feature amount calculation unit 202 when a plurality of tracking points are arranged in the attention area will be described.
- the feature amount calculation unit 202 first estimates the position of the tracking point arranged in one captured image in another captured image that is different from the one captured image.
- the other captured image may be any captured image of several frames before and after the frame of the one captured image.
- the feature amount calculation unit 202 detects the movement of the tracking point on the moving image by performing processing related to the estimation of the position of the tracking point in the other captured image for each captured image constituting the moving image. Note that the motion detected by the feature amount calculation unit 202 may be motion in all or part of the moving image.
- the feature amount calculation unit 202 may estimate the position of the tracking point based on, for example, a motion vector calculated by comparing one captured image with another captured image. This motion vector may be a motion vector calculated for each tracking point. The motion vector may be calculated by a known method such as block matching or a gradient method. The feature amount calculation unit 202 according to the present embodiment will be described assuming that the motion vector is estimated by block matching.
- the feature amount calculation unit 202 selects the area where the information regarding the pixels in the tracking area most closely matches between the one captured image and the other captured image. By detecting from a predetermined block size (search range) of the captured image, the position of the tracking point in another captured image may be estimated.
- the size of the tracking area and the block size may be determined according to the imaging condition (for example, imaging magnification) of the imaging device 10, the type of observation target, the type of analysis performed on the observation target, and the like. For example, when the movement of the observation target is large, the tracking area or the block size may be increased. Thereby, the estimation accuracy of the position of the tracking point by the feature amount calculation unit 202 can be improved. Further, when there are a large number of tracking points with respect to the attention area, the tracking area or the block size may be adjusted to be small in order to reduce the calculation load.
- the feature amount calculation unit 202 may estimate the position of the tracking point in another captured image generated at the imaging time point determined based on the information regarding the observation target. For example, when tracking a change in the form of an observation target whose form change speed is slow, the difference in captured images between a plurality of consecutive frames generated by the imaging apparatus 10 is small. Therefore, when tracking changes in the shape of an observation target whose shape change speed is slow, the feature amount calculation unit 202 performs detection processing on another captured image that is a captured image that is several frames away from the frame of one captured image. Also good. More specifically, the feature amount calculation unit 202 may perform detection processing using a captured image several frames after the frame of one captured image as another captured image.
- the frame interval can be appropriately set according to the type or state of the observation target.
- the feature amount calculation unit 202 may calculate a feature amount based on a change in the position of the contour line or a change in the shape of the contour line based on the detected movement position of the tracking point. For example, the feature amount calculation unit 202 may calculate a statistical value such as an average value or a median value of movement distances of a plurality of tracking points as a feature amount based on a change in the position of the contour line. In addition, the feature amount calculation unit 202 calculates a feature amount based on a change in the shape of the contour line based on the movement distance of a tracking point whose movement distance is significantly larger than the other tracking points among the plurality of tracking points. It may be calculated. As a result, the feature amount can be calculated using only information obtained from the tracking points, and thus the calculation cost can be suppressed.
- the feature amount calculation unit 202 may rearrange the tracking points for the attention area after motion detection. Thereby, the estimation accuracy of the motion of the attention area can be increased.
- the feature amount calculation unit 202 detects the movement of the contour line of the attention region and the feature amount calculation processing using the tracking point when the tracking point is arranged for the attention region.
- the present technology is not limited to this example, and the detection processing of the motion of the contour line of the attention area may be performed using a known algorithm related to object tracking such as optical flow or pattern matching. Further, the feature amount calculation unit 202 may calculate a feature amount based on the movement of the outline of the attention area detected using such a known algorithm.
- the feature amount calculation unit 202 may calculate a feature amount based on, for example, a motion on a moving image inside the region of interest.
- the movement inside the attention area refers to the movement inside the attention area on the moving image caused by the movement of the internal structure of the observation target corresponding to the attention area.
- FIG. 7 is a diagram for explaining the internal movement of the region of interest.
- the engulfed substance region 1011 exists inside the observation target region 1001.
- the movement inside the observation target becomes large.
- a large movement can be detected in the region 2003 near the engulfed substance region 1011.
- the tissue inside the observation target may be moved greatly in order to digest the engulfed substance.
- the movement inside the attention area corresponding to the observation target also increases. Therefore, by calculating the feature amount based on the internal movement of the region of interest and using the feature amount for calculating the evaluation value, it is possible to capture the presence or timing of the phagocytic function of the observation target.
- the feature amount calculation unit 202 may detect a motion vector inside the attention area as the movement inside the attention area.
- This motion vector may be a motion vector calculated for each mesh by cutting a mesh inside the region of interest (mesh processing).
- the feature amount calculated by the feature amount calculation unit 202 is a statistical value such as an average value, median value, maximum value, minimum value, or standard deviation of the magnitude of the motion vector calculated for each mesh inside the region of interest. It may be a value. Further, the magnitude of the calculated motion vector itself may be used as the feature amount.
- the feature amount calculation unit 202 estimates a motion such as a change in the position and shape of the region of interest on the moving image. That is, the position and shape of the contour line of the attention area are different for each captured image. Therefore, when the feature amount calculation unit 202 calculates a feature amount based on the internal movement of the attention area, the feature amount calculation unit 202 specifies the position and shape of the contour line of the attention region for each captured image, and the contour line May be detected, and a feature amount based on the motion may be calculated.
- the feature quantity may be a feature quantity based on movement in a partial area inside the attention area. For example, when a path through which a phagocytic substance such as the digestive tract passes can be grasped in advance within the observation target, a feature amount based on the movement of the region corresponding to the digestive tract may be calculated. Such a region may be specified by a known image recognition technique or the like. Thereby, since the feature-value specialized in the movement relevant to a phagocytic function can be calculated, an evaluation value can be calculated more accurately.
- the feature amount calculation unit 202 may calculate a feature amount based on pixel information inside the region of interest, for example.
- the pixel information inside the attention area includes, for example, luminance information inside the attention area or a pattern inside the attention area. Such pixel information can change due to the movement of the internal structure of the observation target corresponding to the region of interest.
- FIG. 8 is a diagram for explaining pixel information inside the region of interest.
- the inside of the observation target area 1001 where the attention area 1101 is detected and the engulfed substance area 1011 often have different pixel information. Therefore, for example, when a pixel having a luminance or pattern different from the luminance or pattern corresponding to the observation target region 1001 exists inside the attention region 1101, there is a possibility that a foreign object is captured in the observation target. .
- the different luminance or pattern is a luminance or pattern corresponding to the engulfed substance region 1011, there is a high possibility that the phagocytic substance is phagocytosed in the observation target. Therefore, by calculating the feature amount based on the pixel information inside the region of interest and using the feature amount for calculating the evaluation value, the presence or timing of the phagocytic function of the observation target can be detected.
- the feature amount based on the pixel information is, for example, a value related to the luminance for each pixel included in the attention area (an average value, a minimum value, a maximum value, a statistical value such as a median value or a range, or a luminance value). Gradient).
- the feature amount based on the pixel information may be a feature amount based on the pattern similarity related to the texture of the observation target region, for example. More specifically, when the phagocytic substance is phagocytosed by the observation target, the similarity of the pattern inside the region of interest with respect to the pattern of the observation target region is considered to decrease relatively. Therefore, the similarity may be used as a feature amount based on pixel information.
- the feature amount calculated by the feature amount calculation unit 202 has been described above.
- the feature amount calculation unit 202 calculates the feature amount related to the change of the attention area on the moving image for all or a part of the moving image.
- the calculation result of the feature amount is output to the evaluation value calculation unit 203. Further, the calculation result of the feature amount may be output to the display control unit 204 for presentation to the user.
- the feature amount calculation unit 202 may calculate a feature amount related to a change in the first region. Further, as shown in the description of the modification, the feature amount calculation unit 202 may calculate a feature amount related to the change in the second region. Specifically, the feature amount calculation unit 202 may calculate a feature amount related to a change in the position of the contour line of the second region. The calculated feature amount related to the change in the second region may be used for evaluation value calculation by the evaluation value calculation unit 203 described later. This makes it possible to evaluate the phagocytosis function of the observation target based on the movement of the engulfed substance.
- the feature amount calculation unit 202 described later changes between captured images of each detected attention area.
- the feature amount may be calculated based on the above. In this case, the feature amount calculation unit 202 does not have to calculate a change on the moving image of the attention area detected from one captured image constituting the moving image.
- the evaluation value calculation unit 203 calculates an evaluation value for a function (for example, a phagocytic function) related to absorption or release of the observation target, based on one or more feature amounts calculated by the feature amount calculation unit 202.
- the evaluation value calculated by the evaluation value calculation unit 203 is, for example, (1) the number of occurrences of the phagocytic function by the observation target, (2) the frequency of the phagocytic function expression by the observation target, and (3 ) The timing of phagocytic function manifestation depending on the observation target. About the number of (1), the number of the observation objects which expressed the phagocytic function can be grasped.
- the expression frequency of (2) the number of engulfed substances phagocytosed by one observation target that has developed the phagocytic function can be grasped. Thereby, regarding the phagocytic function to be observed, it is possible to quantitatively evaluate the specific expression frequency as well as the presence / absence of mere function expression.
- the timing of (3) the timing at which the phagocytic function was expressed can be grasped. Thereby, the time-dependent evaluation with respect to the phagocytic function of an observation object is attained.
- these evaluation values related to the expression of the phagocytic function can be calculated based on the temporal change of the calculated feature amount.
- FIG. 9 is a graph showing an example of the temporal change data of the feature amount calculated for the observation target.
- a movement amount curve 3001 a deformation amount curve 3002, an internal movement amount curve 3003, and a luminance curve 3004 are drawn.
- the movement amount curve 3001 is a curve showing the temporal change data of the feature amount related to the change in the position of the contour line of the attention area.
- a deformation amount curve 3002 is a curve showing temporal change data of a feature amount related to a change in the shape of a region of interest.
- the internal motion amount curve 3003 is a curve showing the temporal change data of the feature amount related to the internal motion of the attention area.
- the luminance curve 3004 is a curve showing the temporal change data of the feature amount related to the luminance information inside the attention area.
- Each feature amount indicated by these curves is a feature amount calculated by the feature amount calculation unit 202.
- the movement amount curve 3001 shows a high value in the non-phagocytic section of the observation target (period when the phagocytic function is not expressed), and in the phagocytic section of the observation target (when the phagocytic function is expressed). It shows a low value. This is because when the observation target does not express the phagocytic function, the observation target moves relatively, and when the observation target expresses the phagocytic function, the observation target is stationary on the spot. is there. Therefore, it is possible to determine the presence or absence of the phagocytic function from the magnitude of the feature amount indicated by the movement amount curve 3001.
- the deformation amount curve 3002 shows two peaks immediately before the phagocytosis section and in the phagocytosis section. This peak is due to the fact that the observation object locally changes its shape when it captures the engulfed substance. Further, as shown in FIG. 9, since there are two such peaks, it is presumed that the engulfed substance was captured twice by the observation target. Therefore, it is possible to determine the presence or absence of the phagocytic function from the peak indicated by the deformation amount curve 3002 and to calculate the frequency of the phagocytic function.
- the feature values other than the peak of the deformation amount curve 3002 in the engulfed section are larger than the feature values of the deformation amount curve 3002 in the engulfed section. This is because the observation object is enlarged by phagocytosing the engulfed substance. Using this fact, it is also possible to determine whether or not the phagocytic function is expressed.
- the internal movement amount curve 3003 shows a plurality of peaks in the beggar section. This peak is caused by the movement of the engulfed substance taken in by the observation target inside the observation target. This movement includes, for example, a movement caused by digestion of the engulfed substance by the observation target. Therefore, from the peak indicated by the internal movement amount curve 3003, it is possible to determine whether or not the phagocytic function is expressed.
- these peaks are divided into two group sections. This indicates that the observation object is phagocytosing one engulfed substance one by one in each group section. Therefore, it is possible to calculate the frequency of the phagocytic function from the number of groups. Furthermore, the end point times (for example, times t1 and t2 shown in FIG. 9) of each group section correspond to the time when the phagocytosis of one engulfed substance by the observation target is completed. Therefore, the evaluation value calculation unit 203 may calculate such a time as an evaluation value related to the onset timing of the phagocytic function.
- the time is not limited to the end point time of each group section, but may be the start point time of each group section (the time when the phagocytosis of one engulfed substance by the observation target is started).
- it is possible to determine the presence or absence of the phagocytic function from the peak indicated by the internal motion amount curve 3003, and to calculate the phagocytic function expression frequency and expression timing.
- the luminance curve 3004 shows a plurality of peaks in the beggar section. This peak is attributed to the brightness of the engulfed substance taken in by the observation object. For example, in the case where a fluorescent substance is labeled on the phagocytic substance, when the phagocytic substance is taken into the observation target, the luminance information inside the observation target changes due to fluorescence from the fluorescent substance. Therefore, the presence or absence of the phagocytic function can be determined from the peak indicated by the luminance curve 3004.
- the temporal change data of the feature amount reflects the expression of the phagocytic function depending on the observation target. Therefore, the phagocytic function can be evaluated by analyzing the temporal change data of the feature amount.
- the evaluation value calculation unit 203 determines the number of observation targets in which the phagocytic function is expressed by determining whether or not the phagocytic function is expressed by each observation target in each attention region based on the feature amount of each attention region. It can be calculated as an evaluation value. The presence or absence of this phagocytic function can be determined by analyzing the temporal change data of the feature amount.
- the evaluation value calculation unit 203 can calculate the frequency of occurrence of the phagocytic function to be observed or the timing of the phagocytic function as an evaluation value. It can. This makes it possible to evaluate the phagocytic function to be observed (or decline or malfunction) due to the drug injection, the response of the phagocytic function to the drug injection timing, and the like. That is, the evaluation of the phagocytic function becomes more detailed and diversified.
- known various data analysis methods such as peak detection or time series clustering are used.
- the temporal change data of the feature amount used for the evaluation value calculation by the evaluation value calculation unit 203 may be a single type or a plurality of types.
- the evaluation value calculation unit 203 uses the feature amount related to the change in the position of the attention region, the feature amount related to the movement inside the attention region, and the feature amount related to the luminance information inside the attention region, to calculate the evaluation value. It may be calculated.
- the evaluation value calculation unit 203 uses the feature amount related to the change in the position of the attention region, the feature amount related to the movement inside the attention region, and the feature amount related to the luminance information inside the attention region, to calculate the evaluation value. It may be calculated.
- the evaluation value calculation unit 203 may perform a gating process on the calculated feature value or the temporal change data of the feature value.
- Gating is a process of plotting data related to one or more feature quantities related to an observation target in a dimension corresponding to the type of the feature quantity, and selecting each plot for each group using a predetermined threshold or the like It is.
- observation objects can be grouped according to, for example, the presence or absence of expression of a phagocytic function, and the number or ratio of observation objects that have developed a phagocytic function can be easily calculated as an evaluation value.
- FIG. 10 is a diagram illustrating an example of the gating process performed by the evaluation value calculation unit 203 according to the present embodiment.
- the evaluation value related to the change (motion) inside the attention area is parameter 1
- the evaluation value related to the outline change of the attention area is parameter 2.
- an evaluation value related to the luminance information of the attention area may be used as the parameter 3.
- the parameter 1 may be, for example, the number of peaks (or the number of group sections) included in the temporal change data of the feature amount related to the movement inside the attention area.
- the parameter 2 may be, for example, the total amount of movement of the attention area.
- the parameter 3 may be, for example, the number of peaks included in the temporal change data of the feature amount related to the luminance information inside the attention area.
- each feature amount calculated for the change in the region of interest is plotted.
- the evaluation value calculation unit 203 performs gating on the plotted three-dimensional graph and sets the regions of interest 4001 and 4002.
- the plot included in the region of interest 4001 shows a high value for the feature quantity related to parameter 2 and shows a low value for the feature quantity related to parameters 1 and 3.
- the plot included in the region of interest 4002 shows a low value for the feature quantity related to the parameter 2 and shows a high value for the feature quantity related to the parameters 1 and 3.
- the number of plots included in the region of interest 4002 is the number of observation objects that have developed a phagocytic function, and the number is calculated by the evaluation value calculation unit 203 as an evaluation value (see table T102 in FIG. 10, nine).
- the observation objects there are five observation objects that express the phagocytic function).
- the regions of interest 4001 and 4002 may be set by a user operation, or may be automatically set to include a plot that satisfies a predetermined condition.
- the predetermined condition may be appropriately adjusted according to the phagocytic function, exercise ability, culture environment of the observation target, or the like. Further, the size and shape of the region of interest are not particularly limited.
- the number of types of feature values used in the gating process or a combination thereof is not particularly limited.
- the evaluation value calculated by the gating process by the evaluation value calculation unit 203 is not limited to the number or the ratio of the observation objects that have developed the phagocytic function as described above.
- the evaluation value calculation unit 203 may calculate the number, ratio, or information on the group of observation objects having similar expression frequency or expression timing of the phagocytic function of the observation object by gating processing. More specifically, the evaluation value calculation unit 203 may perform grouping on observation objects that have the same expression frequency by gating processing. Thereby, it is possible to evaluate a trend regarding the expression of a phagocytic function, and it is also possible to compare observation objects having different trends.
- Information related to the evaluation value calculated by the evaluation value calculation unit 203 is output to the display control unit 204. Further, the evaluation value calculation unit 203 may output the result of the gating process to the display control unit 204 together with the evaluation value. Display control using the result of the gating process will be described later.
- the display control unit 204 displays information related to the processing result by each functional unit on a display device (not shown) or the like. For example, as illustrated in FIG. 3 or FIG. 4, the display control unit 204 according to the present embodiment may superimpose the attention area detected by the detection unit 201 on the moving image. Further, as shown in FIG. 9, the display control unit 204 may display the feature amount calculated by the feature amount calculation unit 202 using a graph related to a change over time.
- the display control unit 204 may display information related to the evaluation value calculated by the evaluation value calculation unit 203. For example, the display control unit 204 calculates information about the number of observation targets in which the phagocytic function is expressed, the frequency of occurrence of the phagocytic function of the observation target, or the onset timing of the phagocytic function of the observation target, which is calculated as the evaluation value. It may be displayed.
- the display control unit 204 may control the display mode for the attention area based on the result of the gating process by the evaluation value calculation unit 203. Specific examples thereof will be described below.
- FIG. 11 is a diagram for explaining an example of display mode control based on the result of the gating process by the display control unit 204 according to the present embodiment. As shown in the schematic diagram F111 of FIG. 11, it is assumed that the captured image F2 includes attention areas 5001 to 5003 (all of which correspond to observation target areas).
- two feature amounts corresponding to parameter 1 and parameter 2 related to the observation target corresponding to the attention areas 5001 to 5003 are plotted on the two-dimensional graph by the gating process by the evaluation value calculation unit 203, respectively.
- the observation target plot 6001 corresponding to the attention area 5001, the observation target plot 6002 corresponding to the attention area 5002, and the observation target plot 6003 corresponding to the attention area 5003 are at the positions shown in the graph G112. Suppose that it is plotted.
- plot 6002 and the plot 6003 both show high values for the parameter 1 and the parameter 2.
- plot 6001 shows a low value for both parameter 1 and parameter 2. If both the parameter 1 and the parameter 2 indicate a high value and the observation target expresses the phagocytic function, the observation target corresponding to the plot 6001 does not express the phagocytic function, and corresponds to the plots 6002 and 6003.
- the object to be observed is distinguished as having developed a phagocytic function.
- the display control unit 204 is different from the attention area 5001 in the display mode of the attention areas 5002 and 5003 related to the observation target that has developed the phagocytic function as a result of the gating process. You may control so that it may. Thereby, it is possible to visualize an observation object that has developed a phagocytic function.
- the display control unit 204 may display a graph showing the result of gating as shown in the graph G112.
- the graph G112 not only the plots 6001 to 6003 but also the region of interest 6010 including the plots 6002 and 6003 may be displayed.
- the display control unit 204 displays the captured image F2 so that the display mode of the attention area corresponding to the plot included in the interest area 6010 is different from other attention areas. You may control.
- the display control unit 204 displays the feature quantities related to each plot plotted on the graph shown in the graph G112 or one or more plots included in the region of interest 6010 as a graph as illustrated in FIG. May be.
- feature values for each plot may be displayed, and statistical values such as the average, median, minimum, or maximum value of the feature values of the selected multiple plots may be displayed. Also good. Thereby, it is possible to confirm what event has occurred with respect to the expression of the phagocytic function indicated by the observation target.
- the size and shape of the region of interest 6010 are not particularly limited.
- the display control unit 204 can visualize the evaluation result of the expression of the phagocytic function by controlling the display mode of the attention area using the result of the gating process.
- the display control unit 204 may perform control to change the display mode of the attention area for each trend regarding the beggar function, using the result of the gating process. For example, the display control unit 204 may change the display mode of the attention area according to the frequency of occurrence of the phagocytic function depending on the observation target. The display control unit 204 may appropriately control the display mode of the attention area according to the evaluation value calculated by the evaluation value calculation unit 203. Thereby, more detailed information can be visualized.
- the display control process by the display control unit 204 has been described above. Display control by the display control unit 204 is appropriately executed by a user operation or the like.
- FIG. 12 is a flowchart illustrating an example of processing performed by the information processing apparatus 20 according to an embodiment of the present disclosure.
- the control unit 200 acquires moving image data from the imaging device 10 via the communication unit 210 (S101).
- the detection unit 201 extracts one captured image from the acquired moving image data, and detects at least one attention area from the one captured image (S103).
- the detection unit 201 identifies the detected attention area as a first area corresponding to the observation target and a second area corresponding to the engulfed substance (S105).
- the feature amount calculation unit 202 calculates a feature amount related to a change in the attention area (first area) (S107).
- the evaluation value calculation unit 203 calculates an evaluation value for the phagocytic function based on the feature amount calculated by the feature amount calculation unit 202 (S109).
- the display control unit 204 controls the display of the results processed by each function unit (S111).
- the information processing apparatus 20 calculates a feature amount related to the detected change in the attention area, and based on the calculated feature amount, an evaluation value for a function related to absorption or release of an observation target such as a phagocytic function Is calculated.
- an observation target such as a phagocytic function Is calculated.
- the information processing apparatus 20 calculates a feature amount based on a change in the contour line of the attention area and the internal movement and luminance information of the attention area as the feature amount.
- the phagocytic function can be evaluated with high accuracy.
- the calculated feature amount changes in accordance with the feature of the attention focused on the attention area. That is, by calculating a plurality of feature amounts, the movement of the observation target related to the phagocytic function can be grasped from various aspects. Therefore, the evaluation accuracy for the phagocytic function can be further increased by using these plurality of feature amounts for calculating the evaluation value.
- immune cells and the like suitable for use in dendritic cell therapy and the like can be acquired based on the above evaluation results.
- the change in the phagocytic function of the observation target with respect to the input of the drug can be evaluated in more detail.
- the information processing apparatus 20 calculates an evaluation value based on the temporal change of the feature amount. This makes it possible to evaluate the phagocytic function of the observation target over time, for example, the frequency of occurrence of the phagocytic function by the observation target and the timing thereof. Therefore, it is possible to evaluate not only the presence / absence of phagocytic function, but also factors over time related to the development of phagocytic function or events that may occur in the development of phagocytic function.
- the phagocytic function (an example of a function related to absorption or release) for the observation target included in the moving image can be evaluated quantitatively and with time. . Thereby, it becomes possible to evaluate the phagocytic function with higher accuracy and in more detail.
- the feature amount calculation unit 202 calculates a feature amount related to a change in the attention region (first region) related to the observation target, and evaluates the beggar function based on the feature amount.
- the value is calculated by the evaluation value calculation unit 203, the present technology is not limited to such an example.
- the information processing apparatus 20 calculates the feature amount related to the change in the attention area (second area) related to the engulfed substance by the feature amount calculation unit 202, and uses the feature amount to calculate the evaluation value described above as the evaluation value You may calculate by the calculation part 203.
- FIG. 13 is an example of a graph showing temporal changes in feature amounts of the first region and the second region according to the modification of the present embodiment.
- a graph G131 in FIG. 13 is a graph showing a temporal change in the feature amount related to the internal movement and luminance information in the first region, and a graph G132 in FIG. 13 shows a change in the position of the contour line in the second region. It is a graph to show.
- the internal movement amount curve 3003 and the luminance curve 3004 shown in the graph G131 are the same as the internal movement amount curve 3003 and the luminance curve 3004 shown in FIG.
- a curve 3011 shown in the graph G132 is a movement amount curve 3011 indicating the movement amount of the second region.
- the internal motion amount curve 3003 and the luminance curve 3004 cause a peak to appear when phagocytosis is performed by the observation target (a phagocytic section).
- the feature amount indicated by the movement amount curve 3011 is smaller than that before the beggar. This is caused by the fact that the phagocytic substance that has moved freely around the medium is taken into the observation target, and thus its movement is limited.
- the feature amount indicated by the movement amount curve 3011 becomes 0 at time t1. This is due to the fact that the engulfed substance is digested by the observation target and the engulfed substance disappears. Therefore, it is considered that the phagocytosis of the engulfed substance by the observation target is completed at time t1. This result coincides with the end of the fluctuation of the internal motion amount curve 3003, for example.
- the feature amount calculation unit 202 may calculate only the feature amount related to the change in the second region, and the evaluation value calculation unit 203 may calculate the evaluation value based only on the calculated feature amount. For example, it is assumed that there is only one observation target in the culture medium and one or more phagocytic substances are present in the vicinity of the observation target. In this case, even if the change of the attention area (first area) corresponding to the observation target is not tracked, the change of the attention area (second area) corresponding to these engulfed substances is tracked, so that In addition, the evaluation value for the phagocytic function of the observation object can be calculated.
- FIG. 14 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the illustrated information processing apparatus 900 can realize, for example, the information processing apparatus 20 in the above-described embodiment.
- the information processing apparatus 900 includes a CPU 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 925, and a communication device 929.
- the information processing apparatus 900 may include a processing circuit called DSP (Digital Signal Processor) or ASIC (Application Specific Integrated Circuit) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls all or a part of the operation in the information processing apparatus 900 according to various programs recorded in the ROM 903, the RAM 905, the storage apparatus 919, or the removable recording medium 923.
- the CPU 901 controls the overall operation of each functional unit included in the information processing apparatus 20 in the above embodiment.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 927 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is a device that can notify the user of the acquired information visually or audibly.
- the output device 917 can be, for example, a display device such as an LCD, PDP, and OELD, an acoustic output device such as a speaker and headphones, and a printer device.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as a video such as text or an image, or outputs it as a sound such as sound.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like. Note that the storage device 919 can realize the function of the storage unit 220 according to the embodiment.
- the drive 921 is a reader / writer for a removable recording medium 923 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 923 and outputs the information to the RAM 905.
- the drive 921 writes a record in the mounted removable recording medium 923.
- the connection port 925 is a port for directly connecting a device to the information processing apparatus 900.
- the connection port 925 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like. Further, the connection port 925 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 929 is a communication interface configured with a communication device for connecting to the communication network NW, for example.
- the communication device 929 may be, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 929 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 929 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network NW connected to the communication device 929 is a network connected by wire or wireless, and is, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like. Note that at least one of the connection port 925 and the communication device 929 can realize the function of the communication unit 210 according to the embodiment.
- the information processing system 1 is configured to include the imaging device 10 and the information processing device 20, but the present technology is not limited to such an example.
- the imaging apparatus 10 may include functions (detection function, feature amount calculation function, and evaluation value calculation function) that the information processing apparatus 20 has.
- the information processing system 1 is realized by the imaging device 10.
- the information processing apparatus 20 may include a function (imaging function) of the imaging apparatus 10.
- the information processing system 1 is realized by the information processing apparatus 20.
- the imaging apparatus 10 may have a part of the functions of the information processing apparatus 20, and the information processing apparatus 20 may have a part of the functions of the imaging apparatus 10.
- the information processing system 1 has been described as a technique for evaluating the phagocytic function of an observation target that is a phagocytic cell, the present technique is not limited to such an example.
- the information processing system according to the present technology is not limited to a function related to absorption such as engulfment, and can also evaluate a function related to release by an observation target. More specifically, it is possible to evaluate a function related to absorption and release of calcium ions by cells.
- a feature amount based on the movement in the region near the contour of the cell is calculated by the feature amount calculation unit, and an evaluation value for a function related to absorption and release of calcium ions is calculated based on the feature amount. May be calculated.
- absorption and release of calcium ions occur in the vicinity of the cell membrane, so that the function can be evaluated by capturing the movement of the region near the cell membrane.
- the information processing system 1 performs the process related to the evaluation of the function of the observation target included in the moving image generated by the imaging device 10, the present technology is not limited to such an example.
- the information processing system 1 according to the present technology may perform processing related to evaluation of the function of the observation target included in a plurality of captured images having different imaging times. More specifically, the information processing system 1 according to the present technology detects a region of interest for an observation target included in a plurality of still images generated continuously by the imaging device 10, and based on the change in the region of interest. The feature amount may be calculated, and the evaluation value may be calculated for the function related to the observation target based on the feature amount. If there are a plurality of captured images of a biological sample and the imaging times are different (generated one after another), the plurality of captured images are processed by the information processing system 1 according to the present technology. Can be a target.
- each step in the processing of the information processing apparatus of the present specification does not necessarily have to be processed in time series in the order described as a flowchart.
- each step in the processing of the information processing apparatus may be processed in an order different from the order described in the flowchart, or may be processed in parallel.
- a detection unit that detects at least one region of interest from at least one captured image of a plurality of captured images with different imaging times for a biological sample;
- a feature amount calculation unit that calculates a feature amount related to a change in the plurality of captured images of the at least one region of interest;
- An evaluation value calculation unit that calculates an evaluation value for a function related to absorption or release of the biological sample based on the feature amount;
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the feature amount calculation unit calculates the feature amount based on movement of an outline of the at least one region of interest on the plurality of captured images.
- the information processing apparatus includes a feature amount related to a change in the position of the contour line.
- the feature amount includes a feature amount related to a change in shape of the contour line.
- the information according to any one of (1) to (4), wherein the feature amount calculation unit calculates the feature amount based on movements on the plurality of captured images inside the at least one region of interest. Processing equipment.
- the information processing apparatus according to any one of (1) to (11), wherein the evaluation value calculation unit performs gating on the feature amount and calculates the evaluation value based on a result of the gating. (13) The information processing apparatus according to (12), further including a display control unit that controls a display mode of the attention area based on the gating result. (14) The detection unit discriminates the detected region of interest into a first region corresponding to the biological sample and a second region corresponding to a substance on which the function of the biological sample acts.
- the feature amount calculation unit calculates a feature amount related to a change in the plurality of captured images of at least one of the first regions, The information processing apparatus according to any one of (1) to (13), wherein the evaluation value calculation unit calculates the evaluation value based on the feature amount related to the first region.
- the feature amount calculating unit calculates a feature amount related to a change in the plurality of captured images of at least one second region; The information processing apparatus according to (14), wherein the evaluation value calculation unit calculates the evaluation value by further using the feature amount related to the second region.
- the detection unit identifies the region of interest into the first region and the second region based on image information of the region of interest in the one captured image (14) or (15) The information processing apparatus described in 1.
- An imaging device including an imaging unit that generates a plurality of captured images with different imaging times for a biological sample; A detection unit that detects at least one region of interest from at least one captured image of the plurality of captured images; A feature amount calculation unit that calculates a feature amount related to a change in the plurality of captured images of the at least one region of interest; An evaluation value calculation unit that calculates an evaluation value for a function related to absorption or release of the biological sample based on the feature amount; An information processing apparatus comprising: An information processing system.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Chemical & Material Sciences (AREA)
- Immunology (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Urology & Nephrology (AREA)
- Cell Biology (AREA)
- Hematology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Health & Medical Sciences (AREA)
- Biotechnology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Analytical Chemistry (AREA)
- Medicinal Chemistry (AREA)
- Microbiology (AREA)
- Biochemistry (AREA)
- Toxicology (AREA)
- Tropical Medicine & Parasitology (AREA)
- Pathology (AREA)
- Food Science & Technology (AREA)
- Multimedia (AREA)
- Zoology (AREA)
- Wood Science & Technology (AREA)
- Physiology (AREA)
- Organic Chemistry (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Sustainable Development (AREA)
- General Engineering & Computer Science (AREA)
- Genetics & Genomics (AREA)
- Image Analysis (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
Abstract
Description
1.情報処理システムの概要
2.情報処理装置
2.1.構成例
2.2.処理例
2.3.効果
2.4.変形例
3.ハードウェア構成例
4.まとめ
図1は、本開示の一実施形態に係る情報処理システム1の構成の概要を示す図である。図1に示すように、情報処理システム1は、撮像装置10、および情報処理装置20を備える。撮像装置10および情報処理装置20は、有線または無線の各種ネットワークにより接続される。
撮像装置10は、撮像画像(動画像)を生成する装置である。本実施形態に係る撮像装置10は、例えば、デジタルカメラにより実現される。他にも、撮像装置10は、例えばスマートフォン、タブレット、ゲーム機、またはウェアラブル装置など、撮像機能を有するあらゆる装置により実現されてもよい。このような撮像装置10は、CCD(Charge Coupled Device)またはCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子、および撮像素子への被写体像の結像を制御するためのレンズなどの各種の部材を有し、これらを用いて実空間を撮像する。これらの撮像素子および各種の部材により、撮像装置10の撮像部としての機能が実現される。また、撮像装置10は、情報処理装置20との間で動画像等を送受信するための通信装置を含む。本実施形態において、撮像装置10は、観察対象である生物学的試料が培養されている培地Mを撮像するための撮像ステージSの上方に設けられる。そして、撮像装置10は、培地Mを所定のフレームレートで撮像することにより動画像を生成する。なお、撮像装置10は、培地Mを直接(他の部材を介さずに)撮像してもよいし、顕微鏡等の他の部材を介して培地Mを撮像してもよい。また、上記フレームレートは特に限定されないが、観察対象の変化の度合いに応じて設定されることが好ましい。なお、撮像装置10は、観察対象の変化を正しく追跡するため、培地Mを含む一定の撮像領域を撮像する。撮像装置10により生成された動画像は、情報処理装置20へ送信される。なお、撮像装置10により生成される撮像画像は動画像に限定されない。撮像装置10により生成される撮像画像は、異なる撮像時間において複数生成される複数の撮像画像であってもよい。例えば、当該複数の撮像画像は、継時的に撮像された静止画像であってもよい。
情報処理装置20は、画像解析機能を有する装置である。情報処理装置20は、PC(Personal Computer)、タブレット、スマートフォンなど、画像解析機能を有するあらゆる装置により実現される。情報処理装置20は、処理回路および通信装置を含む。例えば、本実施形態に係る情報処理装置20では、通信装置が撮像装置10から撮像時間が異なる複数の撮像画像(例えば、動画像または継時的に撮像された静止画像等)を取得し、処理回路が取得した複数の撮像画像のうちの少なくとも一の撮像画像から少なくとも一の注目領域を検出する。そして、当該処理回路は、検出された注目領域の複数の撮像画像上における変化に基づいて、特徴量を算出する。そして、当該処理回路は、算出された特徴量に基づいて、貪食細胞等の吸収または放出に係る機能(例えば、貪食機能)についての評価値を算出する。情報処理装置20の処理回路により行われる各処理については、情報処理装置20の内部または外部に備えられる記憶装置または表示装置等に出力される。なお、情報処理装置20は、ネットワーク上の1または複数の情報処理装置によって実現されてもよい。情報処理装置20の各機能を実現する機能構成については後述する。
以下、図2~図13を参照して、本開示の一実施形態に係る情報処理装置20について説明する。なお、以下では、本実施形態に係る情報処理装置20は、貪食細胞等の観察対象の貪食機能を評価するための技術であるとして説明するが、評価対象である機能は貪食機能に限られず、観察対象の吸収または放出に係る機能であれば、評価対象は特に限定されない。
図2は、本開示の一実施形態に係る情報処理装置20の機能構成例を示す機能ブロック図である。図2に示すように、本実施形態に係る情報処理装置20は、制御部200、通信部210および記憶部220を備える。制御部200の機能は、情報処理装置20が備えるCPU(Central Processing Unit)等の処理回路により実現される。また、通信部210の機能は、情報処理装置20が備える通信装置により実現される。また、記憶部220の機能は、情報処理装置20が備えるストレージ等の記憶装置により実現される。以下、各機能部について説明する。
制御部200は、情報処理装置20の動作全般を制御する。また、制御部200は、図2に示すように、検出部201、特徴量算出部202、評価値算出部203および表示制御部204の各機能を含み、本実施形態に係る情報処理装置20の動作を主導的に制御する。制御部200に含まれる各機能部の有する機能については後述する。
通信部210は、情報処理装置20が備える通信手段であり、ネットワークを介して(あるいは直接的に)、外部装置と無線または有線により各種通信を行う。例えば、通信部210は、撮像装置10と通信を行う。より具体的には、通信部210は、撮像装置10により生成された複数の撮像画像を取得する。本実施形態では、通信部210は撮像装置10により生成された動画像を取得するものとして説明する。また、通信部210は、撮像装置10以外の他の装置と通信を行ってもよい。例えば、通信部210は、後述する評価値算出部203より得られる評価値に関する情報、または表示制御部204より得られる評価値の表示に関する情報等を、外部の情報処理装置または表示装置等に送信してもよい。
記憶部220は、情報処理装置20が備える記憶手段であり、通信部210により取得された情報、または制御部200の有する各機能部により得られた情報等を記憶する。また、記憶部220は、制御部200の有する各機能部、または通信部210からの要求に応じて、記憶されている情報を適宜出力する。
検出部201は、通信部210が撮像装置10から取得した動画像を構成する一の撮像画像から少なくとも一の注目領域を検出する。なお、本明細書において注目領域とは、観察対象の動きを推定するための領域を意味する。
特徴量算出部202は、検出部201により検出された注目領域の動画像上における変化(複数の撮像画像上における変化)に係る特徴量を算出する。当該特徴量は、後述する評価値算出部203が評価値を算出するために用いられる。評価値算出部203による評価値の算出に用いられる特徴量の種類は一または複数であり、選択される種類の数および組み合わせは、特に限定されない。本実施形態に係る特徴量の種類は以下のリストに示すとおりである。
(2)注目領域の内部の動きに基づく特徴量
(3)注目領域の内部の画素情報(輝度情報)に基づく特徴量
特徴量算出部202は、例えば、注目領域の輪郭線の動画像上における動きに基づいて、特徴量を算出してもよい。輪郭線の動きとは、(a)輪郭線の位置の変化、または(b)輪郭線の形状の変化を意味する。
また、特徴量算出部202は、例えば、注目領域の内部の動画像上における動きに基づいて、特徴量を算出してもよい。注目領域の内部の動きとは、注目領域に相当する観察対象の内部構造の動きに起因する、動画像上における当該注目領域の内部の動きである。
また、特徴量算出部202は、例えば、注目領域の内部の画素情報に基づいて、特徴量を算出してもよい。注目領域の内部の画素情報とは、例えば、注目領域の内部の輝度情報、または注目領域の内部のパターンを含む。このような画素情報は、注目領域に相当する観察対象の内部構造の動きに起因して変化し得る。
評価値算出部203は、特徴量算出部202により算出された一または複数の特徴量に基づいて、観察対象の吸収または放出に係る機能(例えば、貪食機能)についての評価値を算出する。本実施形態において、評価値算出部203により算出される評価値は、例えば、(1)観察対象による貪食機能の発現が見られた数(2)観察対象による貪食機能の発現頻度、および(3)観察対象による貪食機能の発現タイミングである。(1)の数については、貪食機能を発現した観察対象の数を把握することができる。これにより、例えば、一または複数の薬剤の投入による貪食機能の変化について評価することが可能である。(2)の発現頻度については、貪食機能を発現させた一の観察対象が貪食した被貪食物質の数を把握することができる。これにより、観察対象の貪食機能について、単なる機能の発現の有無だけではなく、具体的な発現頻度を定量的に評価することが可能となる。また、(3)の発現タイミングについては、貪食機能を発現させたタイミングを把握することができる。これにより、観察対象の貪食機能に対する経時的な評価が可能となる。本実施形態では、貪食機能の発現に係るこれらの評価値は、算出された特徴量の経時変化に基づいて算出され得る。まず、貪食機能の発現の判定方法について説明する。
表示制御部204は、各機能部による処理の結果に係る情報を不図示の表示装置等に表示させる。本実施形態に係る表示制御部204は、例えば、図3または図4に示したように、動画像に対して、検出部201により検出された注目領域を重畳させてもよい。また、表示制御部204は、図9に示すように、特徴量算出部202により算出された特徴量を経時変化に係るグラフにより表示させてもよい。
以上、本開示の一実施形態に係る情報処理装置20の構成および機能について説明した。次に、本開示の一実施形態に係る情報処理装置20による処理の一例について、図12を用いて説明する。
以上、本開示の一実施形態に係る情報処理装置20の構成例および処理例について説明した。本実施形態に係る情報処理装置20は、検出した注目領域の変化に係る特徴量を算出し、算出された特徴量に基づいて貪食機能等の観察対象の吸収または放出に係る機能についての評価値を算出する。かかる構成により、観察対象の形態の変化または内部の動きを追跡し、その変化および動きの特徴から、貪食機能の発現の有無等を判定することができる。これにより、観察対象の貪食機能について定量的かつ経時的な評価をすることが可能となる。また、培養環境が観察対象の上記機能に与える影響についても評価することができる。
上記実施形態に係る情報処理装置20は、観察対象に係る注目領域(第1の領域)の変化に係る特徴量を特徴量算出部202により算出し、当該特徴量に基づいて貪食機能についての評価値を評価値算出部203により算出したが、本技術はかかる例に限定されない。例えば、情報処理装置20は、被貪食物質に係る注目領域(第2の領域)の変化に係る特徴量を特徴量算出部202により算出し、当該特徴量を用いて上記の評価値を評価値算出部203により算出してもよい。
次に、図14を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図14は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。図示された情報処理装置900は、例えば、上記の実施形態における情報処理装置20を実現しうる。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
(1)
生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
を備える情報処理装置。
(2)
前記特徴量算出部は、前記少なくとも一の注目領域の輪郭線の前記複数の撮像画像上における動きに基づいて前記特徴量を算出する、(1)に記載の情報処理装置。
(3)
前記特徴量は、前記輪郭線の位置の変化に関する特徴量を含む、(2)に記載の情報処理装置。
(4)
前記特徴量は、前記輪郭線の形状の変化に関する特徴量を含む、(2)または(3)に記載の情報処理装置。
(5)
前記特徴量算出部は、前記少なくとも一の注目領域の内部の前記複数の撮像画像上における動きに基づいて前記特徴量を算出する、(1)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記特徴量算出部は、前記複数の撮像画像における前記少なくとも一の注目領域の内部の画素情報に基づいて前記特徴量を算出する、(1)~(5)のいずれか1項に記載の情報処理装置。
(7)
前記画素情報は、輝度情報を含む、(6)に記載の情報処理装置。
(8)
前記評価値算出部は、前記機能を発現させた生物学的試料の数を前記評価値として算出する、(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記評価値算出部は、前記生物学的試料による前記機能の発現頻度を前記評価値として算出する、(1)~(8)のいずれか1項に記載の情報処理装置。
(10)
前記評価値算出部は、少なくとも一の前記特徴量の経時変化に基づいて、前記評価値を算出する、(1)~(9)のいずれか1項に記載の情報処理装置。
(11)
前記評価値算出部は、前記生物学的試料により前記機能が発現されたタイミングを前記評価値として算出する、(10)に記載の情報処理装置。
(12)
前記評価値算出部は、前記特徴量についてゲーティングを行い、前記ゲーティングの結果に基づいて前記評価値を算出する、(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記ゲーティングの結果に基づいて前記注目領域についての表示態様を制御する表示制御部をさらに備える、(12)に記載の情報処理装置。
(14)
前記検出部は、検出された前記注目領域を、前記生物学的試料に対応する第1の領域と、前記生物学的試料の前記機能が作用される物質に対応する第2の領域とに識別し、
前記特徴量算出部は、少なくとも一の前記第1の領域の前記複数の撮像画像上における変化に係る特徴量を算出し、
前記評価値算出部は、前記第1の領域に係る前記特徴量に基づいて、前記評価値を算出する、(1)~(13)のいずれか1項に記載の情報処理装置。
(15)
前記特徴量算出部は、少なくとも一の前記第2の領域の前記複数の撮像画像上における変化に係る特徴量を算出し、
前記評価値算出部は、前記第2の領域に係る前記特徴量をさらに用いて、前記評価値を算出する、(14)に記載の情報処理装置。
(16)
前記検出部は、前記一の撮像画像内の前記注目領域の画像情報に基づいて、前記注目領域を、前記第1の領域と前記第2の領域とに識別する、(14)または(15)に記載の情報処理装置。
(17)
前記生物学的試料は、貪食機能を有する細胞である、(1)~(16)のいずれか1項に記載の情報処理装置。
(18)
プロセッサが、
生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出することと、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出することと、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出することと、
を含む情報処理方法。
(19)
コンピュータを、
生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
として機能させるためのプログラム。
(20)
生物学的試料についての、撮像時間が異なる複数の撮像画像を生成する撮像部
を備える撮像装置と、
前記複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
を備える情報処理装置と、
を有する情報処理システム。
10 撮像装置
20 情報処理装置
200 制御部
201 検出部
202 特徴量算出部
203 評価値算出部
204 表示制御部
210 通信部
220 記憶部
Claims (20)
- 生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
を備える情報処理装置。 - 前記特徴量算出部は、前記少なくとも一の注目領域の輪郭線の前記複数の撮像画像上における動きに基づいて前記特徴量を算出する、請求項1に記載の情報処理装置。
- 前記特徴量は、前記輪郭線の位置の変化に関する特徴量を含む、請求項2に記載の情報処理装置。
- 前記特徴量は、前記輪郭線の形状の変化に関する特徴量を含む、請求項2に記載の情報処理装置。
- 前記特徴量算出部は、前記少なくとも一の注目領域の内部の前記複数の撮像画像上における動きに基づいて前記特徴量を算出する、請求項1に記載の情報処理装置。
- 前記特徴量算出部は、前記複数の撮像画像における前記少なくとも一の注目領域の内部の画素情報に基づいて前記特徴量を算出する、請求項1に記載の情報処理装置。
- 前記画素情報は、輝度情報を含む、請求項6に記載の情報処理装置。
- 前記評価値算出部は、前記機能を発現させた生物学的試料の数を前記評価値として算出する、請求項1に記載の情報処理装置。
- 前記評価値算出部は、前記生物学的試料による前記機能の発現頻度を前記評価値として算出する、請求項1に記載の情報処理装置。
- 前記評価値算出部は、少なくとも一の前記特徴量の経時変化に基づいて、前記評価値を算出する、請求項1に記載の情報処理装置。
- 前記評価値算出部は、前記生物学的試料により前記機能が発現されたタイミングを前記評価値として算出する、請求項10に記載の情報処理装置。
- 前記評価値算出部は、前記特徴量についてゲーティングを行い、前記ゲーティングの結果に基づいて前記評価値を算出する、請求項1に記載の情報処理装置。
- 前記ゲーティングの結果に基づいて前記注目領域についての表示態様を制御する表示制御部をさらに備える、請求項12に記載の情報処理装置。
- 前記検出部は、検出された前記注目領域を、前記生物学的試料に対応する第1の領域と、前記生物学的試料の前記機能が作用される物質に対応する第2の領域とに識別し、
前記特徴量算出部は、少なくとも一の前記第1の領域の前記複数の撮像画像上における変化に係る特徴量を算出し、
前記評価値算出部は、前記第1の領域に係る前記特徴量に基づいて、前記評価値を算出する、請求項1に記載の情報処理装置。 - 前記特徴量算出部は、少なくとも一の前記第2の領域の前記複数の撮像画像上における変化に係る特徴量を算出し、
前記評価値算出部は、前記第2の領域に係る前記特徴量をさらに用いて、前記評価値を算出する、請求項14に記載の情報処理装置。 - 前記検出部は、前記一の撮像画像内の前記注目領域の画像情報に基づいて、前記注目領域を、前記第1の領域と前記第2の領域とに識別する、請求項14に記載の情報処理装置。
- 前記生物学的試料は、貪食機能を有する細胞である、請求項1に記載の情報処理装置。
- プロセッサが、
生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出することと、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出することと、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出することと、
を含む情報処理方法。 - コンピュータを、
生物学的試料についての、撮像時間が異なる複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
として機能させるためのプログラム。 - 生物学的試料についての、撮像時間が異なる複数の撮像画像を生成する撮像部
を備える撮像装置と、
前記複数の撮像画像のうちの少なくとも一の撮像画像から、少なくとも一の注目領域を検出する検出部と、
前記少なくとも一の注目領域の前記複数の撮像画像上における変化に係る特徴量を算出する特徴量算出部と、
前記特徴量に基づいて、前記生物学的試料の吸収または放出に係る機能についての評価値を算出する評価値算出部と、
を備える情報処理装置と、
を有する情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/081,774 US20210217172A1 (en) | 2016-03-15 | 2017-01-11 | Information processing device, information processing method, program, and information processing system |
JP2018505279A JPWO2017159011A1 (ja) | 2016-03-15 | 2017-01-11 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
EP17766019.8A EP3432269B1 (en) | 2016-03-15 | 2017-01-11 | Information processing device, information processing method, program, and information processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016051054 | 2016-03-15 | ||
JP2016-051054 | 2016-03-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017159011A1 true WO2017159011A1 (ja) | 2017-09-21 |
Family
ID=59851181
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/000690 WO2017159011A1 (ja) | 2016-03-15 | 2017-01-11 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20210217172A1 (ja) |
EP (1) | EP3432269B1 (ja) |
JP (1) | JPWO2017159011A1 (ja) |
WO (1) | WO2017159011A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019193815A1 (ja) * | 2018-04-04 | 2019-10-10 | 株式会社日立ハイテクノロジーズ | 細菌検査装置および細菌検査方法 |
JP2022066269A (ja) * | 2018-04-04 | 2022-04-28 | 株式会社日立ハイテク | 細菌検査装置および細菌検査方法 |
WO2022130884A1 (ja) * | 2020-12-18 | 2022-06-23 | ソニーグループ株式会社 | 測定装置、測定方法、測定システム |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6708143B2 (ja) * | 2017-02-07 | 2020-06-10 | 株式会社島津製作所 | 時間強度曲線測定装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011185734A (ja) * | 2010-02-14 | 2011-09-22 | Microdent:Kk | 時空間装置 |
JP2015523577A (ja) * | 2012-07-25 | 2015-08-13 | セラノス, インコーポレイテッド | 生物学的サンプルの画像分析および測定 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5950619B2 (ja) * | 2011-04-06 | 2016-07-13 | キヤノン株式会社 | 情報処理装置 |
US10863098B2 (en) * | 2013-06-20 | 2020-12-08 | Microsoft Technology Licensing. LLC | Multimodal image sensing for region of interest capture |
-
2017
- 2017-01-11 WO PCT/JP2017/000690 patent/WO2017159011A1/ja active Application Filing
- 2017-01-11 EP EP17766019.8A patent/EP3432269B1/en active Active
- 2017-01-11 US US16/081,774 patent/US20210217172A1/en not_active Abandoned
- 2017-01-11 JP JP2018505279A patent/JPWO2017159011A1/ja active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011185734A (ja) * | 2010-02-14 | 2011-09-22 | Microdent:Kk | 時空間装置 |
JP2015523577A (ja) * | 2012-07-25 | 2015-08-13 | セラノス, インコーポレイテッド | 生物学的サンプルの画像分析および測定 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3432269A4 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019193815A1 (ja) * | 2018-04-04 | 2019-10-10 | 株式会社日立ハイテクノロジーズ | 細菌検査装置および細菌検査方法 |
JP2019180264A (ja) * | 2018-04-04 | 2019-10-24 | 株式会社日立ハイテクノロジーズ | 細菌検査装置および細菌検査方法 |
JP7032216B2 (ja) | 2018-04-04 | 2022-03-08 | 株式会社日立ハイテク | 細菌検査装置および細菌検査方法 |
JP2022066269A (ja) * | 2018-04-04 | 2022-04-28 | 株式会社日立ハイテク | 細菌検査装置および細菌検査方法 |
JP7319407B2 (ja) | 2018-04-04 | 2023-08-01 | 株式会社日立ハイテク | 細菌検査装置および細菌検査方法 |
WO2022130884A1 (ja) * | 2020-12-18 | 2022-06-23 | ソニーグループ株式会社 | 測定装置、測定方法、測定システム |
Also Published As
Publication number | Publication date |
---|---|
EP3432269A1 (en) | 2019-01-23 |
EP3432269B1 (en) | 2020-10-21 |
US20210217172A1 (en) | 2021-07-15 |
JPWO2017159011A1 (ja) | 2019-01-24 |
EP3432269A4 (en) | 2019-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017159011A1 (ja) | 情報処理装置、情報処理方法、プログラム及び情報処理システム | |
JP7095722B2 (ja) | 情報処理装置およびプログラム | |
WO2017154318A1 (ja) | 情報処理装置、情報処理方法、プログラム及び情報処理システム | |
JP6777086B2 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
JP6720756B2 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
JP6696152B2 (ja) | 情報処理装置、情報処理方法、プログラム及び情報処理システム | |
US10002422B2 (en) | Ultrasound image processing apparatus and medium | |
JP6402717B2 (ja) | 画像解析装置、画像解析方法、画像解析プログラム、細胞の製造方法、細胞の製造装置、細胞の培養方法、および細胞の培養装置 | |
JP5576775B2 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
US20070195165A1 (en) | Image display apparatus | |
WO2018083984A1 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
JP5442542B2 (ja) | 病理診断支援装置、病理診断支援方法、病理診断支援のための制御プログラムおよび該制御プログラムを記録した記録媒体 | |
WO2017199408A1 (ja) | 画像処理装置、画像処理装置の作動方法及び画像処理装置の作動プログラム | |
JP2016535332A (ja) | Mdxのための画像ベースのroi追跡 | |
JP2011211278A (ja) | 記録装置および記録方法 | |
Xu et al. | Beyond two-stream: Skeleton-based three-stream networks for action recognition in videos | |
Osman et al. | Segmentation of tuberculosis bacilli in ziehl-neelsen tissue slide images using hibrid multilayered perceptron network | |
EP3975868A1 (en) | Systems and methods for detecting tissue contact by an ultrasound probe | |
Yao et al. | Scheme and dataset for evaluating computer-aided polyp detection system in colonoscopy | |
JP4552024B2 (ja) | 画像解析装置、画像解析プログラム及び画像解析方法 | |
Peng et al. | Image-based object state modeling of a transfer task in simulated surgical training | |
US20240013394A1 (en) | Skin cancer diagnosis method based on image analysis using artificial intelligence | |
KR20230064898A (ko) | 영상정보 검색 장치 및 방법 | |
Suzuki et al. | Esophageal ultrasound video-processing based bolus inflow detection method for swallowing evaluation | |
JP2010176569A (ja) | シーン変化検出装置、シーン変化検出プログラムおよびシーン変化検出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018505279 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017766019 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017766019 Country of ref document: EP Effective date: 20181015 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17766019 Country of ref document: EP Kind code of ref document: A1 |