WO2017169397A1 - Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image Download PDF

Info

Publication number
WO2017169397A1
WO2017169397A1 PCT/JP2017/007193 JP2017007193W WO2017169397A1 WO 2017169397 A1 WO2017169397 A1 WO 2017169397A1 JP 2017007193 W JP2017007193 W JP 2017007193W WO 2017169397 A1 WO2017169397 A1 WO 2017169397A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation value
image processing
point
motion
motion vector
Prior art date
Application number
PCT/JP2017/007193
Other languages
English (en)
Japanese (ja)
Inventor
和博 中川
威 國弘
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017169397A1 publication Critical patent/WO2017169397A1/fr

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present disclosure relates to an image processing device, an image processing method, and an image processing system.
  • Patent Document 1 a technique is assumed in which a periodic pulsation motion of a cardiomyocyte is detected based on a time-series change of an image feature amount and a center of the pulsation motion of the cardiomyocyte is detected. Is described. Patent Document 2 below describes that the propagation direction of cell pulsation is displayed as a histogram.
  • Patent Document 1 assumes that the center of the pulsating motion of the cardiomyocytes is detected, it does not assume that the strength of the pulsation is obtained. For this reason, a cell with small motion may be detected as a pulsation center due to an error or the like, and it is difficult to quantitatively evaluate the strength of the pulsation. Even in the technique described in Patent Document 2, it is assumed that the propagation direction of the pulsation is analyzed. However, the convergence point at which the cell motion converges is obtained, and the strength at which the motion converges is quantitatively determined. I did not expect to evaluate it.
  • a motion vector acquisition unit that acquires a motion vector at a plurality of positions in an image
  • an evaluation value acquisition unit that acquires an evaluation value representing the strength at which the plurality of motion vectors converge to an arbitrary point of interest
  • a convergence point extracting unit that extracts a convergence point at which the motion vector converges from the attention points based on the evaluation value
  • acquiring motion vectors at a plurality of positions in an image acquiring an evaluation value indicating the strength at which the plurality of motion vectors converge to an arbitrary point of interest, and the evaluation
  • An image processing apparatus comprising: extracting a convergence point at which the motion vector converges from the attention points based on a value.
  • an imaging device that captures an image including biological cells
  • a motion vector acquisition unit that acquires motion vectors at a plurality of positions in an image captured by the imaging device, and the plurality of motion vectors
  • An evaluation value acquisition unit that acquires an evaluation value representing strength that converges to an arbitrary point of interest
  • a convergence point extraction unit that extracts a point of convergence of the motion vector from the point of interest based on the evaluation value
  • An image processing system is provided, and a display device that displays the convergence point extracted by the image processing device.
  • FIG. 13 is a schematic diagram visualizing the deviation of the direction of the motion vector by displaying polar coordinates with the angle ⁇ of the motion vector in the circumferential direction and the frequency f as the radial direction based on the histogram of FIG. 12. It is the schematic diagram which computed and displayed the histogram shown in FIG. 13 in time series along the time-axis t.
  • FIG. 14 is a schematic diagram in which the histogram shown in FIG. 13 is displayed in 3D in space ( ⁇ , f, t). It is a schematic diagram which shows the mode of cell division. It is a schematic diagram which shows the position of the convergence point P in the case of cell division, and a motion direction histogram. It is a schematic diagram which shows cell migration. It is the figure which showed a mode that the cell migrated toward the attracting source in time series. It is a schematic diagram which shows the orientation of a cell.
  • cultured cells which are cell tissues produced by culturing cells collected from a living body.
  • Cultured cardiomyocytes which are cultured cells obtained by culturing cardiomyocytes, may be used, for example, for the treatment of the heart. It is also used for evaluating toxicity to the heart in drug discovery.
  • the cells and tissues have polarity, and the polarity is important for the functions of the cells and tissues.
  • a method for evaluating the polarity of a cell or tissue there are a method based on a simple form (shape) or a method of staining a molecule or cell that forms a polarity and evaluating it based on its distribution.
  • the staining method is invasive, and the method for evaluating morphology is relatively difficult to evaluate quantitatively.
  • the living body has a hierarchical structure, the organ is composed of tissue, the tissue is composed of cells, the cells are composed of intracellular organs, and the intracellular organs are composed of molecules. It can be considered that the polarity evaluation of biological tissue by staining or morphology evaluation is evaluating the orientation of the structure below the target structure from the morphology due to the difference in staining or absorption spectrum.
  • the structure below the target should be dynamically arranged according to the polarity and orientation.
  • the present disclosure quantitatively evaluates the polarity of a living body by evaluating and displaying the directionality of the movement of a structure included in an object.
  • FIG. 1 is a block diagram illustrating a configuration example of an evaluation system 1000 according to an embodiment of the present disclosure.
  • the evaluation system 1000 is an apparatus that evaluates the cultured cell 300 by observing the movement of the cultured cell (observation target) 300.
  • the evaluation system 1000 includes an imaging device 100, an image processing device 200, a display device 400, and an operation input device 500.
  • the display device 400 includes a liquid crystal display (LCD) or the like.
  • the operation input device 500 includes a mouse, a keyboard, a touch sensor, and the like.
  • the imaging device 100 includes an imaging element such as a CMOS sensor, and images the cultured cell 300 to be observed.
  • the imaging device 100 may image the cultured cell 300 directly (without passing through other members), or may image the cultured cell 300 through another member such as a microscope or an optical system. .
  • the cultured cell 300 may or may not be fixed to the imaging device 100.
  • the evaluation system 1000 detects movement (temporal change in position), and thus it is generally desirable that the cultured cell 300 is fixed to the imaging device 100.
  • the imaging device 100 images the cultured cells 300 for a predetermined period. That is, the imaging apparatus 100 obtains a moving image with the cultured cell 300 as a subject.
  • the imaging device 100 supplies an image signal (moving image) of an image of the cultured cell 300 obtained by imaging to the image processing device 200.
  • the imaging apparatus 100 may obtain one or a plurality of still images having different shooting times with the cultured cell 300 as a subject instead of a moving image.
  • the image processing apparatus 200 analyzes the movement convergence point and the movement direction histogram of the cultured cell 300 based on the image signal supplied from the imaging apparatus 100, and records the generated image data on an internal recording medium, for example. save. Further, the image processing apparatus 200 generates display information for displaying the result of analyzing the convergence point of movement of the cultured cell 300 and the movement direction histogram, and sends the display information to the display apparatus 400 for display.
  • the image processing device 200 includes a motion vector acquisition unit 202, an evaluation value acquisition unit 204, a convergence point extraction unit 206, a motion direction histogram generation unit 208, a tracking unit 209, a display processing unit 210, and an operation information acquisition unit 212. It is comprised.
  • the motion vector acquisition unit 202 detects feature points for each frame of the moving image and compares motions of the feature points for each frame, thereby acquiring a motion vector at each pixel in the frame by a known method.
  • the evaluation value acquisition unit 204 acquires the evaluation values E 1 and E 2 by the first or second calculation method described later.
  • the convergence point extraction unit 206 extracts a convergence point based on the evaluation values E 1 and E 2 .
  • the motion direction histogram generation unit 208 generates a motion direction histogram based on the direction of the motion vector of each pixel.
  • the tracking unit 209 tracks the contour of an object such as a cell.
  • the display processing unit 210 generates display information for displaying the image based on the image signal, the convergence point extracted by the convergence point extraction unit 206, and the motion direction histogram generated by the motion direction histogram generation unit 208 on the display device 400.
  • the operation information acquisition unit 212 acquires operation information input from the operation input device 500 such as a mouse, a keyboard, or a touch sensor.
  • Each component of the image processing apparatus 200 can be configured by a central processing unit such as a CPU and a program (software) for causing it to function.
  • Evaluation value acquiring unit 204 a target point to each point (x, y) corresponding to each pixel in the cell image, calculates an evaluation value E 1 for each target point from the following equation (1).
  • v ′ (i, j) in the equation (1) is a component in the direction of the target point of the motion vector v (i, j).
  • the attention point direction is a direction from the attention point (x, y) toward the start point s of the motion vector v (i, j).
  • w (d) is a weighting coefficient obtained from the distance d between the target point and the motion vector, and is calculated based on the characteristics shown in FIG. As shown in FIG. 3, the value of the weighting factor w (d) decreases as the distance d increases, and the weighting factor w (d) becomes 0 when the distance d is greater than or equal to a predetermined value d1. Therefore, the correlation of the motion vector becomes higher closer to the point of interest, a motion vector v starting from the target point is far more distance s will not be taken into account in the calculation of the evaluation value E 1.
  • the convergence point extraction unit 206 searches for a point where the evaluation value E 1 is maximized, N points satisfying the following conditions are set as the convergence point P based on the threshold value E min and the threshold value D min. .
  • D is a distance from the nearest convergence point.
  • 5 to 8 are schematic diagrams for explaining the second calculation method.
  • the first calculation method the component of the motion vector v in the direction of the point of interest is used, but in the second calculation method, the motion vector v itself is used.
  • the target point to each point (x, y) corresponding to each pixel in the cell image calculates an evaluation value E 2 for each point of interest from the following equation (2).
  • the magnitude of the motion vector, the distance l from the point of interest (x, y) to the start point s of the motion vector, the weighting coefficient wd (d) corresponding to the distance s, and the weighting coefficient wl ( from l) the evaluation value E 2 is calculated. As shown in FIG.
  • the distance l is a component of the distance between the point of interest (x, y) and the start point s in the direction orthogonal to the direction of the point of interest
  • the distance d is the point of interest (x, y) and the start point. This is the component in the direction of the point of interest of the distance between s.
  • FIG. 6 is a schematic diagram showing characteristics for calculating the weighting coefficient wd (d)
  • FIG. 7 is a schematic diagram showing characteristics for calculating the weighting coefficient wl (l).
  • the characteristics of the weighting factors shown in FIGS. 3, 6, and 7 may be changed according to the observation object. For example, when evaluating a cell having a large movement, the value of the distance d at which the weight coefficient becomes 0 is increased. When observing the inside of one cell, such as when observing cell division, the value of the distance d at which the weighting coefficient becomes 0 may be smaller than the size of the cell. As the value of the distance d at which the weighting coefficient becomes 0 is decreased, the amount of processing for calculating the evaluation values E 1 and E 2 can be reduced.
  • FIG. 9 shows a state in which the convergence point P based on the evaluation value E 2 obtained by the above method is visualized and displayed on the display device 400.
  • the evaluation value E 2 is large convergence point, and increase the radius of the circle C drawn on the position of the convergence point P.
  • the magnitude of the evaluation value E 2 is visualized by the density of the circle C.
  • higher evaluation value E 2 is large convergence point, and darker density of the circle C.
  • such as the color of the circle C larger the evaluation value E 2 in bright colors may be a visualized color.
  • the threshold values E min and D min can be adjusted as appropriate so that N convergence points P exist in the image.
  • an indicator 410 indicating the values of threshold values E min and D min as shown in FIG. 9 is displayed, and when the user operates the operation input device 500, the indicator 410 is changed, and the threshold values E min and D min are displayed. The value of can be adjusted.
  • the value of the threshold value E min is changed, the number of convergence points P displayed in FIG. 9 increases or decreases, and the number of convergence points decreases as the value of the threshold value E min increases.
  • the value of the threshold value D min is changed, the number of convergence points P displayed in FIG. 9 increases or decreases, and the number of convergence points decreases as the value of the threshold value D min increases.
  • the user can appropriately adjust the number of convergence points by changing the values of the threshold values E min and D min indicated by the indicator 410. For example, in the process in which one cell divides into two, there are two convergence points P if normal division occurs. Therefore, the user adjusts the convergence points P to two by changing the values of the threshold values E min and D min indicated by the indicator 410, so that the lightness caused by factors such as errors is generated. It is possible not to display the convergence point. On the other hand, in the process of observing cell division, if there are three convergence points P even if the indicator 410 is changed, it can be recognized that abnormal cell division has occurred.
  • FIG. 10 is a schematic diagram showing a motion vector of each pixel in the image. As shown in FIG. 10, the angle ⁇ indicating the direction of the motion vector is determined with reference to the horizontal direction of the image.
  • FIG. 11 is a schematic diagram illustrating an example in which a plurality of motion vectors are combined.
  • a histogram may be generated from each motion vector, or a histogram may be generated from a synthesized vector as shown in FIG.
  • FIG. 12 is a schematic diagram showing a motion direction histogram, and shows the frequency f corresponding to the angle ⁇ for the motion vector of each pixel in the image.
  • the range from which the motion vector is extracted may be the entire image or a specific range of the image (for example, a range of one cell).
  • FIG. 13 visualizes the deviation of the direction of the motion vector by displaying polar coordinates with the angle ⁇ of the motion vector in the circumferential direction and the frequency f as the radial direction based on the histogram of FIG.
  • FIG. 14 shows the histogram shown in FIG. 13 calculated and displayed in time series along the time axis t.
  • FIG. 15 is a 3D display of the histogram shown in FIG. 13 in the space ( ⁇ , f, t). According to the display examples shown in FIGS. 14 and 15, it is possible to visually recognize how the direction of the motion vector changes with time.
  • the motion direction histogram it is possible to specify the direction of cell division, the direction in which the cell moves, the direction of the tissue of the cell, and the like. Furthermore, by evaluating the motion direction histogram together with the position of the convergence point P, it is possible to specify the cell division direction, the direction in which the cell moves, the direction of the cell tissue, and the like with high accuracy.
  • FIG. 16 is a schematic diagram showing a state of cell division.
  • the chromosome is divided in the right direction by pulling in the direction of the spindle plate by the microtubule, which is one of the cytoskeleton. Therefore, the spindle plate corresponds to the convergence point P.
  • Transport of intracellular organs such as intracellular vesicle transport is performed along microtubules, not only during division.
  • the polarity of division can be evaluated by evaluating the direction of movement of chromosomes and other intracellular organs.
  • the direction of movement of intracellular organelles is limited by the orientation of the cytoskeleton (microtubules).
  • FIG. 17 is a schematic diagram showing the result of analyzing the position of the convergence point P during cell division and the motion direction histogram.
  • FIG. 17 shows both a normal histogram as shown in FIG. 12 and a polar coordinate display histogram as shown in FIG. 13 as the motion direction histogram.
  • FIG. 17 shows how these histograms and images change with time (0 min ⁇ 5 min ⁇ 10 min ⁇ 15 min).
  • one cell at 0 minutes (0 min) is split into two cells in the horizontal direction (0 to 180 degrees) after 15 minutes.
  • the size of the circle C indicating the convergence point P is set according to the strength of the convergence point P as shown in FIG.
  • a rectangular frame A shown in the original image indicates a range where a motion vector is extracted.
  • the size and shape of the cell in the frame A are tracked by the tracking unit 209 and set to a range slightly smaller than the size of the cell.
  • the shape of a cell can be obtained using machine learning or image processing.
  • the frame A may be set by the user operating the operation input device 500.
  • the peak of the frequency of the motion vector is maintained in the direction of 0 to 180 degrees at which the cell divides until 15 minutes elapse, and the frequency in the direction of 90 to 270 degrees is a minimum value. It has become. Therefore, it can be seen from the motion direction histogram that splitting occurs in the horizontal direction (0 to 180 degrees). Further, when the 20 minutes have elapsed after the division, the motion direction histogram has changed to a different profile, and it can be seen from the motion direction histogram that the division has been completed before the 20 minutes have elapsed. Further, it can be seen that the circle C increases with time and the convergence point P becomes stronger from 0 minutes at the start of division to 10 minutes after cell division. In addition, when 20 minutes have elapsed after the division, a convergence point P (circle C) different from the previous division appears, and it can be seen that a movement toward a new division has started.
  • the polarity of division and the position of the spindle plate can be estimated. Then, abnormal division such as multipolar division can be evaluated from the position of the spindle plate and the spatio-temporal appearance frequency. In addition, abnormal distribution of chromosomes can be evaluated from the movement of chromosomes. Furthermore, by performing the time-series three-dimensional evaluation as shown in FIG. 17, the duration and interval of the division can be evaluated.
  • FIG. 18 is a schematic diagram showing cell migration.
  • FIG. 18 shows an example in which a plurality of cells 200 are migrating toward the attracting source 210.
  • the attraction source 210 include a nutrient source.
  • the cell 200 is a sperm, an egg is mentioned as the attraction source 210.
  • the attraction source 210 is a convergence point P where the cells 200 gather.
  • FIG. 18 shows a state in which the cell 200 migrates toward the attracting source 210.
  • a motion vector is generated according to the wet direction, so that it can be evaluated in the same manner. It is.
  • FIG. 19 is a diagram showing, in time series, how the cell 200 migrates toward the attracting source 210, and shows the state when the time t1 has elapsed and the time t2 has elapsed. Further, in FIG. 19, the motion direction histogram is shown in polar coordinate display when time t1 has elapsed and when time t2 has elapsed.
  • the circle C indicating the strength of the convergence point P increases as the cell 200 approaches the attracting source 210 from the time t1 to the time t2.
  • the motion direction histogram has a greater frequency f of the motion vector in the 0 degree direction.
  • the amount of an attracting substance such as a nutrient source supplied from the attracting source 210 changes with time. Even in such a case, by evaluating the time-series change in the strength of the convergence point and the time-series change in the motion direction histogram, for example, the supply of the attracting substance is periodically changed. It can be analyzed.
  • the direction of migration and invasion of the cells 200 can be evaluated.
  • the position of the release source of the migration and infiltration attractant can be estimated.
  • the temporal release profile of the attracting source 210 can be evaluated by performing a time-series three-dimensional evaluation.
  • FIG. 20 is a schematic diagram showing cell orientation.
  • the transition from the adult tissue having poor orientation (time t11) to the adult tissue having good orientation (time t12) with respect to the convergence points 230 at both ends is shown in time series. It shows.
  • the motion direction histogram is shown in polar coordinate display at each of the time t11 and the time t12.
  • Each cell 220 contracts and expands in the direction of the arrow.
  • the direction of contraction and expansion is aligned in a certain direction as the transition from a poorly oriented adult structure to a well oriented adult structure is performed, so that a circle C indicating the strength of the convergence point P is obtained. growing.
  • the frequency f of the motion vector in the direction coinciding with the direction of contraction and expansion increases in the motion direction histogram as the transition from the adult tissue having poor orientation to the adult tissue having good orientation is made. Therefore, it is possible to evaluate the orientation of the adult tissue with high accuracy based on the convergence point P and the motion direction histogram.
  • the orientation of the tissue can be evaluated from the direction of cell movement.
  • the cell orientation can be evaluated from the direction of movement of the intracellular structure.
  • the maturation and destruction processes of cells and tissues can be evaluated by performing time-series three-dimensional evaluation.
  • the convergence point P can be extracted based on the motion vector, and the convergence strength at the convergence point P can be evaluated from the evaluation values E 1 and E 2 .
  • the convergence strength at the convergence point P can be evaluated from the evaluation values E 1 and E 2 .
  • a motion vector acquisition unit that acquires motion vectors at a plurality of positions in an image
  • An evaluation value acquisition unit that acquires an evaluation value representing the strength with which the plurality of motion vectors converge to an arbitrary point of interest
  • a convergence point extraction unit that extracts a convergence point where the motion vector converges from the attention points
  • An image processing apparatus comprising: (2) The evaluation value acquisition unit integrates the plurality of motion vectors for each attention point according to the distance between each position of the plurality of motion vectors and an arbitrary attention point in the image.
  • the image processing apparatus according to (1), wherein an evaluation value is acquired.
  • the image processing apparatus according to (1) or (2), wherein the convergence point extraction unit extracts the attention point having the evaluation value equal to or greater than a predetermined value as the convergence point.
  • the information processing apparatus according to any one of (1) to (3), further including a display processing unit that performs processing for displaying a strength of convergence movement related to the convergence point according to a magnitude of the evaluation value.
  • Image processing apparatus (5) The image processing apparatus according to (4), wherein the display processing unit displays the strength of the convergence movement related to the convergence point by changing the size of the convergence point.
  • a display processing unit is provided that performs processing for displaying the strength of convergence motion related to the convergence point according to the magnitude of the evaluation value, and processing for displaying the motion direction histogram.
  • the image processing apparatus according to (9) or (10).
  • An image processing apparatus comprising: (14) an imaging device that captures an image including biological cells; A motion vector acquisition unit that acquires a motion vector at a plurality of positions in an image captured by the imaging device, and an evaluation value acquisition unit that acquires an evaluation value representing the strength at which the plurality of motion vectors converge to an arbitrary point of interest And a convergence point extraction unit that extracts a convergence point at which the motion vector converges from the attention points based on the evaluation value; and A display device for displaying the convergence point extracted by the image processing device;
  • An image processing system comprising:

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Analytical Chemistry (AREA)
  • Wood Science & Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Organic Chemistry (AREA)
  • Zoology (AREA)
  • Theoretical Computer Science (AREA)
  • Biotechnology (AREA)
  • Immunology (AREA)
  • Biomedical Technology (AREA)
  • Microbiology (AREA)
  • Sustainable Development (AREA)
  • Medicinal Chemistry (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Genetics & Genomics (AREA)
  • Multimedia (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

Le problème décrit par la présente invention est de déterminer un point de convergence de mouvement de cellules, et également de déterminer l'intensité d'un mouvement de convergence. La solution selon l'invention porte sur le dispositif de traitement d'image comportant : une unité d'acquisition de vecteur de mouvement qui acquiert des vecteurs de mouvement en une pluralité d'emplacements à l'intérieur d'une image ; une unité d'acquisition de valeur d'évaluation qui acquiert une valeur d'évaluation représentant l'intensité à laquelle la pluralité de vecteurs de mouvement convergent vers un point d'intérêt donné ; et une unité d'extraction de point de convergence qui extrait, parmi les points d'intérêt sur la base de la valeur d'évaluation, un point de convergence vers lequel les vecteurs de mouvement convergent. Cette configuration permet de déterminer un point de convergence du mouvement des cellules, et permet également de déterminer l'intensité du mouvement de convergence.
PCT/JP2017/007193 2016-03-29 2017-02-24 Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image WO2017169397A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-064880 2016-03-29
JP2016064880A JP2017175965A (ja) 2016-03-29 2016-03-29 画像処理装置、画像処理方法、及び画像処理システム

Publications (1)

Publication Number Publication Date
WO2017169397A1 true WO2017169397A1 (fr) 2017-10-05

Family

ID=59963136

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/007193 WO2017169397A1 (fr) 2016-03-29 2017-02-24 Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image

Country Status (2)

Country Link
JP (1) JP2017175965A (fr)
WO (1) WO2017169397A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018083984A1 (fr) * 2016-11-02 2018-05-11 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations
WO2024010020A1 (fr) * 2022-07-08 2024-01-11 王子ホールディングス株式会社 Feuillet cellulaire cultivé et son procédé de fabrication; procédé d'évaluation d'un composé ou d'un médicament; et procédé d'évaluation de la qualité d'un feuillet cellulaire cultivé

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019135268A1 (fr) * 2018-01-04 2019-07-11 オリンパス株式会社 Dispositif de traitement d'image de cellules, procédé de traitement d'image de cellules et programme de traitement d'image de cellules

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005521126A (ja) * 2002-02-13 2005-07-14 レイファイ コーポレイション 空間時間信号の取得、圧縮、及び特徴づけのための方法及び装置
JP2009261798A (ja) * 2008-04-28 2009-11-12 Olympus Corp 画像処理装置、画像処理プログラムおよび画像処理方法
JP2014075999A (ja) * 2012-10-10 2014-05-01 Nikon Corp 心筋細胞の運動検出方法、画像処理プログラム及び画像処理装置、心筋細胞の培養方法、心筋細胞の薬剤評価方法及び薬剤製造方法
WO2014102449A1 (fr) * 2012-12-27 2014-07-03 Tampereen Yliopisto Analyse visuelle des cardiomyocytes
WO2014185169A1 (fr) * 2013-05-16 2014-11-20 ソニー株式会社 Dispositif de traitement d'image, méthode de traitement d'image, et programme
JP2015019620A (ja) * 2013-07-19 2015-02-02 ソニー株式会社 画像処理装置および方法、並びにプログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005521126A (ja) * 2002-02-13 2005-07-14 レイファイ コーポレイション 空間時間信号の取得、圧縮、及び特徴づけのための方法及び装置
JP2009261798A (ja) * 2008-04-28 2009-11-12 Olympus Corp 画像処理装置、画像処理プログラムおよび画像処理方法
JP2014075999A (ja) * 2012-10-10 2014-05-01 Nikon Corp 心筋細胞の運動検出方法、画像処理プログラム及び画像処理装置、心筋細胞の培養方法、心筋細胞の薬剤評価方法及び薬剤製造方法
WO2014102449A1 (fr) * 2012-12-27 2014-07-03 Tampereen Yliopisto Analyse visuelle des cardiomyocytes
WO2014185169A1 (fr) * 2013-05-16 2014-11-20 ソニー株式会社 Dispositif de traitement d'image, méthode de traitement d'image, et programme
JP2015019620A (ja) * 2013-07-19 2015-02-02 ソニー株式会社 画像処理装置および方法、並びにプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HAYAKAWA, T. ET AL.: "Image-based evaluation of contraction-relaxation kinetics of human- induced pluripotent stem cell -derived cardiomyocytes: Correlation and complementarity with extracellular electrophysiology", J. MOL. CELL . CARDIOL., vol. 77, 2014, pages 178 - 191, XP055425154, ISSN: 0022-2828 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018083984A1 (fr) * 2016-11-02 2018-05-11 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations
JPWO2018083984A1 (ja) * 2016-11-02 2019-09-19 ソニー株式会社 情報処理装置、情報処理方法及び情報処理システム
JP7001060B2 (ja) 2016-11-02 2022-01-19 ソニーグループ株式会社 情報処理装置、情報処理方法及び情報処理システム
US11282201B2 (en) 2016-11-02 2022-03-22 Sony Corporation Information processing device, information processing method and information processing system
WO2024010020A1 (fr) * 2022-07-08 2024-01-11 王子ホールディングス株式会社 Feuillet cellulaire cultivé et son procédé de fabrication; procédé d'évaluation d'un composé ou d'un médicament; et procédé d'évaluation de la qualité d'un feuillet cellulaire cultivé

Also Published As

Publication number Publication date
JP2017175965A (ja) 2017-10-05

Similar Documents

Publication Publication Date Title
JP6078943B2 (ja) 画像処理装置および方法、並びに、プログラム
Mamiya et al. Neural coding of leg proprioception in Drosophila
JP5828210B2 (ja) 画像処理装置および方法、並びに、プログラム
JP2018038411A (ja) 心筋細胞評価装置および方法、並びに、プログラム
JP7067588B2 (ja) 情報処理装置、プログラム、情報処理方法及び観察システム
US10494598B2 (en) Observation image capturing and evaluation device, method, and program
JP2012504277A (ja) 毛包単位の追跡
WO2017169397A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et système de traitement d'image
WO2017061155A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, système de traitement d'informations
US10929985B2 (en) System and methods for tracking motion of biological cells
JPWO2017154318A1 (ja) 情報処理装置、情報処理方法、プログラム及び情報処理システム
Elias et al. Measuring and quantifying dynamic visual signals in jumping spiders
JP6217968B2 (ja) 画像処理装置および方法、並びにプログラム
JP6746945B2 (ja) 情報処理装置、情報処理システム及び情報処理方法
Flotho et al. Lagrangian motion magnification revisited: Continuous, magnitude driven motion scaling for psychophysiological experiments
JP6436142B2 (ja) 画像処理装置および方法、並びに、プログラム
JP6191888B2 (ja) 画像処理装置および方法、並びに、プログラム
Tang et al. Measurement of displacement in isolated heart muscle cells using markerless subpixel image registration
Ye et al. Analysis of the Dynamical Biological Objects of Optical Microscopy
US20200020099A1 (en) Information processing device, information processing system, and information processing method
Luo et al. A comparison of modified evolutionary computation algorithms with applications to three-dimensional endoscopic camera motion tracking
McIlroy et al. In vivo classification of inflammation in blood vessels with convolutional neural networks
KR102225721B1 (ko) 세포간 동기화 분석 방법 및 장치
Loftus Fine spatiotemporal calcium signals and kinematic properties revealed by motion-corrected calcium images of contracting myometrium
WO2017061080A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17773942

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17773942

Country of ref document: EP

Kind code of ref document: A1