WO2017061080A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations Download PDF

Info

Publication number
WO2017061080A1
WO2017061080A1 PCT/JP2016/004345 JP2016004345W WO2017061080A1 WO 2017061080 A1 WO2017061080 A1 WO 2017061080A1 JP 2016004345 W JP2016004345 W JP 2016004345W WO 2017061080 A1 WO2017061080 A1 WO 2017061080A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
information processing
tracking points
processing device
Prior art date
Application number
PCT/JP2016/004345
Other languages
English (en)
Inventor
Shinji Watanabe
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016150604A external-priority patent/JP6720756B2/ja
Application filed by Sony Corporation filed Critical Sony Corporation
Priority to US15/765,103 priority Critical patent/US10929985B2/en
Priority to CN201680058029.2A priority patent/CN108140241A/zh
Priority to EP16779193.8A priority patent/EP3360113A1/fr
Publication of WO2017061080A1 publication Critical patent/WO2017061080A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20116Active contour; Active surface; Snakes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and an information processing system.
  • PTL 1 discloses a technology in which a biological tissue such as an organ is set as an observation target, and then tracking points are disposed for a partial region of the observation target in an image obtained using ultrasonic waves or computed tomography (CT) to track movements of the tracking points.
  • CT computed tomography
  • an observation target is a cell rather than a biological tissue such as an organ, growth, motions, and the like of the cell cause its shape to change significantly in a short period of time. For this reason, if motions of the tracking points disposed once are merely tracked in the technology disclosed in PTL 1 mentioned above, it is difficult to track changes in the shape of portions in which no tracking points are disposed.
  • the present disclosure proposes a novel and improved information processing device, information processing method, and information processing system which enable tracking of changes in the shape of an observation target with high accuracy.
  • an information processing device includes circuitry configured to dispose a plurality of tracking points within a first region of a first image and set a second region of a second image based on estimated positions of the plurality of tracking points in the second image. The estimated positions are determined by comparing the first image and the second image and the second image is captured at a different time point than the first image.
  • the circuitry is further configured to re-dispose the plurality of tracking points within the second region of the second image.
  • an information processing method includes disposing a plurality of tracking points within a first region of a first image and setting a second region of a second image based on estimated positions of the plurality of tracking points in the second image.
  • the estimated positions are determined by comparing the first image and the second image and the second image is captured at a different time point than the first image.
  • the information processing method further includes re-disposing the plurality of tracking points within the second region of the second image.
  • an information processing system includes an imaging device configured to generate a plurality of images including a first image and a second image.
  • the information processing system further includes circuitry configured to dispose a plurality of tracking points within a first region of a first image and set a second region of a second image based on estimated positions of the plurality of tracking points in the second image. The estimated positions are determined by comparing the first image and the second image and the second image is captured at a different time point than the first image.
  • the circuitry is further configured to re-dispose the plurality of tracking points within the second region of the second image.
  • FIG. 1 is a diagram showing an overview of a configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing a functional configuration example of the information processing device according to the embodiment.
  • FIG. 3 is a flowchart showing an example of a process performed by the information processing device according to the embodiment.
  • FIG. 4 is a diagram showing an example of display of a captured image in an initial setting process according to the embodiment.
  • FIG. 5 is a diagram showing an example of display of the captured image in the initial setting process and an initial region to be noted according to the embodiment.
  • FIG. 6 is a diagram showing an example of disposition of tracking points in the initial setting process according to the embodiment.
  • FIG. 1 is a diagram showing an overview of a configuration of an information processing system according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing a functional configuration example of the information processing device according to the embodiment.
  • FIG. 3 is a flowchart showing an example of a process performed by the information processing device
  • FIG. 7 is a diagram showing an example of setting of a tracking region and a search range in a tracking process according to the embodiment.
  • FIG. 8 is a diagram showing an example of calculation of motion vectors and an estimation example of tracking points in the tracking process according to the embodiment.
  • FIG. 9 is a diagram showing an example of setting of a region to be noted in the tracking process according to the embodiment.
  • FIG. 10 is a diagram showing an example of re-disposition of tracking points in the tracking process according to the embodiment.
  • FIG. 11 is a diagram for describing a first application example of a disposition process by a disposition unit according to the embodiment.
  • FIG. 12 is a diagram for describing a second application example of a disposition process by a disposition unit according to the embodiment.
  • FIG. 13 is a diagram for describing a third application example of a disposition process by a disposition unit according to the embodiment.
  • FIG. 14 is a diagram for describing an application example of an estimation process by an estimation unit according to the embodiment.
  • FIG. 15 is a block diagram showing a functional configuration example of an information processing device according to a modified example of the embodiment.
  • FIG. 16 is a flowchart showing an example of a process by the information processing device according to the modified example.
  • FIG. 17 is a diagram for describing a first application example of the information processing device (to a nerve cell) according to the embodiment.
  • FIG. 18 is a diagram showing an example of an initial setting process of an axon by the information processing device according to the embodiment.
  • FIG. 19 is a diagram showing an example of a tracking process of an axon by the information processing device according to the embodiment.
  • FIG. 20 is a diagram for describing a second application example of the information processing device (to a zebrafish) according to the embodiment.
  • FIG. 21 is a diagram for describing a third application example of the information processing device (to a colony) according to the embodiment.
  • FIG. 22 is a diagram for describing a fourth application example of the information processing device (to a macrophage and a foreign body) according to the embodiment.
  • FIG. 23 is a block diagram showing a hardware configuration example of an information processing device according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram showing an overview of a configuration of an information processing system 1 according to an embodiment of the present disclosure.
  • the information processing system 1 is provided with an imaging device 10 and an information processing device 20.
  • the imaging device 10 and the information processing device 20 are connected to each other via various types of wired or wireless networks.
  • the imaging device 10 is a device which generates captured images (dynamic images).
  • the imaging device 10 is realized by, for example, a digital camera.
  • the imaging device 10 may be realized by any type of device having an imaging function, for example, a smartphone, a tablet, a game device, or a wearable device.
  • the imaging device 10 images real spaces using various members, for example, an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), a lens for controlling formation of a subject image in the image sensor, and the like.
  • the imaging device 10 includes a communication device for transmitting and receiving captured images and the like to and from the information processing device 20.
  • the imaging device 10 is provided above an imaging stage S to image a culture medium M in which a cell that is an observation target is cultured.
  • the imaging device 10 generates dynamic image data by imaging the culture medium M at a specific frame rate.
  • the imaging device 10 may directly image the culture medium M (without involving another member), or may image the culture medium M via another member such as a microscope.
  • the frame rate is not particularly limited, it is desirable to set the frame rate according to the degree of a change of the observation target.
  • the imaging device 10 images a given imaging region including the culture medium M in order to accurately track a change of the observation target. Dynamic image data generated by the imaging device 10 is transmitted to the information processing device 20.
  • the imaging device 10 is assumed to be a camera installed in an optical microscope or the like in the present embodiment, the present technology is not limited thereto.
  • the imaging device 10 may be an imaging device included in an electronic microscope using electron beams such as a scanning electron microscope (SEM) or a transmission electron microscope (TEM), or an imaging device included in a scanning probe microscope (SPM) that uses a probe such as an atomic force microscope (AFM) or a scanning tunneling microscope (STM).
  • SEM scanning electron microscope
  • SPM scanning probe microscope
  • a captured image generated by the imaging device 10 is, for example, an image obtained by irradiating the observation target with electron beams in the case of an electronic microscope, and an image obtained by tracing the observation target using a probe in the case of an SPM.
  • These captured images can also be analyzed by the information processing device 20 according to the present embodiment.
  • the information processing device 20 is a device having an image analyzing function.
  • the information processing device 20 is realized by any type of device having an image analyzing function such as a personal computer (PC), a tablet, or a smartphone.
  • the information processing device 20 may be realized by one or a plurality of information processing devices on a network.
  • the information processing device 20 acquires a captured image from the imaging device 10 and executes tracking of a region of the observation target in the acquired captured image.
  • the result of analysis of the tracking process performed by the information processing device 20 is output to a storage device or a display device provided inside or outside the information processing device 20. Note that a functional configuration that realizes each function of the information processing device 20 will be described below.
  • the information processing system 1 is constituted with the imaging device 10 and the information processing device 20 in the present embodiment, the present technology is not limited thereto.
  • the imaging device 10 may perform a process related to the information processing device 20 (for example, a tracking process).
  • the information processing system 1 is realized by the imaging device having the function of tracking an observation target.
  • a cell set as an observation target undergoes various phenomena such as growth, division, combination, deformation, or necrosis in a short period of time, unlike a normal subject such as a human, an animal, a plant, or a non-living structure.
  • the shape of the cell can significantly change in a short period of time. For this reason, even if the shape of a cell that is an observation target in a captured image is tracked using the technology disclosed in JP 5508035B, for example, when the shape of a part in which no tracking points are disposed changes, it is not possible to track the change in the shape of that part. Thus, it is difficult to track the change in the shape of the cell with high accuracy.
  • the observation target is an animal, a plant, or a non-living structure
  • the observation target shows a remarkable change in its structure or shape of in a short period of time, for example, growth of a thin film or nano-cluster crystal
  • a plurality of tracking points are disposed for a region to be noted set in a captured image, positions of tracking points in another captured image that is captured at a different time point are estimated, the region to be noted is re-set based on the tracking points at the estimated positions, and further tracking points are re-disposed for the re-set region to be noted.
  • this technology makes it possible to re-dispose tracking points for tracking a change of the region at proper positions in each captured frame. Accordingly, positions of the tracking points can be properly adjusted following the change in the shape of the cell, and thus it is possible to track the change in the shape of the cell with high accuracy, regardless of the degree of the change in the shape of the cell.
  • the overview of the information processing system 1 according to an embodiment of the present disclosure has been described above.
  • the information processing device 20 included in the information processing system 1 according to an embodiment of the present disclosure is realized in the following embodiment. A specific configuration example and an operation process of the information processing device 20 will be described below.
  • FIG. 2 is a block diagram showing a functional configuration example of the information processing device 20 according to an embodiment of the present disclosure.
  • the information processing device 20 includes an image acquisition unit 210, an initial setting unit 215, a disposition unit 220, an estimation unit 230, a setting unit 240, and an output control unit 250.
  • the image acquisition unit 210 acquires captured image data generated by the imaging device 10 via a communication device that is not illustrated. For example, the image acquisition unit 210 acquires dynamic image data generated by the imaging device 10 in a time series manner.
  • images that the image acquisition unit 210 acquires include an RGB image, a grayscale image, or the like.
  • an acquired image is an RGB image
  • the image acquisition unit 210 converts the captured image that is the RGB image into a grayscale image.
  • the image acquisition unit 210 outputs the acquired captured image data to the initial setting unit 215 or the estimation unit 230.
  • the image acquisition unit 210 outputs one piece of image data of the acquired captured image data to the initial setting unit 215 for an initial setting of a region to be noted by the initial setting unit 215 to be described below.
  • the image acquisition unit 210 outputs captured image data out of the acquired captured image data, which is different from captured image data used in a tracking process executed one time before for estimation by the estimation unit 230.
  • the process means a series of processes performed by the disposition unit 220, the estimation unit 230, and the setting unit 240 as shown in FIG. 2. This tracking process can be repetitively executed on captured image data included in dynamic image data acquired by the image acquisition unit 210.
  • the initial setting unit 215 has a function of performing an initial setting of a region to be noted for the captured image acquired from the image acquisition unit 210 when a tracking process by the information processing device 20 has not started.
  • a region to be noted mentioned in the present specification means a region that is subject to a tracking process within a captured image.
  • the region to be noted is a region corresponding to an observation target of a captured cell or the like. That is, in a series of tracking processes, a region to be noted set by the setting unit 240 to be described below is a tracking result of a region that corresponds to an observation target.
  • a region to be noted set by the initial setting unit 215 may not completely coincide with a region that corresponds to an observation target within a captured image. However, in order to track a region that corresponds to an observation target with high accuracy, it is desirable to properly set a region to be noted.
  • the region to be noted in the present specification may be a region expressed using, for example, an open curve (including a straight line), or may be a region surrounded by a closed curve (a curve whose starting point and ending point match).
  • a plurality of closed regions or a region in the shape of 8 may be set through an operation of a user.
  • a region to be noted may be set through an operation of the initial setting unit 215 by a user.
  • a region to be noted may be set by a user operating an input device such as a mouse, a touch pen, or a touch panel that is not illustrated to trace the outer circumference of an observation target (for example, the contour of a cell) within a captured image displayed on a display device (such as a display) that is not illustrated. Accordingly, a region that the user desires can be set as a region to be noted.
  • the initial setting unit 215 may set a region to be noted as a region surrounded by a closed curve through an interpolation process or the like.
  • a region to be noted may be automatically set by the initial setting unit 215 through an image analysis process.
  • the initial setting unit 215 may set a region to be noted using an image analysis technique such as a binary image transform, a Hough transform, or machine learning. Accordingly, a burden of a user caused by an initial setting of a region to be noted can be reduced.
  • the disposition unit 220 has a function of disposing a plurality of tracking points for the region to be noted.
  • a tracking point mentioned in the present specification is a point disposed to correspond to a region to be noted set in a given captured image.
  • tracking points are disposed on a line or a contour defining a region to be noted with predetermined intervals.
  • the estimation unit 230 to be described below estimates positions of the tracking points in another captured image captured at a different time point from the captured image used when the region to be noted is set. By estimating the positions of these tracking points in a time series manner, a change in the shape of a cell can be tracked.
  • the disposition unit 220 disposes tracking points at the respective terminal points of the open curve.
  • the disposition unit 220 may not dispose a tracking point at a specific position on the closed curve.
  • the disposition unit 220 may dispose a tracking point at the starting point (or the ending point) of the closed curve. Therefore, the tracking points can be disposed at the position that the user desires.
  • the number of tracking points disposed and disposition intervals between them may be decided according to the type of observation target, or the shape of the observation target. For example, when the shape of a cell that is an observation target significantly changes, it is desirable to increase the number of tracking points disposed and decrease the disposition intervals. Accordingly, even when the shape of the cell significantly changes, the change in the shape can be tracked with high accuracy. In addition, in order to reduce a burden of calculation, it is desirable to reduce the number of tracking points disposed and increase the disposition intervals.
  • the disposition unit 220 disposes tracking points for the region to be noted set by the initial setting unit 215.
  • the disposition unit 220 re-disposes the tracking points with regard to the region to be noted set by the setting unit 240. Accordingly, a tracking point can be properly disposed for each tracking process. For example, when the tracking points are not re-disposed, it is not possible to track a change in the shape of a part of the cell that is an observation target in which no tracking points are disposed. According to the present embodiment, the tracking points are re-disposed in each tracking process by the disposition unit 220 with proper intervals for the region to be noted set by the setting unit 240 one time before. Thus, even if the shape of the cell significantly changes, it is possible to confine a discrepancy between the region to be noted and a region corresponding to the cell at a minimum level. Therefore, the region corresponding to the cell can be tracked with high accuracy.
  • the disposition unit 220 may re-dispose at least one tracking point among the tracking points previously disposed at the same position as that estimated by the estimation unit 230 to be described below. This is because, when the observation target included in a peripheral region of the tracking points disposed by the disposition unit 220 does not show a significant change in its features, for example, re-disposition of a tracking point at the same position improves accuracy in the tracking process.
  • Information with regard to the tracking point disposed (re-disposed) by the disposition unit 220 is output to the estimation unit 230 along with information of the captured image used in the setting of the region to be noted.
  • the estimation unit 230 has a function of estimating, based on comparison of the captured image used for setting the region to be noted (hereinafter referred to as a first captured image) to another captured image acquired from the image acquisition unit 210 (hereinafter referred to as a second captured image), positions of the tracking points of the region to be noted disposed in a first captured image in a second captured image.
  • the second captured image refers to a captured image of which a capturing time point is different from that of the first captured image.
  • the second captured image may be, for example, a captured image of any frame among a few frames before and after the frame of the first captured image. To be more specific, the second captured image may be a captured image generated one frame after the first captured image.
  • the capturing time point of the second captured image (the number of frames between the first captured image and the second captured image) that is an estimation target may be designated according to a state, a change, or the like of the observation target.
  • the estimation unit 230 may estimate positions of the tracking points based on, for example, a motion vector calculated by comparing the first captured image to the second captured image.
  • This motion vector may be a motion vector calculated for each tracking point.
  • the motion vector may be calculated using a technique such as block matching, or a gradient method. In the present specification, the estimation unit 230 is described as estimating the motion vector using block matching.
  • the estimation unit 230 may estimate positions of the tracking points in the second captured image by detecting a region of which information of pixels included in the tracking region of the first captured image matches that of the second captured image from a predetermined search range of the second captured image.
  • a size of the tracking region and the search range may be decided according to an imaging condition (for example, an imaging magnification) of the imaging device 10, the type of the observation target, the type of analysis performed on the observation target.
  • the tracking region or the search range may be set to be larger. Accordingly, accuracy in estimation of tracking points by the estimation unit 230 can be enhanced.
  • the tracking region or the search range may be adjusted to be small in order to reduce a load of calculation.
  • the estimation unit 230 may estimate a position of a tracking point in the second captured image generated at an imaging time point decided based on information of the observation target.
  • a change in the shape of a cell of which a speed of the change in the shape is slow is tracked, for example, a difference in captured images between a plurality of consecutive frames generated by the imaging device 10 is small. For this reason, when a change in the shape of a cell of which a speed of the change in the shape is slow is tracked, the estimation unit 230 may perform an estimation process with a captured image a number of frames before or after the frame of the first captured image as the second captured image.
  • the estimation unit 230 may perform an estimation process with a captured image a number frames after the first captured image as the second captured image.
  • the frame interval between the first captured image and the second captured image enables the data amount of the captured image that is subject to a tracking process to be reduced. Accordingly, it is possible to reduce a load of calculation and track a change in the shape of the cell over a long period of time.
  • the frame interval can be appropriately set according to the type, a state, or the like of the cell.
  • the estimation unit 230 outputs information related to the estimated positions of the tracking points to the setting unit 240.
  • the setting unit 240 has a function of setting a region to be noted in the second captured image based on the positions of the tracking points in the second captured image acquired from the estimation unit 230.
  • the setting unit 240 may set a region to be noted by, for example, performing interpolation on a closed curve (or an open curve when the original region to be noted is depicted using an open curve) that passes through the position of each tracking point estimated by the estimation unit 230.
  • a method of interpolating a closed curve a known interpolation method, for example, Bezier curve interpolation, spline curve interpolation, or the like is used.
  • the setting unit 240 may set a region to be noted using an image analysis result of a captured image, in addition to the estimated positions of respective tracking points. For example, when the number of regions to be noted increases or decreases because the cell that is an observation target undergoes division or combination, it is difficult to set a region to be noted with high accuracy using only the positions of respective tracking points.
  • the setting unit 240 can adjust an interpolation curve set based on each tracking point by performing image analysis on the captured image, for example, edge analysis or density analysis. More specifically, when the cell being observed divides, the setting unit 240 may detect the division of the cell through edge analysis or the like and set a region to be noted using the result of the detection. Accordingly, the number of regions to be noted and their positions can be appropriately set according to the division of the cell.
  • the setting unit 240 outputs information with regard to the set region to be noted to the disposition unit 220 and the output control unit 250. Note that, when a series of tracking processes ends, the setting unit 240 may not output the information to the disposition unit 220.
  • the output control unit 250 has a function of outputting various kinds of information obtained in the series of tracking processes, such as information of the region to be noted acquired from the setting unit 240.
  • the output control unit 250 may output, for example, a result of disposition of the tracking points by the disposition unit 220, a result of estimation of the positions of the tracking points by the estimation unit 230, a result of setting of the region to be noted by the setting unit 240, or the like.
  • An output aspect of the output control unit 250 is not particularly limited.
  • the output control unit 250 may display a captured image on a display device that is not illustrated, and overlap various kinds of information of tracking points or the region to be noted on the displayed captured image.
  • the output control unit 250 may store the various kinds of information in a storage device that is not illustrated, or transmit such information to an external device using a communication device that is not illustrated.
  • FIG. 3 is a flowchart showing the example of the process performed by the information processing device 20 according to an embodiment of the present disclosure.
  • the process according to the present embodiment is constituted of an initial setting process (Steps S101 to S105 of FIG. 3) and a tracking process (Steps S107 to S121 of FIG. 3).
  • the image acquisition unit 210 acquires dynamic image data from the imaging device 10 and outputs one captured image out of the dynamic image data to the initial setting unit 215 (S101).
  • FIG. 4 is a diagram showing an example of display of a captured image in the initial setting process according to the present embodiment. As shown in FIG. 4, the captured image P1 is displayed on the display unit D, and a cell image C1 is included in the captured image P1.
  • the initial setting unit 215 sets an initial region to be noted for the cell image C1 displayed in the captured image P1 (S103 of FIG. 3).
  • FIG. 5 is a diagram showing an example of display of the captured image in the initial setting process and the initial region to be noted according to the present embodiment.
  • the initial setting unit 215 draws a closed curve 1010 around the cell image C1, and sets the region surrounded by the closed curve 1010 as an initial region to be noted 1001.
  • the initial region to be noted 1001 is set by drawing the closed curve 1010 along the contour of the cell image C1 in the example shown in FIG. 5, the initial region to be noted 1001 may be set in a region inside or outside the cell image C1 or crossing the contour of the cell image C1.
  • the closed curve 1010 may be drawn through an operation of a user via an input device that is not illustrated, or may be drawn based on an image analysis process for the captured image P1.
  • the disposition unit 220 disposes tracking points for the initial region to be noted 1001 (S105 of FIG. 3).
  • FIG. 6 is a diagram showing an example of disposition of tracking points in the initial setting process according to the present embodiment.
  • the disposition unit 220 disposes a plurality of tracking points 1011 (1011a, 1011b, 1011c, ---) on the closed curve 1010 defining the initial region to be noted 1001.
  • the tracking points 1011 are disposed on the closed curve 1010 at substantially equal intervals in the example shown in FIG. 6, the disposition intervals between the tracking points are not particularly limited. A method of disposing tracking points will be described below.
  • positions at which the tracking points 1011 are disposed are not particularly limited as long as they are positions associated with the initial region to be noted 1001.
  • the region to be noted is set using a curve interpolating the tracking points in the setting process of the region to be noted by the setting unit 240, it is desirable to dispose the tracking points on the closed curve defining the initial region to be noted.
  • Tracking process The initial setting process performed by the information processing device 20 has been described above. Next, a tracking process performed by the information processing device 20 will be described.
  • the estimation unit 230 sets a tracking region around one tracking point 1011 and a search range of the tracking region (S107 of FIG. 3).
  • FIG. 7 is a diagram showing an example of setting of a tracking region and a search range thereof in a tracking process according to the present embodiment.
  • the estimation unit 230 sets a rectangular region around one tracking point 1011 as a tracking region 2001.
  • a size of the tracking region 2001 may be decided according to an imaging condition of the imaging device 10, a type of observation target, or the like.
  • the estimation unit 230 sets the search range 2011 in which the tracking region 2001 is searched.
  • a size of the search range 2011, a center position of the search range 2011, and a shape of the search range 2011 are not particularly limited.
  • a size of the search range 2011, for example, may be decided according to an imaging condition (for example, an imaging magnification) of the imaging device 10, a type of analysis performed on the observation target, or the like.
  • the estimation unit 230 next acquires another captured image that is different from the captured image used for setting the region to be noted from the image acquisition unit 210 (S109 of FIG. 3). Then, the estimation unit 230 calculates a motion vector of the tracking region corresponding to each tracking point (S111), and estimates the positions of the tracking points in the other captured image based on the calculated motion vector (S113).
  • FIG. 8 is a diagram showing an example of calculation of a motion vector and an example of estimation of positions of the tracking points in a tracking process according to the present embodiment.
  • the estimation unit 230 first acquires a captured image P2 that is the next frame of the captured image P1 in the present embodiment.
  • the captured image P2 displays a cell image C2 of which the cell is the same as that shown in the captured image P1, and deformation of the cell can be ascertained therefrom.
  • FIG. 8 shows the initial region to be noted 1001 and the closed curve 1010 defining the initial region to be noted 1001
  • the display unit D does not display the initial region to be noted 1001 and the closed curve 1010 defining the initial region to be noted 1001 in practice.
  • the estimation unit 230 calculates a motion vector of the tracking region corresponding to each tracking point.
  • the estimation unit 230 searches the captured image P2 for a region that includes pixel information that most closely matches pixel information included in the tracking region 2001 corresponding to the tracking point 1011a of the captured image P1.
  • the search range 2011 of the tracking region 2001 is set as a rectangular region around the tracking region 2001 (the tracking point 1011a)
  • the search range may be the entire captured image P2.
  • the estimation unit 230 calculates a motion vector 2022 based on the specified region 2221.
  • the motion vector 2022 is computed using, for example, the block matching method. Then, the estimation unit 230 estimates the position of the respective tracking points 1011 in the captured image P2 based on the calculated motion vector 2022, and moves the tracking points 1011 to the estimated positions.
  • the setting unit 240 next sets a region to be noted based on the positions of the tracking points after the movement (S115).
  • FIG. 9 is a diagram showing an example of setting a region to be noted in a tracking process according to the present embodiment.
  • the setting unit 240 draws a closed curve 1020 that interpolates the tracking points 1021 after the movement, and sets the region surrounded by the closed curve 1020 as a region to be noted 1002 of the captured image P2. Note that, even if the initial region to be noted 1001 set in the captured image P1 coincides with the cell image C2 included in the captured image P1, the contour of the cell image C2 included in the captured image P2 does not necessarily coincide with the closed curve 1020 defining the region to be noted 1002 set by the setting unit 240.
  • the region to be noted 1002 can be set to coincide with the cell image C2 with higher accuracy by, for example, properly adjusting the number of tracking points, the disposition intervals, the tracking region, or the search range.
  • the region to be noted 1002 set in Step S115 may be modified through an operation of the user via the input device that is not illustrated.
  • the shape of the closed curve 1020 or the positions of the tracking points 1021 after the movement may be modified through an operation of the user. Accordingly, even when there is an error in the setting of the region to be noted in the tracking process, the region to be noted can be properly modified.
  • the output control unit 250 next outputs the tracking points after the movement, the region to be noted set by the setting unit 240, and the like to the display unit D (S117). Note that the processes of Steps S107 to S115 described above may be performed without being displayed on the display unit D, or the processes of the respective steps may be sequentially displayed on the display unit D.
  • the information processing device 20 determines whether or not the tracking process of Steps S107 to S117 is terminated (S119).
  • the termination of the tracking process include, for example, completion of tracking processes on all frames of the dynamic image data, termination of the use of the information processing device 20 by the user, or the like.
  • the disposition unit 220 re-disposes the tracking points for a region to be noted set by the setting unit 240 (S121).
  • FIG. 10 is a diagram showing an example of re-disposition of a tracking point in a tracking process according to the present embodiment.
  • the tracking points 1021 (1021a, 1021b, 1021c, ---) on the closed curve 1020 defining the region to be noted 1002 set by the setting unit 240 move to the positions estimated by the estimation unit 230, unevenness arises in the disposition intervals of the tracking points.
  • positions of the tracking points 1021 are estimated for a captured image of another different frame using the tracking points 1021 after movement without change, it is difficult to track the shape of the cell in, for example, a portion in which the disposition intervals between the tracking points widen.
  • tracking regions overlap, which causes a trackable region to be reduced as a whole, and thus a range in which the shape of the cell can be tracked is limited.
  • the disposition unit 220 deletes the tracking points 1021, and disposes tracking points 1022 (1022a, 1022b, 1022c, --- ) on the closed curve 1020 again.
  • the tracking points 1022 are disposed on the closed curve 1020 having a fixed interval. Accordingly, deviation of the positions at which the tracking points are disposed can be prevented, and accuracy of the tracking process can be maintained.
  • disposition intervals between the tracking points re-disposed by the disposition unit 220 are not particularly limited as in Step S105.
  • a method of disposing tracking points will be described below.
  • positions at which the tracking points 1022 are re-disposed are not particularly limited as long as the positions are associated with the region to be noted 1002.
  • a region to be noted is set using a curve interpolating tracking points in a setting process of the region to be noted by the setting unit 240, it is desirable to re-dispose the tracking points on a closed curve defining the region to be noted.
  • the information processing device 20 repetitively executes the processes of Steps S107 to S121 described above.
  • Steps S107 to S121 By repeating the operation of setting a region to be noted corresponding to a region of a sequentially changing cell, re-disposition of tracking points for the set region to be noted, and estimation of positions of the re-disposed tracking points in the captured image of another frame, it is possible to track a change in the shape of the cell.
  • the initial setting unit 215 sets the region to be noted corresponding to the region of the cell for the first captured image
  • the disposition unit 220 disposes tracking points for the set region to be noted
  • the estimation unit 230 estimates positions of the disposed tracking points in the second captured image.
  • the setting unit 240 sets a region to be noted in the second captured image based on the estimated positions of the tracking points, and the disposition unit 220 re-disposes the tracking points for the set region to be noted.
  • the disposition unit 220 may decide disposition intervals between tracking points (as well as the number disposed) disposed on a line having a shape according to, for example, at least a part of the shape of the line defining a region to be noted. That is, the disposition unit 220 may increase or decrease the disposition intervals between the tracking points according to the shape.
  • FIG. 11 is a diagram for describing the first application example of the disposition process by the disposition unit 220 according to the present embodiment.
  • a region to be noted 1003 defined by a closed curve 1030 is set for a cell (not illustrated) that is an observation target on a captured image P3.
  • the curve out of the closed curve 1030 which is within the region surrounded by a two-dot chain line 1032 has a complicated shape in comparison to the curve outside the region. More specifically, the curve included in the region surrounded by the two-dot chain line 1032is in a shape having high curvature in comparison to the curve outside the region.
  • the disposition unit 220 may decide the disposition intervals between tracking points 1031 according to the degree of curvature of the shape of the curve. More specifically, as shown in FIG. 11, the disposition unit 220 may decide the disposition intervals between the tracking points to be small for the portion of the shape of the curve having the high curvature (the curve included in the region surrounded by the two-dot chain line 1032) so that more tracking points are disposed. Accordingly, minute changes in the shape of the cell can be tracked.
  • This disposition process may be applied to, for example, observation targets (cells) whose shape is likely to change minutely.
  • the disposition unit 220 may decide disposition intervals between tracking points (as well as the number disposed) based on information regarding pixels included in a region to be noted.
  • the information regarding pixels included in a region to be noted may be, for example, luminance information of each pixel.
  • the information regarding pixels is not limited to luminance information, and may be distribution of concentration of pixels, intensity of edges, or the like.
  • FIG. 12 is a diagram for describing the second application example of the disposition process by the disposition unit 220 according to the present embodiment.
  • a region to be noted 1004 (1004A and 1004B) defined by a closed curve 1040 is set on a captured image P4 for a cell (not illustrated) that is an observation target.
  • luminance of the region to be noted 1004A is low, and luminance of the region to be noted 1004B is high.
  • the disposition unit 220 may increase the disposition intervals between the tracking points and reduce the number of tracking points disposed in the region having high luminance. Conversely, when the region exhibits low luminance, the cell is considered to be alive. Thus, the disposition unit 220 may decide to reduce the disposition intervals between the tracking points and increase the number of the tracking points disposed in the region having low luminance. By deciding the disposition intervals between the tracking points (as well as the number disposed) according to luminance as described above, it is possible to efficiently track the region whose shape can change. This disposition process may be applied to, for example, an observation target (a cell) of which cell death can occur.
  • the disposition intervals between the tracking points may be decided according to, for example, a variance of luminance. More specifically, a region having a high variance of luminance is considered to correspond to a region in which a cell that is an observation target is active. Thus, the disposition unit 220 may decide the disposition intervals between tracking points to be small and the number of tracking points disposed to be large for the region having a high variance of luminance. On the other hand, a region having a low variance of luminance is considered to correspond to a region in which a cell is inactive. Thus, the disposition unit 220 may decide the disposition intervals between tracking points to be large and the number of tracking points disposed to be small for the region having a low variance of luminance. Accordingly, the region in which the cell is active can be tracked more closely.
  • the disposition unit 220 may decide disposition intervals between tracking points (as well as the number disposed) based on the magnitudes of motion vectors of respective tracking points estimated by the estimation unit 230 in a previous tracking process.
  • the disposition unit 220 may acquire, for example, distribution of the magnitudes of the motion vectors of the tracking points and decide disposition intervals between the tracking points according to the distribution.
  • FIG. 13 is a diagram for describing the third application example of the disposition process by the disposition unit 220 according to the present embodiment.
  • a region to be noted 1006 defined by a closed curve 1060 is set on a captured image P5 for a cell (not illustrated) that is an observation target.
  • a closed curve 1050 indicated by the dotted line is a curve defining a region to be noted set in the captured image of one frame before the captured image P5.
  • the closed curve 1050 has tracking points 1051 (1051a, 1051b, ---) disposed thereon for the region to be noted set in the previous captured image.
  • the tracking points 1051 move to the positions calculated based on the motion vectors calculated by the estimation unit 230 (the tracking points after the movement are indicated as tracking points 1061).
  • the tracking point 1051a moves to the position at which the point turns into a tracking point 1061a (indicated by a dashed line) based on a motion vector M1a.
  • the tracking point 1051b moves to the position at which the point turns into a tracking point 1061b based on a motion vector M1b.
  • the motion vector M1a is greater than the motion vector M1b as shown in FIG. 13. That is, the change in the shape in the periphery of the tracking point 1051a is considered to be greater than the change in the shape in the periphery of the tracking point 1051b.
  • the disposition unit 220 may re-dispose many tracking points in the periphery of the tracking points corresponding to high motion vectors. For example, as shown in FIG.
  • tracking points 1062 are disposed in the periphery of the tracking point 1061a corresponding to the motion vector M1a. Meanwhile, disposition intervals between the tracking points 1062 are set to be large in the periphery of the tracking point 1061b corresponding to the motion vector M1b.
  • the estimation unit 230 estimates positions of each of tracking points in another captured image, but at that time, the estimation unit 230 may further calculate a movement amount of an observation target using an estimation result. More specifically, the estimation unit 230 may calculate a movement amount of an observation target using a motion vector calculated for each of tracking points.
  • FIG. 14 is a diagram for describing the estimation process application example in which the process is performed by the estimation unit 230 according to the present embodiment.
  • tracking points 1081 (1081a, 1081b, ---) estimated by the estimation unit 230 and a region to be noted 1008 defined by a closed curve 1080 drawn by the setting unit 240 based on the tracking points 1081 are set on a captured image P6 for a cell (not illustrated) that is an observation target.
  • tracking points 1071 (1071a, 1071b, ---) used to draw a closed curve 1070 indicated by the dotted line (which corresponds to a region to be noted 1007) are tracking points disposed in the captured image (not illustrated) that is a frame before the captured image P6.
  • the tracking points 1071 move to the positions calculated based on motion vectors calculated by the estimation unit 230 (tracking points after the movement are indicated as the tracking points 1081).
  • the tracking point 1081a is a point that has moved from the original tracking point 1071a in the magnitude and direction of a motion vector M2a calculated by the estimation unit 230.
  • the shape of each region to be noted set by the setting unit 240 does not significantly change.
  • the motion vectors used in estimating the positions of the tracking points are considered to reflect a state of movement of the cell.
  • the estimation unit 230 may calculate the magnitude of the movement of the observation target from the motion vectors calculated for estimating the positions of the tracking points. For example, the estimation unit 230 may calculate a vector M3 indicating the movement of the observation target based on the motion vectors M2a, M2b, --- that are calculated for the tracking points 1071a, 1071b, --- as shown in FIG. 14.
  • the vector M3 may be calculated based on, for example, a least square method for distribution of the motion vectors M2. This vector M3 may be output as a value of a motion of the observation target.
  • the estimation unit 230 may calculate not only a motion of the observation target in a translational direction but also rotation of the observation target based on the motion vectors M2 of the tracking points. For example, the estimation unit 230 may estimate a rotation center of the observation target based on the magnitude and direction of the motion vectors M2, and calculate the motion of the rotation based on the rotation center.
  • various motions of the observation target can be quantitatively analyzed based on the motion vectors calculated by the estimation unit 230. Accordingly, tracking of the observation can be evaluated in more detail.
  • FIG. 15 is a block diagram showing a functional configuration example of an information processing device 20A according to the modified example of the embodiment of the present disclosure.
  • the information processing device 20A according to the present modified example includes the image acquisition unit 210, the initial setting unit 215, the disposition unit 220, the estimation unit 230, the setting unit 240, and the output control unit 250 included in the information processing device 20 according to the present embodiment, an initial processing unit 255 and an analysis unit 260 can be included in place of the initial setting unit 215. More specifically, the information processing device 20A according to the present modified example can calculate a motion characteristic amount of the inside of a region to be noted specified in a tracking process using the analysis unit 260 based on a motion vector calculated by the initial processing unit 255.
  • the initial processing unit 255 has a function of analyzing motions and calculating motion vectors for a plurality of pieces of captured image data acquired by the image acquisition unit 210.
  • the motion vectors calculated here do not mean motion vectors of tracking points disposed for the region to be noted described above, but means motion vectors of the inside of a frame of captured image data.
  • Such motion vectors are used in calculation of a motion characteristic amount of the analysis unit 260 in the later stage.
  • Captured image data for which motion vectors are subject to calculation may be frames of all acquired pieces of captured image (dynamic image) data, or frames of a section selected automatically or according to user's selection. Calculation of motion vectors is appropriately performed using a known algorithm such as block matching.
  • the initial processing unit 255 can also have the function of the initial setting unit 215 described above. That is, the initial processing unit 255 can have a function of setting a region to be noted for an acquired captured image. Thus, the initial processing unit 255 can perform an initial setting process and the above-described motion vector calculation process for a region to be noted. An order of these processes is not particularly limited.
  • the motion vector calculation process by the initial processing unit 255 is preferably performed before a tracking process of a later stage. This is in order to reduce a load imposed on the computation. Information with regard to the motion vectors obtained by the initial processing unit 255 can be output to the analysis unit 260 of the later stage.
  • the analysis unit 260 has a function of calculating a motion characteristic amount for a region to be noted set by the setting unit 240 in the tracking process. Specifically, the analysis unit 260 specifies a motion vector of the inside of the region to be noted set by the setting unit 240 among motion vectors calculated by the initial processing unit 255 in advance, and calculates a motion characteristic amount for the region to be noted based on the specified motion vector.
  • the motion characteristic amount is, for example, at least one of a motion amount, a motion region, an average motion amount, a standard deviation of motion amounts, acceleration, a motion direction, and a motion frequency. These motion characteristic amounts are appropriately calculated using a known algorithm or the like.
  • the calculated motion characteristic amount is output to the output control unit 250.
  • An output form of the motion characteristic amount is appropriately selected according to a property of a motion to be analyzed, like a time-series graph, two-dimensional mapping, a radar chart, or a histogram.
  • FIG. 16 is a flowchart showing an example of a process by the information processing device 20A according to the present modified example.
  • the image acquisition unit 210 first acquires dynamic image data (captured image data) (S201).
  • the initial processing unit 255 calculates motion vectors for the dynamic image data (S203).
  • the initial processing unit 255 sets a region to be noted for one captured image (S205), and the disposition unit 220 disposes tracking points in the region to be noted (S207). Then, the estimation unit 230 sets tracking regions around the tracking points and search ranges of the tracking regions (S209).
  • the image acquisition unit 210 acquires an image of the next frame of the one captured image for which the region to be noted has been set (S211), and the estimation unit 230 estimates positions of the tracking points in the image (S213).
  • the setting unit 240 sets a region to be noted based on the positions of the tracking points that have undergone the tracking process (S215).
  • the analysis unit 260 calculates a motion characteristic amount of the set region to be noted based on motion vectors of the inside of the region to be noted, and outputs the calculated motion characteristic amount in a predetermined form (S217).
  • FIG. 17 is a diagram for describing the first application example (nerve cell) of the information processing device 20 according to the present embodiment.
  • the nerve cell C10 that is an observation target in the present application example is composed of a cyton C10A and an axon C10B (note that, for the sake of convenience in description, other elements such as a dendrite and a nucleus composing the nerve cell C10 are omitted in the present application example).
  • the nerve cell C10 elongates the axon C10B toward another cell in order to form a neural circuit. That is, the axon C10B in the nerve cell C10 grows in a short period of time.
  • the growth of the axon C10B can thus be tracked.
  • a region to be noted 3000 can be set in accordance with the shape and elongation direction of the axon C10B by, for example, disposing the tracking point 3001 to overlap the axon C10B.
  • the region to be noted 3000 is defined by a straight line or a curve.
  • this region to be noted 3000 may be defined using a planar region according to the shape of the axon C10B.
  • FIG. 18 is a diagram showing an example of an initial setting process of an axon C11B by the information processing device 20 according to the present embodiment.
  • a region to be noted 3010 is set to overlap the axon C11B of a nerve cell C11.
  • the tracking point 3011a is disposed on the boundary of a cyton C11A and the axon C11B
  • the tracking point 3011c is disposed at the tip of the axon C11B.
  • FIG. 19 is a diagram showing an example of a tracking process of the axon C11B by the information processing device 20 according to the present embodiment.
  • the axon C11B is in a state in which it has elongated in the length direction of the axon C11B from the state shown in FIG. 18.
  • the tracking points 3011 move to positions estimated by the estimation unit 230.
  • the tracking point 3011c moves to the position corresponding to the tip of the axon C11B.
  • the tracking point 3011a moves to the position corresponding to the boundary of the cyton C11A and the axon C11B (the case in which it does not move is also considered).
  • a region to be noted 3020 is set by the setting unit 240 based on the tracking points 3011 after the movement.
  • new tracking points 3021 are disposed by the disposition unit 220 for the region to be noted 3020.
  • tracking points 3021a and 3021f are re-disposed at the same positions as the foregoing tracking points 3011a and 3011c at the tip of the axon C11B and on the boundary of the cyton C11A and the axon C11B.
  • the number of tracking points re-disposed by the disposition unit 220 may increase according to elongation of the axon C11B.
  • the elongation of the axon C11B can be tracked.
  • An application target of the information processing device 20 according to the present embodiment is not limited to cells as described above.
  • the application target may be, for example, animals, plants, or non-living structures. When such an application target significantly changes its shape or structure in a short period of time, the tracking process by the information processing device 20 according to the present embodiment is effective.
  • a second application example of the information processing device 20 according to the present embodiment will be described below.
  • FIG. 20 is a diagram for describing the second application example (zebrafish) of the information processing device 20 according to the present embodiment.
  • Zebrafish like a zebrafish C20 that is an observation target in the present application example, are not kept only as pets but are frequently used for biological research as a model vertebrate organism. That is, observation and evaluation of changes in the form of zebrafish like the zebrafish C20 are frequently performed.
  • the information processing device 20 according to the present embodiment is used, growth and changes in the form of the zebrafish C20 can be objectively tracked.
  • the zebrafish C20 has an eye C20A and a backbone C20B.
  • a region to be noted 4000A surrounding the eye C20A and a region to be noted 4000B formed along the shape of the backbone C20B are set by disposing tracking points 4001 on the contour of the eye C20A and on the backbone C20B.
  • the region to be noted 4000B is set to track motions of the backbone C20B.
  • dynamic states of the zebrafish C20 can be tracked.
  • the region to be noted 4000A is set on the contour of the eye C20A.
  • the shape of the eye C20A shows no particularly significant change.
  • having the region to be noted 4000A surrounding the eye C20A as a fixed region relative dynamic states or changes in the form of the region to be noted 4000B can be tracked.
  • relative motions of a tracking point 4001a may be calculated.
  • An application target (observation target) of the information processing device 20 according to the present embodiment may not only be one structure described above, but also a group constituted by a plurality of structures. When such an application target significantly changes its shape or structure as a group in a short period of time, the tracking process by the information processing device 20 according to the present embodiment is useful.
  • a third application example of the information processing device 20 according to the present embodiment will be described.
  • FIG. 21 is a diagram for describing the third application example of the information processing device 20 (to a colony) according to the present embodiment.
  • the colony C30 that is an observation target in the present application example is a group derived from a single species formed by bacteria, cells or the like in a cultivation process. By observing a change in the shape of the colony C30, characteristics of individual cells, bacteria or the like or an effect of treatment for the observation target can be evaluated. For example, a colony forming ability of cells that are differentiated from ES cells can be evaluated, or drug efficacy can be evaluated based on a proliferation ability of the colony C30 of cancer cells into which a medicine has been put.
  • the outermost contour of the colony C30 is preferably set as a region to be noted 5000 with tracking points 5001 disposed at the outermost side of the colony C30 as shown in FIG. 21.
  • a shape of the colony C30 complicatedly changes due to overall expansion caused by a cell division of a colony inside C31 and projection of a partial contour caused by a change in cells near an outer periphery C32.
  • the information processing device 20 according to the present embodiment may not only set one kind of structure alone to be an application target for one piece of captured image data but also set a plurality of different kinds of structures as application targets. When these application targets interact and thus change their shapes and structures significantly in a short period of time, the tracking process by the information processing device 20 according to the present embodiment is useful.
  • a fourth application example of the information processing device 20 according to the present embodiment will be described below.
  • FIG. 22 is a diagram for describing the fourth application example of the information processing device 20 (to a macrophage and a foreign body) according to the present embodiment.
  • a macrophage C40 that is an observation target in the present application example is a white blood cell that migrates inside a living body, and brings phagocytosis in which the cell predates and digests dead cells, denaturing materials inside a body, or foreign bodies such as bacteria.
  • migration of the macrophage C40 and changes in the shape of the macrophage C40 are caught, motions of the macrophage C40 in phagocytosis are analyzed, and thereby a phagocytosis ability of the macrophage C40 can be evaluated.
  • tracking points 6001 and 7001 respectively on contours of the macrophage C40 and a foreign body C50 such as a cancer cell and set the outermost contours of the macrophage C40 and the foreign body C50 as regions to be noted 6000 and 7000. Since the macrophage C40 significantly changes its shape in a phagocytic process, it is desirable to increase the number of tracking points 6001 disposed and reduce disposition intervals 6002.
  • the foreign body C50 since the foreign body C50 hardly changes its shape in a phagocytic process, it is preferable to reduce the number of tracking points 7001 disposed and increase disposition intervals 7002. In addition, the tracking points 7001 do not have to be re-disposed for the foreign body C50 in a tracking process. Accordingly, a computation load can be reduced, and tracking processes appropriate for respective cells can be performed.
  • the disposition unit 220 of the information processing device 20 may appropriately decide disposition intervals between respective tracking points based on information with regard to an observation target for which a region to be noted is set (for example, a type of the observation target, the number of observation targets, or a state such as active or inactive). Accordingly, while a computation load is reduced, tracking processes appropriate for respective observation targets can be performed.
  • FIG. 23 is a block diagram showing a hardware configuration example of the information processing device according to the embodiment of the present disclosure.
  • An illustrated information processing device 900 can realize the information processing device 20 in the above described embodiment.
  • the information processing device 900 includes a central processing unit (CPU) 901, read only memory (ROM) 903, and random access memory (RAM) 905.
  • the information processing device 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 925, and a communication device 929.
  • the information processing device 900 may include a processing circuit such as a digital signal processor (DSP) or an application-specific integrated circuit (ASIC), instead of or in addition to the CPU 901.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls the overall operation or a part of the operation of the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or a removable recording medium 923.
  • the CPU 901 controls overall operations of respective function units included in the information processing device 20 of the above-described embodiment.
  • the ROM 903 stores programs, operation parameters, and the like used by the CPU 901.
  • the RAM 905 transiently stores programs used when the CPU 901 is executed, and parameters that change as appropriate when executing such programs.
  • the CPU 901, the ROM 903, and the RAM 905 are connected with each other via the host bus 907 configured from an internal bus such as a CPU bus or the like.
  • the host bus 907 is connected to the external bus 911 such as a Peripheral Component Interconnect/Interface (PCI) bus via the bridge 909.
  • PCI Peripheral Component Interconnect/Interface
  • the input device 915 is a device operated by a user such as a mouse, a keyboard, a touchscreen, a button, a switch, and a lever.
  • the input device 915 may be a remote control device that uses, for example, infrared radiation and another type of radio waves.
  • the input device 915 may be an external connection device 927 such as a mobile phone that corresponds to an operation of the information processing device 900.
  • the input device 915 includes an input control circuit that generates input signals on the basis of information which is input by a user to output the generated input signals to the CPU 901. The user inputs various types of data and indicates a processing operation to the information processing device 900 by operating the input device 915.
  • the output device 917 includes a device that can visually or audibly report acquired information to a user.
  • the output device 917 may be, for example, a display device such as a LCD, a PDP, and an OELD, an audio output device such as a speaker and a headphone, and a printer.
  • the output device 917 outputs a result obtained through a process performed by the information processing device 900, in the form of text or video such as an image, or sounds such as audio sounds.
  • the storage device 919 is a device for data storage that is an example of a storage unit of the information processing device 900.
  • the storage device 919 includes, for example, a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 919 stores therein the programs and various data executed by the CPU 901, and various data acquired from an outside.
  • the drive 921 is a reader/writer for the removable recording medium 923 such as a magnetic disk, an optical disc, a magneto-optical disk, and a semiconductor memory, and built in or externally attached to the information processing device 900.
  • the drive 921 reads out information recorded on the mounted removable recording medium 923, and outputs the information to the RAM 905.
  • the drive 921 writes the record into the mounted removable recording medium 923.
  • the connection port 925 is a port used to directly connect devices to the information processing device 900.
  • the connection port 925 may be a Universal Serial Bus (USB) port, an IEEE1394 port, or a Small Computer System Interface (SCSI) port, for example.
  • the connection port 925 may also be an RS-232C port, an optical audio terminal, a High-Definition Multimedia Interface (HDMI (registered trademark)) port, and so on.
  • HDMI High-Definition Multimedia Interface
  • the communication device 929 is a communication interface including, for example, a communication device for connection to a communication network NW.
  • the communication device 929 may be, for example, a wired or wireless local area network (LAN), Bluetooth (registered trademark), or a communication card for a wireless USB (WUSB).
  • the communication device 929 may also be, for example, a router for optical communication, a router for asymmetric digital subscriber line (ADSL), or a modem for various types of communication.
  • the communication device 929 transmits and receives signals in the Internet or transits signals to and receives signals from another communication device by using a predetermined protocol such as TCP/IP.
  • the communication network NW to which the communication device 929 connects is a network established through wired or wireless connection.
  • the communication network NW is, for example, the Internet, a home LAN, infrared communication, radio wave communication, or satellite communication.
  • the imaging device 10 may have the function of the information processing device 20 (tracking function). In this case, the information processing system 1 is realized by the imaging device 10.
  • the information processing device 20 may have the function of the imaging device 10 (imaging function). In this case, the information processing system 1 is realized by the information processing device 20.
  • the imaging device 10 may have a part of the function of the information processing device 20, and the information processing device 20 may have a part of the function of the imaging device 10.
  • a cell is exemplified as an observation target for analysis of the information processing system 1 in the embodiments, the present technology is not limited thereto.
  • the observation target may be, for example, a cell organelle, a biological tissue, an organ, a human, an animal, a plant, a non-living structure, or the like, and when the structure of shape thereof change in a short period of time, changes of the observation targets can be tracked using the information processing system 1.
  • the steps in the processes performed by the information processing device in the present specification may not necessarily be processed chronologically in the orders described in the flowcharts.
  • the steps in the processes performed by the information processing device may be processed in different orders from the orders described in the flowcharts or may be processed in parallel.
  • a computer program causing hardware such as the CPU, the ROM, and the RAM included in the information processing device to carry out the equivalent functions as the above-described configuration of the information processing device provided with an adjustment instruction specifying unit can be generated.
  • a storage medium having the computer program stored therein can be provided.
  • An information processing device including: circuitry configured to: dispose a plurality of tracking points within a first region of a first image; set a second region of a second image based on estimated positions of the plurality of tracking points in the second image, wherein the estimated positions are determined by comparing the first image and the second image, and the second image is captured at a different time point than the first image; and re-dispose the plurality of tracking points within the second region of the second image.
  • disposing the plurality of tracking points comprises disposing the plurality of tracking points on a line defining the first region of the first image.
  • disposing the plurality of tracking points comprises determining disposition intervals between the plurality of tracking points based on a shape of at least part of the line.
  • determining disposition intervals between the plurality of tracking points comprises determining the disposition intervals based on a curvature of the shape.
  • the line defining the first region is a closed curve surrounding the first region.
  • disposing the plurality of tracking points comprises determining disposition intervals between the plurality of tracking points based on pixel information for pixels included within the first region..
  • the pixel information includes luminance information.
  • re-disposing the plurality of tracking points within the second region comprises disposing at least one of the plurality of tracking points at the same position as one of the estimated positions.
  • the positions of the plurality of tracking points are estimated in the second image by comparing the first image and the second image to obtain a motion vector and estimating positions of the plurality of tracking points based on the motion vector.
  • obtaining the motion vector comprises setting a tracking region that includes at least one tracking point of the plurality of tracking points in the first image and searching a predetermined search range for a position of the tracking region in the second image.
  • a size of the tracking region is a size determined based on information of an observation target in the first region.
  • a size of the search range is determined based on information of an observation target in the first region.
  • estimating positions of the plurality of tracking points further comprises calculating a movement amount of an observation target between the first image and the second image based on the motion vector.
  • re-disposing the plurality of tracking points further comprises determining disposition intervals between the plurality of tracking points based on the motion vector.
  • the information processing device wherein the positions of the plurality of tracking points are estimated in the second image based on information of an observation target in the first region of the first image.
  • setting the second region of the second image further comprises using pixel information of the first image.
  • disposing the plurality of tracking points within the first region of the first image comprises disposing the plurality of tracking points relative to an observation target within the first region, and setting the second region of the second image further comprises setting the second region to include at least a portion of the observation target.
  • the observation target includes at least a portion of a biological cell.
  • re-disposing the plurality of tracking points further comprises re-disposing the plurality of tracking points in a manner to adjust for a change in shape of the biological cell.
  • estimating the positions of the plurality of tracking points in the second image further comprises adjusting the positions of the plurality of tracking points in the first image based on movement of the observation target between the first image and the second image.
  • re-disposing the plurality of tracking points within the second region of the second image comprises using at least one of the estimated positions of the plurality of tracking points in the second image.
  • An information processing method including: disposing a plurality of tracking points within a first region of a first image; setting a second region of a second image based on estimated positions of the plurality of tracking points in the second image, wherein the estimated positions are determined by comparing the first image and the second image, and the second image is captured at a different time point than the first image; and re-disposing the plurality of tracking points within the second region of the second image.
  • An information processing system including: an imaging device configured to generate a plurality of images including a first image and a second image; and circuitry configured to: dispose a plurality of tracking points within a first region of a first image; set a second region of a second image based on estimated positions of the plurality of tracking points in the second image, wherein the estimated positions are determined by comparing the first image and the second image, and the second image is captured at a different time point than the first image; and re-dispose the plurality of tracking points within the second region of the second image.
  • An information processing device including: a disposition unit that disposes a plurality of tracking points for a region to be noted in a captured image; an estimation unit that estimates, based on comparison of the captured image and another captured image of which a capturing time point is different from a capturing time point of the captured image, positions of the tracking points in the other captured image; and a setting unit that sets a region to be noted in the other captured image based on the positions of the tracking points estimated by the estimation unit, wherein, when the region to be noted is set in the captured image by the setting unit, the disposition unit re-disposes the tracking points for the set region to be noted, and the estimation unit estimates positions of the re-disposed tracking points in the other captured image.
  • the disposition unit re-disposes at least one tracking point among the previously disposed tracking points at the same position as the position estimated by the estimation unit.
  • the estimation unit estimates positions of the tracking points in the other captured image based on a motion vector obtained by comparing the captured image and the other captured image.
  • the estimation unit calculates the motion vector by searching a predetermined search range for a position of the tracking region in the other captured image.
  • a size of the tracking region is a size decided based on information with regard to an observation target corresponding to the region to be noted.
  • a size of the search range is a size decided based on information with regard to an observation target corresponding to the region to be noted.
  • the estimation unit calculates a movement amount of an observation target corresponding to the region to be noted based on the motion vector.
  • the disposition unit decides disposition intervals between the plurality of tracking points based on the magnitude of the motion vector.
  • the information processing device according to any one of (24) to (38), further including: an analysis unit configured to calculate a motion characteristic amount of the inside of the region to be noted using a motion vector obtained through analysis of each of the captured images.
  • an analysis unit configured to calculate a motion characteristic amount of the inside of the region to be noted using a motion vector obtained through analysis of each of the captured images.
  • the estimation unit estimates positions of the tracking points in the other captured image captured at a capturing time point decided based on information with regard to an observation target corresponding to the region to be noted.
  • the setting unit sets the region to be noted using information with regard to pixels included in the captured image.
  • An information processing method performed by a processor including: disposing a plurality of tracking points for a region to be noted in a captured image; estimating, based on comparison of the captured image and another captured image of which a capturing time point is different from a capturing time point of the captured image, positions of the tracking points in the other captured image; and setting a region to be noted in the other captured image based on the estimated positions of the tracking points, wherein, when the processor sets the region to be noted in the captured image, the processor re-disposes the tracking points for the set region to be noted, and estimates positions of the re-disposed tracking points in the other captured image.
  • An information processing system including: an imaging device that is provided with an imaging unit that generates a plurality of captured images; and an information processing device that is provided with a disposition unit that disposes a plurality of tracking points for a region to be noted in one captured image acquired from the imaging unit, an estimation unit that estimates, based on comparison of the one captured image and another captured image of which a capturing time point of the imaging unit is different from a capturing time point of the one captured image, positions of the tracking points in the other captured image, and a setting unit that sets a region to be noted in the other captured image based on the estimated positions of the tracking points, wherein, when the region to be noted is set in the one captured image by the setting unit, the disposition unit re-disposes the tracking points for the set region to be noted, and the estimation unit estimates positions of the re-disposed tracking points in the other captured image.
  • imaging device 20 information processing device 210 image acquisition unit 215 initial setting unit 220 disposition unit 230 estimation unit 240 setting unit 250 output control unit 255 initial processing unit 260 analysis unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

Selon certains aspects, l'invention concerne un dispositif de traitement d'informations. Le dispositif de traitement d'informations comprend une circuiterie configurée pour disposer une pluralité de points de suivi dans une première région d'une première image et régler une seconde région d'une seconde image sur la base des positions estimées de la pluralité de points de suivi dans la seconde image. Les positions estimées sont déterminées par comparaison de la première image et de la seconde image. La circuiterie est en outre configurée pour re-disposer la pluralité de points de suivi dans la seconde région de la seconde image.
PCT/JP2016/004345 2015-10-08 2016-09-26 Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations WO2017061080A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/765,103 US10929985B2 (en) 2015-10-08 2016-09-26 System and methods for tracking motion of biological cells
CN201680058029.2A CN108140241A (zh) 2015-10-08 2016-09-26 信息处理装置、信息处理方法和信息处理系统
EP16779193.8A EP3360113A1 (fr) 2015-10-08 2016-09-26 Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-199991 2015-10-08
JP2015199991 2015-10-08
JP2016-150604 2016-07-29
JP2016150604A JP6720756B2 (ja) 2015-10-08 2016-07-29 情報処理装置、情報処理方法及び情報処理システム

Publications (1)

Publication Number Publication Date
WO2017061080A1 true WO2017061080A1 (fr) 2017-04-13

Family

ID=57124073

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/004345 WO2017061080A1 (fr) 2015-10-08 2016-09-26 Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2017061080A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5508035B2 (ja) 2002-09-12 2014-05-28 株式会社日立メディコ 画像診断装置

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5508035B2 (ja) 2002-09-12 2014-05-28 株式会社日立メディコ 画像診断装置

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ALEXANDER WONG ET AL: "Shape-guided active contour based segmentation and tracking of lumbar vertebrae in video fluoroscopy using complex wavelets", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 2008. EMBS 2008. 30TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE, IEEE, PISCATAWAY, NJ, USA, 20 August 2008 (2008-08-20), pages 863 - 866, XP031508091, ISBN: 978-1-4244-1814-5 *
ANONYMOUS: "CellTrack - An Open-Source Software for Cell Tracking and Motility Analysis (project website)", 25 June 2015 (2015-06-25), Database Research Group, Dept of Computer Scienece an Engineering, The Ohio State University, XP055331318, Retrieved from the Internet <URL:http://web.archive.org/web/20150625041010/http://bio.cse.ohio-state.edu/CellTrack/> [retrieved on 20161222] *
RYOICHI ANDO ET AL: "Vector fluid", EYE GAZE IN INTELLIGENT HUMAN MACHINE INTERACTION: EYE-GAZE & MULTIMODALITY, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 7 June 2010 (2010-06-07), pages 129 - 135, XP058242685, ISBN: 978-1-4503-0125-1, DOI: 10.1145/1809939.1809954 *
YING CHU ET AL: "Registration for DSA Image Using Triangle Grid and Spatial Transformation Based on Stretching", SIGNAL PROCESSING, THE 8TH INTERNATIONAL CONFERENCE ON, IEEE, PI, 1 January 2006 (2006-01-01), XP031058698, ISBN: 978-0-7803-9736-1 *

Similar Documents

Publication Publication Date Title
US10929985B2 (en) System and methods for tracking motion of biological cells
CN108351919B (zh) 信息处理设备、信息处理方法、程序及信息处理系统
US20180342078A1 (en) Information processing device, information processing method, and information processing system
JP6922891B2 (ja) 情報処理装置、情報処理方法、プログラム及び情報処理システム
JP7001060B2 (ja) 情報処理装置、情報処理方法及び情報処理システム
Zhang et al. Deepgi: An automated approach for gastrointestinal tract segmentation in mri scans
WO2018105298A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et système de traitement d&#39;informations
EP3485458A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et système de traitement d&#39;informations
US8537212B2 (en) Recording apparatus and recording method thereof
CN112489062B (zh) 基于边界及邻域引导的医学图像分割方法及系统
WO2017169397A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et système de traitement d&#39;image
JP2007282906A (ja) 医用画像処理方法及びその装置、プログラム
WO2017061080A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et système de traitement d&#39;informations
CN117133014A (zh) 一种生猪面部关键点检测方法
Al-Oraifan et al. Optic disc detection using fish school search algorithm based on FPGA
Ashanand et al. A novel chaotic weighted EHO-based methodology for retinal vessel segmentation
Dimiccoli et al. Particle detection and tracking in fluorescence time-lapse imaging: a contrario approach
CN117036372B (zh) 一种鲁棒性激光散斑图像血管分割系统及分割方法
Pan et al. Nasopharyngeal Organ Segmentation Algorithm Based on Dilated Convolution Feature Pyramid
Tafuro Assessment of the contractile properties of engineered heart tissues by optical flow
WO2018012353A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations et système de traitement d&#39;informations
Moshaei-Nezhad et al. Registration and Fusion of Visible Light and IRT Images in Neurosurgery
CN117058467A (zh) 一种胃肠道病变类型识别方法及系统
Rizvandi et al. Principles of automatic vision systems for tracking elongated microorganisms

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16779193

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15765103

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016779193

Country of ref document: EP