WO2019102750A1 - Image processing device, image processing method, and image processing program - Google Patents

Image processing device, image processing method, and image processing program Download PDF

Info

Publication number
WO2019102750A1
WO2019102750A1 PCT/JP2018/038603 JP2018038603W WO2019102750A1 WO 2019102750 A1 WO2019102750 A1 WO 2019102750A1 JP 2018038603 W JP2018038603 W JP 2018038603W WO 2019102750 A1 WO2019102750 A1 WO 2019102750A1
Authority
WO
WIPO (PCT)
Prior art keywords
flow
image processing
image
flow channel
amount
Prior art date
Application number
PCT/JP2018/038603
Other languages
French (fr)
Japanese (ja)
Inventor
志織 笹田
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019102750A1 publication Critical patent/WO2019102750A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present technology relates to an image processing device, an image processing method, and a program for image processing. More specifically, the present invention relates to an image processing apparatus, an image processing method, and an image processing program for acquiring flow channel internal information from a plurality of images obtained by continuous imaging of a flow channel region.
  • an evaluation method using means such as light or ultrasonic waves is used.
  • the method of measuring a blood flow for example using a laser beam
  • the method of acquiring the information in a blood vessel using an ultrasonic wave are mentioned.
  • means such as a blood vessel contrast agent may be used to visualize the flow path.
  • Patent Document 1 an image obtained by classifying each of a plurality of blocks constituting at least a part of the image data based on the feature amount of the motion specified from the image data obtained by imaging the object performing periodic motion is A data processing apparatus having a control unit for controlling to display. "(Claim 1) is described.
  • Patent Document 2 “a motion detection unit that detects a motion of the evaluation target using an image to be evaluated, and a motion vector indicating a motion of the evaluation target detected by the motion detection unit are used.
  • An evaluation value for evaluating the cooperativity of the movement of the evaluation object using a correlation calculation unit that calculates a correlation of temporal changes in motion amount at a plurality of evaluation objects, and the correlation calculated by the correlation calculation unit An image processing apparatus comprising: an evaluation value calculation unit to be calculated. "(Claim 1). In Patent Document 3 below, “a motion detection unit that detects the motion of the observation target for each partial region of the observation region, a motion amount calculation unit that calculates the motion amount of each motion detected by the motion detection unit, and An image processing apparatus comprising: a map creation unit that creates a map representing the position and the size of each motion amount calculated by the motion amount calculation unit.
  • Fluorescently labeled particles may be used in the assessment of biological channels. For example, fluorescent beads are allowed to flow into the blood stream to observe the flow path, and substances transported by axons are fluorescently labeled to observe axons. In such observation, although the fluorescently labeled substance is observed, the flow channel through which the fluorescently labeled substance flows may not be observable. If it is possible to evaluate the flow path through which the substance flows in the analysis using the fluorescently labeled substance, it is considered to be useful for various analysis or research.
  • the conventionally used channel evaluation method can not be applied.
  • a new flow channel is formed to generate a new flow, such as angiogenesis and nerve axonal generation
  • the conventionally used flow channel evaluation method may not be applicable.
  • flow path visualization means such as a blood vessel contrast agent can not be applied.
  • image processing techniques described above are useful techniques for visualizing the motion of an object. If it is possible to evaluate the flow path through which an object flows by image processing, it is considered to be useful for various analysis or research.
  • the present technology aims to provide a new flow path evaluation technology.
  • the present inventors have found that an image processing apparatus having a specific configuration enables channel evaluation.
  • the present technology A motion amount calculation unit that calculates a motion amount between at least two of a plurality of images obtained by imaging the flow path region continuously in time; A feature amount acquisition unit that acquires a feature amount related to the movement amount for each pixel block that constitutes at least a part of the image of the flow path region; And a flow path internal information acquisition unit for acquiring flow path internal information of the flow path region based on the feature amount.
  • the flow path internal information acquisition unit may include a color information addition unit that adds color information to each pixel block according to the feature amount.
  • the flow passage internal information acquisition unit may further include a flow passage region image creation unit that creates an image of the flow passage region based on the color information.
  • the flow channel internal information acquisition unit may further include a flow velocity acquisition unit that acquires the flow velocity inside the flow channel based on the feature quantity.
  • the flow passage internal information acquisition unit may further include an abnormal area identification unit that identifies an abnormal area of the flow based on the flow velocity.
  • the flow passage internal information acquisition unit may further include an area calculation unit that calculates an area of a pixel block in which the feature value is within a predetermined range.
  • the feature may be obtained based on five or more motion amounts.
  • the feature value may be an integrated value or an average value of the motion amounts, or a maximum value of the motion amounts.
  • the plurality of images may constitute a moving image of the flow channel area.
  • the at least two images may be continuous images among the images constituting the moving image
  • the amount of movement flows in the flow path It may be the amount of movement of particles.
  • the particles may be bioparticles.
  • the flow passage region may include a flow passage in a living body in which particles are flowing and / or an artificial flow passage in which particles are flowing.
  • an image processing method including the flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
  • a motion amount calculating step of calculating a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time.
  • a feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
  • an image processing program for causing a computer to execute a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
  • the present technology provides a new image processing technology that enables channel evaluation.
  • the technology allows, for example, visualization or quantification of angiogenesis, visualization of axonal generation, or quantification of axonal transport rate.
  • the effects exerted by the present technology are not necessarily limited to the effects described herein, and may be any of the effects described in the present specification.
  • 1 shows an example of an image processing system including an image processing apparatus according to the present technology.
  • 1 is an example of a block diagram of an image processing apparatus according to the present technology. It is a figure which shows the example of moving image data. It is a figure which shows the image divided
  • 1 is an example of a block diagram of an image processing apparatus according to the present technology. It is a figure which shows the example of moving image data. It is a figure showing an example of a channel field picture created according to this art. It is a figure showing an example of the flow of the image processing method according to this art.
  • FIG. 1 is a diagram showing an example of a schematic hardware configuration of an information processing apparatus for realizing an image processing apparatus according to the present technology. It is a figure showing an example of a channel field picture created according to this art. It is a figure which shows the example of acquisition of the feature-value from moving image data.
  • First embodiment image processing apparatus
  • First Embodiment First Example of the First Embodiment (Image Processing Apparatus) (3) Second example of the first embodiment (image processing apparatus) 2.
  • Second embodiment image processing method
  • Description of Second Embodiment Example of Second Embodiment (Image Processing Method) 3.
  • Third embodiment program for image processing (1) Description of Third Embodiment Hardware configuration example
  • the present technology is directed to a motion amount calculator configured to calculate a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region.
  • a feature quantity acquisition unit that acquires a feature quantity related to the movement amount for each pixel block that constitutes a part; and a flow path internal information acquisition unit that acquires flow path internal information of the flow path region based on the feature quantity.
  • An image processing apparatus is provided.
  • the present technology it is possible to obtain flow channel internal information from an image obtained by imaging the flow channel region.
  • the shape or structure of the flow channel, the flow velocity in the flow channel, and / or the flow velocity distribution in the flow channel can be acquired.
  • the present technology can be applied to, for example, the following situations.
  • the fluorescence emitted from the bead can be observed, but the blood vessel through which the bead flows may not be observed.
  • the shape or structure of the flow channel and / or the flow velocity in the flow channel can be evaluated, so that the state of the blood vessel through which the bead flows can be grasped.
  • the place where the flow velocity of blood is decreasing, the place where blood flow is stagnant, or the place where blood leaks can be identified by the present technology. In this way, it is possible to identify where the vascular abnormality is occurring.
  • the flow path internal information obtained according to the present technology is considered to contribute to, for example, prevention of thrombus formation in the artificial blood vessel and / or improvement of the structure of the artificial blood vessel.
  • the technology enables visualization and / or quantification of angiogenesis.
  • the technology can, for example, assess the shape or structure or flow rate of the flow path, so that, for example, the structure of the axon can be understood, the axonal transport velocity can be quantified, and the shape or extension direction of the generated axon can be identified. It becomes possible by this technology.
  • the present technology makes it possible to identify a problem point in the flow path, for example, a place where blockage tends to occur and a place where stagnation or flow rate reduction occurs. As a result, quality evaluation of these devices becomes possible.
  • the movement amount calculation unit calculates the movement amount between at least two images among a plurality of images obtained by imaging the flow passage region continuously in time.
  • the plurality of images may be, for example, images constituting a moving image.
  • the at least two images may be, for example, at least two of a plurality of images constituting the moving image.
  • the moving image may be a set of images captured sequentially in time in a region including the flow path.
  • the frame rate of the moving image may be appropriately selected by those skilled in the art in consideration of factors such as the magnification of the imaging target and the microscope, for example.
  • the velocity of the blood flow is, for example, 0.05 to 50 cm / sec, so the frame rate can be selected in consideration of the velocity of the blood flow.
  • the frame rate (fps) can be set to (moving distance (pixel) per second of an object (for example, particle) moving in the image) / (4 to 16) or more.
  • a frame rate of 100/4 to 100/16 fps or more may be employed, for example, a frame rate of 5 fps or more or a frame rate of 25 fps or more.
  • the frame rate of the moving image may be, for example, 5 to 100 fps, preferably 10 to 80 fb, more preferably 15 to 60 fps.
  • the frame rate can be set by selecting the type or setting of the imaging device used for imaging the flow channel region. For example, when imaging the blood flow of a transparent animal such as zebrafish, a frame rate of 100 fps to 100 fps may be adopted, and a frame rate of 1000 fps or more may be adopted. Examples of moving image data that can be used in the present technology are shown as 301-1 to 301-t in FIG. Thus, the moving image data may be the t-th image from the first image within a predetermined period.
  • the motion amount may be calculated over the entire time of the moving image obtained by imaging, or a certain section of the moving image, that is, a moving image of a certain time of all the time
  • the motion amount may be calculated only for.
  • the section of the moving image used to calculate the motion amount can be appropriately selected by those skilled in the art.
  • the number of images used for calculating the amount of movement may be at least 2, preferably 5 or more, more preferably 30 or more, more preferably 300 or more, and still more preferably 3000 or more.
  • the moving image preferably has 1 second or more, more preferably 10 seconds or more, and still more preferably 100 seconds or more for calculation of the amount of movement. It can be used. As the number of images used to calculate the amount of movement is larger, better channel internal information can be obtained.
  • the calculation of the motion amount can be performed by comparing the two images.
  • the two images to be compared may be, for example, two images that are temporally continuous among a plurality of images that constitute a moving image, or may be two images that are not temporally continuous. .
  • Two temporally consecutive images are two images of an image obtained by imaging a predetermined area at a certain point in time and an image obtained by imaging the area at a point closest to the certain point in time.
  • it may be an image at a certain point in time that constitutes a moving image and an image immediately after the image.
  • a one-second moving image with a frame rate of 30 fps is composed of 30 images.
  • the amount of movement between the first and second images of the 30 images, the amount of movement between the second and third images, ..., and the 29th and 30th images The motion amount between the images of the eyes may be calculated by the motion amount calculation unit.
  • the two images which are not continuous in time are two images of an image obtained by imaging a predetermined area at a certain point in time and an image other than an image obtained by imaging the area at a point closest to the certain point in time.
  • it may be an image at a certain point in time that constitutes a moving image and a further image of the image immediately after the image.
  • the amount of movement between the first and third images of the 30 images constituting the moving image, and the third and fifth images may be calculated.
  • the amount of movement between two images of one image skipping can be calculated.
  • the number of images skipped when calculating the amount of movement is not limited to one.
  • the amount of movement between two images of 1 to 20 sheets, particularly 1 to 10 sheets, more particularly 2 sheets, 3 sheets, 4 sheets, 5 sheets or 10 sheets is calculated. It is also good. By reducing the number of images used for motion amount calculation, it is possible to reduce the amount of data generated by the motion amount calculation unit.
  • the motion amount calculation unit may calculate a motion amount between two images at predetermined two points in time among the plurality of images constituting the moving image.
  • two images at predetermined two time points are, for example, two images of an image obtained by imaging a predetermined area at a certain time point and an image obtained by imaging the area after a predetermined time has elapsed from the certain time point. It is possible. For example, images every 0.01 seconds to one second, preferably every 0.1 seconds to every 0.5 seconds can be used for the movement amount calculation. More specifically, when the present technology is applied to, for example, a 10-second moving image, for example, 0.5 seconds have passed since the first image of the images constituting the 10-second moving image and the first image. The amount of movement between the second image at a point in time, the amount of movement between the second image, and the third image after 0.5 seconds from the second image, and so on An amount of movement between two images at predetermined two time points can be calculated.
  • the flow channel region may be a region having a flow channel at least in part, or may be a region having a possibility of having a flow channel in at least one portion. In the latter case, by imaging the area assumed to have a flow path and applying the present technology to the image obtained by imaging, for example, the presence or absence of the flow path in the area or the shape of the flow path Information can be obtained.
  • the flow passage region may be, for example, a flow passage in a living body in which particles are flowing and / or an artificial flow passage in which particles are flowing.
  • the flow path may be a structure through which a substance, such as particles, can flow.
  • flow channels for example, flow channels in a living body such as blood vessels, lymphatic vessels, and axons, artificial biological flow channels such as artificial blood vessels and artificial lymphatic vessels, and non-biological flow channels such as microchannels Although it can mention, it is not limited to these.
  • the particles flowing in the flow channel facilitate calculation of the amount of movement between the images. That is, in the present technology, the amount of movement may be the amount of movement of particles flowing in the channel.
  • the particles may include living particles and non-living particles.
  • the biological particles include various biological molecules, biological particles such as cells, microorganisms, solid components of biological origin, and liposomes.
  • the non-living particles include, but are not limited to, non-living molecules, synthetic particles such as latex particles, gel particles, and industrial particles.
  • the biomolecules include, but are not limited to, molecules transported by axon and molecules flowing in the blood stream.
  • the cells can include animal cells and plant cells.
  • Animal cells can include, for example, tumor cells and blood cells, such as red blood cells.
  • the microorganisms may include bacteria such as E. coli, fungi such as yeast, and the like.
  • Examples of the living body-derived solid component include solid crystals produced in the living body.
  • the non-biomolecules can include reagent molecules that can be flowed into the flow channel such as blood vessels or axons.
  • the synthetic particles may be particles made of, for example, an organic or inorganic polymer material or a metal.
  • Organic polymeric materials may include polystyrene, styrene divinyl benzene, and polymethyl methacrylate.
  • Inorganic polymeric materials can include glass, silica, magnetic materials, and the like.
  • Metals may include gold colloids and aluminum.
  • the synthetic particles can be, for example, beads, in particular fluorescent beads, intended to be flowed in the bloodstream or in the microchannel.
  • the particles may be a combination of a plurality of particles, such as two or three.
  • the particles may be labeled for enabling imaging by the imaging device. Labels can include, for example, fluorescent labels.
  • the lower limit of the maximum dimension of the cross section of the flow path to be analyzed in the present technology may be, for example, 1 ⁇ m, preferably 5 ⁇ m, more preferably 10 ⁇ m.
  • the upper limit of the largest dimension of the cross section may be, for example, 10 mm, preferably 5 mm, more preferably 1 mm.
  • the largest dimension of the cross-section may be the diameter if the cross-section is circular, the major axis if the cross-section is elliptical, and the diagonal length if the cross-section is rectangular.
  • the motion amount is, for example, a motion vector.
  • Motion vectors may be calculated, for example, by techniques used in motion compensated prediction coding. Examples of such techniques include, but are not limited to, gradient methods and block matching methods.
  • the gradient method is a technique that utilizes the fact that the brightness gradient is constant in a minute section in the image. For example, for example, an image having a brightness change similar to the brightness change at a certain point in an image at a certain point should be compared with the image at that certain point (for example, an image at a point immediately after the certain point) Etc.), the motion vector of the certain part is calculated.
  • the field of motion vectors of the entire image is also called optical flow.
  • an image is divided into rectangular blocks, for example, square blocks, and motion vectors can be calculated on a rectangular block basis.
  • an image at a certain point in time is divided into rectangular blocks, and for each block, the image to be compared with the image at a certain point in time (eg, an image at a point in time immediately before the certain point in time) is best.
  • the motion vector of each block is calculated.
  • a pixel block is a set of pixels that constitute an image.
  • the pixel block can be appropriately selected according to, for example, a method for calculating a motion vector.
  • the pixel block has one side of, for example, 1 to 50 pixels, particularly 2 to 30 pixels, more particularly 3 to 20 pixels, and the other side of 1 to 50 pixels, especially 2 to 30 pixels, In particular, it may be a rectangular pixel block of 3 to 20 pixels.
  • the size of the pixel block may be appropriately set by those skilled in the art according to, for example, the method of calculating the amount of movement, the size of the image, and the size of particles flowing in the flow path.
  • the pixel block is, for example, 16 ⁇ 16, 8 ⁇ 16, 16 ⁇ 8, 8 ⁇ 8, 4 ⁇ 8, 8 ⁇ 4, or 4 ⁇ 4 pixels. sell. Also, if block matching is used in the calculation of the amount of motion, the pixel block may preferably be square.
  • the feature amount may be an amount representing the feature of the motion amount in the pixel block.
  • the feature amount for example, the velocity and / or direction of flow in the pixel block, the moving velocity of particles present in the pixel block, the tendency or average of the velocity, the direction in which particles present in the pixel block, and / or Trends in that direction can be expressed.
  • the feature amount may be, for example, an integrated value or an average value of motion amounts, or a maximum value of the motion amounts. These values are suitable for characterizing the amount of motion in the pixel block.
  • the integrated value, the average value, and the maximum value may be an integrated value, an average value, and a maximum value of motion vectors, respectively.
  • the integrated value, average value, and maximum value of motion vectors can be calculated by a method known to those skilled in the art.
  • data processing such as exclusion of an outlier, may be performed, for example.
  • the feature amount may be, for example, a maximum value of a number representing a motion amount for each pixel block (for example, a size of a motion vector).
  • the magnitudes of motion vectors between images are quantified for each pixel block, and the magnitudes of quantified motion vectors for each pixel block are collected over a predetermined time.
  • the quantified maximum value of the collected motion vector may be used as the feature amount of each pixel block.
  • the number of data of movement amount used to obtain the feature amount is preferably 5 or more, more preferably 10 or more, more preferably 20 or more, still more preferably 100 or more, 200 or more, 500 or more, or 1000 or more It can be. A more appropriate feature amount can be obtained by the large number of motion amount data used to obtain the feature amount.
  • the flow path internal information may be, for example, information on the structure of the flow path and information on the flow in the flow path.
  • Examples of the information on the structure of the flow path include the shape of the flow path and the area of the flow path.
  • Information on the flow in the flow path can include, for example, the flow velocity in the flow path, the flow velocity distribution in the flow path, and the direction of the flow.
  • the flow channel internal information acquisition unit of the image processing apparatus may include a color information addition unit.
  • the color information adding unit adds color information to each pixel block according to the feature amount.
  • the color information may be, for example, color information based on an XYZ color system or an RGB color system.
  • the rules regarding the application of color information may be set as appropriate by those skilled in the art. For example, the following may be set as the rule. For example, when the feature amount of a pixel block is equal to or more than a predetermined value, the color information adding unit adds predetermined color information to all the pixels constituting the pixel block, and the color information adding unit When the feature amount is less than the predetermined value, color information different from the color information may be provided to all the pixels forming the pixel block.
  • the inside of the flow path and the other portion can be distinguished.
  • three or more different color information may be provided depending on the feature amount.
  • the number of types of color information to be provided may be, for example, 3 to 50, 4 to 40, 5 to 30, or 10 to 20.
  • the inside of the flow path can be distinguished from the other portions, and the inside of the flow path can be divided into a slow flow portion and a fast flow portion. Thereby, it is possible to identify the stagnation portion and / or the flow velocity reduction portion in the flow channel. Further, the color information to be provided may be gradually changed according to the feature amount.
  • gradation appears in the flow path Color information may be provided. This makes it easy to grasp the flow velocity distribution in the flow channel.
  • the flow channel internal information acquisition unit of the image processing apparatus may include a flow channel area image generation unit.
  • the flow passage internal information acquisition unit may include a color information addition unit and a flow passage area image generation unit.
  • the flow channel area image creation unit creates an image based on the color information of each pixel block provided by the color information application unit. By performing color reproduction based on the color information given to each pixel, an image of the flow passage area can be created.
  • the flow channel internal information acquisition unit of the image processing apparatus may include a flow velocity acquisition unit.
  • the flow passage internal information acquisition unit may include a flow velocity acquisition unit, a color information addition unit, and a flow passage area image generation unit.
  • the flow velocity acquisition unit acquires the flow velocity based on the feature amount.
  • the color information adding unit may add color information to each pixel block according to the acquired flow velocity.
  • the application of color information may be performed, for example, in the same manner as the rule described above, that is, color information may be applied according to a rule in which “feature amount” is replaced with “flow velocity” in the rule described above.
  • the color information adding unit may further add other color information to the pixels forming the boundary between the slow flow rate part and the fast flow rate part.
  • the color information adding unit may further add other color information to the pixels forming the boundary between the slow flow rate part and the fast flow rate part.
  • the flow channel internal information acquisition unit of the image processing apparatus may include an abnormal area identification unit.
  • the flow channel internal information acquisition unit may include a flow velocity acquisition unit, an abnormal area identification unit, a color information application unit, and a flow channel area image creation unit.
  • the abnormal area identification unit can identify an abnormal area of the flow based on, for example, the flow velocity.
  • the abnormal area identification unit may generate information for indicating to the user a pixel block identified as an abnormal area as an abnormal area. Examples of the information include color information for indicating the boundary between the abnormal area and the other area, or information for causing the abnormal area to blink and display it.
  • the abnormal region identification unit can identify a portion where the flow velocity is lower than a predetermined value as a portion where thrombus is likely to occur or a portion where thrombus is occurring.
  • the portion identified as the abnormal area may be given color information so as to give a predetermined color or pattern, for example, by the color information giving unit.
  • the flow passage area image generation unit may generate a flow passage area image, or an output unit may perform an output.
  • the flow channel internal information acquisition unit of the image processing apparatus may include an area calculation unit.
  • the flow passage internal information acquisition unit may include an area calculation unit, a color information addition unit, and a flow passage area image creation unit.
  • the area calculation unit can calculate the area of a pixel block in which the feature value is within a predetermined range.
  • the area of the flow path can be calculated by calculating the area of a pixel block having a feature value equal to or greater than a predetermined value.
  • the area calculation unit may calculate the area of a pixel having predetermined color information. That is, after the color information is given by the color information giving unit, the area can be calculated based on the color information.
  • the flow channel internal information acquisition unit of the image processing apparatus may include a flow direction identification unit.
  • the flow passage internal information acquisition unit may include a flow direction identification unit, a color information addition unit, and a flow passage area image creation unit.
  • the flow direction identification unit identifies the flow direction in the pixel block based on the feature amount. By identifying the flow direction, it is possible to identify, for example, a region where swirling or turbulent flow is occurring.
  • the flow direction identification unit can, for example, identify only the flow direction of a pixel block having a feature amount equal to or greater than a predetermined value. As a result, the flow direction is identified only in the inside of the flow path, and the flow direction is not identified in parts other than the flow path.
  • the identified flow direction may be displayed, for example, as a line or an arrow in the flow path. As a result, the user of the image processing apparatus according to the present technology can more easily grasp the flow direction in the flow path.
  • the flow channel internal information acquisition unit includes one selected from a color information addition unit, a flow channel region image generation unit, a flow velocity acquisition unit, an abnormal region identification unit, an area calculation unit, and a flow direction identification unit. Two, three, four or five components may be included. Of course, in the present technology, all of these six components may be included in the internal flow channel information acquisition unit.
  • FIG. 1 is a diagram showing an example of an image processing system including an image processing apparatus according to the present technology.
  • FIG. 2 is an example of a block diagram of an image processing apparatus according to the present technology.
  • the image processing system 100 includes an imaging device 101 and an image processing device 200.
  • the imaging device 101 can capture an image of a flow passage region to be imaged, and can be appropriately selected or configured by those skilled in the art.
  • the imaging device 101 may comprise, for example, a camera, in particular a digital video camera.
  • the imaging device 101 of FIG. 1 includes a microscope 102 and a digital video camera 103.
  • the microscope 102 is configured to be able to observe the sample 104 including the flow passage area to be imaged.
  • the image processing apparatus 200 is connected to the imaging apparatus 101 by wire or wireless so that the image data captured by the imaging apparatus 101 can be acquired.
  • the image processing apparatus 200 includes an image recording unit 201, a motion amount calculation unit 202, a feature amount acquisition unit 203, a flow path internal information acquisition unit 204, and an output unit 207.
  • the flow channel internal information acquisition unit 204 includes a color information addition unit 205, a flow channel area image generation unit 206, an area calculation unit 208, and a flow direction identification unit 209.
  • the imaging apparatus 101 captures an image of the flow passage area of the sample 104 with the digital video camera 103 via the microscope 102.
  • the captured moving image data is stored in the image recording unit 201 in the image processing apparatus 200.
  • An example of video data is shown in FIG.
  • the moving image data is composed of a total number t of image data from the first image data 301-1 to the t-th image data 301-t.
  • the first image data 301-1 is an image recorded first in the imaging period of the moving image data
  • the t-th image data 301-t is recorded last in the imaging period of the moving image data It is an image.
  • the motion amount calculation unit 202 calculates the motion amount between the images from the plurality of images constituting the moving image data.
  • the motion amount calculation unit 202 calculates, for example, a motion vector as the motion amount.
  • the motion amount calculator 202 calculates the amount of motion between the image data 301-1 and the image data 301-2, the amount of motion between the image data 301-2 and the image data 301-3,.
  • the amount of movement between (t-1) and the image data 301-t is calculated.
  • the motion amount calculation unit 202 can compare the image data 301-1 and the image data 301-2 by, for example, the gradient method.
  • the motion amount calculation unit 202 detects, in the image data 301-2, a portion having a luminance change similar to the luminance change at a certain portion in the image data 301-1.
  • the motion amount calculation unit 202 calculates a motion vector from the positions of the two parts. Similarly, the motion amount calculation unit 202 calculates a motion vector between the image data 301-2 and the image data 301-3,..., Between the image data 301-(t ⁇ 1) and the image data 301-t. Calculate motion vector.
  • the calculated motion vector is represented, for example, by an arrow, it can be represented as, for example, 302-1, 302-2, ..., and 302- (t-1) in FIG.
  • the motion amount does not have to be represented by an arrow as in 302-1, 302-2, ..., and 302- (t-1) in FIG. 3, and is recorded only as data. You only need to
  • the feature amount acquisition unit 203 acquires, for each of the pixel blocks constituting the image, a feature amount related to the motion amount calculated by the motion amount calculation unit 202.
  • the imaged flow path region is divided into M pieces in the horizontal direction and N pieces in the vertical direction, for example, as shown in FIG. That is, the flow passage area is divided into pixel blocks divided into M ⁇ N grids.
  • the feature amount acquisition unit 203 integrates or averages the motion amount calculated by the motion amount calculation unit 202 for each pixel block, or acquires the maximum value of the motion amount for each pixel block.
  • the internal flow channel information acquisition unit 204 acquires internal flow channel information of the flow channel region based on the feature amount acquired by the feature amount acquisition unit 203.
  • An example in which the feature amount is an integrated value and the flow path internal information is an image of a flow path region will be described below.
  • the color information application unit 205 applies color information to each pixel block in accordance with the integrated value acquired by the feature amount acquisition unit 203. For example, when the integrated value of a pixel block is higher than a predetermined threshold, gray color information is added to the pixel block, and when the integrated value of a pixel block is lower than the threshold, the pixel The block is given white color information.
  • the flow path area image creation unit 206 creates a flow path area image from the color information provided to each pixel block by the color information addition unit 205. For example, a flow path area image as shown on the right side of FIG. 5 is created. The image shown on the left side of FIG. 5 is a diagram showing each pixel block by a grid. The shape of the flow path can be grasped by the flow path region image.
  • the area calculation unit 208 calculates, for example, the area of a pixel block to which gray color information is added by the color information addition unit 205. Thereby, the area of the flow path in the created image can be calculated.
  • the flow direction identification unit 209 identifies the flow direction in the pixel block based on, for example, the feature amount. In the identification, for example, the flow direction in the pixel block designated by the user can be identified.
  • the output unit 207 outputs the flow channel area image generated by the flow channel area image generation unit 206.
  • the output unit 207 may be, for example, a display or a printer. Further, the output unit 207 may output the area calculated by the area calculation unit 208 simultaneously with the output of the flow path region image. In addition, the output unit 207 may display the flow direction identified by the flow direction identification unit 209, for example, by an arrow or a line on the flow passage area image.
  • the image processing according to the present technology for example, at two different times, it is also possible to observe the generation and disappearance of the flow path.
  • image processing according to the present technology based on a moving image obtained by imaging a certain flow channel area at a certain time, an image shown in the left of FIG. 11 is obtained.
  • the image processing shown in the right of FIG. 11 is performed by performing the image processing according to the present technology based on the moving image obtained by imaging the flow channel area. obtain.
  • the generation of the flow channel can be observed. This can, for example, assess angiogenesis and axonal generation.
  • FIG. 1 is as described above.
  • FIG. 6 is an example of a block diagram of an image processing apparatus according to the present technology.
  • the image processing apparatus 200 includes an image recording unit 601, a motion amount calculation unit 602, a feature amount acquisition unit 603, a flow channel internal information acquisition unit 604, and an output unit 607.
  • the flow channel internal information acquisition unit 604 includes a color information addition unit 605, a flow channel area image generation unit 606, a flow velocity acquisition unit 608, and an abnormal area identification unit 609.
  • the imaging apparatus 101 captures an image of the flow passage area of the sample 104 with the digital video camera 103 via the microscope 102.
  • the captured moving image data is stored in the image recording unit 601 in the image processing apparatus 200.
  • An example of the video data is shown in FIG.
  • moving image data is composed of a total number t of image data from the first image data 701-1 to the t-th image data 701-t.
  • the first image data 701-1 is an image recorded first in the imaging period of the moving image data
  • the t-th image data 701-t is recorded last in the imaging period of the moving image data It may be an image.
  • the motion amount calculation unit 602 calculates a motion amount between images from a plurality of images forming the moving image data.
  • the motion amount calculation unit 602 calculates a motion vector as the motion amount.
  • the motion amount calculation unit 602 calculates the amount of motion between the image data 701-1 and the image data 701-2, the amount of motion between the image data 701-2 and the image data 701-3,.
  • the amount of movement between (t-1) and the image data 701-t is calculated.
  • the calculation of the movement amount will be described below.
  • the motion amount calculation unit 602 can compare the image data 701-1 and the image data 701-2 by the block matching method.
  • the motion amount calculation unit 602 divides, for example, each of the image data 701-1 and the image data 701-2 into M ⁇ N pixel blocks shown in FIG.
  • the motion amount calculation unit 602 divides both image data into M pieces in the horizontal direction and into N pieces in the vertical direction.
  • the pixel block may be, for example, a square block of 16 ⁇ 16 pixels.
  • the motion amount calculation unit 602 detects a block closest to a certain block in the image data 701-1 in the image data 701-2.
  • the motion amount calculation unit 602 calculates a motion vector from the positions of the two blocks.
  • the motion amount calculation unit 602 calculates a motion vector between the image data 701-2 and the image data 701-3, ..., and the image data 701- (t-1) and the image data 701-t. Calculate the motion vector between them.
  • the calculated motion vector When the calculated motion vector is represented by an arrow, it can be represented as, for example, 702-1, 702-2, ..., and 702- (t-1) in FIG. Some of the arrows indicated by 702-1, 702-2, ..., and 702- (t-1) in FIG. 7 are smaller than the other arrows. This small arrow schematically indicates that the motion vector is smaller than the others. In the present technology, the motion amount does not have to be represented by an arrow as in 702-1, 702-2, ..., and 702- (t-1) in FIG. 7, and is recorded only as data. May be
  • the feature amount acquisition unit 603 acquires a feature amount related to the motion amount calculated by the motion amount calculation unit 602 for each of the pixel blocks constituting the image. For example, the feature amount acquiring unit 603 integrates or averages the motion amounts calculated by the motion amount calculating unit 602 for each of the M ⁇ N pixel blocks shown in FIG. 4 described above, or The maximum value of the motion amount is acquired for each block.
  • the flow channel internal information acquisition unit 604 acquires flow channel internal information of the flow channel region based on the feature amount acquired by the feature amount acquisition unit 603.
  • An example in which the feature amount is an average value and the flow channel internal information is a flow velocity in the flow channel will be described below.
  • the flow velocity acquisition unit 608 calculates the flow velocity of each pixel block from the average value acquired by the feature amount acquisition unit 603. Based on the calculated flow velocity, for example, the color information giving unit 605 described above gives color information to the pixel block.
  • the color information addition unit 605 for example, applies the first color information to the pixel block whose flow rate is equal to or higher than the first threshold.
  • the color information addition unit 605 applies second color information to pixel blocks that are equal to or greater than a second threshold that is less than the first threshold and that is lower than the first threshold.
  • the color information adding unit 605 adds third color information to the pixel block smaller than the second threshold.
  • the pixel block provided with the first color information has a high flow velocity
  • the pixel block provided with the second color information has a low flow velocity
  • the pixel block provided with the third color information has It can be shown that there is no flow rate (i.e., out of the flow path).
  • FIGS. 7 and 8 show an example in which the color information adding unit 605 adds dark gray as the first color information, light gray as the second color information, and white to the third color information.
  • the average value is smaller than that in the other flow passage region, and as a result, the flow velocity is less than the first threshold and the second The threshold is 2 or more.
  • the flow velocity is equal to or higher than the first threshold.
  • dark gray color information is given to the pixel block.
  • a channel area image created as a result of the channel area image creation unit 606 performing color reproduction based on the color information provided by the color information application unit 605 is shown on the right of FIG.
  • the portions shown in dark gray and light gray are the flow channel regions.
  • a portion shown in light gray indicates a portion where the flow velocity is slower than that of the other flow channel areas.
  • White parts are outside the flow path. From the image on the right of FIG.
  • the image shown on the left side of FIG. 8 is a diagram showing each pixel block by a grid. Also, in addition to the calculation of the flow velocity, the flow direction in each pixel block may be determined. The determination of the flow direction may be performed by, for example, the flow direction identification unit described above.
  • the motion amount calculation unit 602 calculates the motion vector between the image data 701-1 and the image data 701-2, the image data 701-2 and the image data 701-3. Motion vectors between the image data 701- (t-1) and the image data 701-t are calculated. The magnitudes of the calculated motion vectors are digitized for each pixel block as shown in FIG. For example, the portion surrounded by the square in the image of 702-1 is represented as 703-1. As shown in 703-1, the magnitude of the motion vector is quantified for each pixel block.
  • the magnitude of the motion vector for each pixel block is digitized. From the grid information of each of these 703-1 to 703- (t-1), as indicated by 704, the maximum value for each pixel block is obtained. As shown in 704, for example, the portion represented by 5 has a high flow rate, but the portion represented by 0 or 1 has no or small flow rate. Therefore, in the part represented by 0 or 1, there is considered to be a factor that impedes the flow. Also, as indicated by 703-1 to 703- (t-1) and 704, color information may be given to each pixel block according to the maximum value. The application of the color information may be performed as described above.
  • the abnormal area identification unit 609 identifies, for example, a pixel block to which the second color information is added as an abnormal area.
  • the abnormal area identification unit 609 can generate information for indicating the pixel block as an abnormal area to the user. Examples of the information include color information for indicating the boundary between the abnormal area and the other area, information for blinking the abnormal area for display, or information for displaying an arrow pointing to the abnormal area. .
  • the flow channel area image generation unit 606 can generate a flow channel area image, or an output unit can perform output.
  • the output unit 607 outputs the flow channel area image created as described above.
  • the output unit 607 may be, for example, a display or a printer.
  • the present technology includes a motion amount calculating step of calculating a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region.
  • the flow path internal information can be acquired by the image processing method according to the present technology.
  • the acquired flow channel internal information is as described in the above “1. First embodiment (image processing apparatus)”.
  • FIG. 9 is a diagram illustrating an example of the flow of an image processing method according to the present technology.
  • step S101 the image processing apparatus 200 starts image processing according to the present technology.
  • step S102 the motion amount calculation unit 202 acquires image data, for example, moving image data.
  • the image data may be recorded in the image recording unit 201, or may be recorded in the imaging device 101 (in particular, the digital video camera 103).
  • step S103 the motion amount calculation unit 202 calculates the amount of motion between at least two of the plurality of images included in the acquired image data.
  • the contents described in regard to the motion amount calculation unit in the above “1. First embodiment (image processing apparatus)” also apply to the motion amount calculation step of step S103.
  • the motion amount calculation unit 202 calculates the amount of motion between the image data 301-1 and the image data 301-2 in FIG. 3 and the motion between the image data 301-2 and the image data 301-3.
  • the amount,..., The amount of movement between the image data 301- (t-1) and the image data 301-t can be calculated.
  • the motion amount may be calculated by, for example, the gradient method or the block matching method.
  • step S104 the feature amount acquisition unit 203 acquires a feature amount related to the movement amount for each pixel block of the image including the flow channel area.
  • the contents described in relation to the feature amount acquisition unit in “1. First embodiment (image processing apparatus)” also apply to the feature amount acquisition step of step S104.
  • the feature amount acquisition unit 203 integrates or averages the motion amounts calculated by the motion amount calculation unit 202 for each pixel block, or acquires the maximum value of the motion amount for each pixel block. sell.
  • step S105 the flow channel internal information acquisition unit 204 acquires flow channel internal information of the flow channel region based on the feature amount.
  • the contents described regarding the flow path internal information acquisition unit in the above “1. First embodiment (image processing apparatus)” also apply to the flow path internal information acquisition step of step S105.
  • the color information addition unit 205 of the flow path internal information acquisition unit 204 can add color information to each pixel block in accordance with the integrated value acquired by the feature amount acquisition unit 203.
  • the flow channel area image generation unit 206 of the flow channel internal information acquisition unit 204 can generate a flow channel area image from the color information provided to each pixel block by the color information addition unit 205.
  • step S106 the output unit 207 outputs the acquired channel internal information.
  • the flow path area image created by the flow path internal information acquisition unit 204 may be displayed by a display or printed by a printer.
  • step S107 the image processing apparatus 200 ends the image processing according to the present technology.
  • the present technology includes a motion amount calculating step of calculating a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region.
  • An image processing program to be executed by a computer is also provided.
  • the image processing program according to the present technology is a program for causing a computer to execute the image processing method according to the present technology. Since each process performed by the said program is as having described in said "2. 2nd Embodiment (image processing method)", description about these is abbreviate
  • FIG. 10 is a diagram showing an example of a schematic hardware configuration of an information processing apparatus for realizing the image processing apparatus according to the present technology.
  • An information processing apparatus 1001 illustrated in FIG. 10 includes a CPU (central processing unit) 1002 and a RAM 1003.
  • the CPU 1002 and the RAM 1003 are connected to each other via a bus 1005, and are also connected to other components of the information processing apparatus 1001 via the bus 1005.
  • the CPU 1002 performs control and calculation of the information processing apparatus 1001.
  • any processor can be used, and examples thereof include a processor of Xeon (registered trademark) series, Core (trademark) series, or Atom (trademark) series.
  • Each component of the image processing apparatus 200 described with reference to FIG. 2 can be realized by the CPU 1002, for example.
  • the RAM 1003 includes, for example, a cache memory and a main memory, and can temporarily store a program used by the CPU 1002 and the like.
  • the information processing apparatus 1001 may include a disk 1004, a communication device 1006, an output device 1007, an input device 1008, and a drive 1009. Any of these components can be connected to the bus 1005.
  • the disk 1004 includes an operating system (for example, WINDOWS (registered trademark), UNIX (registered trademark), LINUX (registered trademark), etc.), an image processing program according to the present technology and various other programs, and various data ( For example, image data can be stored.
  • the communication device 1006 connects the information processing device 1001 to the network 1010 by wire or wirelessly.
  • the communication device 1006 can enable the information processing device 1001 to communicate with the imaging device via the network 1010.
  • the type of communication device 1006 may be appropriately selected by those skilled in the art.
  • the output device 1007 can output the processing result of the information processing device 1001.
  • Examples of the output device 1007 include, but are not limited to, a display device such as a display, an audio output device such as a speaker, or a printer.
  • the input device 1008 is a device for the user to operate the information processing apparatus 1001. Examples of the input device 1008 can include, but are not limited to, a mouse and a keyboard.
  • the drive 1009 can read out the information recorded on the recording medium and output the information to the RAM 1003 and / or write various data to the recording medium.
  • the recording medium is, for example, a DVD medium, a flash memory, or an SD memory card, but is not limited thereto.
  • the present technology can also be configured as follows.
  • a motion amount calculation unit that calculates a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time
  • a feature amount acquisition unit that acquires a feature amount related to the movement amount for each pixel block that constitutes at least a part of the image of the flow path region;
  • a flow channel internal information acquisition unit that acquires flow channel internal information of the flow channel region based on the feature amount.
  • the image processing apparatus according to [1], wherein the flow channel internal information acquisition unit includes a color information addition unit that adds color information to each pixel block according to the feature amount.
  • the flow channel internal information acquisition unit further includes a flow channel region image generation unit that generates an image of the flow channel region based on the color information.
  • the flow channel internal information acquisition unit further includes a flow velocity acquisition unit that acquires the flow velocity inside the flow channel based on the feature amount.
  • the flow channel internal information acquisition unit further includes an abnormal area identification unit that identifies an abnormal area of the flow based on the flow velocity.
  • the particles are biological particles.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Hematology (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Image Analysis (AREA)

Abstract

The purpose of the present invention is to provide a new flow path assessment technique. This technique provides an image processing device that includes: a movement amount calculation unit that calculates the amount of movement between at least two images among a plurality of images obtained by imaging a flow path region in chronological succession; a feature amount acquisition unit that acquires a feature amount relating to the amount of movement for every pixel block that constitutes at least a portion of the images of the flow path region; and a flow path internal information acquisition unit that acquires flow path internal information of the flow path region on the basis of the feature amount.

Description

画像処理装置、画像処理方法、画像処理用プログラムImage processing apparatus, image processing method, program for image processing
 本技術は、画像処理装置、画像処理方法、及び画像処理用プログラムに関する。より詳細には、流路領域の連続的な撮像により得られた複数の画像から流路内部情報を取得するための画像処理装置、画像処理方法、及び画像処理用プログラムに関する。 The present technology relates to an image processing device, an image processing method, and a program for image processing. More specifically, the present invention relates to an image processing apparatus, an image processing method, and an image processing program for acquiring flow channel internal information from a plurality of images obtained by continuous imaging of a flow channel region.
 生体流路を評価するために、例えば光又は超音波などの手段を用いた評価方法が用いられている。当該評価方法として、例えばレーザ光を用いて血流を計測する方法及び超音波を用いて血管内の情報を得る方法が挙げられる。流路の可視化のために、例えば血管造影剤などの手段が用いられることもある。 In order to evaluate a living body flow path, an evaluation method using means such as light or ultrasonic waves is used. As the said evaluation method, the method of measuring a blood flow, for example using a laser beam, and the method of acquiring the information in a blood vessel using an ultrasonic wave are mentioned. For example, means such as a blood vessel contrast agent may be used to visualize the flow path.
 また、生体の動きを評価するために、種々の画像処理技術が開発されている。
 例えば、下記特許文献1には、「周期的な運動を行う物体を撮像した画像データから特定された動きの特徴量に基づき前記画像データの少なくとも一部を構成する複数ブロックそれぞれを分類した画像を表示するよう制御する制御部を有するデータ処理装置。」(請求項1)が記載されている。
 下記特許文献2には、「評価対象の画像を用いて前記評価対象の動きを検出する動き検出部と、前記動き検出部により検出された前記評価対象の動きを示す動きベクトルを用いて、前記評価対象の複数個所における動き量の時間的変化の相関を算出する相関算出部と、前記相関算出部により算出された前記相関を用いて、前記評価対象の動きの協同性を評価する評価値を算出する評価値算出部とを備える画像処理装置。」(請求項1)が記載されている。
 下記特許文献3には、「観察対象の動きを観察領域の部分領域毎に検出する動き検出部と、前記動き検出部により検出された各動きの動き量を算出する動き量算出部と、前記動き量算出部により算出された各動き量の位置および大きさを表すマップを作成するマップ作成部とを備える画像処理装置。」(請求項1)が記載されている。
In addition, various image processing techniques have been developed to evaluate the movement of a living body.
For example, in Patent Document 1 below, “an image obtained by classifying each of a plurality of blocks constituting at least a part of the image data based on the feature amount of the motion specified from the image data obtained by imaging the object performing periodic motion is A data processing apparatus having a control unit for controlling to display. "(Claim 1) is described.
In Patent Document 2 below, “a motion detection unit that detects a motion of the evaluation target using an image to be evaluated, and a motion vector indicating a motion of the evaluation target detected by the motion detection unit are used. An evaluation value for evaluating the cooperativity of the movement of the evaluation object using a correlation calculation unit that calculates a correlation of temporal changes in motion amount at a plurality of evaluation objects, and the correlation calculated by the correlation calculation unit An image processing apparatus comprising: an evaluation value calculation unit to be calculated. "(Claim 1).
In Patent Document 3 below, “a motion detection unit that detects the motion of the observation target for each partial region of the observation region, a motion amount calculation unit that calculates the motion amount of each motion detected by the motion detection unit, and An image processing apparatus comprising: a map creation unit that creates a map representing the position and the size of each motion amount calculated by the motion amount calculation unit.
特開2015-207308号公報JP, 2015-207308, A 特開2012-105631号公報JP, 2012-105631, A 特開2012-194168号公報JP 2012-194168 A
 生体流路の評価において、蛍光標識された粒子が使われることがある。例えば蛍光ビーズを血流中に流して流路を観察すること及び軸索輸送される物質を蛍光標識して軸索を観察することが行われている。このような観察において、蛍光標識された物質は観察されるが、当該蛍光標識された物質が流れる流路は観察できないことがある。蛍光標識された物質を用いた分析において当該物質が流れる流路の評価を行うことができれば、種々の分析又は研究に役立つと考えられる。 Fluorescently labeled particles may be used in the assessment of biological channels. For example, fluorescent beads are allowed to flow into the blood stream to observe the flow path, and substances transported by axons are fluorescently labeled to observe axons. In such observation, although the fluorescently labeled substance is observed, the flow channel through which the fluorescently labeled substance flows may not be observable. If it is possible to evaluate the flow path through which the substance flows in the analysis using the fluorescently labeled substance, it is considered to be useful for various analysis or research.
 また、従来用いられている流路評価方法が適用できない場面もある。例えば血管新生及び神経軸索生成など、新しい流路が形成されて新しい流れが生じる場合において、従来用いられている流路評価方法が適用できないことがある。また、血管造影剤などの流路可視化手段が適用できない場面もある。 There are also cases where the conventionally used channel evaluation method can not be applied. For example, in the case where a new flow channel is formed to generate a new flow, such as angiogenesis and nerve axonal generation, the conventionally used flow channel evaluation method may not be applicable. There are also cases where flow path visualization means such as a blood vessel contrast agent can not be applied.
 また、上記で述べた画像処理技術は、物体の動きを可視化するために有用な技術である。画像処理によって、物体が流れる流路の評価を行うことができれば、種々の分析又は研究に役立つと考えられる。 Also, the image processing techniques described above are useful techniques for visualizing the motion of an object. If it is possible to evaluate the flow path through which an object flows by image processing, it is considered to be useful for various analysis or research.
 本技術は、新たな流路評価技術を提供することを目的とする。 The present technology aims to provide a new flow path evaluation technology.
 本発明者らは、特定の構成を有する画像処理装置によって、流路評価が可能となることを見出した。 The present inventors have found that an image processing apparatus having a specific configuration enables channel evaluation.
 すなわち、本技術は、
 流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出部と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得部と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得部と
 を備えている画像処理装置を提供する。
 本技術の一つの実施態様に従い、前記流路内部情報取得部が、前記特徴量に応じて各画素ブロックに色情報を付与する色情報付与部を含みうる。
 本技術の一つの実施態様に従い、前記流路内部情報取得部が、前記色情報に基づき前記流路領域の画像を作成する流路領域画像作成部をさらに含みうる。
 本技術の一つの実施態様に従い、前記流路内部情報取得部が、前記特徴量に基づき流路内部の流速を取得する流速取得部をさらに含みうる。
 本技術の一つの実施態様に従い、前記流路内部情報取得部が、前記流速に基づき流れの異常領域を同定する異常領域同定部をさらに含みうる。
 本技術の一つの実施態様に従い、前記流路内部情報取得部が、前記特徴量が所定の範囲内にある画素ブロックの面積を算出する面積算出部をさらに含みうる。
 本技術の一つの実施態様に従い、前記特徴量が、5以上の動き量に基づき取得されうる。
 本技術の一つの実施態様に従い、前記特徴量が、前記動き量の積算値若しくは平均値、又は、前記動き量のうちの最大値でありうる。
 本技術の一つの実施態様に従い、前記複数の画像が、前記流路領域の動画を構成するものでありうる。
 本技術の一つの実施態様に従い、前記少なくとも2つの画像が、前記動画を構成する画像のうちの連続する画像でありうる
 本技術の一つの実施態様に従い、前記動き量が、流路内を流れる粒子の動き量でありうる。
 本技術の一つの実施態様に従い、前記粒子が生体粒子でありうる。
 本技術の一つの実施態様に従い、前記流路領域が、粒子が流れている生体内の流路及び/又は粒子が流れている人工的な流路を含みうる。
That is, the present technology
A motion amount calculation unit that calculates a motion amount between at least two of a plurality of images obtained by imaging the flow path region continuously in time;
A feature amount acquisition unit that acquires a feature amount related to the movement amount for each pixel block that constitutes at least a part of the image of the flow path region;
And a flow path internal information acquisition unit for acquiring flow path internal information of the flow path region based on the feature amount.
According to one embodiment of the present technology, the flow path internal information acquisition unit may include a color information addition unit that adds color information to each pixel block according to the feature amount.
According to one embodiment of the present technology, the flow passage internal information acquisition unit may further include a flow passage region image creation unit that creates an image of the flow passage region based on the color information.
According to one embodiment of the present technology, the flow channel internal information acquisition unit may further include a flow velocity acquisition unit that acquires the flow velocity inside the flow channel based on the feature quantity.
According to one embodiment of the present technology, the flow passage internal information acquisition unit may further include an abnormal area identification unit that identifies an abnormal area of the flow based on the flow velocity.
According to one embodiment of the present technology, the flow passage internal information acquisition unit may further include an area calculation unit that calculates an area of a pixel block in which the feature value is within a predetermined range.
According to one embodiment of the present technology, the feature may be obtained based on five or more motion amounts.
According to one embodiment of the present technology, the feature value may be an integrated value or an average value of the motion amounts, or a maximum value of the motion amounts.
According to one embodiment of the present technology, the plurality of images may constitute a moving image of the flow channel area.
According to an embodiment of the present technology, the at least two images may be continuous images among the images constituting the moving image According to an embodiment of the present technology, the amount of movement flows in the flow path It may be the amount of movement of particles.
According to one embodiment of the present technology, the particles may be bioparticles.
According to one embodiment of the present technology, the flow passage region may include a flow passage in a living body in which particles are flowing and / or an artificial flow passage in which particles are flowing.
 また、本技術は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
 を含む画像処理方法も提供する。
Further, according to the present technology, a motion amount calculating step of calculating a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time.
A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
There is also provided an image processing method including the flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
 また、本技術は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
 をコンピュータに実行させるための画像処理用プログラムも提供する。
Further, according to the present technology, a motion amount calculating step of calculating a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time.
A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
There is also provided an image processing program for causing a computer to execute a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
 本技術により、流路評価を可能とする新たな画像処理技術が提供される。本技術により、例えば血管新生の可視化若しくは定量化、神経軸索生成の可視化、又は軸索輸送速度の定量化が可能となる。なお、本技術により奏される効果は、ここに記載された効果に必ずしも限定されるものではなく、本明細書中に記載されたいずれかの効果であってもよい。 The present technology provides a new image processing technology that enables channel evaluation. The technology allows, for example, visualization or quantification of angiogenesis, visualization of axonal generation, or quantification of axonal transport rate. The effects exerted by the present technology are not necessarily limited to the effects described herein, and may be any of the effects described in the present specification.
本技術に従う画像処理装置を含む画像処理システムの例を示す図である。1 shows an example of an image processing system including an image processing apparatus according to the present technology. 本技術に従う画像処理装置のブロック図の一例である。1 is an example of a block diagram of an image processing apparatus according to the present technology. 動画データの例を示す図である。It is a figure which shows the example of moving image data. M×N個の画素ブロックに分割された画像を示す図である。It is a figure which shows the image divided | segmented into MxN pixel block. 本技術に従い作成された流路領域画像の例を示す図である。It is a figure showing an example of a channel field picture created according to this art. 本技術に従う画像処理装置のブロック図の一例である。1 is an example of a block diagram of an image processing apparatus according to the present technology. 動画データの例を示す図である。It is a figure which shows the example of moving image data. 本技術に従い作成された流路領域画像の例を示す図である。It is a figure showing an example of a channel field picture created according to this art. 本技術に従う画像処理方法のフローの一例を示す図である。It is a figure showing an example of the flow of the image processing method according to this art. 本技術に従う画像処理装置を実現する情報処理装置の概略的なハードウェア構成の一例を示す図である。FIG. 1 is a diagram showing an example of a schematic hardware configuration of an information processing apparatus for realizing an image processing apparatus according to the present technology. 本技術に従い作成された流路領域画像の例を示す図である。It is a figure showing an example of a channel field picture created according to this art. 動画データからの特徴量の取得例を示す図である。It is a figure which shows the example of acquisition of the feature-value from moving image data.
 以下、本技術を実施するための好適な形態について説明する。なお、以下に説明する実施形態は、本技術の代表的な実施形態を示したものであり、本技術の範囲がこれらの実施形態に限定されることはない。なお、説明は以下の順序で行う。
1.第1の実施形態(画像処理装置)
(1)第1の実施形態の説明
(2)第1の実施形態の第1の例(画像処理装置)
(3)第1の実施形態の第2の例(画像処理装置)
2.第2の実施形態(画像処理方法)
(1)第2の実施形態の説明
(2)第2の実施形態の例(画像処理方法)
3.第3の実施形態(画像処理用プログラム)
(1)第3の実施形態の説明
4.ハードウェア構成例
Hereinafter, preferred embodiments for implementing the present technology will be described. The embodiments described below show representative embodiments of the present technology, and the scope of the present technology is not limited to these embodiments. The description will be made in the following order.
1. First embodiment (image processing apparatus)
(1) Description of the First Embodiment (2) First Example of the First Embodiment (Image Processing Apparatus)
(3) Second example of the first embodiment (image processing apparatus)
2. Second embodiment (image processing method)
(1) Description of Second Embodiment (2) Example of Second Embodiment (Image Processing Method)
3. Third embodiment (program for image processing)
(1) Description of Third Embodiment Hardware configuration example
1.第1の実施形態(画像処理装置) 1. First embodiment (image processing apparatus)
(1)第1の実施形態の説明 (1) Description of the first embodiment
 本技術は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出部と、前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得部と、前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得部とを備えている画像処理装置を提供する。 The present technology is directed to a motion amount calculator configured to calculate a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region. A feature quantity acquisition unit that acquires a feature quantity related to the movement amount for each pixel block that constitutes a part; and a flow path internal information acquisition unit that acquires flow path internal information of the flow path region based on the feature quantity. An image processing apparatus is provided.
 本技術によって、流路領域を撮像した画像から流路内部情報を取得することが可能となる。例えば、流路内部情報として、流路の形状若しくは構造、流路内の流速、及び/又は流路内の流速分布を取得することができる。 According to the present technology, it is possible to obtain flow channel internal information from an image obtained by imaging the flow channel region. For example, as the flow channel internal information, the shape or structure of the flow channel, the flow velocity in the flow channel, and / or the flow velocity distribution in the flow channel can be acquired.
 また、本技術に従い取得された流路内部情報に基づき、流路の状態を評価することが可能となる。例えば、本技術に従い取得された流路内部情報に基づき、流路内で滞留又は流速低下が起こっている場所を同定することができる。本技術は例えば以下の場面に適用されうる。 Moreover, it becomes possible to evaluate the state of the flow path based on the flow path internal information acquired according to the present technology. For example, based on the information inside the flow channel acquired according to the present technology, it is possible to identify the place where the stagnation or the flow velocity decrease is occurring in the flow channel. The present technology can be applied to, for example, the following situations.
 例えば、蛍光ビーズを血管(人工血管又は生体内の血管)内に流した状態で当該血管を観察する場合、当該ビーズから発せられる蛍光は観察できるが、当該ビーズが流れる血管は観察できないことがある。本技術により例えば流路の形状若しくは構造及び/又は流路の内の流速を評価することができるので、当該ビーズが流れる血管の状態を把握することができる。例えば血液の流速が低下している場所、血流が停滞している場所、又は血液が漏れ出ている場所などを、本技術により同定することができる。このようにして、血管の異常が起こっている場所を同定することができる。本技術に従い得られた流路内部情報は、例えば人工血管内の血栓形成の予防及び/又は人工血管の構造の改良に資すると考えられる。 For example, when the blood vessel is observed in a state where fluorescent beads are flowed into a blood vessel (artificial blood vessel or a blood vessel in a living body), the fluorescence emitted from the bead can be observed, but the blood vessel through which the bead flows may not be observed. . According to the present technology, for example, the shape or structure of the flow channel and / or the flow velocity in the flow channel can be evaluated, so that the state of the blood vessel through which the bead flows can be grasped. For example, the place where the flow velocity of blood is decreasing, the place where blood flow is stagnant, or the place where blood leaks can be identified by the present technology. In this way, it is possible to identify where the vascular abnormality is occurring. The flow path internal information obtained according to the present technology is considered to contribute to, for example, prevention of thrombus formation in the artificial blood vessel and / or improvement of the structure of the artificial blood vessel.
 例えば、本技術により、血管新生の可視化及び/又は定量化が可能となる。例えば、血管新生により生じる流路の構造又は形状を把握すること、又は、血管新生により生じた流路の面積を把握することが、本技術により可能となる。 For example, the technology enables visualization and / or quantification of angiogenesis. For example, it is possible by the present technology to grasp the structure or shape of the flow channel generated by angiogenesis, or the area of the flow channel generated by angiogenesis.
 例えば、蛍光標識された物質が軸索内を輸送される状況を観察する場合、蛍光は観察できるが、軸索の構造は観察できないことがある。本技術により例えば流路の形状若しくは構造又は流速を評価することができるので、例えば軸索の構造の把握、軸索輸送速度の定量化、及び生成される軸索の形状又は伸長方向の同定が本技術により可能となる。 For example, when observing a situation in which a fluorescently labeled substance is transported in an axon, fluorescence may be observed but the structure of the axon may not be observed. The technology can, for example, assess the shape or structure or flow rate of the flow path, so that, for example, the structure of the axon can be understood, the axonal transport velocity can be quantified, and the shape or extension direction of the generated axon can be identified. It becomes possible by this technology.
 例えば、マイクロキャピラリ及びマイクロ流路チップなどのマイクロ流体デバイス中に細胞及びビーズなどの粒子を流す場合に本技術を適用することで、流路内の流速分布を取得することができる。よって、例えばつまりが発生しやすい場所及び滞留若しくは流速低下が起こっている場所など、流路内の問題箇所を同定することが本技術により可能となる。その結果、これらデバイスの品質評価が可能となる。 For example, by applying the present technology in the case of flowing particles such as cells and beads in a microfluidic device such as a microcapillary and a microchannel chip, the flow velocity distribution in the channel can be obtained. Thus, the present technology makes it possible to identify a problem point in the flow path, for example, a place where blockage tends to occur and a place where stagnation or flow rate reduction occurs. As a result, quality evaluation of these devices becomes possible.
 以下で、本技術に関してより詳細な説明を行う。 A more detailed description of the present technology is provided below.
 本技術において、前記動き量算出部は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する。当該複数の画像は、例えば動画を構成する画像でありうる。当該少なくとも2つの画像は、例えば、当該動画を構成する複数の画像のうちの少なくとも2つでありうる。
 当該動画は、流路を含む領域を時間的に連続して撮像された画像の集合でありうる。当該動画のフレームレートは、例えば撮像対象及び顕微鏡の倍率などの要因を考慮して当業者により適宜選択されてよい。撮像対象が血流である場合、血流の速度は例えば0.05~50cm/秒であるので、この血流の速度を考慮してフレームレートが選択されうる。
 また、例えば4~16ピクセルの動きを捉えることが、画像処理にとって効率的であると考えられる。そのため、(画像内を移動する対象(例えば粒子)の1秒当たりの移動距離(ピクセル))/(4~16)以上に、フレームレート(fps)が設定されうる。例えば、1秒当たり100ピクセル進む対象を画像内に含む場合には、100/4~100/16fps以上のフレームレートが採用されてよく、例えば5fps以上のフレームレート又は25fps以上のフレームレートが採用されてよい。当該動画のフレームレートは、例えば5~100fps、好ましくは10~80fpb、より好ましくは15~60fpsでありうる。フレームレートは、流路領域の撮像のために用いられる撮像装置の種類又は設定を選択することにより設定されうる。
 例えばゼブラフィッシュなどの透明動物の血流を撮像する場合は、100fps~100fpsのフレームレートが採用されてもよく、1000fps以上のフレームレートが採用されてもよい。
 本技術において用いられうる動画データの例が、図3中に301-1~301-tとして示されている。このように、動画データは、所定期間内の1番目の画像からt番目の画像なるものであってよい。
 本技術において、撮像により得られた動画の全ての時間にわたって動き量の算出が行われてもよく、又は、当該動画のうちの或る区間、すなわち当該全ての時間のうちの或る時間の動画についてのみ動き量の算出が行われてもよい。動き量の算出に用いられる動画の区間は当業者により適宜選択されうる。
In the embodiment of the present technology, the movement amount calculation unit calculates the movement amount between at least two images among a plurality of images obtained by imaging the flow passage region continuously in time. The plurality of images may be, for example, images constituting a moving image. The at least two images may be, for example, at least two of a plurality of images constituting the moving image.
The moving image may be a set of images captured sequentially in time in a region including the flow path. The frame rate of the moving image may be appropriately selected by those skilled in the art in consideration of factors such as the magnification of the imaging target and the microscope, for example. When the imaging target is a blood flow, the velocity of the blood flow is, for example, 0.05 to 50 cm / sec, so the frame rate can be selected in consideration of the velocity of the blood flow.
In addition, capturing motion of, for example, 4 to 16 pixels is considered to be efficient for image processing. Therefore, the frame rate (fps) can be set to (moving distance (pixel) per second of an object (for example, particle) moving in the image) / (4 to 16) or more. For example, when an image includes an object advancing 100 pixels per second, a frame rate of 100/4 to 100/16 fps or more may be employed, for example, a frame rate of 5 fps or more or a frame rate of 25 fps or more. You may The frame rate of the moving image may be, for example, 5 to 100 fps, preferably 10 to 80 fb, more preferably 15 to 60 fps. The frame rate can be set by selecting the type or setting of the imaging device used for imaging the flow channel region.
For example, when imaging the blood flow of a transparent animal such as zebrafish, a frame rate of 100 fps to 100 fps may be adopted, and a frame rate of 1000 fps or more may be adopted.
Examples of moving image data that can be used in the present technology are shown as 301-1 to 301-t in FIG. Thus, the moving image data may be the t-th image from the first image within a predetermined period.
In the present technology, the motion amount may be calculated over the entire time of the moving image obtained by imaging, or a certain section of the moving image, that is, a moving image of a certain time of all the time The motion amount may be calculated only for. The section of the moving image used to calculate the motion amount can be appropriately selected by those skilled in the art.
 動き量の算出のために用いられる画像の数は、少なくとも2つであり、好ましくは5以上、より好ましくは30以上、より好ましくは300以上、さらにより好ましくは3000以上でありうる。例えば、動き量の算出に用いられる画像が30fpsの動画を構成する画像である場合、好ましくは1秒以上、より好ましくは10秒以上、さらにより好ましくは100秒以上の動画が動き量の算出に用いられうる。動き量の算出のために用いられる画像の数が多いほど、より良い流路内部情報が得られうる。 The number of images used for calculating the amount of movement may be at least 2, preferably 5 or more, more preferably 30 or more, more preferably 300 or more, and still more preferably 3000 or more. For example, when the image used to calculate the amount of movement is an image constituting a 30 fps moving image, the moving image preferably has 1 second or more, more preferably 10 seconds or more, and still more preferably 100 seconds or more for calculation of the amount of movement. It can be used. As the number of images used to calculate the amount of movement is larger, better channel internal information can be obtained.
 動き量の算出は、2つの画像を対比することにより行われうる。対比されるべき2つの画像は、例えば、動画を構成する複数の画像のうちの時間的に連続する2つの画像であってもよく、又は、時間的に連続しない2つの画像であってもよい。 The calculation of the motion amount can be performed by comparing the two images. The two images to be compared may be, for example, two images that are temporally continuous among a plurality of images that constitute a moving image, or may be two images that are not temporally continuous. .
 時間的に連続する2つの画像は、或る時点において所定領域を撮像した画像と、前記或る時点から最も近い時点に当該領域を撮像した画像との2つの画像である。例えば、動画を構成する或る時点での画像と、当該画像の直後の画像でありうる。
 例えば、フレームレートが30fpsである1秒間の動画は、30枚の画像から構成される。当該30枚の画像のうちの1枚目及び2枚目の画像の間の動き量、2枚目及び3枚目の画像の間の動き量、・・・、並びに、29枚目及び30枚目の画像の間の動き量が、前記動き量算出部によって算出されうる。
Two temporally consecutive images are two images of an image obtained by imaging a predetermined area at a certain point in time and an image obtained by imaging the area at a point closest to the certain point in time. For example, it may be an image at a certain point in time that constitutes a moving image and an image immediately after the image.
For example, a one-second moving image with a frame rate of 30 fps is composed of 30 images. The amount of movement between the first and second images of the 30 images, the amount of movement between the second and third images, ..., and the 29th and 30th images The motion amount between the images of the eyes may be calculated by the motion amount calculation unit.
 時間的に連続しない2つの画像は、或る時点において所定領域を撮像した画像と、前記或る時点から最も近い時点に当該領域を撮像した画像以外の画像との2つの画像である。例えば、動画を構成する或る時点での画像と、当該画像の直後の画像のさらに次の画像でありうる。
 例えば、フレームレートが30fpsである1秒間の動画の場合、当該動画を構成する30枚の画像のうちの1枚目及び3枚目の画像の間の動き量、3枚目及び5枚目の画像の間の動き量、・・・、並びに、27枚目及び29枚目の画像の間の動き量が算出されうる。すなわち、動画を構成する画像のうち、1枚飛ばしの2つの画像間の動き量が算出されうる。動き量を算出する際に飛ばされる画像の枚数は1枚飛ばしに限られない。例えば1~20枚飛ばし、特には1~10枚飛ばし、より特には2枚飛ばし、3枚飛ばし、4枚飛ばし、5枚飛ばし、又は10枚飛ばしの2つの画像間の動き量が算出されてもよい。動き量算出に用いられる画像の枚数を減らすことで、動き量算出部により生成されるデータ量を減らすことができる。
The two images which are not continuous in time are two images of an image obtained by imaging a predetermined area at a certain point in time and an image other than an image obtained by imaging the area at a point closest to the certain point in time. For example, it may be an image at a certain point in time that constitutes a moving image and a further image of the image immediately after the image.
For example, in the case of a 1-second moving image having a frame rate of 30 fps, the amount of movement between the first and third images of the 30 images constituting the moving image, and the third and fifth images. The amount of movement between the images,..., And the amount of movement between the 27th and 29th images may be calculated. That is, of the images constituting the moving image, the amount of movement between two images of one image skipping can be calculated. The number of images skipped when calculating the amount of movement is not limited to one. For example, the amount of movement between two images of 1 to 20 sheets, particularly 1 to 10 sheets, more particularly 2 sheets, 3 sheets, 4 sheets, 5 sheets or 10 sheets is calculated. It is also good. By reducing the number of images used for motion amount calculation, it is possible to reduce the amount of data generated by the motion amount calculation unit.
 代替的には、本技術において、前記動き量算出部によって、動画を構成する複数の画像のうちの所定の2つの時点における2つの画像間の動き量が算出されてもよい。本技術において、所定の2つの時点における2つの画像とは、例えば或る時点において所定領域を撮像した画像と、前記或る時点から所定時間経過後に当該領域を撮像した画像との2つの画像でありうる。例えば0.01秒おき~1秒おき、好ましくは0.1秒おき~0.5秒おきの画像が動き量算出に用いられうる。より具体的には、例えば10秒間の動画に対して本技術が適用される場合、例えば当該10秒間の動画を構成する画像のうちの最初の画像と当該最初の画像から0.5秒経過した時点での第2の画像との間の動き量、当該第2の画像と当該第2の画像からさらに0.5秒経過した時点での第3の画像との間の動き量などのように、所定の2つの時点における2つの画像間の動き量が算出されうる。 Alternatively, in the present technology, the motion amount calculation unit may calculate a motion amount between two images at predetermined two points in time among the plurality of images constituting the moving image. In the present technology, two images at predetermined two time points are, for example, two images of an image obtained by imaging a predetermined area at a certain time point and an image obtained by imaging the area after a predetermined time has elapsed from the certain time point. It is possible. For example, images every 0.01 seconds to one second, preferably every 0.1 seconds to every 0.5 seconds can be used for the movement amount calculation. More specifically, when the present technology is applied to, for example, a 10-second moving image, for example, 0.5 seconds have passed since the first image of the images constituting the 10-second moving image and the first image. The amount of movement between the second image at a point in time, the amount of movement between the second image, and the third image after 0.5 seconds from the second image, and so on An amount of movement between two images at predetermined two time points can be calculated.
 本技術において、流路領域は、少なくとも一部に流路を有する領域であってよく、又は少なくとも一部に流路を有する可能性のある領域であってもよい。後者の場合、流路を有すると推測される領域を撮像し、撮像により得られた画像に対して本技術が適用されることで、例えば当該領域内における流路の有無又は流路の形状に関する情報が取得されうる。流路領域は、例えば、粒子が流れている生体内の流路及び/又は粒子が流れている人工的な流路でありうる。流路は、その内部を物質、例えば粒子が流れることができる構造体でありうる。流路として、例えば血管、リンパ管、及び軸索などの生体内の流路、例えば人工血管及び人工リンパ管などの人工的な生体流路、並びに、例えばマイクロ流路などの非生体流路を挙げることができるが、これらに限定されない。 In the present technology, the flow channel region may be a region having a flow channel at least in part, or may be a region having a possibility of having a flow channel in at least one portion. In the latter case, by imaging the area assumed to have a flow path and applying the present technology to the image obtained by imaging, for example, the presence or absence of the flow path in the area or the shape of the flow path Information can be obtained. The flow passage region may be, for example, a flow passage in a living body in which particles are flowing and / or an artificial flow passage in which particles are flowing. The flow path may be a structure through which a substance, such as particles, can flow. As the flow channels, for example, flow channels in a living body such as blood vessels, lymphatic vessels, and axons, artificial biological flow channels such as artificial blood vessels and artificial lymphatic vessels, and non-biological flow channels such as microchannels Although it can mention, it is not limited to these.
 本技術において、撮像される流路内に、粒子が流れていることが望ましい。粒子が流路内を流れていることによって、画像間の動き量を算出することが容易になる。すなわち、本技術において、動き量は、流路内を流れる粒子の動き量でありうる。
 前記粒子として、生体粒子及び非生体粒子を挙げることができる。前記生体粒子としては、例えば種々の生体分子、生物学的な粒子、例えば細胞、微生物、生体由来固形成分、及びリポソームなどを挙げることができる。前記非生体粒子としては、非生体分子、合成粒子、例えばラテックス粒子、ゲル粒子、及び工業用粒子などを挙げることができるがこれらに限定されない。
 前記生体分子として、例えば軸索輸送される分子及び血流中を流れる分子を挙げることができるがこれらに限定されない。前記細胞には、動物細胞および植物細胞が含まれうる。動物細胞として、例えば腫瘍細胞及び血液細胞、例えば赤血球など、を挙げることができる。前記微生物には、大腸菌などの細菌類、イースト菌などの菌類などが含まれうる。前記生体由来固形成分として、例えば、生体中で生成される固形物結晶類を挙げることができる。
 前記非生体分子として、血管又は軸索などの流路中に流されうる試薬分子を挙げることができる。前記合成粒子は、例えば有機若しくは無機高分子材料又は金属などからなる粒子でありうる。有機高分子材料には、ポリスチレン、スチレン・ジビニルベンゼン、及びポリメチルメタクリレートなどが含まれうる。無機高分子材料には、ガラス、シリカ、及び磁性体材料などが含まれうる。金属には、金コロイド及びアルミなどが含まれうる。前記合成粒子は、例えば血流中又はマイクロ流路中に流されることを意図されたビーズ、特には蛍光ビーズでありうる。また、本技術において、粒子は、例えば二つ又は三つなどの複数の粒子の結合物であってもよい。
 粒子には、撮像装置により撮像可能とするための標識が付されていてよい。標識として、例えば蛍光標識を挙げることができる。
In the present technology, it is desirable that particles flow in the flow path to be imaged. The particles flowing in the flow channel facilitate calculation of the amount of movement between the images. That is, in the present technology, the amount of movement may be the amount of movement of particles flowing in the channel.
The particles may include living particles and non-living particles. Examples of the biological particles include various biological molecules, biological particles such as cells, microorganisms, solid components of biological origin, and liposomes. Examples of the non-living particles include, but are not limited to, non-living molecules, synthetic particles such as latex particles, gel particles, and industrial particles.
Examples of the biomolecules include, but are not limited to, molecules transported by axon and molecules flowing in the blood stream. The cells can include animal cells and plant cells. Animal cells can include, for example, tumor cells and blood cells, such as red blood cells. The microorganisms may include bacteria such as E. coli, fungi such as yeast, and the like. Examples of the living body-derived solid component include solid crystals produced in the living body.
The non-biomolecules can include reagent molecules that can be flowed into the flow channel such as blood vessels or axons. The synthetic particles may be particles made of, for example, an organic or inorganic polymer material or a metal. Organic polymeric materials may include polystyrene, styrene divinyl benzene, and polymethyl methacrylate. Inorganic polymeric materials can include glass, silica, magnetic materials, and the like. Metals may include gold colloids and aluminum. The synthetic particles can be, for example, beads, in particular fluorescent beads, intended to be flowed in the bloodstream or in the microchannel. Also, in the present technology, the particles may be a combination of a plurality of particles, such as two or three.
The particles may be labeled for enabling imaging by the imaging device. Labels can include, for example, fluorescent labels.
 本技術において分析対象とされる流路の横断面の最大寸法の下限は、例えば1μm、好ましくは5μm、より好ましくは10μmでありうる。当該横断面の最大寸法の上限は、例えば10mm、好ましくは5mm、より好ましくは1mmでありうる。
 前記横断面の最大寸法は、横断面が円形である場合は直径であり、横断面が楕円である場合は長径であり、及び横断面が矩形である場合は対角線の長さであってよい。
The lower limit of the maximum dimension of the cross section of the flow path to be analyzed in the present technology may be, for example, 1 μm, preferably 5 μm, more preferably 10 μm. The upper limit of the largest dimension of the cross section may be, for example, 10 mm, preferably 5 mm, more preferably 1 mm.
The largest dimension of the cross-section may be the diameter if the cross-section is circular, the major axis if the cross-section is elliptical, and the diagonal length if the cross-section is rectangular.
 本技術において、動き量とは、例えば動きベクトルである。動きベクトルは、例えば動き補償予測符号化において用いられる技法により算出されうる。当該技法の例として、勾配法及びブロックマッチング法を挙げることができるがこれらに限定されない。
 勾配法は、画像中の微小区間では輝度勾配が一定であることを利用した技法である。例えば、例えば或る時点における画像中の或る部位における輝度変化と同様な輝度変化を有する部位を、前記或る時点における画像と対比されるべき画像(例えば当該或る時点の直後の時点における画像など)中に検出することにより、前記或る部位の動きベクトルが算出される。画像全体の動きベクトルの場は、オプティカルフローとも呼ばれる。
 ブロックマッチング法では、画像は矩形ブロック、例えば正方形ブロックなどに分割され、各矩形ブロックを単位として動きベクトルが算出されうる。例えば、或る時点での画像を矩形ブロックに分割し、各ブロックについて、前記或る時点での画像と対比されるべき画像(例えば当該或る時点の直前の時点における画像)中に、最もよく似ているブロックを検出することにより、各ブロックの動きベクトルが算出される。
In the present technology, the motion amount is, for example, a motion vector. Motion vectors may be calculated, for example, by techniques used in motion compensated prediction coding. Examples of such techniques include, but are not limited to, gradient methods and block matching methods.
The gradient method is a technique that utilizes the fact that the brightness gradient is constant in a minute section in the image. For example, for example, an image having a brightness change similar to the brightness change at a certain point in an image at a certain point should be compared with the image at that certain point (for example, an image at a point immediately after the certain point) Etc.), the motion vector of the certain part is calculated. The field of motion vectors of the entire image is also called optical flow.
In the block matching method, an image is divided into rectangular blocks, for example, square blocks, and motion vectors can be calculated on a rectangular block basis. For example, an image at a certain point in time is divided into rectangular blocks, and for each block, the image to be compared with the image at a certain point in time (eg, an image at a point in time immediately before the certain point in time) is best By detecting similar blocks, the motion vector of each block is calculated.
 本技術において、画素ブロックとは、画像を構成する画素の集合である。画素ブロックは、例えば動きベクトルを算出するための方法に従い適宜選択されうる。例えば、画素ブロックは、一辺が例えば1~50ピクセル、特には2~30ピクセル、より特には3~20ピクセルであり、且つ、他方の辺が1~50ピクセル、特には2~30ピクセル、より特には3~20ピクセルである矩形の画素ブロックでありうる。画素ブロックのサイズは、例えば動き量の算出方法、画像のサイズ、及び流路中を流れる粒子のサイズなどによって、当業者により適宜設定されてよい。例えば動き量の算出においてブロックマッチング法が用いられる場合、画素ブロックは、例えば16×16、8×16、16×8、8×8、4×8、8×4、又は4×4ピクセルでありうる。また、動き量の算出においてブロックマッチング法が用いられる場合、画素ブロックは好ましくは正方形のものでありうる。 In the present technology, a pixel block is a set of pixels that constitute an image. The pixel block can be appropriately selected according to, for example, a method for calculating a motion vector. For example, the pixel block has one side of, for example, 1 to 50 pixels, particularly 2 to 30 pixels, more particularly 3 to 20 pixels, and the other side of 1 to 50 pixels, especially 2 to 30 pixels, In particular, it may be a rectangular pixel block of 3 to 20 pixels. The size of the pixel block may be appropriately set by those skilled in the art according to, for example, the method of calculating the amount of movement, the size of the image, and the size of particles flowing in the flow path. For example, when the block matching method is used to calculate the motion amount, the pixel block is, for example, 16 × 16, 8 × 16, 16 × 8, 8 × 8, 4 × 8, 8 × 4, or 4 × 4 pixels. sell. Also, if block matching is used in the calculation of the amount of motion, the pixel block may preferably be square.
 本技術において、特徴量は、画素ブロックにおける動き量の特徴を表す量でありうる。特徴量によって、例えば、画素ブロックにおける流れの速度及び/若しくは方向、画素ブロックに存在する粒子の移動速度、当該速度の傾向若しくは平均、当該画素ブロックに存在する粒子が移動する方向、並びに/又は、当該方向の傾向が表されうる。
 前記特徴量は、例えば動き量の積算値若しくは平均値、又は、前記動き量の最大値でありうる。これらの値は、画素ブロックにおける動き量の特徴を表すのに適している。例えば、前記積算値、前記平均値、及び前記最大値はそれぞれ、動きベクトルの積算値、平均値、及び最大値でありうる。動きベクトルの積算値、平均値、及び最大値は、当業者に既知の手法により算出されうる。なお、これらの値を算出する際に、例えば外れ値の除外などのデータ処理が行われてもよい。
 例えば、前記特徴量は、例えば画素ブロック毎の動き量(例えば動きベクトルの大きさ)を表す数字の最大値であってよい。画像間の動きベクトルの大きさが画素ブロック毎に数値化され、そして、所定時間にわたって画素ブロック毎における数値化された動きベクトルの大きさが収集される。当該収集された動きベクトルの数値化された最大値が、各画素ブロックの特徴量として用いられうる。
 前記特徴量を取得するために用いられる動き量のデータ数は、好ましくは5以上、より好ましくは10以上、より好ましくは20以上、さらにより好ましくは100以上、200以上、500以上、又は1000以上でありうる。特徴量を取得するために用いられる動き量のデータ数が多いことによって、より適切な特徴量が得られうる。
In the present technology, the feature amount may be an amount representing the feature of the motion amount in the pixel block. Depending on the feature amount, for example, the velocity and / or direction of flow in the pixel block, the moving velocity of particles present in the pixel block, the tendency or average of the velocity, the direction in which particles present in the pixel block, and / or Trends in that direction can be expressed.
The feature amount may be, for example, an integrated value or an average value of motion amounts, or a maximum value of the motion amounts. These values are suitable for characterizing the amount of motion in the pixel block. For example, the integrated value, the average value, and the maximum value may be an integrated value, an average value, and a maximum value of motion vectors, respectively. The integrated value, average value, and maximum value of motion vectors can be calculated by a method known to those skilled in the art. In addition, when calculating these values, data processing, such as exclusion of an outlier, may be performed, for example.
For example, the feature amount may be, for example, a maximum value of a number representing a motion amount for each pixel block (for example, a size of a motion vector). The magnitudes of motion vectors between images are quantified for each pixel block, and the magnitudes of quantified motion vectors for each pixel block are collected over a predetermined time. The quantified maximum value of the collected motion vector may be used as the feature amount of each pixel block.
The number of data of movement amount used to obtain the feature amount is preferably 5 or more, more preferably 10 or more, more preferably 20 or more, still more preferably 100 or more, 200 or more, 500 or more, or 1000 or more It can be. A more appropriate feature amount can be obtained by the large number of motion amount data used to obtain the feature amount.
 本技術において、流路内部情報とは、例えば流路の構造に関する情報及び流路中の流れに関する情報でありうる。流路の構造に関する情報として、例えば流路の形状及び流路の面積を挙げることができる。流路中の流れに関する情報として、例えば流路内の流速、流路内の流速分布、及び流れの方向を挙げることができる。 In the present technology, the flow path internal information may be, for example, information on the structure of the flow path and information on the flow in the flow path. Examples of the information on the structure of the flow path include the shape of the flow path and the area of the flow path. Information on the flow in the flow path can include, for example, the flow velocity in the flow path, the flow velocity distribution in the flow path, and the direction of the flow.
 本技術に従う画像処理装置の流路内部情報取得部は、色情報付与部を含みうる。色情報付与部は、前記特徴量に応じて各画素ブロックに色情報を付与する。色情報は、例えばXYZ表色系又はRGB表色系に基づく色情報でありうる。色情報の付与に関するルールは、当業者により適宜設定されてよい。当該ルールとして、例えば以下が設定されうる。
 例えば、色情報付与部は、或る画素ブロックの特徴量が所定の値以上である場合に、当該画素ブロックを構成する画素の全てに所定の色情報を付与し、且つ、或る画素ブロックの特徴量が前記所定の値未満である場合に、当該画素ブロックを構成する画素の全てに前記色情報とは異なる色情報を付与しうる。このように異なる2つの色情報を付与することによって、例えば流路内部とそれ以外の部分とを区別することができる。
 また、特徴量に応じて異なる3つ以上の色情報が付与されてもよい。付与される色情報の種類の数は、例えば3~50、4~40、5~30、又は10~20でありうる。例えば異なる3つの色情報を付与することで、流路内部とそれ以外の部分とを区別でき、且つ、流路内部を流速が遅い部分及び速い部分とに分けることができる。これにより、流路内の滞留部分及び/又は流速低下部分を同定することができる。
 また、特徴量に応じて、付与される色情報を徐々に変化させてもよい。例えば、特徴量の高さに応じてXYZ表色系におけるX、Y、及びX値又はRGB表色系におけるR、G、及びB値を徐々に変化させることで、流路内にグラデーションが表れるように色情報が付与されうる。これにより、流路内の流速分布を把握することが容易になる。
The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include a color information addition unit. The color information adding unit adds color information to each pixel block according to the feature amount. The color information may be, for example, color information based on an XYZ color system or an RGB color system. The rules regarding the application of color information may be set as appropriate by those skilled in the art. For example, the following may be set as the rule.
For example, when the feature amount of a pixel block is equal to or more than a predetermined value, the color information adding unit adds predetermined color information to all the pixels constituting the pixel block, and the color information adding unit When the feature amount is less than the predetermined value, color information different from the color information may be provided to all the pixels forming the pixel block. By providing two different color information in this manner, for example, the inside of the flow path and the other portion can be distinguished.
Also, three or more different color information may be provided depending on the feature amount. The number of types of color information to be provided may be, for example, 3 to 50, 4 to 40, 5 to 30, or 10 to 20. For example, by providing three different color information, the inside of the flow path can be distinguished from the other portions, and the inside of the flow path can be divided into a slow flow portion and a fast flow portion. Thereby, it is possible to identify the stagnation portion and / or the flow velocity reduction portion in the flow channel.
Further, the color information to be provided may be gradually changed according to the feature amount. For example, by gradually changing the X, Y, and X values in the XYZ color system or the R, G, and B values in the RGB color system according to the height of the feature amount, gradation appears in the flow path Color information may be provided. This makes it easy to grasp the flow velocity distribution in the flow channel.
 本技術に従う画像処理装置の流路内部情報取得部は、流路領域画像作成部を含みうる。本技術の一つの実施態様に従い、前記流路内部情報取得部は、色情報付与部と流路領域画像作成部とを含みうる。流路領域画像作成部は、色情報付与部により付与された各画素ブロックの色情報に基づき画像を作成する。各画素に付与された色情報に基づき色再現を行うことで、流路領域の画像が作成されうる。 The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include a flow channel area image generation unit. According to one embodiment of the present technology, the flow passage internal information acquisition unit may include a color information addition unit and a flow passage area image generation unit. The flow channel area image creation unit creates an image based on the color information of each pixel block provided by the color information application unit. By performing color reproduction based on the color information given to each pixel, an image of the flow passage area can be created.
 本技術に従う画像処理装置の流路内部情報取得部は、流速取得部を含みうる。本技術の一つの実施態様に従い、前記流路内部情報取得部は、流速取得部、色情報付与部、及び流路領域画像作成部を含みうる。流速取得部は、前記特徴量に基づき流速を取得する。取得された流速に応じて、前記色情報付与部が、各画素ブロックに色情報を付与してもよい。
 色情報の付与は、例えば上記で述べたルールと同様に行われてよく、すなわち上記で述べたルールにおいて「特徴量」を「流速」と読み替えたルールに従い色情報が付与されうる。また、色情報付与部は、流速が遅い部分と速い部分との境界部分を構成する画素に、さらに他の色情報を付与してもよい。これにより、例えば流路内の滞留部分及び/又は流速低下部分を、本技術の画像処理装置のユーザに対してより明確に提示することができる。
The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include a flow velocity acquisition unit. According to one embodiment of the present technology, the flow passage internal information acquisition unit may include a flow velocity acquisition unit, a color information addition unit, and a flow passage area image generation unit. The flow velocity acquisition unit acquires the flow velocity based on the feature amount. The color information adding unit may add color information to each pixel block according to the acquired flow velocity.
The application of color information may be performed, for example, in the same manner as the rule described above, that is, color information may be applied according to a rule in which “feature amount” is replaced with “flow velocity” in the rule described above. In addition, the color information adding unit may further add other color information to the pixels forming the boundary between the slow flow rate part and the fast flow rate part. Thereby, for example, stagnant parts and / or flow rate reduction parts in the flow path can be more clearly presented to the user of the image processing apparatus of the present technology.
 本技術に従う画像処理装置の流路内部情報取得部は、異常領域同定部を含みうる。本技術の一つの実施態様に従い、前記流路内部情報取得部は、流速取得部、異常領域同定部、色情報付与部、及び流路領域画像作成部を含みうる。異常領域同定部は、例えば前記流速に基づき流れの異常領域を同定しうる。例えば、異常領域同定部は、異常領域として同定された画素ブロックをユーザに対して異常領域であると示すための情報を生成しうる。当該情報の例として、異常領域とそれ以外の領域との境界を示すための色情報、又は、異常領域を点滅させて表示するための情報が挙げられる。例えば、異常領域同定部により、流速が所定の値より低い部分が、血栓が生じる可能性が高い部分又は血栓が生じている部分として同定されうる。異常領域として同定された部分は、例えば上記色情報付与部によって、所定の色又は模様を付与するように色情報を付与されうる。これらの情報に基づき、流路領域画像作成部により流路領域画像が生成され、又は、出力部により出力が行われうる。 The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include an abnormal area identification unit. According to one embodiment of the present technology, the flow channel internal information acquisition unit may include a flow velocity acquisition unit, an abnormal area identification unit, a color information application unit, and a flow channel area image creation unit. The abnormal area identification unit can identify an abnormal area of the flow based on, for example, the flow velocity. For example, the abnormal area identification unit may generate information for indicating to the user a pixel block identified as an abnormal area as an abnormal area. Examples of the information include color information for indicating the boundary between the abnormal area and the other area, or information for causing the abnormal area to blink and display it. For example, the abnormal region identification unit can identify a portion where the flow velocity is lower than a predetermined value as a portion where thrombus is likely to occur or a portion where thrombus is occurring. The portion identified as the abnormal area may be given color information so as to give a predetermined color or pattern, for example, by the color information giving unit. Based on these pieces of information, the flow passage area image generation unit may generate a flow passage area image, or an output unit may perform an output.
 本技術に従う画像処理装置の流路内部情報取得部は、面積算出部を含みうる。本技術の一つの実施態様に従い、前記流路内部情報取得部は、面積算出部、色情報付与部、及び流路領域画像作成部を含みうる。
 一つの実施態様において、面積算出部は、前記特徴量が所定の範囲内にある画素ブロックの面積を算出しうる。例えば、所定の値以上の特徴量を有する画素ブロックの面積を算出することで、流路の面積が算出されうる。また、所定の範囲内の特徴量を有する画素ブロックの面積を算出することで、正常に流れている流路部分の面積及び/又は異常領域(例えば流速低下が起きている領域)の面積が算出されうる。
 他の実施態様において、面積算出部は、所定の色情報を有する画素の面積を算出してもよい。すなわち、前記色情報付与部により色情報が付与された後に、当該色情報に基づき面積が算出されうる。
The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include an area calculation unit. According to one embodiment of the present technology, the flow passage internal information acquisition unit may include an area calculation unit, a color information addition unit, and a flow passage area image creation unit.
In one embodiment, the area calculation unit can calculate the area of a pixel block in which the feature value is within a predetermined range. For example, the area of the flow path can be calculated by calculating the area of a pixel block having a feature value equal to or greater than a predetermined value. In addition, by calculating the area of the pixel block having the feature amount within the predetermined range, the area of the flow path portion which is flowing normally and / or the area of the abnormal area (for example, the area where the flow velocity decrease occurs) is calculated. It can be done.
In another embodiment, the area calculation unit may calculate the area of a pixel having predetermined color information. That is, after the color information is given by the color information giving unit, the area can be calculated based on the color information.
 本技術に従う画像処理装置の流路内部情報取得部は、流れ方向同定部を含みうる。本技術の一つの実施態様に従い、前記流路内部情報取得部は、流れ方向同定部、色情報付与部、及び流路領域画像作成部を含みうる。
 流れ方向同定部は、特徴量に基づき、画素ブロックにおける流れ方向を同定する。流れ方向が同定されることで、例えば、旋回流又は乱流が起こっている領域を同定することができる。
 流れ方向同定部は、例えば、所定の値以上の特徴量を有する画素ブロックの流れ方向だけを同定しうる。これにより、流路内部のみ流れ方向が同定され、流路以外の部分は流れ方向が同定されない。
 当該同定された流れ方向は、流路内に例えば線又は矢印として表示されうる。その結果、本技術に従う画像処理装置のユーザは、流路内の流れ方向をより容易に把握することができる。
The flow channel internal information acquisition unit of the image processing apparatus according to the present technology may include a flow direction identification unit. According to one embodiment of the present technology, the flow passage internal information acquisition unit may include a flow direction identification unit, a color information addition unit, and a flow passage area image creation unit.
The flow direction identification unit identifies the flow direction in the pixel block based on the feature amount. By identifying the flow direction, it is possible to identify, for example, a region where swirling or turbulent flow is occurring.
The flow direction identification unit can, for example, identify only the flow direction of a pixel block having a feature amount equal to or greater than a predetermined value. As a result, the flow direction is identified only in the inside of the flow path, and the flow direction is not identified in parts other than the flow path.
The identified flow direction may be displayed, for example, as a line or an arrow in the flow path. As a result, the user of the image processing apparatus according to the present technology can more easily grasp the flow direction in the flow path.
 本技術において、流路内部情報取得部には、色情報付与部、流路領域画像作成部、流速取得部、異常領域同定部、面積算出部、及び流れ方向同定部から選択される1つ、2つ、3つ、4つ、又は5つの構成要素が含まれうる。もちろん、本技術において、これら6つの構成要素の全てが流路内部情報取得部に含まれていてもよい。 In the present technology, the flow channel internal information acquisition unit includes one selected from a color information addition unit, a flow channel region image generation unit, a flow velocity acquisition unit, an abnormal region identification unit, an area calculation unit, and a flow direction identification unit. Two, three, four or five components may be included. Of course, in the present technology, all of these six components may be included in the internal flow channel information acquisition unit.
(2)第1の実施形態の第1の例(画像処理装置) (2) First Example of First Embodiment (Image Processing Device)
 以下で、本技術に従う画像処理装置の例及び当該画像処理装置を用いた画像処理の例を、図1及び2を参照しながら説明する。図1は、本技術に従う画像処理装置を含む画像処理システムの例を示す図である。図2は、本技術に従う画像処理装置のブロック図の一例である。 Hereinafter, an example of an image processing apparatus according to the present technology and an example of image processing using the image processing apparatus will be described with reference to FIGS. 1 and 2. FIG. 1 is a diagram showing an example of an image processing system including an image processing apparatus according to the present technology. FIG. 2 is an example of a block diagram of an image processing apparatus according to the present technology.
 図1に示されるとおり、画像処理システム100は、撮像装置101及び画像処理装置200を含む。撮像装置101は、撮像対象となる流路領域を撮像することができるものであり、当業者により適宜選択又は構成されうる。撮像装置101は、例えばカメラ、特にはデジタルビデオカメラを含みうる。図1の撮像装置101は、顕微鏡102及びデジタルビデオカメラ103を含む。顕微鏡102は、撮像されるべき流路領域を含む試料104を観察可能に構成される。画像処理装置200は、撮像装置101により撮像された画像データを取得できるように、撮像装置101に有線又は無線で接続されている。 As shown in FIG. 1, the image processing system 100 includes an imaging device 101 and an image processing device 200. The imaging device 101 can capture an image of a flow passage region to be imaged, and can be appropriately selected or configured by those skilled in the art. The imaging device 101 may comprise, for example, a camera, in particular a digital video camera. The imaging device 101 of FIG. 1 includes a microscope 102 and a digital video camera 103. The microscope 102 is configured to be able to observe the sample 104 including the flow passage area to be imaged. The image processing apparatus 200 is connected to the imaging apparatus 101 by wire or wireless so that the image data captured by the imaging apparatus 101 can be acquired.
 図2に示されるとおり、画像処理装置200は、画像記録部201、動き量算出部202、特徴量取得部203、流路内部情報取得部204、及び出力部207を備えている。流路内部情報取得部204は、色情報付与部205、流路領域画像作成部206、面積算出部208、及び流れ方向同定部209を含む。 As illustrated in FIG. 2, the image processing apparatus 200 includes an image recording unit 201, a motion amount calculation unit 202, a feature amount acquisition unit 203, a flow path internal information acquisition unit 204, and an output unit 207. The flow channel internal information acquisition unit 204 includes a color information addition unit 205, a flow channel area image generation unit 206, an area calculation unit 208, and a flow direction identification unit 209.
 撮像装置101は、顕微鏡102を介してデジタルビデオカメラ103により、試料104の流路領域を撮像する。撮像された動画データは、画像処理装置200内の画像記録部201に格納される。動画データの例を図3に示す。図3に示されるとおり、動画データは、1番目の画像データ301-1からt番目の画像データ301-tまでの、合計数tの画像データから構成される。1番目の画像データ301-1は、当該動画データの撮像期間内の最初に記録された画像であり、t番目の画像データ301-tは、当該動画データの撮像期間内の最後に記録された画像である。 The imaging apparatus 101 captures an image of the flow passage area of the sample 104 with the digital video camera 103 via the microscope 102. The captured moving image data is stored in the image recording unit 201 in the image processing apparatus 200. An example of video data is shown in FIG. As shown in FIG. 3, the moving image data is composed of a total number t of image data from the first image data 301-1 to the t-th image data 301-t. The first image data 301-1 is an image recorded first in the imaging period of the moving image data, and the t-th image data 301-t is recorded last in the imaging period of the moving image data It is an image.
 動き量算出部202は、当該動画データを構成する複数の画像から、画像間の動き量を算出する。動き量算出部202は、当該動き量として、例えば動きベクトルを算出する。動き量算出部202は、画像データ301-1と画像データ301-2との間の動き量、画像データ301-2と画像データ301-3との間の動き量、・・・画像データ301-(t-1)と画像データ301-tとの間の動き量を算出する。
 動き量算出部202は、例えば勾配法により、画像データ301-1と画像データ301-2とを対比しうる。動き量算出部202は、画像データ301-1中の或る部位における輝度変化と同様な輝度変化を有する部位を、画像データ301-2中に検出する。動き量算出部202は、前記2つの部位の位置から、動きベクトルを算出する。動き量算出部202は、同様に、画像データ301-2と画像データ301-3との間の動きベクトル、・・・画像データ301-(t-1)と画像データ301-tとの間の動きベクトルを算出する。算出された動きベクトルを例えば矢印で表すと、例えば図3の302-1、302-2、・・・、及び302-(t-1)のように表されうる。なお、本技術において、動き量は、図3の302-1、302-2、・・・、及び302-(t-1)のように矢印で表される必要はなく、データとしてのみ記録されるだけであってよい。
The motion amount calculation unit 202 calculates the motion amount between the images from the plurality of images constituting the moving image data. The motion amount calculation unit 202 calculates, for example, a motion vector as the motion amount. The motion amount calculator 202 calculates the amount of motion between the image data 301-1 and the image data 301-2, the amount of motion between the image data 301-2 and the image data 301-3,. The amount of movement between (t-1) and the image data 301-t is calculated.
The motion amount calculation unit 202 can compare the image data 301-1 and the image data 301-2 by, for example, the gradient method. The motion amount calculation unit 202 detects, in the image data 301-2, a portion having a luminance change similar to the luminance change at a certain portion in the image data 301-1. The motion amount calculation unit 202 calculates a motion vector from the positions of the two parts. Similarly, the motion amount calculation unit 202 calculates a motion vector between the image data 301-2 and the image data 301-3,..., Between the image data 301-(t−1) and the image data 301-t. Calculate motion vector. When the calculated motion vector is represented, for example, by an arrow, it can be represented as, for example, 302-1, 302-2, ..., and 302- (t-1) in FIG. In the present technology, the motion amount does not have to be represented by an arrow as in 302-1, 302-2, ..., and 302- (t-1) in FIG. 3, and is recorded only as data. You only need to
 特徴量取得部203は、前記画像を構成する画素ブロック毎に、動き量算出部202により算出された動き量に関する特徴量を取得する。
 撮像された流路領域は、例えば図4に示されるとおり、水平方向にM個及び垂直方向にN個に分割される。すなわち、当該流路領域は、M×N個の格子状に区切られた画素ブロックに分割される。特徴量取得部203は、画素ブロック毎に、動き量算出部202によって算出された動き量を積算し若しくは平均し、又は、画素ブロック毎に動き量の最大値を取得する。
The feature amount acquisition unit 203 acquires, for each of the pixel blocks constituting the image, a feature amount related to the motion amount calculated by the motion amount calculation unit 202.
The imaged flow path region is divided into M pieces in the horizontal direction and N pieces in the vertical direction, for example, as shown in FIG. That is, the flow passage area is divided into pixel blocks divided into M × N grids. The feature amount acquisition unit 203 integrates or averages the motion amount calculated by the motion amount calculation unit 202 for each pixel block, or acquires the maximum value of the motion amount for each pixel block.
 流路内部情報取得部204は、特徴量取得部203により取得された特徴量に基づき、前記流路領域の流路内部情報を取得する。当該特徴量が積算値であり且つ当該流路内部情報が流路領域の画像である例を以下に説明する。 The internal flow channel information acquisition unit 204 acquires internal flow channel information of the flow channel region based on the feature amount acquired by the feature amount acquisition unit 203. An example in which the feature amount is an integrated value and the flow path internal information is an image of a flow path region will be described below.
 色情報付与部205は、特徴量取得部203により取得された積算値に応じて、各画素ブロックに、色情報を付与する。例えば、或る画素ブロックの前記積算値が所定の閾値より高い場合に当該画素ブロックにはグレーの色情報を付与し、或る画素ブロックの前記積算値が当該閾値以下である場合には当該画素ブロックには白の色情報を付与する。 The color information application unit 205 applies color information to each pixel block in accordance with the integrated value acquired by the feature amount acquisition unit 203. For example, when the integrated value of a pixel block is higher than a predetermined threshold, gray color information is added to the pixel block, and when the integrated value of a pixel block is lower than the threshold, the pixel The block is given white color information.
 流路領域画像作成部206は、色情報付与部205により各画素ブロックに付与された色情報から、流路領域画像を作成する。例えば、図5の右側に示されるような流路領域画像が作成される。なお、図5の左側に示される画像は、各画素ブロックを格子によって示した図である。当該流路領域画像によって、流路の形状を把握することができる。 The flow path area image creation unit 206 creates a flow path area image from the color information provided to each pixel block by the color information addition unit 205. For example, a flow path area image as shown on the right side of FIG. 5 is created. The image shown on the left side of FIG. 5 is a diagram showing each pixel block by a grid. The shape of the flow path can be grasped by the flow path region image.
 面積算出部208は、例えば色情報付与部205によりグレーの色情報が付与された画素ブロックの面積を算出する。これにより、作成された画像中の流路の面積が算定されうる。 The area calculation unit 208 calculates, for example, the area of a pixel block to which gray color information is added by the color information addition unit 205. Thereby, the area of the flow path in the created image can be calculated.
 流れ方向同定部209は、例えば前記特徴量に基づき画素ブロックにおける流れ方向を同定する。当該同定において、例えばユーザにより指定された画素ブロックにおける流れの方向が同定されうる。 The flow direction identification unit 209 identifies the flow direction in the pixel block based on, for example, the feature amount. In the identification, for example, the flow direction in the pixel block designated by the user can be identified.
 出力部207は、流路領域画像作成部206により作成された流路領域画像を出力する。出力部207は、例えばディスプレイ又はプリンタでありうる。
 また、出力部207は、前記流路領域画像の出力と同時に、面積算出部208により算定された面積を出力してもよい。
 また、出力部207は、流れ方向同定部209により同定された流れ方向を、前記流路領域画像上に例えば矢印又は線により表示してもよい。
The output unit 207 outputs the flow channel area image generated by the flow channel area image generation unit 206. The output unit 207 may be, for example, a display or a printer.
Further, the output unit 207 may output the area calculated by the area calculation unit 208 simultaneously with the output of the flow path region image.
In addition, the output unit 207 may display the flow direction identified by the flow direction identification unit 209, for example, by an arrow or a line on the flow passage area image.
 また、本技術に従う画像処理を、例えば異なる2つの時間において適用することで、流路の生成や消滅を観察することもできる。例えば、或る時間において或る流路領域を撮像して得られた動画に基づき本技術に従う画像処理を行うことで、図11の左に示される画像を得る。次に、当該時間から所定時間経過後、例えば24時間経過後に、当該流路領域を撮像して得られた動画に基づき本技術に従う画像処理を行うことで、図11の右に示される画像を得る。このような図11の左及び右に示される画像を比較することで、流路の生成を観察することができる。これにより、例えば血管新生及び軸索生成を評価することができる。 In addition, by applying the image processing according to the present technology, for example, at two different times, it is also possible to observe the generation and disappearance of the flow path. For example, by performing image processing according to the present technology based on a moving image obtained by imaging a certain flow channel area at a certain time, an image shown in the left of FIG. 11 is obtained. Next, after a predetermined time has elapsed from the time, for example, after 24 hours, the image processing shown in the right of FIG. 11 is performed by performing the image processing according to the present technology based on the moving image obtained by imaging the flow channel area. obtain. By comparing the images shown on the left and right of FIG. 11, the generation of the flow channel can be observed. This can, for example, assess angiogenesis and axonal generation.
(3)第1の実施形態の第2の例(画像処理装置) (3) Second example of the first embodiment (image processing apparatus)
 以下で、本技術に従う画像処理装置の例及び当該画像処理装置を用いた画像処理の例を、図1及び6を参照しながら説明する。図1は、上記で説明したとおりである。図6は、本技術に従う画像処理装置のブロック図の一例である。 Hereinafter, an example of an image processing apparatus according to the present technology and an example of image processing using the image processing apparatus will be described with reference to FIGS. 1 and 6. FIG. 1 is as described above. FIG. 6 is an example of a block diagram of an image processing apparatus according to the present technology.
 図6に示されるとおり、画像処理装置200は、画像記録部601、動き量算出部602、特徴量取得部603、流路内部情報取得部604、及び出力部607を備えている。流路内部情報取得部604は、色情報付与部605、流路領域画像作成部606、流速取得部608及び異常領域同定部609を含む。 As illustrated in FIG. 6, the image processing apparatus 200 includes an image recording unit 601, a motion amount calculation unit 602, a feature amount acquisition unit 603, a flow channel internal information acquisition unit 604, and an output unit 607. The flow channel internal information acquisition unit 604 includes a color information addition unit 605, a flow channel area image generation unit 606, a flow velocity acquisition unit 608, and an abnormal area identification unit 609.
 撮像装置101は、顕微鏡102を介してデジタルビデオカメラ103により、試料104の流路領域を撮像する。撮像された動画データは、画像処理装置200内の画像記録部601に格納される。動画データの例を図7に示す。図7に示されるとおり、動画データは、1番目の画像データ701-1からt番目の画像データ701-tまでの、合計数tの画像データから構成される。1番目の画像データ701-1は、当該動画データの撮像期間内の最初に記録された画像であり、t番目の画像データ701-tは、当該動画データの撮像期間内の最後に記録された画像でありうる。 The imaging apparatus 101 captures an image of the flow passage area of the sample 104 with the digital video camera 103 via the microscope 102. The captured moving image data is stored in the image recording unit 601 in the image processing apparatus 200. An example of the video data is shown in FIG. As shown in FIG. 7, moving image data is composed of a total number t of image data from the first image data 701-1 to the t-th image data 701-t. The first image data 701-1 is an image recorded first in the imaging period of the moving image data, and the t-th image data 701-t is recorded last in the imaging period of the moving image data It may be an image.
 動き量算出部602は、当該動画データを構成する複数の画像から、画像間の動き量を算出する。動き量算出部602は、当該動き量として、動きベクトルを算出する。動き量算出部602は、画像データ701-1と画像データ701-2との間の動き量、画像データ701-2と画像データ701-3との間の動き量、・・・画像データ701-(t-1)と画像データ701-tとの間の動き量を算出する。動き量の算出を以下に説明する。
 例えば、動き量算出部602は、ブロックマッチング法により、画像データ701-1と画像データ701-2とを対比しうる。動き量算出部602は、例えば画像データ701-1及び画像データ701-2をそれぞれ、図4に示されるM×N個の画素ブロックに分割する。すなわち、動き量算出部602は、両画像データを、水平方向においてはM個に分割し、垂直方向においてはN個に分割する。画素ブロックは、例えば16×16ピクセルの正方形ブロックでありうる。動き量算出部602は、画像データ701-1中の或るブロックに最も近いブロックを画像データ701-2中に検出する。動き量算出部602は、前記2つのブロックの位置から、動きベクトルを算出する。
 動き量算出部602は、同様に、画像データ701-2と画像データ701-3との間の動きベクトル、・・・、及び画像データ701-(t-1)と画像データ701-tとの間の動きベクトルを算出する。算出された動きベクトルを矢印で表すと、例えば、図7の702-1、702-2、・・・、及び702-(t-1)のように表されうる。
 図7の702-1、702-2、・・・、及び702-(t-1)に示された矢印の一部が、他の矢印と比べて小さい。この小さい矢印は、動きベクトルが他と比べて小さいことを模式的に示すものである。なお、本技術において、動き量は、図7の702-1、702-2、・・・、及び702-(t-1)のように矢印で表される必要はなく、データとしてのみ記録されるだけであってもよい。
The motion amount calculation unit 602 calculates a motion amount between images from a plurality of images forming the moving image data. The motion amount calculation unit 602 calculates a motion vector as the motion amount. The motion amount calculation unit 602 calculates the amount of motion between the image data 701-1 and the image data 701-2, the amount of motion between the image data 701-2 and the image data 701-3,. The amount of movement between (t-1) and the image data 701-t is calculated. The calculation of the movement amount will be described below.
For example, the motion amount calculation unit 602 can compare the image data 701-1 and the image data 701-2 by the block matching method. The motion amount calculation unit 602 divides, for example, each of the image data 701-1 and the image data 701-2 into M × N pixel blocks shown in FIG. That is, the motion amount calculation unit 602 divides both image data into M pieces in the horizontal direction and into N pieces in the vertical direction. The pixel block may be, for example, a square block of 16 × 16 pixels. The motion amount calculation unit 602 detects a block closest to a certain block in the image data 701-1 in the image data 701-2. The motion amount calculation unit 602 calculates a motion vector from the positions of the two blocks.
Similarly, the motion amount calculation unit 602 calculates a motion vector between the image data 701-2 and the image data 701-3, ..., and the image data 701- (t-1) and the image data 701-t. Calculate the motion vector between them. When the calculated motion vector is represented by an arrow, it can be represented as, for example, 702-1, 702-2, ..., and 702- (t-1) in FIG.
Some of the arrows indicated by 702-1, 702-2, ..., and 702- (t-1) in FIG. 7 are smaller than the other arrows. This small arrow schematically indicates that the motion vector is smaller than the others. In the present technology, the motion amount does not have to be represented by an arrow as in 702-1, 702-2, ..., and 702- (t-1) in FIG. 7, and is recorded only as data. May be
 特徴量取得部603は、前記画像を構成する画素ブロック毎に、動き量算出部602により算出された動き量に関する特徴量を取得する。特徴量取得部603は、例えば、上記で説明した図4に示されるM×N個の画素ブロックの夫々について、動き量算出部602によって算出された動き量を積算し若しくは平均し、又は、画素ブロック毎に動き量の最大値を取得する。 The feature amount acquisition unit 603 acquires a feature amount related to the motion amount calculated by the motion amount calculation unit 602 for each of the pixel blocks constituting the image. For example, the feature amount acquiring unit 603 integrates or averages the motion amounts calculated by the motion amount calculating unit 602 for each of the M × N pixel blocks shown in FIG. 4 described above, or The maximum value of the motion amount is acquired for each block.
 流路内部情報取得部604は、特徴量取得部603により取得された特徴量に基づき、前記流路領域の流路内部情報を取得する。当該特徴量が平均値であり且つ当該流路内部情報が流路内の流速である例を以下に説明する。 The flow channel internal information acquisition unit 604 acquires flow channel internal information of the flow channel region based on the feature amount acquired by the feature amount acquisition unit 603. An example in which the feature amount is an average value and the flow channel internal information is a flow velocity in the flow channel will be described below.
 流速取得部608は、特徴量取得部603により取得された平均値から、各画素ブロックの流速を算出する。算出された流速に基づき、例えば上記で説明した色情報付与部605が、画素ブロックに色情報を付与する。
 色情報付与部605は、例えば、流速が第1の閾値以上である画素ブロックには、第1の色情報を付与する。色情報付与部605は、前記第1の閾値未満であり且つ当該第1の閾値よりもさらに低い第2の閾値以上の画素ブロックには、第2の色情報を付与する。色情報付与部605は、前記第2の閾値未満の画素ブロックには、第3の色情報を付与する。これにより、例えば第1の色情報を付与された画素ブロックは流速が速く、第2の色情報を付与された画素ブロックは流速が遅く、且つ、第3の色情報を付与された画素ブロックは流速がないこと(すなわち、流路外であること)が示されうる。
The flow velocity acquisition unit 608 calculates the flow velocity of each pixel block from the average value acquired by the feature amount acquisition unit 603. Based on the calculated flow velocity, for example, the color information giving unit 605 described above gives color information to the pixel block.
The color information addition unit 605, for example, applies the first color information to the pixel block whose flow rate is equal to or higher than the first threshold. The color information addition unit 605 applies second color information to pixel blocks that are equal to or greater than a second threshold that is less than the first threshold and that is lower than the first threshold. The color information adding unit 605 adds third color information to the pixel block smaller than the second threshold. Thus, for example, the pixel block provided with the first color information has a high flow velocity, and the pixel block provided with the second color information has a low flow velocity, and the pixel block provided with the third color information has It can be shown that there is no flow rate (i.e., out of the flow path).
 色情報付与部605が、第1の色情報として濃いグレーを、第2の色情報として薄いグレーを、及び第3の色情報として白を画素ブロックに付与する場合の例を図7及び8を参照して説明する。
 図7において、小さい矢印で示された動きベクトルが算出された画素ブロックでは、上記平均値が他の流路領域と比べて小さくなり、その結果流速が前記第1の閾値未満であり且つ前記第2の閾値以上となる。その結果、当該画素ブロックには、薄いグレーの色情報が付与される。
 図7において、他の矢印で示された動きベクトルが算出された画素ブロックでは、流速が前記第1の閾値以上となる。その結果、当該画素ブロックには、濃いグレーの色情報が付与される。
 図7において、矢印が示されない画素ブロックでは、動きベクトルが極めて小さく、その結果、流速が前記第2の閾値未満となる。従って、当該画素ブロックには、白の色情報が付与される。
 流路領域画像作成部606が、色情報付与部605により付与された色情報に基づき色再現を行った結果作成された流路領域画像が図8の右に示されている。図8の右に示されるとおり、濃いグレー及び薄いグレーで示された部分が流路領域である。当該流路領域のうち、薄いグレーで示された部分が、流速が、他の流路領域と比べて遅い部分を示す。白い部分は、流路外である。ユーザは、図8の右の画像から、流路の形状を把握できるだけでなく、流速の速い部分と流速の遅い部分とを区別できる。
 なお、図8の左側に示される画像は、各画素ブロックを格子によって示した図である。
 また、流速の算出に加えて、各画素ブロックにおける流れの方向が決定されてもよい。当該流れ方向の決定は、例えば上記で述べた流れ方向同定部により行われてよい。
FIGS. 7 and 8 show an example in which the color information adding unit 605 adds dark gray as the first color information, light gray as the second color information, and white to the third color information. Refer to the description.
In FIG. 7, in the pixel block for which the motion vector indicated by the small arrow is calculated, the average value is smaller than that in the other flow passage region, and as a result, the flow velocity is less than the first threshold and the second The threshold is 2 or more. As a result, light gray color information is given to the pixel block.
In FIG. 7, in the pixel block in which the motion vector indicated by the other arrow is calculated, the flow velocity is equal to or higher than the first threshold. As a result, dark gray color information is given to the pixel block.
In FIG. 7, in the pixel block in which the arrow is not shown, the motion vector is extremely small, and as a result, the flow velocity is less than the second threshold. Therefore, white color information is given to the pixel block.
A channel area image created as a result of the channel area image creation unit 606 performing color reproduction based on the color information provided by the color information application unit 605 is shown on the right of FIG. As shown on the right of FIG. 8, the portions shown in dark gray and light gray are the flow channel regions. In the flow channel area, a portion shown in light gray indicates a portion where the flow velocity is slower than that of the other flow channel areas. White parts are outside the flow path. From the image on the right of FIG. 8, the user can not only understand the shape of the flow path, but also can distinguish between the high flow velocity portion and the low flow velocity portion.
The image shown on the left side of FIG. 8 is a diagram showing each pixel block by a grid.
Also, in addition to the calculation of the flow velocity, the flow direction in each pixel block may be determined. The determination of the flow direction may be performed by, for example, the flow direction identification unit described above.
 また、特徴量取得部603により取得される特徴量が最大値である場合の例を、以下に説明する。
 図7を参照して上記で説明したとおり、動き量算出部602は、画像データ701-1と画像データ701-2との間の動きベクトル、画像データ701-2と画像データ701-3との間の動きベクトル、・・・画像データ701-(t-1)と画像データ701-tとの間の動きベクトルを算出する。算出された動きベクトルの大きさは、図12に示されるとおり、画素ブロック毎に数値化される。例えば、702-1の画像中の四角で囲まれた部分を取り出したものが、703-1として表されている。703-1に示されるとおり、画素ブロック毎に、動きベクトルの大きさが数値化されている。同様に、703-2~703-(t-1)に示されるとおり、画素ブロック毎の動きベクトルの大きさが数値化される。これら703-1~703-(t-1)それぞれのグリッド情報から、704に示されるとおり、画素ブロック毎の最大値が取得される。704に示されるとおり、例えば5で表される部分は流速が大きいが、0又は1で表される部分は流速が無いか又は小さい。そのため、0又は1で表される部分には、流れを妨げる要因があると考えられる。
 また、703-1~703-(t-1)及び704に示されるとおり、各画素ブロックに、最大値に応じて色情報が付与されてもよい。色情報の付与は、上記で説明したとおりに行われうる。
Further, an example in which the feature amount acquired by the feature amount acquisition unit 603 is the maximum value will be described below.
As described above with reference to FIG. 7, the motion amount calculation unit 602 calculates the motion vector between the image data 701-1 and the image data 701-2, the image data 701-2 and the image data 701-3. Motion vectors between the image data 701- (t-1) and the image data 701-t are calculated. The magnitudes of the calculated motion vectors are digitized for each pixel block as shown in FIG. For example, the portion surrounded by the square in the image of 702-1 is represented as 703-1. As shown in 703-1, the magnitude of the motion vector is quantified for each pixel block. Similarly, as indicated by 703-2 to 703- (t-1), the magnitude of the motion vector for each pixel block is digitized. From the grid information of each of these 703-1 to 703- (t-1), as indicated by 704, the maximum value for each pixel block is obtained. As shown in 704, for example, the portion represented by 5 has a high flow rate, but the portion represented by 0 or 1 has no or small flow rate. Therefore, in the part represented by 0 or 1, there is considered to be a factor that impedes the flow.
Also, as indicated by 703-1 to 703- (t-1) and 704, color information may be given to each pixel block according to the maximum value. The application of the color information may be performed as described above.
 異常領域同定部609は、例えば、上記第2の色情報が付与された画素ブロックを異常領域として同定する。例えば、異常領域同定部609は、当該画素ブロックを異常領域としてユーザに示すための情報を生成しうる。当該情報の例として、異常領域とそれ以外の領域との境界を示すための色情報、異常領域を点滅させて表示するための情報、又は異常領域を指す矢印を表示するための情報が挙げられる。これらの情報に基づき、流路領域画像作成部606により流路領域画像が生成され、又は、出力部により出力が行われうる。 The abnormal area identification unit 609 identifies, for example, a pixel block to which the second color information is added as an abnormal area. For example, the abnormal area identification unit 609 can generate information for indicating the pixel block as an abnormal area to the user. Examples of the information include color information for indicating the boundary between the abnormal area and the other area, information for blinking the abnormal area for display, or information for displaying an arrow pointing to the abnormal area. . Based on these pieces of information, the flow channel area image generation unit 606 can generate a flow channel area image, or an output unit can perform output.
 出力部607は、以上のとおりに作成された流路領域画像を出力する。出力部607は、例えばディスプレイ又はプリンタでありうる。 The output unit 607 outputs the flow channel area image created as described above. The output unit 607 may be, for example, a display or a printer.
2.第2の実施形態(画像処理方法) 2. Second embodiment (image processing method)
(1)第2の実施形態の説明 (1) Description of the second embodiment
 本技術は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程とを含む画像処理方法を提供する。 The present technology includes a motion amount calculating step of calculating a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region. A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting a part; and a flow passage internal information obtaining step of obtaining flow passage internal information of the flow passage region based on the feature amount. Provide an image processing method including:
 本技術に従う画像処理方法によって、流路内部情報を取得することができる。取得される流路内部情報は、上記「1.第1の実施形態(画像処理装置)」において述べたとおりである。 The flow path internal information can be acquired by the image processing method according to the present technology. The acquired flow channel internal information is as described in the above “1. First embodiment (image processing apparatus)”.
(2)第2の実施形態の例(画像処理方法) (2) Example of Second Embodiment (Image Processing Method)
 以下では、本技術に従う画像処理方法の例を、図1、2、及び9を参照しながら説明する。図1及び2は、上記「1.第1の実施形態(画像処理装置)」において説明したとおりである。図9は、本技術に従う画像処理方法のフローの一例を示す図である。 Hereinafter, an example of an image processing method according to the present technology will be described with reference to FIGS. 1 and 2 are as described in the above "1. First embodiment (image processing apparatus)". FIG. 9 is a diagram illustrating an example of the flow of an image processing method according to the present technology.
 ステップS101において、画像処理装置200は、本技術に従う画像処理を開始する。 In step S101, the image processing apparatus 200 starts image processing according to the present technology.
 ステップS102において、動き量算出部202は、画像データ、例えば動画データを取得する。当該画像データは、画像記録部201に記録されているものであってよく、又は、撮像装置101(特にはデジタルビデオカメラ103)に記録されているものであってもよい。 In step S102, the motion amount calculation unit 202 acquires image data, for example, moving image data. The image data may be recorded in the image recording unit 201, or may be recorded in the imaging device 101 (in particular, the digital video camera 103).
 ステップS103において、動き量算出部202は、前記取得された画像データに含まれる複数の画像のうち少なくとも2つの画像間の動き量を算出する。上記「1.第1の実施形態(画像処理装置)」において動き量算出部に関して説明した内容が、ステップS103の動き量算出工程にも当てはまる。例えば、ステップS103において、動き量算出部202は、図3の画像データ301-1と画像データ301-2との間の動き量、画像データ301-2と画像データ301-3との間の動き量、・・・画像データ301-(t-1)と画像データ301-tとの間の動き量を算出しうる。動き量の算出は、例えば勾配法又はブロックマッチング法により行われうる。 In step S103, the motion amount calculation unit 202 calculates the amount of motion between at least two of the plurality of images included in the acquired image data. The contents described in regard to the motion amount calculation unit in the above “1. First embodiment (image processing apparatus)” also apply to the motion amount calculation step of step S103. For example, in step S103, the motion amount calculation unit 202 calculates the amount of motion between the image data 301-1 and the image data 301-2 in FIG. 3 and the motion between the image data 301-2 and the image data 301-3. The amount,..., The amount of movement between the image data 301- (t-1) and the image data 301-t can be calculated. The motion amount may be calculated by, for example, the gradient method or the block matching method.
 ステップS104において、特徴量取得部203は、流路領域を含む画像の画素ブロック毎に、前記動き量に関する特徴量を取得する。上記「1.第1の実施形態(画像処理装置)」において特徴量取得部に関して説明した内容が、ステップS104の特徴量取得工程にも当てはまる。例えば、ステップS104において、特徴量取得部203は、画素ブロック毎に、動き量算出部202によって算出された動き量を積算し若しくは平均し、又は、画素ブロック毎に動き量の最大値を取得しうる。 In step S104, the feature amount acquisition unit 203 acquires a feature amount related to the movement amount for each pixel block of the image including the flow channel area. The contents described in relation to the feature amount acquisition unit in “1. First embodiment (image processing apparatus)” also apply to the feature amount acquisition step of step S104. For example, in step S104, the feature amount acquisition unit 203 integrates or averages the motion amounts calculated by the motion amount calculation unit 202 for each pixel block, or acquires the maximum value of the motion amount for each pixel block. sell.
 ステップS105において、流路内部情報取得部204は、前記特徴量に基づき、前記流路領域の流路内部情報を取得する。上記「1.第1の実施形態(画像処理装置)」において流路内部情報取得部に関して説明した内容が、ステップS105の流路内部情報取得工程にも当てはまる。例えば、ステップS105において、流路内部情報取得部204の色情報付与部205が、特徴量取得部203により取得された積算値に応じて、各画素ブロックに、色情報を付与しうる。そして、同ステップにおいて、流路内部情報取得部204の流路領域画像作成部206は、色情報付与部205により各画素ブロックに付与された色情報から、流路領域画像を作成しうる。 In step S105, the flow channel internal information acquisition unit 204 acquires flow channel internal information of the flow channel region based on the feature amount. The contents described regarding the flow path internal information acquisition unit in the above “1. First embodiment (image processing apparatus)” also apply to the flow path internal information acquisition step of step S105. For example, in step S105, the color information addition unit 205 of the flow path internal information acquisition unit 204 can add color information to each pixel block in accordance with the integrated value acquired by the feature amount acquisition unit 203. Then, in the same step, the flow channel area image generation unit 206 of the flow channel internal information acquisition unit 204 can generate a flow channel area image from the color information provided to each pixel block by the color information addition unit 205.
 ステップS106において、出力部207は、取得された前記流路内部情報を出力する。例えば、流路内部情報取得部204により作成された流路領域画像が、ディスプレイにより表示され又はプリンタにより印刷されうる。 In step S106, the output unit 207 outputs the acquired channel internal information. For example, the flow path area image created by the flow path internal information acquisition unit 204 may be displayed by a display or printed by a printer.
 ステップS107において、画像処理装置200は、本技術に従う画像処理を終了する。 In step S107, the image processing apparatus 200 ends the image processing according to the present technology.
3.第3の実施形態(画像処理用プログラム) 3. Third embodiment (program for image processing)
(1)第3の実施形態の説明 (1) Description of the third embodiment
 本技術は、流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程とをコンピュータに実行させるための画像処理用プログラムも提供する。 The present technology includes a motion amount calculating step of calculating a motion amount between at least two images of a plurality of images obtained by imaging the flow passage region continuously in time, and at least an image of the flow passage region. A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting a part; and a flow passage internal information obtaining step of obtaining flow passage internal information of the flow passage region based on the feature amount. An image processing program to be executed by a computer is also provided.
 すなわち、本技術に従う画像処理用プログラムは、本技術に従う画像処理方法をコンピュータに実行させるためのプログラムである。当該プログラムにより実行される各工程は、上記「2.第2の実施形態(画像処理方法)」において述べたとおりであるので、これらについての説明は省略する。 That is, the image processing program according to the present technology is a program for causing a computer to execute the image processing method according to the present technology. Since each process performed by the said program is as having described in said "2. 2nd Embodiment (image processing method)", description about these is abbreviate | omitted.
4.ハードウェア構成例 4. Hardware configuration example
 以下で、図10を参照しながら、本技術に従う画像処理装置を実現する情報処理装置のハードウェア構成の一例を説明する。図10は、本技術に従う画像処理装置を実現する情報処理装置の概略的なハードウェア構成の一例を示す図である。 Hereinafter, an example of a hardware configuration of an information processing apparatus that realizes an image processing apparatus according to the present technology will be described with reference to FIG. FIG. 10 is a diagram showing an example of a schematic hardware configuration of an information processing apparatus for realizing the image processing apparatus according to the present technology.
 図10に示される情報処理装置1001は、CPU(中央演算処理装置)1002及びRAM1003を備えている。CPU1002及びRAM1003は、バス1005を介して相互に接続されており、また、情報処理装置1001の他の構成要素ともバス1005を介して接続されている。
 CPU1002は、情報処理装置1001の制御及び演算を行う。CPU1002として、任意のプロセッサを用いることができ、その例としてXeon(登録商標)シリーズ、Core(商標)シリーズ、又はAtom(商標)シリーズのプロセッサを挙げることができる。図2を参照して説明した画像処理装置200の各構成要素は例えばCPU1002により実現されうる。
 RAM1003は、例えばキャッシュ・メモリ及びメイン・メモリを含み、CPU1002により使用されるプログラムなどを一時記憶しうる。
An information processing apparatus 1001 illustrated in FIG. 10 includes a CPU (central processing unit) 1002 and a RAM 1003. The CPU 1002 and the RAM 1003 are connected to each other via a bus 1005, and are also connected to other components of the information processing apparatus 1001 via the bus 1005.
The CPU 1002 performs control and calculation of the information processing apparatus 1001. As the CPU 1002, any processor can be used, and examples thereof include a processor of Xeon (registered trademark) series, Core (trademark) series, or Atom (trademark) series. Each component of the image processing apparatus 200 described with reference to FIG. 2 can be realized by the CPU 1002, for example.
The RAM 1003 includes, for example, a cache memory and a main memory, and can temporarily store a program used by the CPU 1002 and the like.
 情報処理装置1001は、ディスク1004、通信装置1006、出力装置1007、入力装置1008、及びドライブ1009を備えていてもよい。これらの構成要素はいずれもバス1005に接続されうる。
 ディスク1004には、オペレーティング・システム(例えば、WINDOWS(登録商標)、UNIX(登録商標)、又はLINUX(登録商標)など)、本技術に従う画像処理用プログラム及び他の種々のプログラム、並びに各種データ(例えば画像データ)が格納されうる。
 通信装置1006は、情報処理装置1001をネットワーク1010に有線又は無線で接続する。通信装置1006は、情報処理装置1001を、ネットワーク1010を介して撮像装置と通信可能にすることができる。通信装置1006の種類は当業者により適宜選択されてよい。
 出力装置1007は、情報処理装置1001による処理結果を出力しうる。出力装置1007として、ディスプレイなどの表示装置、スピーカなどの音声出力装置、又はプリンタを挙げることができるが、これらに限定されない。
 入力装置1008は、ユーザが情報処理装置1001を操作するための装置である。入力装置1008として、例えばマウス及びキーボードを挙げることができるが、これらに限定されない。
 ドライブ1009は、記録媒体に記録されている情報を読み出して、RAM1003に出力すること及び/又は記録媒体に各種データを書き込むことができる。記録媒体は、例えば、DVDメディア、フラッシュメモリ、又はSDメモリカードであるが、これらに限定されない。
The information processing apparatus 1001 may include a disk 1004, a communication device 1006, an output device 1007, an input device 1008, and a drive 1009. Any of these components can be connected to the bus 1005.
The disk 1004 includes an operating system (for example, WINDOWS (registered trademark), UNIX (registered trademark), LINUX (registered trademark), etc.), an image processing program according to the present technology and various other programs, and various data ( For example, image data can be stored.
The communication device 1006 connects the information processing device 1001 to the network 1010 by wire or wirelessly. The communication device 1006 can enable the information processing device 1001 to communicate with the imaging device via the network 1010. The type of communication device 1006 may be appropriately selected by those skilled in the art.
The output device 1007 can output the processing result of the information processing device 1001. Examples of the output device 1007 include, but are not limited to, a display device such as a display, an audio output device such as a speaker, or a printer.
The input device 1008 is a device for the user to operate the information processing apparatus 1001. Examples of the input device 1008 can include, but are not limited to, a mouse and a keyboard.
The drive 1009 can read out the information recorded on the recording medium and output the information to the RAM 1003 and / or write various data to the recording medium. The recording medium is, for example, a DVD medium, a flash memory, or an SD memory card, but is not limited thereto.
 なお、本技術は、以下のような構成をとることもできる。
〔1〕流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出部と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得部と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得部と
 を備えている画像処理装置。
〔2〕前記流路内部情報取得部が、前記特徴量に応じて各画素ブロックに色情報を付与する色情報付与部を含む、〔1〕の画像処理装置。
〔3〕前記流路内部情報取得部が、前記色情報に基づき前記流路領域の画像を作成する流路領域画像作成部をさらに含む、〔2〕の画像処理装置。
〔4〕前記流路内部情報取得部が、前記特徴量に基づき流路内部の流速を取得する流速取得部をさらに含む、〔1〕~〔3〕のいずれかの画像処理装置。
〔5〕前記流路内部情報取得部が、前記流速に基づき流れの異常領域を同定する異常領域同定部をさらに含む、〔4〕の画像処理装置。
〔6〕前記流路内部情報取得部が、前記特徴量が所定の範囲内にある画素ブロックの面積を算出する面積算出部をさらに含む、〔1〕~〔5〕のいずれかの画像処理装置。
〔7〕前記特徴量が、5以上の動き量に基づき取得される、〔1〕~〔6〕のいずれかの画像処理装置。
〔8〕前記特徴量が、前記動き量の積算値若しくは平均値、又は、前記動き量のうちの最大値である、〔1〕~〔7〕のいずれかの画像処理装置。
〔9〕前記複数の画像が、前記流路領域の動画を構成するものである、〔1〕~〔8〕のいずれかの画像処理装置。
〔10〕前記少なくとも2つの画像が、前記動画を構成する画像のうちの連続する画像である、〔9〕の画像処理装置。
〔11〕前記動き量が、流路内を流れる粒子の動き量である、〔1〕~〔10〕のいずれかの画像処理装置。
〔12〕前記粒子が生体粒子である、〔11〕の画像処理装置。
〔13〕前記流路領域が、粒子が流れている生体内の流路及び/又は粒子が流れている人工的な流路を含む、〔1〕~〔12〕のいずれかの画像処理装置。
〔14〕流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
 を含む画像処理方法。
〔15〕流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
 前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
 前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
 をコンピュータに実行させるための画像処理用プログラム。
The present technology can also be configured as follows.
[1] A motion amount calculation unit that calculates a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time
A feature amount acquisition unit that acquires a feature amount related to the movement amount for each pixel block that constitutes at least a part of the image of the flow path region;
And a flow channel internal information acquisition unit that acquires flow channel internal information of the flow channel region based on the feature amount.
[2] The image processing apparatus according to [1], wherein the flow channel internal information acquisition unit includes a color information addition unit that adds color information to each pixel block according to the feature amount.
[3] The image processing apparatus according to [2], wherein the flow channel internal information acquisition unit further includes a flow channel region image generation unit that generates an image of the flow channel region based on the color information.
[4] The image processing apparatus according to any one of [1] to [3], wherein the flow channel internal information acquisition unit further includes a flow velocity acquisition unit that acquires the flow velocity inside the flow channel based on the feature amount.
[5] The image processing apparatus of [4], wherein the flow channel internal information acquisition unit further includes an abnormal area identification unit that identifies an abnormal area of the flow based on the flow velocity.
[6] The image processing apparatus according to any one of [1] to [5], wherein the flow channel internal information acquisition unit further includes an area calculation unit that calculates an area of a pixel block in which the feature value is within a predetermined range. .
[7] The image processing apparatus according to any one of [1] to [6], wherein the feature amount is acquired based on a motion amount of 5 or more.
[8] The image processing apparatus according to any one of [1] to [7], wherein the feature amount is an integrated value or an average value of the motion amounts, or a maximum value of the motion amounts.
[9] The image processing apparatus according to any one of [1] to [8], wherein the plurality of images constitute a moving image of the flow passage area.
[10] The image processing apparatus of [9], wherein the at least two images are continuous images among the images constituting the moving image.
[11] The image processing apparatus according to any one of [1] to [10], wherein the movement amount is a movement amount of particles flowing in the flow path.
[12] The image processing apparatus of [11], wherein the particles are biological particles.
[13] The image processing apparatus according to any one of [1] to [12], wherein the flow passage region includes a flow passage in a living body in which particles are flowing and / or an artificial flow passage in which particles are flowing.
[14] A motion amount calculating step of calculating a motion amount between at least two of a plurality of images obtained by imaging the flow passage region continuously in time,
A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
And a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
[15] A motion amount calculating step of calculating a motion amount between at least two images among a plurality of images obtained by imaging the flow channel region continuously in time,
A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
An image processing program for causing a computer to execute a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
100 画像処理システム
101 撮像装置
102 顕微鏡
103 デジタルビデオカメラ
104 試料
200 画像処理装置
DESCRIPTION OF SYMBOLS 100 Image processing system 101 Imaging device 102 Microscope 103 Digital video camera 104 Sample 200 Image processing apparatus

Claims (15)

  1.  流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出部と、
     前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得部と、
     前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得部と
     を備えている画像処理装置。
    A motion amount calculation unit that calculates a motion amount between at least two of a plurality of images obtained by imaging the flow path region continuously in time;
    A feature amount acquisition unit that acquires a feature amount related to the movement amount for each pixel block that constitutes at least a part of the image of the flow path region;
    And a flow channel internal information acquisition unit that acquires flow channel internal information of the flow channel region based on the feature amount.
  2.  前記流路内部情報取得部が、前記特徴量に応じて各画素ブロックに色情報を付与する色情報付与部を含む、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the flow path internal information acquisition unit includes a color information addition unit that adds color information to each pixel block according to the feature amount.
  3.  前記流路内部情報取得部が、前記色情報に基づき前記流路領域の画像を作成する流路領域画像作成部をさらに含む、請求項2に記載の画像処理装置。 The image processing apparatus according to claim 2, wherein the flow channel internal information acquisition unit further includes a flow channel region image generation unit that generates an image of the flow channel region based on the color information.
  4.  前記流路内部情報取得部が、前記特徴量に基づき流路内部の流速を取得する流速取得部をさらに含む、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the flow channel internal information acquisition unit further includes a flow velocity acquisition unit that acquires a flow velocity inside the flow channel based on the feature amount.
  5.  前記流路内部情報取得部が、前記流速に基づき流れの異常領域を同定する異常領域同定部をさらに含む、請求項4に記載の画像処理装置。 The image processing apparatus according to claim 4, wherein the flow channel internal information acquisition unit further includes an abnormal area identification unit that identifies an abnormal area of the flow based on the flow velocity.
  6.  前記流路内部情報取得部が、前記特徴量が所定の範囲内にある画素ブロックの面積を算出する面積算出部をさらに含む、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the flow channel internal information acquisition unit further includes an area calculation unit that calculates an area of a pixel block in which the feature amount is within a predetermined range.
  7.  前記特徴量が、5以上の動き量に基づき取得される、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the feature amount is acquired based on a motion amount of 5 or more.
  8.  前記特徴量が、前記動き量の積算値若しくは平均値、又は、前記動き量のうちの最大値である、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the feature amount is an integrated value or an average value of the motion amounts, or a maximum value of the motion amounts.
  9.  前記複数の画像が、前記流路領域の動画を構成するものである、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the plurality of images constitute a moving image of the flow passage area.
  10.  前記少なくとも2つの画像が、前記動画を構成する画像のうちの連続する画像である、請求項9に記載の画像処理装置。 The image processing apparatus according to claim 9, wherein the at least two images are continuous images among images constituting the moving image.
  11.  前記動き量が、流路内を流れる粒子の動き量である、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the movement amount is a movement amount of particles flowing in the flow path.
  12.  前記粒子が生体粒子である、請求項11に記載の画像処理装置。 The image processing apparatus according to claim 11, wherein the particles are biological particles.
  13.  前記流路領域が、粒子が流れている生体内の流路及び/又は粒子が流れている人工的な流路を含む、請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the flow channel region includes a flow channel in a living body in which particles flow and / or an artificial flow channel in which particles flow.
  14.  流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
     前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
     前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
     を含む画像処理方法。
    A motion amount calculating step of calculating a motion amount between at least two of a plurality of images obtained by imaging the flow path region continuously in time;
    A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
    And a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
  15.  流路領域を時間的に連続して撮像して得られた複数の画像のうち少なくとも2つの画像間の動き量を算出する動き量算出工程と、
     前記流路領域の画像の少なくとも一部を構成する画素ブロック毎に前記動き量に関する特徴量を取得する特徴量取得工程と、
     前記特徴量に基づき、前記流路領域の流路内部情報を取得する流路内部情報取得工程と
     をコンピュータに実行させるための画像処理用プログラム。
    A motion amount calculating step of calculating a motion amount between at least two of a plurality of images obtained by imaging the flow path region continuously in time;
    A feature amount obtaining step of obtaining a feature amount related to the movement amount for each pixel block constituting at least a part of the image of the flow passage region;
    An image processing program for causing a computer to execute a flow channel internal information acquisition step of acquiring flow channel internal information of the flow channel region based on the feature amount.
PCT/JP2018/038603 2017-11-24 2018-10-17 Image processing device, image processing method, and image processing program WO2019102750A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017225404 2017-11-24
JP2017-225404 2017-11-24

Publications (1)

Publication Number Publication Date
WO2019102750A1 true WO2019102750A1 (en) 2019-05-31

Family

ID=66631460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/038603 WO2019102750A1 (en) 2017-11-24 2018-10-17 Image processing device, image processing method, and image processing program

Country Status (1)

Country Link
WO (1) WO2019102750A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10123163A (en) * 1996-10-22 1998-05-15 Toyota Central Res & Dev Lab Inc Flow rate distribution measuring method
WO2010137471A1 (en) * 2009-05-29 2010-12-02 コニカミノルタオプト株式会社 Device for measuring amount of agglutinated blood cells and method for measuring amount of agglutinated blood cells
JP2012050617A (en) * 2010-08-31 2012-03-15 Fujifilm Corp Image acquisition method and device
JP2014166270A (en) * 2013-02-28 2014-09-11 Canon Inc Image processing apparatus and image processing method
JP2016042264A (en) * 2014-08-15 2016-03-31 ソニー株式会社 Image processing apparatus, image processing program, and image processing method
WO2016194718A1 (en) * 2015-06-05 2016-12-08 ソニー株式会社 Image processing device, image processing method, and surgical system
JP2017124063A (en) * 2016-01-14 2017-07-20 国立大学法人 名古屋工業大学 Artery blood vessel detection device and artery blood vessel evaluation device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10123163A (en) * 1996-10-22 1998-05-15 Toyota Central Res & Dev Lab Inc Flow rate distribution measuring method
WO2010137471A1 (en) * 2009-05-29 2010-12-02 コニカミノルタオプト株式会社 Device for measuring amount of agglutinated blood cells and method for measuring amount of agglutinated blood cells
JP2012050617A (en) * 2010-08-31 2012-03-15 Fujifilm Corp Image acquisition method and device
JP2014166270A (en) * 2013-02-28 2014-09-11 Canon Inc Image processing apparatus and image processing method
JP2016042264A (en) * 2014-08-15 2016-03-31 ソニー株式会社 Image processing apparatus, image processing program, and image processing method
WO2016194718A1 (en) * 2015-06-05 2016-12-08 ソニー株式会社 Image processing device, image processing method, and surgical system
JP2017124063A (en) * 2016-01-14 2017-07-20 国立大学法人 名古屋工業大学 Artery blood vessel detection device and artery blood vessel evaluation device

Similar Documents

Publication Publication Date Title
Wang et al. Crystal growth measurement using 2D and 3D imaging and the perspectives for shape control
US9014444B2 (en) Method and apparatus for automatic HER2 scoring of tissue samples
Parlak et al. Deep learning-based detection of aluminum casting defects and their types
JP7176697B2 (en) Cell evaluation system and method, cell evaluation program
JPH10318904A (en) Apparatus for analyzing particle image and recording medium recording analysis program therefor
Scipioni et al. Phasor analysis of local ICS detects heterogeneity in size and number of intracellular vesicles
JPWO2019230356A1 (en) Learning device, inspection device, learning method and inspection method
Iwabuchi et al. Evaluation of the effectiveness of Gaussian filtering in distinguishing punctate synaptic signals from background noise during image analysis
CN105190290A (en) Three-dimensional image processing to locate nanoparticles in biolgical and nonbiological media
JP4383352B2 (en) Histological evaluation of nuclear polymorphism
Bengtsson Computerized cell image analysis: Past, present, and future
WO2019102750A1 (en) Image processing device, image processing method, and image processing program
Martens et al. Raw data to results: a hands-on introduction and overview of computational analysis for single-molecule localization microscopy
Kumar et al. Automated motion tracking and data extraction for red blood cell biomechanics
Miller et al. Automated measurement of fast mitochondrial transport in neurons
Kovalova et al. Possibilities of automated image processing at optical capillaroscopy
JPWO2021019830A5 (en)
JP2007078590A (en) Particle property analysis display device
Mehri et al. An automated method for dynamic red blood cell aggregate detection in microfluidic flow
Wang et al. Crowdsourced generation of annotated video datasets: a zebrafish larvae dataset for video segmentation and tracking evaluation
US20210304417A1 (en) Observation device and observation method
JP7256125B2 (en) Cell analysis method and device
Matula et al. Acquiarium: free software for the acquisition and analysis of 3D images of cells in fluorescence microscopy
Youn et al. Fully-automatic deep learning-based analysis for determination of the invasiveness of breast cancer cells in an acoustic trap
JP2021043828A (en) Analysis method of metal solidified crystal structure, and analyzing device of metal solidified crystal structure

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18880094

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18880094

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP