WO2012169088A1 - 画像処理装置、画像処理方法及び画像処理システム - Google Patents
画像処理装置、画像処理方法及び画像処理システム Download PDFInfo
- Publication number
- WO2012169088A1 WO2012169088A1 PCT/JP2011/075626 JP2011075626W WO2012169088A1 WO 2012169088 A1 WO2012169088 A1 WO 2012169088A1 JP 2011075626 W JP2011075626 W JP 2011075626W WO 2012169088 A1 WO2012169088 A1 WO 2012169088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image processing
- target cell
- candidate
- pixels
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present invention relates to an image processing apparatus, an image processing method, and an image processing system.
- NRBCs fetal-derived nucleated red blood cells
- target cells fetal-derived nucleated red blood cells
- Patent No. 4346923 gazette
- the object of the present invention is that when searching for a target cell by image processing from a captured image obtained by imaging a target cell having a nucleus, it can be made less susceptible to differences in the sample for the target cell and differences in imaging conditions etc.
- An image processing apparatus, an image processing method, and an image processing system are provided.
- the invention according to claim 1 is an acquisition means for acquiring an imaged image obtained by imaging a sample containing a target cell having a nucleus, and the nucleus among the pixels contained in the imaged image.
- First extraction means for extracting a candidate pixel as a candidate for the candidate pixel based on a first condition predetermined for at least one of color and luminance to be possessed by the candidate pixel;
- the connected pixel group serving as a candidate for the target cell is selected in advance from the connected pixel group in which adjacent pixels are connected for each of the pixels, and the predetermined size and shape of the connected pixel group serving as the candidate
- the invention according to claim 2 is the image according to claim 1, characterized in that the condition of the image feature amount is machine-learned based on sample images of positive and negative examples of the target cell. It is a processing device.
- the invention according to claim 3 divides the rectangular area set by the setting means into predetermined partial areas, calculates the direction of the luminance gradient in each pixel in each partial area, and calculates the luminance gradient.
- the image processing apparatus according to claim 1 or 2 further comprising computing means for computing a histogram of directions for each partial region, concatenating the histograms of each partial region, and computing an image feature amount obtained from the rectangular region.
- the invention according to claim 4 is the image processing apparatus according to any one of claims 1 to 3, wherein the image feature quantity is a HOG feature quantity.
- the vector connecting the center of the rectangular area set by the setting means and the gravity center of the connected pixel group included in the rectangular area is directed in a predetermined direction.
- the image processing method further includes an image feature amount acquisition unit that acquires an image feature amount from the rotated captured image after rotating the captured image included in the rectangular region, and the determination unit acquires the image feature amount acquired by the image feature amount acquisition unit. It is determined whether or not the target cell is included in the rectangular area based on whether or not the image feature quantity satisfied satisfies the condition of the image feature quantity. It is an image processing apparatus according to any of the above.
- the connection included in the rectangular area among the images included in the rectangular area when the end of the captured image is included in the rectangular area set by the setting unit, the connection included in the rectangular area among the images included in the rectangular area.
- the method further includes an interpolation unit that combines and interpolates an image that is outside the center line with respect to the center line set in the pixel group with respect to the center line and is symmetrical with respect to the center line. 5.
- An image processing apparatus according to any one of items 1 to 5.
- the invention according to claim 7 is characterized in that a coordinate region of the captured image corresponding to a rectangular region determined to contain a target cell by the determination means is displayed on a display device.
- 6 is an image processing apparatus according to any of 6;
- the connected pixel group to be a candidate for the target cell among the connected pixel groups to which each is connected is based on a second condition predetermined for the size and shape that the connected pixel group to be the candidate should have.
- the invention according to claim 9 is the image according to claim 8, wherein the condition of the image feature amount is machine-learned based on sample images of positive and negative examples of the target cell. It is a processing method.
- the rectangular area set in the setting step is divided into predetermined partial areas, and the direction of the luminance gradient is calculated at each pixel in each partial area, and the calculated luminance gradient 10.
- the invention according to claim 11 is the image processing method according to any one of claims 8 to 10, wherein the image feature amount is a HOG feature amount.
- the invention according to claim 12 includes an image processing apparatus, an optical microscope connected to the image processing apparatus, and a display apparatus connected to the image processing apparatus, the image processing apparatus including a nucleus.
- Acquisition means for acquiring a captured image obtained by capturing a sample containing a target cell, and a color or luminance that the candidate pixel should have as a candidate pixel for the nucleus among the pixels included in the captured image
- the target is selected from a first extraction unit for extracting based on a first condition predetermined for at least one of the above and a connected pixel group in which adjacent pixels for the pixels extracted by the first extraction unit are respectively connected
- a second extraction unit that extracts a connected pixel group that is a candidate for a cell based on a second condition that is predetermined for the size and shape that the connected pixel group that is a candidate should have, and the second extraction
- a setting unit configured to set, in the captured image, a rectangular area of a given size centered on a pixel included in the connected pixel group extracted by
- the invention according to claim 13 is the image according to claim 12, wherein the condition of the image feature amount is machine-learned based on sample images of positive and negative examples of the target cell. It is a processing system.
- the invention according to claim 14 divides the rectangular area set by the setting means into predetermined partial areas, calculates the direction of the luminance gradient in each pixel in each partial area, and calculates the luminance gradient. 14.
- the invention according to claim 15 is the image processing system according to any one of claims 12 to 14, wherein the image feature quantity is an HOG feature quantity.
- the target cell when searching for a target cell by image processing from a captured image obtained by imaging a target cell having a nucleus, as compared with the case of not having this configuration, the target cell It can be made hard to receive the influence of the difference of each sample, the difference of imaging conditions, etc.
- the target cell in the captured image is compared with the case where the condition of the image feature amount is not learned based on the sample images of the positive and negative examples of the sample cells. Can be accurately determined.
- the difference or imaging of the target cell with respect to the image feature obtained from the rectangular area The influence of differences in conditions etc. is reduced.
- the image in the rectangular area is compared with the case where the present configuration is not provided. It can be accurately determined whether or not the target cell is a target cell.
- the sixth aspect of the present invention compared with the case where the present configuration is not provided, it is more accurate whether the image in the rectangular area is the target cell or not even from the rectangular area set at the end of the captured image. It can be judged well.
- the seventh aspect of the present invention it is possible to display an image area including a target cell in a captured image obtained by imaging a target cell having a nucleus.
- FIG. 1 shows a system configuration diagram of an image processing system 1 according to the present embodiment.
- the image processing system 1 includes an optical microscope 2, an image processing device 10, and a display device 6, and the image processing device 10 is connected in data communication with each of the optical microscope 2 and the display device 6. ing.
- the optical microscope 2 captures an image of a sample on a slide glass 3 disposed on a sample table with a CCD camera 5 via an optical system such as an objective lens 4.
- an optical system such as an objective lens 4.
- maternal blood is applied to the slide glass 3 and used for May-Giemsa staining.
- fetal-derived nucleated red blood cells (NRBCs) in maternal blood are stained blue-purple.
- NRBCs fetal-derived nucleated red blood cells
- the image processing apparatus 10 acquires a captured image captured by the optical microscope 2 and searches for a target cell in the acquired captured image. Details of the target cell search process performed in the image processing apparatus 10 will be described later.
- the display device 6 displays a screen based on the result of processing by the image processing apparatus 10. For example, the display device 6 displays a captured image captured by the optical microscope 2, a search result of a target cell by the image processing device 10, and the like.
- the image processing apparatus 10 includes a captured image acquisition unit 12, a preprocessing unit 14, a nucleus candidate region extraction unit 16, a target cell candidate region extraction unit 18, a determination target region setting unit 20, and a normalization unit 22, an image interpolation unit 24, a feature amount calculation unit 26, a learning data acquisition unit 28, a learning unit 30, a determination unit 32, a target cell area storage unit 34, and a result output unit 36.
- the functions of the above-described units included in the image processing apparatus 10 are computer-readable information storage that can be read by a computer including a control unit such as a CPU, a storage unit such as a memory, an input / output unit that transmits and receives data with an external device It may be realized by reading and executing a program stored in a medium.
- the program may be supplied to the image processing apparatus 10 as a computer by an information storage medium such as an optical disk, a magnetic disk, a magnetic tape, a magneto-optical disk, a flash memory, etc. It may be supplied to the processing device 10.
- the captured image acquisition unit 12 acquires, from the optical microscope 2, a captured image obtained by capturing an image of a sample from the CCD camera 5 provided in the optical microscope 2.
- FIG. 3 an example of the captured image which imaged the sample (maternal blood) acquired with the captured image acquisition part 12 with the optical microscope 2 was shown.
- the target cells have the following four features (automatic extraction of nucleated red blood cells from a large number of microscopic images, see Journal of Image Electronics, Vol. 37, No. 5, September 2008) ).
- the first feature of NRBCs is that NRBCs have one nucleus, the shape of the nucleus is close to a true circle, and the density is high.
- the second feature is that the nuclei of NRBCs are stained slightly darker by May-Giemsa than those of other cells.
- the third feature is that the area of NRBCs and the area of their nuclei, and the ratio thereof fall within a specific range.
- the fourth feature is that NRBCs have a slightly larger concentration difference between the nucleus and cytoplasm than other cells.
- the preprocessing unit 14 subjects the captured image acquired by the captured image acquisition unit 12 to image processing such as histogram normalization, color matching by principal component analysis, average value filter, median filter, etc. to normalize the color of the captured image. And noise removal.
- image processing such as histogram normalization, color matching by principal component analysis, average value filter, median filter, etc. to normalize the color of the captured image. And noise removal.
- the nucleus candidate region extraction unit 16 extracts, as a candidate region of a nucleus, a pixel whose color or density is in a predetermined range with respect to a captured image from which noise has been removed by the preprocessing unit 14.
- the nucleus candidate area extraction unit 16 may binarize the pixels in the captured image with a predetermined color (or density) threshold, and specifically, the color (or density) is greater than the threshold Alternatively, dark pixels (or more than the threshold value) may be extracted as black pixels.
- FIG. 4 shows an example of the pixel of the nucleus candidate extracted by the nucleus candidate region extraction unit 16 with respect to the captured image shown in FIG. 3.
- an area (pixel) which is a candidate of a nucleus is extracted from the captured image.
- the target cell candidate region extraction unit 18 determines in advance the size and the shape from among connected pixel groups in which adjacent pixels are respectively connected among the candidate pixels of the nucleus extracted by the nucleus candidate region extraction unit 16.
- the connected pixel group satisfying the conditions is extracted as a pixel group (target cell candidate region) which becomes a candidate for the target cell.
- the target cell candidate region extraction unit 18 sets the vertical and horizontal lengths of the circumscribed rectangle, the ratio of the vertical and horizontal lengths, and the black pixel density in the circumscribed rectangle to a predetermined range for each value.
- the connected pixel group in is extracted as a candidate of a target cell.
- FIG. 5 shows an example of a pixel group extracted as a target cell candidate from the pixels of the nucleus candidate shown in FIG. As shown in FIG. 5, by the processing by the target cell candidate area extraction unit 18, an image area having a possibility of a nucleus in the target cell is further extracted from the nucleus candidates.
- the determination target area setting unit 20 has a given size (for example, N ⁇ M pixels) centered on the pixels of the rectangular area (candidate rectangular area) set in the connected pixel group extracted by the target cell candidate area extraction unit 18
- the rectangular area (determination target rectangular area) of is set in the captured image.
- the determination target area setting unit 20 selects one pixel from among the candidate rectangular areas, identifies the corresponding pixel in the captured image based on the position coordinate of the selected one pixel, and sets the identified corresponding pixel as a center.
- the determination target area setting unit 20 may sequentially select one pixel from the candidate rectangular areas and set the determination target area for each of the selected pixels.
- FIG. 6 illustrates an example of the determination target area set in the captured image by the determination target area setting unit 20. As shown in FIG. 6, the determination target area is set around one pixel of the candidate rectangular area.
- the normalization unit 22 performs a process of rotating so that the orientation of the image in the determination target region set by the determination target region setting unit 20 becomes a predetermined direction. For example, the normalization unit 22 obtains the barycentric position of the binarized image in the determination target area, and an azimuth vector connecting the center position of the determination target area and the calculated barycentric position has a predetermined direction (for example, upward Calculate the rotation angle required to turn the Then, the normalization unit 22 rotates the image (partial image of the captured image) in the determination target area at the calculated rotation angle.
- the process by the normalization unit 22 may not necessarily be performed.
- FIG. 7 is a diagram for explaining the flow of processing by the normalization unit 22.
- FIG. 7A shows a determination target area set in a captured image
- FIG. 7B shows a binarized image in the determination target area. Then, an image obtained by rotating the captured image shown in FIG. 7A at a rotation angle ⁇ necessary for the orientation vector in FIG. 7B to face upward and cutting it out in the determination target area is FIG. 7C. It is.
- the image interpolation unit 24 interpolates the image in the determination target area. For example, after the image interpolation unit 24 expands the determination target area to a predetermined size (2M ⁇ 2M), the image interpolation unit 24 generates an image with the end of the captured image in the binarized image included in the expanded determination target area. Set the longest part of the parallel line segments as the center line.
- the image interpolation unit 24 obtains the distance L from the set center line to the end of the captured image, and in the expanded region, from the end opposite to the end of the captured image with respect to the center line, -L)
- a partial area consisting of 2M pixels parallel to the center line is moved to a position symmetrical with respect to the center line to interpolate the image in the determination target area.
- FIG. 8 is a diagram for explaining the flow of processing by the image interpolation unit 24.
- FIG. 8A shows an example of the determination target area set by the determination target area setting unit 20. As shown in FIG. 8A, the end of the captured image is included in the determination target area.
- the image interpolation unit 24 expands the determination target area to 2M ⁇ 2M, and then the end of the captured image among the binarized images in the determination target area. Set the position of the longest line segment parallel to and as the center line. Next, the image interpolation unit 24 obtains the length L between the center line and the end of the captured image (see FIG. 8C), and faces the end of the captured image with respect to the center line in the expanded determination symmetric region.
- FIG. 8C shows the length L between the center line and the end of the captured image with respect to the center line in the expanded determination symmetric region.
- 8D is a diagram in which a partial region consisting of (M ⁇ L) pixels perpendicular to the center line and 2M pixels parallel to the center line is moved to a line symmetrical position with respect to the center line, as shown in FIG. .
- the partial regions to be coupled may be inverted with respect to the center line.
- the feature amount calculation unit 26 is an image within the determination target area set by the determination target area setting unit 20 (it is preferably an image after normalization by the normalization processing unit, but it may be an image that has not been subjected to normalization processing Calculate the image feature for.
- an HOG feature may be used as the image feature.
- FIG. 9 is a diagram for explaining the HOG feature quantity calculated by the feature quantity calculation unit 26.
- the determination target area is divided into a predetermined number (for example, 4 ⁇ 4) of partial areas, and the direction of the luminance gradient is calculated and calculated in each pixel in each partial area.
- the histogram of the brightness gradient direction is calculated for each partial region, and the HOG feature value is calculated by connecting the histograms of the partial regions.
- the brightness gradient directions may be, for example, eight directions (upper, upper right, lower right, right, lower, lower left, left, upper left).
- the learning data acquisition unit 28 acquires sample images of the positive and negative examples of the target cell, and acquires image feature amounts for each of the acquired sample images of the positive and negative examples. For example, the learning data acquisition unit 28 may calculate the HOG feature amount by the feature amount calculation unit 26 for the sample image and obtain the result.
- the learning unit 30 learns the condition (reference) of the image feature that identifies the target cell and the other based on the image feature of each of the positive example and the negative example obtained by the learning data acquisition unit 28.
- the learning may be performed using a learning algorithm such as Support Vector Machine or AdaBoost.
- AdaBoost Support Vector Machine
- the condition of the image feature to be learned is represented by a hyperplane separating the image feature that matches the target cell and the image feature that does not match.
- the determination unit 32 calculates the image feature amount calculated by the feature amount calculation unit 26 for the image in the determination target region set by the determination target region setting unit 20, the target cell learned by the learning unit 30, and the other. Based on whether or not the condition of the image feature to be identified is satisfied, it is determined whether the image in the determination symmetric area represents a target cell.
- the target cell region storage unit 34 stores the coordinate range in the captured image corresponding to the determination target region determined to include the target cell by the determination unit 32.
- the target cell area storage unit 34 may store a portion where a plurality of determination target areas determined to include the target cell overlap as the target cell existing area.
- the result output unit 36 outputs the result based on the coordinate range of the captured image stored in the target cell area storage unit 34.
- the result output unit 36 causes the display device 6 to display an image that displays the coordinate range of the captured image stored in the target cell area storage unit 34 or moves the imaging position of the optical microscope 2 to the coordinate range. Processing may be performed.
- FIG. 10 Next, an example of the flow of processing performed in the image processing apparatus 10 will be described with reference to the flowcharts illustrated in FIG. 10, FIG. 11A, FIG. 11B, and FIG.
- FIG. 10 shows a flowchart of the learning process of the image feature performed based on the positive and negative examples of the target cell.
- the image processing apparatus 10 acquires a positive example image of a target cell (S101), calculates an image feature quantity (HOG feature quantity) from the acquired positive example image, and generates positive example learning data (S102).
- S101 a positive example image of a target cell
- HOG feature quantity an image feature quantity from the acquired positive example image
- S102 positive example learning data
- the image processing apparatus 10 acquires a negative example image of a target cell (S103), calculates an image feature (HOG feature) from the acquired negative example image, and generates learning data of a negative example (S104). ).
- the image processing apparatus 10 learns the state (model parameter) of the classifier that identifies the image feature quantity of the target cell based on the positive example learning data and the negative example learning data (S105), The parameters are stored (S106), and the learning process is ended.
- the image processing apparatus 10 acquires a captured image obtained by capturing maternal blood with the optical microscope 2 (S201), and performs preprocessing such as a median filter on the obtained captured image (S202). Then, the image processing apparatus 10 performs binary processing such that 1 (black pixel) is a pixel whose color (for example, RGB value) is in a predetermined range and 0 (white pixel) is other than that in the captured image subjected to preprocessing. An image is generated (S203). Here, black pixels in the binarized image indicate candidate regions of nuclei.
- the image processing apparatus 10 connects adjacent pixels among black pixels in the binarized image to generate a connected pixel group, and labels the connected pixel group (S204).
- the image processing apparatus 10 increments i (S210), and returns to S206, and when the unselected connected pixel group does not remain. (S209: N), it progresses to S211.
- the image processing apparatus 10 calculates the image feature quantity of the image included in the set determination target area (S216), and calculates the calculated image feature quantity and the image feature quantity of the target cell learned in advance. Whether or not the target cell is included in the determination target region is determined based on the model parameter of the classifier to be identified (S217), and if it is determined that the target cell is included (S217: Y), The coordinate range of the captured image corresponding to the determination target area is stored (S218).
- the image processing apparatus 10 adjusts the length of one side to the end of the captured image
- the area is expanded so as to be M ⁇ M pixels (S302), and the longest portion of the line segments parallel to the end of the captured image in the binarized image included in the expanded area is set as the center line (S303) ).
- the image processing apparatus 10 obtains the distance L from the set center line to the end of the captured image (S304), and in the expanded region, the distance L from the end facing the end of the captured image with respect to the center line
- a partial region consisting of (ML) pixels ⁇ 2M pixels parallel to the center line is moved from the center line to a line symmetrical position to interpolate the image in the determination target area (S 305).
- the first narrowing down of NRBCs (target cells) candidates contained in the maternal blood is performed by color or concentration, and the first narrowing is further obtained.
- the judgment target area was set based on the result of performing the second narrowing down on the size and shape of NRBCs for the selected candidate, and the image feature value obtained from the judgment target area was learned based on the positive and negative examples of NRBCs.
- the present invention is not limited to the above embodiment.
- an example of the image processing system 1 in which captured images of a sample are sequentially input from the optical microscope 2 has been described.
- the search request for the target cell may be received, and the search result may be returned to the information processing apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Apparatus Associated With Microorganisms And Enzymes (AREA)
- Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)
Abstract
Description
Claims (15)
- 核を有する標的細胞を含む試料を撮像した撮像画像を取得する取得手段と、
前記撮像画像に含まれる画素の中から、前記核の候補となる画素を、当該候補となる画素が有すべき色又は輝度の少なくとも一方について予め定められた第1の条件に基づいて抽出する第1抽出手段と、
前記第1抽出手段により抽出された画素について隣り合う画素をそれぞれ連結した連結画素群の中から、前記標的細胞の候補となる連結画素群を、当該候補となる連結画素群が有すべき大きさ及び形状について予め定められた第2の条件に基づいて抽出する第2抽出手段と、
前記第2抽出手段により抽出された連結画素群に含まれる画素を中心とする所与の大きさの矩形領域を前記撮像画像に設定する設定手段と、
前記設定手段により設定した矩形領域から得られる画像特徴量が、画像特徴量の条件を満足するか否かに基づいて、前記矩形領域に前記標的細胞が含まれるか否かを判定する判定手段と、
を含むことを特徴とする画像処理装置。 - 前記画像特徴量の条件は、前記標的細胞の正例及び負例の標本画像に基づいて機械学習された
ことを特徴とする請求項1に記載の画像処理装置。 - 前記設定手段により設定した矩形領域を予め定められた部分領域に分割し、各部分領域内の各画素において輝度勾配の方向を算出し、算出した輝度勾配方向のヒストグラムを各部分領域に対して計算し、各部分領域のヒストグラムを連結して、前記矩形領域から得られる画像特徴量を演算する演算手段をさらに含む
ことを特徴とする請求項1又は2に記載の画像処理装置。 - 前記画像特徴量は、HOG特徴量である
ことを特徴とする請求項1乃至3のいずれかに記載の画像処理装置。 - 前記設定手段により設定した矩形領域の中心と、当該矩形領域に含まれる前記連結画素群の重心とを結ぶベクトルが予め定められた方向を向くように前記矩形領域に含まれる前記撮像画像を回転させた後に、当該回転させた前記撮像画像から画像特徴量を取得する画像特徴量取得手段をさらに含み、
前記判定手段は、前記画像特徴量取得手段により取得された画像特徴量が、前記画像特徴量の条件を満足するか否かに基づいて、前記矩形領域に前記標的細胞が含まれるか否かを判定する
ことを特徴とする請求項1乃至4のいずれかに記載の画像処理装置。 - 前記設定手段により設定される矩形領域に前記撮像画像の端が含まれる場合に、当該矩形領域に含まれる画像のうち、当該矩形領域に含まれる前記連結画素群内に設定した中心線に対して前記端と対称の領域よりも前記中心線に対して外側にある画像を前記端側に結合して補間する補間手段をさらに含む
ことを特徴とする請求項1乃至5のいずれかに記載の画像処理装置。 - 前記判定手段により標的細胞が含まれると判定された矩形領域に対応する前記撮像画像の座標領域を表示装置に表示させる
ことを特徴とする請求項1乃至6のいずれかに記載の画像処理装置。 - 核を有する標的細胞を含む試料を撮像した撮像画像を取得する取得ステップと、
前記撮像画像に含まれる画素の中から、前記核の候補となる画素を、当該候補となる画素が有すべき色又は輝度の少なくとも一方について予め定められた第1の条件に基づいて抽出する第1抽出ステップと、
前記第1抽出ステップで抽出された画素について隣り合う画素をそれぞれ連結した連結画素群の中から、前記標的細胞の候補となる連結画素群を、当該候補となる連結画素群が有すべき大きさ及び形状について予め定められた第2の条件に基づいて抽出する第2抽出ステップと、
前記第2抽出ステップで抽出された連結画素群に含まれる画素を中心とする所与の大きさの矩形領域を前記撮像画像に設定する設定ステップと、
前記設定ステップで設定した矩形領域から得られる画像特徴量が、画像特徴量の条件を満足するか否かに基づいて、前記矩形領域に前記標的細胞が含まれるか否かを判定する判定ステップと、を含む
ことを特徴とする画像処理方法。 - 前記画像特徴量の条件は、前記標的細胞の正例及び負例の標本画像に基づいて機械学習された
ことを特徴とする請求項8に記載の画像処理方法。 - 前記設定ステップで設定した矩形領域を予め定められた部分領域に分割し、各部分領域内の各画素において輝度勾配の方向を算出し、算出した輝度勾配方向のヒストグラムを各部分領域に対して計算し、各部分領域のヒストグラムを連結して、前記矩形領域から得られる画像特徴量を演算する演算ステップをさらに含む
ことを特徴とする請求項8又は9に記載の画像処理方法。 - 前記画像特徴量は、HOG特徴量である
ことを特徴とする請求項8乃至10のいずれかに記載の画像処理方法。 - 画像処理装置と、当該画像処理装置と接続される光学顕微鏡と、当該画像処理装置と接続される表示装置とを含み、
前記画像処理装置は、
核を有する標的細胞を含む試料を撮像した撮像画像を取得する取得手段と、
前記撮像画像に含まれる画素の中から、前記核の候補となる画素を、当該候補となる画素が有すべき色又は輝度の少なくとも一方について予め定められた第1の条件に基づいて抽出する第1抽出手段と、
前記第1抽出手段により抽出された画素について隣り合う画素をそれぞれ連結した連結画素群の中から、前記標的細胞の候補となる連結画素群を、当該候補となる連結画素群が有すべき大きさ及び形状について予め定められた第2の条件に基づいて抽出する第2抽出手段と、
前記第2抽出手段により抽出された連結画素群に含まれる画素を中心とする所与の大きさの矩形領域を前記撮像画像に設定する設定手段と、
前記設定手段により設定した矩形領域から得られる画像特徴量が、画像特徴量の条件を満足するか否かに基づいて、前記矩形領域に前記標的細胞が含まれるか否かを判定する判定手段と、
を含むことを特徴とする画像処理システム。 - 前記画像特徴量の条件は、前記標的細胞の正例及び負例の標本画像に基づいて機械学習された
ことを特徴とする請求項12に記載の画像処理装置。 - 前記設定手段により設定した矩形領域を予め定められた部分領域に分割し、各部分領域内の各画素において輝度勾配の方向を算出し、算出した輝度勾配方向のヒストグラムを各部分領域に対して計算し、各部分領域のヒストグラムを連結して、前記矩形領域から得られる画像特徴量を演算する演算手段をさらに含む
ことを特徴とする請求項12又は13に記載の画像処理装置。 - 前記画像特徴量は、HOG特徴量である
ことを特徴とする請求項12乃至14のいずれかに記載の画像処理装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11867361.5A EP2719754B1 (en) | 2011-06-09 | 2011-11-07 | Image processing apparatus, image processing method and image processing system |
RU2013153508/08A RU2595495C2 (ru) | 2011-06-09 | 2011-11-07 | Устройство обработки изображений, способ обработки изображений и система обработки изображений |
CN201180071407.8A CN103582697B (zh) | 2011-06-09 | 2011-11-07 | 图像处理装置、图像处理方法和图像处理系统 |
BR112013030787A BR112013030787A8 (pt) | 2011-06-09 | 2011-11-07 | Dispositivo de processamento de imagem, método de processamento de imagem, e, sistema de processamento de imagem |
US14/097,500 US9363486B2 (en) | 2011-06-09 | 2013-12-05 | Image processing device, image processing method, and image processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011129350A JP5413408B2 (ja) | 2011-06-09 | 2011-06-09 | 画像処理装置、プログラム及び画像処理システム |
JP2011-129350 | 2011-06-09 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/097,500 Continuation US9363486B2 (en) | 2011-06-09 | 2013-12-05 | Image processing device, image processing method, and image processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012169088A1 true WO2012169088A1 (ja) | 2012-12-13 |
Family
ID=47295683
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/075626 WO2012169088A1 (ja) | 2011-06-09 | 2011-11-07 | 画像処理装置、画像処理方法及び画像処理システム |
Country Status (7)
Country | Link |
---|---|
US (1) | US9363486B2 (ja) |
EP (1) | EP2719754B1 (ja) |
JP (1) | JP5413408B2 (ja) |
CN (1) | CN103582697B (ja) |
BR (1) | BR112013030787A8 (ja) |
RU (1) | RU2595495C2 (ja) |
WO (1) | WO2012169088A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014030380A1 (ja) * | 2012-08-23 | 2014-02-27 | 富士ゼロックス株式会社 | 画像処理装置、プログラム、画像処理方法及びコンピュータ読み取り媒体並びに画像処理システム |
WO2015098199A1 (ja) * | 2013-12-27 | 2015-07-02 | 富士ゼロックス株式会社 | 画像処理装置、プログラム、記憶媒体、及び画像処理方法 |
WO2015107722A1 (ja) * | 2014-01-20 | 2015-07-23 | 富士ゼロックス株式会社 | 検出制御装置、プログラム、検出システム、記憶媒体及び検出制御方法 |
EP3006550A1 (en) * | 2013-05-31 | 2016-04-13 | Fuji Xerox Co., Ltd. | Image processing device, program, storage medium, and image processing method |
EP3006551A4 (en) * | 2013-05-31 | 2017-02-15 | Fuji Xerox Co., Ltd. | Image processing device, image processing method, program, and storage medium |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3009501A4 (en) * | 2013-07-19 | 2017-01-25 | Sony Corporation | Cell evaluation device, method, and program |
JP6277718B2 (ja) * | 2013-12-27 | 2018-02-14 | 大日本印刷株式会社 | 培地情報登録システム、コロニー検出装置、プログラム及び衛生管理システム |
JP6194791B2 (ja) | 2013-12-27 | 2017-09-13 | 富士ゼロックス株式会社 | 画像処理装置及びプログラム |
WO2016021310A1 (ja) * | 2014-08-05 | 2016-02-11 | 富士フイルム株式会社 | 胎児の染色体の検査方法 |
WO2018068511A1 (zh) * | 2016-10-10 | 2018-04-19 | 深圳市瀚海基因生物科技有限公司 | 基因测序的图像处理方法及系统 |
US10467749B2 (en) | 2016-10-10 | 2019-11-05 | Genemind Biosciences Company Limited | Method and system for processing an image comprising spots in nucleic acid sequencing |
CN110869485A (zh) * | 2017-06-26 | 2020-03-06 | 奥林巴斯株式会社 | 细胞观察系统 |
CN110799636A (zh) * | 2017-06-26 | 2020-02-14 | 奥林巴斯株式会社 | 细胞观察系统 |
CN107543788A (zh) * | 2017-08-15 | 2018-01-05 | 焦作市人民医院 | 一种尿红细胞畸形率检测方法及系统 |
JP6627069B2 (ja) * | 2018-06-01 | 2020-01-08 | 株式会社フロンティアファーマ | 画像処理方法、薬剤感受性試験方法および画像処理装置 |
WO2020026349A1 (ja) * | 2018-07-31 | 2020-02-06 | オリンパス株式会社 | 画像診断支援システムおよび画像診断支援装置 |
WO2020037570A1 (zh) | 2018-08-22 | 2020-02-27 | 深圳市真迈生物科技有限公司 | 图像配准方法、装置和计算机程序产品 |
WO2020037573A1 (zh) | 2018-08-22 | 2020-02-27 | 深圳市真迈生物科技有限公司 | 检测图像上的亮斑的方法、装置和计算机程序产品 |
EP3843033B1 (en) | 2018-08-22 | 2024-05-22 | GeneMind Biosciences Company Limited | Method for constructing sequencing template based on image, and base recognition method and device |
JP7434008B2 (ja) | 2019-04-01 | 2024-02-20 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置およびプログラム |
US11481957B2 (en) * | 2019-04-01 | 2022-10-25 | Canon Medical Systems Corporation | Medical image processing apparatus and storage medium |
JP7416074B2 (ja) * | 2019-09-10 | 2024-01-17 | 株式会社ニコン | 画像処理装置、画像処理方法およびプログラム |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004248619A (ja) * | 2003-02-21 | 2004-09-09 | Haruo Takabayashi | 標的細胞自動探索システム |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2903137B2 (ja) * | 1992-09-10 | 1999-06-07 | 住友金属工業株式会社 | 核抽出方法 |
JPH10185911A (ja) | 1996-10-23 | 1998-07-14 | K O Denshi Kogyo Kk | 細胞解析装置及びその方法 |
US6169816B1 (en) * | 1997-05-14 | 2001-01-02 | Applied Imaging, Inc. | Identification of objects of interest using multiple illumination schemes and finding overlap of features in corresponding multiple images |
JP2002541438A (ja) * | 1999-02-18 | 2002-12-03 | バイオ−ヴィユー リミテッド | 混合細胞個体群内の希少な細胞型の同定及び分析用システム及び方法 |
JP3915033B2 (ja) | 2003-05-15 | 2007-05-16 | 株式会社テクノホロン | ステレオ光学系を用いた測定方法及び測定装置 |
JP5245424B2 (ja) | 2008-01-25 | 2013-07-24 | 日本電気株式会社 | 病理組織画像撮影システム、病理組織画像撮影方法、および病理組織画像撮影プログラム |
JP5380026B2 (ja) * | 2008-09-24 | 2014-01-08 | シスメックス株式会社 | 標本撮像装置 |
RU2385494C1 (ru) * | 2008-10-22 | 2010-03-27 | Государственное образовательное учреждение высшего профессионального образования Московский инженерно-физический институт (государственный университет) | Способ распознавания изображения текстуры клеток |
EP3065105B1 (en) | 2009-06-12 | 2020-04-29 | Nikon Corporation | Technique for determining the state of a cell aggregation, image processing program and image processing device using the technique, and method for producing a cell aggregation |
JPWO2010146802A1 (ja) | 2009-06-19 | 2012-11-29 | 株式会社ニコン | 細胞塊の状態判別方法、この方法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法 |
EP2463379A1 (en) | 2009-07-31 | 2012-06-13 | Nikon Corporation | Technique for determining maturity of cell mass, and image processing program and image processing device which use the technique, and method for producing cell mass |
WO2011016189A1 (ja) | 2009-08-07 | 2011-02-10 | 株式会社ニコン | 細胞の分類手法、この手法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法 |
-
2011
- 2011-06-09 JP JP2011129350A patent/JP5413408B2/ja active Active
- 2011-11-07 WO PCT/JP2011/075626 patent/WO2012169088A1/ja unknown
- 2011-11-07 CN CN201180071407.8A patent/CN103582697B/zh active Active
- 2011-11-07 RU RU2013153508/08A patent/RU2595495C2/ru active
- 2011-11-07 EP EP11867361.5A patent/EP2719754B1/en active Active
- 2011-11-07 BR BR112013030787A patent/BR112013030787A8/pt not_active IP Right Cessation
-
2013
- 2013-12-05 US US14/097,500 patent/US9363486B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004248619A (ja) * | 2003-02-21 | 2004-09-09 | Haruo Takabayashi | 標的細胞自動探索システム |
JP4346923B2 (ja) | 2003-02-21 | 2009-10-21 | 晴夫 高林 | 標的細胞自動探索システム |
Non-Patent Citations (4)
Title |
---|
"Automated Extraction of Nucleated Red Blood Cells from Considerable Microscope Images", THE JOURNAL OF THE INSTITUTE OF IMAGE ELECTRONICS ENGINEERS OF JAPAN, vol. 37, no. 5, September 2008 (2008-09-01) |
DI CATALDO, S. ET AL.: "Automated segmentation of tissue images for computerized IHC analysis.", COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, vol. 100, no. 1, 2010, pages 1 - 15, XP055138788 * |
ETSUKO ICHIDA ET AL.: "Auto-extraction of Nucleated Red Blood Cells from Massive Microscopy Images", IEICE TECHNICAL REPORT, vol. 107, no. 461, 2008, pages 291 - 296, XP008172017 * |
PLISSITI, MARINA E. ET AL.: "Combining shape, texture and intensity features for cell nuclei extraction in pap smear images.", PATTERN RECOGNITION LETTERS, vol. 32, no. 6, April 2011 (2011-04-01), pages 838 - 853, XP028182464 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014030380A1 (ja) * | 2012-08-23 | 2014-02-27 | 富士ゼロックス株式会社 | 画像処理装置、プログラム、画像処理方法及びコンピュータ読み取り媒体並びに画像処理システム |
US9934571B2 (en) | 2012-08-23 | 2018-04-03 | Fuji Xerox Co., Ltd. | Image processing device, program, image processing method, computer-readable medium, and image processing system |
EP3006551A4 (en) * | 2013-05-31 | 2017-02-15 | Fuji Xerox Co., Ltd. | Image processing device, image processing method, program, and storage medium |
EP3006550A1 (en) * | 2013-05-31 | 2016-04-13 | Fuji Xerox Co., Ltd. | Image processing device, program, storage medium, and image processing method |
EP3006550A4 (en) * | 2013-05-31 | 2017-03-29 | Fuji Xerox Co., Ltd. | Image processing device, program, storage medium, and image processing method |
US9858662B2 (en) | 2013-05-31 | 2018-01-02 | Fuji Xerox Co., Ltd. | Image processing device, computer storage medium, and method for detecting and displaying nucleated target cells |
US10395091B2 (en) | 2013-05-31 | 2019-08-27 | Fujifilm Corporation | Image processing apparatus, image processing method, and storage medium identifying cell candidate area |
WO2015098199A1 (ja) * | 2013-12-27 | 2015-07-02 | 富士ゼロックス株式会社 | 画像処理装置、プログラム、記憶媒体、及び画像処理方法 |
US10146042B2 (en) | 2013-12-27 | 2018-12-04 | Fujifilm Corporation | Image processing apparatus, storage medium, and image processing method |
JP2015137857A (ja) * | 2014-01-20 | 2015-07-30 | 富士ゼロックス株式会社 | 検出制御装置、プログラム及び検出システム |
CN105637343A (zh) * | 2014-01-20 | 2016-06-01 | 富士施乐株式会社 | 检测控制装置、程序、检测系统、存储介质和检测控制方法 |
WO2015107722A1 (ja) * | 2014-01-20 | 2015-07-23 | 富士ゼロックス株式会社 | 検出制御装置、プログラム、検出システム、記憶媒体及び検出制御方法 |
US10007834B2 (en) | 2014-01-20 | 2018-06-26 | Fujifilm Corporation | Detection control device, detection system, non-transitory storage medium, and detection control method |
Also Published As
Publication number | Publication date |
---|---|
US20140092228A1 (en) | 2014-04-03 |
BR112013030787A2 (pt) | 2016-09-06 |
EP2719754A1 (en) | 2014-04-16 |
EP2719754B1 (en) | 2021-06-23 |
JP2012254042A (ja) | 2012-12-27 |
CN103582697B (zh) | 2017-04-05 |
RU2013153508A (ru) | 2015-07-20 |
BR112013030787A8 (pt) | 2017-12-19 |
CN103582697A (zh) | 2014-02-12 |
EP2719754A4 (en) | 2015-01-14 |
RU2595495C2 (ru) | 2016-08-27 |
JP5413408B2 (ja) | 2014-02-12 |
US9363486B2 (en) | 2016-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012169088A1 (ja) | 画像処理装置、画像処理方法及び画像処理システム | |
US9704017B2 (en) | Image processing device, program, image processing method, computer-readable medium, and image processing system | |
JP5333570B2 (ja) | 画像処理装置、プログラム及び画像処理システム | |
US9684958B2 (en) | Image processing device, program, image processing method, computer-readable medium, and image processing system | |
US9934571B2 (en) | Image processing device, program, image processing method, computer-readable medium, and image processing system | |
JP5413501B1 (ja) | 画像処理装置、画像処理システム及びプログラム | |
EP3006551B1 (en) | Image processing device, image processing method, program, and storage medium | |
CN109182081A (zh) | 一种基于图像处理模型的单细胞分选系统 | |
WO2021000948A1 (zh) | 配重重量的检测方法与系统、获取方法与系统及起重机 | |
GB2485209A (en) | Curve fitting to histogram data for identification of chromatin types in nuclei | |
JP5861678B2 (ja) | 画像処理装置、プログラム及び画像処理システム | |
JP5907125B2 (ja) | 画像処理装置、プログラム及び画像処理システム | |
US10146042B2 (en) | Image processing apparatus, storage medium, and image processing method | |
CN117710338A (zh) | 一种类镜面表面的缺陷检测方法 | |
CN112926676A (zh) | 一种虚假目标识别方法、装置及计算机设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11867361 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2013153508 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013030787 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013030787 Country of ref document: BR Kind code of ref document: A2 Effective date: 20131129 |