WO2011080808A1 - 放射線画像処理装置および放射線画像処理プログラム - Google Patents
放射線画像処理装置および放射線画像処理プログラム Download PDFInfo
- Publication number
- WO2011080808A1 WO2011080808A1 PCT/JP2009/007368 JP2009007368W WO2011080808A1 WO 2011080808 A1 WO2011080808 A1 WO 2011080808A1 JP 2009007368 W JP2009007368 W JP 2009007368W WO 2011080808 A1 WO2011080808 A1 WO 2011080808A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature
- frequency
- image
- low
- image processing
- Prior art date
Links
- 238000000605 extraction Methods 0.000 claims abstract description 88
- 230000005855 radiation Effects 0.000 claims description 80
- 238000000034 method Methods 0.000 claims description 47
- 238000005286 illumination Methods 0.000 claims description 29
- 239000000284 extract Substances 0.000 claims description 9
- 230000002093 peripheral effect Effects 0.000 claims description 8
- 238000009499 grossing Methods 0.000 claims description 6
- 238000001914 filtration Methods 0.000 claims description 5
- 238000002601 radiography Methods 0.000 claims description 4
- 230000007423 decrease Effects 0.000 abstract description 4
- 230000003247 decreasing effect Effects 0.000 abstract 3
- 230000000694 effects Effects 0.000 abstract 3
- 238000001514 detection method Methods 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000003384 imaging method Methods 0.000 description 6
- 238000011946 reduction process Methods 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000005251 gamma ray Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/42—Arrangements for detecting radiation specially adapted for radiation diagnosis
- A61B6/4208—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
- A61B6/4233—Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector using matrix detectors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5258—Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20064—Wavelet transform [DWT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
Definitions
- the present invention relates to a radiographic image processing apparatus and a radiographic image processing program for processing a radiographed radiographic image.
- a direct radiation is directly applied by a lead plate or the like that blocks radiation.
- the image processing is performed by blocking the part and photographing with a narrow irradiation field of radiation.
- Patent Document 1 of the prior art feature points are extracted from the entire image data by filtering or differentiation. Therefore, when noise is included in the image, not only the feature corresponding to the boundary of the required illumination field region but also many other unnecessary feature points are detected. Therefore, there is a problem that the detection accuracy is deteriorated.
- the extracted edge points are classified into a plurality of groups and features are selected.
- the combination of candidate points may be enormous and the calculation is complicated.
- the problem of becoming since the collimator has many rectangular shapes, the illumination field region also has a rectangular shape, and the boundary of the illumination field region is a linear pattern, but an extra pattern (for example, linear data on the side of the subject) is detected. There is also a problem.
- the present invention has been made in view of such circumstances, and achieves highly accurate feature extraction and region extraction by reducing the influence of noise and the like, and a radiological image processing apparatus capable of reducing the amount of calculation, and An object is to provide a radiation image processing program.
- the radiographic image processing apparatus of the present invention is a radiographic image processing apparatus that processes a radiographic image obtained by radiography, and reduces the spatial resolution of the radiographic image so as to be an image having a lower frequency than the radiographic image.
- a feature extraction is performed by obtaining a feature amount based on a signal level difference between the low-frequency image generating means for generating a frequency image and a signal level of a peripheral pixel with respect to the pixel in an arbitrary pixel of the low-frequency image.
- a feature extraction means for generating a feature image by associating the feature amount with a pixel, and generating a low-frequency feature that is an image having a lower frequency than the feature image by reducing the spatial resolution of the feature image.
- a low-frequency feature generating unit; and a region extracting unit that selects a feature serving as a boundary of a radiation irradiation field region based on the low-frequency feature and extracts the irradiation field region from the radiation image That it comprises a and is characterized in.
- the low-frequency image generating means generates a low-frequency image that is an image having a frequency lower than that of the radiographic image by reducing the spatial resolution of the radiographic image.
- the feature extraction unit performs feature extraction by obtaining a feature amount based on a signal level difference with respect to the peripheral pixel with respect to the pixel in an arbitrary pixel of the low-frequency image, and performs the above-described process on each pixel of the low-frequency image.
- a feature image is generated in correspondence with the feature amount.
- the low-frequency feature generation means generates a low-frequency feature that is an image having a frequency lower than that of the feature image by reducing the spatial resolution of the feature image.
- the area extracting means selects a feature that becomes a boundary of the irradiation field area of the radiation based on the above-described low frequency feature, and extracts the above-mentioned irradiation field area from the radiation image.
- the spatial resolution is lowered a total of two times by the low-frequency image generation means and the low-frequency feature generation means, and the amount of calculation is reduced by reducing the influence of noise and the like.
- the influence of noise or the like can be reduced and the amount of calculation can be reduced for the low-frequency feature generated by the low-frequency feature generating unit and the illumination field region extracted by the subsequent region extracting unit.
- the influence of noise and the like can be reduced and the amount of calculation can be reduced for the low-frequency feature generated by the low-frequency feature generating unit and the illumination field region extracted by the subsequent region extracting unit.
- An example of the low-frequency image generation means is to generate the above-described low-frequency image by reducing the radiation image
- another example of the low-frequency image generation means converts the radiation image into a spatial frequency domain
- the low-frequency image is generated by converting the low-frequency region of the converted spatial frequency region into a real space.
- the spatial resolution of the radiographic image is reduced by reducing the radiographic image to generate a low-frequency image.
- the low-frequency of the transformed spatial frequency region is reduced.
- the spatial resolution of the radiographic image is reduced to generate a low-frequency image.
- a low-frequency image may be generated by performing smoothing by filtering on the radiation image.
- An example of the low-frequency feature generation unit is to generate the above-described low-frequency feature by reducing the feature image, and another example of the low-frequency feature generation unit converts the feature image into a spatial frequency domain,
- the above-mentioned low frequency feature is generated by converting a low frequency region of the transformed spatial frequency region into a real space.
- the feature image is reduced to reduce the spatial resolution of the feature image to generate a low-frequency feature
- the low-frequency region of the transformed spatial frequency region is reduced.
- the spatial resolution of the feature image is lowered to generate a low-frequency feature.
- the low-frequency feature may be generated by performing smoothing by filtering on the feature image.
- An example of the feature extraction means is to perform the above-described feature extraction by obtaining a gradient strength based on a signal level difference between the pixel and the surrounding pixels.
- the region extraction unit binarizes the above-described low frequency feature using a preset threshold value, creates binary data indicating the presence or absence of the feature, and converts the binary data into the binary data. Based on this, it is preferable to select a feature that is a boundary of the illumination field region. By creating binary data in this way, extra patterns can be further removed.
- the region extraction unit includes a normal distance from a reference origin to a straight line on a two-dimensional plane and an angle formed by the normal and the reference axis.
- a Hough transform that transforms coordinates on a two-dimensional plane into space, the above low frequency features are projected onto a space consisting of the above distance and the above angle to obtain a plurality of sinusoids. It is preferable to detect a straight line that is a candidate for the boundary of the illumination field region based on the location where the sinusoids intersect each other and the number of intersections. By detecting a straight line using such a Hough transform, it is possible to easily find a straight line that is a candidate for the boundary of the irradiation field region.
- the radiographic image processing program of the present invention is a radiographic image processing program for causing a computer to process a radiographic image taken by radiography, and lowers the spatial resolution of the radiographic image so that the radiographic image is lower than the radiographic image.
- a low-frequency image generation step for generating a low-frequency image, which is a frequency image, and, for any pixel of the low-frequency image, feature extraction is performed by obtaining a feature amount based on a signal level difference between the pixel and a peripheral pixel.
- a feature extraction step for generating a feature image by associating the feature quantity with each pixel of the low-frequency image; and a spatial resolution of the feature image is reduced, so that the image has a lower frequency than the feature image.
- a low-frequency feature generation step for generating a low-frequency feature, and a feature that serves as a boundary of an irradiation field area of radiation is selected based on the low-frequency feature;
- a region extraction step of extracting the irradiation field area, is characterized in that to execute the processing in these steps on a computer.
- the spatial resolution is reduced twice in total in the low-frequency image generation step and the low-frequency feature generation step, thereby reducing the influence of noise and the like and reducing the amount of calculation.
- the spatial resolution is lowered a total of two times to reduce the influence of noise and the like, thereby reducing the amount of calculation.
- FIG. 1 It is a block diagram of the radiographic image processing apparatus which concerns on an Example. It is a schematic diagram of the detection surface of a flat panel type radiation detector (FPD).
- FPD flat panel type radiation detector
- (A) is a schematic diagram of an image before reduction
- (b) is a schematic diagram of an image after reduction. It is the flowchart which showed the flow of a series of area
- (A) to (d) are schematic diagrams for explaining the Hough transform.
- (A), (b) is a schematic diagram used for description of Hough transform. It is a schematic diagram of the image with which it uses for the description regarding the conditions of the left side boundary line of an irradiation visual field area
- FIG. 1 is a block diagram of the radiation image processing apparatus according to the embodiment
- FIG. 2 is a schematic view of a detection surface of a flat panel radiation detector (FPD)
- FIG. 3B is a schematic diagram of an image after reduction
- FIG. 4 is a flowchart showing a flow of a series of region extraction
- FIGS. 5 and 6 are Hough transforms.
- FIG. 7 is a schematic diagram of an image used for explaining the boundary condition on the left side of the illumination field region.
- the radiographic image processing apparatus includes a top plate 1 on which a subject M is placed, and a radiation source 2 (for example, X-rays) that irradiates the subject M with radiation (for example, X-rays).
- a radiation source 2 for example, X-rays
- X-ray tube a flat panel radiation detector (hereinafter abbreviated as “FPD”) 3 that detects radiation irradiated from the radiation source 2 and transmitted through the subject M, and radiation detected by the FPD 3
- An image processing unit 4 that performs image processing
- a display unit 5 that displays radiographic images subjected to various types of image processing by the image processing unit 4.
- the display unit 5 includes display means such as a monitor and a television. Further, a collimator 21 for operating the irradiation field of radiation is disposed on the irradiation side of the radiation source 2.
- the image processing unit 4 includes a central processing unit (CPU).
- a program for performing various image processing is written and stored in a storage medium represented by ROM (Read-only Memory) and the like, and the CPU of the image processing unit 4 executes the program from the storage medium.
- image processing corresponding to the program is performed.
- the low-frequency image generation unit 41, the feature extraction unit 42, the low-frequency feature generation unit 43, and the region extraction unit 44 described later of the image processing unit 4 generate a low-frequency image, extract a feature, generate a low-frequency feature, and a region.
- a program related to generation of low-frequency images, feature extraction, generation of low-frequency features, and region extraction corresponds to the radiation image processing program in the present invention.
- the image processing unit 4 reduces the spatial resolution of the radiographic image captured by the FPD 3 to generate a low-frequency image that is a lower-frequency image than the radiographic image, and an arbitrary low-frequency image. For each pixel, feature extraction is performed by obtaining a feature amount based on a signal level difference (for example, pixel value difference or luminance difference) with respect to the surrounding pixel with respect to the pixel, and the feature amount is associated with each pixel of the low-frequency image.
- a signal level difference for example, pixel value difference or luminance difference
- a feature extraction unit 42 that generates a feature image
- a low-frequency feature generation unit 43 that generates a low-frequency feature that is a lower-frequency image than the feature image by reducing the spatial resolution of the feature image
- a low-frequency feature And a region extraction unit 44 that selects a feature serving as a boundary of the irradiation field region of radiation and extracts the irradiation field region from the radiation image.
- the low frequency image generation unit 41 corresponds to the low frequency image generation unit in the present invention
- the feature extraction unit 42 corresponds to the feature extraction unit in the present invention
- the low frequency feature generation unit 43 corresponds to the low frequency feature in the present invention.
- the region extraction unit 44 corresponds to the generation unit in the present invention.
- the FPD 3 is configured by arranging a plurality of detection elements d sensitive to radiation on a detection surface in a two-dimensional matrix.
- the detection element d detects the radiation by converting the radiation transmitted through the subject M into an electrical signal, temporarily storing it, and reading the stored electrical signal.
- An electrical signal detected by each detection element d is converted into a pixel value corresponding to the electrical signal, and a radiation image is output by assigning the pixel value to each pixel corresponding to the position of the detection element d.
- the radiographic image is sent to the low frequency image generation unit 41 and the region extraction unit 44 (see FIG. 1) of the image processing unit 4.
- an imaging button (not shown) is pressed, radiation is generated from the radiation source 2, the radiation is irradiated toward the subject M, and imaging is started in conjunction therewith.
- the generated radiation passes through the subject M through the collimator 21 and enters the FPD 3 so that the FPD 3 detects the radiation, outputs a radiation image, and performs imaging.
- the collimator 21 is disposed on the irradiation side of the radiation source 2, the irradiation field area of the radiation corresponding to the shape of the collimator 21 is determined.
- the radiation image corresponding to the determined illumination field area is sent to the low frequency image generation section 41 of the image processing section 4 and also to the area extraction section 44.
- the low frequency image generation unit 41 reduces the spatial resolution of the radiographic image and generates a low frequency image that is an image having a frequency lower than that of the radiographic image.
- the low frequency image generation unit 41 generates a low frequency image by reducing the radiation image.
- the reduction ratio of the image or the size of the low frequency image is arbitrary. However, if the image reduction ratio or the size of the low-frequency image is too small, the deterioration of information is too large. Therefore, the low-frequency image is preferably about 1/8 of the radiation image. In the case of generating a low-frequency image by reducing it to 1/8, the average pixel value of 8 ⁇ 8 pixels in the vertical and horizontal directions in the radiation image shown in FIG.
- 3A is the same as that of the low-frequency image shown in FIG.
- the reduction process is executed by setting the pixel value as one pixel.
- a region surrounded by a thick frame is 8 ⁇ 8 pixels in length and width, and a portion having the same hatching in FIG. 3 is shown in FIG.
- 3 (b) one pixel after reduction corresponding to FIG. 3 (a) is assumed.
- the method of generating a low-frequency image by reducing the radiation image is not limited to the method for obtaining the average value as described above, and thinning out the pixels of the radiation image, etc. Any reduction method that is normally used may be used.
- the method of generating a low-frequency image is not limited to the reduction processing as described above, and is a method of generating a low-frequency image by smoothing by filter processing. Any method for generating a low-frequency image that is normally used, such as a method for generating a low-frequency image by converting a low-frequency region in the spatial frequency region to a real space, or a combination of these methods may be used. There are the following methods for generating a low-frequency image by converting a radiographic image into a spatial frequency domain and converting a low-frequency area of the converted spatial frequency domain into a real space.
- a radiographic image is converted into a spatial frequency domain by Fourier transform, and a high frequency component is removed from the converted spatial frequency domain, or a low frequency component is allowed to pass through to obtain a low frequency domain.
- a low frequency image may be generated by transforming into real space by inverse Fourier transform. In the case of the method using the Fourier transform / inverse Fourier transform, a low frequency image can be generated without changing the size of the image. Further, the present invention is not limited to Fourier transform, and processing such as wavelet transform and Gabor filter may be performed.
- the generation of the low frequency image by the low frequency image generation unit 41 corresponds to the low frequency image generation step in the present invention.
- the low frequency image generated by the low frequency image generation unit 41 is sent to the feature extraction unit 42.
- the feature extraction unit 42 performs feature extraction by obtaining a feature amount based on a signal level difference (for example, a pixel value difference or a luminance difference) with respect to a peripheral pixel with respect to the pixel in an arbitrary pixel of the low-frequency image.
- a feature image is generated by associating a feature amount with each pixel.
- feature extraction is performed by the following method to generate a feature image.
- a pixel gradient (a luminance gradient in the case of the monitor of the display unit 5) is obtained.
- the calculation of the gradient will be specifically described.
- the pixel value at the coordinates (x, y) on the pixel is I (x, y), and the feature quantity in the x-axis direction among the obtained feature quantities is Ex (x, y).
- the feature quantity in the y-axis direction is Ey (x, y)
- expression (1) and expression (2) are used.
- the calculation of the feature amount is not limited to the calculation by the gradient using the above equation (3). If the technique is to extract the signal level difference of the image as a feature value, such as edge calculation by 1D or 2D differential filter processing such as Sobel, Prewitt, Laplacian or Canny, high frequency component extraction by frequency processing, etc. But you can. Furthermore, as long as the feature amount to be obtained is a feature that increases the numerical value at the boundary position of the illumination field region, another feature amount may be substituted.
- the boundary lines of the regions in approximately four directions (up, down, left, and right) corresponding to the four sides of the rectangle with respect to the subject in the center of the image Therefore, four feature amounts corresponding to each of them may be obtained.
- the feature amount is obtained in the same way. Then, a feature image is generated by associating a feature amount with each pixel of the low-frequency image. In this feature image, a large value is shown at a portion where the pixel value change (luminance change) is strong.
- the feature extraction by the feature extraction unit 42 corresponds to the feature extraction step in the present invention.
- the feature amount and feature image extracted by the feature extraction unit 42 are sent to the low-frequency feature generation unit 43.
- the low frequency feature generation unit 43 generates a low frequency feature that is an image having a frequency lower than that of the feature image by reducing the spatial resolution of the feature image.
- the method of generating the low-frequency feature by reducing the feature image Similar to the method of generating a low-frequency image by reducing the radiation image (reduction processing by the low-frequency image generation unit 41), the method of generating the low-frequency feature by reducing the feature image (low-frequency feature generation)
- the reduction processing by the unit 43 is not limited to the method for obtaining the average value as described above, and may be any reduction method that is normally used, such as thinning out pixels of the feature image.
- the method for generating the low-frequency feature is not limited to the reduction processing as described above.
- the method for generating the low-frequency feature by the smoothing by the filter processing, the feature image is converted into the spatial frequency domain, and converted.
- Any method may be used as long as it is a method of generating a low frequency characteristic by converting a low frequency region of the spatial frequency region into a real space, or a method of generating a low frequency that is normally used, such as a combination of these methods.
- a reduction process by the low-frequency image generation unit 41 Similar to, Fourier transform and inverse Fourier transform may be used. Further, the present invention is not limited to Fourier transform, and processing such as wavelet transform and Gabor filter may be performed.
- the generation of the low frequency feature by the low frequency feature generation unit 43 corresponds to the low frequency feature generation step in the present invention.
- the low frequency feature generated by the low frequency feature generation unit 43 is sent to the region extraction unit 44.
- the region extraction unit 44 selects a feature serving as a boundary of the irradiation field region of the radiation based on the low frequency feature, and extracts the irradiation field region from the radiation image.
- region extraction is performed by the region extraction unit 44 in the flow of FIG.
- Step S1 Binarization
- the low-frequency feature is binarized with a preset threshold value to create binary data indicating the presence or absence of the feature. For example, if the data value (gradient intensity in this case) at an arbitrary point (coordinate on the pixel) of the feature image data is larger than the threshold value, it is assumed that there is a feature and “1” is substituted. Is smaller, “0” is substituted because there is no feature.
- the threshold value is obtained in advance by an empirical rule or an average value of characteristic image data values (gradient intensity in this case).
- the value to be assigned is not limited to “0” or “1”. Other values or symbols may be used as long as the presence / absence of the feature can be determined.
- the threshold value is not limited to a fixed value, and may be a value set based on the pixel value of the radiation image.
- Step S2 Straight Line Detection From the binary data indicating the presence / absence of the feature binarized in step S1, a group of pixels in which pixels determined to have a feature are linearly continuous is selected to obtain a linear equation. A plurality of straight lines may be detected.
- a Hough transform is used as a method for obtaining a linear expression.
- the Hough transform represents a two-dimensional straight line on a two-dimensional plane (here, on the image) by a normal distance r from the origin to the straight line and an angle ⁇ formed by the normal and the x axis.
- This is a method of detecting a straight line by projecting the feature point (here, the coordinate of “1”) onto the Hough space and obtaining the distance r and the angle ⁇ .
- a straight line passing through the coordinates (x, y) and perpendicular to the distance r 0 from the origin to the coordinates (x, y) is drawn as shown in FIG.
- the straight line on the Hough space is a black circle on the sine curve expressed by the above equation (4) as shown in FIG. Is projected onto the portion ( ⁇ 0 , r 0 ) drawn at.
- a straight line passing through the coordinates (x, y) and parallel to the y axis is drawn as shown in FIG.
- the distance r between the y-axis and the straight line is a distance r 1 from the origin
- a straight line orthogonal to the distance r 1 from the origin is parallel to the y-axis.
- a straight line passing through the coordinates (x, y) and parallel to the x-axis is drawn as shown in FIG.
- the distance r between the x axis and the straight line is a distance r 2 from the origin
- a straight line orthogonal to the distance r 2 from the origin is parallel to the x axis.
- the straight line orthogonal to the distance r 0 from the origin to the coordinate (x, y) described above, the straight line parallel to the y axis, and the straight line parallel to the x axis, the coordinate (x, y ) are projected onto the Hough space, the sinusoidal curve expressed by the above equation (4) is drawn by the drawn portion ( ⁇ , r).
- one coordinate is set to (x 1 , y 1 ), and the other coordinate is set to (x 2 , y 2).
- various straight lines (not shown) passing through the coordinates (x 1 , y 1 ) are respectively projected onto the Hough space
- a sine curve shown in FIG. 6B is drawn, and various straight lines passing through the coordinates (x 2 , y 2 ) are drawn.
- a sine curve shown in FIG. 6B is drawn.
- the coordinates at which the two sinusoids intersect in the Hough space are a straight line on the image (here, the two coordinates (x 1 , y 1 ) and (x 2 , y 2 ) shown in FIG. 6A). It corresponds to a straight line passing through both.
- this Hough transform is applied to a feature point in the image of the present embodiment (in this case, the coordinate that is “1”), the coordinates of the feature point of the binary data indicating the presence or absence of the feature binarized in step S1 are obtained. Extraction is performed, and various straight lines passing through the respective coordinates are projected onto the Hough space, and the locations ( ⁇ L , r L ) where the sine curves intersect and the number of intersections are detected.
- the method of detecting a straight line from feature image data is not limited to using the Hough transform. You may use the least squares method which calculates
- the matching data may be detected as a straight line.
- the side data of the subject is known as an unnecessary pattern
- the side data of the subject is set as a template, and the template is compared with the feature points in the image.
- the matching data may be excluded from straight line detection as an extra pattern.
- Step S3 Straight Line Selection Of the straight lines detected in step S2, the straight line that shows the strongest feature whose position and inclination meet the conditions (for example, FIG. a) The straight line in) is selected.
- the above conditions are as follows when a rectangular illumination field region is extracted.
- the straight line is located on the left side with respect to the center O of the subject in the image (a feature image in this embodiment).
- the angle between the image frame F L of the left side of the image (feature image in this embodiment) is in the range of up to 45 ° -45 °.
- the sign F R that indicate like conditions corresponding to each direction in FIG. 7 in the right image frame of the image, reference numeral F B is in the lower image frame of the image, reference numeral F U denotes the upper image frame of the image).
- Step S4 Region Division
- the straight line selected in step S3 is determined as the boundary line of the irradiation field region, the size is returned to the radiographic image, and the selected straight line is drawn on the radiographic image. Then, the region is divided in the radiation image corresponding to the selected straight line position.
- Step S5 All? It is determined whether or not all boundary lines (straight lines) in the illumination field region have been detected. If not, the process returns to the straight line detection in step S2 and the same processing is performed for the other boundary lines (straight lines). When all the boundary lines (straight lines) are detected, the region extraction processing by the region extraction unit 44 ends.
- the region extraction (steps S1 to S5) by the region extraction unit 44 corresponds to the region extraction step in this invention.
- the irradiation field region extracted by the region extraction unit 44 is sent to the display unit 5 together with the radiation image, and the display unit 5 displays and outputs the irradiation field region together with the radiation image.
- the irradiation field area together with the radiation image may be written in and stored in a storage medium represented by RAM (Random-Access Memory), and read out as necessary, or printing means represented by a printer or the like. May be printed out.
- RAM Random-Access Memory
- a line representing the boundary of the illumination field is output and displayed in the radiation image, or the pixel value outside the illumination field is displayed (when displayed on the monitor). Is set to “0” for output display.
- the low-frequency image generation unit 41 generates a low-frequency image that is a lower-frequency image than the radiographic image by reducing the spatial resolution of the radiographic image.
- the feature extraction unit 42 performs feature extraction by obtaining a feature amount based on a signal level difference (pixel value difference or luminance difference in the present embodiment) with respect to a peripheral pixel with respect to the pixel in an arbitrary pixel of the above-described low frequency image.
- a feature image is generated by associating the above-described feature amount with each pixel of the low-frequency image.
- the low-frequency feature generation unit 43 generates a low-frequency feature that is an image having a frequency lower than that of the feature image by reducing the spatial resolution of the feature image.
- the region extraction unit 44 selects a feature that becomes a boundary of the irradiation field region of the radiation based on the above-described low frequency feature, and extracts the above-mentioned irradiation field region from the radiation image. In this way, the spatial resolution is lowered a total of two times by the low-frequency image generation unit 41 and the low-frequency feature generation unit 43 to reduce the influence of noise and the like, thereby reducing the amount of calculation.
- the low frequency image generation unit 41 generates a low frequency image by reducing the radiation image as described above.
- the spatial resolution of the radiographic image is reduced, and a low-frequency image is generated.
- the low frequency image generation unit 41 when the low frequency image generation unit 41 generates a low frequency region by converting a radiation image into a spatial frequency region and converting the low frequency region of the converted spatial frequency into a real space, The low frequency region of the converted spatial frequency region is converted into a real space, thereby reducing the spatial resolution of the radiation image and generating a low frequency image.
- the low frequency feature generation unit 43 generates the low frequency feature by reducing the feature image as described above. By reducing the feature image in this way, the spatial resolution of the feature image is reduced, and a low-frequency feature is generated. Further, when the low frequency feature generation unit 43 converts the feature image into the spatial frequency domain and generates the low frequency feature by converting the low frequency domain of the converted spatial frequency domain into the real space. The low frequency region of the converted spatial frequency region is converted into a real space, thereby reducing the spatial resolution of the feature image and generating a low frequency feature.
- the feature extraction unit 42 obtains the gradient strength P (x, y) based on the signal level difference (pixel value difference or luminance difference in the present embodiment) with respect to the above-described pixel and the surrounding pixels. The above feature extraction is performed.
- the region extraction unit 44 binarizes the low-frequency feature using a preset threshold value, creates binary data indicating the presence / absence of the feature, and based on the binary data The feature that is the boundary of the illumination field is selected. By creating binary data in this way, it is possible to further remove an extra pattern (in this case, coordinates that are “0”).
- the region extraction unit 44 has a normal distance r from a reference origin to a straight line on a two-dimensional plane, and the normal and a reference axis (here, x-axis).
- a plurality of sinusoids by projecting a low-frequency feature onto the space consisting of the above-mentioned distance r and the above-mentioned angle ⁇ using the Hough transform for converting the coordinates on the two-dimensional plane into the space consisting of the angle ⁇ formed by Based on the locations where the sinusoids intersect each other and the number of intersections, straight lines that are candidates for the boundary of the irradiation field region are detected.
- the present invention is not limited to the above embodiment, and can be modified as follows.
- the X-ray is taken as an example of radiation, but it may be applied to radiation other than X-ray (for example, ⁇ -ray).
- the radiographic image processing apparatus has a structure in which imaging is performed by placing the subject on the top plate 1 as shown in FIG. Not.
- it may have a structure in which a subject (in this case, the subject to be examined is a subject) is transported on a belt and photographed, as in a non-destructive inspection apparatus used for industrial use, etc.
- It may be a structure such as an X-ray CT apparatus used in the present invention.
- an apparatus that performs imaging may be configured as a separate apparatus as an external apparatus, and may simply include a low-frequency image generation unit, a feature extraction unit, a low-frequency feature generation unit, and a region extraction unit.
- the flat panel radiation detector (FPD) is used as the image sensor, but the image sensor is not limited to the FPD.
- the present invention can be applied to any image sensor that is normally used, as exemplified by an image intensifier. It is particularly useful when a digital image sensor such as an FPD is used.
- the signal level is the pixel value or the brightness.
- the embodiment describes the value of the electrical signal output from the FPD 3 as illustrated in the embodiment. They can be used together.
- the irradiation field area to be extracted is a rectangle, but is not limited to a rectangle typified by a square or a rectangle, but a shape such as a rhombus or a parallelogram, or a trapezoid
- the illumination field area having a shape may be used, and the illumination field area having such a shape can be extracted with high accuracy. Further, it may be a polygonal shape such as an octagon or an illumination field region such as a circle or an ellipse, and the shape of the illumination field region is not particularly limited as long as it is a closed shape.
- the Hough transform of a circle In the case of a circular or elliptical illumination field region, the Hough transform of a circle, a method by matching with a circular template, a method of comparing the correlation with circular data, and a minimum two using an equation indicating a circle
- the boundary line may be detected using multiplication or the like.
- the low-frequency feature is binarized to create binary data.
- the present invention is not limited to binarization. For example, using two thresholds, if the value is higher than the threshold value having a high value, the feature is considered to be much, and “2” is substituted, and the threshold value having a lower value is lower than the threshold value having a higher value. If it is high, it is assumed that there is a little feature and “1” is substituted, and if it is lower than a threshold having a low value, it is assumed that there is no feature and “0” is substituted to create ternary data, etc.
- Data may be created by sorting into a plurality of three or more values.
- a straight line is detected using the intersection of sinusoids, for example, ternary data is used, and when there is the most characteristic (when higher than a threshold having a high value), for example, a weight of “2” If there is a little feature (lower than a threshold having a high value and higher than a threshold having a low value), for example, a weight of “1” is used to detect a straight line more accurately. It may be.
- a straight line may be detected by setting a value proportional to the feature value in the space. In this case, the binarization in step S1 in FIG. 4 is not necessary.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
すなわち、この発明の放射線画像処理装置は、放射線撮影された放射線画像を処理する放射線画像処理装置であって、前記放射線画像の空間分解能を低下させて、放射線画像よりも低周波の画像である低周波画像を生成する低周波画像生成手段と、前記低周波画像の任意の画素において、当該画素に対する周辺画素との信号レベル差に基づく特徴量を求めることで特徴抽出を行い、低周波画像の各々の画素に対して前記特徴量を対応させて特徴画像を生成する特徴抽出手段と、前記特徴画像の空間分解能を低下させて、前記特徴画像よりも低周波の画像である低周波特徴を生成する低周波特徴生成手段と、前記低周波特徴に基づいて放射線の照視野領域の境界となる特徴を選択し、前記放射線画像から前記照視野領域を抽出する領域抽出手段とを備えていることを特徴とするものである。
41 … 低周波画像生成部
42 … 特徴抽出部
43 … 低周波特徴生成部
44 … 領域抽出部
P(x、y) … 勾配強度
r … 距離
θ … 角度
Ey(x、y)=I(x、y+1)-I(x、y-1) …(2)
勾配強度をP(x、y)とする。特徴量Ex(x、y),Ey(x、y)間で正負が互いに逆転して勾配強度P(x、y)を求める際に打ち消される心配があるので、下記(3)式のように平方和をとって勾配強度P(x、y)を算出する。
なお、画像の画素勾配(表示部5のモニタの場合には輝度勾配)を求める方法は、上記(3)式を用いた方法に限定されず、その他公知の方法を使用してもよい。
低周波特徴に関して、予め設定された閾値により2値化して、特徴の有無を示す2値データを作成する。例えば、特徴画像のデータの任意の点(画素上の座標)におけるデータの値(ここでは勾配強度)が閾値よりも大きい場合には、特徴があるとみなして“1”を代入し、閾値よりも小さい場合には、特徴がないとして“0”を代入する。閾値については経験則あるいは特徴画像のデータの値(ここでは勾配強度)の平均値によって予め求める。
ステップS1で2値化された特徴の有無を示す2値データから、特徴があると判定された画素が直線的に連続する画素群を選択し、直線式を求める。直線については複数本検出してもよい。
上記(4)式から明らかなように距離rは角度θの正弦曲線でハフ空間上では表される。
ステップS2で検出された直線の中から、位置や傾きが条件に合致し、かつ最も強度の強い特徴を示す直線(例えば上述の正弦曲線の交わる個数が最も多い図6(a)中の直線)を選択する。上述の条件とは、矩形の照視野領域を抽出する場合には、以下を示すものとする。
・図7に示すように、画像(本実施例では特徴画像)中の被検体の中心Oに対して、直線が左側に位置する。
・図7に示すように、画像(本実施例では特徴画像)の左側の画像枠FLとのなす角度が、-45度から45度までの範囲内である。
照視野領域の右側、下側および上側の条件に関しても、それぞれの方向に対応した同様の条件を示すものとする(図7中の符号FRは画像の右側の画像枠で、符号FBは画像の下側の画像枠で、符号FUは画像の上側の画像枠を示す)。
ステップS3で選択された直線を照視野領域の境界線と判定し、サイズを放射線画像に戻した上で選択された直線を、放射線画像上に描画する。そして、選択された直線位置に対応する放射線画像において領域を分割する。
照視野領域の全ての境界線(直線)を検出したか否かを判定し、まだの場合にはステップS2の直線検出に戻って、他の境界線(直線)に関して同様の処理を行う。全ての境界線(直線)を検出した場合には、領域抽出部44による領域抽出の処理を終了する。領域抽出部44による領域抽出(ステップS1~S5)は、この発明における領域抽出工程に相当する。
Claims (11)
- 放射線撮影された放射線画像を処理する放射線画像処理装置であって、
前記放射線画像の空間分解能を低下させて、放射線画像よりも低周波の画像である低周波画像を生成する低周波画像生成手段と、
前記低周波画像の任意の画素において、当該画素に対する周辺画素との信号レベル差に基づく特徴量を求めることで特徴抽出を行い、低周波画像の各々の画素に対して前記特徴量を対応させて特徴画像を生成する特徴抽出手段と、
前記特徴画像の空間分解能を低下させて、前記特徴画像よりも低周波の画像である低周波特徴を生成する低周波特徴生成手段と、
前記低周波特徴に基づいて放射線の照視野領域の境界となる特徴を選択し、前記放射線画像から前記照視野領域を抽出する領域抽出手段と
を備えていることを特徴とする放射線画像処理装置。 - 請求項1に記載の放射線画像処理装置において、
前記低周波画像生成手段は、前記放射線画像を縮小することにより前記低周波画像を生成することを特徴とする放射線画像処理装置。 - 請求項1に記載の放射線画像処理装置において、
前記低周波画像生成手段は、前記放射線画像を空間周波数領域に変換して、変換された空間周波数領域のうちの低周波領域を実空間に変換することにより前記低周波画像を生成することを特徴とする放射線画像処理装置。 - 請求項1に記載の放射線画像処理装置において、
前記低周波画像生成手段は、前記放射線画像に対してフィルタ処理による平滑化を行うことにより、前記低周波画像を生成することを特徴とする放射線画像処理装置。 - 請求項1から請求項4のいずれかに記載の放射線画像処理装置において、
前記低周波特徴生成手段は、前記特徴画像を縮小することにより前記低周波特徴を生成することを特徴とする放射線画像処理装置。 - 請求項1から請求項4のいずれかに記載の放射線画像処理装置において、
低周波特徴生成手段は、前記特徴画像を空間周波数領域に変換して、変換された空間周波数領域のうちの低周波領域を実空間に変換することにより前記低周波特徴を生成することを特徴とする放射線画像処理装置。 - 請求項1から請求項4のいずれかに記載の放射線画像処理装置において、
低周波特徴生成手段は、前記特徴画像に対してフィルタ処理による平滑化を行うことにより、前記低周波特徴を生成することを特徴とする放射線画像処理装置。 - 請求項1から請求項7のいずれかに記載の放射線画像処理装置において、
前記特徴抽出手段は、前記当該画素に対する周辺画素との信号レベル差に基づく勾配強度を求めることで前記特徴抽出を行うことを特徴とする放射線画像処理装置。 - 請求項1から請求項8のいずれかに記載の放射線画像処理装置において、
前記領域抽出手段は、予め設定された閾値により前記低周波特徴を2値化して、特徴の有無を示す2値データを作成し、
その2値データに基づいて前記照視野領域の境界となる特徴を選択することを特徴とする放射線画像処理装置。 - 請求項1から請求項9のいずれかに記載の放射線画像処理装置において、
前記領域抽出手段は、2次元平面上で基準となる原点から直線への法線の距離と、当該法線と基準となる軸とがなす角度とからなる空間に前記2次元平面上の座標を変換するハフ変換を利用して、前記低周波特徴を前記距離と前記角度からなる空間に投影して複数の正弦曲線を求め、
各々の前記正弦曲線が互いに交わる箇所、および交わる個数に基づいて、前記照視野領域の境界の候補となる直線を検出することを特徴とする放射線画像処理装置。 - 放射線撮影された放射線画像の処理をコンピュータに実行させるための放射線画像処理プログラムであって、
前記放射線画像の空間分解能を低下させて、放射線画像よりも低周波の画像である低周波画像を生成する低周波画像生成工程と、
前記低周波画像の任意の画素において、当該画素に対する周辺画素との信号レベル差に基づく特徴量を求めることで特徴抽出を行い、低周波画像の各々の画素に対して前記特徴量を対応させて特徴画像を生成する特徴抽出工程と、
前記特徴画像の空間分解能を低下させて、前記特徴画像よりも低周波の画像である低周波特徴を生成する低周波特徴生成工程と、
前記低周波特徴に基づいて放射線の照視野領域の境界となる特徴を選択し、前記放射線画像から前記照視野領域を抽出する領域抽出工程と
を備え、
これらの工程での処理をコンピュータに実行させることを特徴とする放射線画像処理プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/007368 WO2011080808A1 (ja) | 2009-12-29 | 2009-12-29 | 放射線画像処理装置および放射線画像処理プログラム |
JP2011547117A JP5333607B2 (ja) | 2009-12-29 | 2009-12-29 | 放射線画像処理装置および放射線画像処理プログラム |
US13/519,773 US8831325B2 (en) | 2009-12-29 | 2009-12-29 | Radiographic image processing apparatus and radiographic image processing program |
CN200980163236.4A CN102711611B (zh) | 2009-12-29 | 2009-12-29 | 放射线图像处理装置以及放射线图像处理方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2009/007368 WO2011080808A1 (ja) | 2009-12-29 | 2009-12-29 | 放射線画像処理装置および放射線画像処理プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011080808A1 true WO2011080808A1 (ja) | 2011-07-07 |
Family
ID=44226243
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/007368 WO2011080808A1 (ja) | 2009-12-29 | 2009-12-29 | 放射線画像処理装置および放射線画像処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8831325B2 (ja) |
JP (1) | JP5333607B2 (ja) |
CN (1) | CN102711611B (ja) |
WO (1) | WO2011080808A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6326812B2 (ja) * | 2013-12-26 | 2018-05-23 | コニカミノルタ株式会社 | 画像処理装置及び照射野認識方法 |
JP7134017B2 (ja) * | 2018-08-14 | 2022-09-09 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
WO2020129384A1 (ja) * | 2018-12-21 | 2020-06-25 | 株式会社島津製作所 | 放射線画像処理装置、放射線画像処理方法及び放射線画像処理プログラム |
US10955241B2 (en) * | 2019-06-26 | 2021-03-23 | Aurora Flight Sciences Corporation | Aircraft imaging system using projected patterns on featureless surfaces |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10286249A (ja) * | 1997-04-14 | 1998-10-27 | Fuji Photo Film Co Ltd | 放射線画像の照射野認識装置 |
JP2005218581A (ja) * | 2004-02-04 | 2005-08-18 | Canon Inc | 画像処理装置及びその制御方法、プログラム |
JP2006181362A (ja) * | 2004-12-24 | 2006-07-13 | General Electric Co <Ge> | ディジタル画像放射線撮像においてコリメーション・エッジを検出するシステム、方法及び装置 |
WO2007013300A1 (ja) * | 2005-07-27 | 2007-02-01 | Konica Minolta Medical & Graphic, Inc. | 異常陰影候補検出方法及び異常陰影候補検出装置 |
JP2007041664A (ja) * | 2005-08-01 | 2007-02-15 | Olympus Corp | 領域抽出装置および領域抽出プログラム |
JP2007202811A (ja) * | 2006-02-02 | 2007-08-16 | Fujifilm Corp | 照射野認識装置、照射野認識方法およびそのプログラム |
WO2009044452A1 (ja) * | 2007-10-02 | 2009-04-09 | Shimadzu Corporation | 放射線画像処理装置および放射線画像処理プログラム |
JP4280729B2 (ja) * | 2005-05-31 | 2009-06-17 | キヤノン株式会社 | 照射野領域抽出方法及び放射線撮影装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3225969B2 (ja) | 1991-03-11 | 2001-11-05 | 株式会社安川電機 | リクレーマの制御装置 |
JP3617751B2 (ja) * | 1997-03-12 | 2005-02-09 | 富士写真フイルム株式会社 | 放射線画像の照射野認識方法および装置 |
US6901158B2 (en) * | 1997-09-22 | 2005-05-31 | Canon Kabushiki Kaisha | Image discrimination apparatus and image discrimination method |
US7050648B1 (en) * | 1998-09-18 | 2006-05-23 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and recording medium |
JP3631095B2 (ja) * | 2000-04-17 | 2005-03-23 | キヤノン株式会社 | 照射野領域抽出装置、放射線撮影装置、放射線画像用システム、照射野領域抽出方法、及びコンピュータ可読記憶媒体 |
US7359541B2 (en) | 2000-04-28 | 2008-04-15 | Konica Corporation | Radiation image processing apparatus |
JP2001331800A (ja) | 2000-05-19 | 2001-11-30 | Konica Corp | 特徴抽出方法および被写体認識方法ならびに画像処理装置 |
JP3619158B2 (ja) * | 2001-02-13 | 2005-02-09 | キヤノン株式会社 | 画像処理装置、画像処理システム、画像処理方法、画像処理方法プログラム及び記録媒体 |
JP2004030596A (ja) * | 2002-05-10 | 2004-01-29 | Canon Inc | 画像階調変換方法、画像階調変換装置、システム、プログラム及び記憶媒体 |
JP4538260B2 (ja) * | 2004-04-15 | 2010-09-08 | 株式会社日立メディコ | 特定領域抽出方法及び装置 |
US7801344B2 (en) * | 2006-12-01 | 2010-09-21 | Carestream Health, Inc. | Edge boundary definition for radiographic detector |
JP4586052B2 (ja) * | 2007-08-08 | 2010-11-24 | キヤノン株式会社 | 画像処理装置及びその制御方法 |
JP5804340B2 (ja) * | 2010-06-10 | 2015-11-04 | 株式会社島津製作所 | 放射線画像領域抽出装置、放射線画像領域抽出プログラム、放射線撮影装置および放射線画像領域抽出方法 |
-
2009
- 2009-12-29 CN CN200980163236.4A patent/CN102711611B/zh active Active
- 2009-12-29 US US13/519,773 patent/US8831325B2/en active Active
- 2009-12-29 JP JP2011547117A patent/JP5333607B2/ja active Active
- 2009-12-29 WO PCT/JP2009/007368 patent/WO2011080808A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10286249A (ja) * | 1997-04-14 | 1998-10-27 | Fuji Photo Film Co Ltd | 放射線画像の照射野認識装置 |
JP2005218581A (ja) * | 2004-02-04 | 2005-08-18 | Canon Inc | 画像処理装置及びその制御方法、プログラム |
JP2006181362A (ja) * | 2004-12-24 | 2006-07-13 | General Electric Co <Ge> | ディジタル画像放射線撮像においてコリメーション・エッジを検出するシステム、方法及び装置 |
JP4280729B2 (ja) * | 2005-05-31 | 2009-06-17 | キヤノン株式会社 | 照射野領域抽出方法及び放射線撮影装置 |
WO2007013300A1 (ja) * | 2005-07-27 | 2007-02-01 | Konica Minolta Medical & Graphic, Inc. | 異常陰影候補検出方法及び異常陰影候補検出装置 |
JP2007041664A (ja) * | 2005-08-01 | 2007-02-15 | Olympus Corp | 領域抽出装置および領域抽出プログラム |
JP2007202811A (ja) * | 2006-02-02 | 2007-08-16 | Fujifilm Corp | 照射野認識装置、照射野認識方法およびそのプログラム |
WO2009044452A1 (ja) * | 2007-10-02 | 2009-04-09 | Shimadzu Corporation | 放射線画像処理装置および放射線画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
US8831325B2 (en) | 2014-09-09 |
US20120288179A1 (en) | 2012-11-15 |
CN102711611A (zh) | 2012-10-03 |
JP5333607B2 (ja) | 2013-11-06 |
CN102711611B (zh) | 2015-07-15 |
JPWO2011080808A1 (ja) | 2013-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4154374B2 (ja) | パターンマッチング装置及びそれを用いた走査型電子顕微鏡 | |
JP5804340B2 (ja) | 放射線画像領域抽出装置、放射線画像領域抽出プログラム、放射線撮影装置および放射線画像領域抽出方法 | |
JP2000287956A (ja) | X線画像での輪郭検出方法 | |
US9619893B2 (en) | Body motion detection device and method | |
WO2016174926A1 (ja) | 画像処理装置及び画像処理方法及びプログラム | |
JP6055228B2 (ja) | 形状計測装置 | |
US7256392B2 (en) | Inspection method of radiation imaging system and medical image processing apparatus using the same, and phantom for use of inspection of radiation imaging system | |
JP5333607B2 (ja) | 放射線画像処理装置および放射線画像処理プログラム | |
TWI512284B (zh) | 玻璃氣泡瑕疵檢測系統 | |
JP5526775B2 (ja) | 放射線撮像装置 | |
US10282826B2 (en) | Despeckling method for radiographic images | |
CN109870730B (zh) | 一种用于x光机图像解析度测试体定检的方法及系统 | |
JP4353479B2 (ja) | ムラ検査装置、ムラ検査方法、および、濃淡ムラをコンピュータに検査させるプログラム | |
JP4801697B2 (ja) | 画像形成方法,画像形成装置、及びコンピュータプログラム | |
JP2008249413A (ja) | 欠陥検出方法および装置 | |
JP4981433B2 (ja) | 検査装置、検査方法、検査プログラムおよび検査システム | |
JP2007271434A (ja) | 検査装置、検査方法、検査プログラムおよび検査システム | |
JP2016217989A (ja) | 欠陥検査装置および欠陥検査方法 | |
JP2019120644A (ja) | 表面検査装置、及び表面検査方法 | |
JP2006293522A (ja) | 直線検出装置、直線検出方法およびそのプログラム | |
JP5231779B2 (ja) | 外観検査装置 | |
He et al. | Computer Assisted Image Analysis for Objective Determination of Scanning Resolution for Photographic Collections–An Automated Approach | |
CN102735169B (zh) | 一种计算剂量曲线带宽度的系统和方法 | |
JP5851269B2 (ja) | 欠陥検査システム及び欠陥検査方法 | |
JP2013160710A (ja) | 放射線撮影装置および投影画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980163236.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09852786 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011547117 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13519773 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09852786 Country of ref document: EP Kind code of ref document: A1 |