WO2010024331A1 - 画像処理装置、及び画像処理方法 - Google Patents
画像処理装置、及び画像処理方法 Download PDFInfo
- Publication number
- WO2010024331A1 WO2010024331A1 PCT/JP2009/064958 JP2009064958W WO2010024331A1 WO 2010024331 A1 WO2010024331 A1 WO 2010024331A1 JP 2009064958 W JP2009064958 W JP 2009064958W WO 2010024331 A1 WO2010024331 A1 WO 2010024331A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parameter
- lesion candidate
- image
- image processing
- pixel
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 110
- 238000000034 method Methods 0.000 title description 69
- 230000003902 lesion Effects 0.000 claims abstract description 155
- 210000000056 organ Anatomy 0.000 claims abstract description 58
- 238000001514 detection method Methods 0.000 claims abstract description 51
- 238000004364 calculation method Methods 0.000 claims description 41
- 238000012937 correction Methods 0.000 claims description 12
- 238000012217 deletion Methods 0.000 claims description 10
- 230000037430 deletion Effects 0.000 claims description 10
- 238000011156 evaluation Methods 0.000 claims description 8
- 239000000284 extract Substances 0.000 claims description 4
- 238000003672 processing method Methods 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 2
- 208000037062 Polyps Diseases 0.000 abstract description 16
- 230000008859 change Effects 0.000 abstract description 3
- 230000008569 process Effects 0.000 description 47
- 238000010586 diagram Methods 0.000 description 31
- 238000004891 communication Methods 0.000 description 11
- 238000003745 diagnosis Methods 0.000 description 7
- 238000002591 computed tomography Methods 0.000 description 5
- 238000005452 bending Methods 0.000 description 2
- 210000002429 large intestine Anatomy 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1075—Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/56—Details of data transmission or power supply, e.g. use of slip rings
- A61B6/566—Details of data transmission or power supply, e.g. use of slip rings involving communication between diagnostic systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/06—Topological mapping of higher dimensional structures onto lower dimensional surfaces
- G06T3/067—Reshaping or unfolding 3D tree structures onto 2D planes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20068—Projection on vertical or horizontal image axis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
- G06T2207/30032—Colon polyp
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2215/00—Indexing scheme for image rendering
- G06T2215/06—Curved planar reformation of 3D line structures
Definitions
- the present invention relates to an image processing apparatus for processing an image used for medical diagnosis.
- CAD Computer-aided detection
- a curvature value typified by Shape Index is calculated as a feature amount representing a shape characteristic, and abnormal shadow candidate regions are narrowed down from the shape of a curved surface indicating the density distribution of an image.
- Patent Document 2 as a CAD user interface, features indicating abnormalities in a scanned image are highlighted or displayed so that they can be compared with the original scanned image to improve the convenience of the operator. ing.
- Patent Document 3 As an image display method for efficiently diagnosing the inside of a luminal organ such as the large intestine, a method has been developed to generate an image in which the inside of the organ is displayed centered on the core line of the luminal organ (hereinafter referred to as a developed image).
- the developed image has an advantage that a doctor or the like can easily find a lesion candidate because the entire surface inside the hollow organ can be viewed.
- Patent Document 4 a technique for generating a virtual endoscopic image from volume image data obtained by stacking a plurality of tomographic images obtained by the above-described X-ray CT apparatus or the like has been developed.
- a virtual endoscopic image is a voxel on the line of sight that emits a virtual ray (ray) from a virtual viewpoint provided inside a luminal organ, and has a luminance equal to or higher than a predetermined threshold.
- This is a display method in which an object having a value is extracted and projected onto a projection plane, and the inside of an organ can be observed like an image obtained by an endoscope (Patent Document 4).
- a lesion candidate area is extracted based on the curvature value as in conventional CAD, the lesion is not a lesion, such as a fold on the organ surface, or the lesion is too small (polyp diameter, etc.)
- the size of the lesion (polyp diameter, etc.) to be detected by CAD differs depending on the purpose of diagnosis, such as the purpose of early detection of lesions and the purpose of detection of advanced lesions.
- Various algorithms for extracting candidate lesions have been developed according to the characteristics of the lesion tissue and polyps, etc., but they are specialized for each target, but lacked versatility.
- the present invention has been made in view of the above problems, and provides an image processing apparatus and the like that allows an operator to easily change a detection target and detect a lesion area according to a diagnostic purpose. With the goal.
- the first invention is an image processing apparatus for detecting a lesion candidate region from a medical image, the parameter setting means for setting a parameter used for detection of the lesion candidate region,
- An image processing apparatus comprising: a lesion candidate area detecting unit that evaluates the medical image using a parameter set by a parameter setting unit and detects a lesion candidate area based on the evaluation result.
- a data table in which the value of the parameter is predetermined for each mode is provided, and the parameter setting means includes first input means for reading and inputting the parameter value corresponding to the selected mode from the data table. Is desirable.
- the apparatus further comprises second input means for inputting a numerical value of the parameter value, and the parameter setting means sets the numerical value input by the second input means as the parameter value.
- the parameter setting means includes a parameter input means for inputting a first parameter, and a second parameter calculation means for calculating a second parameter from the first parameter input by the parameter input means,
- the lesion candidate detection unit calculates a feature amount representing the shape of the organ surface using the second parameter calculated by the second parameter calculation unit for the medical image, and based on the calculated feature amount
- a lesion candidate region extracting unit that extracts a candidate region and a lesion candidate region extracted by the lesion candidate region extracting unit determine a false positive region by evaluating a predetermined feature amount, and are determined as a false positive region False positive deletion means for deleting a lesion candidate area.
- the second parameter is a distance between differential reference points used when calculating a curvature value as a feature value representing the shape of the organ surface.
- the parameter setting means includes: parameter input means for inputting a first parameter; and third parameter calculation means for calculating a third parameter from the first parameter input by the parameter input means.
- the lesion candidate detection means calculates a feature amount representing the shape of the organ surface for the medical image, and extracts a lesion candidate region based on the calculated feature amount; and the lesion candidate region For the lesion candidate area extracted by the extracting means, a predetermined feature amount is evaluated using the third parameter calculated by the third parameter calculating means to determine a false positive area, and determined as a false positive area And false positive deletion means for deleting the lesion candidate area.
- the third parameter includes at least one of a size indicating the size of the lesion candidate region and a shape indicating the shape of the lesion candidate region.
- the image processing apparatus further includes a parameter correction unit that corrects the parameter set by the parameter setting unit according to the distortion of the medical image, and the lesion candidate area detection unit uses the parameter corrected by the parameter correction unit. It is desirable to evaluate the medical image and detect a lesion candidate region based on the evaluation result.
- the medical image is a developed image in which the internal surface of the organ is displayed so as to be developed around the core line of the luminal organ.
- the medical image is preferably a virtual endoscopic image obtained by projecting the inside of the organ onto a predetermined projection plane from a virtual viewpoint provided inside the luminal organ.
- a second invention is an image processing method for detecting a lesion candidate area from a medical image, the parameter setting step for setting a parameter used for detection of the lesion candidate area, and the parameter set by the parameter setting step And a lesion candidate area detecting step of detecting a lesion candidate area based on an evaluation result by evaluating the medical image using the image processing method.
- an image processing apparatus or the like that allows an operator to easily change a detection target according to a diagnosis purpose and detect a lesion area.
- Hardware configuration diagram showing the overall configuration of the image processing system 1 A flowchart showing the overall flow of image processing in the image processing system 1 The flowchart explaining the flow of the process regarding the lesion candidate detection which the medical image processing apparatus 100 performs
- Data configuration diagram of main memory 102 (first embodiment)
- Diagram explaining Shape Index Diagram explaining the distance between differential reference points The figure explaining the direction of the pixel on a developed image and a developed image
- FIG. 1 is a hardware configuration diagram showing the overall configuration of the image processing system 1.
- the image processing system 1 includes a display device 107, a medical image processing device 100 including an input device 109, and an image database 111 connected to the medical image processing device 100 via a network 110. .
- the medical image processing apparatus 100 is a computer for image diagnosis installed in a hospital or the like, and as a computer detection support apparatus (CAD) that analyzes medical images, detects lesion candidates from the shadows, and presents them to the doctor It functions.
- CAD computer detection support apparatus
- the medical image processing apparatus 100 includes a CPU 101 (Central Processing Unit) 101, a main memory 102, a storage device 103, a communication interface (communication I / F) 104, a display memory 105, an interface with an external device such as a mouse 108 (I / F). ) 106, and each unit is connected via a bus 107.
- CPU 101 Central Processing Unit
- main memory 102 main memory
- storage device 103 main memory
- communication interface communication I / F
- display memory 105 a display memory 105
- an interface with an external device such as a mouse 108 (I / F). )
- the CPU 101 calls a program stored in the main memory 102 or the storage device 103 to the work memory area on the RAM of the main memory 102 and executes it, and drives and controls each unit connected via the bus 107 to perform medical image processing. Various processes performed by the apparatus 100 are realized.
- the CPU 101 executes processing to be described later regarding lesion candidate detection (see FIGS. 2 and 3).
- the main memory 102 is composed of ROM (Read Only Memory), RAM (Random Access Memory), and the like.
- ROM Read Only Memory
- RAM Random Access Memory
- the ROM permanently holds a computer boot program, a BIOS program, data, and the like.
- the RAM temporarily holds programs, data, and the like loaded from the ROM, the storage device 103, and the like, and includes a work area that the CPU 101 uses for performing various processes.
- the storage device 103 is a storage device that reads / writes data to / from an HDD (hard disk drive) or other recording medium, and stores programs executed by the CPU 101, data necessary for program execution, an OS (operating system), and the like. .
- As for the program a control program corresponding to the OS and an application program are stored. Each of these program codes is read by the CPU 101 as necessary, transferred to the RAM of the main memory 102, and executed as various means.
- the communication I / F 104 includes a communication control device, a communication port, and the like, and mediates communication between the medical image processing apparatus 100 and the network 110.
- the communication I / F 104 controls communication with the image database 111, another computer, an X-ray CT apparatus, an MRI apparatus, or the like via the network 110.
- the I / F 106 is a port for connecting a peripheral device, and transmits / receives data to / from the peripheral device.
- an input device such as a mouse 108 may be connected via the I / F 106.
- the display memory 105 is a buffer that temporarily stores display data input from the CPU 101.
- the accumulated display data is output to the display device 107 at a predetermined timing.
- the display device 107 includes a display device such as a liquid crystal panel and a CRT monitor, and a logic circuit for executing display processing in cooperation with the display device, and is connected to the CPU 101 via the display memory 105.
- the display device 107 displays the display data stored in the display memory 105 under the control of the CPU 101 on the display device.
- the input device 109 is an input device such as a keyboard, for example, and outputs various instructions and information input by the operator to the CPU 101.
- the operator interactively operates the medical image processing apparatus 100 using external devices such as the display device 107, the input device 109, and the mouse 108.
- the network 110 includes various communication networks such as a LAN (Local Area Network), a WAN (Wide Area Network), an intranet, the Internet, and the like, and is connected to the medical image processing apparatus 100 with the image database 111, a server, other information devices, and the like. Mediate.
- LAN Local Area Network
- WAN Wide Area Network
- intranet the Internet
- the image database 111 stores and stores medical images taken by devices that take images used for medical diagnosis, such as an X-ray CT apparatus and an MRI apparatus.
- a server such as a hospital or a medical center Is provided.
- the image database 111 is configured to be connected to the medical image processing apparatus 100 via the network 110, but the image database 111 is provided in, for example, the storage device 103 in the medical image processing apparatus 100. You may do it.
- the medical image handled by the image processing system 1 of the present invention includes a tomographic image of a subject, a developed image of a luminal organ, and a virtual endoscopic image.
- the developed image displays the inside of the organ so as to develop around the core line (path line) of the luminal organ (see FIG. 6), and the virtual endoscopic image is a virtual image provided inside the luminal organ.
- the inside of the luminal organ is displayed from a simple viewpoint by a display method based on the central projection method (see FIG. 20 (b)).
- FIG. 2 is a flowchart showing the overall flow of image processing in the image processing system 1.
- FIG. 3 is a flowchart for explaining a flow of processing relating to lesion candidate detection executed by the medical image processing apparatus 100.
- FIG. 4 is a diagram showing data held in the RAM of the main memory 102 when image processing and lesion candidate detection processing are executed.
- FIG. 5 is a diagram illustrating an example of the data table 2 in which the value of the parameter P1 is set in advance for each mode according to the present embodiment.
- FIG. 6 is a display example of the developed image 71 and the parameter setting window 72.
- FIG. 7 is a diagram for explaining the Shape Index.
- FIG. 8 is a diagram for explaining the distance between the differential reference points.
- the CPU 101 of the medical image processing apparatus 100 reads a program and data related to image processing and lesion candidate detection processing from the main memory 102, and executes image processing and lesion candidate detection processing based on the program and data.
- the image data is fetched from the image database 111 or the like via the network 110 and the communication I / F 104 and stored in the storage device 103 of the medical image processing apparatus 100.
- the CPU 101 of the medical image processing apparatus 100 performs a process of reading image data.
- the CPU 101 displays an image selection window displaying a plurality of selected images in a list or thumbnail display on the display device 107, and accepts selection of an image from the operator.
- the CPU 101 reads the selected image data from the storage device 103 and stores it in the main memory 102 (step S101, 102a in FIG. 4).
- image data of a lumen region is selected.
- the image data 102a read at this stage is volume image data obtained by stacking a plurality of tomographic images.
- the CPU 101 creates a display image from the image data 102a read in step S101.
- a developed image is created as a display image.
- the CPU 101 acquires lumen wall coordinate data 102b from the image data 102a.
- the lumen wall coordinate data 102b includes coordinates (x, y) in the real space of each point (each pixel) on the lumen wall displayed as a developed image, and the vicinity of the center of the lumen from the lumen surface at that coordinate.
- a distance f (x, y) on a three-dimensional coordinate to a passing line hereinafter referred to as a route line).
- the distance f (x, y) is called depth data, and is created by the CPU 101 when creating a developed image.
- the CPU 101 holds the acquired lumen wall coordinate data 102b in the main memory 102 (step S102, 102b in FIG. 4).
- Patent Document 3 Japanese Patent No. 3627066
- the CPU 101 detects a lesion candidate based on the lumen wall coordinate data 102b acquired in step S102 (step S103; to the lesion candidate detection process in FIG. 3).
- the CPU 101 sets parameters used for the lesion candidate detection process (step S201).
- the parameter set in step S201 is referred to as parameter P1.
- parameter P1 a value indicating the length of the lesion (polyp diameter), area, volume, etc. can be considered, but in the present embodiment, as an example, the lesion to be detected (polyp, etc.)
- the length (polyp diameter) is shown.
- Parameter P1 is used to calculate parameter P2 (distance between differential reference points) used in curvature calculation in step S202, and parameters P3 (area diameter threshold) and P4 (circularity threshold) used in false positive deletion processing in step S204. Is also used.
- the parameter P2 described above represents the distance between the differential reference points, and is given by the following equation (1).
- the parameter P3 described above represents a threshold value of the diameter (region diameter) of the extracted lesion candidate region, and is expressed by the following equation (2).
- the parameter P4 described above represents the circularity threshold value of the extracted lesion candidate region, and is expressed by the following equation (3).
- A, B, and C are constants.
- the CPU 101 may read and set a predetermined value for each mode from the data table 2 of FIG. 5, or an operator inputs an arbitrary numerical value from the input device 109. It is good.
- an object for example, a polyp image
- the size or shape of the object is determined by an input operation from a pointing device such as the mouse 108 or the input device 109.
- the magnitude of the parameter P1 may be input.
- the CPU 101 sets a value corresponding to the size (diameter) or shape indicated by the object as the parameter P1, and holds the parameter P1 in the main memory parameter P1 (102c in FIG. 4).
- the default value of the parameter P1 is “6”, “10”, “8” for each mode such as “early detection” mode, “normal” mode, and “manual” mode.
- “Display ON / OFF” in the data table 2 indicates “ON / OFF” of the mode switch, “1” indicates the “ON” state, and “0” indicates the “OFF” state.
- FIG. 6 shows a state where a developed image 71 is displayed at the top of the display screen of the display device 107 and a parameter setting window 72 is displayed at the bottom.
- the developed image 71 is actually the shade of the organ surface expressed in gray scale (shading information), but in order to clearly show the figure, it is assumed to be represented by a solid line in FIG. That is, the region sandwiched between the upper and lower two lines 711 corresponds to the inner surface of the luminal organ, and a plurality of vertical lines 712 drawn in the region represent folds on the organ surface.
- a selectable mode list 721 is displayed together with a check button, and a numerical value input frame 722 for the parameter P1 is displayed.
- the “display” button 723 is pressed by the mouse 108 or the like after the parameter P1 is set.
- the CPU 101 performs lesion candidate detection processing. (Steps S202 to S204) are executed, and lesion candidate areas are identified and displayed on the developed image 71.
- the parameters P2, P3, and P4 are calculated based on the parameter P1 by the above formulas (1), (2), and (3), and the main memory 102 (FIG. 4d, 102d, 102e, 102f).
- the CPU 101 uses the depth data f (x, y) (102b in FIG. 4) of the developed image 71 to each pixel of the developed image 71.
- the first feature amount of p is calculated.
- the first feature amount is, for example, a curvature value.
- the curvature value is represented by, for example, ShapeShaIndex (step S202).
- the CPU 101 holds the calculated curvature value in the main memory 102 (102g in FIG. 4).
- Shape Index takes a value that continuously changes from 0 to 1, and each value corresponds to a different curved surface state.
- the concave hemisphere corresponds to the Shape Index value “0”, and as the Shape Index value increases from “0”, the concave half cylinder, the saddle-shaped surface / plane, the convex half cylinder, the convex shape Represents the hemisphere. In the convex hemisphere, this corresponds to a Shape Index value of “1”.
- the Shape Index is calculated by the following equation (4).
- ⁇ max and ⁇ min are the maximum value and the minimum value of the principal curvature for each point on the curved surface.
- the maximum value ⁇ max and the minimum value ⁇ min of the main curvature are calculated by the following equation (5).
- fxx, fyy, and fxy are the second partial derivatives of f (x, y) at the target pixel p, and the coordinates (x, y) of the target pixel p and the depth data f (x, y) at the pixel p y) and the following equation (6).
- P2 is the distance between the differential reference points calculated by the above equation (1).
- the distance between the differential reference points represents the distance between the pixel referred to when calculating the second partial derivative of Equation (6) and the pixel of interest p.
- FIG. 8 is a diagram for explaining the distance between the differential reference points.
- the value of the curvature depends on the distance between the differential reference points.
- the curvature value takes the maximum value when the distance between the differential reference points is about the same as the width of the curved surface (unevenness).
- the curvature of the substantially flat surface is obtained, and the Shape Index takes a value in the vicinity of 0.5.
- the slope of the convex surface can be captured when calculating the second-order partial derivative. Indicates a shape close to a convex hemisphere.
- the CPU 101 performs threshold processing based on the calculated Shape Index (curvature value 102g) for each pixel p, and extracts a lesion candidate region (step S203).
- the CPU 101 holds the extracted lesion candidate area in the main memory (lesion candidate area 102h in FIG. 4).
- a lower limit value of Shape Index is set, and the CPU 101 sets a pixel having a curvature value equal to or higher than the lower limit value as a lesion candidate area.
- the lower limit value is, for example, 0.5.
- the calculated curvature value is small, so it is excluded from the lesion candidate and is not extracted.
- the CPU 101 obtains the second and third feature amounts for each extracted lesion candidate region (target region), and stores them in the main memory 102 (the feature amount (region diameter) 102i in FIG. Degrees) 102j).
- the second feature amount is, for example, the region diameter d of the lesion candidate region
- the third feature amount is the circularity k of the lesion candidate region.
- the CPU 101 performs the following evaluation on the second and third feature amounts for the attention area, and when it is determined that the determination result is false positive, from the lesion candidate area listed in step S203, The attention area is deleted (step S204).
- the CPU 101 when evaluating the area diameter d that is the second feature amount, refers to the coordinate data of the three-dimensional real space for each point on the surface of the lumen, and determines each lesion candidate area.
- the area diameter d is calculated.
- the CPU 101 compares the calculated region diameter d (102i in FIG. 4) with the parameter P3 (Equation (2); 102e in FIG. 4) set in step S201, and determines that d ⁇ P3 is false positive, The attention area is deleted from the lesion candidate area 102h.
- the CPU 101 when evaluating the circularity k that is the third feature amount, refers to the coordinate data of the three-dimensional real space for each point on the surface of the lumen, and determines each lesion candidate area. Calculate circularity k.
- the CPU 101 compares the calculated circularity k (102j in FIG. 4) with the parameter P4 (Equation (3); 102f in FIG. 4) set in step S201, and determines that it is false positive if k ⁇ P4 holds ( The false positive region 102k in FIG. 4 is deleted from the lesion candidate region 102h.
- the area diameter d and the circularity k are evaluated as feature quantities.
- the present invention is not limited to this. You may make it determine a false positive area
- Curvedness may be used as a feature amount used in the false positive deletion process in step S204.
- Curvedness indicates the size of the curved surface. Among the same convex surfaces, those having a large Curvedness value indicate small convex surfaces, and those having a small Curvedness value indicate large convex surfaces. Therefore, Curvedness can be used as a measure of the polyp diameter to be evaluated. Curvedness is given by the following equation (7).
- the average value of the Curvedness of the entire lesion candidate region of interest is compared with a predetermined value (a value proportional to the reciprocal of the parameter P1) to determine whether it is a false positive.
- a predetermined value a value proportional to the reciprocal of the parameter P1
- step S204 When the false positive deletion process in step S204 is completed and the false positive is deleted, the process proceeds to step S104 in FIG.
- the CPU 101 identifies and displays the lesion candidate area on the developed image 71 using the mark 713 or the like (step S104; see FIG. 6), and ends the image processing.
- the medical image processing apparatus 100 executes a process of detecting a lesion candidate region from an image of an organ surface.
- the lesion candidate detection process parameters used for detection of a lesion candidate region are set for each mode, or an operator can input a numerical value or set it using a GUI.
- the parameter handles four types of parameters, P1, P2, P3, and P4.
- P1 is a polyp diameter
- P2 is a distance between differential reference points
- P3 is a region diameter threshold
- P4 is a circularity.
- the threshold is used.
- the CPU 101 calculates the feature amount (curvature value) of the shape for each point on the organ surface using the set parameter (P2), and sets a point corresponding to the predetermined shape as a lesion candidate region.
- the CPU 101 calculates feature amounts such as a region diameter and a circularity for the detected lesion candidate region, determines whether these feature amounts correspond to a lesion candidate using parameters (P3, P4), Delete those that are positive. Then, the CPU 101 identifies and displays the lesion candidate areas excluding the false positive areas on the image.
- parameters related to the detection of lesion candidate areas are set by mode or by the operator, so that it is possible to easily detect the lesion candidate areas by changing the target according to the purpose of diagnosis. Can be improved.
- the parameter (P1) initially set by the operator from the parameter (P1) initially set by the operator, the parameter (distance between the differential reference points of P2) used for calculating the curvature value, and the feature in the false positive deletion process Parameters (P3 region diameter, P4 circularity) used for quantity evaluation are calculated, and one parameter P1 is used secondarily.
- the parameters (P2, P3, P4) used to determine other feature values are calculated from the set one parameter (P1), it is not necessary to input many parameters, and the parameter setting is complicated. Can be reduced. Further, if the parameter to be set by the operator is high visibility indicating the size and shape of the lesion, such as a polyp diameter, the operator can easily operate the CAD sensuously. Furthermore, since GUI is used to input parameters, operability is improved.
- the parameter P1 is the length of the lesion
- the parameter P2 is the distance between the differential reference points, and they are set in association with each other.
- the medical image illustrated in the first embodiment is a developed image of a luminal organ, but is not limited thereto, and various medical images such as a tomographic image and a three-dimensional volume image of a subject are used. May be used. Even in such a case, the medical image processing apparatus 100 can set parameters related to lesion candidate detection, and detects lesion candidates using the set parameters.
- P1 is input as a parameter
- the present invention is not limited to this, and other parameters (P2, P3, P4) can also be set to predetermined values for each mode or values desired by the operator. You may make it input.
- the parameter is corrected based on the distortion of the developed image when setting the parameter used to detect the lesion candidate.
- the pixel size (dy in FIG. 9) in the direction perpendicular to the longitudinal direction of the luminal organ (y direction; hereinafter referred to as “short direction”) is the longitudinal direction (x direction) of the developed image.
- the curve of the lumen is assigned to the pixels in the short direction in increments of a predetermined angle. Since the circumference of the cross section differs depending on the direction, the pixel size dy varies and this becomes distortion.
- the distance to the adjacent x position (dx in FIG. 9) differs between the inside and outside of the curve, resulting in image distortion.
- the pixel distortion calculation process shown in FIG. Pixel distortion is calculated, and distortion adjustment parameters (P2_x, P2_y) obtained by correcting the parameter P2 based on the calculated pixel distortion are calculated.
- FIG. 9 is a diagram for explaining the developed image 71 of the luminal organ and the orientation of the pixels on the developed image 71.
- the path line in the longitudinal direction of the luminal organ is the x direction
- the direction perpendicular to the path line (short direction) is the y direction.
- An actual organ surface length corresponding to one side of the pixel 715 is referred to as a pixel size
- a pixel size in the x direction is represented as dx
- a pixel size in the y direction is represented as dy.
- the pixel distortion is obtained as a ratio (dx / dy) of each pixel size in the x direction and the y direction of the target pixel.
- FIG. 10 is a flowchart for explaining the flow of pixel distortion calculation processing.
- FIG. 11 is a diagram illustrating data held in the RAM of the main memory 102 when the pixel distortion calculation process is executed.
- FIG. 12 is a diagram illustrating the path radius R.
- FIG. 13 is a diagram for explaining the distance between the cross section of interest (lumen surface S n ) and the adjacent cross section (lumen surface S n + 1 ).
- FIG. 14 is a diagram illustrating the relationship between the position of the lumen surface and the pixel size in the longitudinal direction.
- the CPU 101 of the medical image processing apparatus 100 reads a program and data related to the pixel distortion calculation process illustrated in FIG. 10 from the main memory 102, and executes the pixel distortion calculation process based on the program and data. It is assumed that the image data is taken in from the image database 111 or the like via the network 110 and stored in the storage device 103 of the medical image processing apparatus 100 at the start of execution of the following processing.
- the CPU 101 of the medical image processing apparatus 100 stores, from the storage device 103, expanded image data 102l, 3D real space coordinate data 102m storing 3D real space coordinates of corresponding points of the expanded image,
- the coordinate data 102n of a point on the route line (hereinafter referred to as a route point) is read and stored in the main memory 102 (step S301; 102l, 102m, 102n in FIG. 11).
- the coordinate data 102n of path points, the cross section of the pixel rows arranged in the lateral direction of the expanded image 71 and the luminal surface (hereinafter, referred to as target tube luminal surface S n.) And is the path line perpendicular 3D real space coordinate data of a point.
- the path points to a route point in the target tube luminal surface S n is called a target path points Q n (see FIG. 12).
- the CPU 101 sequentially scans each pixel on the developed image 71 to calculate pixel distortion (dx / dy) at each point (pixel).
- the flowchart of FIG. 10 shows an example in which scanning is performed from the short direction on the developed image 71 and then scanning is performed in the longitudinal direction.
- the CPU 101 Since the pixel distortion generated in the developed image 71 differs depending on the curve of the path line, that is, the size of the curve of the lumen, the CPU 101 first determines the size of the curve.
- the CPU 101 obtains the route radius R using the coordinate data 102n of the route point for the curved region of the lumen, and holds it in the main memory 102 (step S302, 102o in FIG. 11).
- the path radius R will be described with reference to FIG. FIG. 12 shows a curved region of the luminal organ 8.
- a line passing near the center of the luminal organ 8 is a route line 82.
- the route radius R is a radius R of a circle passing through the three points of the route point of interest Q n and route points Q n ⁇ N and Q n + N separated by a predetermined number N.
- the CPU 101 determines the degree of bending of the luminal organ 82 (step S303). For example, when the route radius R is greater than or equal to a predetermined threshold value Rt, it is determined that the route point of interest Q n is on a gentle curve, and when the route radius R is less than the predetermined threshold value Rt, the route point of interest Q n is abrupt. It is determined that it is on the curve.
- step S303 gentle curve
- the distortion due to the curve is small and not considered, and the CPU 101 considers the pixel distortion in the short direction.
- step S304 the pixel size dx in the x direction at the target pixel p is obtained and stored in the main memory 102 (102p in FIG. 11).
- the pixel size dx is expressed by the following equation (8).
- step S305 the pixel size dy in the lateral direction at the target pixel p is obtained and stored in the main memory 102 (102q in FIG. 11).
- the pixel size dy is expressed by the following equation (9).
- the CPU 101 calculates pixel distortion dx / dy based on the pixel sizes dx and dy calculated in steps S304 and S305 (step S306) and stores them in the array (102r in FIG. 11).
- step S303 when it is determined that the target route point Q n is on a steep curve (step S303; steep curve), pixel distortion is caused in the route line direction (longitudinal direction) on the developed image 71 due to the influence of the curve. Arise. The degree of pixel distortion due to the curve is determined by whether the pixel is inside or outside the path line curve.
- the luminal surface S n represents the path line direction in the vertical direction to the paper surface, the projection plane t each point B n of the edge of the lumen area at the luminal surface S n (luminal surface 83)
- a state of projection to 0 is shown.
- a point Q n shown in FIG. 14 is a route point of interest, a point B n is a point on the lumen surface 83, and a point O corresponds to the point O (center of a circle fitted to the route line 82) in FIG.
- That distance between the projected coordinates of the corresponding angle of the target tube luminal surface S n and the adjacent luminal surface S n + 1 corresponds to the pixel size dx of the expanded image corresponding to the point B n.
- the CPU 101 first calculates the distance l 0 between the target route point Q n and the adjacent route point Q n + 1 and stores it in the main memory 102 (102s in FIG. 11).
- the distance l 0 is obtained from the three-dimensional real space coordinate data 102n of the route point read in step S301 (step S307).
- the CPU 101 calculates an average diameter r of a cross section (lumen surface S n ) orthogonal to the target route point Q n . That is, the CPU 101 refers to the three-dimensional real space coordinates for each pixel on the developed image having the same longitudinal direction (x direction) coordinates as the target pixel, and calculates the distance from the target route point Q n . Then, the average of the calculated distances is obtained and set as the average radius r of the lumen (step S308). The CPU 101 holds the calculated average radius r in the main memory 102 (102t in FIG. 11). This distance (average radius r) can also be obtained from depth data.
- step S309 the CPU 101 obtains the projection coordinate q of the target pixel p.
- the projected coordinates are calculated by the following equation (10).
- the value of the angle ⁇ is obtained from the center coordinates of the path radius R (the coordinates of the point O), the coordinates of the path line (the coordinates of the point Q n ), and the coordinates of the lumen surface (the point B). (102u in FIG. 11).
- the CPU 101 calculates a pixel size dy in the y direction at the target pixel p (step S310; 102q in FIG. 11).
- the pixel size dy is calculated by the following equation (12) as in step S305.
- the CPU 101 calculates pixel distortion dx / dy based on the pixel sizes dx and dy calculated in steps S309 and S310 (step S311) and stores them in the array (step S312; 102r in FIG. 11).
- the CPU 101 repeats the processing from step S302 to step S312 for each pixel of the developed image, and calculates the pixel distortion dx / dy for all the pixels, and ends the pixel distortion calculation process.
- the pixel distortion dx / dy of each pixel calculated by the pixel distortion calculation process is referred to when performing curvature calculation in step S202 of the lesion candidate area detection process of FIG.
- the CPU 101 calculates strain adjustment parameters P2_x and P2_y by the following equation (13).
- P2_x is a distance between differential reference points in the longitudinal direction
- P2_y is a distance between differential reference points in a direction perpendicular to the longitudinal direction.
- the following equation (14) is calculated using the strain adjustment parameters P2_x and P2_y, and the curvature index is obtained by calculating the shape index using the above equations (4), (5), and (6).
- strain adjustment parameters P2_x and P2_y can also be calculated by the following equation (15).
- FIG. 15 is a diagram illustrating a display example of a lesion candidate area obtained when a lesion candidate detection process is performed using the strain adjustment parameters P2_x and P2_y.
- lesion candidate areas 713a, 713b, and 713c are identified and displayed.
- a mark 713a indicates a lesion candidate area in an area with little distortion
- a mark 713b indicates a lesion candidate area distorted in the horizontal direction
- a mark 713c indicates a lesion candidate area distorted in the vertical direction.
- a region that is laterally distorted due to the influence of a curve in the real space of the luminal organ is a circular polyp in the real space, but if the lesion candidate is not detected using the strain adjustment parameter, the lesion candidate Not detected.
- the shape in the real space is correctly evaluated by performing curvature calculation using the strain adjustment parameters P2_x and P2_y as in the present embodiment.
- the shape in the real space is correctly evaluated for the region distorted in the vertical direction, such as the mark 713c.
- the CPU 101 executes pixel distortion calculation processing to perform the longitudinal direction and the short direction of each pixel.
- the distortion amount (dx / dy) is calculated, the parameter is corrected using the distortion amount, and the strain adjustment parameters (P2_x, P2_y) are calculated.
- the CPU 101 performs curvature calculation using the distortion adjustment parameter, and detects a lesion candidate area.
- the organ surface shape in the real space can be correctly evaluated, and the detection accuracy of the lesion candidate region is improved.
- processing is performed in consideration of distortion of the developed image when setting parameters.
- the image distortion due to the curve is adjusted.
- a process for distortion of the developed image in a steeper curve area will be described.
- FIG. 16 is a diagram for explaining cross-sectional correction in a sharp curve region.
- an arbitrary cross-section concentration point O ′ is given to a position outside the lumen region 81 and inside the curve of the route line 82, and the cross section so as to pass through the line segment O'Q n connecting the centralized point O 'and target path points Q n, taking the method of selecting the target tube luminal surface S n.
- the target tube luminal surface S n may be the tangent of the path lines in the target path points Q n not orthogonal, thus greatly inclined. Therefore, the distance between the projected coordinates of the corresponding angle of the target tube luminal surface S n and the adjacent luminal surface S n + 1 may not be able to linearly approximated as technique according to the second embodiment.
- the pixel of interest and the pixel in contact with the longitudinal direction of the pixel of interest p on the developed image (hereinafter referred to as the adjacent pixel p).
- the adjacent pixel p the pixel of interest and the pixel in contact with the longitudinal direction of the pixel of interest p on the developed image
- FIG. 17 is a diagram for explaining the positional relationship of each point in the original luminal organ 8 of the developed image that is a lesion detection target in the third embodiment.
- the luminal surface and S the target pixel in n p the pixel p next corresponding to the pixel of interest p in the adjacent luminal surface S n + 1, the three-dimensional
- the corresponding points in the real space are referred to as target pixel corresponding points p ′ and p next ′.
- the pixel distortion calculation process shown in FIG. Pixel distortion (dx / dy) is calculated, and distortion adjustment parameters (P2_x, P2_y) obtained by correcting the parameter P2 based on the calculated pixel distortion (dx / dy) are calculated.
- FIG. 18 is a flowchart for explaining the flow of pixel distortion calculation processing in the third embodiment.
- FIG. 19 is a diagram illustrating data held in the RAM of the main memory 102 when the pixel distortion calculation process is executed.
- the CPU 101 of the medical image processing apparatus 100 reads a program and data related to the pixel distortion calculation process shown in FIG. 18 from the main memory 102, and executes the pixel distortion calculation process based on the program and data. It is assumed that the image data is taken in from the image database 111 or the like via the network 110 and stored in the storage device 103 of the medical image processing apparatus 100 at the start of execution of the following processing.
- step S301 of the pixel distortion calculation process shown in FIG. 18 the CPU 101 of the medical image processing apparatus 100, as in step S301 of the pixel distortion calculation process shown in FIG.
- the three-dimensional real space coordinate data storing the three-dimensional real space coordinates and the coordinate data of the route point are read and stored in the main memory 102 (step S401, 102l in FIG. 19; 102m, 102n).
- the CPU 101 sequentially scans each pixel on the developed image 71 to calculate pixel distortion (dx / dy) at each point (pixel).
- the flowchart of FIG. 18 shows an example in which scanning is performed from the short direction on the developed image and then scanning is performed in the longitudinal direction.
- the pixel distortion generated in the developed image 71 differs depending on the curve of the path line 82, that is, the curve size of the luminal organ 8, so the CPU 101 first determines the curve size. To do.
- the CPU 101 calculates the distance between the cross-section concentration point O ′ and the target route point Q n ′ as the route radius R and stores it in the main memory 102 (102o ′ in FIG. 19). Then, the CPU 101 determines the size of the curve based on the size of the route radius R (step S402). When the route radius R is large, the region is a gentle curve, and when the route radius R is small, the region is a sharp curve.
- the CPU 101 determines the degree of bending based on the calculated path radius R (step S403). For example, when the route radius R is greater than or equal to a predetermined threshold value Rt, it is determined that the route point of interest Q n is on a gentle curve, and when the route radius R is less than the predetermined threshold value Rt, the route point of interest Q n is abrupt. It is determined that it is on the curve.
- step S403 gentle curve
- steps S304, S305, and S306 in steps S304, S305, and S306 in FIG.
- step S404 step S405, 102p and 102q in FIG. 19
- step S406 step S410; 102r in FIG. 19
- step S407 the CPU 101 obtains the arc length (pixel size dx) between the two points of interest pixel corresponding points p ′ and p next ′ shown in FIG.
- the CPU 101 obtains the distance p′O ′ based on the three-dimensional real space coordinate data of the target pixel corresponding point p ′ and the cross-sectional concentration point O ′, and sets this as the radius R ′.
- the CPU 101 holds the calculated radius R ′ in the main memory 102 (102x in FIG. 19).
- the CPU 101 obtains an angle ⁇ formed by three points of the target pixel corresponding point p ′, the cross-sectional concentration point O ′, and the adjacent pixel corresponding point p next ′, and stores it in the main memory 102 (102y in FIG. 19).
- the CPU 101 obtains the arc length from the following equation (17) and holds it in the main memory 102 (step S407; 102p in FIG. 19).
- the CPU 101 calculates the pixel size dy in the short side direction of the target pixel p from the following equation (18) and holds it in the main memory 102 (step S408; 102q in FIG. 19).
- the CPU 101 calculates pixel distortion dx / dy based on the pixel sizes dx and dy calculated in steps S407 and S408 (step S409) and stores them in the array (step S410; 102r in FIG. 19).
- the CPU 101 repeats the processing from step S402 to step S410 for each pixel of the developed image, and after calculating the pixel distortion dx / dy for all the pixels, the pixel distortion calculation process ends.
- the pixel distortion dx / dy of each pixel calculated by the pixel distortion calculation process is referred to when performing curvature calculation in step S202 of the lesion candidate area detection process of FIG. Since the calculation of the strain adjustment parameters (P2_x, P2_y) is the same as the second embodiment, description thereof is omitted.
- the lesion candidate area can be accurately identified and displayed regardless of the distortion of the developed image, as in the display example shown in FIG. 15 (the lesion candidate area 713a in FIG. 15). 713b, 713c).
- the pixel size dx in the longitudinal direction of the luminal organ is between the adjacent pixel corresponding points. Calculated as arc length. Therefore, even in a developed image that has been subjected to cross-sectional correction due to a sharp curve, the shape in the real space can be correctly evaluated, and the detection accuracy of the lesion candidate area is improved.
- the method for obtaining the pixel size dx with an arc is a method of drawing a relatively gentle curve that is not subjected to cross-sectional correction as in the developed image of the second embodiment. You may make it apply about an expansion
- the example of correcting the parameter P2 representing the distance between the differential reference points has been shown, but the distortion correction may be similarly performed for the parameter P3 and the parameter P4. .
- FIG. 20 is a diagram for explaining a virtual endoscopic image.
- FIG. 20 (a) is a diagram showing the longitudinal direction of the luminal organ in the vertical direction
- FIG. 20 (b) is an example of a virtual endoscopic image for the luminal organ of FIG. 20 (a).
- a virtual endoscopic image is a planar view of a view from a given range of directions ( ⁇ view ) as a viewing angle from an arbitrary viewpoint p 0 set inside the lumen region v shown in FIG. image 75 projected on the projection plane s 0 is a (FIG. 20 (b)).
- the pixel value of each point (hereinafter referred to as “target pixel p”) of the virtual endoscopic image 75 is a shadow value given based on the distance between the viewpoint p 0 and the target pixel corresponding point p ′. .
- the target pixel corresponding point p ′ is, for example, a voxel that reaches a ray when a virtual ray called a ray is irradiated from the viewpoint p 0 to the target pixel p on the three-dimensional real space coordinates.
- the voxel corresponding to the target pixel corresponding point p ′ has a pixel value within a predetermined threshold range.
- FIG. 21 is a diagram for explaining the distortion caused by the distance from the viewpoint of the projection object
- FIG. 22 is a diagram for explaining the distortion generated at the end of the virtual endoscopic image.
- the size of the projected image varies depending on the distance from the viewpoint p 0 .
- the two objects T1, T2 of the same size, arranged at different locations distances be the same direction with respect to the projection plane s 0 from the viewpoint p 0.
- the distance from the view point p 0 to the object T1 and L 1 the distance from the view point p 0 to the object T2 and L 2.
- the sizes of the images projected on the projection plane s 0 are denoted by ⁇ 1 and ⁇ 2 , respectively.
- the object T1 closer to the viewpoint p 0 is a larger image on the projection plane s 0 ( ⁇ 1 > ⁇ 2 ). This is a distortion of the image. Therefore, when calculating the curvature, it is necessary to set the differential reference point distance (parameter P2) at the target pixel p so that the distance between the differential reference points becomes equal at the target pixel corresponding point p ′.
- the angle formed by each normal line is defined as ⁇ .
- the sizes of the images projected on the projection plane s 0 are denoted by ⁇ 3 and ⁇ 4 , respectively. Comparing ⁇ 3 and ⁇ 4 , the larger the angle ⁇ , the larger the image on the projection plane s 0 ( ⁇ 4 > ⁇ 3 ). This is a distortion of the image. Therefore, at the end of the virtual endoscopic image 75, the value of the distance between the differential reference points (parameter P2) needs to be corrected to a value larger than that at the center of the image.
- FIG. 23 is a diagram for explaining the central projection method.
- FIG. 24 is a flowchart showing a flow of differential reference point distance calculation processing in the virtual endoscopic image 75.
- FIG. 25 is a diagram illustrating data held in the RAM of the main memory 102 when the differential reference point distance calculation processing is executed.
- p 0 is the viewpoint
- s 0 is the projection plane
- ⁇ is the length of one side (pixel size) of a pixel located at the center of the projection plane s 0 (hereinafter referred to as “center pixel”)
- L 0 is the distance between the viewpoint p 0 and the center pixel
- theta 0 is centered viewpoint p 0, it is the angle of the ends and the viewpoint p 0 of the center pixel.
- p is the target pixel
- ⁇ ′ is the length of the projection target T1 at the target pixel corresponding point p ′
- L ′ is the distance between the viewpoint p 0 and the target pixel corresponding point p ′
- ⁇ is centered on the viewpoint p 0 . Is an angle formed by both ends of the target pixel p and the viewpoint p 0 .
- the CPU 101 sets the coordinates of the viewpoint p 0 and the position and orientation of the projection plane s 0 and holds them in the main memory 102 (step S501; 102A and 102B in FIG. 25).
- Projection surface s 0 can be set to the distance L 0 from the viewpoint p 0, and the vector connecting the view point p 0 and the center pixel, from.
- the CPU 101 calculates the length (pixel size ⁇ ) of one side of the central pixel of the projection plane s 0 and stores it in the main memory 102 (step S502; 102C in FIG. 25).
- the pixel size ⁇ is obtained from the following equation (19).
- step S506 for each point on the projection plane s 0 (pixel of interest p), and repeats the processing from step S503 ⁇ step S506 follows.
- the CPU 101 acquires the coordinates of the target pixel corresponding point p ′ projected onto the target pixel p (step S503). That is, the CPU 101 irradiates the target pixel p with the ray from the viewpoint p 0 and acquires the coordinates of the irradiation-target voxel having the luminance value within the threshold range as the coordinates of the target pixel corresponding point p ′.
- the CPU 101 obtains the length ⁇ ′ at the position of the target pixel corresponding point p ′ for the target pixel p and stores it in the main memory 102E (step S504; 102E in FIG. 25).
- ⁇ ′ indicates the distance L ′ between the target pixel corresponding point p ′ and the viewpoint p 0, and the points and viewpoints at both ends of the pixel p ′ when the target pixel p is viewed from the viewpoint p 0. It can be calculated from the angle ⁇ formed by p 0 . That is, ⁇ ′ is expressed by the following equation (20).
- the angle theta, the coordinates of the pixel p of interest, the distance L between the target pixel p and the viewpoint p 0, the distance L 0 between the center pixel and the viewpoint p 0, be obtained from the length of one side of the center pixel ⁇ Can do.
- the CPU 101 calculates the differential reference point distance P2 at the target pixel p (step S505).
- the distance P2 between the differential reference points of the pixel of interest p is expressed by the following formula (21) Can be expressed as:
- the CPU 101 stores the differential reference point distance P2 for the target pixel p calculated in step S505 in the array (step S506; 102F in FIG. 25).
- the CPU 101 repeats the processing from step S503 to step S506 for each pixel of the virtual endoscopic image, and calculates the distance P2 between the differential reference points for all the pixels p.
- the distance calculation process ends. Thereafter, in the lesion candidate area detection process shown in FIG. 3, the CPU 101 calculates a curvature value (Shape Index) using the differential reference point distance P2 of each pixel calculated by the above-described process, and detects a lesion candidate.
- a curvature value Shape Index
- the distance P2 between the differential reference points is corrected in consideration of image distortion. To do.
- the lesion candidate is detected using the corrected differential reference point distance P2.
- correction of the distance between the differential reference points in the virtual endoscopic image using the center projection method has been described.
- distortion that occurs at the edge of the image is shown.
- Some have been subjected to a process of correcting (distortion according to the angle of the projection surface from the viewpoint to the projection surface) for example, JP-A-7-296184.
- the distance between the differential reference points may be corrected depending only on the distance from the viewpoint.
- the distance between the differential reference points is expressed by the following equation (22).
- 1 image processing system 100 medical image processing device, 101 CPU, 102 main memory, 103 storage device, 104 communication I / F, 105 display memory, 106 I / F, 107 display device, 108 mouse (external device), 109 input Device, 110 network, 111 image database, 71 unfolded image, 713 lesion candidate area, 72 parameter setting window, 721 mode list, 722 numeric input frame, 8 lumen organ, 81 lumen area, 82 route line, 83 lumen surface , Q n path points, S n tube luminal surface, p the pixel of interest, dx longitudinal pixel size, dy widthwise direction pixel size of the circle that fits O path point center, O 'sectional concentration point
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Epidemiology (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Primary Health Care (AREA)
- Dentistry (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Pulmonology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
まず、本発明の画像処理装置を適用した画像処理システム1の構成について説明する。
図1は、画像処理システム1の全体構成を示すハードウエア構成図である。
図1に示すように、画像処理システム1は、表示装置107、入力装置109を備えた医用画像処理装置100、及び医用画像処理装置100とネットワーク110を介して接続される画像データベース111とを備える。
I/F106は、周辺機器を接続させるためのポートであり、周辺機器とのデータの送受信を行う。例えば、マウス108等の入力装置等をI/F106を介して接続させるようにしてもよい。
図2は、画像処理システム1における画像処理全体の流れを示すフローチャートである。
図3は、医用画像処理装置100が実行する病変候補検出に関する処理の流れを説明するフローチャートである。
図4は、画像処理及び病変候補検出処理実行時に主メモリ102のRAMに保持されるデータを示す図である。
図5は、本実施の形態のモード別にパラメータP1の値が予め設定されるデータテーブル2の一例を示す図である。
図6は、展開画像71及びパラメータ設定ウィンドウ72の表示例である。
図7は、Shape Indexについて説明する図である。
図8は、微分参照点間距離を説明する図である。
本実施の形態では、管腔領域の画像データが選択されるものとする。また、この段階で読み込む画像データ102aは複数の断層像を積み上げたボリューム画像データであるものとする。
パラメータP1は、ステップS202の曲率計算で用いるパラメータP2(微分参照点間距離)や、ステップS204の偽陽性削除処理で用いるパラメータP3(領域径の閾値)、P4(円形度の閾値)の算出にも利用される。
また、上述のパラメータP3は、抽出された病変候補領域の径(領域径)の閾値を表すものであり、次式(2)で表される。
また、上述のパラメータP4は、抽出された病変候補領域の円形度の閾値を表すものであり、次式(3)で表される。
ここで、A,B,Cは定数である。
一例として微分参照点間距離P2を、パラメータP1(ポリープ径)の1/2とする。すなわち、上述の式(1)において、A=1/2とする。
次に、第2の実施の形態の画像処理システム1について説明する。第2の実施の形態の画像処理システム1のハードウエア構成は、図1の第1の実施の形態の画像処理システム1と同一であるので説明を省略し、同一の各部については同一の符号を付して説明する。
第2の実施の形態では、病変候補の検出に利用されるパラメータの設定の際に、展開画像のひずみに基づいてパラメータを補正する。
図9の展開画像71は、管腔臓器の長手方向の経路線をx方向とし、経路線に垂直な方向(短手方向)をy方向としている。そして、画素715の1辺に相当する実際の臓器表面の長さを画素サイズとよび、x方向の画素サイズをdx、y方向の画素サイズをdyとして表すこととする。画素ひずみは、注目画素のx方向とy方向の各画素サイズの比(dx/dy)として求められる。
図11は、画素ひずみ算出処理実行時に主メモリ102のRAMに保持されるデータを示す図である。
図12は、経路半径Rについて説明する図である。
図13は、注目断面(管腔面Sn)と隣の断面(管腔面Sn+1)との距離について説明する図である。
図14は、管腔表面の位置と、長手方向の画素サイズの関係を説明する図である。
なお、以下の処理の実行開始に際して、画像データは画像データベース111等からネットワーク110を介して取り込まれ、医用画像処理装置100の記憶装置103に記憶されているものとする。
ステップS309において、まずCPU101は、注目画素pの投影座標qを求める。投影座標は次式(10)により算出される。
図15に示すように展開画像71には、病変候補領域713a、713b、713cが識別表示される。
図15中、マーク713aは歪みの少ない領域における病変候補領域を示し、マーク713bは横方向に歪んだ病変候補領域を示し、マーク713cは縦方向に歪んだ病変候補領域を示している。すなわち、管腔臓器の実空間でのカーブによる影響で横方向に歪んだ領域は、実空間では円形のポリープであっても、ひずみ調整パラメータを用いた病変候補の検出を行なわない場合は病変候補として検出されない。しかしながら、本実施の形態のように、ひずみ調整パラメータP2_x、P2_yを用いて、曲率計算を行なうことにより、実空間での形状が正しく評価されることとなる。同様に、マーク713cのように、縦方向に歪んだ領域についても、実空間での形状が正しく評価されることとなる。
次に、第3の実施の形態の画像処理システム1について説明する。第3の実施の形態の画像処理システム1のハードウエア構成は、図1の第1の実施の形態の画像処理システム1と同一であるので説明を省略し、同一の各部については同一の符号を付して説明する。
図16は、急カーブ領域での断面補正を説明する図である。
図16に示すように、管腔臓器8の急カーブ領域では、管腔領域81の外側であって経路線82のカーブの内側に存在する位置に、任意の断面集中点O’を与え、断面集中点O’と注目経路点Qnとを結ぶ線分O’Qnを通るように、注目管腔面Snを選択する手法をとっている。
図17に示すような管腔臓器8の急カーブ領域において、管腔面Snにおける注目画素pと、隣接する管腔面Sn+1における注目画素pに対応する画素pnextとの、各3次元実空間における対応点を注目画素対応点p’、pnext’と呼ぶ。
そして注目画素対応点p’、pnext’の3次元実空間座標での距離dxを経路線82に沿った曲線の長さに近づけるため、2点p’、pnext’を結ぶ円弧の長さを求めて、円弧の長さを距離dxとする。
図19は、画素ひずみ算出処理実行時に主メモリ102のRAMに保持されるデータを示す図である。
なお、以下の処理の実行開始に際して、画像データは画像データベース111等からネットワーク110を介して取り込まれ、医用画像処理装置100の記憶装置103に記憶されているものとする。
経路半径Rが大きい場合は、緩やかなカーブの領域であり、経路半径Rが小さい場合は、急なカーブの領域である。
また、第3の実施の形態についても、微分参照点間距離を表すパラメータP2についての補正を行なう例を示したが、パラメータP3、パラメータP4についても同様にひずみの補正を行なうようにしてもよい。
第1から第3の実施の形態では管腔臓器の展開画像についての病変候補検出についての例を示したが、本発明の画像処理装置は、その他の画像表示方法において適用することとしてもよい。第4の実施の形態では、仮想内視鏡画像について本発明を適用する例について説明する。
仮想内視鏡画像75の各点(以下、「注目画素p」と呼ぶ。)の持つ画素値は、視点p0と注目画素対応点p’との距離に基づいて与えられた陰影値である。注目画素対応点p’とは、例えば3次元実空間座標上において、視点p0から注目画素pに対してレイと呼ばれる仮想的な光線を照射したときに、レイの達するボクセルである。注目画素対応点p’であるボクセルは、所定の閾値範囲内の画素値を持つ。
しかしながら、一般的に用いられる中心投影法により生成される仮想内視鏡画像75には、以下の2種類のひずみが表れる。
そのため仮想内視鏡画像75について病変候補領域検出処理を行う際は、前述の方法で求めた画素値に基づく曲率計算に用いる微分参照点間距離(パラメータP2)を補正する必要がある。
図23は、中心投影法について説明する図である。
図24は、仮想内視鏡画像75における、微分参照点間距離算出処理の流れを示すフローチャートである。
図25は、微分参照点間距離算出処理実行時に主メモリ102のRAMに保持されるデータを示す図である。
また、pは注目画素、Δ’は注目画素対応点p’における投影対象物T1の長さ、L’は視点p0と注目画素対応点p’との距離、θは、視点p0を中心とする、注目画素pの両端と視点p0のなす角度である。
その後CPU101は、図3に示す病変候補領域検出処理において、上述の処理によって算出した各画素の微分参照点間距離P2を用いて曲率値(Shape Index)を算出し、病変候補の検出を行なう。
その結果、仮想内視鏡画像について病変候補検出処理を行う場合にも、臓器表面の形状を正しく評価でき、病変候補領域の検出精度が向上する。
Claims (12)
- 医用画像から病変候補領域を検出する画像処理装置であって、
前記病変候補領域の検出に利用されるパラメータを設定するパラメータ設定手段と、
前記パラメータ設定手段により設定されたパラメータを用いて前記医用画像を評価し、評価結果に基づいて病変候補領域を検出する病変候補領域検出手段と、
を備えることを特徴とする画像処理装置。 - 前記パラメータ設定手段は、
第1のパラメータを入力するパラメータ入力手段と、
前記パラメータ入力手段によって入力された第1のパラメータから、第2のパラメータを算出する第2のパラメータ算出手段と、を備え、
前記病変候補検出手段は、
前記医用画像について前記第2のパラメータ算出手段により算出された第2のパラメータを用いて臓器表面の形状を表す特徴量を算出し、算出された特徴量に基づいて病変候補領域を抽出する病変候補領域抽出手段と、
前記病変候補領域抽出手段によって抽出された病変候補領域について、所定の特徴量を評価することにより偽陽性領域を判定し、偽陽性領域と判定された病変候補領域を削除する偽陽性削除手段と、
を備えることを特徴とする請求項1に記載の画像処理装置。 - 前記第2のパラメータは、前記臓器表面の形状を表す特徴量として曲率値を算出する際に利用される微分参照点間距離であることを特徴とする請求項2に記載の画像処理装置。
- 前記パラメータ設定手段は、
第1のパラメータを入力するパラメータ入力手段と、
前記パラメータ入力手段によって入力された第1のパラメータから、第3のパラメータを算出する第3のパラメータ算出手段と、を備え、
前記病変候補検出手段は、
前記医用画像について臓器表面の形状を表す特徴量を算出し、算出された特徴量に基づいて病変候補領域を抽出する病変候補領域抽出手段と、
前記病変候補領域抽出手段によって抽出された病変候補領域について、所定の特徴量を前記第3のパラメータ算出手段により算出された第3のパラメータを用いて評価することにより偽陽性領域を判定し、偽陽性領域と判定された病変候補領域を削除する偽陽性削除手段と、
を備えることを特徴とする請求項1に記載の画像処理装置。 - 前記第3のパラメータは、前記病変候補領域の大きさを示すもの、または前記病変候補領域の形状を示すもののうち、少なくともいずれか一方を含むことを特徴とする請求項4に記載の画像処理装置。
- 前記パラメータ設定手段によって設定されたパラメータを、前記医用画像の歪みに応じて補正するパラメータ補正手段を更に備え、
前記病変候補領域検出手段は、前記パラメータ補正手段によって補正されたパラメータを用いて、前記医用画像を評価し、評価結果に基づいて病変候補領域を検出することを特徴とする請求項1に記載の画像処理装置。 - 前記医用画像は、管腔臓器の芯線を中心に展開するように臓器内部表面を表示した展開画像であることを特徴とする請求項6に記載の画像処理装置。
- 前記医用画像は、管腔臓器内部に設けた仮想的な視点から臓器内部を所定の投影面に投影した仮想内視鏡画像であることを特徴とする請求項6に記載の画像処理装置。
- モード別に前記パラメータの値を予め定めたデータテーブルを備え、
前記パラメータ設定手段は、選択されたモードに対応するパラメータの値を前記データテーブルから読み出して入力する第1の入力手段を備えることを特徴とする請求項1に記載の画像処理装置。 - 前記パラメータの値を数値入力する第2の入力手段を備え、
前記パラメータ設定手段は、前記第2の入力手段によって入力された数値をパラメータの値として設定することを特徴とする請求項1に記載の画像処理装置。 - 前記医用画像が表示される表示画面上に、前記パラメータの値の大きさに応じて大きさまたは形状が変化するオブジェクトを表示させ、該オブジェクトに対する操作によって前記パラメータの値を入力する第3の入力手段を備え、
前記パラメータ設定手段は、前記第3の入力手段によって入力された前記オブジェクトの大きさまたは形状に応じた値を前記パラメータの値として設定することを特徴とする請求項1に記載の画像処理装置。 - 医用画像から病変候補領域を検出する画像処理方法であって、
前記病変候補領域の検出に利用されるパラメータを設定するパラメータ設定ステップと、
前記パラメータ設定ステップにより設定されたパラメータを用いて前記医用画像を評価し、評価結果に基づいて病変候補領域を検出する病変候補領域検出ステップと、
を備えることを特徴とする画像処理方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/060,506 US8538113B2 (en) | 2008-09-01 | 2009-08-27 | Image processing device and method for processing image to detect lesion candidate region |
JP2010526757A JP5346938B2 (ja) | 2008-09-01 | 2009-08-27 | 画像処理装置、及び画像処理装置の作動方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008222978 | 2008-09-01 | ||
JP2008-222978 | 2008-09-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010024331A1 true WO2010024331A1 (ja) | 2010-03-04 |
Family
ID=41721497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/064958 WO2010024331A1 (ja) | 2008-09-01 | 2009-08-27 | 画像処理装置、及び画像処理方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8538113B2 (ja) |
JP (1) | JP5346938B2 (ja) |
WO (1) | WO2010024331A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011122037A1 (ja) * | 2010-03-31 | 2011-10-06 | 富士フイルム株式会社 | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム |
US20110261072A1 (en) * | 2008-12-05 | 2011-10-27 | Takayuki Kadomura | Medical image display device and method of medical image display |
JP2016521165A (ja) * | 2013-04-18 | 2016-07-21 | セント・ジュード・メディカル・エイトリアル・フィブリレーション・ディヴィジョン・インコーポレーテッド | 2d平面投影及び部分的展開表面マッピングプロセスを利用して不整脈を視覚化し分析するためのシステム及び方法 |
JP2016158916A (ja) * | 2015-03-03 | 2016-09-05 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、医用画像処理装置に搭載可能なプログラム、及び医用画像処理方法 |
JP2019118447A (ja) * | 2017-12-28 | 2019-07-22 | 株式会社Aze | 医用画像処理装置、その制御方法、及びプログラム |
CN111986137A (zh) * | 2019-05-21 | 2020-11-24 | 梁红霞 | 生物器官病变检测方法、装置、设备及可读存储介质 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4676021B2 (ja) * | 2009-04-16 | 2011-04-27 | 富士フイルム株式会社 | 診断支援装置、診断支援プログラムおよび診断支援方法 |
US8175617B2 (en) * | 2009-10-28 | 2012-05-08 | Digimarc Corporation | Sensor-based mobile search, related methods and systems |
US8121618B2 (en) | 2009-10-28 | 2012-02-21 | Digimarc Corporation | Intuitive computing methods and systems |
WO2013051045A1 (ja) * | 2011-10-03 | 2013-04-11 | 株式会社日立製作所 | 画像処理装置および画像処理方法 |
US20140003655A1 (en) * | 2012-06-29 | 2014-01-02 | Praveen Gopalakrishnan | Method, apparatus and system for providing image data to represent inventory |
JP6045417B2 (ja) * | 2012-12-20 | 2016-12-14 | オリンパス株式会社 | 画像処理装置、電子機器、内視鏡装置、プログラム及び画像処理装置の作動方法 |
JP6150555B2 (ja) * | 2013-02-26 | 2017-06-21 | オリンパス株式会社 | 内視鏡装置、内視鏡装置の作動方法及び画像処理プログラム |
US9311640B2 (en) | 2014-02-11 | 2016-04-12 | Digimarc Corporation | Methods and arrangements for smartphone payments and transactions |
JP6150554B2 (ja) * | 2013-02-26 | 2017-06-21 | オリンパス株式会社 | 画像処理装置、内視鏡装置、画像処理装置の作動方法及び画像処理プログラム |
US10117563B2 (en) * | 2014-01-09 | 2018-11-06 | Gyrus Acmi, Inc. | Polyp detection from an image |
JP2015156937A (ja) * | 2014-02-24 | 2015-09-03 | ソニー株式会社 | 画像処理装置、画像処理方法、並びにプログラム |
DE102014007908A1 (de) * | 2014-05-27 | 2015-12-03 | Carl Zeiss Meditec Ag | Chirurgie-System |
US20180064342A1 (en) * | 2015-03-18 | 2018-03-08 | Imricor Medical Systems, Inc. | System and method for enhanced magnetic resonance imaging of tissue |
US9536054B1 (en) | 2016-01-07 | 2017-01-03 | ClearView Diagnostics Inc. | Method and means of CAD system personalization to provide a confidence level indicator for CAD system recommendations |
US10339650B2 (en) | 2016-01-07 | 2019-07-02 | Koios Medical, Inc. | Method and means of CAD system personalization to reduce intraoperator and interoperator variation |
US10346982B2 (en) | 2016-08-22 | 2019-07-09 | Koios Medical, Inc. | Method and system of computer-aided detection using multiple images from different views of a region of interest to improve detection accuracy |
WO2018235246A1 (ja) * | 2017-06-22 | 2018-12-27 | オリンパス株式会社 | 画像処理装置、画像処理プログラム及び画像処理方法 |
WO2019023909A1 (zh) * | 2017-07-31 | 2019-02-07 | 深圳联影医疗科技有限公司 | 一种肝段分割方法与设备 |
US12029385B2 (en) * | 2018-09-27 | 2024-07-09 | Hoya Corporation | Electronic endoscope system |
KR20210083725A (ko) * | 2019-12-27 | 2021-07-07 | 주식회사 이우소프트 | 치근관의 만곡도를 3차원적으로 시각화하는 방법 및 장치 |
JP7376674B2 (ja) * | 2020-02-10 | 2023-11-08 | 富士フイルム株式会社 | 文書作成支援装置、文書作成支援方法及びプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004351056A (ja) * | 2003-05-30 | 2004-12-16 | Konica Minolta Medical & Graphic Inc | 医用画像処理装置 |
WO2006056798A1 (en) * | 2004-11-29 | 2006-06-01 | Medicsight Plc | Digital medical image analysis |
JP2006230910A (ja) * | 2005-02-28 | 2006-09-07 | Konica Minolta Medical & Graphic Inc | 画像処理装置及び画像処理方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050152588A1 (en) * | 2003-10-28 | 2005-07-14 | University Of Chicago | Method for virtual endoscopic visualization of the colon by shape-scale signatures, centerlining, and computerized detection of masses |
US7609910B2 (en) * | 2004-04-09 | 2009-10-27 | Siemens Medical Solutions Usa, Inc. | System and method for creating a panoramic view of a volumetric image |
-
2009
- 2009-08-27 JP JP2010526757A patent/JP5346938B2/ja active Active
- 2009-08-27 US US13/060,506 patent/US8538113B2/en active Active
- 2009-08-27 WO PCT/JP2009/064958 patent/WO2010024331A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004351056A (ja) * | 2003-05-30 | 2004-12-16 | Konica Minolta Medical & Graphic Inc | 医用画像処理装置 |
WO2006056798A1 (en) * | 2004-11-29 | 2006-06-01 | Medicsight Plc | Digital medical image analysis |
JP2006230910A (ja) * | 2005-02-28 | 2006-09-07 | Konica Minolta Medical & Graphic Inc | 画像処理装置及び画像処理方法 |
Non-Patent Citations (1)
Title |
---|
TETSUO NAKAZAWA ET AL.: "CT Colonoscopy no Kaihatsu", INNERVISION, vol. 23, no. 4, 25 March 2008 (2008-03-25), pages 14 - 15 * |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110261072A1 (en) * | 2008-12-05 | 2011-10-27 | Takayuki Kadomura | Medical image display device and method of medical image display |
US8791957B2 (en) * | 2008-12-05 | 2014-07-29 | Hitachi Medical Corporation | Medical image display device and method of medical image display |
WO2011122037A1 (ja) * | 2010-03-31 | 2011-10-06 | 富士フイルム株式会社 | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム |
JP2011212245A (ja) * | 2010-03-31 | 2011-10-27 | Fujifilm Corp | 内視鏡観察を支援するシステムおよび方法、並びに、装置およびプログラム |
CN102821670A (zh) * | 2010-03-31 | 2012-12-12 | 富士胶片株式会社 | 内窥镜观察辅助系统、方法、装置和程序 |
US9220468B2 (en) | 2010-03-31 | 2015-12-29 | Fujifilm Corporation | Endoscope observation assistance system, method, apparatus and program |
EP2554104A4 (en) * | 2010-03-31 | 2016-01-06 | Fujifilm Corp | SYSTEM AND METHOD FOR SUPPORTING AN ENDOSCOPE OBSERVATION AND DEVICE AND PROGRAM THEREFOR |
JP2016521165A (ja) * | 2013-04-18 | 2016-07-21 | セント・ジュード・メディカル・エイトリアル・フィブリレーション・ディヴィジョン・インコーポレーテッド | 2d平面投影及び部分的展開表面マッピングプロセスを利用して不整脈を視覚化し分析するためのシステム及び方法 |
JP2016158916A (ja) * | 2015-03-03 | 2016-09-05 | キヤノンマーケティングジャパン株式会社 | 医用画像処理装置、医用画像処理装置に搭載可能なプログラム、及び医用画像処理方法 |
JP2019118447A (ja) * | 2017-12-28 | 2019-07-22 | 株式会社Aze | 医用画像処理装置、その制御方法、及びプログラム |
JP7114252B2 (ja) | 2017-12-28 | 2022-08-08 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、その制御方法、及びプログラム |
CN111986137A (zh) * | 2019-05-21 | 2020-11-24 | 梁红霞 | 生物器官病变检测方法、装置、设备及可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20110164064A1 (en) | 2011-07-07 |
US8538113B2 (en) | 2013-09-17 |
JP5346938B2 (ja) | 2013-11-20 |
JPWO2010024331A1 (ja) | 2012-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5346938B2 (ja) | 画像処理装置、及び画像処理装置の作動方法 | |
US8423124B2 (en) | Method and system for spine visualization in 3D medical images | |
US20190355174A1 (en) | Information processing apparatus, information processing system, information processing method, and computer-readable recording medium | |
EP2420188B1 (en) | Diagnosis support apparatus, diagnosis support method, and storage medium storing diagnosis support program | |
US8542896B2 (en) | Medical image processing device and medical image processing method | |
CN107405126B (zh) | 检索成对的医学图像的对应结构 | |
US7304644B2 (en) | System and method for performing a virtual endoscopy | |
US8290231B2 (en) | Method and apparatus for providing measurement data of an anomaly in a medical image | |
US7620229B2 (en) | Method and apparatus for aiding image interpretation and computer-readable recording medium storing program therefor | |
JP6353827B2 (ja) | 画像処理装置 | |
JP5492024B2 (ja) | 領域分割結果修正装置、方法、及びプログラム | |
US9019272B2 (en) | Curved planar reformation | |
JPWO2007013300A1 (ja) | 異常陰影候補検出方法及び異常陰影候補検出装置 | |
JPWO2010064687A1 (ja) | 医用画像表示装置及び医用画像表示方法 | |
WO2016104082A1 (ja) | 画像処理装置及び画像処理方法 | |
JP4497951B2 (ja) | 医用画像診断装置 | |
JPWO2005009242A1 (ja) | 医用画像処理装置及び方法 | |
JP2007267979A (ja) | 生物の臓器形態解析方法と生物の臓器形態解析システム | |
JP2011172692A (ja) | 医用画像処理装置、及び医用画像処理プログラム | |
JP5701208B2 (ja) | 医用画像表示装置及び医用画像表示方法 | |
US20220172367A1 (en) | Visualization of sub-pleural regions | |
US8165375B2 (en) | Method and system for registering CT data sets | |
JP2006055402A (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
JP2010075334A (ja) | 医用画像処理装置及びプログラム | |
JP2021122677A (ja) | 画像処理装置、画像処理方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09809981 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010526757 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13060506 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09809981 Country of ref document: EP Kind code of ref document: A1 |