WO2012063653A1 - 医用画像表示装置及び医用画像表示方法 - Google Patents
医用画像表示装置及び医用画像表示方法 Download PDFInfo
- Publication number
- WO2012063653A1 WO2012063653A1 PCT/JP2011/074891 JP2011074891W WO2012063653A1 WO 2012063653 A1 WO2012063653 A1 WO 2012063653A1 JP 2011074891 W JP2011074891 W JP 2011074891W WO 2012063653 A1 WO2012063653 A1 WO 2012063653A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- image
- voxel
- medical image
- image display
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000004364 calculation method Methods 0.000 description 58
- 238000009877 rendering Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 3
- 210000000056 organ Anatomy 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 2
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000002429 large intestine Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000009206 nuclear medicine Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000002603 single-photon emission computed tomography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
Definitions
- the present invention relates to a medical image display apparatus and a medical image display method for displaying a medical image obtained from a medical image diagnostic apparatus including an X-ray CT apparatus, an MRI apparatus, an ultrasonic apparatus, and a nuclear medicine diagnostic apparatus, and more particularly to a medical image.
- the present invention relates to a technology for displaying images as three-dimensional images.
- 3D images include surface rendering (Surface Rendering), volume rendering (Volume Rendering), maximum value projection (Maximum Intensity Projection: MIP), and minimum value projection (Minimum Intensity Projection: MinIP), Ray Summation, and Multi-Planar Reconstruction (MPR).
- Patent Document 1 discloses speeding up the creation of a three-dimensional image by limiting the projection direction to the voxel arrangement direction on the cross-sectional image.
- the projection direction is limited to the voxel arrangement direction on the cross-sectional image, and no consideration is given to a case where projection is desired in an arbitrary direction.
- an object of the present invention is to provide a medical image display device and a medical image display method capable of displaying a three-dimensional image in an arbitrary direction at high speed.
- the present invention rearranges the arrangement of voxels constituting a three-dimensional image on a memory according to the angle of the projection plane and the projection method, and creates a projection image using the rearranged voxel data. To do.
- access to the data on the memory can be speeded up, so that the projected image can be displayed at high speed.
- the medical image display device of the present invention is a medical image display device including a display unit that displays a three-dimensional image created based on a cross-sectional image of a subject, and is set for the three-dimensional image.
- a voxel slide unit that slides each voxel constituting the three-dimensional image in one direction according to an angle of the projected plane and a projection method, and a projection image is created using the slid voxel data and is displayed on the display unit
- a projection image creating unit for displaying the projection image is a medical image display device including a display unit that displays a three-dimensional image created based on a cross-sectional image of a subject, and is set for the three-dimensional image.
- a voxel slide unit that slides each voxel constituting the three-dimensional image in one direction according to an angle of the projected plane and a projection method, and a projection image is created using the slid voxel data and is displayed on the display unit
- the medical image display method of the present invention is a medical image display method for displaying a three-dimensional image created based on a cross-sectional image of a subject, the angle of the projection plane and the projection set for the three-dimensional image.
- the present invention it is possible to provide a medical image display device and a medical image display method capable of displaying a three-dimensional image at high speed.
- Hardware configuration of medical image display apparatus of present invention Processing flow of the first embodiment of the present invention
- Example of 3D image display parameter setting GUI Example of GUI for parameter setting of computed image
- Example of processing flow in step 204 A diagram explaining the positional relationship between the 3D image and the projection plane Figure explaining supplementary shear images
- FIG. 1 is a diagram showing a hardware configuration of the medical image display apparatus 1.
- the medical image display device 1 includes a CPU (Central Processing Unit) 2, a main memory 3, a storage device 4, a display memory 5, a display device 6, a controller 7 connected to a mouse 8, a keyboard 9, and a network adapter 10 including a system bus 11. Is configured to be capable of transmitting and receiving signals.
- the medical image display device 1 is connected to a medical image photographing device 13 and a medical image database 14 via a network 12 so as to be able to send and receive signals.
- “to enable signal transmission / reception” indicates a state in which signals can be transmitted / received to each other or from one to the other, regardless of whether they are electrically or optically wired or wireless.
- the CPU2 is a device that controls the operation of each component.
- the CPU 2 loads a program stored in the storage device 4 and data necessary for program execution into the main memory 3 and executes it.
- the storage device 4 is a device that stores medical image information captured by the medical image capturing device 13, and is specifically a hard disk or the like.
- the storage device 4 may be a device that exchanges data with a portable recording medium such as a flexible disk, an optical (magnetic) disk, a ZIP memory, or a USB memory.
- the medical image information is acquired from the medical image capturing device 13 and the medical image database 14 via a network 12 such as a LAN (Local Area Network).
- the storage device 4 stores a program executed by the CPU 2 and data necessary for program execution.
- the main memory 3 stores programs executed by the CPU 2 and the progress of arithmetic processing.
- the display memory 5 temporarily stores display data to be displayed on the display device 6 such as a liquid crystal display or a CRT (Cathode Ray Tube).
- the mouse 8 and the keyboard 9 are operation devices for an operator to give an operation instruction to the medical image display device 1.
- the mouse 8 may be another pointing device such as a trackpad or a trackball.
- the controller 7 detects the state of the mouse 8, acquires the position of the mouse pointer on the display device 6, and outputs the acquired position information and the like to the CPU 2.
- the network adapter 10 is for connecting the medical image display apparatus 1 to a network 12 such as a LAN, a telephone line, or the Internet.
- the medical image photographing device 13 is a device that acquires medical image information such as a cross-sectional image of a subject.
- the medical imaging apparatus 13 is, for example, an MRI apparatus, an X-ray CT apparatus, an ultrasonic diagnostic apparatus, a scintillation camera apparatus, a PET apparatus, a SPECT apparatus, or the like.
- the medical image database 14 is a database system that stores medical image information captured by the medical image capturing device 13.
- FIG. 2 is an example of a processing flow according to the first embodiment of the present invention. Each step in FIG. 2 will be described below.
- Step 201 The CPU 2 acquires a medical image selected by the operator by operating the mouse 8 or the keyboard 9 as a three-dimensional image from the medical image capturing device 13 or the medical image database 14 via the network 12.
- the three-dimensional image 102 is created by stacking cross-sectional images 101 photographed using a medical image photographing device.
- the medical image acquired in this step may be the entire 3D image 102 as shown in FIG. 3 or a specific region in the 3D image 102.
- the specific area in the three-dimensional image 102 may be an area extracted by a threshold process executed by the CPU 2 using a predetermined threshold, or specified by the operator operating the mouse 8 or the keyboard 9. It may be an area.
- Step 202 The CPU 2 acquires information on the viewpoint and projection plane set for the three-dimensional image acquired in step 201 by operating the mouse 8 and the keyboard 9 by the operator.
- An example of a GUI (Graphical User Interface) used when the operator sets the viewpoint and the projection plane will be described later in detail with reference to FIG.
- Step 203 The CPU 2 acquires conditions necessary for creating the calculation image.
- the operation image is an image such as a surface rendering image, a volume rendering image, a MIP image, a MinIP image, a ray-sum image, or an MPR image.
- An example of the GUI used when the operator sets the calculation image creation conditions will be described in detail later with reference to FIG.
- Step 204 CPU 2 creates a shear image based on the parameters set in step 202.
- the shear image is an image created so that the projection line and the voxel are arranged in parallel. Note that this step may be executed prior to step 203. An example of the details of the flow of shear image creation processing will be described with reference to FIG.
- Step 601 The CPU 2 acquires the projection condition from the information set in step 202.
- the acquired projection conditions are the positional relationship between the three-dimensional image 102 and the projection plane 411, and whether or not the projection method is parallel projection.
- an XYZ coordinate system is set to represent the coordinates of the voxels constituting the three-dimensional image 102.
- the Z axis is set in the body axis direction of the subject, and the XY plane is a cross-sectional image.
- A is an affine transformation matrix for transforming the XYZ coordinate system to the UVW coordinate system, and includes rotation, movement, and scaling.
- Equation 1 By multiplying both sides of Equation 1 by the inverse matrix A -1 of A and replacing both sides, the following equation is obtained, and the UVW coordinate system can be converted to the XYZ coordinate system.
- Whether or not the projection method is parallel projection is based on the projection method selected by the projection method selection unit 420.
- Step 602 The CPU 2 acquires the calculation target area from the information set in step 203.
- step 203 the position of the knob 521 in the calculation area specifying unit 52 is specified and the length is changed, whereby the calculation target area is set as the distance from the projection plane 411, that is, the value of W.
- the CPU 2 calculates an area on the projection plane 411 corresponding to the calculation target area 700 acquired in step 602. Specifically, the CPU 2 extends a projection line from each voxel in the calculation target region 700 onto the projection plane 411, and calculates the intersection coordinates (u, v) between the projection line and the projection plane 411. For example, when the voxel coordinates are (X 0 , Y 0 , Z 0 ), the values of U and V obtained by substituting (X 0 , Y 0 , Z 0 ) into Equation 1 are the intersection coordinates (u, v). The calculated intersection coordinates (u, v) do not necessarily match the center coordinates of the pixels on the projection plane 411. The CPU 2 calculates an area that can include all of the intersection coordinates (u, v) corresponding to each voxel as an area on the projection plane corresponding to the calculation target area 700.
- this step is not essential, the execution of this step limits the area to be handled on the projection plane, so that the amount of subsequent calculations can be reduced and the calculation can be speeded up.
- Step 604 The CPU 2 calculates coordinates (x, y, z) in the three-dimensional image 102 corresponding to the pixels on the projection plane 411. Specifically, the CPU 2 extends a projection line from each pixel on the projection plane 411 to the three-dimensional image 102, and configures the three-dimensional image 102, and the intersection coordinates (x, y, z) is calculated. For example, pixel coordinates (U 1, V 1), when the z-coordinate of the cross-sectional image is Z 1, the value of W by substituting the number 2 (U 1, V 1) and Z 1 is first Desired.
- the intersection coordinates (x, y, z) are calculated. Will be. That is, if the pixel coordinates on the projection plane and the z-coordinate of the cross-sectional image are determined, the intersection coordinates (x, y, z) are calculated. Although the intersection coordinates (x, y, z) are on the cross-sectional image, they do not necessarily match the center coordinates of the pixels on the cross-sectional image.
- Step 605 The CPU 2 slides each voxel based on the intersection coordinates (x, y, z) calculated in step 604 to create a shear image.
- the shear image is an image created so that the intersection between the projection line and each cross-sectional image is arranged in parallel with any of the x, y, and z axes. For example, when the intersection of the projection line and each cross-sectional image is arranged parallel to the z axis, the (x, y) coordinates on the projection line are the same.
- FIG. 8 depicts a three-dimensional image 102 at 8 3 voxels.
- FIG. 8A is a perspective view of the three-dimensional image 102 before being slid
- FIG. 8B is a perspective view of the shear image 104 after being slid.
- FIG. 8 (c) shows the shear image 104 as viewed from the z-axis direction.
- each cross-sectional image slides in the same direction in the XY plane, that is, in the direction of the arrow 800 in FIG. 8 (c).
- the slide amount of each cross-sectional image in FIG. 8B differs for each cross-sectional image, but the slide amount difference between adjacent cross-sectional images is equal.
- the slide direction and the slide amount are determined by the positional relationship between the projection plane and the three-dimensional image.
- FIG. 9 shows that a three-dimensional image 902 created by stacking cross-sectional images 902a to 902g in the z-axis direction is projected onto the projection plane 901.
- FIG. 9A shows a state before the voxel of the three-dimensional image 902 is slid
- FIG. 9B shows a state after the voxel is slid and the shear image 904 is created.
- the slice interval between the cross-sectional images 902a to 902g is D
- the angle formed between the three-dimensional image 902 and the projection plane 901 is ⁇ .
- the cross-sectional images 902a to 902g may be slid by a predetermined amount in a direction parallel to the cross-sectional image.
- Cross-sectional images 904a to 904g are obtained by sliding the cross-sectional images 902a to 902g, and shear images 904 are obtained by stacking the cross-sectional images 904a to 904g.
- the projection lines 903a to 903d become projection lines 905a to 905d, and the projection lines 905a to 905d are parallel to the z axis.
- the sliding amount s when the voxel of the three-dimensional image 902 is slid in the direction parallel to the cross-sectional image is expressed by the following equation.
- ⁇ is an angle formed by the three-dimensional image and the projection plane
- D is a slice interval
- the slide amount of each voxel is obtained from the angle between the three-dimensional image and the projection plane and the distance from the reference cross-sectional image.
- the slide amount s is a constant value within the same cross-sectional image.
- the slide amount s is not necessarily an integer multiple of the size of the voxel, in order to calculate the voxel value on the projection line, interpolation calculation in the cross-sectional image, that is, in the xy plane in FIG. 9 is required.
- the voxel is slid in the direction parallel to the cross-sectional image, interpolation calculation of the voxel value in the projection direction is unnecessary.
- each voxel may be slid in a direction parallel to the cross-sectional images so that the projection direction is a direction in which the cross-sectional images are stacked.
- FIG. 10 is a diagram for explaining a slide amount in a plane including a center line 1007 passing through the viewpoint 1006 and orthogonal to the projection plane 1001.
- FIG. 10 shows that a three-dimensional image 1002 created by stacking the cross-sectional images 1002a to 1002g is projected from the viewpoint 1006 onto the projection plane 1001.
- 10A shows a state before the voxel of the three-dimensional image 1002 is slid
- FIG. 10B shows a state after the voxel is slid and the shear image 1004 is created.
- the slice interval of the cross-sectional images 1002a to 1002g is D
- the angle formed between the three-dimensional image 1002 and the projection plane 1001 is ⁇ .
- the projection lines 1003a to 1003d are emitted radially from the viewpoint 1006, the inclination of the projection line with respect to the projection plane 1001 differs for each projection line. Therefore, in FIG. 10, the inclination of the projection line with respect to the center line 1007 is represented by ⁇ . That is, ⁇ of the projection line 1003a is larger than ⁇ of the projection line 1003b.
- Section images 1004a to 1004g are obtained by sliding the section images 1002a to 1002g, and shear images 1004 are obtained by stacking the section images 1004a to 1004g.
- the projection lines 1003a to 1003d and the center line 1007 become the projection lines 1005a to 1005d and the center line 1008, and the projection lines 1005a to 1005d and the center line 1008 are parallel to the z axis.
- the sliding amount s when the voxel of the three-dimensional image 1002 is slid in the direction parallel to the cross-sectional image within the plane including the center line 1007 is expressed by the following equation.
- ⁇ is an angle formed by the three-dimensional image and the projection plane
- ⁇ is an angle formed by the center line 1007 and each projection line
- D is a slice interval
- n is the number of slices from the reference cross-sectional image.
- the reference cross-sectional image is the cross-sectional image 1002a
- n 1 in the cross-sectional image 1002b
- n 2 in the cross-sectional image 1002c.
- the sign before ⁇ is determined by the direction of each projection line. If the direction of the projection line with respect to the cross-sectional images 1002a to 1002g is closer to parallel than the center line 1007, the sign is closer to plus. Negative. Specifically, using FIG.
- the slide amount s is n ⁇ D ⁇ tan ⁇ for the voxel on the center line 1007, n ⁇ D ⁇ tan ( ⁇ + ⁇ ) on the projection lines 1003a and 1003b, and the projection line.
- n ⁇ D ⁇ tan ( ⁇ - ⁇ ) On 1003c and 1003d, n ⁇ D ⁇ tan ( ⁇ - ⁇ ).
- the directions of the projection lines 1003a and 1003b with respect to the cross-sectional images 1002a to 1002g are closer to parallel than the center line 1007, and the directions of the projection lines 1003c and 1003d are closer to perpendicular to the center line 1007.
- all the voxels are slid from left to right.
- ⁇ > ⁇ the value of n ⁇ D ⁇ tan ( ⁇ - ⁇ ) is negative, so that it slides in the opposite direction. Become.
- the slide amount s of each voxel can be obtained from the angle between the projection plane and the projection line and the distance from the reference cross-sectional image. That is, in the case of perspective projection, the slide amount s varies depending on the inclination of the projection lines 1003a to 1003d with respect to the cross-sectional images 1002a to 1002g even within the same cross-sectional image.
- the slide amount s is not necessarily an integer multiple of the size of the voxel. Therefore, in order to calculate the voxel value on the projection line, interpolation calculation in the cross-sectional image, that is, in the xy plane in FIG. Is required. In addition, since the voxel is slid in the direction parallel to the cross-sectional image, interpolation calculation of the voxel value in the projection direction is unnecessary.
- each voxel may be slid in a direction parallel to the cross-sectional images so that the projection direction is a direction in which the cross-sectional images are stacked.
- equation 4 if the value of ⁇ is zero, it becomes the same as equation 3. This indicates that parallel projection is obtained when the perspective of perspective projection is set to a point at infinity.
- Step 205 The CPU 2 creates a calculation image using the shear image created in step 204.
- a known method can be used as a method for creating the calculation image.
- the access to the voxel value data on the memory can be speeded up. As a result, calculation images can be created at high speed.
- the shear image may be divided into a plurality of regions as necessary, and a computed image may be created for each of the divided regions to form an in-volume image. Further, various operations may be executed between a plurality of in-volume images to create an inter-volume image.
- FIG. 11 shows that the three-dimensional image 902 created by stacking the cross-sectional images 902a to 902g is projected onto the projection plane 901, as in FIG.
- calculation target areas 1100a to 1100c are set.
- 11A shows a state before the voxel of the three-dimensional image 902 is slid
- FIG. 11B shows a state after the voxel is slid to create the shear image 904.
- the calculation target areas 1100a to 1100c in the three-dimensional image 902 become calculation target areas 1101a to 1101c in the shear image 904.
- the in-volume images are respectively created for the calculation target areas 1100a to 1100c, three in-volume images are created in FIG.
- the projection line and the voxel are arranged in parallel, so the interpolation calculation of the voxel value in the projection direction is unnecessary, and the calculation is performed. Speed can be increased.
- Acceleration of computation by using the shear image shown in Fig. 11 (b) is also possible when creating an inter-volume image.
- the voxels are arranged obliquely with respect to the projection line in the state before sliding the voxels.
- interpolation calculation of the voxel values in the projection direction is required.
- the interpolation calculation of the voxel value in the projection direction is not necessary, so that the calculation speed can be increased even when creating the inter-volume image. .
- Step 206 The CPU 2 causes the display device 6 to display the calculation image created in step 205. It should be noted that the operator decides to recreate the displayed calculation image, and when such an operation is performed, the process returns to step 203 or step 202.
- the voxels of the cross-sectional images 902a to 902g are slid in the direction parallel to the cross-sectional image.
- the projection line and the voxel are arranged in parallel.
- a shear image can be created.
- FIG. 12 shows an example when the voxel is slid in a direction orthogonal to the cross-sectional images 902a to 902g.
- the voxels are slid as shown in FIG. 12
- the directions of the projected lines 905a to 905d after the slide are parallel to the cross-sectional images 902a to 902g.
- shear images by utilizing the memory space independence of shear images, it is possible to divide the memory space to be processed in units of threads and perform pipeline processing for each thread. Therefore, by creating a shear image, it is possible to increase the speed when creating a calculation image from a three-dimensional image.
- FIG. 4 shows an example of the GUI used in step 202, that is, the GUI used when the operator sets the viewpoint and the projection plane. 4 includes an image display unit 41 and a display parameter setting unit.
- the image display unit 41 displays a three-dimensional image 102, a viewpoint, and a projection plane 411.
- the display forms of the three-dimensional image 102 and the projection plane 411 displayed on the image display unit 41 change according to the display parameters set by the display parameter setting unit 42.
- the display parameter setting unit 42 includes a projection method selection unit 420, a coordinate system selection unit 421, a rotation angle setting unit 422, a movement amount setting unit 423, and an enlargement ratio setting unit 424.
- the projection method selection unit 420 can select either parallel projection or perspective projection as the projection method.
- Parallel projection is a method in which projection lines are projected in the same direction from a viewpoint set at an infinite point, and all projection lines are parallel.
- Perspective projection is a method of projecting a projection line radially from a certain point of view and is also called central projection.
- the pixel value of the intersection point of each projection line on the projection plane 411 is determined using the voxel value of the intersection point between the projection line and the three-dimensional image 102 that is the projection target.
- radio buttons are used, but the present invention is not limited to this.
- FIG. 4 since parallel projection is selected, the viewpoint is set to the infinity point and is not displayed on the image display unit 41.
- the coordinate system selection unit 421 can select either image coordinates or projection coordinates.
- the image coordinate is a coordinate system corresponding to the three-dimensional image 102
- the projected coordinate is a coordinate system corresponding to the viewpoint or the projection plane 411.
- Parameters set by the rotation angle setting unit 422 and the movement amount setting unit 423 are valid for the coordinate system selected by the coordinate system selection unit 421.
- a tab is used as the coordinate system selection unit 421, but the present invention is not limited to this.
- the image coordinates are selected.
- the rotation angle setting unit 422 can set the rotation angle around each axis of the coordinate system selected by the coordinate system selection unit 421.
- ⁇ , ⁇ , and ⁇ represent rotation angles around the X, Y, and Z axes, respectively.
- the coordinate system selected by the coordinate system selection unit 421 rotates, and the image corresponding to the coordinate system rotates along with the rotation, and the image display unit 41 Updated above.
- the viewpoint or the projection plane 411 may be rotated in conjunction with the three-dimensional image 102.
- the rotation angle setting unit 422 in FIG. 4 uses a combination of an edit field and a spin button, the present invention is not limited to this.
- the movement amount setting unit 423 can set the movement amount in each axis direction of the coordinate system selected by the coordinate system selection unit 421. Each time the value of any of X, Y, and Z is updated, the coordinate system selected by the coordinate system selection unit 421 moves, and the image corresponding to the coordinate system moves along with the movement, and the image display unit 41 Updated above. When the image coordinate is selected by the coordinate system selection unit 421, the viewpoint or the projection plane 411 may be moved in conjunction with the three-dimensional image 102.
- the movement amount setting unit 423 in FIG. 4 uses a combination of an edit field and a spin button, but is not limited to this.
- the enlargement ratio setting unit 424 can set an enlargement ratio when displaying an image corresponding to the coordinate system selected by the coordinate system selection unit 421. Since an image having a size multiplied by the value set as the enlargement factor is displayed, if 1 is set as the enlargement factor, the displayed image becomes the full size.
- the enlargement ratio setting unit 423 in FIG. 4 uses an edit field, but the present invention is not limited to this.
- the 3D image 102 displayed on the image display unit 41, the viewpoint, and the projection plane 411 may be rotated, moved, or enlarged by the operator performing a drag operation with the mouse 8.
- FIG. 5 shows an example of the GUI used in step 203, that is, the GUI used when the operator sets the calculation image creation conditions.
- the GUI 50 shown in FIG. 5A includes a calculation image display unit 51, a calculation region designation unit 52, a volume number setting unit 53, and a calculation execution button 57.
- an in-volume image or an inter-volume image created as a calculation image is displayed.
- the in-volume image is an image created by performing an operation on volume data in an area designated as an operation target.
- An inter-volume image is an image created by performing various operations between a plurality of intra-volume images. The computation executed when creating the inter-volume image may be different from the computation executed when creating the in-volume image.
- the calculation area designating unit 52 is used for designating the position and area to be calculated.
- a scroll bar is used as the calculation area specifying unit 52, and the position of the calculation target is specified by moving the knob 521 on the scroll bar.
- the direction of the scroll bar corresponds to the direction perpendicular to the projection plane set in step 202.
- the length of the knob 521 is variable, and the area to be calculated can be changed by changing the length of the knob 521.
- a volume specifying unit 54 described later is displayed.
- the volume number setting unit 53 is used to set the number of volumes to be subjected to inter-volume calculation. As the numerical value set by the volume number setting unit 53 increases, the length of the knob 521 increases. If the numerical value set by the volume number setting unit 53 is 1, the calculated image displayed on the calculated image display unit 51 is an in-volume image. Note that the numerical value displayed in the volume number setting unit 53 may be changed in accordance with the change in the length of the knob 521.
- Fig. 5 (b) shows an example of the volume designation unit 54.
- the volume designation unit 54 includes a volume interval setting unit 541, a volume number display unit 542, and a volume width setting unit 545.
- the volume interval setting unit 541 is used for setting the volume interval
- the volume width setting unit 545 is used for setting the volume width.
- the volume number display section 542 displays an axis 543 and a scale 544.
- the number of volumes is represented by the number of scales 544.
- the interval of the scale 544 changes according to the value of the volume interval.
- the length of the axis 543 changes according to the value of the volume width.
- an in-volume image creation condition setting unit 55 described later is displayed.
- a knob may be displayed on the clicked scale. Clicking between the scales 544 displays an inter-volume image creation condition setting unit 55 described later.
- FIG. 5 (c) shows an example of the in-volume image creation condition setting unit 55.
- the in-volume image creation condition setting unit 55 includes a slab thickness setting unit 551, a slice pitch setting unit 552, an operation parameter setting unit 553, and an operator selection unit 554.
- the slab thickness setting unit 551 is used to set the slab thickness of the region that is the target of the in-volume image.
- the slice pitch setting unit 552 is used to set a slice pitch in a region that is a target of an in-volume image.
- the operator selection unit 554 is used to select an operator used to create an in-volume image.
- the operator selection unit 554 can select the type of operator executed on the volume data. In the operator selection unit 554 in FIG. 5 (c), a pull-down menu is used, but the present invention is not limited to this.
- the types of operations include arithmetic operations, comparison operations, and in-volume operations. Hereinafter, the type of each calculation will be described.
- Arithmetic operations are operations that use four arithmetic operations. For example, there is weighted addition. Specifically, all the weights are equal to each other. There are weighted laysums for performing subtraction, subtraction using negative values for some weighting factors, ⁇ blending for making the sum of weighting factors 1 and the like.
- the comparison operation is an operation for determining the pixel value on the projection plane by comparing the voxel values on the projection line. Specifically, the MIP calculation for projecting the maximum voxel value on the projection line onto the projection plane, on the projection line There is a MinIP operation that projects the minimum voxel value of the image onto the projection plane.
- In-volume computation is computation that does not depend on the pixel position on the projection plane. Specifically, Rendering creates a projection image based on the opacity set according to the voxel value, and sets a weighting factor for each voxel value. Thus, there is a Crystal (count value image) that performs a weighted product-sum operation between cross-sectional images.
- parameters necessary for setting are displayed according to the operator selected by the operator selection section 554.
- the operator can change the parameters displayed on the calculation parameter setting unit 553 by operating a mouse or the like.
- the weighted laysum is selected as the operator, and the weighting coefficient is displayed in the calculation parameter setting unit 553.
- FIG. 5 (d) shows an example of the inter-volume image creation condition setting unit 56.
- a calculation parameter setting unit 561 and an operator selection unit 562 are provided.
- the operator selection unit 562 is used to select an operator used to create an inter-volume image, and is the same as the operator selection unit 554 in FIG.
- the calculation parameter setting unit 561 displays parameters necessary for setting according to the operator selected by the operator selection unit 562.
- the operator can change the parameter displayed on the calculation parameter setting unit 561 by operating a mouse or the like.
- MIP is selected as the operator. Since there is no parameter necessary for setting in the case of MIP calculation, nothing is displayed in the calculation parameter setting unit 561.
- GUI used to set the calculation image creation conditions is not limited to that shown in FIG.
- the CPU 2 advances the processing to step 204.
- the processing flow of the second embodiment is substantially the same as FIG. However, the flow of processing executed in step 203 is different from the GUI used in step 202. Hereinafter, differences from the first embodiment will be described.
- FIG. 13 is an example of a GUI used in the second embodiment. Differences from the GUI 40 used in the first embodiment shown in FIG. 4 will be described.
- the GUI 110 used in this embodiment includes a projection plane shape designation unit 1300.
- the projection surface shape designation unit 1300 can designate the shape of the projection surface.
- the storage device 4 stores various projection plane shapes and projection plane shape identification numbers, which are numbers for identifying the projection plane shapes, in association with each other. The operator selects a desired projection plane shape by inputting a projection plane shape identification number to the projection plane shape designation unit 1100.
- a GUI that can set a partial curvature of the projection plane may be used.
- the projection coordinate is selected by the coordinate system selection unit 421.
- the shape of the calculation target area acquired in step 602 in FIG. 6 is a shape along the projection surface that is a curved surface, and the other steps are the same as in FIG. That is, even when the projection surface is a curved surface, it is possible to increase the speed when creating a calculation image from a three-dimensional image by creating a shear image.
- 1 medical image display device 2 CPU, 3 main memory, 4 storage device, 5 display memory, 6 display device, 7 controller, 8 mouse, 9 keyboard, 10 network adapter, 11 system bus, 12 network, 13 medical imaging device , 14 Medical image database, 101 Cross-sectional image, 102 Stacked 3D image
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Computer Graphics (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Generation (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
Description
本発明の第1の実施形態について、図2~11を用いて説明する。本実施形態では、三次元画像を構成するボクセルの配列を投影面の角度と投影方法に応じてメモリ上で並べ替え、並び替え後のボクセルのボクセルデータを用いて投影画像を作成する。図2は本発明の第1の実施形態の処理フローの一例である。図2の各ステップについて以下に説明する。
CPU2は、操作者がマウス8やキーボード9を操作して選択した医用画像を医用画像撮影装置13または医用画像データベース14からネットワーク12を介して三次元画像として取得する。図3に示すように、三次元画像102は医用画像撮影装置を用いて撮影された断面画像101を積み重ねて作成されるものである。なお、本ステップで取得される医用画像は、図3に示すような三次元画像102の全体であっても良いし、三次元画像102の中の特定領域であっても良い。三次元画像102の中の特定領域は、予め定められた閾値を用いてCPU2が実行する閾値処理により抽出された領域でも良いし、操作者がマウス8やキーボード9を操作することで指定された領域でも良い。
CPU2は、操作者がマウス8やキーボード9を操作してステップ201で取得された三次元画像に対して設定した視点や投影面に関する情報を取得する。操作者が視点や投影面を設定する際に用いるGUI(Graphical User Interface)の例については図4を用いて後で詳細に説明する。
CPU2は、演算画像を作成する際に必要な条件を取得する。ここで演算画像とは、サーフェースレンダリング画像、ボリュームレンダリング画像、MIP画像、MinIP画像、レイサム画像、MPR画像といった画像である。操作者が演算画像の作成条件を設定する際に用いるGUIの例については図5を用いて後で詳細に説明する。
CPU2は、ステップ202で設定されたパラメータに基づき、せん断画像を作成する。せん断画像は投影線とボクセルとが平行に配置されるように作成された画像である。なお、本ステップはステップ203に先立って実行されても良い。
せん断画像の作成処理の流れの詳細の一例について図6を用いて説明する。
CPU2は、ステップ202で設定された情報から投影条件を取得する。取得される投影条件は、三次元画像102と投影面411との位置関係、投影方法が平行投影か否かである。
CPU2は、ステップ203で設定された情報から演算対象領域を取得する。ステップ203では演算領域指定部52中のつまみ521の位置の指定及び長さの変更がなされることにより、投影面411からの距離、すなわちWの値として演算対象領域が設定される。図7にはW=W1の面からW=W2の面までが演算対象領域700として設定された例を示した。
CPU2は、ステップ602で取得された演算対象領域700に対応する投影面411上の領域を算出する。具体的には、CPU2は演算対象領域700中の各ボクセルから投影面411上に投影線を延ばし、投影線と投影面411との交点座標(u,v)を算出する。例えばボクセル座標が (X0,Y0,Z0)であるときに、数1に(X0,Y0,Z0)を代入することで求められるUとVの値が交点座標(u,v)となる。算出される交点座標(u,v)は、投影面411上のピクセルの中心座標と必ずしも一致するわけではない。CPU2は各ボクセルと対応する交点座標(u,v)の全てを含みうる領域を、演算対象領域700に対応する投影面上の領域として算出する。
CPU2は、投影面411上のピクセルに対応する三次元画像102中の座標(x,y,z)を算出する。具体的には、CPU2は投影面411上の各ピクセルから三次元画像102へ投影線を延ばし、三次元画像102を構成しz座標により定められる各断面画像と投影線との交点座標(x,y,z)を算出する。例えばピクセル座標が (U1,V1)であり、断面画像のz座標がZ1であるときに、数2に(U1,V1)及びZ1を代入することでWの値がまず求められる。次に求められたWの値と(U1,V1)とを数2に代入することにより、XとYの値が求められ、その結果、交点座標(x,y,z)が算出されることになる。つまり、投影面上のピクセル座標と断面画像のz座標が定められれば、交点座標(x,y,z)が算出される。なお、交点座標(x,y,z)は断面画像上にはあるものの、断面画像上のピクセルの中心座標とは必ずしも一致するわけではない。
CPU2は、ステップ604で算出された交点座標(x,y,z)に基づいて各ボクセルをスライドさせて、せん断画像を作成する。せん断画像は、投影線と各断面画像との交点がx、y、z軸のいずれかと平行に配置されるように作成される画像である。例えば、投影線と各断面画像との交点がz軸と平行に配置されると、投影線上の(x,y)座標は同一となる。このようなせん断画像を作成すると、投影面上の任意のピクセル座標(U,V)の画素値を算出するには、せん断画像中のボクセルのうち(U,V)に対応する(x,y)座標を有するボクセルのボクセル値のみを扱えば良い。その結果、メモリ上のデータへのアクセスを高速化できるようになり、投影画像の高速表示が可能となる。
なお、数4において、Δθの前の符号は、各投影線の方向によって決まるものであり、断面画像1002a~1002gに対する投影線の方向が、中心線1007よりも平行に近ければプラス、垂直に近ければマイナスとなる。図10(b)を用いて具体的に説明すると、スライド量sは、中心線1007上のボクセルではn・D・tanθ、投影線1003a、1003b上ではn・D・tan(θ+Δθ)、投影線1003c、1003d上ではn・D・tan(θ-Δθ)となる。断面画像1002a~1002gに対する投影線1003a、1003bの方向は中心線1007よりも平行に近く、投影線1003c、1003dの方向は中心線1007よりも垂直に近い。また図10(b)では全ボクセルを左から右にスライドさせているが、Δθ>θの場合はn・D・tan(θ-Δθ)の値が負となるので逆方向にスライドさせることになる。
CPU2は、ステップ204で作成されたせん断画像を用いて演算画像を作成する。演算画像の作成方法は公知の方法を用いることができる。せん断画像では投影線とボクセルとが平行に配置されるので、メモリ上のボクセル値データへのアクセスを高速化できる。その結果、演算画像の作成を高速に行うことができる。
CPU2は、ステップ205で作成された演算画像を表示装置6に表示させる。なお、操作者が表示された演算画像を作成しなおしたいと判断し、そのような操作がなされた時はステップ203もしくはステップ202へ戻る。
投影方法選択部420では、投影方法として平行投影と透視投影のいずれかを選択することができる。平行投影とは、無限遠点に設定された視点から同じ方向に投影線を延ばして投影する方法であり、全ての投影線が平行である。透視投影とは、ある1点の視点から放射状に投影線を延ばして投影する方法であり、中心投影とも呼ばれる。いずれの投影方法でも、投影対象物である三次元画像102と投影線との交点のボクセル値を用いて、投影面411の各投影線との交点の画素値を決定する。図4の投影方法選択部420ではラジオボタンが用いられているが、これに限定されるものではない。図4では平行投影が選択されているので、視点は無限遠点にされ画像表示部41には表示されない。
本発明の第2の実施形態について、図を用いて説明する。第1の実施形態では、投影面411が平面である場合について説明した。本実施形態では、投影面として曲面を選択可能である場合について説明する。血管や大腸のような管腔臓器を診断する場合は、管腔臓器の走行方向と平行な断面画像を作成すると診断がしやすくなる。管腔臓器の走行方向と平行な断面画像を作成するには投影面として曲面を扱う必要がある。
Claims (12)
- 被検体の断面画像に基づき作成された三次元画像を表示する表示部を備えた医用画像表示装置であって、
前記三次元画像に対し設定される投影面の角度と投影方法とに応じて前記三次元画像を構成する各ボクセルを一方向にスライドさせるボクセルスライド部と、
スライドされたボクセルデータを用いて投影画像を作成し前記表示部に前記投影画像を表示させる投影画像作成部と、を備えることを特徴とする医用画像表示装置。 - 請求項1に記載の医用画像表示装置において、
前記ボクセルスライド部は、投影面に対する各投影線の傾きに応じて、ボクセルのスライド量を決定することを特徴とする医用画像表示装置。 - 請求項2に記載の医用画像表示装置において、
前記投影方法が平行投影の場合には、同一断面画像内ではスライド量が一定であることを特徴とする医用画像表示装置。 - 請求項2に記載の医用画像表示装置において、
前記投影方法が透視投影の場合には、投影面に対する各投影線の傾きに応じて、スライド量が異なることを特徴とする医用画像表示装置。 - 請求項1に記載の医用画像表示装置において、
前記ボクセルスライド部は、ボクセルを前記断面画像と平行な方向にスライドさせることを特徴とする医用画像表示装置。 - 請求項1に記載の医用画像表示装置において、
前記投影面の角度と前記投影方法との設定を受付ける投影条件受付部をさらに備えることを特徴とする医用画像表示装置。 - 被検体の断面画像に基づき作成された三次元画像を表示する医用画像表示方法であって、
前記三次元画像に対し設定される投影面の角度と投影方法とに応じて前記三次元画像を構成する各ボクセルを一方向にスライドさせるボクセルスライドステップと、
スライドされたボクセルデータを用いて投影画像を作成し、前記投影画像を表示する投影画像作成ステップと、を備えることを特徴とする医用画像表示方法。 - 請求項7に記載の医用画像表示方法において、
前記ボクセルスライドステップでは、投影面に対する各投影線の傾きに応じて、ボクセルのスライド量を決定することを特徴とする医用画像表示方法。 - 請求項8に記載の医用画像表示方法において、
前記投影方法が平行投影の場合には、同一断面画像内ではスライド量が一定であることを特徴とする医用画像表示方法。 - 請求項8に記載の医用画像表示方法において、
前記投影方法が透視投影の場合には、投影面に対する各投影線の傾きに応じて、スライド量が異なることを特徴とする医用画像表示方法。 - 請求項7に記載の医用画像表示方法において、
前記ボクセルスライドステップでは、ボクセルを前記断面画像と平行な方向にスライドさせることを特徴とする医用画像表示方法。 - 請求項7に記載の医用画像表示方法において、
前記投影面の角度と前記投影方法との設定を受付ける投影条件受付ステップを前記ボクセルスライドステップの前に備えることを特徴とする医用画像表示方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012542867A JPWO2012063653A1 (ja) | 2010-11-12 | 2011-10-28 | 医用画像表示装置及び医用画像表示方法 |
US13/882,384 US20130222383A1 (en) | 2010-11-12 | 2011-10-28 | Medical image display device and medical image display method |
CN201180053602.8A CN103188998B (zh) | 2010-11-12 | 2011-10-28 | 医用图像显示装置以及医用图像显示方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010253338 | 2010-11-12 | ||
JP2010-253338 | 2010-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012063653A1 true WO2012063653A1 (ja) | 2012-05-18 |
Family
ID=46050805
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/074891 WO2012063653A1 (ja) | 2010-11-12 | 2011-10-28 | 医用画像表示装置及び医用画像表示方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130222383A1 (ja) |
JP (1) | JPWO2012063653A1 (ja) |
CN (1) | CN103188998B (ja) |
WO (1) | WO2012063653A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014082015A1 (en) * | 2012-11-23 | 2014-05-30 | Icad, Inc. | System and method for improving workflow efficiencies in reading tomosynthesis medical image data |
CN104619258A (zh) * | 2012-09-13 | 2015-05-13 | 富士胶片株式会社 | 三维图像显示装置、方法及程序 |
US9456797B2 (en) | 2002-11-27 | 2016-10-04 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US9805507B2 (en) | 2012-02-13 | 2017-10-31 | Hologic, Inc | System and method for navigating a tomosynthesis stack using synthesized image data |
US10008184B2 (en) | 2005-11-10 | 2018-06-26 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
JP2018526708A (ja) * | 2015-08-13 | 2018-09-13 | ビューワークス カンパニー リミテッド | 時系列イメージ分析のためのグラフィックユーザーインタフェース提供方法 |
JP2019180866A (ja) * | 2018-04-10 | 2019-10-24 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、教師データ作成プログラム及び教師データ作成方法 |
US10573276B2 (en) | 2011-11-27 | 2020-02-25 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11403483B2 (en) | 2017-06-20 | 2022-08-02 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US11406332B2 (en) | 2011-03-08 | 2022-08-09 | Hologic, Inc. | System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy |
US11419565B2 (en) | 2014-02-28 | 2022-08-23 | IIologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US11445993B2 (en) | 2017-03-30 | 2022-09-20 | Hologic, Inc. | System and method for targeted object enhancement to generate synthetic breast tissue images |
US11455754B2 (en) | 2017-03-30 | 2022-09-27 | Hologic, Inc. | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement |
US11452486B2 (en) | 2006-02-15 | 2022-09-27 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
US11589944B2 (en) | 2013-03-15 | 2023-02-28 | Hologic, Inc. | Tomosynthesis-guided biopsy apparatus and method |
US11701199B2 (en) | 2009-10-08 | 2023-07-18 | Hologic, Inc. | Needle breast biopsy system and method of use |
US11775156B2 (en) | 2010-11-26 | 2023-10-03 | Hologic, Inc. | User interface for medical image review workstation |
US11957497B2 (en) | 2017-03-30 | 2024-04-16 | Hologic, Inc | System and method for hierarchical multi-level feature image synthesis and representation |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104337535A (zh) * | 2013-08-02 | 2015-02-11 | 上海联影医疗科技有限公司 | 计算机断层成像方法和装置 |
US20160306936A1 (en) * | 2015-04-15 | 2016-10-20 | Canon Kabushiki Kaisha | Diagnosis support system, information processing method, and program |
JP6667231B2 (ja) * | 2015-08-31 | 2020-03-18 | キヤノン株式会社 | 情報処理装置、画像処理装置、情報処理システム、情報処理方法、及びプログラム。 |
EP3509033B1 (en) | 2016-08-30 | 2024-02-28 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, image processing program, and image processing system |
WO2019065466A1 (ja) * | 2017-09-29 | 2019-04-04 | キヤノン株式会社 | 画像処理装置、画像処理方法、およびプログラム |
JP6921711B2 (ja) * | 2017-10-31 | 2021-08-18 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
CN110297332B (zh) * | 2019-06-28 | 2021-08-27 | 京东方科技集团股份有限公司 | 三维显示装置及其控制方法 |
CN112184629B (zh) * | 2020-09-07 | 2022-08-09 | 上海培云教育科技有限公司 | 一种pet色彩化肿瘤体旋转显示方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61278976A (ja) * | 1985-05-31 | 1986-12-09 | Shimadzu Corp | X線ctリフオ−マツテイング像の再構成方法 |
JPH10192271A (ja) * | 1997-01-10 | 1998-07-28 | Toshiba Corp | X線ct装置及び画像処理装置 |
JPH11508386A (ja) * | 1997-04-15 | 1999-07-21 | ザ リサーチ ファウンデーション オブ ステイト ユニヴァーシティ オブ ニューヨーク | 平行法及び遠近法によりボリュームを実時間で視覚化する装置及び方法 |
JP2001104291A (ja) * | 1999-10-06 | 2001-04-17 | Ge Yokogawa Medical Systems Ltd | X線ct装置 |
JP2001283249A (ja) * | 2000-04-03 | 2001-10-12 | Hitachi Medical Corp | 画像表示装置 |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4908573A (en) * | 1989-01-05 | 1990-03-13 | The Regents Of The University Of California | 3D image reconstruction method for placing 3D structure within common oblique or contoured slice-volume without loss of volume resolution |
US5544283A (en) * | 1993-07-26 | 1996-08-06 | The Research Foundation Of State University Of New York | Method and apparatus for real-time volume rendering from an arbitrary viewing direction |
WO1996007989A1 (en) * | 1994-09-06 | 1996-03-14 | The Research Foundation Of State University Of New York | Apparatus and method for real-time volume visualization |
US5787889A (en) * | 1996-12-18 | 1998-08-04 | University Of Washington | Ultrasound imaging with real time 3D image reconstruction and visualization |
US6313841B1 (en) * | 1998-04-13 | 2001-11-06 | Terarecon, Inc. | Parallel volume rendering system with a resampling module for parallel and perspective projections |
US6556199B1 (en) * | 1999-08-11 | 2003-04-29 | Advanced Research And Technology Institute | Method and apparatus for fast voxelization of volumetric models |
CA2286447C (en) * | 1999-10-15 | 2009-01-06 | Vittorio Accomazzi | Perspective with shear warp |
GB2361396B (en) * | 2000-04-10 | 2002-04-03 | Voxar Ltd | Imaging volume data |
WO2002061686A1 (de) * | 2001-01-29 | 2002-08-08 | Dkfz | Verfahren und vorrichtung zur bildrekonstruktion eines raumvolumens |
US6570952B2 (en) * | 2001-02-27 | 2003-05-27 | Siemens Corporate Research, Inc. | Memory efficient shear-warp voxel projection algorithm |
WO2002078545A1 (fr) * | 2001-03-28 | 2002-10-10 | Hitachi Medical Corporation | Dispositif d'affichage d'images a trois dimensions |
US7003175B2 (en) * | 2001-03-28 | 2006-02-21 | Siemens Corporate Research, Inc. | Object-order multi-planar reformatting |
USRE45759E1 (en) * | 2001-07-31 | 2015-10-20 | Koninklijke Philips N.V. | Transesophageal and transnasal, transesophageal ultrasound imaging systems |
EP1455307A1 (en) * | 2003-03-06 | 2004-09-08 | MeVis GmbH | Partial volume visualization |
US7656418B2 (en) * | 2003-06-11 | 2010-02-02 | Koninklijke Philips Electronics N.V. | User control of 3d volume plane crop |
US7250949B2 (en) * | 2003-12-23 | 2007-07-31 | General Electric Company | Method and system for visualizing three-dimensional data |
JP4130428B2 (ja) * | 2004-09-02 | 2008-08-06 | ザイオソフト株式会社 | 画像処理方法及び画像処理プログラム |
KR100669900B1 (ko) * | 2004-12-16 | 2007-01-17 | 한국전자통신연구원 | 이미지 기반의 볼륨 데이터 제거 방법 |
US7453983B2 (en) * | 2005-01-20 | 2008-11-18 | Carestream Health, Inc. | Radiation therapy method with target detection |
JP4213135B2 (ja) * | 2005-04-22 | 2009-01-21 | ザイオソフト株式会社 | 展開画像投影方法、展開画像投影プログラム、展開画像投影装置 |
US7307630B2 (en) * | 2005-08-26 | 2007-12-11 | Barco Nv | Volume rendering apparatus and method |
US20080292164A1 (en) * | 2006-08-29 | 2008-11-27 | Siemens Corporate Research, Inc. | System and method for coregistration and analysis of non-concurrent diffuse optical and magnetic resonance breast images |
US20080177163A1 (en) * | 2007-01-19 | 2008-07-24 | O2 Medtech, Inc. | Volumetric image formation from optical scans of biological tissue with multiple applications including deep brain oxygenation level monitoring |
US7856129B2 (en) * | 2007-03-09 | 2010-12-21 | Siemens Medical Solutions Usa, Inc. | Acceleration of Joseph's method for full 3D reconstruction of nuclear medical images from projection data |
JP2008259612A (ja) * | 2007-04-11 | 2008-10-30 | Fujifilm Corp | 投影画像生成装置およびそのプログラム |
JP4545169B2 (ja) * | 2007-04-12 | 2010-09-15 | 富士フイルム株式会社 | 画像表示方法、装置およびプログラム |
JP5523681B2 (ja) * | 2007-07-05 | 2014-06-18 | 株式会社東芝 | 医用画像処理装置 |
US9251585B2 (en) * | 2007-07-12 | 2016-02-02 | Siemens Aktiengesellschaft | Coregistration and analysis of multi-modal images obtained in different geometries |
US7978191B2 (en) * | 2007-09-24 | 2011-07-12 | Dolphin Imaging Systems, Llc | System and method for locating anatomies of interest in a 3D volume |
US9427173B2 (en) * | 2008-05-09 | 2016-08-30 | General Electric Company | Determining mechanical force on aneurysms from a fluid dynamic model driven by vessel blood flow information |
US8184890B2 (en) * | 2008-12-26 | 2012-05-22 | Three Palm Software | Computer-aided diagnosis and visualization of tomosynthesis mammography data |
AU2010292181B2 (en) * | 2009-09-09 | 2016-09-15 | Oregon Health & Science University | Automated detection of melanoma |
CN107403058B (zh) * | 2010-07-21 | 2021-04-16 | 阿敏·E·莫尔勒 | 图像报告方法 |
-
2011
- 2011-10-28 WO PCT/JP2011/074891 patent/WO2012063653A1/ja active Application Filing
- 2011-10-28 CN CN201180053602.8A patent/CN103188998B/zh not_active Expired - Fee Related
- 2011-10-28 US US13/882,384 patent/US20130222383A1/en not_active Abandoned
- 2011-10-28 JP JP2012542867A patent/JPWO2012063653A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61278976A (ja) * | 1985-05-31 | 1986-12-09 | Shimadzu Corp | X線ctリフオ−マツテイング像の再構成方法 |
JPH10192271A (ja) * | 1997-01-10 | 1998-07-28 | Toshiba Corp | X線ct装置及び画像処理装置 |
JPH11508386A (ja) * | 1997-04-15 | 1999-07-21 | ザ リサーチ ファウンデーション オブ ステイト ユニヴァーシティ オブ ニューヨーク | 平行法及び遠近法によりボリュームを実時間で視覚化する装置及び方法 |
JP2001104291A (ja) * | 1999-10-06 | 2001-04-17 | Ge Yokogawa Medical Systems Ltd | X線ct装置 |
JP2001283249A (ja) * | 2000-04-03 | 2001-10-12 | Hitachi Medical Corp | 画像表示装置 |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10413263B2 (en) | 2002-11-27 | 2019-09-17 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US9456797B2 (en) | 2002-11-27 | 2016-10-04 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US9808215B2 (en) | 2002-11-27 | 2017-11-07 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US10010302B2 (en) | 2002-11-27 | 2018-07-03 | Hologic, Inc. | System and method for generating a 2D image from a tomosynthesis data set |
US10008184B2 (en) | 2005-11-10 | 2018-06-26 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11452486B2 (en) | 2006-02-15 | 2022-09-27 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
US11918389B2 (en) | 2006-02-15 | 2024-03-05 | Hologic, Inc. | Breast biopsy and needle localization using tomosynthesis systems |
US11701199B2 (en) | 2009-10-08 | 2023-07-18 | Hologic, Inc. | Needle breast biopsy system and method of use |
US11775156B2 (en) | 2010-11-26 | 2023-10-03 | Hologic, Inc. | User interface for medical image review workstation |
US11406332B2 (en) | 2011-03-08 | 2022-08-09 | Hologic, Inc. | System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy |
US10573276B2 (en) | 2011-11-27 | 2020-02-25 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US10978026B2 (en) | 2011-11-27 | 2021-04-13 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11837197B2 (en) | 2011-11-27 | 2023-12-05 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US11508340B2 (en) | 2011-11-27 | 2022-11-22 | Hologic, Inc. | System and method for generating a 2D image using mammography and/or tomosynthesis image data |
US9805507B2 (en) | 2012-02-13 | 2017-10-31 | Hologic, Inc | System and method for navigating a tomosynthesis stack using synthesized image data |
US11663780B2 (en) | 2012-02-13 | 2023-05-30 | Hologic Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
US10977863B2 (en) | 2012-02-13 | 2021-04-13 | Hologic, Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
US10410417B2 (en) | 2012-02-13 | 2019-09-10 | Hologic, Inc. | System and method for navigating a tomosynthesis stack using synthesized image data |
CN104619258A (zh) * | 2012-09-13 | 2015-05-13 | 富士胶片株式会社 | 三维图像显示装置、方法及程序 |
WO2014082015A1 (en) * | 2012-11-23 | 2014-05-30 | Icad, Inc. | System and method for improving workflow efficiencies in reading tomosynthesis medical image data |
US8983156B2 (en) | 2012-11-23 | 2015-03-17 | Icad, Inc. | System and method for improving workflow efficiences in reading tomosynthesis medical image data |
US11589944B2 (en) | 2013-03-15 | 2023-02-28 | Hologic, Inc. | Tomosynthesis-guided biopsy apparatus and method |
US11419565B2 (en) | 2014-02-28 | 2022-08-23 | IIologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
US11801025B2 (en) | 2014-02-28 | 2023-10-31 | Hologic, Inc. | System and method for generating and displaying tomosynthesis image slabs |
JP2018526708A (ja) * | 2015-08-13 | 2018-09-13 | ビューワークス カンパニー リミテッド | 時系列イメージ分析のためのグラフィックユーザーインタフェース提供方法 |
US11455754B2 (en) | 2017-03-30 | 2022-09-27 | Hologic, Inc. | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement |
US11445993B2 (en) | 2017-03-30 | 2022-09-20 | Hologic, Inc. | System and method for targeted object enhancement to generate synthetic breast tissue images |
US11957497B2 (en) | 2017-03-30 | 2024-04-16 | Hologic, Inc | System and method for hierarchical multi-level feature image synthesis and representation |
US11983799B2 (en) | 2017-03-30 | 2024-05-14 | Hologic, Inc. | System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement |
US11403483B2 (en) | 2017-06-20 | 2022-08-02 | Hologic, Inc. | Dynamic self-learning medical image method and system |
US11850021B2 (en) | 2017-06-20 | 2023-12-26 | Hologic, Inc. | Dynamic self-learning medical image method and system |
JP7066491B2 (ja) | 2018-04-10 | 2022-05-13 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、教師データ作成プログラム及び教師データ作成方法 |
JP2019180866A (ja) * | 2018-04-10 | 2019-10-24 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、教師データ作成プログラム及び教師データ作成方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103188998B (zh) | 2015-03-04 |
CN103188998A (zh) | 2013-07-03 |
US20130222383A1 (en) | 2013-08-29 |
JPWO2012063653A1 (ja) | 2014-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012063653A1 (ja) | 医用画像表示装置及び医用画像表示方法 | |
EP2191442B1 (en) | A caliper for measuring objects in an image | |
RU2497194C2 (ru) | Способ и устройство для объемной визуализации наборов данных | |
EP2486548B1 (en) | Interactive selection of a volume of interest in an image | |
EP2193500B1 (en) | A caliper for measuring objects in an image | |
EP2074499B1 (en) | 3d connected shadow mouse pointer | |
US7496222B2 (en) | Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously | |
US7773786B2 (en) | Method and apparatus for three-dimensional interactive tools for semi-automatic segmentation and editing of image objects | |
US9179893B2 (en) | Image processing apparatus, image processing method, image processing system, and program | |
JP4856181B2 (ja) | 画像データセットからのビューのレンダリング | |
CN101275993B (zh) | 基于纹理的快速dt-mri张量场可视化系统和方法 | |
RU2706231C2 (ru) | Визуализация объемного изображения анатомической структуры | |
US9142017B2 (en) | TNM classification using image overlays | |
EP2168492B1 (en) | Medical image displaying apparatus, medical image displaying method, and medical image displaying program | |
EP3314582B1 (en) | Interactive mesh editing | |
JP6114266B2 (ja) | 画像をズームするシステム及び方法 | |
US20130265302A1 (en) | Visualization of flow patterns | |
US20230237612A1 (en) | Determining volume of a selectable region using extended reality | |
GB2497832A (en) | Measuring a ratio of a variable in medical imaging data | |
US20130114785A1 (en) | Method for the medical imaging of a body part, in particular the hand |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11840421 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012542867 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13882384 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11840421 Country of ref document: EP Kind code of ref document: A1 |