CN114005513A - MIP projection CPU algorithm - Google Patents

MIP projection CPU algorithm Download PDF

Info

Publication number
CN114005513A
CN114005513A CN202111246354.0A CN202111246354A CN114005513A CN 114005513 A CN114005513 A CN 114005513A CN 202111246354 A CN202111246354 A CN 202111246354A CN 114005513 A CN114005513 A CN 114005513A
Authority
CN
China
Prior art keywords
image
pixel
projection
sphere
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111246354.0A
Other languages
Chinese (zh)
Inventor
杨扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202111246354.0A priority Critical patent/CN114005513A/en
Publication of CN114005513A publication Critical patent/CN114005513A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention relates to the technical field of medical image processing, and particularly provides a MIP projection CPU algorithm, which comprises the following steps: s1, taking out the images in the DICOM file in sequence and forming a three-dimensional stack; s2, converting the interval unit from millimeter to pixel to form a pixel cube; s3, taking the center of the pixel cube as the sphere center to form a sphere, and taking any point on the surface of the sphere as an observation point; s4, performing ray projection on the observation point, and projecting the point with the maximum density, which penetrates through the pixel cube, of the initial ray on a ray vertical plane; s5, making parallel rays of the initial rays until all the parallel rays passing through the pixel cube are projected; s6, forming an initial image; s7, rotating and zooming the points of the initial image to form a positive direction image; s8, rearranging and interpolating pixel points in the positive direction image to obtain a projected image; s9, the projection image is displayed on the interface, and imaging can be performed completely according to the definition of MIP and more accurate display result can be obtained by using only CPU resources.

Description

MIP projection CPU algorithm
Technical Field
The invention relates to the technical field of medical image processing, in particular to a MIP projection CPU algorithm.
Background
DICOM image three-dimensional visualization is the study of two-dimensional image sequences acquired by various medical imaging devices to construct three-dimensional geometric models of tissues or organs, and "true" rendering and reality on computer screens. Currently, there are two main categories of DICOM image three-dimensional visualization methods: one type describes the three-dimensional structure of an object by splicing and fitting the surface of the object through geometric units, is called a surface drawing method, and is realized by computer graphics technology and hardware based on one-dimensional image edge or contour line extraction. The image formed by the algorithm has high resolution and high generation speed, and can rotate and change the illumination effect quickly and flexibly. It is suitable for drawing tissue and organ with clear surface characteristics. Another type is a method of projecting voxels directly to a display plane, called a volume rendering method. The algorithm does not require accurate segmentation, but directly applies the visual principle, and finally obtains an image with a three-dimensional semitransparent effect through volume data resampling and voxel gray value processing. Volume rendering methods typically do not require precise segmentation of the displayed object, but rather process each voxel in the volumetric data field separately. Therefore, when three-dimensional display is performed on tissues and organs with fuzzy shape features, a volume rendering method is suitable, but because the calculation amount of the algorithm is too large, the requirement of interactive operation in practical application cannot be met even if a high-performance computer is used, and therefore surface rendering is still the current mainstream algorithm.
In the existing medical imaging three-dimensional technology, each layer of picture in DCM generated by DICOM is projected to form a model with the maximum density of each point through MIP and displayed on a plane graph, most of the solutions are to generate a three-dimensional model first and then project the three-dimensional model onto the plane, such a method needs pretreatment and is not completely consistent with the definition of MIP, and most of the solutions are to use a display card to simulate, calculate and image, so that many common computers cannot meet the operation requirements of the users, a method is needed at present, so that the common computers can operate medical three-dimensional imaging, namely, the MIP method can be realized under the condition of only utilizing CPU resources.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: in order to solve the problem of low utilization rate of CPU resources in the traditional medical imaging three-dimensional technology, the invention provides a MIP projection CPU algorithm to solve the problem.
The technical scheme adopted by the invention for solving the technical problems is as follows: a MIP projection CPU algorithm comprising the steps of: s1, taking out all the images in the DICOM file in sequence, and forming a three-dimensional stack from top to bottom at certain intervals by the taken-out images;
s2, converting the interval unit from millimeter to pixel according to proportion, and interpolating and filling the pixel to form a pixel cube;
s3, taking the center of the pixel cube in S2 as the sphere center to form a sphere, taking any point of the surface of the sphere as an observation point, and taking the surface of the sphere as an observation point surface;
s4, performing ray projection on the observation points, and projecting the points with the maximum density, which penetrate through the pixel cube, of the initial rays on a ray vertical plane relative to the vertex position;
s5, making parallel rays of the initial rays, and projecting the points, with the maximum density, of the parallel rays passing through the pixel cube on a ray vertical plane in the S4 until all the parallel rays passing through the pixel cube are projected;
s6, forming an initial image after the parallel rays are projected;
s7, rotating and zooming points on the initial image to enable the initial image to be a positive direction image;
s8, rearranging and interpolating pixel points in the forward direction image to obtain a projected image;
and S9, displaying the projection image in the S8 in an interface.
Preferably, the method further comprises the following steps: and S10, moving the observation point along the surface of the observation point, and then repeating the steps S4-S9 to obtain projection images in different directions.
Preferably, in S6, the plane X-axis and Y-axis directions of the initial image are determined according to the right-hand rule.
Preferably, in S7, the size of the forward direction image is adjusted according to the scaling of the initial image.
Preferably, in S8, the pixels are rearranged and interpolated, and then image smoothing processing is performed.
Preferably, in S3, the sphere has a radius twice the length of the edge of the pixel cube.
The invention has the advantages that large-scale 3D dot matrix calculation is not needed, the requirement on the performance of the display card is reduced, imaging can be completely carried out according to the definition of MIP by using CPU resources, and a more accurate display result is obtained.
Drawings
The invention is further illustrated with reference to the following figures and examples.
FIG. 1 is an algorithm flow chart of a preferred embodiment of the MIP projection CPU algorithm of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", and the like, indicate orientations and positional relationships based on the orientations and positional relationships shown in the drawings, and are used merely for convenience of description and for simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, should not be considered as limiting the present invention.
Furthermore, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as being fixed or detachable or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
As shown in fig. 1, the present invention provides an embodiment of a MIP projection CPU algorithm, which includes the following steps: s1, taking out all the images in the DICOM file in sequence, and forming a three-dimensional stack from top to bottom at certain intervals by the taken-out images;
s2, converting the interval unit from millimeter to pixel according to proportion, and interpolating and filling the pixel to form a pixel cube;
s3, taking twice of the edge length of the pixel cube in S2 as a radius, taking the center of the pixel cube as a sphere center to form a sphere, taking any point of the surface of the sphere as an observation point, and taking the surface of the sphere as an observation point surface;
s4, performing ray projection on the observation point, and projecting the point with the maximum density of the initial ray passing through the pixel cube on a ray vertical plane relative to the vertex position, wherein the point with the maximum density is the point with the maximum tissue density in anatomy, and can also be understood as the point with the maximum gray level in imaging;
s5, making parallel rays of the initial rays, and projecting the points, with the maximum density, of the parallel rays passing through the pixel cube on a ray vertical plane in the S4 until all the parallel rays passing through the pixel cube are projected;
s6, forming an initial image after the projection of the parallel rays is finished, and determining the plane X-axis direction and the plane Y-axis direction of the initial image according to the right-hand rule, for example, the right hand is back to the screen, the thumb points to the positive direction of the X-axis, the index finger points to the positive direction of the Y-axis, and the middle finger points to the positive direction of the Z-axis;
s7, rotating and zooming the points on the initial image to make the initial image adjusted into a positive direction image with a required size according to the zooming proportion;
s8, rearranging and interpolating pixel points in the positive direction image, and performing image smoothing to obtain a projected image, so that each pixel point in the positive direction image is well processed;
s9, displaying the projected image in S8 in the interface, for example, displaying the projected image in the interface including but not limited to: the interface of the equipment such as a mobile phone, a tablet or a computer.
S10, moving the observation point along the surface of the observation point, and then repeating the steps of S4-S9 to obtain projection images in different directions.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, a schematic representation of the term does not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In light of the foregoing description of the preferred embodiment of the present invention, many modifications and variations will be apparent to those skilled in the art without departing from the spirit and scope of the invention. The technical scope of the present invention is not limited to the content of the specification, and must be determined according to the scope of the claims.

Claims (6)

1. An MIP projection CPU algorithm, characterized by comprising the steps of: s1, taking out all the images in the DICOM file in sequence, and forming a three-dimensional stack from top to bottom at certain intervals by the taken-out images;
s2, converting the interval unit from millimeter to pixel according to proportion, and interpolating and filling the pixel to form a pixel cube;
s3, taking the center of the pixel cube in S2 as the sphere center to form a sphere, taking any point of the surface of the sphere as an observation point, and taking the surface of the sphere as an observation point surface;
s4, performing ray projection on the observation points, and projecting the points with the maximum density, which penetrate through the pixel cube, of the initial rays on a ray vertical plane relative to the vertex position;
s5, making parallel rays of the initial rays, and projecting the points, with the maximum density, of the parallel rays passing through the pixel cube on a ray vertical plane in the S4 until all the parallel rays passing through the pixel cube are projected;
s6, forming an initial image after the parallel rays are projected;
s7, rotating and zooming points on the initial image to enable the initial image to be a positive direction image;
s8, rearranging and interpolating pixel points in the forward direction image to obtain a projected image;
and S9, displaying the projection image in the S8 in an interface.
2. The MIP projection CPU algorithm of claim 1, further comprising: and S10, moving the observation point along the surface of the observation point, and then repeating the steps S4-S9 to obtain projection images in different directions.
3. A MIP projection CPU algorithm according to claim 2, wherein: in S6, the planar X-axis and Y-axis directions of the initial image are determined according to the right-hand rule.
4. A MIP projection CPU algorithm according to claim 3, wherein: in S7, the size of the forward direction image is adjusted according to the scaling of the initial image.
5. The MIP projection CPU algorithm of claim 4 wherein: in S8, the pixels are rearranged and interpolated, and then image smoothing processing is performed.
6. The MIP projection CPU algorithm of claim 5, wherein: in S3, the sphere is at a radius of twice the edge length of the pixel cube.
CN202111246354.0A 2021-10-26 2021-10-26 MIP projection CPU algorithm Pending CN114005513A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111246354.0A CN114005513A (en) 2021-10-26 2021-10-26 MIP projection CPU algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111246354.0A CN114005513A (en) 2021-10-26 2021-10-26 MIP projection CPU algorithm

Publications (1)

Publication Number Publication Date
CN114005513A true CN114005513A (en) 2022-02-01

Family

ID=79924110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111246354.0A Pending CN114005513A (en) 2021-10-26 2021-10-26 MIP projection CPU algorithm

Country Status (1)

Country Link
CN (1) CN114005513A (en)

Similar Documents

Publication Publication Date Title
US20220292739A1 (en) Enhancements for displaying and viewing tomosynthesis images
US6798412B2 (en) Occlusion reducing transformations for three-dimensional detail-in-context viewing
CN1716317B (en) Sliding texture volume rendering
US10692272B2 (en) System and method for removing voxel image data from being rendered according to a cutting region
US20050116957A1 (en) Dynamic crop box determination for optimized display of a tube-like structure in endoscopic view ("crop box")
EP2017789B1 (en) Projection image generation apparatus and program
US20050237336A1 (en) Method and system for multi-object volumetric data visualization
CN104574263A (en) Quick three-dimensional ultrasonic reconstruction and display method on basis of GPU (graphics processing unit)
CN101625766A (en) Method for processing medical images
JP4885042B2 (en) Image processing method, apparatus, and program
CN110602477A (en) Display method, display device, electronic equipment and storage medium
US20220343589A1 (en) System and method for image processing
CN111210898B (en) Method and device for processing DICOM data
KR100420791B1 (en) Method for generating 3-dimensional volume-section combination image
JP2009247502A (en) Method and apparatus for forming intermediate image and program
JP2008067915A (en) Medical picture display
CN114005513A (en) MIP projection CPU algorithm
Dai et al. Volume-rendering-based interactive 3D measurement for quantitative analysis of 3D medical images
JP2006000126A (en) Image processing method, apparatus and program
US11158114B2 (en) Medical imaging method and apparatus
CN1969298A (en) Method and system for multi-object volumetric data visualization
Liang et al. Fast hardware-accelerated volume rendering of CT scans
Chen et al. A quality controllable multi-view object reconstruction method for 3D imaging systems
i Bartrolı et al. Visualization techniques for virtual endoscopy
Hashimoto et al. Real-time volume rendering running on an AR device in medical applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination