CN111127461A - Chest image processing method and device, storage medium and medical equipment - Google Patents

Chest image processing method and device, storage medium and medical equipment Download PDF

Info

Publication number
CN111127461A
CN111127461A CN201911423166.3A CN201911423166A CN111127461A CN 111127461 A CN111127461 A CN 111127461A CN 201911423166 A CN201911423166 A CN 201911423166A CN 111127461 A CN111127461 A CN 111127461A
Authority
CN
China
Prior art keywords
projection
image
determining
thoracic
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911423166.3A
Other languages
Chinese (zh)
Other versions
CN111127461B (en
Inventor
张丛嵘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Medical Systems Co Ltd
Original Assignee
Neusoft Medical Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Medical Systems Co Ltd filed Critical Neusoft Medical Systems Co Ltd
Priority to CN201911423166.3A priority Critical patent/CN111127461B/en
Publication of CN111127461A publication Critical patent/CN111127461A/en
Application granted granted Critical
Publication of CN111127461B publication Critical patent/CN111127461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • Mathematical Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Quality & Reliability (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The application provides a chest image processing method, a chest image processing device, a storage medium and medical equipment, which are used for reducing bone diseases such as microfracture and bone fracture. The chest image processing method includes: acquiring a three-dimensional chest image obtained by scanning a detected object, wherein the three-dimensional chest image comprises a plurality of sectional images, and identifying rib regions where ribs are located in the plurality of sectional images; selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a thoracic curve corresponding to any sampling image in the sampling image set according to a rib region in the sampling image; determining the corresponding relation between the projection point on the thoracic curve and the projection angle; determining a target projection angle; and projecting the three-dimensional chest image to a two-dimensional plane by adopting a cylindrical projection expansion algorithm according to the corresponding relation between the projection point on the thoracic curve and the projection angle to obtain a two-dimensional thoracic expansion image.

Description

Chest image processing method and device, storage medium and medical equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for processing a chest image, a storage medium, and a medical device.
Background
Computed Tomography (CT) or Magnetic Resonance (MR) is a main method for diagnosing rib fractures and other bone diseases in the chest, but because of the special tissue structure and shape of the ribs, the lesions such as partial fractures or fractures are difficult to diagnose due to the problem of view angle such as occlusion.
At present, a method for diagnosing rib fractures and other bone diseases is a rib two-dimensional plane projection method, which projects a rib image onto a two-dimensional plane and expands the two-dimensional plane, and doctors diagnose bone diseases such as fractures and bone fractures by observing gray level continuity and brightness change of the two-dimensional projection image of the rib. The two-dimensional rib projection image obtained by the method is usually a two-dimensional thoracic expansion image with a spine as a center, however, due to the influence of noise, images on two sides in the two-dimensional thoracic expansion image are blurred compared with an image in a central region, and bone diseases such as microfracture and bone fracture are not easily observed, so that the bone diseases such as microfracture and bone fracture are easily missed.
Disclosure of Invention
In view of the above, the present application provides a method and an apparatus for processing a thoracic image, a storage medium and a medical device, which are used to reduce bone diseases such as microfracture and bone fracture.
In a first aspect, an embodiment of the present application provides a method for processing a chest image, where the method includes:
acquiring a three-dimensional chest image obtained by scanning a detected object, wherein the three-dimensional chest image comprises a plurality of sectional images, and identifying rib regions where ribs are located in the plurality of sectional images;
selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a thoracic curve corresponding to any sampling image in the sampling image set according to a rib region in the sampling image;
determining a corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a vertical projection of a projection line on a transverse plane on a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system and the coordinate axis, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
determining a target projection angle;
and according to the corresponding relation between the projection point on the thoracic curve and the projection angle, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection expansion algorithm to obtain a two-dimensional thoracic expansion image, wherein the two-dimensional thoracic expansion image is a pixel column which corresponds to each projection point in the direction of the target projection angle and is taken as a central line.
In a possible implementation manner, the determining a thoracic curve corresponding to the sample image according to the rib region in the sample image includes:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
with the viewpoint as the center, emitting a projection line outwards in the first transverse plane according to a first set angle step length;
when the projection line intersects with the rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting according to all the projection points in the first transverse plane to obtain a thoracic curve corresponding to the sampling image.
In a possible implementation manner, the determining a viewpoint of a first transverse plane corresponding to the sampling image includes:
determining a center of gravity of the subject chest from rib regions in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the gravity center as a coordinate origin, the direction of a rotating shaft of the object as a vertical axis, the horizontal direction on a plane parallel to a transverse plane as a horizontal axis, and the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining the intersection point of a first transverse plane corresponding to the sampling image and the vertical axis of the space rectangular coordinate system as the viewpoint of the first transverse plane.
In a possible implementation manner, the determining an intersection point of the projection line and a rib region in the sampled image, and determining a projection point corresponding to the projection line according to the intersection point includes:
determining a first intersection point of the projection line and an inner edge line of a rib region in the sampled image, and determining a second intersection point of the projection line and an outer edge line of the rib region in the sampled image;
and taking the middle point of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
In a possible implementation manner, the determining the target projection angle includes:
receiving an image expansion instruction aiming at any one position to be diagnosed, wherein the image expansion instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In a possible implementation manner, the projecting the three-dimensional chest image to a two-dimensional plane by using a cylindrical projection expansion algorithm according to a correspondence between a projection point on the thoracic curve and a projection angle to obtain a two-dimensional thoracic expansion image includes:
for a thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting the sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting the sampling points on different thoracic curves corresponding to the same projection angle to the same row of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic expansion image through interpolation.
In a second aspect, the present application further provides a breast image processing apparatus, including means for performing the breast image processing method in the first aspect or any possible implementation manner of the first aspect.
In a third aspect, the present application further provides a storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method for processing a breast image in the first aspect or any possible implementation manner of the first aspect.
In a fourth aspect, embodiments of the present application further provide a medical apparatus, including a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method for processing a chest image in the first aspect or any possible implementation manner of the first aspect when executing the program.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
according to the technical scheme, the target projection angle is determined firstly, then the three-dimensional thoracic image is projected to a two-dimensional plane by adopting a cylindrical projection expansion algorithm according to the corresponding relation between the projection point on the thoracic curve and the projection angle, and the two-dimensional thoracic expansion image with the pixel corresponding to each projection point in the direction of the target projection angle as the central line is obtained, namely, when the projection angle corresponding to the position to be diagnosed is determined as the target projection angle in the application, the image corresponding to the position to be diagnosed can be projected to the central area of the two-dimensional thoracic expansion image, so that bone diseases such as microfracture, bone fracture and the like can be reduced, and the like are omitted.
Drawings
Fig. 1 is a schematic flowchart of a method for processing a chest image according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a three-dimensional breast image provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of the principle of determining a thoracic curve provided by the embodiment of the present application;
FIG. 4 is a schematic diagram of an image grid provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a two-dimensional extended thoracic image obtained by applying the method provided by the embodiment of the present application;
fig. 6 is a schematic structural diagram of a chest image processing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a thoracic curve determining module in a thoracic image processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an image unfolding module in a chest image processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a medical device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Referring to fig. 1, an embodiment of the present application provides a method for processing a thoracic image, which may be used in a CT imaging system or a magnetic resonance imaging system, and the method may include the following steps:
s101, acquiring a three-dimensional chest image obtained by scanning a detected object, wherein the three-dimensional chest image comprises a plurality of sectional images, and identifying a rib region where a rib is located in the plurality of sectional images;
the three-dimensional breast image is a three-dimensional medical image, and may be, for example, a CT image, an MR image, or the like.
In the embodiment of the present application, a rib region where a rib is located in a three-dimensional chest image (or multiple tomographic images) can be identified by using a conventional or deep learning method. In the traditional method, for example, a threshold value, a region growing method and the like can be adopted to identify a rib region where a rib is located in a three-dimensional chest image, in the deep learning method, for example, a neural network can be adopted to identify the rib region where the rib is located in the three-dimensional chest image, wherein the neural network can be designed by adopting a common VggBlock, down-sampling and up-sampling combined method.
When the neural network is adopted to identify the rib region where the rib is located in the three-dimensional chest image, the neural network model can be trained in advance. When the model is trained, a large number of marked rib images can be obtained as a sample set, and the neural network model is obtained by continuously adjusting parameters and optimizing the training model according to the sample set.
During identification, the tomogram can be input into the trained neural network model, and the tomogram which is output by the neural network model and marked with the rib region is obtained after the processing of the neural network model.
S102, selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a thoracic curve corresponding to any sampling image in the sampling image set according to a rib region in the sampling image;
in the embodiment of the present application, a preset number of layers (for example, 5 layers) may be set layer by layer or at intervals along the vertical axis (z axis) of the rectangular spatial coordinate system, and a preset number (for example, 30) of images may be selected from the plurality of tomographic images to form a sampling image set.
In some embodiments, the determining the thoracic curve corresponding to the sampled image according to the rib region in the sampled image in step S102 includes:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
with the viewpoint as the center, emitting a projection line outwards in the first transverse plane according to a first set angle step length;
when the projection line intersects with the rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting according to all the projection points in the first transverse plane to obtain a thoracic curve corresponding to the sampling image.
The first set angle step is an included angle between two adjacent projection lines in the first transverse plane.
In some embodiments, the determining the viewpoint of the first transverse plane corresponding to the sampling image comprises:
determining a center of gravity of the subject chest from rib regions in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the gravity center as a coordinate origin, the direction of a rotating shaft of the object as a vertical axis, the horizontal direction on a plane parallel to a transverse plane as a horizontal axis, and the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining the intersection point of a first transverse plane corresponding to the sampling image and the vertical axis of the space rectangular coordinate system as the viewpoint of the first transverse plane.
In this embodiment, the center of gravity of the subject chest may be determined according to the positions of the rib points in the rib regions in the plurality of tomographic images, and for example, the center of gravity of the subject chest may be obtained by calculating an average of three-dimensional coordinates of the rib points in all the rib regions in the plurality of tomographic images.
For example, as shown in fig. 2, the center of gravity O of the chest of the subject is defined by the center of gravity O as the origin of coordinates, the direction of the rotation axis of the subject (i.e., the vertical direction) as the vertical axis (z-axis), the horizontal direction on the plane parallel to the horizontal plane as the horizontal axis (x-axis), and the vertical direction on the plane parallel to the horizontal plane as the vertical axis (y-axis), a rectangular spatial coordinate system is created, and the intersections of the first horizontal plane corresponding to each sample image and the vertical axis (z-axis) of the rectangular spatial coordinate system are the viewpoints of the respective corresponding first horizontal planes, as shown by O in fig. 21,O2,……,On
Of course, the origin of coordinates of the rectangular spatial coordinate system may be determined by other methods, for example, by using the intersection of the rotation axis of the object and the transverse plane corresponding to the tomographic image of the underlying layer as the origin of coordinates. The directions of the horizontal axis and the vertical axis may also be selected from other directions, which is not limited in the embodiments of the present application.
In some embodiments, the determining an intersection point of the projection line and the rib region in the sampled image, and determining a projection point corresponding to the projection line according to the intersection point includes:
determining a first intersection point of the projection line and an inner edge line of a rib region in the sampled image, and determining a second intersection point of the projection line and an outer edge line of the rib region in the sampled image;
and taking the middle point of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
For example, as shown in fig. 3, projection lines may be projected outward in a first set angle step from the horizontal axis parallel direction with the viewpoint O as the center, the first set angle step being an angle a between two adjacent projection lines (0 ° < a <360 °), a first intersection of the projected line R1 projected outward and an inner edge line of a rib region (highlight region in fig. 3) in the sample image being P1, a second intersection of the projected line R1 projected outward and the outer edge line of the rib region in the sample image being P2, and a midpoint P of a connection line between P1 and P2 may be taken as a projection point corresponding to the projection line R1.
After the projection points in the first transverse plane corresponding to the sampled image are determined, a thoracic curve corresponding to the sampled image can be obtained by fitting according to all the projection points in the first transverse plane by using a B-spline interpolation method, and the thoracic curve C is shown by a dotted line in fig. 3.
For each sample image, a respective thoracic curve is determined using the method described above, whereby a series of thoracic curves C1-Cn can be obtained, n being an integer greater than 1.
S103, determining a corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a vertical projection of a projection line on a transverse plane on a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system and a coordinate axis (the transverse axis or the longitudinal axis), and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
s104, determining a target projection angle;
in this embodiment, the default target projection angle may be set as the projection angle corresponding to the spine position, and in order to reduce omission of bone diseases such as microfracture and bone fracture, the user may project the image corresponding to the position to be diagnosed where the bone diseases such as microfracture and bone fracture may exist to the central region of the two-dimensional thorax expansion image, and at this time, the projection angle corresponding to the position to be diagnosed may be set as the target projection angle.
In some embodiments, the determining the target projection angle in step S104 includes:
receiving an image expansion instruction aiming at any one position to be diagnosed, wherein the image expansion instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In this embodiment, the image expansion instruction may be generated according to a projection angle input by a user in the angle adjustment device, the angle adjustment device may be designed as an angle input box or an angle adjustment scroll bar, the angle adjustment device may be set at any position on a display page of the image, or a User Interface (UI) may be popped up after a preset operation of the user for a position to be diagnosed is received, and the angle adjustment device is displayed on the UI interface.
And S105, projecting the three-dimensional chest image to a two-dimensional plane by adopting a cylindrical projection expansion algorithm according to the corresponding relation between the projection point and the projection angle on the chest curve to obtain a two-dimensional chest expansion image, wherein the two-dimensional chest expansion image is a pixel column corresponding to each projection point in the direction of the target projection angle as a central line.
In some embodiments, in step S105, projecting the three-dimensional chest image to a two-dimensional plane by using a cylindrical projection expansion algorithm according to a correspondence between projection points on the thoracic curve and projection angles, to obtain a two-dimensional thoracic expansion image, including:
for a thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting the sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting the sampling points on different thoracic curves corresponding to the same projection angle to the same row of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic expansion image through interpolation.
For example, the target projection angle is α (0 ° ≦ α <360 °), the second set angle step α 2(0 ° < β <360 °) is an angle between two adjacent projection lines in the first transverse plane corresponding to the sampled image, the smaller the β is, the more projection points on the thoracic curve corresponding to the sampled image are, the first projection point on the thoracic curve C1 corresponding to the first projection angle α 1+180 ° 180 degrees different from α is Q1, the first projection point on the thoracic curve C2 corresponding to the first projection angle α +180 ° is Q2, … …, the first projection point on the thoracic curve Cn corresponding to the first projection angle α +180 ° is Qn, for each thoracic curve, starting from the first projection point, a sampling point is selected along the thoracic curve in the second set angle step β, then, starting from the first projection point, a sampling point on the same thoracic curve is projected onto a same row of pixels on the two-dimensional plane, for example, a sampling point on the same two-dimensional projection line is selected along the thoracic curve C2, and corresponding to a sampling point on the same two-dimensional projection line of the same projection curve C867, and a sampling point on the same two-dimensional projection plane is obtained by interpolating, for example, and corresponding to a sampling point on the same column of the same projection line of the same projection point on the same two-dimensional projection curve, and a two-dimensional projection line of the same projection curve, and a sampling point on the same projection line of the projection point of the projection curve, and a sampling point on the same projection line of the same projection line, such as a sampling point of the projection line of the projection curve C180, such as shown by interpolating method, for example, and a sampling point on the same projection angle, and a sampling point on the same projection angle, such as shown by.
When the target projection angle is a default value, that is, the projection angle corresponding to the spine position, the obtained two-dimensional thoracic expansion image is as shown in fig. 5.
According to the technical scheme, the three-dimensional chest image is projected to the two-dimensional plane by adopting the cylindrical projection expansion algorithm, the obtained two-dimensional chest expansion image can simultaneously display the shapes of relevant tissues such as a spine, ribs, fat and the like, in addition, the chest can be expanded from any projection angle by adjusting the target projection angle, furthermore, when the projection angle corresponding to the position to be diagnosed is determined as the target projection angle, the image corresponding to the position to be diagnosed can be projected to the central area of the two-dimensional chest expansion image, and therefore bone diseases such as microfracture, bone fracture and the like can be reduced and omitted.
Based on the same inventive concept, referring to fig. 6, an embodiment of the present application further provides a breast image processing apparatus, including: an image acquisition and recognition module 11, a thoracic curve determination module 12, a correspondence determination module 13, a projection angle determination module 14, and an image expansion module 15.
An image acquisition and identification module 11 configured to acquire a three-dimensional chest image obtained by scanning a subject, the three-dimensional chest image including a plurality of tomographic images, and identify a rib region in which a rib is located in the plurality of tomographic images;
a thoracic curve determining module 12 configured to select a preset number of images from the plurality of tomographic images to form a sampling image set, and for any sampling image in the sampling image set, determine a thoracic curve corresponding to the sampling image according to a rib region in the sampling image;
a correspondence determining module 13 configured to determine a correspondence between a projection point on the thoracic curve and a projection angle, where the projection angle is an included angle between a perpendicular projection of a projection line on a transverse plane on a coordinate plane including a transverse axis and a longitudinal axis of a rectangular spatial coordinate system and the coordinate axis, and the transverse plane is parallel to the coordinate plane including the transverse axis and the longitudinal axis;
a projection angle determination module 14 configured to determine a target projection angle;
and the image expansion module 15 is configured to project the three-dimensional chest image to a two-dimensional plane by using a cylindrical projection expansion algorithm according to the correspondence between the projection points and the projection angles on the thoracic curve, so as to obtain a two-dimensional thoracic expansion image, wherein the two-dimensional thoracic expansion image is a central line in which pixels corresponding to the projection points in the direction of the target projection angle are arranged.
In one possible implementation, as shown in fig. 7, the thoracic curve determination module 12 includes:
a viewpoint determining submodule 121 configured to determine a viewpoint of a first transverse plane corresponding to the sample image, where the viewpoint is an intersection point of the first transverse plane and a vertical axis of the spatial rectangular coordinate system;
a projection line emission submodule 122 configured to emit a projection line outward in a first set angle step within the first transverse plane with the viewpoint as a center;
a projection point determining submodule 123 configured to determine, when the projection line intersects a rib region in the sampled image, an intersection point of the projection line and the rib region in the sampled image, and determine a projection point corresponding to the projection line according to the intersection point;
a thoracic curve obtaining sub-module 124 configured to fit all the projection points in the first transverse plane to obtain a thoracic curve corresponding to the sampled image.
In one possible implementation, the viewpoint determining submodule 121 is configured to:
determining a center of gravity of the subject chest from rib regions in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the gravity center as a coordinate origin, the direction of a rotating shaft of the object as a vertical axis, the horizontal direction on a plane parallel to a transverse plane as a horizontal axis, and the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining the intersection point of a first transverse plane corresponding to the sampling image and the vertical axis of the space rectangular coordinate system as the viewpoint of the first transverse plane.
In one possible implementation, the proxel determination submodule 123 is configured to:
determining a first intersection point of the projection line and an inner edge line of a rib region in the sampled image, and determining a second intersection point of the projection line and an outer edge line of the rib region in the sampled image;
and taking the middle point of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
In one possible implementation, the projection angle determination module 14 is configured to:
receiving an image expansion instruction aiming at any one position to be diagnosed, wherein the image expansion instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
In one possible implementation, as shown in fig. 8, the image expansion module 15 includes:
the searching submodule 151 is configured to search, for a thoracic curve corresponding to each of the sampled images, a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to a corresponding relationship between a projection point on the thoracic curve and the projection angle;
a sampling submodule 152 configured to select sampling points along the thoracic curve at a second set angular step from the first projection point;
and the image expansion sub-module 153 is configured to project the sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane from the first projection point, project the sampling points on different thoracic curves corresponding to the same projection angle to the same column of pixels on the two-dimensional plane, and obtain the two-dimensional thoracic expansion image through interpolation.
The implementation process of the functions and actions of each unit in the above device is specifically described in the implementation process of the corresponding step in the above method, and is not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Based on the same inventive concept, the present application also provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the breast image processing method in any of the possible implementations described above.
Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Based on the same inventive concept, referring to fig. 9, an embodiment of the present application further provides a medical apparatus, which includes a memory 61 (e.g., a non-volatile memory), a processor 62, and a computer program stored on the memory 61 and executable on the processor 62, and when the processor 62 executes the program, the steps of the breast image processing method in any possible implementation manner described above are implemented. The medical device may be, for example, a PC belonging to a CT imaging system or a magnetic resonance imaging system.
As shown in fig. 9, the computer device may also generally include: a memory 63, a network interface 64, and an internal bus 65. In addition to these components, other hardware may be included, which is not described in detail.
It should be noted that the chest image processing device can be implemented by software, which is a logical device formed by the processor 62 of the computer device in which the chest image processing device is located reading computer program instructions stored in the nonvolatile memory into the memory 63 for operation.
Embodiments of the subject matter and the functional operations described in this specification can be implemented in: digital electronic circuitry, tangibly embodied computer software or firmware, computer hardware including the structures disclosed in this specification and their structural equivalents, or a combination of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a tangible, non-transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or additionally, the program instructions may be encoded on an artificially generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode and transmit information to suitable receiver apparatus for execution by the data processing apparatus. The computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Computers suitable for executing computer programs include, for example, general and/or special purpose microprocessors, or any other type of central processing unit. Generally, a central processing unit will receive instructions and data from a read-only memory and/or a random access memory. The basic components of a computer include a central processing unit for implementing or executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer does not necessarily have such a device. Moreover, a computer may be embedded in another device, e.g., a mobile telephone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer-readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., an internal hard disk or a removable disk), magneto-optical disks, and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. In other instances, features described in connection with one embodiment may be implemented as discrete components or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Further, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (14)

1. A method of breast image processing, the method comprising:
acquiring a three-dimensional chest image obtained by scanning a detected object, wherein the three-dimensional chest image comprises a plurality of sectional images, and identifying rib regions where ribs are located in the plurality of sectional images;
selecting a preset number of images from the plurality of tomographic images to form a sampling image set, and determining a thoracic curve corresponding to any sampling image in the sampling image set according to a rib region in the sampling image;
determining a corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a vertical projection of a projection line on a transverse plane on a coordinate plane containing a transverse axis and a longitudinal axis of a space rectangular coordinate system and the coordinate axis, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
determining a target projection angle;
and according to the corresponding relation between the projection point on the thoracic curve and the projection angle, projecting the three-dimensional thoracic image to a two-dimensional plane by adopting a cylindrical projection expansion algorithm to obtain a two-dimensional thoracic expansion image, wherein the two-dimensional thoracic expansion image is a pixel column which corresponds to each projection point in the direction of the target projection angle and is taken as a central line.
2. The method of claim 1, wherein determining a thoracic curve corresponding to the sampled image from the rib region in the sampled image comprises:
determining a viewpoint of a first transverse plane corresponding to the sampling image, wherein the viewpoint is an intersection point of the first transverse plane and a vertical axis of the space rectangular coordinate system;
with the viewpoint as the center, emitting a projection line outwards in the first transverse plane according to a first set angle step length;
when the projection line intersects with the rib region in the sampling image, determining an intersection point of the projection line and the rib region in the sampling image, and determining a projection point corresponding to the projection line according to the intersection point;
and fitting according to all the projection points in the first transverse plane to obtain a thoracic curve corresponding to the sampling image.
3. The method of claim 2, wherein determining the viewpoint of the first transverse plane corresponding to the sampled image comprises:
determining a center of gravity of the subject chest from rib regions in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the gravity center as a coordinate origin, the direction of a rotating shaft of the object as a vertical axis, the horizontal direction on a plane parallel to a transverse plane as a horizontal axis, and the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining the intersection point of a first transverse plane corresponding to the sampling image and the vertical axis of the space rectangular coordinate system as the viewpoint of the first transverse plane.
4. The method of claim 2, wherein determining the intersection of the projection line and the rib region in the sampled image and determining the projection point corresponding to the projection line according to the intersection comprises:
determining a first intersection point of the projection line and an inner edge line of a rib region in the sampled image, and determining a second intersection point of the projection line and an outer edge line of the rib region in the sampled image;
and taking the middle point of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
5. The method of claim 1, wherein determining the target projection angle comprises:
receiving an image expansion instruction aiming at any one position to be diagnosed, wherein the image expansion instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
6. The method of claim 1, wherein the projecting the three-dimensional thoracic image to a two-dimensional plane by a cylindrical projection expansion algorithm according to the corresponding relationship between the projection point and the projection angle on the thoracic curve to obtain a two-dimensional thoracic expansion image comprises:
for a thoracic curve corresponding to each sampling image, searching a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle;
from the first projection point, selecting sampling points along the thoracic curve according to a second set angle step length;
and starting from the first projection point, projecting the sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane, projecting the sampling points on different thoracic curves corresponding to the same projection angle to the same row of pixels on the two-dimensional plane, and obtaining the two-dimensional thoracic expansion image through interpolation.
7. A chest image processing apparatus, characterized in that the apparatus comprises:
the image acquisition and identification module is configured to acquire a three-dimensional chest image obtained by scanning a detected object, wherein the three-dimensional chest image comprises a plurality of sectional images, and rib regions where ribs are located in the plurality of sectional images are identified;
a thoracic curve determining module configured to select a preset number of images from the plurality of tomographic images to form a sampling image set, and determine a thoracic curve corresponding to any one of the sampling images according to a rib region in the sampling image;
the corresponding relation determining module is configured to determine the corresponding relation between a projection point on the thoracic curve and a projection angle, wherein the projection angle is an included angle between a vertical projection and a coordinate axis of a projection line on a transverse plane on a coordinate plane containing a transverse axis and a longitudinal axis of a rectangular spatial coordinate system, and the transverse plane is parallel to the coordinate plane containing the transverse axis and the longitudinal axis;
a projection angle determination module configured to determine a target projection angle;
and the image expansion module is configured to project the three-dimensional chest image to a two-dimensional plane by adopting a cylindrical projection expansion algorithm according to the corresponding relation between the projection point and the projection angle on the thoracic curve to obtain a two-dimensional thoracic expansion image, wherein the two-dimensional thoracic expansion image is a pixel column corresponding to each projection point in the direction of the target projection angle as a central line.
8. The apparatus of claim 7, wherein the thoracic curve determination module comprises:
a viewpoint determining submodule configured to determine a viewpoint of a first transverse plane corresponding to the sampled image, the viewpoint being an intersection point of the first transverse plane and a vertical axis of the spatial rectangular coordinate system;
a projection line emission submodule configured to emit a projection line outward in a first set angle step within the first transverse plane with the viewpoint as a center;
the projection point determining submodule is configured to determine an intersection point of the projection line and the rib region in the sampling image when the projection line intersects the rib region in the sampling image, and determine a projection point corresponding to the projection line according to the intersection point;
and the thoracic curve acquisition sub-module is configured to fit all the projection points in the first transverse plane to obtain a thoracic curve corresponding to the sampling image.
9. The apparatus of claim 8, wherein the viewpoint determining sub-module is configured to:
determining a center of gravity of the subject chest from rib regions in the plurality of tomographic images;
establishing the space rectangular coordinate system by taking the gravity center as a coordinate origin, the direction of a rotating shaft of the object as a vertical axis, the horizontal direction on a plane parallel to a transverse plane as a horizontal axis, and the vertical direction on a plane parallel to the transverse plane as a vertical axis;
and determining the intersection point of a first transverse plane corresponding to the sampling image and the vertical axis of the space rectangular coordinate system as the viewpoint of the first transverse plane.
10. The apparatus of claim 8, wherein the proxel determination sub-module is configured to:
determining a first intersection point of the projection line and an inner edge line of a rib region in the sampled image, and determining a second intersection point of the projection line and an outer edge line of the rib region in the sampled image;
and taking the middle point of the connecting line of the first intersection point and the second intersection point as a projection point corresponding to the projection line.
11. The apparatus of claim 7, wherein the projection angle determination module is configured to:
receiving an image expansion instruction aiming at any one position to be diagnosed, wherein the image expansion instruction comprises a projection angle corresponding to the position to be diagnosed, and determining the projection angle corresponding to the position to be diagnosed as a target projection angle.
12. The apparatus of claim 7, wherein the image unfolding module comprises:
the searching submodule is configured to search a first projection point corresponding to a first projection angle which is 180 degrees different from the target projection angle according to the corresponding relation between the projection point on the thoracic curve and the projection angle for the thoracic curve corresponding to each sampling image;
the sampling sub-module is configured to select sampling points along the thoracic curve according to a second set angle step from the first projection point;
and the image expansion sub-module is configured to project sampling points on the same thoracic curve to the same row of pixels on the two-dimensional plane from the first projection point, project sampling points on different thoracic curves corresponding to the same projection angle to the same row of pixels on the two-dimensional plane, and obtain the two-dimensional thoracic expansion image through interpolation.
13. A storage medium having a computer program stored thereon, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
14. A medical device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any one of claims 1-6 are performed when the program is executed by the processor.
CN201911423166.3A 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment Active CN111127461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911423166.3A CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911423166.3A CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Publications (2)

Publication Number Publication Date
CN111127461A true CN111127461A (en) 2020-05-08
CN111127461B CN111127461B (en) 2023-09-26

Family

ID=70507891

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911423166.3A Active CN111127461B (en) 2019-12-31 2019-12-31 Chest image processing method, chest image processing device, storage medium and medical equipment

Country Status (1)

Country Link
CN (1) CN111127461B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111710028A (en) * 2020-05-27 2020-09-25 北京东软医疗设备有限公司 Three-dimensional contrast image generation method and device, storage medium and electronic equipment
CN112767415A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Chest scanning area automatic determination method, device, equipment and storage medium
CN113643176A (en) * 2021-07-28 2021-11-12 沈阳先进医疗设备技术孵化中心有限公司 Rib display method and device
CN114240740A (en) * 2021-12-16 2022-03-25 数坤(北京)网络科技股份有限公司 Bone expansion image acquisition method and device, medical equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170262978A1 (en) * 2016-03-14 2017-09-14 Toshiba Medical Systems Corporation Medical image data processing system and method
US20190057541A1 (en) * 2017-08-18 2019-02-21 Siemens Healthcare Gmbh Planar visualization of anatomical structures
CN109830289A (en) * 2019-01-18 2019-05-31 上海皓桦科技股份有限公司 Bone images display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170262978A1 (en) * 2016-03-14 2017-09-14 Toshiba Medical Systems Corporation Medical image data processing system and method
CN107194909A (en) * 2016-03-14 2017-09-22 东芝医疗系统株式会社 Medical image-processing apparatus and medical imaging processing routine
US20190057541A1 (en) * 2017-08-18 2019-02-21 Siemens Healthcare Gmbh Planar visualization of anatomical structures
CN109830289A (en) * 2019-01-18 2019-05-31 上海皓桦科技股份有限公司 Bone images display device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111710028A (en) * 2020-05-27 2020-09-25 北京东软医疗设备有限公司 Three-dimensional contrast image generation method and device, storage medium and electronic equipment
CN112767415A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Chest scanning area automatic determination method, device, equipment and storage medium
CN113643176A (en) * 2021-07-28 2021-11-12 沈阳先进医疗设备技术孵化中心有限公司 Rib display method and device
CN113643176B (en) * 2021-07-28 2024-05-28 东软医疗系统股份有限公司 Rib display method and device
CN114240740A (en) * 2021-12-16 2022-03-25 数坤(北京)网络科技股份有限公司 Bone expansion image acquisition method and device, medical equipment and storage medium

Also Published As

Publication number Publication date
CN111127461B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN111127461A (en) Chest image processing method and device, storage medium and medical equipment
CN110010249B (en) Augmented reality operation navigation method and system based on video superposition and electronic equipment
US8831310B2 (en) Systems and methods for displaying guidance data based on updated deformable imaging data
JP6918528B2 (en) Medical image processing equipment and medical image processing program
CN106651895B (en) Method and device for segmenting three-dimensional image
CN107169919B (en) Method and system for accelerated reading of 3D medical volumes
US10716457B2 (en) Method and system for calculating resected tissue volume from 2D/2.5D intraoperative image data
US9035941B2 (en) Image processing apparatus and image processing method
CN104000655B (en) Surface reconstruction and registration for the combination of laparoscopically surgical operation
US9563978B2 (en) Image generation apparatus, method, and medium with image generation program recorded thereon
CN103402434B (en) Medical diagnostic imaging apparatus, medical image display apparatus and medical image-processing apparatus
US9767562B2 (en) Image processing apparatus, image processing method and storage medium
JP6400725B2 (en) Image processing apparatus and method for segmenting a region of interest
US20070053564A1 (en) Image processing method and computer readable medium for image processing
CN111009032A (en) Blood vessel three-dimensional reconstruction method based on improved epipolar line constraint matching
CN110634554A (en) Spine image registration method
Kumar et al. Stereoscopic visualization of laparoscope image using depth information from 3D model
US20150320377A1 (en) Medical image display control apparatus, method, and program
US9754411B2 (en) Path proximity rendering
US9558589B2 (en) Medical image display apparatus, method, and program
US8165375B2 (en) Method and system for registering CT data sets
JP2019010382A (en) Image positioning apparatus, method, and program
US20160019694A1 (en) Region extraction apparatus, method, and program
WO2011110867A1 (en) Apparatus and method for registering medical images containing a tubular organ
US20230342994A1 (en) Storage medium, image identification method, image identification device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant