CN113781593B - Four-dimensional CT image generation method, device, terminal equipment and storage medium - Google Patents

Four-dimensional CT image generation method, device, terminal equipment and storage medium Download PDF

Info

Publication number
CN113781593B
CN113781593B CN202110938287.2A CN202110938287A CN113781593B CN 113781593 B CN113781593 B CN 113781593B CN 202110938287 A CN202110938287 A CN 202110938287A CN 113781593 B CN113781593 B CN 113781593B
Authority
CN
China
Prior art keywords
image
dimensional
phase
target user
puncture needle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110938287.2A
Other languages
Chinese (zh)
Other versions
CN113781593A (en
Inventor
邓金城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenying Medical Technology Shenzhen Co ltd
Original Assignee
Shenying Medical Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenying Medical Technology Shenzhen Co ltd filed Critical Shenying Medical Technology Shenzhen Co ltd
Priority to CN202110938287.2A priority Critical patent/CN113781593B/en
Publication of CN113781593A publication Critical patent/CN113781593A/en
Application granted granted Critical
Publication of CN113781593B publication Critical patent/CN113781593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/100764D tomography; Time-sequential 3D tomography

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The application is suitable for the technical field of image processing, and provides a four-dimensional CT image generation method, a device, terminal equipment and a storage medium. The generation method comprises the following steps: when the target user is in operation and the breathing state of the target user is the reference breathing state, acquiring a reference three-dimensional CT image obtained by the CT equipment through scanning the target user; determining a first three-dimensional CT image of a reference phase corresponding to a reference respiratory state from the preoperative four-dimensional CT images; performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase to obtain image registration parameters; transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase; an intra-operative four-dimensional CT image is generated from the reference three-dimensional CT image and the second three-dimensional CT image of the remaining phases. By the method, the intraoperative four-dimensional CT image can be generated.

Description

Four-dimensional CT image generation method, device, terminal equipment and storage medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a four-dimensional CT image generation method, a device, terminal equipment and a storage medium.
Background
In order to eliminate or reduce the influence of respiratory motion artifact on the chest and abdomen viscera computerized tomography (Computed Tomography, CT) and reflect the law of the changes of the chest and abdomen viscera along with time, the purpose of accurate diagnosis and treatment is achieved. Ichikawa et al in 2000 proposed the concept of 4D-CT, incorporating time factors into the three-dimensional reconstruction of CT images, resulting in four-dimensional CT images, i.e., 4D-CT.
In the prior art, three-dimensional CT images and respiratory signals are synchronously acquired before operation, all acquired three-dimensional CT images are marked with time phases in the respiratory cycle to obtain three-dimensional CT images of all the time phases, the three-dimensional CT images of all the time phases form a three-dimensional image sequence which changes along with time, and finally four-dimensional CT images before operation are formed. However, in the operation, since the posture and the positioning of the patient are different from those in the case of forming a four-dimensional CT image before the operation, the four-dimensional CT image formed before the operation cannot be applied to the operation.
Disclosure of Invention
The application provides a method, a device, terminal equipment and a storage medium for generating a four-dimensional CT image so as to generate the four-dimensional CT image in operation.
In a first aspect, an embodiment of the present application provides a method for generating a four-dimensional CT image, where the method includes:
When a target user is in operation and the breathing state of the target user is a reference breathing state, acquiring a reference three-dimensional CT image obtained by scanning the target user by a CT device;
determining a first three-dimensional CT image of a reference phase from preoperative four-dimensional CT images, wherein the preoperative four-dimensional CT images comprise first three-dimensional CT images of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state;
performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase to obtain image registration parameters;
transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase, wherein the residual phase refers to a phase other than the reference phase in the N phases;
and generating an intraoperative four-dimensional CT image according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual phase, wherein the reference three-dimensional CT image is the second three-dimensional CT image of the reference phase in the intraoperative four-dimensional CT image.
In a second aspect, an embodiment of the present application provides a generating apparatus for four-dimensional CT image, including:
The first acquisition module is used for acquiring a reference three-dimensional CT image obtained by the CT equipment through scanning the target user when the target user is in operation and the breathing state of the target user is a reference breathing state;
the first determining module is used for determining a first three-dimensional CT image of a reference phase from four-dimensional CT images before operation, wherein the four-dimensional CT images before operation comprise first three-dimensional CT images of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state;
the image registration module is used for carrying out image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase to obtain image registration parameters;
the image transformation module is used for transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase, and the residual phase refers to a phase other than the reference phase in the N phases;
and the image generation module is used for generating an intraoperative four-dimensional CT image according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual phase, wherein the reference three-dimensional CT image is the second three-dimensional CT image of the reference phase in the intraoperative four-dimensional CT image.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the generating method according to the first aspect described above when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method of generating as described in the first aspect above.
In a fifth aspect, embodiments of the present application provide a computer program product for, when run on a terminal device, causing the terminal device to perform the steps of the generation method as described in the first aspect above.
From the above, when the target user is in operation and the respiratory state of the target user is the reference respiratory state, the scheme can acquire the reference three-dimensional CT image obtained by scanning the target user by the CT equipment, determine the first three-dimensional CT image of the reference time phase corresponding to the reference respiratory state from the four-dimensional CT image before operation, acquire the image registration parameters by carrying out image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase, convert the first three-dimensional CT image of the residual time phase according to the image registration parameters, obtain the second three-dimensional CT image of the residual time phase, and generate the four-dimensional CT image during operation according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual time phase.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic implementation flow chart of a four-dimensional CT image generating method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of a breathing cycle;
FIG. 3 is an exemplary view of a first three-dimensional CT image of 10 phases;
fig. 4 is a schematic implementation flow chart of a four-dimensional CT image generating method according to the second embodiment of the present application;
FIG. 5 is an exemplary diagram of a puncture path;
fig. 6 is a schematic structural diagram of a four-dimensional CT image generating apparatus according to a third embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Furthermore, in the description of the present specification and the appended claims, the terms "first," "second," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
Before describing the present application, for the convenience of the reader, the terms involved in the present application will be explained.
Rigid transformation refers to a transformation method in which the distance between any two points on the same image remains unchanged on the transformed image. The rigid body transformation can be split and divided into translation, rotation and other steps.
The rigid registration is to calculate conversion parameters (rotation parameters and translation parameters in three directions of X axis, Y axis, Z axis, etc.) in the rigid conversion. Generally, an iteration method is used, and the rotation parameter and the translation parameter are updated continuously, so that the similarity between two images (a reference image and a floating image) is maximized, and finally the optimal rotation parameter and translation parameter are obtained. The optimal rotation parameter and translation parameter refer to the rotation parameter and translation parameter when the similarity is the maximum.
The deformation transformation refers to a transformation method in which the distance between any two points on the same image cannot be kept unchanged on the transformed image, assuming that local deformation occurs in the image. B-spline transforms are typically used.
The deformation registration is to calculate transformation parameters (displacement amounts of deformation control points) in the deformation transformation. Generally, an iteration method is used, and the best deformation field is finally obtained by continuously updating the deformation field and maximizing the similarity between two images (a reference image and a floating image). Wherein, the optimal deformation field refers to the deformation field when the similarity is maximum.
Furthermore, rigid registration is required prior to deformation registration so that the two images are initially aligned.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be appreciated that in some embodiments, the device is not a portable communication device, but a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a terminal device including a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications that may be executed on the terminal device may use at least one common physical user interface device such as a touch sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal may be adjusted and/or changed between applications and/or within the corresponding applications. In this way, the common physical architecture (e.g., touch-sensitive surface) of the terminal may support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence number of each step in this embodiment does not mean the execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not limit the implementation process of the embodiment of the present application in any way.
In order to illustrate the technical scheme of the application, the following description is made by specific examples.
Referring to fig. 1, a flowchart of an implementation of a four-dimensional CT image generating method according to an embodiment of the present application is shown, where the generating method is applied to a terminal device. As shown in fig. 1, the generating method may include the steps of:
step 101, when a target user is in operation and the respiratory state of the target user is a reference respiratory state, acquiring a reference three-dimensional CT image obtained by the CT equipment through scanning the target user.
The target user may refer to any user who needs to scan the CT apparatus. For example, a patient whose chest and abdomen organs are diseased.
The reference respiratory state may refer to a respiratory state set in advance, and a respiratory state that is easily reproduced among all respiratory states may be used as the reference respiratory state.
Because the end of the expiration is a position which is easy to reproduce, the target user can hold breath at the end of expiration, and the target user can be scanned by the CT equipment at the moment, so that a reference three-dimensional CT image can be obtained.
In order to accurately acquire the reference three-dimensional CT image, when the target user is scanned by the CT equipment, respiration of the target user can be monitored through the respiration monitoring equipment to obtain a respiration signal of the target user, the respiration signal of the target user is sent to the terminal equipment, the terminal equipment judges whether the respiration state of the target user is the end of expiration according to the respiration signal, and when the respiration state of the target user is the end of expiration, the target user is scanned by the CT equipment to obtain the reference three-dimensional CT image. The respiration monitoring device may be any device capable of monitoring the respiration of the user, for example, a device capable of measuring the respiration amount by a spirometer, measuring the height difference of the body surface along with the fluctuation of the respiration by an infrared imaging device, measuring the pressure difference caused by the respiration by a pressure sensor or the like, and converting these measurement signals into respiration signals.
The terminal device may extract an exhalation amount of the target user from the respiratory signal, and may determine that the respiratory state of the target user is an end of exhalation when the exhalation amount of the target user is zero or less than the exhalation amount threshold, or determine that the respiratory state of the target user is not the end of exhalation.
It should be noted that, after the CT apparatus obtains the reference three-dimensional CT image, the reference three-dimensional CT image may be sent to the terminal apparatus, so that the terminal apparatus obtains the reference three-dimensional CT image.
Step 102, determining a first three-dimensional CT image of a reference phase from the four-dimensional CT images before operation.
The preoperative four-dimensional CT image comprises a first three-dimensional CT image of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state.
In this embodiment, since one respiratory cycle is generally divided into N phases, the four-dimensional CT image may be divided into N phases according to N phases of one respiratory cycle, and each phase has a corresponding respiratory state. For example, a four-dimensional CT image is divided into 10 phases, 0%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90%, and it is generally prescribed that phase 0% corresponds to the inhalation end and phase 50% corresponds to the exhalation end. When the reference respiratory state is the end of expiration, the reference phase is 50% of the phase. An exemplary diagram of one breathing cycle is shown in fig. 2, where one breathing cycle is divided into 10 phases in fig. 2. An exemplary view of a first three-dimensional CT image of 10 phases is shown in fig. 3.
The method for acquiring the preoperative four-dimensional CT image can be described in the related art, and will not be described herein.
And step 103, performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase to obtain image registration parameters.
Since the reference three-dimensional CT image is acquired when the target user is in an operation and the respiratory state is the reference respiratory state, the reference three-dimensional CT image can be understood as a second three-dimensional CT image of the reference time phase corresponding to the reference respiratory state, and thus, by performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase, the image registration parameters for transforming the preoperative four-dimensional CT image into the intra-operation four-dimensional CT image can be obtained.
As an alternative embodiment, performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase, and obtaining the image registration parameters includes:
and carrying out rigid registration and deformation registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase in sequence to obtain translation parameters and rotation parameters corresponding to the rigid registration and deformation fields corresponding to the deformation registration.
In this embodiment, the reference three-dimensional CT image may be used as a reference image, and first, the reference three-dimensional CT image is rigidly registered with the first three-dimensional CT image in the reference time phase to obtain a translation parameter and a rotation parameter corresponding to the rigid registration, and then, the reference three-dimensional CT image is deformation registered with the first three-dimensional CT image in the reference time phase to obtain image registration parameters such as a deformation field corresponding to the deformation registration.
Before deformation registration is performed on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase, rigid registration is performed first, so that the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase can be initially aligned, and accuracy of deformation registration is improved.
And 104, transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase.
Wherein the remaining phases refer to phases other than the reference phase among the N phases.
According to the image registration parameters obtained in step 103, the first three-dimensional CT image of all phases in the preoperative four-dimensional CT image may be transformed into the second three-dimensional CT image of the corresponding phase in the intraoperative four-dimensional CT image. Wherein, since the reference three-dimensional CT image is the second three-dimensional CT image of the reference phase in the four-dimensional CT image in the operation, the first three-dimensional CT image of the remaining N-1 phases is only needed to be transformed according to the image registration parameters.
Step 105, generating an intraoperative four-dimensional CT image according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual phase.
Wherein the reference three-dimensional CT image is a second three-dimensional CT image of a reference phase in the intra-operative four-dimensional CT image.
In one embodiment, the sequence of the corresponding second three-dimensional CT image in the four-dimensional CT image during operation may be determined according to the sequence of the first three-dimensional CT image of the N phases in the four-dimensional CT image before operation.
For example, the order of the first three-dimensional CT images of 10 phases in the pre-operation four-dimensional CT image refers to a first three-dimensional CT image of 0% of the phase, a first three-dimensional CT image of 10% of the phase, a first three-dimensional CT image of 20% of the phase, a first three-dimensional CT image of 30% of the phase, a first three-dimensional CT image of 40% of the phase, a first three-dimensional CT image of 50% of the phase, a first three-dimensional CT image of 60% of the phase, a first three-dimensional CT image of 70% of the phase, a first three-dimensional CT image of 80% of the phase, a first three-dimensional CT image of 90% of the phase, and then a second three-dimensional CT image of 10 phases in the four-dimensional CT image of the operation is an order of 0% of the phase, a second three-dimensional CT image of 10% of the phase, a second three-dimensional CT image of 20% of the phase, a second three-dimensional CT image of 30% of the phase, a second three-dimensional CT image of 40% of the phase, a second three-dimensional CT image of 50% of the phase, a second three-dimensional CT image of 60% of the phase, a second three-dimensional CT image of the phase of 70% of the phase, a second three-dimensional CT image of the phase of 80% of the phase, a second three-dimensional CT image of phase of 90% of the phase CT image of phase.
According to the embodiment of the application, when the target user is in operation and the respiratory state of the target user is the reference respiratory state, the reference three-dimensional CT image obtained by the CT equipment through scanning the target user can be obtained, the first three-dimensional CT image of the reference time phase corresponding to the reference respiratory state is determined from the four-dimensional CT images before operation, the image registration parameters can be obtained through carrying out image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase, the first three-dimensional CT image of the residual time phase can be transformed according to the image registration parameters, the second three-dimensional CT image of the residual time phase can be obtained, and the four-dimensional CT image in operation can be generated according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual time phase.
Referring to fig. 4, a schematic implementation flow diagram of a four-dimensional CT image generating method according to a second embodiment of the present application is shown, where the generating method is applied to a terminal device. As shown in fig. 4, the generating method may include the steps of:
step 401, when the target user is in operation and the respiratory state is the reference respiratory state, acquiring a reference three-dimensional CT image obtained by the CT device through scanning the target user.
The step is the same as step 101, and specific reference may be made to the description related to step 101, which is not repeated here.
Step 402, determining a first three-dimensional CT image of a reference phase from the pre-operative four-dimensional CT images.
The step is the same as step 102, and the detailed description of step 102 is omitted here.
Step 403, performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase to obtain image registration parameters.
The step is the same as step 103, and specific reference may be made to the related description of step 103, which is not repeated here.
And step 404, transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase.
The step is the same as step 104, and the detailed description of step 104 is omitted here.
In step 405, an intra-operative four-dimensional CT image is generated from the reference three-dimensional CT image and the second three-dimensional CT image of the remaining phases.
This step is the same as step 105, and specific reference may be made to the description related to step 105, which is not repeated here.
Step 406, the current respiration state of the target user is obtained.
In one embodiment, the respiration signal of the target user may be obtained by the respiration monitoring device and sent to the terminal device, and the terminal device may determine the current respiration state of the target user by analyzing the received respiration signal of the target user.
As an alternative embodiment, the image registration parameters include deformation fields, and when acquiring the current respiration state of the target user, further include:
acquiring an intraoperative three-dimensional CT image of a target user, wherein the intraoperative three-dimensional CT image comprises a puncture path of a puncture needle;
updating the puncture path according to the deformation field to obtain a target puncture path;
and displaying a target puncture path on a second three-dimensional CT image of the current time phase, wherein the target puncture path is used for guiding a puncture needle to puncture.
When the current respiratory state of the target user is acquired, the intraoperative three-dimensional CT image of the target user in the current respiratory state can be synchronously acquired, and puncture path planning is performed on the intraoperative three-dimensional CT image, so that the puncture path of the puncture needle is obtained.
Because the interior of the intra-operative three-dimensional CT image may be deformed compared to the second three-dimensional CT image of the current phase, the puncture path needs to be updated according to the deformation field to improve the accuracy of the puncture path on the second three-dimensional CT image of the current phase.
According to the target puncture path displayed on the second three-dimensional CT image of the current time phase and the position of the puncture needle on the second three-dimensional CT image of the current time phase, a doctor can adjust the position of the puncture needle in an actual scene, so that the puncture needle can puncture along the target puncture path, the position of the puncture needle changes according to different respiratory motions, and the positioning and navigation accuracy in the chest and abdomen operation is improved.
As an alternative embodiment, the puncture path refers to a path from a puncture point to a target point, and updating the puncture path according to the deformation field includes:
and updating the position information of the target point in the puncture path according to the deformation field.
Because the three-dimensional CT image in operation and the second three-dimensional CT image in the current time phase are positioned in the same space, although the interior of the three-dimensional CT image in operation is possibly deformed, the position of a puncture point of a puncture path is not changed generally, and the position is changed generally as a target point, only the position information of the target point can be updated according to a deformation field, and the updating speed of the puncture path is improved. The updated position information of the target point may refer to position information of the target point in the second three-dimensional CT image of the current phase.
An exemplary view of the puncture path is shown in fig. 5. The entry point in fig. 5 is referred to as a puncture entry point.
Step 407, determining a second three-dimensional CT image of the current phase from the four-dimensional CT images in the operation.
The four-dimensional CT image in operation comprises a second three-dimensional CT image of N time phases, and the N time phases respectively correspond to one respiratory state, so that the second three-dimensional CT image of the current time phase corresponding to the current respiratory state can be found from the four-dimensional CT image in operation. The current phase may refer to a phase corresponding to the current respiration state.
At step 408, the needle is displayed on the second three-dimensional CT image of the current phase.
Wherein the puncture needle is located at a first position of the target user.
The puncture needle is displayed on the second three-dimensional CT image of the current time phase, the position of the puncture needle in the actual scene can be mapped to an image space, and a doctor can conveniently know the position of the puncture needle in the actual scene by looking at the position of the puncture needle in the second three-dimensional CT image of the current time phase.
As an alternative embodiment, displaying the puncture needle on the second three-dimensional CT image of the current phase comprises:
acquiring position information of the puncture needle in a reference frame space, wherein the reference frame is positioned at a second position of a target user, and the second position is different from the first position;
and transforming the position information of the puncture needle to a second three-dimensional image of the current time phase from the reference frame space to obtain the position information of the puncture needle in the second three-dimensional image of the current time phase, and displaying the puncture needle at the corresponding position in the second three-dimensional image of the current time phase.
A reference frame can be arranged at a second position of the target user, and the reference frame is synchronous with the target user and reflects the posture and the positioning of the target user.
The reference frame space may refer to a coordinate system established based on the reference frame, for example, a three-dimensional coordinate system established with the reference frame as an origin.
In one embodiment, the terminal device may determine a coordinate transformation relationship from the binocular positioning system space to the reference frame space according to the coordinates of the reference frame in the binocular positioning system space; according to the coordinate of the puncture needle in the binocular positioning system space and the coordinate transformation relation between the binocular positioning system space and the reference frame space, the position information of the puncture needle in the reference frame space can be obtained.
Wherein, the reference frame and the puncture needle are both in the shooting range of the binocular positioning system. The binocular positioning system can obtain the coordinates of the reference frame and the puncture needle in the space of the binocular positioning system through two built-in cameras. The binocular positioning system space may refer to the coordinate system of either of the two cameras of the binocular positioning system, and may also be referred to as the camera coordinate system.
As an alternative embodiment, transforming the position information of the puncture needle from the reference frame space to the second three-dimensional image of the current phase comprises:
acquiring a coordinate transformation relation between a reference frame space and an image space, wherein the image space refers to a coordinate system of a three-dimensional CT image in an operation;
and transforming the position information of the puncture needle from the reference frame space to a second three-dimensional image of the current time phase according to the coordinate transformation relation between the reference frame space and the image space and the deformation field.
The terminal equipment can spatially register the reference frame space and the image space through a least square method, a nearest point search method (Iterative Closest Point, ICP) and other spatial registration algorithms, and a coordinate transformation relation between the reference frame space and the image space is obtained.
The terminal equipment can transform the position information of the puncture needle from the reference frame space to the intraoperative three-dimensional CT image according to the coordinate transformation relation between the reference frame space and the image space, and can transform the position information of the puncture needle from the intraoperative three-dimensional CT image to the second three-dimensional image of the current time phase according to the deformation field.
It should be noted that, the terminal device, the binocular positioning system, the CT device, the respiration monitoring device, the reference frame and the puncture needle in this embodiment may form an intra-operative navigation positioning system, through which the puncture path may be predicted based on the intra-operative four-dimensional CT image, the puncture path may be updated, and the accuracy of positioning and navigation in the chest and abdomen may be improved.
According to the embodiment, on the basis of the first embodiment, the current respiratory state of the target user is obtained, the second three-dimensional CT image of the current time phase can be determined from the intra-operative four-dimensional CT image generated in the embodiment, and the puncture needle and the puncture path are displayed on the second three-dimensional CT image of the current time phase, so that a doctor can conveniently adjust the position of the puncture needle in real time according to the puncture path, and positioning and navigation accuracy in the thoracoabdominal operation is improved.
Referring to fig. 6, a schematic structural diagram of a four-dimensional CT image generating apparatus according to a third embodiment of the present application is shown, and for convenience of explanation, only a portion related to the embodiment of the present application is shown.
The generating device includes:
a first obtaining module 61, configured to obtain a reference three-dimensional CT image obtained by scanning a target user by a CT apparatus when the target user is in an operation and a respiratory state thereof is a reference respiratory state;
a first determining module 62, configured to determine a first three-dimensional CT image of a reference phase from pre-operation four-dimensional CT images, where the pre-operation four-dimensional CT image includes a first three-dimensional CT image of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state;
an image registration module 63, configured to perform image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase, so as to obtain an image registration parameter;
an image transformation module 64, configured to transform the first three-dimensional CT image of a remaining phase according to the image registration parameter, to obtain a second three-dimensional CT image of the remaining phase, where the remaining phase is a phase other than the reference phase in the N phases;
An image generation module 65 is configured to generate an intra-operative four-dimensional CT image according to the reference three-dimensional CT image and the second three-dimensional CT image of the remaining phases, where the reference three-dimensional CT image is the second three-dimensional CT image of the reference phases in the intra-operative four-dimensional CT image.
Optionally, the generating device further includes:
the state acquisition module is used for acquiring the current breathing state of the target user;
the second determining module is used for determining a second three-dimensional CT image of a current phase from the intraoperative four-dimensional CT images, wherein the current phase is the phase corresponding to the current respiratory state;
and the puncture needle display module is used for displaying a puncture needle on the second three-dimensional CT image of the current time phase, and the puncture needle is positioned at the first position of the target user.
Optionally, the generating device further includes:
the second acquisition module is used for acquiring an intraoperative three-dimensional CT image of the target user, wherein the intraoperative three-dimensional CT image comprises a puncture path of the puncture needle;
the path updating module is used for updating the puncture path according to the deformation field to obtain a target puncture path;
the path display module is used for displaying the target puncture path on the second three-dimensional CT image of the current time phase, and the target puncture path is used for guiding the puncture needle to puncture.
Optionally, the puncture path refers to a path from a puncture point to a target point, and the path update module is specifically configured to:
and updating the position information of the target point in the puncture path according to the deformation field.
Optionally, the path display module includes:
a position acquisition unit, configured to acquire position information of the puncture needle in a reference frame space, where the reference frame is located at a second position of the target user, where the second position is different from the first position;
and the position conversion unit is used for converting the position information of the puncture needle from the reference frame space to the second three-dimensional image of the current time phase, obtaining the position information of the puncture needle in the second three-dimensional image of the current time phase, and displaying the puncture needle at the corresponding position in the second three-dimensional image of the current time phase.
Optionally, the above-mentioned position conversion unit is specifically configured to:
acquiring a coordinate transformation relation between the reference frame space and an image space, wherein the image space refers to a coordinate system of the intraoperative three-dimensional CT image;
and transforming the position information of the puncture needle from the reference frame space to a second three-dimensional image of the current time phase according to the coordinate transformation relation between the reference frame space and the image space and the deformation field.
Optionally, the image registration module 63 is specifically configured to:
and carrying out rigid registration and deformation registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase in sequence to obtain translation parameters and rotation parameters corresponding to the rigid registration and deformation fields corresponding to the deformation registration.
The generating device provided in the embodiment of the present application may be applied to the first and second embodiments of the foregoing method, and details refer to the description of the first and second embodiments of the foregoing method, which are not repeated herein.
Fig. 7 is a schematic structural diagram of a terminal device according to a fourth embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: one or more processors 70 (only one shown), a memory 71, and a computer program 72 stored in the memory 71 and executable on the processor 70. The steps of the various generation method embodiments described above are implemented by the processor 70 when executing the computer program 72.
The terminal device 7 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the terminal device 7 and does not constitute a limitation of the terminal device 7, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 70 may be a central processing unit (Central Processing Unit, CPU), or may be another general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 71 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The present application may also be implemented by a computer program product for implementing all or part of the steps of the method embodiments described above, when the computer program product is run on a terminal device, so that the terminal device executes the steps.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (7)

1. A method for generating a four-dimensional CT image, the method comprising:
when a target user is in operation and the breathing state of the target user is a reference breathing state, acquiring a reference three-dimensional CT image obtained by scanning the target user by a CT device;
determining a first three-dimensional CT image of a reference phase from preoperative four-dimensional CT images, wherein the preoperative four-dimensional CT images comprise first three-dimensional CT images of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state;
Performing image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference phase to obtain image registration parameters, wherein the image registration parameters comprise deformation fields;
transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase, wherein the residual phase refers to a phase other than the reference phase in the N phases;
generating an intraoperative four-dimensional CT image according to the reference three-dimensional CT image and the second three-dimensional CT image of the residual phase, wherein the reference three-dimensional CT image is the second three-dimensional CT image of the reference phase in the intraoperative four-dimensional CT image;
determining a second three-dimensional CT image of a current phase according to the current respiratory state of the target user and the intraoperative four-dimensional CT image, wherein the current phase is a phase corresponding to the current respiratory state;
displaying a puncture needle on a second three-dimensional CT image of the current time phase, wherein the puncture needle is positioned at a first position of the target user;
acquiring position information of the puncture needle in a reference frame space, wherein the reference frame is positioned at a second position of the target user, and the second position is different from the first position;
Acquiring a coordinate transformation relation between the reference frame space and an image space, wherein the image space refers to a coordinate system of an intraoperative three-dimensional CT image, and the intraoperative three-dimensional CT image acquired when the current respiratory state of a target user is acquired comprises a puncture path of a puncture needle;
according to the coordinate transformation relation between the reference frame space and the image space and the deformation field, the position information of the puncture needle is transformed from the reference frame space to the second three-dimensional image of the current time phase, the position information of the puncture needle in the second three-dimensional image of the current time phase is obtained, and the puncture needle is displayed at the corresponding position in the second three-dimensional image of the current time phase.
2. The generating method according to claim 1, wherein when acquiring the current respiration state of the target user, further comprising:
acquiring an intraoperative three-dimensional CT image of the target user, wherein the intraoperative three-dimensional CT image comprises a puncture path of the puncture needle;
updating the puncture path according to the deformation field to obtain a target puncture path;
and displaying the target puncture path on the second three-dimensional CT image of the current time phase, wherein the target puncture path is used for guiding the puncture needle to puncture.
3. The method of generating of claim 2, wherein the puncture path is a puncture point-to-target path, and updating the puncture path based on the deformation field comprises:
and updating the position information of the target point in the puncture path according to the deformation field.
4. A method of generating as claimed in any one of claims 1 to 3, wherein said image registering said reference three-dimensional CT image with said first three-dimensional CT image of said reference phase, obtaining image registration parameters comprises:
and carrying out rigid registration and deformation registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase in sequence to obtain translation parameters and rotation parameters corresponding to the rigid registration and deformation fields corresponding to the deformation registration.
5. A four-dimensional CT image generation apparatus, the generation apparatus comprising:
the first acquisition module is used for acquiring a reference three-dimensional CT image obtained by the CT equipment through scanning the target user when the target user is in operation and the breathing state of the target user is a reference breathing state;
the first determining module is used for determining a first three-dimensional CT image of a reference phase from four-dimensional CT images before operation, wherein the four-dimensional CT images before operation comprise first three-dimensional CT images of N phases, N is an integer greater than 1, and the reference phase is a phase corresponding to the reference respiratory state;
The image registration module is used for carrying out image registration on the reference three-dimensional CT image and the first three-dimensional CT image of the reference time phase to obtain image registration parameters, wherein the image registration parameters comprise deformation fields;
the image transformation module is used for transforming the first three-dimensional CT image of the residual phase according to the image registration parameters to obtain a second three-dimensional CT image of the residual phase, and the residual phase refers to a phase other than the reference phase in the N phases;
an image generation module, configured to generate an intra-operative four-dimensional CT image according to the reference three-dimensional CT image and a second three-dimensional CT image of the remaining phase, where the reference three-dimensional CT image is the second three-dimensional CT image of the reference phase in the intra-operative four-dimensional CT image;
the second determining module is used for determining a second three-dimensional CT image of a current phase according to the current respiratory state of the target user and the intraoperative four-dimensional CT image, wherein the current phase is a phase corresponding to the current respiratory state;
the puncture needle display module is used for displaying a puncture needle on the second three-dimensional CT image of the current time phase, and the puncture needle is positioned at the first position of the target user;
The path display module comprises a position acquisition unit and a control unit, wherein the position acquisition unit is used for acquiring the position information of the puncture needle in a reference frame space, the reference frame is positioned at a second position of the target user, and the second position is different from the first position;
the path display module comprises a position conversion unit for:
acquiring a coordinate transformation relation between the reference frame space and an image space, wherein the image space refers to a coordinate system of an intraoperative three-dimensional CT image, and the intraoperative three-dimensional CT image acquired when the current respiratory state of a target user is acquired comprises a puncture path of a puncture needle;
according to the coordinate transformation relation between the reference frame space and the image space and the deformation field, the position information of the puncture needle is transformed from the reference frame space to the second three-dimensional image of the current time phase, the position information of the puncture needle in the second three-dimensional image of the current time phase is obtained, and the puncture needle is displayed at the corresponding position in the second three-dimensional image of the current time phase.
6. Terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the generating method according to any of claims 1 to 4 when the computer program is executed.
7. A computer-readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the generating method according to any one of claims 1 to 4.
CN202110938287.2A 2021-08-16 2021-08-16 Four-dimensional CT image generation method, device, terminal equipment and storage medium Active CN113781593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110938287.2A CN113781593B (en) 2021-08-16 2021-08-16 Four-dimensional CT image generation method, device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110938287.2A CN113781593B (en) 2021-08-16 2021-08-16 Four-dimensional CT image generation method, device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113781593A CN113781593A (en) 2021-12-10
CN113781593B true CN113781593B (en) 2023-10-27

Family

ID=78837944

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110938287.2A Active CN113781593B (en) 2021-08-16 2021-08-16 Four-dimensional CT image generation method, device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113781593B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105816196A (en) * 2016-05-13 2016-08-03 上海联影医疗科技有限公司 Marking tape for 4DCT imaging and 4DCT imaging method
CN112515763A (en) * 2020-11-27 2021-03-19 中国科学院深圳先进技术研究院 Target positioning display method, system and device and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130035588A1 (en) * 2011-08-03 2013-02-07 Siemens Corporation Magnetic resonance imaging for therapy planning
GB2536274B (en) * 2015-03-12 2019-10-16 Mirada Medical Ltd Method and apparatus for assessing image registration

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105816196A (en) * 2016-05-13 2016-08-03 上海联影医疗科技有限公司 Marking tape for 4DCT imaging and 4DCT imaging method
CN112515763A (en) * 2020-11-27 2021-03-19 中国科学院深圳先进技术研究院 Target positioning display method, system and device and electronic equipment

Also Published As

Publication number Publication date
CN113781593A (en) 2021-12-10

Similar Documents

Publication Publication Date Title
US9384528B2 (en) Image annotation using a haptic plane
CN107133946B (en) Medical image processing method, device and equipment
US20110262015A1 (en) Image processing apparatus, image processing method, and storage medium
CN102667857B (en) Bone in X-ray photographs suppresses
Maier-Hein et al. Towards mobile augmented reality for on-patient visualization of medical images
US11468589B2 (en) Image processing apparatus, image processing method, and program
CN109152566B (en) Correcting for probe-induced deformations in ultrasound fusion imaging systems
CN102727204B (en) Information processing apparatus, information processing method, and imaging system
CN111080583B (en) Medical image detection method, computer device, and readable storage medium
US9675311B2 (en) Follow up image acquisition planning and/or post processing
US9020215B2 (en) Systems and methods for detecting and visualizing correspondence corridors on two-dimensional and volumetric medical images
WO2009093693A1 (en) Image generation device, image generation method, and program
CN108038904B (en) Three-dimensional reconstruction system for medical images
CN109350059B (en) Combined steering engine and landmark engine for elbow auto-alignment
US10810717B2 (en) Image processing apparatus, image processing method, and image processing system
US20110115785A1 (en) Image processing apparatus, method, and program
US11495346B2 (en) External device-enabled imaging support
CN113781593B (en) Four-dimensional CT image generation method, device, terminal equipment and storage medium
CN116439691A (en) Joint activity detection method based on artificial intelligence and related equipment
CN113262048B (en) Spatial registration method and device, terminal equipment and intraoperative navigation system
CN113888566B (en) Target contour curve determination method and device, electronic equipment and storage medium
CN108510432B (en) Method, device and system for displaying image and storage medium
US11138736B2 (en) Information processing apparatus and information processing method
CN112365492A (en) Image scanning method, image scanning device, electronic equipment and storage medium
JP6732593B2 (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant