CN114332281A - Image reconstruction method and system - Google Patents

Image reconstruction method and system Download PDF

Info

Publication number
CN114332281A
CN114332281A CN202111674837.0A CN202111674837A CN114332281A CN 114332281 A CN114332281 A CN 114332281A CN 202111674837 A CN202111674837 A CN 202111674837A CN 114332281 A CN114332281 A CN 114332281A
Authority
CN
China
Prior art keywords
scanning
layer
data
scan
reconstructed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111674837.0A
Other languages
Chinese (zh)
Inventor
闫晶
周海华
冯娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202111674837.0A priority Critical patent/CN114332281A/en
Publication of CN114332281A publication Critical patent/CN114332281A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the application discloses an image reconstruction method. The image reconstruction method comprises the following steps: scanning at least one scanning layer of a scanning object based on the deflection positions of the scanning source and the detector to acquire scanning data; determining the division points of the imaging areas of the at least one scanning layer respectively corresponding to the detector; determining data to be reconstructed based on the scanning data and the segmentation points respectively corresponding to the at least one scanning layer, wherein the data to be reconstructed is used for reconstructing an image; wherein the deflection position satisfies that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection.

Description

Image reconstruction method and system
Technical Field
The present disclosure relates to the field of medical imaging, and more particularly, to a method and system for reconstructing a medical image.
Background
Medical imaging equipment has become an indispensable equipment in the existing medical field. Image reconstruction is a key technology applied to the field of medical imaging. More specifically, a medical imaging device (e.g., a Computed Tomography (CT) device) scans a patient and reconstructs based on the scanned data. In the scanning process, mechanical deviation of a component (e.g., a mechanical arm) in the medical imaging device may cause artifacts in the reconstructed image, which affects the quality of the reconstructed image.
Therefore, there is a need for an image reconstruction method for medical imaging equipment, which reduces the influence of artifacts on the quality of reconstructed images, thereby improving the efficiency and accuracy of medical analysis and/or diagnosis.
Disclosure of Invention
One of the embodiments of the present specification provides an image reconstruction method. The image reconstruction method comprises the following steps: scanning at least one scanning layer of a scanning object based on the deflection positions of the scanning source and the detector to acquire scanning data; determining the division points of the imaging areas of the at least one scanning layer respectively corresponding to the detector; determining data to be reconstructed based on the scanning data and the segmentation points respectively corresponding to the at least one scanning layer, wherein the data to be reconstructed is used for reconstructing an image; wherein the deflection position satisfies that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection.
One of the embodiments of the present specification provides an image reconstruction system, including: the acquisition module is used for scanning at least one scanning layer of a scanning object based on the deflection positions of the scanning source and the detector to acquire scanning data; the determination module is used for determining the segmentation points of the imaging regions corresponding to the at least one scanning layer on the detector respectively, and determining data to be reconstructed based on the scanning data and the segmentation points corresponding to the at least one scanning layer respectively, wherein the data to be reconstructed is used for reconstructing an image; wherein the deflection position satisfies that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection.
One of the embodiments of the present specification provides an image reconstruction apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements a method of image reconstruction when executing the computer program.
One of the embodiments of the present specification provides a computer-readable storage medium storing computer instructions, and when the computer instructions in the storage medium are read by a computer, the computer executes an image reconstruction method.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of an image reconstruction system according to some embodiments of the present description;
FIG. 2 is an exemplary block diagram of an image reconstruction system according to some embodiments of the present description;
FIG. 3 is an exemplary flow diagram of an image reconstruction method according to some embodiments of the present description;
FIG. 4 is an exemplary flow diagram illustrating the determination of data to be reconstructed according to some embodiments of the present description;
FIG. 5 is an exemplary schematic diagram illustrating scanning based on deflection position according to some embodiments herein;
FIG. 6 is an exemplary diagram illustrating determining a mapping matrix according to some embodiments of the present description;
FIG. 7 is an exemplary diagram illustrating fitting determination of segmentation points according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic view of an application scenario of an image reconstruction system according to some embodiments of the present disclosure.
As shown in fig. 1, the image reconstruction system 100 may include a processing device 110, a network 120, a user terminal 130, a storage device 140, and a medical imaging device 150.
In some embodiments, the image reconstruction system 100 may enable the reconstruction of medical images by implementing the methods and/or processes disclosed herein.
Processing device 110 may process data and/or information retrieved from user terminal 130, medical imaging device 150, and/or storage device 140. Processing device 110 may access information and/or data through network 120 or directly from user terminal 130, storage device 140, and/or medical imaging device 150. For example, the processing device 110 may obtain scan data of the medical imaging device 150 from the user terminal 130 and/or the medical imaging device 150. Processing device 110 may process the acquired data and/or information. For example, the processing device 110 may process the acquired scan data to determine data to be reconstructed, and further reconstruct the image. In some embodiments, the processing device 110 may be a single server or a group of servers. The processing device 110 may be disposed in the medical imaging device 150. The processing device 110 may be local, remote. The processing device 110 may be implemented on a cloud platform.
Network 120 may include any suitable network that provides information and/or data exchange capable of facilitating image reconstruction system 100. In some embodiments, information and/or data may be exchanged between one or more components of image reconstruction system 100 (e.g., processing device 110, user terminal 130, storage device 140, and medical imaging device 150) via network 120. Network 120 may include a Local Area Network (LAN), a Wide Area Network (WAN), a wired network, a wireless network, and the like, or any combination thereof.
User terminal 130 refers to one or more terminal devices or software used by a user. In some embodiments, the user terminal 130 may be a mobile device, a tablet computer, or the like, or any combination thereof. In some embodiments, user terminal 130 may interact with other components in image reconstruction system 100 via network 120. For example, the user terminal 130 may send one or more control instructions to the medical imaging device 150 to control the processing device 110 to process the scan data of the medical imaging device 150, so as to determine the data to be reconstructed. In some embodiments, the user terminal 130 may be part of the processing device 110. In some embodiments, the user terminal 130 may be integrated with the processing device 110 as an operation console of the medical imaging device 150.
Storage device 140 may be used to store data, instructions, and/or any other information. In some embodiments, storage device 140 may store data and/or information obtained from, for example, processing device 110, user terminal 130, medical imaging device 150, and/or the like. For example, the storage device 140 may store scan data, data to be reconstructed, and the like. The storage device 140 may be provided in the medical imaging device 150. In some embodiments, storage device 140 may include mass storage, removable storage, and the like, or any combination thereof.
The medical imaging device 150 may be used to acquire scan data of a scanned object. The scan object may include a biological object (e.g., a human body, an animal, etc.), a non-biological object (e.g., a phantom), and so forth. In some embodiments, the medical imaging device 150 may include a scanning source and a detector (not shown). The scan source may be configured to emit a radiation beam (e.g., X-rays) onto a scan subject. The detector may be configured to receive the radiation beam and form scan data. For more details on the scanning source and detector, see step 310. In some embodiments, the medical imaging device 150 may be a CT imaging device, a PET-CT imaging device, or the like. In some embodiments, the processing device 110 and the storage device 140 may be part of the medical imaging device 150.
FIG. 2 is a block diagram illustrating an image reconstruction system according to some embodiments of the present application. As shown in FIG. 2, a block diagram 200 of an image reconstruction system may include an acquisition module 210 and a determination module 220.
The acquisition module 210 may be configured to scan at least one scan layer of the scan object based on the deflection positions of the scan source and the detector to acquire scan data.
The determining module 220 may be configured to determine segmentation points of the imaging regions corresponding to the at least one scanning layer on the detector, and determine data to be reconstructed based on the scanning data and the segmentation points corresponding to the at least one scanning layer, where the data to be reconstructed is used for reconstructing an image.
In some embodiments, for each of the at least one scan layer, the determining module 220 may be configured to determine scan data between a segmentation point corresponding to the scan layer and a first endpoint of the imaging region corresponding to the scan layer as the data to be reconstructed, wherein the first endpoint is determined based on the deflection direction.
In some embodiments, for each of the at least one scan layer, the determining module 220 may be configured to determine a mapping point of the rotational center point of the scan layer at the imaging region as the segmentation point.
In some embodiments, the at least one scanning layer includes a first scanning layer and a second scanning layer, and the determining module 220 is further configured to fit the segmentation points corresponding to the at least two first scanning layers, respectively, and determine the segmentation points corresponding to the second scanning layer.
See fig. 3-6 and associated description for more on the acquisition module 210 and the determination module 220.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware.
It should be noted that the above descriptions of the candidate item display and determination system and the modules thereof are only for convenience of description, and the description is not limited to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. In some embodiments, the obtaining module 210 and the determining module 220 disclosed in fig. 2 may be different modules in a system, or may be a module that implements the functions of two or more modules described above. For example, each module may share one memory module, and each module may have its own memory module. Such variations are within the scope of the present disclosure.
FIG. 3 is an exemplary flow diagram of an image reconstruction method according to some embodiments of the present description. As shown in fig. 3, the process 300 includes the following steps. In some embodiments, flow 300 may be performed by a processor (e.g., processing device 110).
At step 310, at least one scanning layer of the scanning object is scanned based on the deflection positions of the scanning source and the detector, and scanning data is acquired. In some embodiments, step 310 may be performed by acquisition module 210.
As illustrated in fig. 1, the scan object refers to an object being scanned, and may include a biological object (e.g., a human body, an animal, etc.), a non-biological object (e.g., a phantom), and the like.
The scanning source may refer to a component of a medical imaging apparatus that emits a radiation beam toward a scanned object. For example, components of the scan source include an X-ray tube that emits a radiation beam (e.g., X-rays) that irradiate the target phantom. The position of the scanning source can be represented by the focal point of the scanning source, and the focal point of the scanning source can be the focal point of the X-ray tube.
A detector may refer to a component of a medical imaging apparatus that receives a radiation beam through a scanned object. The region of the detector that receives the radiation beam can be considered as the imaging region of the detector. During scanning of the scanned object by the scanning source, for each scan slice, there is a corresponding imaging region on the detector for receiving signals that traverse the scan slice. That is, the imaging area corresponding to the scanning layer is the position on the detector where the signal from the scanning source when scanning the scanning layer is received.
The deflection position may refer to a position where the scanning source and the detector are deflected from an original position. The home position is the position where the scanning source and detector are located before deflection. In some embodiments, scanning the source and detector over a particular angular range (e.g., 360, etc.) at the original position may result in a reconstructed field of view (subsequently referred to as the pre-deflection reconstructed field of view) and corresponding scan data. Scanning the source and detector at the deflection position for a particular angular range may result in another reconstructed field of view (subsequently referred to as the deflected reconstructed field of view) and corresponding scan data. The reconstructed field of view may be a circular region of maximum range centered on the center of interest of the scanned object that can be covered by the scanning source when the medical imaging device is rotated through a specific angular range (e.g., 360 °, etc.).
The deflection position satisfies that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection. Illustratively, as shown in FIG. 5, the circular region 502 is the reconstructed field of view that the medical imaging device has rotated 360 before the scanning source and detector are deflected (i.e., in their original positions). The circular region 503 is the reconstructed field of view of the medical imaging device rotated 360 ° after the scanning source and detector are deflected (i.e., deflected position). It can be seen that circular area 503 is larger than circular area 502. In some embodiments, the home position is such that the focal point of the scanning source passes through the center of rotation with respect to a perpendicular to the detector. The scanning source and the detector can be in deflection positions so that the focus of the scanning source and the perpendicular line of the detector do not pass through the rotation center. In some embodiments, the scan source and detector may expand the reconstructed field of view by deflecting, i.e., the reconstructed field of view after deflection is larger than the reconstructed field of view before deflection. The overlapping region may occur in the reconstructed field of view after the deflection, i.e., the scan data corresponding to the reconstructed field of view may also include the scan data of the overlapping region. In some embodiments, the data to be reconstructed may be determined by processing the scan data corresponding to the deflected reconstructed field of view (e.g., removing redundant data of the overlapping region), as may be seen in step 320.
The center of rotation may be the center about which the scan source and detector rotate (e.g., 360 ° rotation) while scanning the scan object. For example, the center of rotation may be a line about which the scan source and detector are scanned rotationally. The center of rotation may be obtained from the medical imaging device. Not passing through the center of rotation of the focal point of the scanning source with respect to the perpendicular to the detector may mean that the perpendicular does not intersect the center of rotation.
In some embodiments, the deflection position may be determined from movement of the scan source and detector, which may include rotation or translation. The rotation may be such that the scanning source and detector are about a preset point. For example, the detector and the scanning source are rotated about a point or line on the detector (e.g., a center point or centerline). It should be noted that the rotation for generating the deflection position is different from the rotation for generating the scan data for the scan of the scan object. The translation may be performed in a direction by at least one of the scan source and the detector. For example, the position of the scanning source is fixed and the position of the detector is translated by a distance that cannot exceed 1/2 the length of the detector. For another example, the scan source and the detector may be translated simultaneously, and the directions of movement may be the same or opposite, with relative displacement occurring when the directions of movement are the same.
In some embodiments, the scan source and detector may scan the scan object at a deflected position. For example, the scanning source and the detector are rotated 360 degrees to scan the scanning object at the deflection position, and scanning data is obtained.
The scan layer may refer to a plurality of layered regions divided by a certain pitch in one direction of the scan object. The scanning layer may have a certain layer thickness. In some embodiments, the scan layer may be divided according to actual circumstances. For example, the scanning layers are divided according to characteristic parameters (such as age, weight, physical characteristics, symptoms and the like) of the scanning object.
The scan data may refer to raw data obtained when the medical imaging device scans a scan object. In some embodiments, the medical imaging device uses the scanning source to emit a radiation beam to the scanned object, and uses the detector to receive the radiation beam to form the scan data. In some embodiments, the acquisition module 210 may acquire scan data from a medical imaging device. In some embodiments, the acquisition module 210 may also acquire scan data from a memory (e.g., storage device 140).
And step 320, determining data to be reconstructed based on the scanning data, wherein the data to be reconstructed is used for reconstructing an image. In some embodiments, step 320 may be performed by determination module 220.
The data to be reconstructed refers to data obtained by processing the scan data and used for reconstructing an image.
In some embodiments, the determination module 220 may process noise (e.g., overlaps, missing, errors, anomalies, etc.) in the scan data to determine the data to be reconstructed.
In some embodiments, the determining module 220 may perform weighting processing on the scan data of the overlapping region, so that the weight of the scan data of the overlapping region is consistent with the weight of the scan data of the non-overlapping region, and determine the data to be reconstructed.
The non-overlap region is a region formed by positions where data is acquired only once in a scanning layer of a scanning target when the scanning layer is scanned by one 360 ° rotation. The overlap region is a region formed by positions where data is acquired a plurality of times (for example, 2 times) in a scanning layer of a scanning target when one 360 ° rotation scan is performed. Accordingly, the scan data obtained by scanning includes scan data acquired for the same position for a plurality of times (e.g., 2 times), that is, there is redundant scan data (e.g., scan data of one of two times of scan data can be regarded as redundant scan data). If the image reconstruction is performed directly on the basis of these scan data acquired multiple times, the reconstructed image may have artifacts due to inconsistency of the overlapping area caused by mechanical deviation. The scanning layer has a corresponding imaging area, and accordingly, the overlapping area and the non-overlapping area on the scanning layer have a corresponding area on the imaging area.
In some embodiments, the scan data acquired multiple times in the overlapping region may be weighted so that the weight of the scan data of the final overlapping region is consistent with the weight of the scan data of the non-overlapping region (for example, the weights are both 1), and the scan data of the overlapping region and the scan data of the non-overlapping region subjected to the weighting processing are taken as the data to be reconstructed. For example, the weight of the scan data of the non-overlapping region is set to 1, and if the overlapping region is scanned twice, the scan data of the two scans is subjected to weighting processing so that the weights of the scan data of the two scans are added to 1.
In some embodiments, for each of the at least one scan layer, the determination module 220 may determine a segmentation point of the imaging region corresponding to the scan layer on the detector and determine data to be reconstructed based on the scan data and the segmentation point corresponding to the scan layer. For example, the scan data between the segmentation point and the first endpoint in the imaging region corresponding to the scan layer is determined as the data to be reconstructed for the reconstruction of the image. See fig. 4 for further details regarding the determination of the data to be reconstructed based on the segmentation point and the first endpoint.
See fig. 5 and its associated description for further details regarding deflection locations, imaging regions, scan sources, overlap regions, and the like.
The scanning object is scanned at each deflection position of the scanning source and the detector, so that the scanning area can be enlarged, and the omnibearing scanning of the scanning object is realized. Meanwhile, the parts (such as mechanical arms) in the medical imaging equipment may cause artifacts (namely, overlapping regions) in the reconstructed image due to the existence of mechanical deviation, and the scanning data of the overlapping regions are processed, so that the influence of redundant data is avoided in the data to be reconstructed which is finally used for image reconstruction, the generation of the artifacts is reduced, and the quality of the reconstructed image is improved.
FIG. 4 is an exemplary flow diagram illustrating the determination of data to be reconstructed according to some embodiments herein. In some embodiments, flow 400 may be performed by a processor (e.g., processing device 110).
For each of the at least one scan layer, a segmentation point of a corresponding imaging region of the scan layer on the detector is determined, step 410. In some embodiments, step 410 may be performed by determination module 220.
As previously mentioned, the imaging region is the region on the detector that receives the radiation beam through the scan layer. Each scanning layer may correspond to an imaging region on the detector.
The division point may be a dividing point that divides the imaging region into two parts. In some embodiments, the division point is located in a region corresponding to the overlap region (may be simply referred to as "overlap corresponding region") in the imaging region. For example, the segmentation point may be a midpoint of the overlapping corresponding region. Different positions of the overlapping corresponding areas receive scanning data of different times of the same position in the scanning layer. The segmentation points divide the overlapping corresponding regions so that only a single scan data is used in the final reconstruction of the image.
In some embodiments, the determination module 220 may determine the segmentation point of the imaging region corresponding to the scanning layer on the detector according to the deflection position. For example, the determining module 220 may obtain a corresponding preset value according to the related information of the deflection position (e.g., a deflection manner, a deflection angle, etc.), so as to determine the position of the segmentation point. The preset value is the distance between the segmentation point and a certain end point of the imaging area. For example, there may be a correspondence between different deflection modes and deflection parameters (e.g., deflection angles) and preset values, and based on the deflection modes and deflection parameters, the corresponding preset values may be determined, and accordingly, the locations of the segmentation points may be determined.
In some embodiments, the determination module 220 may determine a mapping point of a rotation center point of the scan layer at the imaging region as a segmentation point, wherein the rotation center point may be determined based on the rotation center.
The center of rotation point may be a point about which the scanning source and detector rotate in the scanning layer, each scanning layer having a center of rotation point.
The center point of rotation may be determined based on the center of rotation. In some embodiments, the intersection point of the scanning layer and the straight line of the rotation center is the rotation center point. The center point of rotation is from the center of rotation. For example, the rotation center is a line, and the rotation center points of different scanning layers are different points on the line. In some embodiments, the center of rotation may be obtained from a medical imaging device.
The mapped point may be a corresponding point determined in the imaging region based on a mapping transformation by rotating the center point. For example, the mapping transformation may be a transformation from three dimensions to two dimensions. In some embodiments, the mapped point may be a projected point of the center point of rotation in the imaging region. In some embodiments, the mapped point may be a point where the focal point of the scanning source corresponds in the imaging region simultaneously with the center point of rotation. For example, the mapping point is the intersection point of the straight line where the focal point of the scanning source and the center point of rotation are located and the imaging region.
In some embodiments, the rotation center point may be processed based on the mapping matrix to determine the mapping point. For example, the position coordinates of the mapping point may be obtained by calculation according to formula (1):
Figure BDA0003451446490000101
wherein P is a mapping matrix, (x)i,yi,zi) Is the position coordinate of the rotation center point of the scanning layer (u)i,vi) As position coordinates of mapped points in the imaging area, wiI represents the ith scan layer as a coefficient.
The mapping matrix may be obtained in a variety of ways. For example, the mapping matrix may be determined by scanning a calibration phantom. In some embodiments, the mapping matrix is a projection matrix, i.e. the mapping matrix is a matrix that transforms a rotational center point in three-dimensional coordinates into a projected point of the rotational center point in the detector imaging region in two-dimensional coordinates. For the acquisition of the mapping matrix, reference may be made to fig. 6 and its associated description.
In some embodiments, the determination module may determine the corresponding segmentation point by performing a mapping calculation on the rotation center point of each scan layer.
In some embodiments, the determination module 220 may determine the segmentation points of other scan layers according to the segmentation points of some of the scan layers.
In some embodiments, the at least one scanning layer includes a first scanning layer and a second scanning layer, and the determining module 220 may fit the segmentation points corresponding to the at least two first scanning layers, respectively, to determine the segmentation points corresponding to the second scanning layer.
The position coordinates of the corresponding segmentation points of the first scanning layer are known, and the second scanning layer can be a scanning layer needing to determine the position coordinates of the corresponding segmentation points. For example, the position coordinates where the division point corresponding to the first scanning layer is located may be determined by calculation using equation (1) above. In some embodiments, the at least two first scanning layers may be the first and last two scanning layers of all the scanning layers, or may be the middle two or more scanning layers of all the scanning layers. The second scanning layer may be the other scanning layers than the first scanning layer among all the scanning layers.
In some embodiments, fitting by using the position coordinates of the segmentation points may include straight line fitting, curve fitting, and the like. The dividing points corresponding to the at least two first scanning layers can be fitted by using a straight line fitting algorithm or a curve fitting algorithm to obtain a fitted line (a fitted straight line or a fitted curve).
The straight line fitting algorithm comprises a least square method, a gradient descent method, a column-horse algorithm and the like, and the curve fitting algorithm comprises a polynomial fitting algorithm and the like. In some embodiments, the fitting of the segmentation points corresponding to the at least two first scanning layers by using a straight line or curve fitting algorithm may be performed on the segmentation points corresponding to the first and last two scanning layers in all scanning layers, or may be performed on the segmentation points corresponding to the middle two or more scanning layers in all scanning layers.
In some embodiments, determination module 220 may determine the segmentation point for the second scan layer from the determined fit line. For example, the intersection of the fit line and the imaging region corresponding to the second scan layer may be taken as the division point of the second scan layer.
Illustratively, as shown in FIG. 7, a square HKVJ is a schematic plan view of the detector. The determining module 220 may determine the segmentation points corresponding to the first and last scanning layers in all the scanning layers, where the segmentation point corresponding to the first scanning layer is M1, and the segmentation point corresponding to the last scanning layer is M2, and then fit the first two scanning layers, for example, directly connect the segmentation points M1 and M2 corresponding to the first scanning layer, to obtain a fitted line segment M1M2, where the points on the line segment M1M2 include the segmentation points corresponding to all the scanning layers. The first end points of the imaging regions corresponding to all the scanning layers are located on the line segment VJ. The scan data corresponding to the polygon area HKM1M2 is made redundant data, and image reconstruction is performed using the polygon area M1M2 VJ.
In some embodiments, the determination module 220 may also fit the segmentation points corresponding to the at least two first scan layers using a machine learning model (which may be referred to as a "fitted model"). For example, the fitting model has the input of the coordinates of the division point of at least two first scanning layers and the output of the coordinates of the division point of the second scanning layer.
Step 420, determining the scan data between the segmentation point corresponding to the scan layer and the first endpoint of the imaging region corresponding to the scan layer as the data to be reconstructed. In some embodiments, step 420 may be performed by determination module 220.
In some embodiments, for each of the at least one scan layer, the imaging region corresponding to that scan layer includes two endpoints, namely a first endpoint and a second endpoint. The first end point and the second end point may be points of a range of signals generated when the detector receives the radiation beam to scan the scan layer, i.e. the first end point and the second end point determine the range of imaging.
In some embodiments, the first endpoint is determined based on the yaw direction. In some embodiments, the first endpoint is a point proximate to the deflection direction. For example, when the deflection is rotation, the deflection direction is counterclockwise, and the first end point is the end point on the left side of the imaging region. The deflection direction is clockwise, the first end point is the end point to the right of the imaging area. For another example, when the yaw is the panning, the yaw direction is the rightward panning, and the first end point is the end point on the right side of the imaging region. The direction of deflection is left translation, and the first endpoint is the endpoint on the left side of the imaging region. For example, as shown in fig. 5, if the rotation direction 504 is counterclockwise, the point Q is the first end point. It is understood that the second endpoint is another endpoint in the imaging region than the first endpoint.
In some embodiments, the distance from the first endpoint to the segmentation point is no less than the distance from the second endpoint to the segmentation point. In some embodiments, the first endpoint and the second endpoint may be determined based on a distance between the two endpoints of the imaging region and the segmentation point.
In some embodiments, the scan data between the segmentation point and the first end point of the imaging region is data to be reconstructed, and the scan data between the segmentation point and the second end point of the imaging region is overlapped data, i.e., the data to be reconstructed does not include the scan data between the second end point of the imaging region and the segmentation point.
In some embodiments, the image is reconstructed based only on the scan data between the segmentation point and the first endpoint.
For more details on the first endpoint, the second endpoint, the center of rotation point, and the segmentation point, reference may be made to fig. 4 and its associated description.
By determining the segmentation points, redundant data can be accurately removed, and artifacts in a reconstructed image are reduced. Moreover, the corresponding segmentation points can be determined for a single reconstructed image, and the problems that the length of an overlapping region of each projection is inconsistent due to the existence of mechanical deviation in the rotating scanning process of a frame (such as a scanning source and a detector), the determined overlapping region is inaccurate, and the quality of the reconstructed image is not high are avoided.
FIG. 5 is an exemplary diagram illustrating scanning based on deflection position according to some embodiments herein.
Fig. 5 shows a cross-section of the scanning source and detector in a Y-axis (not shown in fig. 5) when scanning a scanning layer of the scanned object, the Y-axis being a line perpendicular to the plane formed by the detector and the scanning source, and the X-axis and Z-axis being the plane formed by the focal point of the scanning source and the imaging area of the scanning layer on the detector. The X-axis is a line defined by the scan source and detector in the home position based on the imaging area on the detector, and the Z-axis is a line defined by the scan source and detector in the home position with the focal point of the scan source perpendicular to the imaging area.
O is the rotation center point of the scanning layer, T is the position of the focal point of the scanning source before deflection (i.e., the position of the focal point of the scanning source at the original position), D is the midpoint of the imaging region, and AB is the imaging region corresponding to the scanning layer before deflection (i.e., the position of the imaging region corresponding to the scanning layer at the original position).
The source and detector can be rotated about the center of the detector (i.e., the imaging region AB can be rotated about its midpoint D), the focal point of the source is rotated from the home position T to the deflected position T', and the imaging region of the detector is rotated from the home position AB to the deflected position QP.
T' is the position of the scanned layer after deflection (i.e., the position of the focal point of the scanning source at the deflected position), and QP is the imaging area corresponding to the scanned layer after deflection (i.e., the position of the imaging area corresponding to the scanned layer at the deflected position). The circular region 502 (i.e., the solid bold circle in fig. 5) is the reconstructed field of view that the medical imaging device has had for 360 ° of rotation before the scan source and detector are deflected (i.e., in the original position). The circular area 503 (i.e., the bold solid line circle in fig. 5) is the reconstructed field of view that the medical imaging device has rotated 360 ° after the scanning source and detector are deflected (i.e., deflected position).
During one 360 deg. rotational scan, a circular region 501 (i.e., the dashed circle in fig. 5) is acquired twice, and the other regions are acquired only once. That is, the circular region 501 is an overlapping region. Illustratively, the location 501-1 is acquired twice, and the location 501-1 can be considered as an overlapping location in the overlapping region. Position 501-1 is acquired once while the scan source is above (i.e., positive direction of the Z axis), and the acquired data is received by a position in the region of the NM in the imaging region; when the scan source is down (i.e., negative direction of the Z axis), position 501-1 is acquired once, and the acquired data is received by a location in the region of MP in the imaging region. One of the two acquired data may be considered redundant data. Other locations of the circular region 501 are similar and will not be described in detail. Accordingly, redundant data exists in the data collected by the MP and the MN on the imaging area.
The M point is a mapping point of the rotation center point of the scanning layer on the imaging area. M is the division point of the imaging region QP. The deflection direction 504 is the direction toward Q, Q being the first end and P being the second end. It can be seen that QM is greater than PM. The data received by the QM in the imaging area is taken as data to be reconstructed, and the data received by the MP is taken as redundant data.
FIG. 6 is an exemplary diagram illustrating determining a mapping matrix according to some embodiments of the present description.
As previously described, the determination module 220 may determine the mapping matrix by scanning a calibration phantom. In some embodiments, the processing device scans a calibration phantom, and acquires calibration scan data, wherein the calibration phantom includes a preset point; carrying out image reconstruction based on the calibration scanning data to obtain a calibration image; acquiring a first coordinate of a preset point in a calibration image; acquiring a second coordinate of a preset point in the calibration die body; determining the mapping matrix based on the first and second coordinates.
The calibration phantom may be a non-biological subject. Such as phantoms, mechanical models of various shapes, and the like. In some embodiments, the calibration phantom includes one or more pre-set points therein. The pre-set point is a specific point (e.g., a center point, etc.) in a specific substance. The particular substance may be distinguished from other substances in the calibration phantom. For example, the specific substance is a ball or the like. In some embodiments, a three-dimensional coordinate system may be established, and the first coordinates of the preset points may be determined. For example by means of measurements or the like.
In some embodiments, the medical imaging device may scan the calibration phantom to obtain calibration scan data. For example, a rotational scan or the like is performed based on the home position or the yaw position. Furthermore, the scanning data is preprocessed or image reconstruction is directly carried out on the basis of the scanning data, and a calibration image is obtained. See figure 3 and its associated description for pre-processing. In some embodiments, a two-dimensional coordinate system may be established, and the second coordinates of the preset point in the calibration image may be determined. For example by means of measurements or the like. It will be appreciated that because the particular location may be distinguished from other materials in the calibration phantom, the particular material may be detected by reconstructing the image and, further, determining a second coordinate of the particular point of the particular material.
In some embodiments, the processing device may determine a transformation matrix or mapping matrix that maps from the first coordinates to the second coordinates based on the first coordinates and the second coordinates. For example, the processing device solves based on the first and second coordinates of the plurality of preset points, determining the mapping matrix. For example, the solution may be a gaussian projection or the like.
As shown in fig. 6, the jth scan layer of the calibration phantom is scanned. X, Y and the Z-axis are three-dimensional coordinate systems constructed based on a calibration phantom. X, Y, Z are similar to FIG. 5 and will not be described in detail.
I is a calibration phantom, K is a preset point in the calibration phantom, and the first coordinate of the point is (x)j,yj,zj) For example, the predetermined point may be a midpoint of a particular material in the calibration phantom. The preset point is located at the j-th scanning layer.
The two-dimensional coordinate system constructed by the U axis and the V axis is a mapping coordinate system, and the coordinate system is constructed based on the calibration image F.
k is the position of the preset point on the calibration image, and the first coordinate of the point is (u)j,vj)。
T "is the focal point of the scanning source, and the position of T" may be the same as or different from T' and T.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. A method of image reconstruction, the method comprising:
scanning at least one scanning layer of a scanning object based on the deflection positions of the scanning source and the detector to acquire scanning data;
determining the division points of the imaging areas of the at least one scanning layer respectively corresponding to the detector;
determining data to be reconstructed based on the scanning data and the segmentation points respectively corresponding to the at least one scanning layer, wherein the data to be reconstructed is used for reconstructing an image;
wherein the deflection position satisfies that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection.
2. The method of claim 1, wherein the determining the data to be reconstructed based on the segmentation points corresponding to the scan data and the at least one scan layer respectively comprises:
for each of the at least one scanning layer, determining the scanning data between a segmentation point corresponding to the scanning layer and a first endpoint of an imaging region corresponding to the scanning layer as the data to be reconstructed, wherein the first endpoint is determined based on a deflection direction.
3. The method of claim 1, wherein determining the segmentation points of the imaging regions of the at least one scan layer respectively corresponding to the detectors comprises:
for one of the at least one scanning layer, a mapping point of a rotational center point of the scanning layer at the imaging region is determined as the division point.
4. The method of claim 1, wherein the at least one scan layer comprises a first scan layer and a second scan layer, and wherein determining the segmentation points of the imaging regions of the at least one scan layer corresponding to the detector respectively comprises:
and fitting the segmentation points respectively corresponding to at least two first scanning layers, and determining the segmentation points corresponding to the second scanning layer.
5. An image reconstruction system, characterized in that the system comprises:
the acquisition module is used for scanning at least one scanning layer of a scanning object based on the deflection positions of the scanning source and the detector to acquire scanning data, wherein the deflection position meets the condition that the reconstructed field of view before deflection is larger than the reconstructed field of view after deflection;
the determination module is used for determining the segmentation points of the imaging regions corresponding to the at least one scanning layer on the detector respectively, and determining data to be reconstructed based on the scanning data and the segmentation points corresponding to the at least one scanning layer respectively, wherein the data to be reconstructed is used for reconstructing an image.
6. The system of claim 5, wherein the determination module is further configured to:
for each of the at least one scanning layer, determining the scanning data between a segmentation point corresponding to the scanning layer and a first endpoint of an imaging region corresponding to the scanning layer as the data to be reconstructed, wherein the first endpoint is determined based on a deflection direction.
7. The system of claim 5, wherein the determination module is further configured to:
for one of the at least one scanning layer, a mapping point of a rotational center point of the scanning layer at the imaging region is determined as the division point.
8. The system of claim 5, wherein the at least one scan layer comprises a first scan layer and a second scan layer, and wherein the determination module is further configured to:
and fitting the segmentation points respectively corresponding to at least two first scanning layers, and determining the segmentation points corresponding to the second scanning layer.
9. An image reconstruction device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-4 when executing the computer program.
10. A computer-readable storage medium storing computer instructions, wherein when the computer instructions in the storage medium are read by a computer, the computer performs the method of any one of claims 1-4.
CN202111674837.0A 2021-12-31 2021-12-31 Image reconstruction method and system Pending CN114332281A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111674837.0A CN114332281A (en) 2021-12-31 2021-12-31 Image reconstruction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111674837.0A CN114332281A (en) 2021-12-31 2021-12-31 Image reconstruction method and system

Publications (1)

Publication Number Publication Date
CN114332281A true CN114332281A (en) 2022-04-12

Family

ID=81021556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111674837.0A Pending CN114332281A (en) 2021-12-31 2021-12-31 Image reconstruction method and system

Country Status (1)

Country Link
CN (1) CN114332281A (en)

Similar Documents

Publication Publication Date Title
JP4415762B2 (en) Tomography equipment
US7844094B2 (en) Systems and methods for determining geometric parameters of imaging devices
JP4438053B2 (en) Radiation imaging apparatus, image processing method, and computer program
JP4360817B2 (en) Radiation tomography equipment
JP5133690B2 (en) Image reconstruction using voxel-dependent interpolation
US20090297009A1 (en) Method of reconstructing an image function from radon data
JP4342164B2 (en) Computed tomography equipment
US10610170B2 (en) Patient position monitoring system based on 3D surface acquisition technique
JP2007512034A (en) Image reconstruction method for divergent beam scanner
JP2000081318A (en) Scanning and data collecting method for three- dimensional computer tomography imaging and imaging system
US7737972B2 (en) Systems and methods for digital volumetric laminar tomography
JP4236858B2 (en) Method and apparatus for reconstructing a three-dimensional image from cone beam projection data
JP2009534079A (en) Cone beam computed tomography with multiple partial scan trajectories
CN1936958B (en) Method and apparatus for reconstructing a three-dimensional image volume from two-dimensional projection images
JP2004237088A (en) Three-dimensional back projection method and x-ray ct apparatus
EP2633817A1 (en) Image processing device of a computer tomography system
US20110019791A1 (en) Selection of optimal views for computed tomography reconstruction
JP2019158534A (en) X-ray ct apparatus for measurement and method for generating fault image
CN111223159B (en) Cone beam imaging method, apparatus, computer device and storage medium
CN115526929A (en) Image-based registration method and device
JP4498023B2 (en) X-ray CT system
JP7154535B2 (en) Dimensional measurement method using projection image obtained by X-ray CT device
JP2001238877A (en) Method of producing x-ray image and its equipment
CN107233105B (en) Correction method and correction system for CT image reconstruction
EP3629294A1 (en) Method of providing a training dataset

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination