CN113884519A - Self-navigation X-ray imaging system and imaging method - Google Patents

Self-navigation X-ray imaging system and imaging method Download PDF

Info

Publication number
CN113884519A
CN113884519A CN202111153262.8A CN202111153262A CN113884519A CN 113884519 A CN113884519 A CN 113884519A CN 202111153262 A CN202111153262 A CN 202111153262A CN 113884519 A CN113884519 A CN 113884519A
Authority
CN
China
Prior art keywords
ray
module
imaging
target object
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111153262.8A
Other languages
Chinese (zh)
Other versions
CN113884519B (en
Inventor
邢宇翔
张丽
陈志强
高河伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202111153262.8A priority Critical patent/CN113884519B/en
Publication of CN113884519A publication Critical patent/CN113884519A/en
Application granted granted Critical
Publication of CN113884519B publication Critical patent/CN113884519B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/02Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
    • G01N23/04Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
    • G01N23/046Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/03Investigating materials by wave or particle radiation by transmission
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/10Different kinds of radiation or particles
    • G01N2223/101Different kinds of radiation or particles electromagnetic radiation
    • G01N2223/1016X-ray
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/40Imaging
    • G01N2223/401Imaging image processing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2223/00Investigating materials by wave or particle radiation
    • G01N2223/40Imaging
    • G01N2223/419Imaging computed tomograph

Abstract

The application provides a self-navigation X-ray imaging system and an imaging method, wherein the imaging system comprises: the X-ray source module scans a target object; the X-ray detection module collects detection data; the scanning track navigation positioning module acquires a depth image and/or a current position of a target object in real time and guides the X-ray detection module to move relative to the target object; the light source/detector relative position navigation positioning module acquires the relative position between the X-ray source module and the X-ray detection module by using a calibration die body and guides the relative movement between the X-ray source module and the X-ray detection module; the signal acquisition synchronization module synchronizes the data acquisition time of the X-ray detection module and the scanning track navigation positioning module; the geometric locus data processing module constructs a projection relation matrix of the imaging system; and the image reconstruction and image processing module performs image processing and/or image reconstruction by using the projection relation matrix based on the detection data of a plurality of moments acquired after time synchronization to obtain the imaging result of the target object. The system can flexibly and effectively implement X-ray transmission imaging and CT imaging.

Description

Self-navigation X-ray imaging system and imaging method
Technical Field
The present application relates to the field of radiation imaging technology, and in particular, to a self-navigation X-ray imaging system and an imaging method.
Background
The X-ray CT imaging system is widely applied to the fields of medical treatment, security inspection, industrial nondestructive inspection and the like. The ray source and the detector collect a series of projection data according to a certain orbit, and the three-dimensional space distribution of the linear attenuation coefficient of the object can be obtained through the processes of restoration of an image reconstruction algorithm, image denoising and the like. For conventional CT imaging systems, it is generally comprised of an X-ray source, mechanical motion devices, detectors, control circuitry, and a data processing system.
Related art X-ray CT has three main modes: 1. the physical positions of the X-ray light source and the detector are fixed, and an imaged object rotates or translates according to a pre-designed fixed track under the control of the mechanical motion device; 2. the object to be imaged is fixed, and the X-ray light source and the detector rotate or translate according to a pre-designed fixed track under the control of the mechanical motion device; 3. the combination of the two modes, such as spiral CT, the imaged object, the X-ray light source and the detector move according to the designed track, and the tracks of the two directions are integrated to form a determined relative motion track. One common feature of these approaches is the deterministic presetting of the CT imaging trajectory, and actual motion away from the preset trajectory will cause artifacts and errors in the CT images. This high precision trajectory requirement for CT imaging greatly limits its flexibility of use, limiting its applicability scenarios. However, in many practical scenarios, there are some special imaging requirements, such as the shape and size of the object, which make it impossible to achieve a conventional scanning trajectory, or the cost of the mechanical apparatus for achieving a special trajectory is large, and for example, the workpiece must be detected in situ and located at a position where a mechanical motion rail apparatus cannot be constructed.
With the improvement of detection requirements of various industries, the flexibility and adaptability of CT scanning imaging application become a big problem in the whole field and a bottleneck problem of application widening. Therefore, a new system design and method are urgently needed to be proposed, and a novel X-ray CT imaging system with high degree of freedom and high precision is created and established.
Content of application
The application provides a self-navigation X-ray imaging system and an imaging method, which aim to solve the problems that a mechanical movement track device needs to be constructed, the cost is high, the operation is inconvenient, the X-ray transmission imaging and the CT imaging cannot be flexibly and effectively implemented and the like in the related technology.
An embodiment of a first aspect of the present application provides a self-navigation X-ray imaging system, including: the X-ray source module is used for scanning a target object while moving according to a preset or real-time track; the X-ray detection module is used for collecting detection data of the target object and the calibration die body while moving according to a preset or real-time track; the scanning track navigation positioning module is used for acquiring the depth image and/or the current position of the target object in real time and guiding the movement of the X-ray detection module relative to the target object in the imaging process; the light source/detector relative position navigation positioning module comprises a calibration die body and is used for acquiring the relative position between the X-ray source module and the X-ray detection module by utilizing the calibration die body and guiding the relative movement between the X-ray source module and the X-ray detection module; the signal acquisition synchronization module is used for synchronizing the data acquisition time of the X-ray detection module and the data acquisition time of the scanning track navigation positioning module; the geometric locus data processing module is used for constructing a projection relation matrix of the imaging system according to the depth image and/or the current position and the relative position; and the image reconstruction and image processing module is used for processing and/or reconstructing an image by utilizing the projection relation matrix based on the detection data of a plurality of moments acquired after time synchronization so as to obtain an imaging result of the target object.
According to the embodiment of the application, the scanning track navigation positioning module and the light source/detector relative position navigation positioning module are respectively or both loaded to the X-ray source module or the X-ray detection module.
According to the embodiment of the application, in the imaging process, detection data of the calibrated phantom on the X-ray detection module are extracted, a first adjustment value of the position of the X-ray detection module or a second adjustment value of the position of the X-ray source module is generated according to the detection data of the calibrated phantom, and after the X-ray detection module and/or the X-ray source module are changed according to the first adjustment value and/or the second adjustment value, the projection of the region of interest in the target object is located in the detection view field of the X-ray detection module.
According to an embodiment of the application, the image reconstruction and image processing module is further configured to perform data sequence arrangement integration or fusion noise processing on the detection data at the multiple moments, and solve to obtain a transmission image or a reconstructed image.
According to the embodiment of the application, the scanning track navigation positioning module is further configured to determine an actual coordinate of a focal spot of the X-ray source module in a world coordinate system through imaging information of a preset phantom by the scanning track navigation positioning module and the X-ray source module.
An embodiment of a second aspect of the present application provides a self-navigation X-ray imaging method, including the following steps: in the imaging system, a calibration phantom is used for acquiring a depth image and/or a current position of the target object, and a motion track of the target object is determined according to the depth image and/or the current position; calculating the coordinates of the pixels of the X-ray detection module in a world coordinate system according to the real-time detection data of the calibration phantom, and determining the collection ray path of the imaging system according to the actual coordinates of the X-ray source module in the world coordinate system and the coordinates of the pixels of the X-ray detection module in the world coordinate system; constructing a projection relation matrix of the imaging system according to the motion track of the target object and the collection ray path; removing projection data components of the calibration phantom from the detection data of the target object and the calibration phantom collected by the X-ray detection module, and obtaining the detection data of the target object according to the removed detection data and the projection relation matrix; and carrying out image processing and/or image reconstruction on the detection data of the target object to obtain an imaging result of the target object.
According to the embodiment of the application, the method further comprises the following steps: in the imaging process, the detection data of the calibrated phantom on the X-ray detection module is extracted, a first adjustment value of the position of the X-ray detection module or a second adjustment value of the position of the X-ray source module is generated according to the detection data of the calibrated phantom, and after the X-ray detection module and/or the X-ray source module are changed according to the first adjustment value and/or the second adjustment value, the projection of the region of interest in the target object is positioned in the detection visual field of the X-ray detection module.
According to the embodiment of the application, the method further comprises the following steps: and determining the actual coordinate of the X-ray source module in a world coordinate system through the imaging information of a scanning track navigation positioning module and the X-ray source module to a preset die body.
An embodiment of a third aspect of the present invention provides an electronic device, including: a processor and a memory; wherein the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to implement the self-navigation X-ray imaging method according to the above embodiment.
A fourth aspect of the present invention provides a computer readable storage medium having a computer program stored thereon, wherein the program is executed by a processor for implementing the self-navigation X-ray imaging method according to the above embodiments.
The self-navigation X-ray imaging system and the imaging method have the following beneficial effects:
1) by introducing the navigation module, the CT system does not need a fixed mechanical motion framework, thereby greatly reducing the limiting conditions of the CT system and increasing the flexibility of the system;
3) by introducing the light source/detector relative position navigation positioning module, mutual accurate positioning and proper scan geometry maintenance can be realized under the condition that the optical-mechanical module and the detector module are not in hard connection, the application scene of the CT system is greatly widened, and the CT can be used in the occasions where the use is impossible before;
3) by using an imaging mode without a fixed scanning track, the cost of the CT equipment can be saved to a great extent.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic structural diagram of a self-navigation X-ray imaging system provided according to an embodiment of the present application;
fig. 2 is a schematic diagram of an integrated light source and binocular camera provided according to an embodiment of the present application;
FIG. 3 is a two-dimensional schematic diagram illustrating the determination of coordinates of an X-ray source by constructing a CT imaging system according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a self-guided X-ray imaging system including a calibration phantom provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic diagram of an X-ray detection module loaded with a navigation and positioning module (calibration phantom) for relative position of a light source/detector according to an embodiment of the present application;
fig. 6 is a schematic diagram of a self-navigation X-ray imaging system with a quad-rotor drone as a platform for manipulating a motion trajectory according to an embodiment of the present application;
FIG. 7 is a block diagram illustrating an exemplary workflow of a self-navigation X-ray imaging system provided in accordance with an embodiment of the present application;
FIG. 8 is a photograph of a texture of a surface of an object being imaged, with different photographs obtained from movement of the object at different times, according to an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating a small texture image with a plurality of mark points defined thereon, wherein the mark points on the two images have different positions according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of a scanning track according to an embodiment of the present application, where dots are positions of light sources and beam directions thereof;
FIG. 11 is a schematic diagram illustrating a normal position of an X-ray detector according to an embodiment of the present application;
FIG. 12 illustrates a deviation of the position of an X-ray detector from a normal position provided in accordance with an embodiment of the present application; (a) is 90 degrees off the angle of the central ray; (b) the distance between the detector and the ray source is larger;
FIG. 13 is a schematic diagram of a virtual probe setup provided in accordance with an embodiment of the present application;
FIG. 14 is a flow chart of a method of self-guided X-ray imaging according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals: the system comprises an X-ray source module-100, an X-ray detection module-200, a scanning track navigation positioning module-300, a light source/detector relative position navigation positioning module-400, a signal acquisition synchronization module-500, a geometric track data processing module-600, an image reconstruction and image processing module-700, a memory-151, a processor-152 and a communication interface-153.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
Fig. 1 is a schematic structural diagram of a self-navigation X-ray imaging system according to an embodiment of the present application.
As shown in fig. 1, the self-navigation X-ray imaging system includes: the system comprises an X-ray source module 100, an X-ray detection module 200, a scanning track navigation positioning module 300, a light source/detector relative position navigation positioning module 400, a signal acquisition synchronization module 500, a geometric track data processing module 600 and an image reconstruction and image processing module 700.
The X-ray source module 100 is configured to scan a target object while moving according to a preset or real-time trajectory. In a specific embodiment, the X-ray source module is formed by a platform loading X-ray machine with a controllable motion track, such as a quad-rotor unmanned aerial vehicle carrying an X-ray source, or an intelligent AGV carrying an X-ray source.
The X-ray detection module 200 is configured to collect detection data of a target object and a calibration phantom while moving according to a preset or real-time track. In a specific embodiment, the X-ray detection module is formed by a platform-loaded X-ray detector with a controllable motion track, such as a quad-rotor unmanned aerial vehicle carrying an X-ray detector, or an intelligent AGV carrying an X-ray detector. The X-ray detectors are not limited in shape or number, and may be in a plurality of rows or an area array, a flat plate or an arc, etc.
The scanning track navigation positioning module 300 is configured to acquire a depth image and/or a current position of the target object in real time during an imaging process and guide a movement of the X-ray detection module relative to the target object. In one embodiment, the scan trajectory navigation module may be a navigation device and system having three-dimensional positioning or imaging functionality such that object distance and position may be measured, such as a depth camera or lidar navigation. The scanning trajectory navigation positioning module may be loaded on the X-ray source module or on the X-ray detection module, as shown in fig. 2 and 3, the scanning trajectory navigation positioning module is a binocular camera and is loaded on the X-ray source module. Other arrangements may also implement the functions of the embodiments of the present application, and are not particularly limited.
The navigation positioning module 400 for the relative position of the light source/detector comprises a calibration phantom, which is used for collecting the relative position between the X-ray source module and the X-ray detection module and guiding the relative movement between the two modules. As shown in fig. 4, the left side of fig. 4 (a) is a schematic diagram of a calibration phantom a, fig. 4 (b) is a schematic diagram of an X-ray light source module of the calibration phantom a loaded with a scanning trajectory navigation positioning module (binocular camera) and a light source/detector relative position navigation positioning module, and the light source/detector relative position navigation positioning module mainly comprises a calibration phantom (which may be referred to as phantom a) with a known whole or partial geometric size and shape, for example, a phantom composed of a plurality of round balls, cylinders, etc. at different positions, as shown in fig. 4 (a). This module can be fixed to the X-ray source module (as shown in fig. 4 (b)) or to the X-ray detection module (as shown in fig. 5), and fig. 5 is a schematic diagram of the X-ray detection module loaded with the navigation and positioning module (calibration phantom) for the relative position of the light source/detector. In a self-navigation imaging system, only one light source/detector relative position navigation positioning module (calibration phantom) is needed, and is not particularly limited. The phantom is preferably sized such that its projection is within the field of view when the beam is being imaged by CT.
And a signal acquisition synchronization module 500 for synchronizing data acquisition time of the X-ray detection module and the scanning track navigation positioning module. It can be understood that, in order to make the data acquisition time of the X-ray detection module and the data acquisition time of the scanning track navigation positioning module consistent, the signal acquisition synchronization module synchronizes the acquisition time of the two modules, thereby reducing the error of data acquisition and improving the subsequent imaging quality.
And the geometric trajectory data processing module 600 is used for constructing a projection relation matrix of the imaging system according to the depth image and/or the current position and the relative position.
It can be understood that the geometric locus data processing module determines the signal acquisition ray path of the X-ray transmission imaging and CT imaging system by using the signals of the scanning locus navigation positioning module and the light source/detector relative position navigation positioning module, and constructs the projection matrix of the X-ray imaging system. The specific determination process is described by the following specific examples.
And the image reconstruction and image processing module 700 is configured to perform image processing and/or image reconstruction by using the projection relation matrix based on the detection data acquired after time synchronization at multiple times, so as to obtain an imaging result of the target object. The image reconstruction and processing module completes image processing of X-ray transmission imaging or image reconstruction and processing of X-ray CT data to obtain a final image of the target object.
As shown in fig. 6, a schematic diagram of a self-navigation X-ray imaging system with a quad-rotor unmanned aerial vehicle as a platform capable of manipulating a motion trajectory is shown, and the functions of the self-navigation X-ray imaging system are described in detail through an imaging process of the self-navigation X-ray imaging system.
Because the scanning track navigation positioning module and the light source/detector relative position navigation positioning module can be loaded on the X-ray source module or the X-ray detection module, the basic working principle is similar. In the embodiment of the application, the working principle and the flow of the self-navigation system are explained by taking the example that the scanning track navigation positioning module and the light source/detector relative position navigation positioning module are loaded on the X-ray source module.
The working principle and the flow of the self-navigation X-ray CT system are as follows:
step 1: defining a world coordinate system of a scanning track navigation positioning module: simultaneously imaging a fixed structure die body (marked as die body B) through a scanning track navigation positioning module and X rays, determining the coordinates of a focal spot of an X-ray source module in a world coordinate system, and marking as PSrc=(xSrc,ySrc,zSrc)T. This step may be done prior to the actual imaging.
In an embodiment of the present application, the scanning track navigation positioning module is further configured to determine an actual coordinate of a focal spot of the X-ray source module in the world coordinate system according to imaging information of a preset phantom by the scanning track navigation positioning module and the X-ray source module.
Step 2: integrating a light source/detector relative position navigation positioning module on an X-ray source module, measuring the relative position of the X-ray detection module, calibrating the position of a characteristic point of a film body (marked as a film body A) in a world coordinate system
Figure BDA0003287858040000061
And m is the index number of the characteristic point. And obtaining a projection image g of the fixed die body A in the CT system in the step 1A*. This step can also be done before the actual imaging.
And step 3: in the CT imaging process, a track navigation positioning module acquires a real-time image or position of a target object in a world coordinate system in real time; calculating the motion trail q (t) of the scanned target object under a world coordinate system according to the acquired image or position, wherein q (t) at any time t can be written into a matrix form, namely [ R t ]]It is a combination of rotational and translational motion. Defining the starting position as time 0, the position of a point on the object at time 0 [ x (0), y (0), z (0)]TThe position [ x (t), y (t), z (t) of this point at time t]TThe following relationships exist:
Figure BDA0003287858040000062
in practice, the latest position can also be obtained by cascading motion changes at front and back moments, i.e. the position is updated:
Figure BDA0003287858040000063
Here, the number of the first and second electrodes,
Figure BDA0003287858040000064
a motion trajectory change matrix from the time i-1 to the time i is represented.
And 4, step 4: and acquiring signals of a synchronous scanning track navigation positioning module and acquiring signals of a ray beam-emitting detector.
In an embodiment of the application, in an imaging process, detection data of a calibration phantom on an X-ray detection module is extracted, a first adjustment value of a position of the X-ray detection module or a second adjustment value of the position of an X-ray source module is generated according to the detection data of the calibration phantom, and after the X-ray detection module and/or the X-ray source module is changed in position according to the first adjustment value and/or the second adjustment value, a projection of a region of interest in a target object is located in a detection field of view of the X-ray detection module.
And 5: the X-ray source module moves according to the scanning requirement (the movement can be controlled in a wired or wireless remote control mode), so that an object is scanned, the distribution of the line attenuation coefficient of the scanned target object is represented by a vector mu, and the data acquired by the X-ray detection module at a certain moment is represented by I (t). And while beam-emitting imaging is continuously carried out in the moving process, extracting an image of the calibration phantom A from I (t), and adjusting the position of an X-ray detection module (as a driven module) according to the position of the characteristic point of the image of the calibration phantom A so as to enable the projection of the region of interest of the object to be positioned in the field of view of the X-ray detection module. Or the X-ray detection module moves to scan the target object according to the scanning requirement, the X-ray source module continuously emits beams for imaging in the moving process, and the position of the X-ray source module (as a driven module) is adjusted according to the image of the calibration die body A on the X-ray detection module, so that the projection of the region of interest of the target object is positioned in the visual field of the X-ray detection module. For example, the target object is a sphere, and the region of interest is a partial position of the sphere, and when adjusting, the projection of the partial position of interest is positioned in the field of view of the X-ray detection module, so as to better acquire the detection data of the position.
Step 6: according to the real-time image measurement of the phantom A in the detector module in the CT imaging process, the position of the detector pixel in the world coordinate system is calculated
Figure BDA0003287858040000071
n is the index number of the detector pixel.
And 7: determining a system matrix of the CT imaging process according to the actual target object position and the signal acquisition of the X-ray detection module; defining the coordinates of the object in a world coordinate system at the initial scanning time as x (0), y (0) and z (0); for a certain time t, the light source is a fixed value in the world coordinate system, i.e. P in step 1Src(ii) a Combining the coordinates of each pixel of the actual detector in step 6 into
Figure BDA0003287858040000072
From this, each ray position is determined: pSrcTo
Figure BDA0003287858040000073
Synthesizing the motion track of an object
Figure BDA0003287858040000074
Or
Figure BDA0003287858040000075
Obtaining a projection relation matrix h (t) imaged at the moment, which can be expressed as:
I(t)=I0exp(-h(t)μ-gA(t))+Inoise (1)
here, I0Is an incident photon, InoiseRepresenting possible noise signals, gA(t) is the projection of the line attenuation coefficient distribution of the calibration phantom A (also referred to as the line integral).
And 8: removing projection components of the calibration phantom A from the data I (t) acquired by the X-ray detection module, namely according to the structure and g of the known calibration phantom AA*And PSrcTo
Figure BDA0003287858040000076
Estimate g ofA(t), g (t) h (t) μ is estimated according to the formula (1).
And step 9: and (3) carrying out detection (such as defect detection) after transmission image processing (such as splicing, denoising, enhancing and the like) is carried out on the data acquired by the X-ray detection module.
In an embodiment of the application, the image reconstruction and image processing module is further configured to perform data sequence integration or fusion noise processing on the detection data at multiple times, and solve the detection data to obtain a transmission image or a reconstructed image.
Step 10: the data collected at a plurality of moments are integrated to obtain:
Figure BDA0003287858040000081
i.e. the data is arranged and integrated in sequence. Because the data contains noise, the above formula can also be written as a mathematical model of the fusion noise:
g=Hμ+n (3)
where n is a noise vector.
The image reconstruction is performed according to formula (2) or formula (3), and may be performed by various methods: 1) rearranging the data into fan beams or parallel beams, and reconstructing by using an analytic method; 2) solving g ═ H mu + n by using a statistical (iterative) method in the field, wherein g is a projection value obtained according to data acquired by a detector, mu is the attenuation coefficient distribution of an object line to be solved, and n is noise; 3) solving using a neural network method. Other methods may also be used for reconstruction, and are not particularly limited.
After the imaging principle of the self-navigation X-ray imaging system is clarified through the above description, the self-navigation X-ray imaging method is described through a specific embodiment with reference to the attached drawings.
A binocular camera scanning track positioning module of a depth camera is taken as a scanning track navigation positioning module, and a light source/detector relative position navigation positioning module constructed by a plurality of small cubes is taken as an example for introduction. An overall system architecture is shown in fig. 7, with specific embodiment examples as follows:
1) defining a world coordinate system of a scanning track navigation positioning module; and determining the coordinates of the focal spot of the X-ray source in the world coordinate system, denoted as PSrc=(xSrc,ySrc,zSrc)T
1-1) definition [ x y z]TFor the coordinates of a point in the world coordinate system, the relationship between the coordinates in the camera coordinate system and the coordinates in the world coordinate system can be expressed as follows:
Figure BDA0003287858040000082
here [ R t0]Commonly known as the extrinsic parameter matrix of the camera, R conforming to the form of the rotation matrix, t0For the translation matrix, the extrinsic parameter matrix reflects the relationship between the camera coordinate system and the artificially defined world coordinate system. So that the imaging position of a point on the world coordinate system on the camera is:
Figure BDA0003287858040000091
here, s is a scale factor. A is commonly referred to as the camera's intrinsic parameter matrix, reflecting the relative relationship between the photo coordinate system and the camera coordinate system. The intrinsic parameter matrix and the extrinsic parameter matrix of the camera are calibrated according to methods known in the art. For binocular cameras, respectively defining the internal parameter matrixes of the cameras as Aleft,Aright(ii) a The binocular camera external parameter matrix is respectively defined as (determining the conversion relation between the world coordinate system and the camera coordinate system): [ R ]lefttlcft],[Rrighttright]。
1-2) as shown in FIG. 2, a binocular camera is integrated with the X-ray source module as a scanning trajectory navigation positioning module. Determination of the world coordinate system position of an X-ray source module fixed with a binocular camera may be used to construct a systemAnd (4) testing the CT system, and finishing a fixed structure die body (marked as a die body B). Placing the die body B on a rotary table, using an X-ray source module to perform CT imaging, simultaneously using a binocular camera to take pictures, and simultaneously determining the world coordinate system coordinate P of the key structure point of the light source position calibration piece according to the following formulaCalSrc=[xCalSrc,yCalSrc,zCalSrc]T
Figure BDA0003287858040000092
Figure BDA0003287858040000093
Where u, v are the positions of the images of the left and right cameras in binocular vision, respectively. Reconstructing CT data, and determining the coordinates of the key structure points (optional corner points) of the X-ray source module position calibration piece in the experimental CT system coordinate system
Figure BDA0003287858040000094
Whereby P passes through a plurality of structure pointsCalSrcAnd
Figure BDA0003287858040000095
the relationship of (1):
Figure BDA0003287858040000096
determination of QCT_W. The coordinates of the light source in the coordinate system of the experimental CT system in the constructed CT system
Figure BDA0003287858040000097
As is known, can thus be obtained
Figure BDA0003287858040000098
2) The navigation and positioning module for the relative position of the light source/detector is integrated on the X-ray source module, as shown in FIG. 4, a schematic diagram of the calibration phantom A is shown, and the characteristics of the calibration phantom A are measured by a binocular vision cameraThe position of the points (where the corner points of the phantom A can be taken) in the world coordinate system
Figure BDA0003287858040000099
And m is the index number of the characteristic point. Obtaining a projection drawing g after the calibration phantom A is fixed in the CT system in the step 1)A*
3) In the CT imaging process, the track navigation positioning module acquires real-time images or positions of the scanned object in a world coordinate system in real time. As shown in fig. 8, the surface of the scanned target object may be photographed, thereby obtaining a texture image of the surface. If the surface of the imaged object is completely uniform in texture, a texture image can be drawn or attached on the surface of the object. The object surface corresponding to a small image (as indicated by a box in fig. 8) can be regarded as a "marked point", and the corresponding coordinates of the marked point in binocular vision can be obtained through registration, so that the world coordinates of the marked point are obtained according to the acquired image or position. The mark point position at the moment when the initial position is defined as 0 is recorded as P (0) ═ x (0), y (0), z (0)]T: at time t, the mark point position is denoted as p (t) ([ x (t), y (t), z (t))]T. In this way, the front-rear movement positions of the plurality of marker points can be obtained (the case of meaning a plurality of marker points as shown in fig. 9). The position change of the scanned object in the world coordinate system can be represented by a combination of rotation and translation movements, for example, the position moving from time 0 to time t can be formulated as:
Figure BDA0003287858040000101
assuming that the object to be imaged is a rigid body, i.e. the motion of all points on the object surface can be described by this equation. From the position changes of the plurality of marker points at two different times, a plurality of equations as in equation (4) can be obtained:
Figure BDA0003287858040000102
here, the subscript j is an index number of the mark point. By combining these equations, the matrix q (t) can be solved.
In the practical process, the latest position can also be obtained by cascading the motion changes at the front and back moments, namely:
Figure BDA0003287858040000103
here, the number of the first and second electrodes,
Figure BDA0003287858040000104
a motion trajectory change matrix from the time i-1 to the time i is represented.
4) And acquiring signals of a synchronous scanning track navigation positioning module and acquiring signals of a ray beam-emitting detector.
5) The X-ray source module moves according to the scanning requirement (can be in a wired or wireless remote control mode), so that the object is scanned. Figure 10 is a simplified schematic of two-dimensional source motion trajectories. The line attenuation coefficient distribution of the scanned object is represented by a vector mu, and the data acquired by the detector at a certain moment is represented by I (t). And while beam-emitting imaging is continuously carried out in the moving process, extracting an image of the calibration phantom A from I (t), and adjusting the position of a detector (serving as a driven module) according to the position of the characteristic point of the image of the calibration phantom A so as to enable the projection of the region of interest of the object to be positioned in the view field of the detector. Or the detector module moves to scan the object according to the scanning requirement, the ray source continuously emits beams to form images in the moving process, and the position of the light source (as a driven module) is adjusted according to the image of the calibration phantom A on the detector, so that the projection of the region of interest of the object is positioned in the view field of the detector. Fig. 11 shows the ideal normal position of the X-ray detector at two-dimensional viewing angles, and fig. 12 shows the offset position of the X-ray detector at two-dimensional viewing angles. The method allows the position of the X-ray detector to deviate from the ideal normal position, but requires that the image of the calibration phantom A is always positioned inside the detector, namely C in FIG. 111,C2,C3,C4Is always positioned in the imaging visual field of the detector.
6) According to the real-time image measurement of the mold body A in the detector module in the CT imaging processMeasuring the position of a detector pixel in a world coordinate system
Figure BDA0003287858040000111
n is the index number of the detector pixel. The detector plane can be set as:
Figure BDA0003287858040000112
defining characteristic points of the motif A
Figure BDA0003287858040000113
The projection position on the detector plane is
Figure BDA0003287858040000114
Solving for l according to the following relationshipx,ly,lzDetermining world coordinate system
Figure BDA0003287858040000115
6-1) for all m,
Figure BDA0003287858040000116
in the plane of the detector, i.e. satisfy
Figure BDA0003287858040000117
6-2) for all m, PSrc,
Figure BDA0003287858040000118
Figure BDA0003287858040000119
Three points are on a straight line.
6-3) projection positions of any two characteristic points
Figure BDA00032878580400001110
And
Figure BDA00032878580400001111
the distance between them is equal to the distance between them on the actual detector
Figure BDA00032878580400001112
Figure BDA00032878580400001113
The coordinates of the detector in the coordinate system. E.g. the size of the pixel in the detector row and column directions is delta, respectivelyu,ΔvIt is shown that the first row and column of pixels on the detector can be defined as being under its own coordinate system
Figure BDA00032878580400001114
On the basis of
Figure BDA00032878580400001115
And
Figure BDA00032878580400001116
determining a transfer relation function Q of the detector plane between the world coordinate system and the coordinate system of the detector plane, so that
Figure BDA00032878580400001117
So that each detector unit (u) can be obtainedn,vn,0)TCorresponding at time t
Figure BDA00032878580400001118
7) Determining a system matrix of the CT imaging process according to the position of the actual object and the signal acquisition of the detector; the coordinates of the object in the world coordinate system at the initial scanning time are defined as x (0), y (0) and z (0). For a certain time t, the light source is a fixed value in the world coordinate system, i.e. P in 1)Src. In combination with 6) each pixel coordinate of the actual detector is
Figure BDA00032878580400001119
From this, each ray position is determined: pSrcTo
Figure BDA0003287858040000121
Synthesizing the motion track of an object
Figure BDA0003287858040000122
Or
Figure BDA0003287858040000123
Obtaining a projection relation matrix h (t) imaged at the moment, which can be expressed as:
I(t)=I0exp(-h(t)μ-gA(t))+Inoise (5)
here I0Is an incident photon, InoiseRepresenting possible noise signals, gA(t) is the projection of the line attenuation coefficient distribution of the calibration phantom A (also referred to as the line integral).
8) As shown in fig. 13, a virtual probe is constructed, corresponding to the probe position in step 1. For a pixel P on a virtual detector (abbreviated VD)VDAnd PSrcTogether defining a ray that intersects the actual detector plane obtained in step 6 to obtain an intersection point (x)Det(t),yDet(t),zDet(t))TFurther according to (u, v, 0)T=Q-1{(xDet(t),yDet(t),zDet(t))TGet the ray projection pixel position on the actual detector, so can get from gA*Interpolating to obtain gA(t) estimation:
Figure BDA0003287858040000124
removing a projection component exp (-g) of the calibration phantom A from data I (t) acquired by the detectorA(t)), and further, g (t) h (t) μ is estimated according to equation 5.
9) And (3) detecting (such as defect detection) after image processing (such as splicing, denoising, enhancing and the like) is carried out on the data acquired by the detector.
10) The data collected at a plurality of moments are integrated to obtain:
Figure BDA0003287858040000125
i.e. the data is arranged and integrated in sequence. Since the data contains noise, the above formula can also be written as a mathematical model of the fused noise.
g=Hμ+n (7)
Either an ART-TV reconstruction according to equation (6) or a PWLS reconstruction according to equation (7).
According to the self-navigation X-ray imaging system provided by the embodiment of the application, the scanning track navigation positioning module and the light source/detector relative position navigation positioning module are used, and the system can flexibly and effectively implement X-ray transmission imaging and CT imaging by combining a synchronous acquisition and real-time track positioning method of various signals of the light source and the detector with an image processing and image reconstruction method, so that the practical application is greatly facilitated.
Next, a self-navigation X-ray imaging method proposed according to an embodiment of the present application is described with reference to the drawings.
FIG. 14 is a flow chart of a self-navigation X-ray imaging method according to an embodiment of the application.
As shown in fig. 14, the self-navigation X-ray imaging method includes the steps of:
step S101, in an imaging system, a depth image and/or a current position of a target object are/is acquired by utilizing a calibration phantom, and a motion track of the target object is determined according to the depth image and/or the current position.
Step S102, calculating coordinates of pixels of the X-ray detection module in a world coordinate system according to real-time detection data of the calibrated phantom, and determining a collection ray path of the imaging system according to actual coordinates of the X-ray source module in the world coordinate system and coordinates of the pixels of the X-ray detection module in the world coordinate system.
And step S103, constructing a projection relation matrix of the imaging system according to the motion track of the target object and the collection ray path.
And step S104, removing the projection data of the calibration phantom from the target object and the detection data of the calibration phantom collected by the X-ray detection module, and obtaining the detection data component of the target object according to the removed detection data and the projection relation matrix.
Step S105, image processing and/or image reconstruction are carried out on the detection data of the target object, and the imaging result of the target object is obtained.
In one embodiment of the present application, the navigational X-ray imaging method further comprises: in the imaging process, detection data of the calibrated phantom on the X-ray detection module are extracted, a first adjustment value of the position of the X-ray detection module or a second adjustment value of the position of the X-ray source module is generated according to the detection data of the calibrated phantom, and after the X-ray detection module and/or the X-ray source module are/is changed in position according to the first adjustment value and/or the second adjustment value, the projection of the region of interest in the target object is positioned in the detection visual field of the X-ray detection module.
In one embodiment of the present application, the navigational X-ray imaging method further comprises: and determining the actual coordinate of the X-ray source module in a world coordinate system through the imaging information of a preset die body by the scanning track navigation positioning module and the X-ray source module.
It should be noted that the foregoing explanation of the self-navigation X-ray imaging system embodiment also applies to the self-navigation X-ray imaging method of the embodiment, and details are not repeated here.
According to the self-navigation X-ray imaging method provided by the embodiment of the application, the scanning track navigation positioning module and the light source/detector relative position navigation positioning module are used for carrying out auxiliary positioning on a scanned object, and the system can flexibly and effectively implement X-ray transmission imaging and CT imaging by combining the acquisition synchronization of various signals of the light source and the detector, the real-time track positioning method, the image processing method and the image reconstruction method, so that the practical application is greatly facilitated.
In order to implement the above embodiments, the present invention further provides an electronic device, including: a processor and a memory. Wherein the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for implementing the self-navigation X-ray imaging method as in the foregoing embodiments.
Fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device may include: a memory 151, a processor 152, and a computer program stored on the memory 151 and executable on the processor 152.
The processor 152, when executing the program, implements the self-navigation X-ray imaging method provided in the above-described embodiments.
Further, the computer device further comprises:
a communication interface 153 for communication between the memory 151 and the processor 152.
A memory 151 for storing computer programs operable on the processor 152.
The memory 151 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 151, the processor 152 and the communication interface 153 are implemented independently, the communication interface 153, the memory 151 and the processor 152 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 15, but this is not intended to represent only one bus or type of bus.
Optionally, in a specific implementation, if the memory 151, the processor 152, and the communication interface 153 are integrated on a chip, the memory 151, the processor 152, and the communication interface 153 may complete communication with each other through an internal interface.
The processor 152 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present invention.
The present embodiment also provides a computer-readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements the self-navigation X-ray imaging method as above.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of implementing the embodiments of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.

Claims (10)

1. A self-navigating X-ray imaging system, comprising:
the X-ray source module is used for scanning a target object while moving according to a preset or real-time set track;
the X-ray detection module is used for collecting detection data of the target object and the calibration die body while setting track motion according to the preset or real-time setting;
the scanning track navigation positioning module is used for acquiring the depth image and/or the current position of the target object in real time and guiding the movement of the X-ray detection module relative to the target object in the imaging process;
the light source/detector relative position navigation positioning module comprises a calibration die body and is used for acquiring the relative position between the X-ray source module and the X-ray detection module by utilizing the calibration die body and guiding the relative movement between the X-ray source module and the X-ray detection module;
the signal acquisition synchronization module is used for synchronizing the data acquisition time of the X-ray detection module and the data acquisition time of the scanning track navigation positioning module;
the geometric locus data processing module is used for constructing a projection relation matrix of the imaging system according to the depth image and/or the current position and the relative position; and
and the image reconstruction and image processing module is used for processing and/or reconstructing an image by utilizing the projection relation matrix based on the detection data of a plurality of moments acquired after time synchronization so as to obtain an imaging result of the target object.
2. The system of claim 1, wherein the scanning trajectory navigation and positioning module and the light source/detector relative position navigation and positioning module are loaded to the X-ray source module or the X-ray detection module respectively or both.
3. The system according to claim 1, wherein during the imaging process, detection data of the calibration phantom on the X-ray detection module is extracted, a first adjustment value of the position of the X-ray detection module or a second adjustment value of the position of the X-ray source module is generated according to the detection data of the calibration phantom, and after the X-ray detection module and/or the X-ray source module is changed to the position according to the first adjustment value and/or the second adjustment value, a projection of a region of interest in the target object is located within a detection field of view of the X-ray detection module.
4. The system of claim 1, wherein the image reconstruction and image processing module is further configured to perform data sequence integration or fusion noise processing on the detection data at the multiple time instants, and solve the detection data to obtain a transmission image or a reconstructed image.
5. The system of claim 1, wherein the scanning track navigation and positioning module is further configured to determine the actual coordinates of the focal spot of the X-ray source module in the world coordinate system according to the imaging information of the scanning track navigation and positioning module and the X-ray source module on a preset phantom.
6. A self-navigation X-ray imaging method for a self-navigation X-ray imaging system according to claims 1-5, characterized by comprising the steps of:
in the imaging system, a calibration phantom is used for acquiring a depth image and/or a current position of the target object, and a motion track of the target object is determined according to the depth image and/or the current position;
calculating the coordinates of the pixels of the X-ray detection module in a world coordinate system according to the real-time detection data of the calibration phantom, and determining the collection ray path of the imaging system according to the actual coordinates of the X-ray source module in the world coordinate system and the coordinates of the pixels of the X-ray detection module in the world coordinate system;
constructing a projection relation matrix of the imaging system according to the motion track of the target object and the collection ray path;
removing projection data components of the calibration phantom from the detection data of the target object and the calibration phantom collected by the X-ray detection module, and obtaining the detection data of the target object according to the removed detection data and the projection relation matrix;
and carrying out image processing and/or image reconstruction on the detection data of the target object to obtain an imaging result of the target object.
7. The method of claim 6, further comprising: in the imaging process, the detection data of the calibrated phantom on the X-ray detection module is extracted, a first adjustment value of the position of the X-ray detection module or a second adjustment value of the position of the X-ray source module is generated according to the detection data of the calibrated phantom, and after the X-ray detection module and/or the X-ray source module are changed according to the first adjustment value and/or the second adjustment value, the projection of the region of interest in the target object is positioned in the detection visual field of the X-ray detection module.
8. The method of claim 6, further comprising: and determining the actual coordinate of the X-ray source module in a world coordinate system through the imaging information of a scanning track navigation positioning module and the X-ray source module to a preset die body.
9. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, the processor executing the program to implement the self-navigating X-ray imaging method as claimed in claims 6-8.
10. A computer-readable storage medium, on which a computer program is stored which is executed by a processor for implementing the self-navigation X-ray imaging method as claimed in claims 6-8.
CN202111153262.8A 2021-09-29 2021-09-29 Self-navigation X-ray imaging system and imaging method Active CN113884519B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111153262.8A CN113884519B (en) 2021-09-29 2021-09-29 Self-navigation X-ray imaging system and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111153262.8A CN113884519B (en) 2021-09-29 2021-09-29 Self-navigation X-ray imaging system and imaging method

Publications (2)

Publication Number Publication Date
CN113884519A true CN113884519A (en) 2022-01-04
CN113884519B CN113884519B (en) 2022-07-12

Family

ID=79008183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111153262.8A Active CN113884519B (en) 2021-09-29 2021-09-29 Self-navigation X-ray imaging system and imaging method

Country Status (1)

Country Link
CN (1) CN113884519B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114414598A (en) * 2022-03-09 2022-04-29 河南省科学院同位素研究所有限责任公司 Steel structure corrosion positioning non-contact evaluation method in high-altitude closed space
CN114636715A (en) * 2022-03-09 2022-06-17 河南省科学院同位素研究所有限责任公司 High-altitude steel structure corrosion positioning evaluation method based on synchronous positioning of shed on shed and shed under shed
WO2023165074A1 (en) * 2022-03-01 2023-09-07 上海涛影医疗科技有限公司 Image positioning and dynamic image generation methods, apparatuses and systems, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1971414A (en) * 2005-11-21 2007-05-30 清华大学 Imaging system
CN101561405A (en) * 2008-04-17 2009-10-21 清华大学 Straight-line track scanning imaging system and method
WO2010012441A1 (en) * 2008-07-31 2010-02-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V. X-ray image recording system and x-ray image recording method for recording image data using x-ray devices for volume reconstruction
CN203720109U (en) * 2014-01-27 2014-07-16 东南大学 Real-time online industrial CT (Computed Tomography) detection system based on X-ray source array
US20170164910A1 (en) * 2014-08-26 2017-06-15 Nanovision Technology (Beijing) Co., Ltd. Stationary real time ct imaging system and method thereof
CN109671128A (en) * 2018-12-07 2019-04-23 广州华端科技有限公司 Data processing, image rebuilding method and device in image reconstruction process
CN112568918A (en) * 2019-09-27 2021-03-30 西门子医疗有限公司 Method for determining tomographic image, image generation unit, program product, and medium
CN113081265A (en) * 2021-03-24 2021-07-09 重庆博仕康科技有限公司 Surgical navigation space registration method and device and surgical navigation system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1971414A (en) * 2005-11-21 2007-05-30 清华大学 Imaging system
CN101561405A (en) * 2008-04-17 2009-10-21 清华大学 Straight-line track scanning imaging system and method
WO2010012441A1 (en) * 2008-07-31 2010-02-04 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e. V. X-ray image recording system and x-ray image recording method for recording image data using x-ray devices for volume reconstruction
CN203720109U (en) * 2014-01-27 2014-07-16 东南大学 Real-time online industrial CT (Computed Tomography) detection system based on X-ray source array
US20170164910A1 (en) * 2014-08-26 2017-06-15 Nanovision Technology (Beijing) Co., Ltd. Stationary real time ct imaging system and method thereof
CN109671128A (en) * 2018-12-07 2019-04-23 广州华端科技有限公司 Data processing, image rebuilding method and device in image reconstruction process
CN112568918A (en) * 2019-09-27 2021-03-30 西门子医疗有限公司 Method for determining tomographic image, image generation unit, program product, and medium
CN113081265A (en) * 2021-03-24 2021-07-09 重庆博仕康科技有限公司 Surgical navigation space registration method and device and surgical navigation system

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
KUGA, MAMORUACAA 等: "《Navigation system for ACL reconstruction using registration between multi-viewpoint X-ray images and CT images》", 《INTERNATIONAL CONGRESS SERIES》 *
XIE, XM (XIE, XIAOMIAN) 等: "《3D navigation of CTVE and correction of MinIP methods in non-invasive diagnostic detection》", 《COMPUTERIZED MEDICAL IMAGING AND GRAPHICS》 *
ZHANG, WQ (ZHANG, WENQIANG) 等: "《Real-time visualization based on deformed X-ray fluoroscopy》", 《FIRST INTERNATIONAL MULTI-SYMPOSIUMS ON COMPUTER AND COMPUTATIONAL SCIENCES (IMSCCS 2006), PROCEEDINGS》 *
欧阳兆煊: "《X射线图像几何畸变校正》", 《中国优秀博硕士学位论文全文数据库(硕士) 信息技术辑》 *
王刚: "《多视觉信息融合的放射性区域重建》", 《中国优秀博硕士学位论文全文数据库(硕士) 工程科技II辑》 *
陈志强 等: "《基于视觉定位的非确定轨迹CT系统》", 《清华大学学报(自然科学版)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023165074A1 (en) * 2022-03-01 2023-09-07 上海涛影医疗科技有限公司 Image positioning and dynamic image generation methods, apparatuses and systems, and storage medium
CN114414598A (en) * 2022-03-09 2022-04-29 河南省科学院同位素研究所有限责任公司 Steel structure corrosion positioning non-contact evaluation method in high-altitude closed space
CN114636715A (en) * 2022-03-09 2022-06-17 河南省科学院同位素研究所有限责任公司 High-altitude steel structure corrosion positioning evaluation method based on synchronous positioning of shed on shed and shed under shed

Also Published As

Publication number Publication date
CN113884519B (en) 2022-07-12

Similar Documents

Publication Publication Date Title
CN113884519B (en) Self-navigation X-ray imaging system and imaging method
JP5580164B2 (en) Optical information processing apparatus, optical information processing method, optical information processing system, and optical information processing program
CN104019829B (en) Vehicle-mounted panorama camera based on POS (position and orientation system) and external parameter calibrating method of linear array laser scanner
CN109272574B (en) Construction method and calibration method of linear array rotary scanning camera imaging model based on projection transformation
CN105378794A (en) 3d recording device, method for producing 3d image, and method for setting up 3d recording device
CN111189415B (en) Multifunctional three-dimensional measurement reconstruction system and method based on line structured light
CN111060006A (en) Viewpoint planning method based on three-dimensional model
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN111445529A (en) Calibration equipment and method based on multi-laser ranging
CN114283203A (en) Calibration method and system of multi-camera system
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN112254680B (en) Multi freedom's intelligent vision 3D information acquisition equipment
CN107421503B (en) Single-detector three-linear-array three-dimensional mapping imaging method and system
JP2023505891A (en) Methods for measuring environmental topography
CN113706635B (en) Long-focus camera calibration method based on point feature and line feature fusion
CN116625258A (en) Chain spacing measuring system and chain spacing measuring method
CN113643436B (en) Depth data splicing and fusion method and device
CN111445528A (en) Multi-camera common calibration method in 3D modeling
CN112164119B (en) Calibration method for multi-camera system placed in surrounding mode and suitable for narrow space
CN111583388A (en) Scanning method and device of three-dimensional scanning system
EP3529977B1 (en) A bundle adjustment system
Wu Photogrammetry: 3-D from imagery
CN112304250B (en) Three-dimensional matching equipment and method between moving objects
KR20240056516A (en) Method and system for generating camera model for camera calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant