EP2881917A1 - Device and method for obtaining densitometric images of objects by a combination of x-ray systems and depth-sensing cameras - Google Patents
Device and method for obtaining densitometric images of objects by a combination of x-ray systems and depth-sensing cameras Download PDFInfo
- Publication number
- EP2881917A1 EP2881917A1 EP13825508.8A EP13825508A EP2881917A1 EP 2881917 A1 EP2881917 A1 EP 2881917A1 EP 13825508 A EP13825508 A EP 13825508A EP 2881917 A1 EP2881917 A1 EP 2881917A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- radiological
- images
- systems
- depth
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 239000000463 material Substances 0.000 claims abstract description 22
- 238000010521 absorption reaction Methods 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims abstract description 5
- 230000005855 radiation Effects 0.000 claims description 10
- 238000007689 inspection Methods 0.000 claims description 6
- 238000003908 quality control method Methods 0.000 claims description 4
- 238000001959 radiotherapy Methods 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 238000010603 microCT Methods 0.000 claims description 3
- 238000003325 tomography Methods 0.000 claims description 3
- 238000004497 NIR spectroscopy Methods 0.000 claims description 2
- 238000002083 X-ray spectrum Methods 0.000 claims description 2
- 238000010276 construction Methods 0.000 claims description 2
- 238000001506 fluorescence spectroscopy Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 claims description 2
- 238000002460 vibrational spectroscopy Methods 0.000 claims description 2
- 238000012800 visualization Methods 0.000 claims description 2
- 238000003858 X-ray microfluorescence spectroscopy Methods 0.000 claims 1
- 230000008569 process Effects 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000004075 alteration Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000002603 single-photon emission computed tomography Methods 0.000 description 2
- 210000003625 skull Anatomy 0.000 description 2
- 238000002835 absorbance Methods 0.000 description 1
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 201000009310 astigmatism Diseases 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009792 diffusion process Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000009547 dual-energy X-ray absorptiometry Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000005251 gamma ray Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000001634 microspectroscopy Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002285 radioactive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000004876 x-ray fluorescence Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/005—Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/08—Volume rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/501—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of the head, e.g. neuroimaging or craniography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/416—Exact reconstruction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/464—Dual or multimodal imaging, i.e. combining two or more imaging modalities
Definitions
- the present invention refers to a device for obtaining densitometric images of objects by combining radiological systems and depth cameras. Secondly, it refers to the method which, by means of the mentioned device, obtains densitometric images of objects using a combination of radiological systems and depth cameras. In addition, various uses of the method and procedure mentioned above are provided.
- the main fields of application of the device and method described herein are those of industry, medical and security, particularly the food quality control, quality control of industrial pieces and the improvement of radiological medical devices. Note that throughout the present document when talking about objects, individuals and scenes are also include, being able to apply the present invention indistinctly on any of them.
- the image formation model allows mathematically characterizing the process by which a point of the scene is projected onto an image.
- the "pinhole” camera model ( Figure 3 ) is the simplest and the most used in the area of computer vision.
- the optical center or center of projection and the focal length which is the distance between the image plane and the optical center.
- the optical center is generally used as the origin of the coordinate system relative to a camera.
- all points of Figure 3 contained in any of the dotted lines are projected onto the same point on the image plane. Therefore, each point in the image represents a line in space that contains all the points projected on it.
- this important property is used to be able to obtain the 3D coordinates of a point by triangulation (a straight line for each camera).
- the "pinhole” camera model (“Multiple View Geometry in Computer Vision", Hartley, R.I. and Zisserman, A.) may be too simple when the lens used in the camera produces aberrations in the projected image. Some of the most common aberrations include: spherical aberration, astigmatism, radial / tangential distortion and field curvature.
- the set of all the parameters which allow modelling the projective process and the distortions caused by the lenses are known as intrinsic parameters of the camera.
- each pixel provides information about the interaction zone of the incident light that is reflected to the plane of the camera ( FIG. 5 ).
- each pixel corresponds to the attenuated intensity due to absorption and diffraction phenomena between two surfaces limiting a volume.
- the depth cameras are sensors that allow to create two-dimensional images in which each pixel contains the information of the distance between the point of the scene represented, and the plane of the camera. Sometimes, after a calibration process, it is possible to obtain the three spatial coordinates of the scene points with depth information.
- a first classification would distinguish between passive or active techniques.
- depth is obtained in most cases by using triangulation.
- the images obtained by two or more RGB cameras are used and the correspondence problem is solved.
- the fundamental advantage of the passive methods is that special lighting conditions are not required and they are suitable for working outdoors in daylight.
- the disadvantage of passive cameras is that the correspondence problem is difficult to solve in both intensity and color homogenous zones.
- the scene is illuminated artificially with a light pattern that by means of adequate processing, allows to determine the depth.
- One of the pioneering techniques in this field illuminates the scene through a linear light beam.
- the deformation of the projection of the beam that impacts with objects in the scene can be related to the depth by triangulation if the position of the light source and the camera that captures the image are known.
- By the relative movement of the light beam with respect to the objects to be measured it is possible to obtain a set of profiles that form the depth image.
- the drawback of this technique is that the acquisition time of a depth image is large because in every instant solely one intensity profile is obtained.
- An alternative way to obtain the depth image with a single image is to use structured light.
- a known pattern of light such as a set of horizontal or vertical lines is usually projected. Again the analysis of pattern deformations allows to know the depth in many profiles.
- Time of flight cameras employ an alternative technique similar to the one used in radar systems. In this case a specific sensor is used to measure the time of flight of a light pulse.
- the advantage over radar systems is that it is possible to obtain the depth of all the image points simultaneously and a sweeping of a spot beam is not necessary.
- 3D cameras have numerous applications in fields such as industrial design, and medicine. In these cases either the cameras are used for registration or for object modelling. In other fields of application such as the video surveillance or assisted driving, the depth information is very useful to overcome ambiguities that are very difficult to solve using only the information of a conventional RGB image.
- the set of systems for the reconstruction of the surface of an object is what is called “depth camera”. If this measuring system includes the texture of the object it is called “texture and depth camera”.
- the X-rays (and gamma rays) techniques are commonly used in non-destructive analysis ( Industrial Radiology: Theory and Practice R. Halmshaw; Nietdestructief onderzoek ISBN 90-407-1147-X (Dutch ). WJP Vink, Non-destructive analysis.
- the radiographic images are obtained by placing a natural or artificial source of gamma or X rays passing through a part or all the examined object, and a generally flat or linear detector at the other side. Absorption differences due to the nature of the material and material thickness generate an image of intensities on the detector.
- X-rays and gamma-rays The fundamental difference between X-rays and gamma-rays is that the first are derived from a source which generates a continuous spectrum of photons, while gamma rays from natural de-excitation of atomic nucleus or deep layers of electrons in the atom have known energies, namely monochromatic sources.
- This feature allows applications such as measuring the diffusion of tracers in live systems and allows applications such as the SPECT (cameras that detect the projection of the isotope in a plane) or PET (geometrically paired cameras that detect coincidences of photons from positrons produced by the decay of the nucleus).
- SPECT cameras that detect the projection of the isotope in a plane
- PET geometrically paired cameras that detect coincidences of photons from positrons produced by the decay of the nucleus
- I 0 is the incident intensity
- k(r) is a constant that depends on the electron density of the material
- x' is the distance crossed. Because the sources are usually punctual, the radiation intensity "I” must be multiplied by a geometric factor that depends on the square of the inverse of the distance "r” to the source.
- the calibrations are performed by means of mannequins with known densities. Once the equipment is calibrated, the defined parameters are used to highlight different injuries and to analyze the state of bones and tissues, or to locate foreign bodies.
- easily detectable elements with different thicknesses are included in the scene either manually or automatically to establish a link between the measured intensity in the pixel of the detector and the material thickness.
- Another technique is the use of X-ray sources that emit at two different energies (multi-voltage). These techniques rely on the fact that X-ray absorption in the medium is different and depend on the electron density of the material and the energy of the X-ray beam. Comparing the images of the same scene acquired at different voltages one can get information on the material composition, being the method able to perform a densitometric study. This method that allows the evaluation by comparing ( Figure 1 ) the distance travelled by the radiation in the material is inaccurate because the measures obtained are relative. It also requires extra steps and to have either time for the two measurements or to have two radiological devices.
- the image of X-rays or gamma rays obtained in a detector is a mixture of different frequencies (colors) from the emitting source. It is a multispectral image where multiple wavelengths overlap.
- diffraction techniques or radioactive sources of known energy are applied.
- the former have the disadvantage of drastically reducing the light intensity of the beam and the second has the disadvantages associated with the natural sources of gamma rays.
- the resulting image has a camera equation characterized by the intrinsic parameters, i.e. distance from source to detector, radial corrections in case of use of image intensifiers, and some extrinsic parameters which locate the radiological system to a coordinate system.
- the 3D image reconstruction can be performed from 2D images using image acquisitions from different angular positions ("tomography systems").
- tomography systems image acquisitions from different angular positions
- micro-tomography systems the pixel size in the micrometer range is termed "micro-tomography systems.”
- the image acquisition can be based on "gantry systems” with a detector or detectors and a source or sources of X-rays that rotates on its axis or in systems in which the scene rotates on its axis.
- Sensors which provide planar images are detectors sensitive to the radiation intensity detected either in a plane or linearly arranged, but relying on the relative movement of the object and the detector, and synchronizing the displacement speed with the speed of reading.
- the calibration treatment of these images is slightly different to obtain the register, as in the first case you have a different focal for each vertical-horizontal orientation of the image, the mathematical differences in their treatment are not relevant.
- the assembly between the image sensor and the source of X-rays or gamma rays is called 'radiological system', and provides 'radiological images'. If these images contain information related to a 'reference frame' they are called 'registered radiological images'.
- the calibration is performed using fiducial or marks systems that identify coordinates which are known and visible on different imaging modalities and are adapted to the various setting modes.
- a technique used in medicine is the use of frames calibrated and adapted to the patient ( Figure 4 ). These systems are called “reference frame for radiological image” herein named as “reference frame”.
- the surface information obtained with different wavelengths in the visible region of the spectrum may be also used for texture analysis, i.e. to study periodic spatial distribution patterns that form the surface topography of the object locally.
- registered systems or devices when two systems or devices images provide spatial information in the same coordinate system will be referred to as “registered systems or devices” and the images captured by such systems will be referred to as “registered images”.
- the invention consists of a device and a method for obtaining densitometric images of objects combining radiological systems and depth cameras.
- a first object of the present invention is a device that obtains information from the surface of an object by using depth cameras or depth cameras with texture combined with one or more radiological systems. By registering the radiological system and depth cameras, spatial information in the analysis of X-ray images is introduced.
- the parameterization of the object surface is obtained. In applications where this surface is of interest, it may be further provided.
- the first object of the present invention is a device for obtaining densitometric images of objects by the combination of radiological systems and depth cameras. That device comprises at least:
- At least one of the depth sensors comprises tools for moving its relative position with respect to the rest of the device, while keeping the registration.
- At least one of the radiological systems comprises tools for moving its relative position with respect to the rest of the device, while keeping the registration.
- the device comprises a system of marks in pre-set fixed locations recognizable for the at least one radiological device and for the at least one depth sensor. This system of marks allows the calibration of the elements in the device to obtain densitometric images.
- the objects comprise frames of known materials, positioned in the proximity to said objects, that allow to filter the X-ray spectrum and to obtain regions of pixels with modulated energy in the same radiological image.
- At least one radiological system comprises a radiation source of gamma rays.
- the device for obtaining radiological images comprises a system of registered images that enables to incorporate texture information to the object surfaces.
- the radiation source comprises a radiotherapy accelerator.
- the device comprises at least one registered image in the visible light region of the spectrum for applying information on the texture to the object surface.
- a second object of the invention is the method for obtaining densitometric images of at least one object by combination of radiographic images and depth cameras. Such a procedure is carried out by using the device described above.
- This method includes the following steps:
- At least one sensor changes its relative position with respect to other components of the system, keeping the registration.
- At least one of the radiological systems changes its relative position with respect to other components of the system, keeping the registration.
- the method comprises a combination of the visible images of the objects with the three-dimensional image of the depth sensors, obtaining images with texture.
- a third object of the present invention is the uses of the device and the process described above.
- its use is provided in the following systems or devices: axial tomography systems, micro-tomography systems, fluorescence spectrometry systems, systems of micro-spectrometry X-ray fluorescence, vibrational spectroscopy systems, systems of near-infrared spectroscopy, multispectral cameras systems, hyperspectral cameras systems, accelerators for radiotherapy, display systems, quality control systems, food, products and mechanical constructions inspection systems, side scatter correction systems (side scatter), inspection of mechanical pieces and support systems for diagnostic devices.
- X-ray images do not provide information about the surface delimiting the inspected object. This information is relevant for the analysis of the density of the materials. Under certain circumstances, the geometry is known or can be obtained in a comparative way. But usually this information is not available, reducing the applicability of the X-ray images.
- the most common solution to this problem is the dual-energy X-ray absorptiometry, which involves the comparison of two X-ray images taken with different voltages.
- the cost of these devices is high, because two X-ray sources and two detectors are required.
- this method does not allow to obtain spatial information of the scene.
- Another technique is the use of X-ray sources emitting at least two different energies (multi-voltage).
- the figure 1 shows the contrasts obtained with calibrated templates for two different voltages. Specifically, the comparison of the distances travelled by the radiation in the material for two different voltages, 150 kV and 80 kV, is presented, showing the material with higher electronic density in darker colors.
- Figure 2 shows the process for obtaining the registered X-ray images for the camera equation of each projection.
- This type of image registration is commonly used in clinical protocols.
- two X-ray sources (1, 2) are used to irradiate an object (3) provided with a recognizable cross-shaped fiducial system (x, +) which at the same time are projected on the radiation receptors (4, 5) respectively.
- an extra fiducial could be set in any of the two types of fiducials.
- the projection process for the "pinhole” camera model is represented in figure 3 .
- the optical center or projection center (6) and the focal length (7) which is the distance between the image plane (8) and the optical center (1) are among the main parameters of the model.
- the optical center (1) is generally used as the origin of the coordinate system referred to a camera. It shows how all the dots contained in any of the dotted lines are projected onto the same point (10) in the image plane (8). Therefore, each dot in the image (10) of the object (9) represents a straight line in the space that contains all dots which are projected (9) on said object (9).
- FIG 4 A volumetric reconstruction of a human skull showing a fiducial system for 3D volumes is shown in figure 4 .
- That fiducial system is fixed with a calibrated frame (11) that is adapted to the subject (13).
- "N-shaped" items (12) establish a recognizable pattern in computed tomography or magnetic resonance images.
- These fiducials facilitate the mathematical inference of the object surface, the human head in this case, for which the 3D coordinates must be expressed in the same reference system. For this, the inference is carried out by the combined use of patterns visible in both systems, the depth cameras, by locating elements of the scene, and the radiological systems.
- the fiducial system and the calibrated frame are particular embodiments, being able to use other existing fiducial systems and frames available in the state of the art.
- Figure 5 shows a partial reconstruction of the surface of a human head obtained by depth cameras.
- the techniques used in this type of reconstruction are well-known in the state of the art.
- Figure 6 shows a particular embodiment of the device which is the object of the present invention. Specifically it includes a radiation source (14) that emits an X-ray or gamma ray beam on an object (15) from which a densitometric radiological image in a radiological receptor (16) want to be obtained. It also includes a depth camera (17) and a receptor (18) of the depth camera (17). That receptor (18) is responsible for providing the 3D reconstruction of the object (15). The device also includes a unit for image registration (19). Three beams (20, 21, 22) of X-rays or gamma rays irradiating the object, as an example, are represented.
- the width of the material passed through by the beam is represented, presenting an entry point into the object (23) and an exit point (24), defining the two points a straight line (25) which corresponds to the width of the material.
- the equation of the line which represents each one of the pixels is calculated.
- the straight line is obtained (25) which defines the width of material traversed by the beam (21).
- Figures 7a and 7b where the same X-ray image taken by conventional devices and techniques ( Fig. 7a ) and by the device and method object of the present invention ( Fig. 7b ) are shown. So, based on the radiological image shown in Figure 7a , and combining this image with the information provided by the surface reconstruction obtained with the depth cameras shown in Figure 5 , Figure 7b is obtained, which proves that the dynamic range of the image has been compacted by representing density instead of absorption. Adjusting the dynamic range for visualization allows a simplification of the diagnostic systems.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Animal Behavior & Ethology (AREA)
- High Energy & Nuclear Physics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Hardware Design (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Image Processing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Analysing Materials By The Use Of Radiation (AREA)
Abstract
Description
- The present invention, as expressed in the title of this descriptive document, firstly refers to a device for obtaining densitometric images of objects by combining radiological systems and depth cameras. Secondly, it refers to the method which, by means of the mentioned device, obtains densitometric images of objects using a combination of radiological systems and depth cameras. In addition, various uses of the method and procedure mentioned above are provided. The main fields of application of the device and method described herein are those of industry, medical and security, particularly the food quality control, quality control of industrial pieces and the improvement of radiological medical devices. Note that throughout the present document when talking about objects, individuals and scenes are also include, being able to apply the present invention indistinctly on any of them.
- The image formation model allows mathematically characterizing the process by which a point of the scene is projected onto an image.
- Depending on the required accuracy and the application different types of formation models such as "pinhole" camera models, thin lens or thick lens are used.
- The "pinhole" camera model (
Figure 3 ) is the simplest and the most used in the area of computer vision. - Among the main model parameters are the optical center or center of projection and the focal length which is the distance between the image plane and the optical center. The optical center is generally used as the origin of the coordinate system relative to a camera. In this model all points of
Figure 3 contained in any of the dotted lines are projected onto the same point on the image plane. Therefore, each point in the image represents a line in space that contains all the points projected on it. In stereo-vision this important property is used to be able to obtain the 3D coordinates of a point by triangulation (a straight line for each camera). - The "pinhole" camera model ("Multiple View Geometry in Computer Vision", Hartley, R.I. and Zisserman, A.) may be too simple when the lens used in the camera produces aberrations in the projected image. Some of the most common aberrations include: spherical aberration, astigmatism, radial / tangential distortion and field curvature.
- The set of all the parameters which allow modelling the projective process and the distortions caused by the lenses are known as intrinsic parameters of the camera.
- Depending on the camera type the information projected in each point of the image may have different nature. For example, in an RGB image, each pixel provides information about the interaction zone of the incident light that is reflected to the plane of the camera (
FIG. 5 ). In X-ray images, each pixel corresponds to the attenuated intensity due to absorption and diffraction phenomena between two surfaces limiting a volume. - The depth cameras are sensors that allow to create two-dimensional images in which each pixel contains the information of the distance between the point of the scene represented, and the plane of the camera. Sometimes, after a calibration process, it is possible to obtain the three spatial coordinates of the scene points with depth information.
- There are multiple techniques for obtaining the depth of a scene. A first classification would distinguish between passive or active techniques. In the first case depth is obtained in most cases by using triangulation. For this, the images obtained by two or more RGB cameras are used and the correspondence problem is solved. The fundamental advantage of the passive methods is that special lighting conditions are not required and they are suitable for working outdoors in daylight. The disadvantage of passive cameras is that the correspondence problem is difficult to solve in both intensity and color homogenous zones.
- In the case of active techniques the scene is illuminated artificially with a light pattern that by means of adequate processing, allows to determine the depth.
- One of the pioneering techniques in this field illuminates the scene through a linear light beam. The deformation of the projection of the beam that impacts with objects in the scene can be related to the depth by triangulation if the position of the light source and the camera that captures the image are known. By the relative movement of the light beam with respect to the objects to be measured it is possible to obtain a set of profiles that form the depth image. The drawback of this technique is that the acquisition time of a depth image is large because in every instant solely one intensity profile is obtained.
- An alternative way to obtain the depth image with a single image is to use structured light. In this case a known pattern of light such as a set of horizontal or vertical lines is usually projected. Again the analysis of pattern deformations allows to know the depth in many profiles.
- Time of flight cameras employ an alternative technique similar to the one used in radar systems. In this case a specific sensor is used to measure the time of flight of a light pulse. The advantage over radar systems is that it is possible to obtain the depth of all the image points simultaneously and a sweeping of a spot beam is not necessary.
- Recently a new type of low cost depth cameras appeared in the market. They use a different type of structured light known as coded light. Although these cameras were initially designed for leisure-related applications, its low cost has made possible that a great amount of new applications in many different areas appear (
U.S. 2010/0199228 Al , Kinect patent). Such cameras are also known as RGB-D, this is because each point provides information about its color and depth. This is possible because the coded light pattern is in the near infrared. - 3D cameras have numerous applications in fields such as industrial design, and medicine. In these cases either the cameras are used for registration or for object modelling. In other fields of application such as the video surveillance or assisted driving, the depth information is very useful to overcome ambiguities that are very difficult to solve using only the information of a conventional RGB image.
- The set of systems for the reconstruction of the surface of an object is what is called "depth camera". If this measuring system includes the texture of the object it is called "texture and depth camera".
- The X-rays (and gamma rays) techniques are commonly used in non-destructive analysis (Industrial Radiology: Theory and Practice R. Halmshaw; Nietdestructief onderzoek ISBN 90-407-1147-X (Dutch). WJP Vink, Non-destructive analysis. Application of machine vision to food and agriculture: a review Davies ER) since the early twentieth century, both in clinical diagnosis and inspection of objects and have led to major technological advances in the development of detectors and production methods. The radiographic images are obtained by placing a natural or artificial source of gamma or X rays passing through a part or all the examined object, and a generally flat or linear detector at the other side. Absorption differences due to the nature of the material and material thickness generate an image of intensities on the detector.
- The fundamental difference between X-rays and gamma-rays is that the first are derived from a source which generates a continuous spectrum of photons, while gamma rays from natural de-excitation of atomic nucleus or deep layers of electrons in the atom have known energies, namely monochromatic sources.
- The use of radiographic X-ray sources has the advantage that intensity can be modulated and the emission can automatically be cut, while gamma sources cannot cut their emissions because this is a natural process that follows a varying intensity over time:
where "w" is the inverse of the half-life of the isotope, "Io (t0)" is the source intensity measured at time "t0" and "t" is the time where the current measurement is performed. - This feature allows applications such as measuring the diffusion of tracers in live systems and allows applications such as the SPECT (cameras that detect the projection of the isotope in a plane) or PET (geometrically paired cameras that detect coincidences of photons from positrons produced by the decay of the nucleus).
- There are two ways commonly used to obtain the intensity of the source that passes through the scene, or the total intensity is measured without discriminating the individual energy of every photon or the individual energy of every photon is counted and measured by using gamma cameras. This latter type of techniques is applied in the aforementioned PET and SPECT applications. When X-rays interact with matter they are partly absorbed and partly transmitted. The probability of interacting with the material depends on the electron density which is a function that depends on the incident photon energy, and on the elemental composition (Z or atomic number) of the material.
- Thus the absorption of X-rays depends on the distance they cross, and the characteristics of the material. The transmitted intensity is determined by the following expression:
where "w" refers to the fraction of each component and the subscript "i" indicates the "k" characteristic for each of them. To find the distance "x' in the acquisition of planar X-ray images is not possible ("The X-ray Inspection" Dr.Ing. M. Purschke. Castell-Verlag GmbH) unless the geometry is known. - From the point of view of the end user, it is necessary to establish an optimization protocol of both, the maximum energy of the beam and its intensity, in order to prevent the saturation of the image by an excess of intensity or the lack of contrast by default therein. Optimal working parameters depend on the density or density variations expected.
- In clinical diagnostic environments, for example, the calibrations are performed by means of mannequins with known densities. Once the equipment is calibrated, the defined parameters are used to highlight different injuries and to analyze the state of bones and tissues, or to locate foreign bodies.
- In some cases to calibrate the detector response, easily detectable elements with different thicknesses are included in the scene either manually or automatically to establish a link between the measured intensity in the pixel of the detector and the material thickness.
- Another technique is the use of X-ray sources that emit at two different energies (multi-voltage). These techniques rely on the fact that X-ray absorption in the medium is different and depend on the electron density of the material and the energy of the X-ray beam. Comparing the images of the same scene acquired at different voltages one can get information on the material composition, being the method able to perform a densitometric study. This method that allows the evaluation by comparing (
Figure 1 ) the distance travelled by the radiation in the material is inaccurate because the measures obtained are relative. It also requires extra steps and to have either time for the two measurements or to have two radiological devices. - In general, the image of X-rays or gamma rays obtained in a detector is a mixture of different frequencies (colors) from the emitting source. It is a multispectral image where multiple wavelengths overlap. For applications where it is necessary to select a small continuous band of wavelengths, diffraction techniques or radioactive sources of known energy are applied. The former have the disadvantage of drastically reducing the light intensity of the beam and the second has the disadvantages associated with the natural sources of gamma rays.
- The resulting image has a camera equation characterized by the intrinsic parameters, i.e. distance from source to detector, radial corrections in case of use of image intensifiers, and some extrinsic parameters which locate the radiological system to a coordinate system.
- When two radiographic images registered from different locations are combined, triangulation methods suitable to the visible light can be applied. This manual or automatic identification of scene elements in the two projections allows the reconstruction of the spatial position of the element ("Trajectory triangulation: 3D reconstruction of moving points from a monocular image sequence" Avidan and Shashua). Thus, the pixels obtained in the image cuts generate, by means of the implementation of algebraic methods, a stereoscopic image with three-dimensional information (
figure 2 ). - The 3D image reconstruction can be performed from 2D images using image acquisitions from different angular positions ("tomography systems"). In these devices, the pixel size in the micrometer range is termed "micro-tomography systems." The image acquisition can be based on "gantry systems" with a detector or detectors and a source or sources of X-rays that rotates on its axis or in systems in which the scene rotates on its axis.
- Sensors which provide planar images are detectors sensitive to the radiation intensity detected either in a plane or linearly arranged, but relying on the relative movement of the object and the detector, and synchronizing the displacement speed with the speed of reading. Although the calibration treatment of these images is slightly different to obtain the register, as in the first case you have a different focal for each vertical-horizontal orientation of the image, the mathematical differences in their treatment are not relevant.
- The assembly between the image sensor and the source of X-rays or gamma rays is called 'radiological system', and provides 'radiological images'. If these images contain information related to a 'reference frame' they are called 'registered radiological images'.
- As already mentioned, there are well-known techniques for image registration in the state of the art. The depth camera systems require knowing the calibration parameters of the camera to reconstruct the image. There are techniques that use identifiable and recognizable symbols by means of image processing which allow obtaining this matrix. These methods allow allocating a line in space for each pixel, on the basis of the distance between the focus and the center of the camera, as well as the rotation and translation matrix and the distortion coefficients. These systems are called "reference frame for depth cameras" in this document.
- For the same purpose, in X-ray the calibration is performed using fiducial or marks systems that identify coordinates which are known and visible on different imaging modalities and are adapted to the various setting modes. In the case that the acquisition is made by volume, a technique used in medicine is the use of frames calibrated and adapted to the patient (
Figure 4 ). These systems are called "reference frame for radiological image" herein named as "reference frame". - The surface information obtained with different wavelengths in the visible region of the spectrum may be also used for texture analysis, i.e. to study periodic spatial distribution patterns that form the surface topography of the object locally.
- Throughout herein, when two systems or devices images provide spatial information in the same coordinate system will be referred to as "registered systems or devices" and the images captured by such systems will be referred to as "registered images".
- However in the state of the art devices integrating radiological systems and depth cameras working together to provide densitometric images of objects, scenes or individuals, are not found.
- To achieve the objectives and avoid the drawbacks stated above, the invention consists of a device and a method for obtaining densitometric images of objects combining radiological systems and depth cameras.
- A first object of the present invention is a device that obtains information from the surface of an object by using depth cameras or depth cameras with texture combined with one or more radiological systems. By registering the radiological system and depth cameras, spatial information in the analysis of X-ray images is introduced.
- As additional information obtained in the process, the parameterization of the object surface is obtained. In applications where this surface is of interest, it may be further provided.
- More specifically, the first object of the present invention is a device for obtaining densitometric images of objects by the combination of radiological systems and depth cameras. That device comprises at least:
- at least one radiological device, which provides a set of registered radiological images where the radiological images comprise information about the radiological absorbance of the objects;
- at least one depth sensor, which provides a set of registered depth images that allow the three-dimensional reconstruction of part of the surfaces that constitute the objects;
- image processing means that combine the information of radiological absorption of the set of radiological images with the width of the material traversed provided by the three-dimensional reconstruction of the objects.
- In a particular embodiment of the invention, at least one of the depth sensors comprises tools for moving its relative position with respect to the rest of the device, while keeping the registration.
- In another particular embodiment of the invention, at least one of the radiological systems comprises tools for moving its relative position with respect to the rest of the device, while keeping the registration.
- In another particular embodiment of the invention, the device comprises a system of marks in pre-set fixed locations recognizable for the at least one radiological device and for the at least one depth sensor. This system of marks allows the calibration of the elements in the device to obtain densitometric images.
- In another particular embodiment of the invention, the objects comprise frames of known materials, positioned in the proximity to said objects, that allow to filter the X-ray spectrum and to obtain regions of pixels with modulated energy in the same radiological image.
- In another particular embodiment of the invention, at least one radiological system comprises a radiation source of gamma rays.
- In another particular embodiment of the invention, the device for obtaining radiological images comprises a system of registered images that enables to incorporate texture information to the object surfaces.
- In another particular embodiment of the invention, the radiation source comprises a radiotherapy accelerator.
- In another particular embodiment of the invention, the device comprises at least one registered image in the visible light region of the spectrum for applying information on the texture to the object surface.
- A second object of the invention is the method for obtaining densitometric images of at least one object by combination of radiographic images and depth cameras. Such a procedure is carried out by using the device described above.
- This method includes the following steps:
- acquire at least one radiological image registered by at least one radiological system/device;
- generate the three-dimensional reconstruction of the object from the set of registered depth images acquired through the system to acquire depth images;
- calculate, for each pixel in the radiological image, a straight line in the space which represents that pixel;
- calculate one entry point and one exit point, corresponding to the intersection of each straight line calculated for each pixel in the radiological image with the surface obtained by the three-dimensional reconstruction of the object;
- calculate the width by the length of the segment joining the entry and exit points in the radiological image;
- combine, for each pixel in the radiological image, information on the length of the segment calculated with the information on the surface obtained from the three-dimensional reconstruction of the object.
- In a particular embodiment of the invention, at least one sensor changes its relative position with respect to other components of the system, keeping the registration.
- In another particular embodiment of the invention, at least one of the radiological systems changes its relative position with respect to other components of the system, keeping the registration.
- In another particular realization of the invention, the method comprises a combination of the visible images of the objects with the three-dimensional image of the depth sensors, obtaining images with texture.
- Furthermore, a third object of the present invention is the uses of the device and the process described above. Among the possible uses of the present invention and not limited only to them, its use is provided in the following systems or devices: axial tomography systems, micro-tomography systems, fluorescence spectrometry systems, systems of micro-spectrometry X-ray fluorescence, vibrational spectroscopy systems, systems of near-infrared spectroscopy, multispectral cameras systems, hyperspectral cameras systems, accelerators for radiotherapy, display systems, quality control systems, food, products and mechanical constructions inspection systems, side scatter correction systems (side scatter), inspection of mechanical pieces and support systems for diagnostic devices.
-
-
Figure 1 .- Shows contrasts obtained with calibrated patterns and different voltages in an X-ray tube. -
Figure 2 .- Shows the process, which belongs to the state of the art, for the registration of X-ray images using recognizable cross-shaped fiducials (x, +), for the camera equation of each projection. -
Figure 3 .- Shows the formation of images in the geometric pinhole camera model. -
Figure 4 .- Shows a perspective image about a volumetric reconstruction of a human skull, showing a fiducial system for 3D volumes. -
Figure 5 .- Shows a reconstruction of the surface of a human head by the use of depth cameras according to the state of the art. -
Figure 6 .- Shows a particular embodiment of the device object of the present invention. -
Figure 7 .- Shows the same X-ray image taken by conventional devices and techniques (7a) and by the device and method object of the present invention (7b). - Subsequently, a description of various embodiments of the invention is carried out, with an illustrative and non-limiting character, with reference to the notation adopted in the figures.
- X-ray images do not provide information about the surface delimiting the inspected object. This information is relevant for the analysis of the density of the materials. Under certain circumstances, the geometry is known or can be obtained in a comparative way. But usually this information is not available, reducing the applicability of the X-ray images.
- The most common solution to this problem is the dual-energy X-ray absorptiometry, which involves the comparison of two X-ray images taken with different voltages. The cost of these devices, however, is high, because two X-ray sources and two detectors are required. Likewise this method does not allow to obtain spatial information of the scene. Another technique is the use of X-ray sources emitting at least two different energies (multi-voltage). The
figure 1 shows the contrasts obtained with calibrated templates for two different voltages. Specifically, the comparison of the distances travelled by the radiation in the material for two different voltages, 150 kV and 80 kV, is presented, showing the material with higher electronic density in darker colors. -
Figure 2 shows the process for obtaining the registered X-ray images for the camera equation of each projection. This type of image registration is commonly used in clinical protocols. Thus, two X-ray sources (1, 2) are used to irradiate an object (3) provided with a recognizable cross-shaped fiducial system (x, +) which at the same time are projected on the radiation receptors (4, 5) respectively. To avoid the confusion in the spatial orientation of this application an extra fiducial could be set in any of the two types of fiducials. - The projection process for the "pinhole" camera model is represented in
figure 3 . The optical center or projection center (6) and the focal length (7) which is the distance between the image plane (8) and the optical center (1) are among the main parameters of the model. The optical center (1) is generally used as the origin of the coordinate system referred to a camera. It shows how all the dots contained in any of the dotted lines are projected onto the same point (10) in the image plane (8). Therefore, each dot in the image (10) of the object (9) represents a straight line in the space that contains all dots which are projected (9) on said object (9). - A volumetric reconstruction of a human skull showing a fiducial system for 3D volumes is shown in
figure 4 . That fiducial system is fixed with a calibrated frame (11) that is adapted to the subject (13). "N-shaped" items (12) establish a recognizable pattern in computed tomography or magnetic resonance images. These fiducials facilitate the mathematical inference of the object surface, the human head in this case, for which the 3D coordinates must be expressed in the same reference system. For this, the inference is carried out by the combined use of patterns visible in both systems, the depth cameras, by locating elements of the scene, and the radiological systems. - The fiducial system and the calibrated frame are particular embodiments, being able to use other existing fiducial systems and frames available in the state of the art.
-
Figure 5 shows a partial reconstruction of the surface of a human head obtained by depth cameras. The techniques used in this type of reconstruction are well-known in the state of the art. -
Figure 6 shows a particular embodiment of the device which is the object of the present invention. Specifically it includes a radiation source (14) that emits an X-ray or gamma ray beam on an object (15) from which a densitometric radiological image in a radiological receptor (16) want to be obtained. It also includes a depth camera (17) and a receptor (18) of the depth camera (17). That receptor (18) is responsible for providing the 3D reconstruction of the object (15). The device also includes a unit for image registration (19). Three beams (20, 21, 22) of X-rays or gamma rays irradiating the object, as an example, are represented. For the central beam (21) the width of the material passed through by the beam is represented, presenting an entry point into the object (23) and an exit point (24), defining the two points a straight line (25) which corresponds to the width of the material. Starting with the surface measured with the depth camera (17) and the receptor (18) and with the X-ray image captured by the source (14) and the receptor (16), for each pixel of the radiological image the equation of the line which represents each one of the pixels, is calculated. Taking that line (25) to the 3D reconstruction captured by the camera (17) and the receptor (18) the straight line is obtained (25) which defines the width of material traversed by the beam (21). Finally the information from the widths calculated for each pixel in the radiological image is used to correct the original radiological image. - The result of the preceding statements is clearly shown in
Figures 7a and 7b where the same X-ray image taken by conventional devices and techniques (Fig. 7a ) and by the device and method object of the present invention (Fig. 7b ) are shown. So, based on the radiological image shown inFigure 7a , and combining this image with the information provided by the surface reconstruction obtained with the depth cameras shown inFigure 5 ,Figure 7b is obtained, which proves that the dynamic range of the image has been compacted by representing density instead of absorption. Adjusting the dynamic range for visualization allows a simplification of the diagnostic systems.
Claims (14)
- Device for obtaining densitometric images of objects by a combination of radiological systems and depth-sensing cameras, characterized in that the device at least comprises:• at least one radiological device, which provides a set of registered radiological images wherein the radiological images comprise radiological absorption information of the objects;• at least one depth sensor that provides a set of registered depth images that allow the three-dimensional reconstruction of the surfaces that constitute the objects;• image processing means that combine the radiological absorption information from the set of registered radiological images with distances of traversed material provided by the three-dimensional reconstruction of the objects.
- Device for obtaining densitometric images, according to claim 1, characterized in that at least one of the depth sensors comprises tools for moving from its relative position with respect to the rest of the device, while keeping the registration.
- Device for obtaining densitometric images, according to claims 1 or 2, characterized in that at least one of the radiological systems comprises tools for moving from its relative position with respect to the rest of the device, while keeping the registration.
- Device for obtaining densitometric images according to any of the preceding claims, characterized in that it comprises a system of marks in pre-set fixed locations recognizable from the at least one radiological device and at least one depth sensor which allows the calibration of the device for obtaining densitometric images.
- Device for obtaining densitometric images according to any of the preceding claims, characterized in that the objects comprise, positioned in proximity, frames of known materials, that allow to filter part of a X-ray spectrum and to obtain regions of pixels with modulated energy in the same radiological image.
- Device for obtaining densitometric images according to any one of claims 1 to 4, wherein the at least one radiological system comprises a radiation source of gamma rays.
- Device for obtaining densitometric images according to any of the preceding claims, characterized in that it comprises a system of registered images that enables to incorporate texture information to the object surface.
- Device for obtaining densitometric images according to any of the preceding claims, characterized in that the radiation source comprises a radiotherapy accelerator.
- Device for obtaining densitometric images according to any of the preceding claims, characterized in that it comprises at least one registered image in the visible region for applying information on the texture to the object surface.
- Method for obtaining densitometric images of at least one object by combination of radiological images and depth cameras, which is performed by the device described in any of the preceding claims, characterized in that it comprises the following phases:• acquire at least one registered radiological image by at least one radiological system;• generate the three-dimensional reconstruction of at least one object from the set of registered depth images acquired through a depth imaging system;• calculate, for each pixel of the at least one radiological image, a straight line in the space that represents the pixel;• calculate one entry point and one exit point, corresponding to the intersection of each straight line calculated for each pixel of the at least one radiological image with the surface obtained by the three-dimensional reconstruction of the at least one object;• calculate a length of a segment that joins the entry point and exit point in the three-dimensional reconstruction;• combine, for each pixel of the radiological image, information about the length of the segment calculated with a information obtained from the surface acquired by the three-dimensional reconstruction of at least one object.
- Method for obtaining densitometric images according to claim 10, characterized in that at least one of the sensors changes its relative position with respect to the rest of the components of the system, keeping the registration.
- Method for obtaining densitometric images according to any of claims 10 or 11, characterized in that at least one of the radiological system changes its relative position with respect to the rest of the system, keeping the registration.
- Method for obtaining densitometric images according to any of the preceding claims, characterized in that it comprises combining the visible images of the objects with the three-dimensional image of the depth sensors, obtaining textured images.
- Use of the device and method described in any of the preceding claims in systems selected between:• axial tomography systems;• micro-tomography systems;• fluorescence spectrometry systems;• X-ray microfluorescence spectrometry;• systems of vibrational spectrometry;• systems of near-infrared spectroscopy;• hyperspectral camera systems;• radiotherapy accelerators;• visualization systems;• quality control systems;• food, products and mechanical constructions inspection systems;• side scatter correction systems;• inspection systems for mechanical pieces; and,• support systems for diagnostics.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ES201231243A ES2445490B1 (en) | 2012-07-31 | 2012-07-31 | DEVICE AND PROCEDURE FOR OBTAINING DENSITOMETRIC IMAGES OF OBJECTS THROUGH COMBINATION OF RADIOLOGICAL SYSTEMS AND DEPTH CAMERAS |
PCT/ES2013/070502 WO2014020202A1 (en) | 2012-07-31 | 2013-07-12 | Device and method for obtaining densitometric images of objects by a combination of x-ray systems and depth-sensing cameras |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2881917A1 true EP2881917A1 (en) | 2015-06-10 |
EP2881917A4 EP2881917A4 (en) | 2015-08-12 |
Family
ID=50027310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13825508.8A Ceased EP2881917A4 (en) | 2012-07-31 | 2013-07-12 | Device and method for obtaining densitometric images of objects by a combination of x-ray systems and depth-sensing cameras |
Country Status (4)
Country | Link |
---|---|
US (1) | US10009593B2 (en) |
EP (1) | EP2881917A4 (en) |
ES (1) | ES2445490B1 (en) |
WO (1) | WO2014020202A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109472853A (en) * | 2018-11-16 | 2019-03-15 | 厦门大学 | A kind of lambert's body microcosmic surface reconstructing method based on image irradiation intensity |
CN109499010B (en) * | 2018-12-21 | 2021-06-08 | 苏州雷泰医疗科技有限公司 | Radiotherapy auxiliary system based on infrared and visible light three-dimensional reconstruction and method thereof |
CN114745642A (en) | 2019-12-30 | 2022-07-12 | 美商楼氏电子有限公司 | Balanced armature receiver |
CN112229827B (en) * | 2020-09-07 | 2022-02-08 | 南京大学 | Real-time multispectral tomography method and device |
EP4190245A1 (en) * | 2021-12-06 | 2023-06-07 | Koninklijke Philips N.V. | Quantitative radiographic imaging using a 3d camera |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5774521A (en) * | 1996-07-08 | 1998-06-30 | Cedars-Sinai Medical Center | Regularization technique for densitometric correction |
FR2849241B1 (en) * | 2002-12-20 | 2005-06-24 | Biospace Instr | RADIOGRAPHIC IMAGING METHOD AND DEVICE |
US7265356B2 (en) * | 2004-11-29 | 2007-09-04 | The University Of Chicago | Image-guided medical intervention apparatus and method |
FR2881255B1 (en) * | 2005-01-25 | 2007-03-30 | Gen Electric | METHOD FOR PRODUCING A RENAL IMAGE |
JP2010033296A (en) * | 2008-07-28 | 2010-02-12 | Namco Bandai Games Inc | Program, information storage medium, and image generation system |
US20100199228A1 (en) | 2009-01-30 | 2010-08-05 | Microsoft Corporation | Gesture Keyboarding |
KR101661934B1 (en) * | 2010-07-29 | 2016-10-04 | 삼성전자주식회사 | Image processing apparatus and method |
US9433395B2 (en) * | 2012-07-12 | 2016-09-06 | Samsung Electronics Co., Ltd. | X-ray imaging apparatus and method for controlling X-ray imaging apparatus |
-
2012
- 2012-07-31 ES ES201231243A patent/ES2445490B1/en active Active
-
2013
- 2013-07-12 US US14/418,608 patent/US10009593B2/en active Active
- 2013-07-12 EP EP13825508.8A patent/EP2881917A4/en not_active Ceased
- 2013-07-12 WO PCT/ES2013/070502 patent/WO2014020202A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
US20150222875A1 (en) | 2015-08-06 |
EP2881917A4 (en) | 2015-08-12 |
ES2445490B1 (en) | 2014-12-10 |
WO2014020202A1 (en) | 2014-02-06 |
US10009593B2 (en) | 2018-06-26 |
ES2445490A1 (en) | 2014-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6437286B2 (en) | Image processing apparatus, image processing program, image processing method, and treatment system | |
US9084568B2 (en) | Radiation imaging apparatus and imaging method using radiation | |
US10009593B2 (en) | Device and method for obtaining densitometric images of objects by a combination of radiological systems and depth-sensing cameras | |
US8977026B2 (en) | Methods and systems for locating a region of interest in an object | |
US20130168570A1 (en) | Device and method for combined optical and nuclear image acquisition | |
US10132764B2 (en) | System and method for reconstructing the surface topography of an object embedded within a scattering medium | |
CN104939848B (en) | The generation of monochrome image | |
JP7107974B2 (en) | Systems and methods for volume of distribution and isotope identification in radioactive environments | |
US20160049216A1 (en) | Method and Apparatus for Ion Beam Bragg Peak Measurement | |
US8824759B2 (en) | Correcting axial tilt based on object positions in axial slices of three dimensional image | |
EP3541285B1 (en) | Apparatus for generating multi energy data from phase contrast imaging data | |
CN107533766A (en) | Image improvement method for the view data from dental imaging generation system | |
Koch et al. | Cross-camera comparison of SPECT measurements of a 3-D anthropomorphic basal ganglia phantom | |
US10182776B2 (en) | System and method for correlating object information with X-ray images | |
CN112822983A (en) | Apparatus and method for editing panoramic radiographic image | |
Rogers et al. | Reduction of wobble artefacts in images from mobile transmission x-ray vehicle scanners | |
CN114650774B (en) | Device for processing data acquired by a dark field and/or phase contrast X-ray imaging system | |
US11367227B2 (en) | Method and apparatus for computer vision based attenuation map generation | |
JP2014012131A (en) | X-ray apparatus and method for measuring x-ray | |
EP2882343B1 (en) | Chronic obstructive pulmonary disease (copd) phantom for computed tomography (ct) and methods of using the same | |
KR101937651B1 (en) | Tomotherapy apparatus using phantom inserting phosphorescent plate | |
JP7008325B2 (en) | Radiation fluoroscopy non-destructive inspection method and radiation fluoroscopy non-destructive inspection equipment | |
JP7437337B2 (en) | Internal state imaging device and internal state imaging method | |
Thürauf et al. | Tuning of X-ray parameters for noise reduction of an image-based focus position measurement of a C-arm X-ray system | |
US20070064869A1 (en) | Laminography apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150219 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RA4 | Supplementary search report drawn up and despatched (corrected) |
Effective date: 20150713 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 11/00 20060101AFI20150707BHEP Ipc: G06T 15/04 20110101ALI20150707BHEP Ipc: G06T 15/00 20110101ALI20150707BHEP Ipc: G06T 15/06 20110101ALI20150707BHEP |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20160902 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAV | Appeal reference deleted |
Free format text: ORIGINAL CODE: EPIDOSDREFNE |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20230207 |