WO2016018825A1 - Procédé et appareil de production d'une image tridimensionnelle - Google Patents

Procédé et appareil de production d'une image tridimensionnelle Download PDF

Info

Publication number
WO2016018825A1
WO2016018825A1 PCT/US2015/042296 US2015042296W WO2016018825A1 WO 2016018825 A1 WO2016018825 A1 WO 2016018825A1 US 2015042296 W US2015042296 W US 2015042296W WO 2016018825 A1 WO2016018825 A1 WO 2016018825A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional image
sensor
shadow
radiation
dimensional
Prior art date
Application number
PCT/US2015/042296
Other languages
English (en)
Inventor
Jeremy HORST
Thomas GAL
Marcin Swiatek
Original Assignee
Oraviz, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oraviz, Inc. filed Critical Oraviz, Inc.
Publication of WO2016018825A1 publication Critical patent/WO2016018825A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4452Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being able to move relative to each other
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/51Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for dentistry
    • A61B6/512Intraoral means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • A61B6/5241Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/14Fixators for body parts, e.g. skull clamps; Constructional details of fixators, e.g. pins
    • A61B90/16Bite blocks
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/14Special procedures for taking photographs; Apparatus therefor for taking photographs during medical operations
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/561Support related camera accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B42/00Obtaining records using waves other than optical waves; Visualisation of such records by using optical means
    • G03B42/02Obtaining records using waves other than optical waves; Visualisation of such records by using optical means using X-rays
    • G03B42/04Holders for X-ray films
    • G03B42/042Holders for X-ray films for dental applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • This disclosure generally relates to three-dimensional imaging. More
  • this disclosure relates to systems and methods for combining two-dimensional images into a three-dimensional image.
  • fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image.
  • the reference objects have physical properties that allow effective normalization of images from diverse sources and positions.
  • some embodiments of the devices and methods may function with various types and manufacturers of radiation projectors and sensors. This may allow dentists to create 3D images using some of the existing 2D equipment already ubiquitous in dental practices. This will expand access to 3D imaging to patients for whom X-ray computed tomography machines are not available for reasons of financial or geographic convenience. Increased access to 3D imaging may in turn improve clinical outcomes for these patients.
  • a 3D image can be generated using small, mobile equipment that does not confine a person within a large machine. Some people, including children or those with claustrophobia or anxiety disorders, for example, may be afraid of large medical devices, including traditional CT machines. Some people have special healthcare needs that limit their ability to enter a CT machine.
  • the radiation source can be at nearly any angle and/or distance relative to the radiation sensor when capturing a 2D image.
  • the radiation source does not need to be mounted to a track or a rig, does not need to rotate about a fixed axis, and/or the angle of the radiation source relative to the radiation sensor does not need to be known at the time an image is captured.
  • some embodiments of the devices and methods do not require the fiducial markers to be glued or affixed to the object being imaged, but instead are fixed in a position relative to the radiation sensor. This may reduce patient discomfort during procedures.
  • some embodiments of the devices and methods herein do not require medical professionals and staff to receive significant additional training to operate.
  • some embodiments of the devices and methods disclosed herein may allow for the creation of 3D images from fewer images or scans than other 3D imaging equipment, and therefore with less radiation exposure.
  • the desired resolution of the resulting 3D image can be adjusted by increasing or decreasing the number of 2D images incorporated into the 3D image.
  • the resolution of particular anatomical features can also be adjusted by selecting particular capture angles and radiation exposures for each 2D image incorporated into the 3D image.
  • an apparatus for capturing radiation includes a support, a radiation sensor coupled to the support, and a fiducial marker held by the support at a set distance from the sensor.
  • a method for capturing a two-dimensional radiographic image includes inserting an apparatus of any of the embodiments described herein into a mouth and capturing the image.
  • the sensor, sensor holder, and/or the fiducial markers are held static with regard to the object being imaged.
  • an apparatus for holding a radiation sensor includes a support, a radiation sensor holder coupled to the support, and a fiducial marker held by the support at a set distance from the radiation sensor holder.
  • the fiducial marker is pre-aligned with respect to the sensor or sensor holder so that a location of the fiducial marker is known with respect to the sensor.
  • the set distance is less than 10 mm.
  • the fiducial marker is less than one cubic centimeter in volume.
  • a side of the sensor has less than four square inches of surface area. In some embodiments, a side of the sensor that can detect radiation has less than four square inches of surface area. In some embodiments, the sensor holder is configured to hold a sensor, wherein a radio sensitive side of the sensor has less than four, three, two, or one square inches of total surface area.
  • the apparatus includes a biting portion extending from the support.
  • the biting portion holds the sensor, sensor holder, and/or the fiducial markers in a static position relative to the object being imaged.
  • the fiducial marker includes a shape selected from the group consisting of a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc.
  • the fiducial marker comprises a radiopaque or a semi-radiopaque material.
  • the radiopaque or the semi-radiopaque material is selected from the group consisting of lead, steel, compounds of barium, barium sulfate, compounds of bismuth, plastic, and thermoplastic.
  • the support includes at least one, at least two, at least three, at least four, at least five, at least six, at least seven, at least eight, at least nine, or at least ten fiducial markers.
  • the apparatus includes a radiation source for emitting radiation.
  • the radiation source is an X-ray source.
  • the apparatus further comprises a sensor in the sensor holder.
  • a method of generating a three dimensional image by combining a plurality of two dimensional images includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, and combining the two dimensional images into a three dimensional image using the polar angle and azimuth angle of each two-dimensional image.
  • the method includes determining a position and
  • radiodensity of each pixel on a two-dimensional image mapping the densities of each pixel across a plurality of voxels, and creating the three dimensional image including radiodensity information.
  • analyzing the shadow includes determining a length of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the length of the shadow. In some embodiments, analyzing the shadow includes determining an angle of the shadow, and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder from the angle of the shadow.
  • calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder includes determining a shadow displacement for each of the plurality of two dimensional images, wherein determining a shadow displacement of each two-dimensional image includes determining a position of a center of a shadow in a two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder, and comparing a position of a center of a shadow in each image to the center of the shadow in the two-dimensional image taken when the radiation source is orthogonal to the radiation sensor and/or radiation sensor holder.
  • a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor and/or sensor holder.
  • an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor and/or sensor holder.
  • the direction of the elliptical shadow indicates the azimuth angle of the radiation source relative to the sensor and/or sensor holder.
  • the length of the major axis of the elliptical shadow indicates a capture angle of the source relative to the sensor and/or sensor holder.
  • a first two-dimensional image of the plurality of two dimensional images has a first capture angle and a second two dimensional image of the plurality of two dimensional images has a second capture angle, wherein the first capture angle and second capture angle are different.
  • the method includes displaying the three dimensional image on a display.
  • the method of generating an image includes obtaining two- dimensional images, determining, for each two-dimensional image, a relative position of a sensor and/or sensor holder and a radiation source used to capture the respective two- dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the relative position of the sensor and/or sensor holder and the radiation source used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
  • the method includes normalizing each two-dimensional image by analyzing a radiodensity gradient created by an object captured in each two- dimensional image.
  • the object can be captured in one or more of the plurality of images.
  • normalizing each two-dimensional image includes adjusting a parameter in each two-dimensional image.
  • the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the two-dimensional images.
  • the relative position includes a polar angle between a plane of the sensor and/or sensor holder and a direction of a beam emitted from the radiation source.
  • generating the three-dimensional image includes overlapping the three dimensional volumes.
  • generating the three-dimensional image includes identifying empty voxels.
  • generating the three- dimensional image includes estimating an intensity value of a non-empty voxel.
  • estimating the intensity value of the non-empty voxel includes averaging an array of potential values for the non-empty voxel.
  • estimating the intensity value of the non-empty voxel includes selecting the highest value from an array of potential values for the non-empty voxel. In some embodiments, generating the three- dimensional image includes iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value among one or more related voxels. In some embodiments, generating the three-dimensional image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone.
  • the method includes applying an anti-aliasing algorithm to the three-dimensional image. In some embodiments, the method includes displaying the three dimensional image on a display.
  • the method includes obtaining a plurality of two dimensional images, wherein each two dimensional image comprises a shadow from a fiducial marker, analyzing the shadow to determine shape characteristics of the shadow, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor and/or radiation sensor holder for each two dimensional image, creating, for each two-dimensional image, a three-dimensional volume by projecting each two-dimensional image in a direction of the polar angle and azimuth angle of the radiation source relative to the radiation sensor and/or radiation sensor holder used to capture the respective two-dimensional image, and generating a three-dimensional image by correlating three-dimensional volumes associated with the two-dimensional images.
  • a non-transitory computer-readable storage medium includes computer-readable instructions, which when executed by one or more processors, causes the one or more processors to perform the method of any one of the embodiments described herein.
  • Figure 1 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Figure 2A shows the difference in shadows created on a radiation sensor by a spherical fiducial marker, in accordance with an embodiment.
  • Figure 2B shows an overhead view of the different shadows created by changing the polar angle ⁇ and the azimuthal angle ⁇ of the radiation source relative to the radiation sensor, in accordance with an embodiment.
  • Figure 2C shows the shadows created by two metal bars.
  • Figure 2D shows the shadow created by two metal bars when one obscures the other from the beam of radiation.
  • Figure 3 illustrates a method of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Figure 4A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 4B shows a front view of an exemplary device in accordance with an embodiment.
  • Figure 4C shows a side view of an exemplary device in accordance with an embodiment.
  • Figure 4D shows an isometric rear view of an exemplary device in accordance with an embodiment.
  • Figure 5A shows an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 5B shows a cut-away front view of an exemplary device in accordance with an embodiment.
  • Figure 6 shows two radiographs produced by a fixed radiation sensor when imaging the same metal bars from different capture angles.
  • methods and devices utilize fiducial markers to determine the incident angle of a radiation source for each of a plurality of 2D images.
  • the fiducial markers may provide reference points on the 2D images to allow the 2D images to be combined into a 3D image.
  • some of the embodiments may be agnostic to the system used to capture the 2D images. Such embodiments allow for the use of traditional 2D imaging technology, which reduces both the cost and space required to generate a 3D image. Further, current 2D imaging techniques provide an acceptable level of radiation; leveraging current 2D imaging techniques may allow a 3D image to be generated without exposing the subject to more radiation than needed to create a typical set of 2D images.
  • a fiducial marker can be understood to be an object placed in the field of view of an imaging system that provides a reference point on an image produced by the imaging system.
  • the fiducial marker appears as a shadow in the image produced, for use as a point of reference or a measure.
  • the position, size, and/or shape of the shadow created by a fiducial marker changes depending on the relative angle of the radiation source and the radiation sensor.
  • the fiducial markers allow for the relative position of one image to be correlated to the relative position of another image in 3D space.
  • the position of the radiation source can be defined relative to the radiation sensor using polar coordinates or spherical coordinates.
  • the spherical coordinates are defined as p, ⁇ , and/or ⁇ , where p is radial distance, ⁇ is the polar angle measured from a fixed zenith orthogonal to a reference plane, e.g. a plane of the sensor, and ⁇ is the azimuthal angle.
  • Figure 1 illustrates a method 100 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Method 100 includes obtaining a plurality of 2D images 101, wherein each 2D image includes a shadow from a fiducial marker.
  • obtaining a plurality of 2D images 101 may include inserting a device in accordance with any of the embodiments described herein into the mouth of a patient.
  • obtaining a plurality of 2D images 101 may include placing a device in accordance with any of the embodiments described herein next to an object to be imaged.
  • obtaining a plurality of 2D images 101 may include affixing a device in accordance with any of the embodiments described herein to an object to be imaged.
  • Method 100 also includes analyzing the shadow to determine shape characteristics of the shadow 102, calculating, from the shape characteristics, a polar angle and an azimuth angle of a radiation source relative to the radiation sensor for each 2D image 103, and combining the 2D images into a 3D image using the polar angle and azimuth angle of each 2D image 104.
  • the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images.
  • FIG. 2A shows the difference in shadows created on a radiation sensor 201 by a spherical fiducial marker 202, in accordance with an embodiment.
  • a spherical fiducial marker 202 can create a circular shadow 204 when a radiation source 221 is positioned orthogonally to the radiation sensor 201.
  • 203 depicts the beam of radiation emitted by radiation source 221.
  • a radiation source 222 that is not orthogonal to the radiation sensor 201 can create an elliptical shadow 205.
  • 230 depicts the beam of radiation emitted by radiation source 222. This can occur because the plane of the radiation sensor 201 intersects the cone of the shadow in 3D space at an incline.
  • 206 depicts the distance between the center of the shadow created by radiation source 221 and the center of the shadow created by radiation source 222.
  • 207 depicts the distance between the sensor 201 and the fiducial marker 202.
  • FIG. 2B shows an overhead view of the different shadows created by changing the polar angle ⁇ and the azimuthal angle ⁇ of the radiation source relative to the radiation sensor, in accordance with an embodiment.
  • is 0, the radiation source is orthogonal to the radiation sensor. This creates a circular shadow 208 on the radiation sensor, which in this view is obscured by the fiducial marker 223.
  • Shadows 209, 210, and 211 depict shadows created by fiducial markers 224, 225, and 226, respectively, using radiation sources with progressively increasing ⁇ . As ⁇ increases, the shadow can become more elliptical.
  • shadows 209, 210, and 211 have progressively longer major axes because they are produced by radiation sources with progressively increasing ⁇ and constant ⁇ relative to the plane of the radiation sensor.
  • the azimuthal angle ⁇ can be determined by the direction of the fiducial shadow. Shadows 212, 213, and 214 each represent a shadow created by fiducial markers 227, 228, and 229, respectively, using radiation sources with the same ⁇ , but with different ⁇ . Thus, in some embodiments, the angle of the major axis of an elliptical shadow relative to the frame of an image is determined by the azimuthal angle ⁇ of the radiation source relative to the sensor.
  • the position of the center of the fiducial shadow can also be dependent on the spherical coordinates ⁇ and ⁇ of the radiation source relative to the radiation sensor.
  • a shadow created by a radiation source 221 orthogonal to the radiation sensor creates a circular shadow 204 directly below the fiducial marker.
  • the center of a shadow created by a radiation source with different ⁇ and/or ⁇ coordinates 222 creates a shadow with a center position displaced by a distance determined by ⁇ and in a direction determined by ⁇ .
  • the spherical coordinates of a radiation source relative to a radiation sensor used to create an image can be calculated by the shape, position, and size characteristics of fiducial shadows.
  • fiducial markers create shadows on each 2D image, wherein the shape, position, and/or size of the shadow are determined by the relative position of the radiation source and the radiation sensor (defined as the capture angle) when the image was captured.
  • some embodiments of the devices and methods herein allow for the generation of 3D images by determining the capture angle of each image from the image itself, and therefore without physically measuring the capture angle of the radiation source and the radiation sensor at the time each image is taken.
  • combining the 2D images into a 3D image 104 can include determining a position and radiodensity of each pixel on a 2D image, mapping the densities of each pixel across a plurality of voxels, and creating the 3D image.
  • Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually tracing the outline of a shadow created by the fiducial markers. Analyzing each image to identify and characterize the shadow created by the fiducial marker 102 can include manually inputting the location of the shadows.
  • software determines the location of shadows created by the fiducial markers.
  • the software can locate the shadow by blob detection methods.
  • blob detection methods detect regions in a digital image that differ in properties, such as brightness or color, compared to surrounding regions.
  • the blob detection method can be a difference of Gaussians method.
  • edges of shadows created by fiducial markers can be determined by any of the edge detection methods known in the art. Exemplary methods include Canny edge detection, morphological thinning, a wavelet transform, or a combination of any of these methods.
  • shape characteristics for the shadow created by the fiducial marker can be determined by a Hough transform method. In some embodiments, the shape
  • characteristics determined by a Hough transform method include, for example, shadow dimensions, including the lengths of the major and minor axes.
  • the method includes determining p, ⁇ , and/or ⁇ for the position of the radiation source relative to the sensor by analyzing the size, shape and/or position characteristics of shadows created by the fiducial markers.
  • analyzing the shadow 102 can include determining a length of the shadow and calculating the polar angle and azimuth angle of the radiation source relative to the radiation sensor from the length of the shadow.
  • Analyzing the shadow 102 can include determining an angle of the shadow and calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor from the angle of the shadow.
  • can be calculated from an angle of a major axis of an elliptical shadow relative to the frame of the image.
  • Analyzing the shadow 102 can include determining a shadow displacement for each of the plurality of 2D images, wherein determining a shadow displacement (e.g., shadow displacement 206 in Figure 2A) of each 2D image includes determining a position of a center of a shadow in a 2D image taken when the radiation source is orthogonal to the radiation sensor, and comparing a position of a center of a shadow in each image to the center of the shadow in the 2D image taken when the radiation source is orthogonal to the radiation sensor. Calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor can include analyzing the shadow displacement.
  • determining a shadow displacement e.g., shadow displacement 206 in Figure 2A
  • Calculating the polar angle and azimuth angle 103 of the radiation source relative to the radiation sensor can include analyzing the shadow displacement.
  • the difference in major axis angles of shadows in two or more images created by the same fiducial marker can be used to calculate the relative difference in ⁇ between the images (e.g. the angles of shadows 212, 213, and 214 in Figure 2B).
  • can be calculated by the length 206 of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in an image and the actual or expected position of a shadow from the same fiducial marker in an actual or theoretical image taken using a radiation source orthogonal to the radiation sensor.
  • the actual or theoretical positions are the x and y coordinates relative to the frame of the image sensor.
  • the relative difference in ⁇ between two images can be calculated by the length of a line representing the shift or displacement of the position of the center of the elliptical shadow from a fiducial marker in a first image and the position of a shadow from the same fiducial marker in a second image.
  • the length 206 divided by the distance 207 between the fiducial marker and the plane of the sensor equals tan(9).
  • can be calculated from a combination of any or all of the methods described herein.
  • can be calculated from a combination of any or all of the methods described herein.
  • calculating ⁇ of the radiation source relative to the radiation sensor includes calculating the angle of a line 206 drawn between the position of the center of the elliptical shadow from a fiducial marker in a first image with a first ⁇ and the expected position of the center of a shadow from the same fiducial marker in an actual or theoretical second image wherein the radiation source is orthogonal to the sensor.
  • p can be calculated by the length of the minor axis of an elliptical shadow created by a spherical fiducial marker.
  • a radiation source positioned at a larger p will create an ellipse with a smaller minor axis than when radiating the same fiducial marker from a position with a smaller p relative to the radiation sensor.
  • converting shadow information into p, ⁇ , and/or ⁇ information involves a linear regression calculation using the shape characteristics.
  • precision and/or accuracy can be improved by analyzing shape characteristics for a plurality of shadows created by a plurality of fiducial markers in each of one or more images.
  • analyzing the shape characteristics for a plurality of shadows includes a regression analysis.
  • a circular shadow created by a spherical fiducial marker indicates the radiation source is orthogonal to the sensor (e.g. shadow 204 in Figure 2A).
  • an elliptical shadow created by a spherical fiducial marker indicates the radiation source is not orthogonal to the sensor (e.g. shadow 205 in Figure 2A).
  • the direction of the elliptical shadow 205 indicates the azimuth angle of the radiation source relative to the sensor.
  • a first 2D image of the plurality of 2D images has a first capture angle
  • a second 2D image of the plurality of 2D images has a second capture angle
  • the plurality of images are captured from different angles using a radiation source rotated about an axis of rotation.
  • some or all of the images are captured with the radiation source having a different set of spherical coordinates relative to a reference plane, wherein the spherical coordinates are defined as p, ⁇ , and/or ⁇ , or a different combination of p, ⁇ , and/or ⁇ .
  • the reference plane is the plane of the sensor.
  • Figure 3 illustrates a method 300 of generating a 3D image by combining a plurality of 2D images, in accordance with an embodiment.
  • Method 300 includes obtaining a plurality of 2D images 301, determining a relative position of a sensor and a radiation source used to capture each 2D image 302, creating a 3D volume for each 2D image by projecting each 2D image in a direction of a relative position of the sensor and the radiation source used to capture the 2D image 303, and generating a 3D image by correlating the volumes associated with the 2D images 304.
  • the relative position of a fiducial marker and the radiation sensor and/or radiation sensor holder is maintained between different 2D images.
  • the method 300 can include normalizing each 2D image by analyzing a radiodensity gradient created by an object captured in each 2D image.
  • the radiodensity gradient includes radiodensities equivalent to the radiodensity of features.
  • the radiodensity gradient is created by a plurality of fiducial markers with varying radiodensities.
  • the radiodensity gradient is created by fiducial markers with radiodensities equivalent to the radiodensity of one or more features.
  • the radiodensity gradient and/or fiducial markers with varying radiodensities help identify features in a 2D or 3D image. In some embodiments,
  • normalizing each 2D image includes adjusting a parameter in each 2D image.
  • the parameter includes at least one selected from the group consisting of a gamma correction, a brightness, and a contrast of each of the 2D images. These parameters can vary between images due to differences in the distance between the radiation source and the radiation sensor, the amount of radiation generated by the radiation source, the focus of the beam of radiation, and variability between different radiation sources and sensors. In some embodiments, these adjustments are used to normalize each image so that the range of values on each image representing features are approximately consistent.
  • Example anatomical features in dental applications include caries, dentin, carious dentin, enamel, carious enamel, cementum, carious cementum, bone, gum, and other tissues.
  • a radiation sensor used to obtain a plurality of images 301 will be exposed non-uniformly if the radiation sensor is not orthogonal to the radiation source in at least one of the plurality of images.
  • the radiation source is a cone beam source or emits radiation in the shape of a cone.
  • the radiation source emits parallel rays of radiation.
  • the radiation source includes a columnator.
  • a side of the radiation sensor further away from the radiation source will be exposed by less radiation than a side closer to the radiation source.
  • normalizing the 2D images includes adjusting each pixel for the distance to the radiation source.
  • differences in radiation exposure of each pixel caused by the differences in distance to the source can be used to determine p, ⁇ , and/or ⁇ of the radiation source relative to the radiation sensor 302. In some embodiments, differences in radiation exposure caused by distance to the radiation source can be
  • the relative position includes a polar angle between a plane of the sensor and a direction of a beam emitted from the radiation source.
  • Step 303 can include creating, for each 2D image, a 3D volume by projecting each 2D image in a direction of the relative position of the sensor and the radiation source used to capture the respective 2D image.
  • each 2D image is projected from a boundary.
  • the projection creates a 3D space for each image where each pixel maps to a line of voxels, and each voxel along the line of voxels is assigned the same intensity value as the corresponding pixel for the image.
  • an intensity value represents the amount of radiation that reaches the sensor.
  • an intensity value has an inverse relationship to a radiopacity value, which can represent the amount of radiation obstructed from reaching the sensor.
  • the shape of each projection can be determined by a boundary and a direction of the projection.
  • an image taken using a radiation source orthogonal to the sensor generates a rectangular 3D space.
  • an image taken using a radiation source at any other angle generates a parallelepiped shaped 3D space, wherein one wall of the 3D space represents the boundary, and the parallelepiped extends from the sensor in a direction parallel to a point having ⁇ and ⁇ calculated for that particular image.
  • each image is projected from a rear boundary of the volumetric model.
  • the rear boundary corresponds to the position of the radiation sensor.
  • the position of the radiation sensor does not change within the model for each projection.
  • the position of the radiation sensor may be held constant while the plurality of images are captured.
  • the position of the radiation source used to capture the 2D images can move along spherical coordinates defined as p, ⁇ , and/or ⁇ , where p is radial distance, ⁇ is the polar angle, and ⁇ is the azimuthal angle.
  • each 2D image includes a series of pixels arranged in 2D space along x and y coordinates.
  • each pixel has an intensity value corresponding to the combined radiopacity of all radiopaque material between the radiation source and the radiation sensor.
  • Generating the 3D image 304 can include overlapping the 3D volumes.
  • the method includes generating a 3D image by correlating 3D volumes associated with the 2D images.
  • an images may comprise lines of voxels that intersect lines of voxels from images with different ⁇ and ⁇ values.
  • each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines that intersect within each voxel's boundaries.
  • Generating the 3D image 304 can include identifying empty voxels.
  • outlines are created around empty volumetric regions of the 3D image.
  • areas of an image, including, for example, a 2D image, that appear black or dark may not have a radiopaque object obstructing the radiation directed to that part of the sensor. In some embodiments, this indicates a clear path or relatively clear path between the source and the sensor.
  • any voxels that overlap a black line in a 2D image may be empty space regardless of the other intersecting line values corresponding to other 2D images.
  • the radiopacity or intensity values of those other intersecting lines can be attributed to the other voxels.
  • Generating the 3D image 304 can include estimating an intensity value of a nonempty voxel.
  • estimating an intensity value of a non-empty voxel includes selecting the highest intensity value from the array of intensity values for that voxel.
  • each pixel of a 2D image has an intensity value determined by the amount of radiation that the sensor detects.
  • each pixel in a 2D image represents the entire radiation that passes through the radiopaque material along a path in 3D space. The path follows a line from the radiation source through the radiopaque material to the sensor.
  • each voxel within the 3D image can have an array of intensity values corresponding to the intensity values of the lines projected from pixels in each of the plurality of 2D images that intersect within each voxel's boundaries.
  • darker areas of an image indicate higher intensity values, which can indicate less radiopaque material along the path of radiation that extends from the sensor to a point having ⁇ and ⁇ calculated for that particular image. Therefore, in some embodiments, the method includes selecting the highest intensity value, which can represent the clearest path between the radiation source and the sensor.
  • the radiopacity associated with unselected values in the array of values assigned to a voxel may belong to other voxels along other paths of radiation with different ⁇ and ⁇ values that intersect that voxel. Therefore, in some embodiments, the smallest radiopacity value in the array of values is selected to represent that voxel, which represents the highest amount of radiation exposure.
  • Generating the 3D image 304 can include a Radon transform and/or a Penrose transform. [0077] In some embodiments, outlines are created around volumetric regions determined to have higher intensity values relative to other volumetric regions. In some embodiments, this process is repeated with progressively smaller intensity values.
  • estimating the intensity value of the non-empty voxel includes averaging the array of potential values for each voxel.
  • Generating the 3D image 304 can include iteratively adjusting the intensity value of the non-empty voxel to distribute a total intensity value of the overlapping voxels.
  • a voxel relates to another voxel if a path of radiation intersects both voxels before reaching the same pixel on the radiation sensor in the same 2D image.
  • each voxel can relate to a different set of voxels for each image.
  • each value for each voxel can be cross-checked against all of the other voxels it relates to in order to ensure that radiopacity attributed to a voxel is consistent with each overlapping pixel.
  • each voxel's intensity value can be iteratively crosschecked against each of the paths of voxels determined to have less radiopaque material than other paths in order to distribute radiopacity to other voxels that intersect with other paths determined to have relatively more radiopaque material.
  • each voxel's intensity value can be iteratively crosschecked against one or more fiducial markers with varying radiopacity.
  • generating a 3D image includes identifying whether the non-empty voxel includes one selected from the group consisting of dentin, enamel, cavity, gum, and bone. Additionally, in some embodiments, once a voxel is identified as dentin, enamel, or other feature, the value for that voxel can be set and remaining radiopacity redistributed among other voxels in lines associated with the identified voxel.
  • the 3D image may be desirable to create a 3D image using the region of the projection of a 2D image that overlaps with a region of at least one other projection.
  • the 3D image can be corrected for radiopacity that only appears in a single projection.
  • the intensity values attributable to areas that only appear in one projection can be removed and the remaining voxel values adjusted
  • the relative position is a position calculated as p, ⁇ , and/or ⁇ by any of the methods contained herein.
  • the 3D image is created by using the relative difference in p, ⁇ , and/or ⁇ between two or more images.
  • generating the 3D image 304 can include identifying whether the non-empty voxel includes one selected from the group consisting of: dentin, enamel, cavity, gum, and bone.
  • Generating the 3D image 304 can include applying an anti-aliasing algorithm to the 3D image.
  • a 3D contour is created from the 3D image.
  • Figure 4A represents an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 4B represents a front view of an exemplary device in accordance with an embodiment.
  • Figure 4C represents a side view of an exemplary device in accordance with an embodiment.
  • Figure 4D represents an isometric rear view of an exemplary device in accordance with an embodiment.
  • An exemplary embodiment includes a plurality of supports for a sensor 402, and one or more fiducial markers 403.
  • the fiducial markers are held in a known and/or fixed position and/or distance relative to the supports for the sensor 402.
  • the supports can be any size.
  • the supports have a width of between 0.1 inches and 4 inches, including, for example, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • the supports can be configured to hold a sensor of any size.
  • the plurality of supports for a sensor are configured to hold a sensor with a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • Some embodiments comprise a bite plate 404.
  • the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches.
  • a sensor can be coupled to the plurality of supports 402 in some embodiments.
  • Figure 5 illustrates an exemplary device in accordance with an embodiment.
  • Figure 5A represents an isometric view of the front of an exemplary device in accordance with an embodiment.
  • Figure 5B represents a cut-away front view of an exemplary device in accordance with an embodiment.
  • An exemplary embodiment includes a radiation sensor 501, and one or more fiducial markers 505.
  • the one or more fiducial markers are contained within supports 502 coupled to the radiation sensor.
  • the sensor can be any size.
  • the senor can have a height and/or width of a radio-sensitive side between 0.25 inches and 4 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, and 4 inches.
  • the fiducial markers are held in a known and/or fixed position and/or distance relative to the sensor 501.
  • Some embodiments comprise a bite plate 504.
  • the bite plate can be any length between 0.1 and 6 inches, including, for example 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.25, 1.5, 1.75, 2.0, 2.5 3, 4, 5, or 6 inches.
  • the sensor can be any type of radiation sensor.
  • the radiation sensor is an X-ray sensor adapted for use in capturing dental images.
  • the radiation sensor is a digital sensor.
  • the position of at least one of the one or more fiducial markers are fixed relative to the position of the sensor.
  • the fiducial marker obstructs or partially obstructs radiation emitted from the radiation source from reaching the sensor, thereby creating a shadow in the resulting image.
  • the one or more fiducial markers comprise 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, or more fiducial markers.
  • the one or more fiducial markers 405 are each aligned at a set distance and/or position from the sensor and from the other fiducial markers so that a location of each fiducial marker is known with respect to the sensor and the other fiducial markers.
  • the distance between one or more of the fiducial markers and the sensor is less than 10 mm, e.g.
  • one or more of the fiducial markers are in contact with the sensor, i.e. the one or more fiducial markers are immediately adjacent to the sensor.
  • the sensor includes one or more fiducial markers.
  • the one or more fiducial markers have a known size.
  • the size of the fiducial markers can be any size that creates a shadow detectable by the radiation sensor.
  • the size of the fiducial marker is the same size or between 1 and 100 times (e.g. about 1, 2, 3, 4, 5, 6, 7, 8, 10, 15, 20, 25, 30, 40, 50, 60, 70, 80, 90, or 100 time) the size of the resolution of a pixel in the radiation sensor.
  • the size is between 0.01 and 1 cubic centimeters in volume.
  • the fiducial marker has a diameter between about 0.1 mm and about 5.0 mm long (e.g.
  • the shape of the fiducial markers can be any radiopaque mass that creates a shadow.
  • the shape of the one or more fiducial markers can be a sphere, a cylinder, a cross, a cube, a pyramid, a hexahedron, or a disc.
  • the one or more fiducial markers comprise a plurality of shapes.
  • the fiducial marker is radiopaque. In some embodiments, the fiducial marker is partially radiopaque.
  • the position of the sensor relative to the object being imaged is held constant or relatively constant.
  • the location of objects in 3D space captured in 2D images can be determined by using the "buccal object rule," also known as the SLOB rule (Same Lingual; Opposite Buccal), demonstrated in Figure 6.
  • SLOB rule Standard Lingual; Opposite Buccal
  • whether a first object is in front of or behind a second object from the perspective of the radiation source can be determined by analyzing two images captured with a radiation source with different ⁇ and/or ⁇ angles. If an object moves in the same direction as the source of the x-ray beam, it is behind (lingual) the other object. If the object moves in the opposite direction of the source, it is in front of (buccal) to the other object.
  • Figure 6 depicts two different resulting radiographs (605 and 607) produced by a radiation sensor 601 resulting from the imaging the same metal bars (602 and 603) using radiation sources from two different angles (604 and 606).
  • the radiation source 604 is orthogonal to the radiation sensor, and shadows from the metal bars 602 and 603 appear in the radiograph in almost the same relationship that they share in reality.
  • shadows from the metal bars 602 and 603 appear on the film in a distorted relationship in the resulting radiograph 607.
  • the object closer to the radiation source 603 will create a shadow at a greater distance than the object farther from the radiation source 602, and the relative shift of the shadow created by the object closer to the radiation source 603 will be greater than the relative shift of the shadow created by the objected closer 602 to the radiation sensor.
  • the amount of displacement of a shadow created by an object can be correlated to the distance between the object and the sensor by comparing the displacement to the displacement of an object with a known distance to the sensor.
  • the object with a known distance to the sensor is a fiducial marker.
  • this comparison is a linear regression.
  • the relative size and direction an object shifts from one image to another relative to other objects captured in the same image can place the object in 3D space.
  • computer program product may be used generally to refer to media such as, memory storage devices, or storage unit. These, and other forms of computer-readable media, may be involved in storing one or more instructions for use by processor to cause the processor to perform specified operations. Such instructions, generally referred to as "computer program code” (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system.
  • computer readable storage may be a non-transitory medium.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Neurosurgery (AREA)
  • Theoretical Computer Science (AREA)
  • Dentistry (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne des dispositifs et des procédés pour générer une image tridimensionnelle en combinant une pluralité d'images bidimensionnelles. Un exemple d'appareil utilisé pour capturer un rayonnement comprend un support, un porte-capteur de rayonnement couplé au support, et un repère de fond de chambre supporté par le support et à une distance déterminée par rapport au porte-capteur. L'appareil peut également comprendre un capteur de rayonnement. Les procédés de génération d'une image tridimensionnelle comprennent la combinaison d'une pluralité d'images bidimensionnelles, chaque image bidimensionnelle comprenant une ombre à partir d'un repère de fond de chambre.
PCT/US2015/042296 2014-07-28 2015-07-27 Procédé et appareil de production d'une image tridimensionnelle WO2016018825A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462029843P 2014-07-28 2014-07-28
US62/029,843 2014-07-28
US201562183104P 2015-06-22 2015-06-22
US62/183,104 2015-06-22

Publications (1)

Publication Number Publication Date
WO2016018825A1 true WO2016018825A1 (fr) 2016-02-04

Family

ID=55218220

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/042296 WO2016018825A1 (fr) 2014-07-28 2015-07-27 Procédé et appareil de production d'une image tridimensionnelle

Country Status (1)

Country Link
WO (1) WO2016018825A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190231285A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for correcting a motion of an object in panoramic, computed tomography, or cephalometric x-ray imaging
US20190231284A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for determining an imaging area of a patient in panoramic, computed tomography, or cephalometric x-ray imaging

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130274A1 (en) * 2001-03-15 2002-09-19 International Busines Machines Corporation Spatial phase locking with shaped electron beam lithography
US20040264648A1 (en) * 2003-06-25 2004-12-30 General Electric Company Method, apparatus, and medium for calibration of tomosynthesis system geometry using fiducial markers with non-determined position
US7014361B1 (en) * 2005-05-11 2006-03-21 Moshe Ein-Gal Adaptive rotator for gantry
US20090274272A1 (en) * 2005-11-09 2009-11-05 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20110255661A1 (en) * 2010-04-20 2011-10-20 Hans Schweizer Imaging fluoroscopy method and system using a navigation system marker device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130274A1 (en) * 2001-03-15 2002-09-19 International Busines Machines Corporation Spatial phase locking with shaped electron beam lithography
US20040264648A1 (en) * 2003-06-25 2004-12-30 General Electric Company Method, apparatus, and medium for calibration of tomosynthesis system geometry using fiducial markers with non-determined position
US7014361B1 (en) * 2005-05-11 2006-03-21 Moshe Ein-Gal Adaptive rotator for gantry
US20090274272A1 (en) * 2005-11-09 2009-11-05 Dexela Limited Methods and apparatus for obtaining low-dose imaging
US20110255661A1 (en) * 2010-04-20 2011-10-20 Hans Schweizer Imaging fluoroscopy method and system using a navigation system marker device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190231285A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for correcting a motion of an object in panoramic, computed tomography, or cephalometric x-ray imaging
US20190231284A1 (en) * 2018-01-26 2019-08-01 Palodex Group Oy Portable bite part for determining an imaging area of a patient in panoramic, computed tomography, or cephalometric x-ray imaging

Similar Documents

Publication Publication Date Title
US10492755B2 (en) Calibration phantom comprising a reflectance calibration target and a plurality of radio-opaque markers
US7372935B2 (en) Method for minimizing image artifacts and medical imaging system
JP5906015B2 (ja) 特徴に基づいた2次元/3次元画像のレジストレーション
US11464475B2 (en) Self-calibrating technique for x-ray imaging scanners
JP6785776B2 (ja) トモシンセシスデータセットからアーチファクトを除去するための方法、システム、装置、およびコンピュータプログラム
US8977026B2 (en) Methods and systems for locating a region of interest in an object
US10470726B2 (en) Method and apparatus for x-ray scan of occlusal dental casts
US8045778B2 (en) Hot spot detection, segmentation and identification in pet and spect images
US20090202127A1 (en) Method And System For Error Compensation
JP2009525780A (ja) Ctベースの減衰マップを作成するときの異質対象物の明示
US9375192B2 (en) Reconstruction of a cone beam scanned object
US11024061B2 (en) Apparatus and method for scattered radiation correction
EP4017370B1 (fr) Détection de marqueur d'étalonnage géométrique dans un système de tomosynthèse spectrale
JP6767997B2 (ja) 歯科用画像生成システムの画像データの画像向上のための方法
EP3072448B1 (fr) Dispositif et procédé pour générer une image tridimensionnelle de surface dentaire
WO2016018825A1 (fr) Procédé et appareil de production d'une image tridimensionnelle
CN103228213A (zh) 用于监测辐射剂量的方法
US10682113B2 (en) Self-calibrating device for X-ray imaging scanners
KR20210088604A (ko) 파노라마 라디오그래피 이미지를 편집하기 위한 디바이스 및 방법
US11367227B2 (en) Method and apparatus for computer vision based attenuation map generation
Kawai et al. Image fusion with a dental panoramic x-ray image and face image acquired with a kinect
JP7209496B2 (ja) 核医学診断装置
KR20160127690A (ko) 파노라마 영상 생성 장치 및 방법
Jiang et al. Comparison of computed tomography scout based reference point localization to conventional film and axial computed tomography
Persons et al. Brachytherapy volume visualization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15826706

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15826706

Country of ref document: EP

Kind code of ref document: A1