WO2023115516A1 - Imaging systems and methods of operation - Google Patents

Imaging systems and methods of operation Download PDF

Info

Publication number
WO2023115516A1
WO2023115516A1 PCT/CN2021/141110 CN2021141110W WO2023115516A1 WO 2023115516 A1 WO2023115516 A1 WO 2023115516A1 CN 2021141110 W CN2021141110 W CN 2021141110W WO 2023115516 A1 WO2023115516 A1 WO 2023115516A1
Authority
WO
WIPO (PCT)
Prior art keywords
image sensor
partial images
radiation
image
circular orbit
Prior art date
Application number
PCT/CN2021/141110
Other languages
French (fr)
Inventor
Peiyan CAO
Yurun LIU
Original Assignee
Shenzhen Xpectvision Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xpectvision Technology Co., Ltd. filed Critical Shenzhen Xpectvision Technology Co., Ltd.
Priority to PCT/CN2021/141110 priority Critical patent/WO2023115516A1/en
Priority to TW111143634A priority patent/TW202326175A/en
Publication of WO2023115516A1 publication Critical patent/WO2023115516A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems

Definitions

  • a radiation detector is a device that measures a property of a radiation. Examples of the property may include a spatial distribution of the intensity, phase, and polarization of the radiation.
  • the radiation measured by the radiation detector may be a radiation that has transmitted through an object.
  • the radiation measured by the radiation detector may be electromagnetic radiation such as infrared light, visible light, ultraviolet light, X-ray, or ⁇ -ray.
  • the radiation may be of other types such as ⁇ -rays and ⁇ -rays.
  • An imaging system may include one or more image sensors each of which may have one or more radiation detectors.
  • the image sensor captures the Ni partial images from Ni different image capturing positions on the circular orbit (i) .
  • the method further comprises reconstructing a 3D (3-dimensional) image of the object based on the multiple partial images.
  • said reconstructing the 3D image of the object comprises: for each value of i, reconstructing a partial 3D image of the object based on the Ni partial images; and combining the resulting M partial 3D images resulting in the 3D image of the object.
  • each point of the object is in at least a partial image of the multiple partial images.
  • the image sensor captures the multiple partial images one by one.
  • the image sensor captures all the Ni partial images without leaving the circular orbit (i) .
  • said capturing with the image sensor the multiple partial images comprises rotating the image sensor about the axis.
  • said capturing with the image sensor the multiple partial images further comprises translating the image sensor with respect to the object along a direction parallel to the axis.
  • the imaging system comprises a radiation source configured to send radiation toward the object and toward the image sensor, wherein in capturing the multiple partial images of the object, the image sensor uses radiation of the radiation from the radiation source that has transmitted through the object, and wherein said capturing the multiple partial images comprises rotating the radiation source and the image sensor about the axis while the radiation source and the image sensor remain stationary with respect to each other.
  • the radiation sent by the radiation source comprises X-rays.
  • the radiation sent by the radiation source comprises radiation pulses, and wherein radiation of each radiation pulse of the radiation pulses that has transmitted through the object is used by the image sensor for capturing a partial image of the multiple partial images.
  • the image sensor comprises P active areas, wherein each active area of the P active areas comprises multiple sensing elements, wherein the P active areas are arranged in Q active area rows, wherein each active area row of the Q active area rows comprises multiple active areas of the P active areas, wherein for each active area row of the Q active area rows, a straight line perpendicular to the axis intersects all active areas of said each active area row, and wherein P and Q are integers greater than 1.
  • any two adjacent active areas of any active area row of the Q active area rows overlap each other with respect to a direction which is perpendicular to a best-fit plane that intersects all sensing elements of the image sensor.
  • the image sensor further comprises a row gap between any two adjacent active area rows of the Q active area rows, and wherein said row gap is along a direction perpendicular to the axis.
  • Fig. 1 schematically shows a radiation detector, according to an embodiment.
  • Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector, according to an embodiment.
  • Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector, according to an embodiment.
  • Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector, according to an alternative embodiment.
  • Fig. 5 schematically shows a top view of a radiation detector package including the radiation detector and a printed circuit board (PCB) , according to an embodiment.
  • PCB printed circuit board
  • Fig. 6 schematically shows a cross-sectional view of an image sensor including the packages of Fig. 5 mounted to a system PCB (printed circuit board) , according to an embodiment.
  • PCB printed circuit board
  • FIG. 7A –Fig. 8D schematically show perspective views of an imaging system in operation, according to an embodiment.
  • Fig. 9 shows a flowchart generalizing the operation of the imaging system, according to an embodiment.
  • Fig. 10 schematically shows a top view of the image sensor of Fig. 6, according to an embodiment.
  • Fig. 11 shows a cross-sectional view of the image sensor of Fig. 10, according to an alternative embodiment.
  • Fig. 1 schematically shows a radiation detector 100, as an example.
  • the radiation detector 100 may include an array of pixels 150 (also referred to as sensing elements 150) .
  • the array may be a rectangular array (as shown in Fig. 1) , a honeycomb array, a hexagonal array, or any other suitable array.
  • the array of pixels 150 in the example of Fig. 1 has 4 rows and 7 columns; however, in general, the array of pixels 150 may have any number of rows and any number of columns.
  • Each pixel 150 may be configured to detect radiation from a radiation source (not shown) incident thereon and may be configured to measure a characteristic (e.g., the energy of the particles, the wavelength, and the frequency) of the radiation.
  • a radiation may include particles such as photons and subatomic particles.
  • Each pixel 150 may be configured to count numbers of particles of radiation incident thereon whose energy falls in a plurality of bins of energy, within a period of time. All the pixels 150 may be configured to count the numbers of particles of radiation incident thereon within a plurality of bins of energy within the same period of time. When the incident particles of radiation have similar energy, the pixels 150 may be simply configured to count numbers of particles of radiation incident thereon within a period of time, without measuring the energy of the individual particles of radiation.
  • Each pixel 150 may have its own analog-to-digital converter (ADC) configured to digitize an analog signal representing the energy of an incident particle of radiation into a digital signal, or to digitize an analog signal representing the total energy of a plurality of incident particles of radiation into a digital signal.
  • ADC analog-to-digital converter
  • the pixels 150 may be configured to operate in parallel. For example, when one pixel 150 measures an incident particle of radiation, another pixel 150 may be waiting for a particle of radiation to arrive. The pixels 150 may not have to be individually addressable.
  • the radiation detector 100 described here may have applications such as in an X-ray telescope, X-ray mammography, industrial X-ray defect detection, X-ray microscopy or microradiography, X-ray casting inspection, X-ray non-destructive testing, X-ray weld inspection, X-ray digital subtraction angiography, etc. It may be suitable to use this radiation detector 100 in place of a photographic plate, a photographic film, a PSP plate, an X-ray image intensifier, a scintillator, or another semiconductor X-ray detector.
  • Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector 100 of Fig. 1 along a line 2-2, according to an embodiment.
  • the radiation detector 100 may include a radiation absorption layer 110 and an electronics layer 120 (which may include one or more ASICs or application-specific integrated circuits) for processing or analyzing electrical signals which incident radiation generates in the radiation absorption layer 110.
  • the radiation detector 100 may or may not include a scintillator (not shown) .
  • the radiation absorption layer 110 may include a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof.
  • the semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
  • the radiation absorption layer 110 may include one or more diodes (e.g., p-i-n or p-n) formed by a first doped region 111, one or more discrete regions 114 of a second doped region 113.
  • the second doped region 113 may be separated from the first doped region 111 by an optional intrinsic region 112.
  • the discrete regions 114 may be separated from one another by the first doped region 111 or the intrinsic region 112.
  • the first doped region 111 and the second doped region 113 may have opposite types of doping (e.g., region 111 is p-type and region 113 is n-type, or region 111 is n-type and region 113 is p-type) .
  • each of the discrete regions 114 of the second doped region 113 forms a diode with the first doped region 111 and the optional intrinsic region 112.
  • the radiation absorption layer 110 has a plurality of diodes (more specifically, 7 diodes corresponding to 7 pixels 150 of one row in the array of Fig. 1, of which only 2 pixels 150 are labeled in Fig. 3 for simplicity) .
  • the plurality of diodes may have an electrical contact 119A as a shared (common) electrode.
  • the first doped region 111 may also have discrete portions.
  • the electronics layer 120 may include an electronic system 121 suitable for processing or interpreting signals generated by the radiation incident on the radiation absorption layer 110.
  • the electronic system 121 may include an analog circuitry such as a filter network, amplifiers, integrators, and comparators, or a digital circuitry such as a microprocessor, and memory.
  • the electronic system 121 may include one or more ADCs (analog to digital converters) .
  • the electronic system 121 may include components shared by the pixels 150 or components dedicated to a single pixel 150.
  • the electronic system 121 may include an amplifier dedicated to each pixel 150 and a microprocessor shared among all the pixels 150.
  • the electronic system 121 may be electrically connected to the pixels 150 by vias 131. Space among the vias may be filled with a filler material 130, which may increase the mechanical stability of the connection of the electronics layer 120 to the radiation absorption layer 110. Other bonding techniques are possible to connect the electronic system 121 to the pixels 150 without using the vias 131.
  • the radiation absorption layer 110 including diodes
  • particles of the radiation may be absorbed and generate one or more charge carriers (e.g., electrons, holes) by a number of mechanisms.
  • the charge carriers may drift to the electrodes of one of the diodes under an electric field.
  • the electric field may be an external electric field.
  • the electrical contact 119B may include discrete portions each of which is in electrical contact with the discrete regions 114.
  • the term “electrical contact” may be used interchangeably with the word “electrode.
  • the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete regions 114 ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete regions 114 than the rest of the charge carriers) .
  • Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete regions 114 are not substantially shared with another of these discrete regions 114.
  • a pixel 150 associated with a discrete region 114 may be an area around the discrete region 114 in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete region 114. Namely, less than 2%, less than 1%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel 150.
  • Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, according to an alternative embodiment.
  • the radiation absorption layer 110 may include a resistor of a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof, but does not include a diode.
  • the semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
  • the electronics layer 120 of Fig. 4 is similar to the electronics layer 120 of Fig. 3 in terms of structure and function.
  • the radiation When the radiation hits the radiation absorption layer 110 including the resistor but not diodes, it may be absorbed and generate one or more charge carriers by a number of mechanisms.
  • a particle of the radiation may generate 10 to 100,000 charge carriers.
  • the charge carriers may drift to the electrical contacts 119A and 119B under an electric field.
  • the electric field may be an external electric field.
  • the electrical contact 119B may include discrete portions.
  • the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete portions of the electrical contact 119B ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete portions than the rest of the charge carriers) .
  • a pixel 150 associated with a discrete portion of the electrical contact 119B may be an area around the discrete portion in which substantially all (more than 98%, more than 99.5%, more than 99.9%or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete portion of the electrical contact 119B. Namely, less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel associated with the one discrete portion of the electrical contact 119B.
  • Fig. 5 schematically shows a top view of a radiation detector package 500 including the radiation detector 100 and a printed circuit board (PCB) 510.
  • PCB printed circuit board
  • the term “PCB” as used herein is not limited to a particular material.
  • a PCB may include a semiconductor.
  • the radiation detector 100 may be mounted to the PCB 510.
  • the wiring between the radiation detector 100 and the PCB 510 is not shown for the sake of clarity.
  • the package 500 may have one or more radiation detectors 100.
  • the PCB 510 may include an input/output (I/O) area 512 not covered by the radiation detector 100 (e.g., for accommodating bonding wires 514) .
  • the radiation detector 100 may have an active area 190 which is where the pixels 150 (Fig. 1) are located.
  • the radiation detector 100 may have a perimeter zone 195 near the edges of the radiation detector 100.
  • the perimeter zone 195 has no pixels 150, and the radiation detector 100 does not detect particles of radiation incident on the perimeter zone
  • Fig. 6 schematically shows a cross-sectional view of an image sensor 600, according to an embodiment.
  • the image sensor 600 may include one or more radiation detector packages 500 of Fig. 5 mounted to a system PCB 650.
  • the electrical connection between the PCBs 510 and the system PCB 650 may be made by bonding wires 514.
  • the PCB 510 may have the I/O area 512 not covered by the radiation detectors 100.
  • the packages 500 may have gaps in between. The gaps may be approximately 1 mm or more.
  • a dead zone of a radiation detector (e.g., the radiation detector 100) is the area of the radiation-receiving surface of the radiation detector, on which incident particles of radiation cannot be detected by the radiation detector.
  • a dead zone of a package (e.g., package 500) is the area of the radiation-receiving surface of the package, on which incident particles of radiation cannot be detected by the radiation detector or detectors in the package.
  • the dead zone of the package 500 includes the perimeter zones 195 and the I/O area 512.
  • a dead zone (e.g., 688) of an image sensor (e.g., image sensor 600) with a group of packages (e.g., packages 500 mounted on the same PCB and arranged in the same layer or in different layers) includes the combination of the dead zones of the packages in the group and the gaps between the packages.
  • the radiation detector 100 (Fig. 1) operating by itself may be considered an image sensor.
  • the package 500 (Fig. 5) operating by itself may be considered an image sensor.
  • the image sensor 600 including the radiation detectors 100 may have the dead zone 688 among the active areas 190 of the radiation detectors 100. However, the image sensor 600 may capture multiple partial images of an object or scene (not shown) one by one, and then these captured partial images may be stitched to form a stitched image of the entire object or scene.
  • image in the present specification is not limited to spatial distribution of a property of a radiation (such as intensity) .
  • image may also include the spatial distribution of density of a substance or element.
  • Fig. 7A –Fig. 8D schematically show perspective views of an imaging system 700 in operation, according to an embodiment.
  • the imaging system 700 may include a radiation source 710 and the image sensor 600.
  • an object 720 may be positioned between the radiation source 710 and the image sensor 600.
  • the radiation source 710 may send a radiation beam 712 toward the object 720 and toward the image sensor 600.
  • the image sensor 600 may capture an image of the object 720.
  • the radiation beam 712 may include X-rays. In an embodiment, the radiation beam 712 may include radiation pulses wherein the radiation of each radiation pulse of the radiation pulses that has transmitted through the object 720 may be used by the image sensor 600 for capturing an image of the object 720.
  • the operation of the imaging system 700 may be as follows.
  • the radiation source 710 and the image sensor 600 may rotate counterclockwise about an axis 730 while the radiation source 710 and the image sensor 600 remain stationary with respect to each other.
  • the image sensor 600 moves along a first circular orbit (not shown) from a first starting position as shown in Fig. 7A, then through a first image capturing position as shown in Fig. 7B, then through a second image capturing position as shown in Fig. 7C, and then back to the first starting position as shown in Fig. 7D.
  • the rotation of the radiation source 710 and the image sensor 600 does not have to complete a full circle.
  • the axis 730 may be stationary with respect to the object 720. In an embodiment, the axis 730 may intersect the object 720 as shown. In general, the axis 730 may or may not intersect the object 720.
  • the image sensor 600 may capture a first partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
  • the image sensor 600 may capture a second partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
  • the image sensor 600 may be translated with respect to the object 720 along a direction parallel to the axis 730 (e.g., into the page) from the first starting position as shown in Fig. 7D to a second starting position as shown in Fig. 8A.
  • the dashed lines (except the dashed line on the axis 730) represent the image sensor 600 at the first starting position.
  • the radiation source 710 and the object 720 may be stationary with respect to each other while the image sensor 600 is translated from the first starting position to the second starting position as described above.
  • the radiation source 710 and the image sensor 600 may rotate counterclockwise about the axis 730 while the radiation source 710 and the image sensor 600 remain stationary with respect to each other.
  • the image sensor 600 moves along a second circular orbit (not shown) from the second starting position as shown in Fig. 8A, then through a third image capturing position as shown in Fig. 8B, then through a fourth image capturing position as shown in Fig. 8C, and then back to the second starting position as shown in Fig. 8D.
  • the rotation of the radiation source 710 and the image sensor 600 does not have to complete a full circle.
  • the number of image capturing positions on the first circular orbit is the same as the number of image capturing positions on the second circular orbit (both numbers are 2) .
  • the number of image capturing positions on each circular orbit is greater than 1 and does not have to be the same as the number of image capturing positions on another circular orbit.
  • the number of image capturing positions on the first circular orbit may be 2 as described above, and the number of image capturing positions on the second circular orbit may be 3 (instead of 2 as described above) .
  • the centers of the first and second circular orbits are on the axis 730.
  • the first and second circular orbits have the same radius and are respectively on 2 different planes that are perpendicular to the axis 730.
  • the image sensor 600 may capture a third partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
  • the image sensor 600 may capture a fourth partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
  • Fig. 9 shows a flowchart 900 generalizing the operation of the imaging system 700, according to an embodiment.
  • an image sensor of an imaging system captures multiple partial images of an object.
  • the image sensor 600 of the imaging system 700 captures the first, second, third, and fourth partial images of the object 720.
  • the 2 centers of the first and second circular orbits are on the same axis 730.
  • all the first and second circular orbits have the same radius and are respectively on 2 different planes perpendicular to the axis 730.
  • the image sensor captures Ni partial images of the multiple partial images while the image sensor is on the circular orbit (i) .
  • a 3D image of the object 720 may be reconstructed based on the first, second, third, and fourth partial images of the object 720. Specifically, in an embodiment, a first partial 3D image of the object 720 may be reconstructed based on the first and second partial images of the object 720. Similarly, a second partial 3D image of the object 720 may be reconstructed based on the third and fourth partial images of the object 720. Then, the 3D image of the object 720 may be created by combining the first and second partial 3D images of the object 720. Note that a partial 3D image of the object 720 is a 3D image of a portion of the object 720.
  • the object 720 may be entirely imaged or scanned.
  • each point of the object is in at least a partial image of the multiple partial images.
  • each and every point of the object 720 is in at least one of the first, second, third, and fourth partial images.
  • each and every point of the object 720 is in at least 2 partial images which the image sensor 600 captures while the image sensor 600 is on a circular orbit.
  • each and every point of the object 720 is in at least (A) the first and second partial images or (B) the third and fourth partial images.
  • Fig. 10 schematically shows a top view of the image sensor 600 of Fig. 6, according to an embodiment.
  • Fig. 6 schematically shows a cross-sectional view of the image sensor 600 of Fig. 10 along a line 6-6, according to an embodiment.
  • the perimeter zones 195 are not shown.
  • the image sensor 600 may have 2 radiation detector packages 500 each of which may have 3 active areas 190.
  • the image sensor 600 may have multiple radiation detector packages 500 each of which may have multiple active areas 190.
  • the 6 active areas 190 of the image sensor 600 may be arranged in 2 active area rows each of which has 3 active areas 190 as shown in Fig. 10.
  • the 2 active area rows may be respectively on the 2 row PCBs 510.
  • the 2 row PCBs 510 may be on the system PCB 650.
  • a direction 1091 may be parallel to the active area rows of the image sensor 600. In other words, for each active area row of the 2 active area rows of the image sensor 600, a straight line parallel to the direction 1091 intersects all the 3 active areas 190 of said each active area row.
  • the axis 730 (Fig. 7A –Fig. 8D) may be chosen such that the direction 1091 is perpendicular to the axis 730.
  • the 2 active area rows of the image sensor 600 are perpendicular to the axis 730.
  • a straight line perpendicular to the axis 730 intersects all 3 active areas 190 of said each active area row.
  • the image sensor 600 may include a column gap 192 between any 2 adjacent active areas 190 of any active area row of the 2 active area rows of the image sensor 600.
  • each of the column gaps 192 of the image sensor 600 may be along a direction 1092 perpendicular to the direction 1091.
  • the image sensor 600 includes the 2 row I/O areas 512 respectively for the 2 active area rows.
  • the 2 row I/O areas 512 and the 2 active area rows may be arranged in an alternating manner as shown in Fig. 10.
  • any 2 adjacent active areas 190 of any active area row of the 2 active area rows of the image sensor 600 may overlap each other with respect to a direction which is perpendicular to a best-fit plane that intersects all sensing elements 150 of the image sensor 600 (the best-fit plane is not shown but should be parallel to the page of Fig. 10) .
  • the best-fit plane is not shown but should be parallel to the page of Fig. 10.
  • Fig. 11 which shows a cross-sectional view of the image sensor 600 of Fig. 10 along a line 11-11 in case of the alternative embodiment described above
  • the 2 left active areas 190 (which are adjacent) overlap each other with respect to a direction 1120 that is perpendicular to the best-fit plane.
  • the 2 left active areas 190 mentioned above may be respectively in two different wafer layers 1101 and 1102.
  • the fabrication of the image sensor 600 may be as follows. The components of the image sensor 600 may be formed on the two separate wafer layers 1101 and 1102, and then the two wafer layers 1101 and 1102 may be bonded together resulting in the image sensor 600 of Fig. 11.
  • the image sensor 600 may include a row gap 194 between the 2 adjacent active area rows.
  • the row gap 194 may be along a direction perpendicular to the axis 730 (Fig. 7A –Fig. 8D) .
  • no 2 image capturing positions respectively on 2 chronologically consecutive circular orbits may be at the same angle.
  • a straight line going through the 2 image capturing positions is not parallel to the axis 730.
  • the first and second circular orbits are 2 chronologically consecutive circular orbits because the image sensor 600 moves on the first circular orbit and then moves on the second circular orbit without moving on a third circular orbit after moving on the first circular orbit and before moving on the second circular orbit.
  • the image sensor 600 captures all the partial images for a circular orbit without leaving the circular orbit. In other words, the image sensor 600 does not leave the circular orbit until the image sensor 600 captures all the partial images for that circular orbit. For example, the image sensor 600 does not leave the first circular orbit until the image sensor 600 captures all the partial images for the first circular orbit (i.e., the first and second partial images) .
  • the image sensor 600 may leave a circular orbit before the image sensor 600 captures all the partial images for that circular orbit.
  • the image sensor 600 may capture the first partial image while the image sensor 600 is moving through the first image capturing position along the first circular orbit. Then, the image sensor 600 may be translated from the first circular orbit to the second circular orbit. Then, the image sensor 600 may capture the third partial image while the image sensor 600 is moving through the third image capturing position along the second circular orbit.
  • the image sensor 600 may be translated from the second circular orbit back to the first circular orbit. Then, the image sensor 600 may capture the second partial image while the image sensor 600 is moving through the second image capturing position along the first circular orbit. Then, the image sensor 600 may be translated from the first circular orbit to the second circular orbit again. Then, the image sensor 600 may capture the fourth partial image while the image sensor 600 is moving through the fourth image capturing position along the second circular orbit.
  • the image sensor 600 moves along the first and second circular orbits in the same angular direction (i.e., counterclockwise) .
  • the image sensor 600 may move along the first and second circular orbits in different angular directions.
  • the image sensor 600 may move counterclockwise along the first circular orbit for capturing the first and second partial images as described above, but the image sensor 600 may move clockwise along the second circular orbit for capturing the third and fourth partial images.
  • the image sensor 600 may reverse its angular direction on a circular orbit.
  • the image sensor 600 may be moving counterclockwise through the first image capturing position along the first circular orbit when the image sensor 600 captures the first partial image of the object 720, but the image sensor 600 may reverse its angular direction and then may be moving clockwise through the second image capturing position along the first circular orbit when the image sensor 600 captures the second partial image of the object 720.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Measurement Of Radiation (AREA)

Abstract

Disclosed herein is a method that includes capturing with an image sensor of an imaging system multiple partial images of an object. The image sensor captures each partial image of the multiple partial images while the image sensor is on a circular orbit of circular orbits (i), i=1, …, M. All centers of the circular orbits (i), i=1, …, M are on a same axis. All the circular orbits (i), i=1, …, M have a same radius and are respectively on M different planes perpendicular to the axis. For each value of i, the image sensor captures Ni partial images of the multiple partial images while the image sensor is on the circular orbit (i). M, Ni, i=1, …, M are integers greater than 1.

Description

IMAGING SYSTEMS AND METHODS OF OPERATION Background
A radiation detector is a device that measures a property of a radiation. Examples of the property may include a spatial distribution of the intensity, phase, and polarization of the radiation. The radiation measured by the radiation detector may be a radiation that has transmitted through an object. The radiation measured by the radiation detector may be electromagnetic radiation such as infrared light, visible light, ultraviolet light, X-ray, or γ-ray. The radiation may be of other types such as α-rays and β-rays. An imaging system may include one or more image sensors each of which may have one or more radiation detectors.
Summary
Disclosed herein is a method comprising: capturing with an image sensor of an imaging system multiple partial images of an object, wherein the image sensor captures each partial image of the multiple partial images while the image sensor is on a circular orbit of circular orbits (i) , i=1, …, M, wherein all centers of the circular orbits (i) , i=1, …, M are on a same axis, wherein all the circular orbits (i) , i=1, …, M have a same radius and are respectively on M different planes perpendicular to the axis, wherein for each value of i, the image sensor captures Ni partial images of the multiple partial images while the image sensor is on the circular orbit (i) , and wherein M, Ni, i=1, …, M are integers greater than 1.
In an aspect, said capturing the multiple partial images comprises moving the image sensor among the circular orbits (i) , i=1, …, M.
In an aspect, for each value of i, the image sensor captures the Ni partial images from Ni different image capturing positions on the circular orbit (i) .
In an aspect, the method further comprises reconstructing a 3D (3-dimensional) image of the object based on the multiple partial images.
In an aspect, said reconstructing the 3D image of the object comprises: for each value of i, reconstructing a partial 3D image of the object based on the Ni partial images; and combining the resulting M partial 3D images resulting in the 3D image of the object.
In an aspect, each point of the object is in at least a partial image of the multiple partial images.
In an aspect, each point of the object is in at least 2 partial images of the multiple partial images which the image sensor captures while the image sensor is on a circular orbit of the circular orbits (i) , i=1, …, M.
In an aspect, the image sensor captures the multiple partial images one by one.
In an aspect, the image sensor captures each partial image of the multiple partial images while the image sensor is moving with respect to the object along a circular orbit of the circular orbits (i) , i=1, …, M.
In an aspect, for each value of i, the image sensor captures all the Ni partial images without leaving the circular orbit (i) .
In an aspect, there is an angular direction for the image sensor, and wherein for each value of i, the image sensor moves in the angular direction as the image sensor captures the Ni partial images.
In an aspect, for at least a value of i, the image sensor captures at least a partial image of the Ni partial images but not all the Ni partial images and then moves to another circular orbit of the circular orbits (i) , i=1, …, M.
In an aspect, said capturing with the image sensor the multiple partial images comprises rotating the image sensor about the axis.
In an aspect, said capturing with the image sensor the multiple partial images further comprises translating the image sensor with respect to the object along a direction parallel to the axis.
In an aspect, the imaging system comprises a radiation source configured to send radiation toward the object and toward the image sensor, wherein in capturing the multiple partial images of the object, the image sensor uses radiation of the radiation from the radiation source that has transmitted through the object, and wherein said capturing the multiple partial images comprises rotating the radiation source and the image sensor about the axis while the radiation source and the image sensor remain stationary with respect to each other.
In an aspect, said capturing the multiple partial images further comprises translating the image sensor with respect to the object along a direction parallel to the axis from a circular orbit of the circular orbits (i) , i=1, …, M to another circular orbit of the circular orbits (i) , i=1, …, M while the radiation source and the object are stationary with respect to each other.
In an aspect, the radiation sent by the radiation source comprises X-rays.
In an aspect, the radiation sent by the radiation source comprises radiation pulses, and wherein radiation of each radiation pulse of the radiation pulses that has transmitted through the object is used by the image sensor for capturing a partial image of the multiple partial images.
In an aspect, the image sensor comprises P active areas, wherein each active area of the P active areas comprises multiple sensing elements, wherein the P active areas are arranged in Q active area rows, wherein each active area row of the Q active area rows comprises multiple active areas of the P active areas, wherein for each active area row of the Q active area rows, a straight line perpendicular to the axis intersects all active areas of said each active area row, and wherein P and Q are integers greater than 1.
In an aspect, any two adjacent active areas of any active area row of the Q active area rows overlap each other with respect to a direction which is perpendicular to a best-fit plane that intersects all sensing elements of the image sensor.
In an aspect, the image sensor further comprises a row gap between any two adjacent active area rows of the Q active area rows, and wherein said row gap is along a direction perpendicular to the axis.
In an aspect, the image sensor moves on the circular orbit (1) , then on the circular orbit (2) , …, and then on the circular orbit (M) as the image sensor captures the multiple partial images, and wherein for each value of i, i=1, …, (M-1) , for every pair of (A) a first image capturing position of the image sensor on the circular orbit (i) and (B) a second image capturing position of the image sensor on the circular orbit (i+1) , a straight line going through the first and second capturing positions is not parallel to the axis.
Brief Description of Figures
Fig. 1 schematically shows a radiation detector, according to an embodiment.
Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector, according to an embodiment.
Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector, according to an embodiment.
Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector, according to an alternative embodiment.
Fig. 5 schematically shows a top view of a radiation detector package including the radiation detector and a printed circuit board (PCB) , according to an embodiment.
Fig. 6 schematically shows a cross-sectional view of an image sensor including the packages of Fig. 5 mounted to a system PCB (printed circuit board) , according to an embodiment.
Fig. 7A –Fig. 8D schematically show perspective views of an imaging system in operation, according to an embodiment.
Fig. 9 shows a flowchart generalizing the operation of the imaging system, according to an embodiment.
Fig. 10 schematically shows a top view of the image sensor of Fig. 6, according to an embodiment.
Fig. 11 shows a cross-sectional view of the image sensor of Fig. 10, according to an alternative embodiment.
Detailed Description
RADIATION DETECTOR
Fig. 1 schematically shows a radiation detector 100, as an example. The radiation detector 100 may include an array of pixels 150 (also referred to as sensing elements 150) . The array may be a rectangular array (as shown in Fig. 1) , a honeycomb array, a hexagonal array, or any other suitable array. The array of pixels 150 in the example of Fig. 1 has 4 rows and 7 columns; however, in general, the array of pixels 150 may have any number of rows and any number of columns.
Each pixel 150 may be configured to detect radiation from a radiation source (not shown) incident thereon and may be configured to measure a characteristic (e.g., the energy of the particles, the wavelength, and the frequency) of the radiation. A radiation may include particles such as photons and subatomic particles. Each pixel 150 may be configured to count numbers of particles of radiation incident thereon whose energy falls in a plurality of bins of energy, within a period of time. All the pixels 150 may be configured to count the numbers of particles of radiation incident thereon within a plurality of bins of energy within the same period of time. When the incident particles of radiation have similar energy, the pixels 150 may be simply configured to count numbers of particles of radiation incident thereon within a period of time, without measuring the energy of the individual particles of radiation.
Each pixel 150 may have its own analog-to-digital converter (ADC) configured to digitize an analog signal representing the energy of an incident particle of radiation into a digital signal, or to digitize an analog signal representing the total energy of a plurality of incident particles of radiation into a digital signal. The pixels 150 may be configured to operate in parallel. For example, when one pixel 150 measures an incident particle of radiation, another pixel 150 may be waiting for a particle of radiation to arrive. The pixels 150 may not have to be individually addressable.
The radiation detector 100 described here may have applications such as in an X-ray telescope, X-ray mammography, industrial X-ray defect detection, X-ray microscopy or microradiography, X-ray casting inspection, X-ray non-destructive testing, X-ray weld inspection, X-ray digital subtraction angiography, etc. It may be suitable to use this radiation detector 100 in place of a photographic plate, a photographic film, a PSP plate, an X-ray image intensifier, a scintillator, or another semiconductor X-ray detector.
Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector 100 of Fig. 1 along a line 2-2, according to an embodiment. Specifically, the radiation detector 100 may include a radiation absorption layer 110 and an electronics layer 120 (which may include one or more ASICs or application-specific integrated circuits) for processing or analyzing electrical signals which incident radiation generates in the radiation absorption layer 110. The radiation detector 100 may or may not include a scintillator (not shown) . The radiation absorption layer 110 may include a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, as an example. Specifically, the radiation absorption layer 110 may include one or more diodes (e.g., p-i-n or p-n) formed by a first doped region 111, one or more discrete regions 114 of a second doped region 113. The second doped region 113 may be separated from the first doped region 111 by an optional intrinsic region 112. The discrete regions 114 may be separated from one another by the first doped region 111 or the intrinsic region 112. The first doped region 111 and the second doped region 113 may have opposite types of doping (e.g., region 111 is p-type and region 113 is n-type, or region 111 is n-type and region 113 is p-type) . In the example of Fig. 3, each of the discrete regions 114 of the second doped region 113 forms a diode with the first doped region 111 and the optional intrinsic region 112. Namely, in the example in Fig. 3, the radiation absorption layer 110 has a plurality of diodes (more specifically, 7 diodes corresponding to 7 pixels 150 of one row in the array of Fig. 1, of which only 2 pixels 150 are labeled in Fig. 3 for simplicity) . The plurality of diodes may have an electrical contact 119A as a shared (common) electrode. The first doped region 111 may also have discrete portions.
The electronics layer 120 may include an electronic system 121 suitable for processing or interpreting signals generated by the radiation incident on the radiation absorption layer  110. The electronic system 121 may include an analog circuitry such as a filter network, amplifiers, integrators, and comparators, or a digital circuitry such as a microprocessor, and memory. The electronic system 121 may include one or more ADCs (analog to digital converters) . The electronic system 121 may include components shared by the pixels 150 or components dedicated to a single pixel 150. For example, the electronic system 121 may include an amplifier dedicated to each pixel 150 and a microprocessor shared among all the pixels 150. The electronic system 121 may be electrically connected to the pixels 150 by vias 131. Space among the vias may be filled with a filler material 130, which may increase the mechanical stability of the connection of the electronics layer 120 to the radiation absorption layer 110. Other bonding techniques are possible to connect the electronic system 121 to the pixels 150 without using the vias 131.
When radiation from the radiation source (not shown) hits the radiation absorption layer 110 including diodes, particles of the radiation may be absorbed and generate one or more charge carriers (e.g., electrons, holes) by a number of mechanisms. The charge carriers may drift to the electrodes of one of the diodes under an electric field. The electric field may be an external electric field. The electrical contact 119B may include discrete portions each of which is in electrical contact with the discrete regions 114. The term “electrical contact” may be used interchangeably with the word “electrode. ” In an embodiment, the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete regions 114 ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete regions 114 than the rest of the charge carriers) . Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete regions 114 are not substantially shared with another of these discrete regions 114. A pixel 150 associated with a discrete region 114 may be an area around the discrete region 114 in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete region 114. Namely, less than 2%, less than 1%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel 150.
Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, according to an alternative embodiment. More specifically, the radiation absorption layer 110 may include a resistor of a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof, but does not include a  diode. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest. In an embodiment, the electronics layer 120 of Fig. 4 is similar to the electronics layer 120 of Fig. 3 in terms of structure and function.
When the radiation hits the radiation absorption layer 110 including the resistor but not diodes, it may be absorbed and generate one or more charge carriers by a number of mechanisms. A particle of the radiation may generate 10 to 100,000 charge carriers. The charge carriers may drift to the  electrical contacts  119A and 119B under an electric field. The electric field may be an external electric field. The electrical contact 119B may include discrete portions. In an embodiment, the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete portions of the electrical contact 119B ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete portions than the rest of the charge carriers) . Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete portions of the electrical contact 119B are not substantially shared with another of these discrete portions of the electrical contact 119B. A pixel 150 associated with a discrete portion of the electrical contact 119B may be an area around the discrete portion in which substantially all (more than 98%, more than 99.5%, more than 99.9%or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete portion of the electrical contact 119B. Namely, less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel associated with the one discrete portion of the electrical contact 119B.
RADIATION DETECTOR PACKAGE
Fig. 5 schematically shows a top view of a radiation detector package 500 including the radiation detector 100 and a printed circuit board (PCB) 510. The term “PCB” as used herein is not limited to a particular material. For example, a PCB may include a semiconductor. The radiation detector 100 may be mounted to the PCB 510. The wiring between the radiation detector 100 and the PCB 510 is not shown for the sake of clarity. The package 500 may have one or more radiation detectors 100. The PCB 510 may include an input/output (I/O) area 512 not covered by the radiation detector 100 (e.g., for accommodating bonding wires 514) . The radiation detector 100 may have an active area 190 which is where the pixels 150 (Fig. 1) are located. The radiation detector 100 may have a perimeter zone 195 near the edges of the  radiation detector 100. The perimeter zone 195 has no pixels 150, and the radiation detector 100 does not detect particles of radiation incident on the perimeter zone 195.
IMAGE SENSOR
Fig. 6 schematically shows a cross-sectional view of an image sensor 600, according to an embodiment. The image sensor 600 may include one or more radiation detector packages 500 of Fig. 5 mounted to a system PCB 650. The electrical connection between the PCBs 510 and the system PCB 650 may be made by bonding wires 514. In order to accommodate the bonding wires 514 on the PCB 510, the PCB 510 may have the I/O area 512 not covered by the radiation detectors 100. In order to accommodate the bonding wires 514 on the system PCB 650, the packages 500 may have gaps in between. The gaps may be approximately 1 mm or more. Particles of radiation incident on the perimeter zones 195, on the I/O area 512, or on the gaps cannot be detected by the packages 500 on the system PCB 650. A dead zone of a radiation detector (e.g., the radiation detector 100) is the area of the radiation-receiving surface of the radiation detector, on which incident particles of radiation cannot be detected by the radiation detector. A dead zone of a package (e.g., package 500) is the area of the radiation-receiving surface of the package, on which incident particles of radiation cannot be detected by the radiation detector or detectors in the package. In this example shown in Fig. 5 and Fig. 6, the dead zone of the package 500 includes the perimeter zones 195 and the I/O area 512. A dead zone (e.g., 688) of an image sensor (e.g., image sensor 600) with a group of packages (e.g., packages 500 mounted on the same PCB and arranged in the same layer or in different layers) includes the combination of the dead zones of the packages in the group and the gaps between the packages.
In an embodiment, the radiation detector 100 (Fig. 1) operating by itself may be considered an image sensor. In an embodiment, the package 500 (Fig. 5) operating by itself may be considered an image sensor.
The image sensor 600 including the radiation detectors 100 may have the dead zone 688 among the active areas 190 of the radiation detectors 100. However, the image sensor 600 may capture multiple partial images of an object or scene (not shown) one by one, and then these captured partial images may be stitched to form a stitched image of the entire object or scene.
The term “image” in the present specification is not limited to spatial distribution of a property of a radiation (such as intensity) . For example, the term “image” may also include the spatial distribution of density of a substance or element.
IMAGING SYSTEM
Fig. 7A –Fig. 8D schematically show perspective views of an imaging system 700 in operation, according to an embodiment. In an embodiment, with reference to Fig. 7A, the imaging system 700 may include a radiation source 710 and the image sensor 600. In an embodiment, an object 720 may be positioned between the radiation source 710 and the image sensor 600.
In an embodiment, the radiation source 710 may send a radiation beam 712 toward the object 720 and toward the image sensor 600. Using the radiation of the radiation beam 712 that has transmitted through the object 720, the image sensor 600 may capture an image of the object 720.
In an embodiment, the radiation beam 712 may include X-rays. In an embodiment, the radiation beam 712 may include radiation pulses wherein the radiation of each radiation pulse of the radiation pulses that has transmitted through the object 720 may be used by the image sensor 600 for capturing an image of the object 720.
OPERATION OF IMAGING SYSTEM
IMAGE SENSOR ON FIRST CIRCULAR ORBIT
In an embodiment, with reference to Fig. 7A –Fig. 7D, the operation of the imaging system 700 may be as follows. In an embodiment, the radiation source 710 and the image sensor 600 may rotate counterclockwise about an axis 730 while the radiation source 710 and the image sensor 600 remain stationary with respect to each other. As a result, the image sensor 600 moves along a first circular orbit (not shown) from a first starting position as shown in Fig. 7A, then through a first image capturing position as shown in Fig. 7B, then through a second image capturing position as shown in Fig. 7C, and then back to the first starting position as shown in Fig. 7D. The rotation of the radiation source 710 and the image sensor 600 does not have to complete a full circle.
In an embodiment, the axis 730 may be stationary with respect to the object 720. In an embodiment, the axis 730 may intersect the object 720 as shown. In general, the axis 730 may or may not intersect the object 720.
In an embodiment, with reference to Fig. 7B, while the image sensor 600 is moving through the first image capturing position along the first circular orbit, the image sensor 600 may capture a first partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
Similarly, in an embodiment, with reference to Fig. 7C, while the image sensor 600 is moving through the second image capturing position along the first circular orbit, the image sensor 600 may capture a second partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
IMAGE SENSOR IS TRANSLATED TO ANOTHER CIRCULAR ORBIT
In an embodiment, after the image sensor 600 comes back to the first starting position as shown in Fig. 7D, the image sensor 600 may be translated with respect to the object 720 along a direction parallel to the axis 730 (e.g., into the page) from the first starting position as shown in Fig. 7D to a second starting position as shown in Fig. 8A. In Fig. 8A, the dashed lines (except the dashed line on the axis 730) represent the image sensor 600 at the first starting position.
In an embodiment, the radiation source 710 and the object 720 may be stationary with respect to each other while the image sensor 600 is translated from the first starting position to the second starting position as described above.
IMAGE SENSOR ON SECOND CIRCULAR ORBIT
In an embodiment, with reference to Fig. 8A –Fig. 8D, after the image sensor 600 reaches the second starting position as shown in Fig. 8A, the radiation source 710 and the image sensor 600 may rotate counterclockwise about the axis 730 while the radiation source 710 and the image sensor 600 remain stationary with respect to each other. As a result, the image sensor 600 moves along a second circular orbit (not shown) from the second starting position as shown in Fig. 8A, then through a third image capturing position as shown in Fig. 8B, then through a fourth image capturing position as shown in Fig. 8C, and then back to the second starting position as shown in Fig. 8D. The rotation of the radiation source 710 and the image sensor 600 does not have to complete a full circle.
For simplicity, in the embodiments described above, the number of image capturing positions on the first circular orbit is the same as the number of image capturing positions on the second circular orbit (both numbers are 2) . In general, the number of image capturing positions on each circular orbit is greater than 1 and does not have to be the same as the number of image capturing positions on another circular orbit. For example, the number of image capturing positions on the first circular orbit may be 2 as described above, and the number of image capturing positions on the second circular orbit may be 3 (instead of 2 as described above) .
As a result of the rotations and the translation of the image sensor 600 as described above, the centers of the first and second circular orbits are on the axis 730. In addition, the first and second circular orbits have the same radius and are respectively on 2 different planes that are perpendicular to the axis 730.
In an embodiment, with reference to Fig. 8B, while the image sensor 600 is moving through the third image capturing position along the second circular orbit, the image sensor 600 may capture a third partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
Similarly, in an embodiment, with reference to Fig. 8C, while the image sensor 600 is moving through the fourth image capturing position along the second circular orbit, the image sensor 600 may capture a fourth partial image of the object 720 by using the radiation of the radiation beam 712 from the radiation source 710 that has transmitted through the object 720.
FLOWCHART FOR GENERALIZATION OF OPERATION OF IMAGING SYSTEM
Fig. 9 shows a flowchart 900 generalizing the operation of the imaging system 700, according to an embodiment. In step 910, an image sensor of an imaging system captures multiple partial images of an object. For example, in the embodiments described above, with reference to Fig. 7A -Fig. 8D, the image sensor 600 of the imaging system 700 captures the first, second, third, and fourth partial images of the object 720.
In addition, also in step 910, the image sensor captures each partial image of the multiple partial images while the image sensor is on a circular orbit of circular orbits (i) , i=1, …, M. For example, in the embodiments described above, with reference to Fig. 7A -Fig. 8D, the image sensor 600 captures each of the first, second, third, and fourth partial images while the image sensor 600 is on one of the first and second circular orbits (here, M=2) .
In addition, also in step 910, all centers of the circular orbits (i) , i=1, …, M are on a same axis. For example, in the embodiments described above, with reference to Fig. 7A -Fig. 8D, the 2 centers of the first and second circular orbits are on the same axis 730.
In addition, also in step 910, all the circular orbits (i) , i=1, …, M have a same radius and are respectively on M different planes perpendicular to the axis. For example, in the embodiments described above, with reference to Fig. 7A -Fig. 8D, all the first and second circular orbits have the same radius and are respectively on 2 different planes perpendicular to the axis 730.
In addition, also in step 910, for each value of i, the image sensor captures Ni partial images of the multiple partial images while the image sensor is on the circular orbit (i) . For example, in the embodiments described above, with reference to Fig. 7A -Fig. 8D, for i=1, the image sensor 600 captures N1=2 partial images of the 4 partial images (i.e., the first and second partial images) while the image sensor 600 is on the first circular orbit. For i=2, the image sensor 600 captures N2=2 partial images of the 4 partial images (i.e., the third and fourth partial images) while the image sensor 600 is on the second circular orbit.
OTHER EMBODIMENTS
3D (3-DIMENSIONAL) IMAGE OF THE OBJECT
In an embodiment, a 3D image of the object 720 may be reconstructed based on the first, second, third, and fourth partial images of the object 720. Specifically, in an embodiment, a first partial 3D image of the object 720 may be reconstructed based on the first and second partial images of the object 720. Similarly, a second partial 3D image of the object 720 may be reconstructed based on the third and fourth partial images of the object 720. Then, the 3D image of the object 720 may be created by combining the first and second partial 3D images of the object 720. Note that a partial 3D image of the object 720 is a 3D image of a portion of the object 720.
OBJECT IS ENTIRELY IMAGED
In an embodiment, the object 720 may be entirely imaged or scanned. In other words, in the general case, with reference to step 910 of Fig. 9, each point of the object is in at least a partial image of the multiple partial images. In the embodiments described above, each and every point of the object 720 is in at least one of the first, second, third, and fourth partial images.
In an alternative embodiment, each and every point of the object 720 is in at least 2 partial images which the image sensor 600 captures while the image sensor 600 is on a circular orbit. In other words, in the embodiments described above, each and every point of the object 720 is in at least (A) the first and second partial images or (B) the third and fourth partial images.
IMAGE SENSOR IN DETAILS
Fig. 10 schematically shows a top view of the image sensor 600 of Fig. 6, according to an embodiment. Note that Fig. 6 schematically shows a cross-sectional view of the image sensor 600 of Fig. 10 along a line 6-6, according to an embodiment. However, in Fig. 10, for simplicity, the perimeter zones 195 are not shown.
In an embodiment, with reference to Fig. 6 and Fig. 10, the image sensor 600 may have 2 radiation detector packages 500 each of which may have 3 active areas 190. In general, the image sensor 600 may have multiple radiation detector packages 500 each of which may have multiple active areas 190. The 6 active areas 190 of the image sensor 600 may be arranged in 2 active area rows each of which has 3 active areas 190 as shown in Fig. 10.
In an embodiment, the 2 active area rows may be respectively on the 2 row PCBs 510. In an embodiment, the 2 row PCBs 510 may be on the system PCB 650.
In an embodiment, a direction 1091 may be parallel to the active area rows of the image sensor 600. In other words, for each active area row of the 2 active area rows of the image sensor 600, a straight line parallel to the direction 1091 intersects all the 3 active areas 190 of said each active area row.
In an embodiment, the axis 730 (Fig. 7A –Fig. 8D) may be chosen such that the direction 1091 is perpendicular to the axis 730. As a result, the 2 active area rows of the image sensor 600 are perpendicular to the axis 730. In other words, for each active area row of the 2 active area rows of the image sensor 600, a straight line perpendicular to the axis 730 intersects all 3 active areas 190 of said each active area row.
In an embodiment, with reference to Fig. 10, the image sensor 600 may include a column gap 192 between any 2 adjacent active areas 190 of any active area row of the 2 active area rows of the image sensor 600. In an embodiment, each of the column gaps 192 of the image sensor 600 may be along a direction 1092 perpendicular to the direction 1091.
In an embodiment, with reference to Fig. 10, the image sensor 600 includes the 2 row I/O areas 512 respectively for the 2 active area rows. In an embodiment, the 2 row I/O areas 512 and the 2 active area rows may be arranged in an alternating manner as shown in Fig. 10.
In the embodiments described above, with reference to Fig. 10, there is a column gap 192 between any 2 adjacent active areas 190 of any active area row of the 2 active area rows of the image sensor 600. In an alternative embodiment, any 2 adjacent active areas 190 of any active area row of the 2 active area rows of the image sensor 600 may overlap each other with respect to a direction which is perpendicular to a best-fit plane that intersects all sensing elements 150 of the image sensor 600 (the best-fit plane is not shown but should be parallel to the page of Fig. 10) . In other words, for said any 2 adjacent active areas 190, there is a straight line that is perpendicular to the best-fit plane and intersects both said any 2 adjacent active areas 190.
For example, with reference to Fig. 11 (which shows a cross-sectional view of the image sensor 600 of Fig. 10 along a line 11-11 in case of the alternative embodiment described above) , the 2 left active areas 190 (which are adjacent) overlap each other with respect to a direction 1120 that is perpendicular to the best-fit plane.
In an embodiment, with reference to Fig. 11, the 2 left active areas 190 mentioned above may be respectively in two  different wafer layers  1101 and 1102. In an embodiment, the fabrication of the image sensor 600 may be as follows. The components of the image sensor 600 may be formed on the two  separate wafer layers  1101 and 1102, and then the two  wafer layers  1101 and 1102 may be bonded together resulting in the image sensor 600 of Fig. 11.
In an embodiment, with reference back to Fig. 10, the image sensor 600 may include a row gap 194 between the 2 adjacent active area rows. In an embodiment, the row gap 194 may be along a direction perpendicular to the axis 730 (Fig. 7A –Fig. 8D) .
NO 2 IMAGE CAPTURING POSITIONS ON 2 CHRONOLOGICALLY CONSECUTIVE CIRCULAR ORBITS ARE AT THE SAME ANGLE
In an embodiment, with reference to Fig. 7A –Fig. 8D, no 2 image capturing positions respectively on 2 chronologically consecutive circular orbits may be at the same angle. In other words, in the embodiments described above, for every pair of (A) an image capturing position of the image sensor 600 on the first circular orbit and (B) an image capturing position of the image sensor 600 on the second circular orbit, a straight line going through the 2 image capturing positions is not parallel to the axis 730.
Note that in the embodiment described above, the first and second circular orbits are 2 chronologically consecutive circular orbits because the image sensor 600 moves on the first circular orbit and then moves on the second circular orbit without moving on a third circular orbit after moving on the first circular orbit and before moving on the second circular orbit.
ALTERNATIVE EMBODIMENTS
IMAGE SENSOR LEAVES A CIRCULAR ORBIT BEFORE CAPTURING ALL PARTIAL IMAGES FOR THAT CIRCULAR ORBIT
In the embodiments described above, the image sensor 600 captures all the partial images for a circular orbit without leaving the circular orbit. In other words, the image sensor 600 does not leave the circular orbit until the image sensor 600 captures all the partial images for that circular orbit. For example, the image sensor 600 does not leave the first circular orbit until the image sensor 600 captures all the partial images for the first circular orbit (i.e., the first and second partial images) .
In an alternative embodiment, the image sensor 600 may leave a circular orbit before the image sensor 600 captures all the partial images for that circular orbit. For example, the image sensor 600 may capture the first partial image while the image sensor 600 is moving through the first image capturing position along the first circular orbit. Then, the image sensor 600 may be translated from the first circular orbit to the second circular orbit. Then, the image sensor 600 may capture the third partial image while the image sensor 600 is moving through the third image capturing position along the second circular orbit.
Then, the image sensor 600 may be translated from the second circular orbit back to the first circular orbit. Then, the image sensor 600 may capture the second partial image while the image sensor 600 is moving through the second image capturing position along the first circular orbit. Then, the image sensor 600 may be translated from the first circular orbit to the second circular orbit again. Then, the image sensor 600 may capture the fourth partial image while the image sensor 600 is moving through the fourth image capturing position along the second circular orbit.
IMAGE SENSOR MOVES IN DIFFERENT ANGULAR DIRECTIONS
In the embodiments described above, the image sensor 600 moves along the first and second circular orbits in the same angular direction (i.e., counterclockwise) . In an alternative embodiment, the image sensor 600 may move along the first and second circular orbits in different angular directions. For example, the image sensor 600 may move counterclockwise along the first circular orbit for capturing the first and second partial images as described above, but the image sensor 600 may move clockwise along the second circular orbit for capturing the third and fourth partial images.
In yet another alternative embodiment, the image sensor 600 may reverse its angular direction on a circular orbit. For example, the image sensor 600 may be moving counterclockwise through the first image capturing position along the first circular orbit when the image sensor 600 captures the first partial image of the object 720, but the image sensor 600 may reverse its angular direction and then may be moving clockwise through the second image capturing position along the first circular orbit when the image sensor 600 captures the second partial image of the object 720.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and  embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (22)

  1. A method, comprising:
    capturing with an image sensor of an imaging system multiple partial images of an object,
    wherein the image sensor captures each partial image of the multiple partial images while the image sensor is on a circular orbit of circular orbits (i) , i=1, …, M,
    wherein all centers of the circular orbits (i) , i=1, …, M are on a same axis,
    wherein all the circular orbits (i) , i=1, …, M have a same radius and are respectively on M different planes perpendicular to the axis,
    wherein for each value of i, the image sensor captures Ni partial images of the multiple partial images while the image sensor is on the circular orbit (i) , and
    wherein M, Ni, i=1, …, M are integers greater than 1.
  2. The method of claim 1,
    wherein said capturing the multiple partial images comprises moving the image sensor among the circular orbits (i) , i=1, …, M.
  3. The method of claim 1,
    wherein for each value of i, the image sensor captures the Ni partial images from Ni different image capturing positions on the circular orbit (i) .
  4. The method of claim 1, further comprising reconstructing a 3D (3-dimensional) image of the object based on the multiple partial images.
  5. The method of claim 4, wherein said reconstructing the 3D image of the object comprises:
    for each value of i, reconstructing a partial 3D image of the object based on the Ni partial images; and
    combining the resulting M partial 3D images resulting in the 3D image of the object.
  6. The method of claim 1,
    wherein each point of the object is in at least a partial image of the multiple partial images.
  7. The method of claim 1,
    wherein each point of the object is in at least 2 partial images of the multiple partial images which the image sensor captures while the image sensor is on a circular orbit of the circular orbits (i) , i=1, …, M.
  8. The method of claim 1,
    wherein the image sensor captures the multiple partial images one by one.
  9. The method of claim 1,
    wherein the image sensor captures each partial image of the multiple partial images while the image sensor is moving with respect to the object along a circular orbit of the circular orbits (i) , i=1, …, M.
  10. The method of claim 1,
    wherein for each value of i, the image sensor captures all the Ni partial images without leaving the circular orbit (i) .
  11. The method of claim 10,
    wherein there is an angular direction for the image sensor, and
    wherein for each value of i, the image sensor moves in the angular direction as the image sensor captures the Ni partial images.
  12. The method of claim 1,
    wherein for at least a value of i, the image sensor captures at least a partial image of the Ni partial images but not all the Ni partial images and then moves to another circular orbit of the circular orbits (i) , i=1, …, M.
  13. The method of claim 1,
    wherein said capturing with the image sensor the multiple partial images comprises rotating the image sensor about the axis.
  14. The method of claim 13,
    wherein said capturing with the image sensor the multiple partial images further comprises translating the image sensor with respect to the object along a direction parallel to the axis.
  15. The method of claim 1,
    wherein the imaging system comprises a radiation source configured to send radiation toward the object and toward the image sensor,
    wherein in capturing the multiple partial images of the object, the image sensor uses radiation of the radiation from the radiation source that has transmitted through the object, and
    wherein said capturing the multiple partial images comprises rotating the radiation source and the image sensor about the axis while the radiation source and the image sensor remain stationary with respect to each other.
  16. The method of claim 15,
    wherein said capturing the multiple partial images further comprises translating the image sensor with respect to the object along a direction parallel to the axis from a circular orbit of the circular orbits (i) , i=1, …, M to another circular orbit of the circular orbits (i) , i=1, …, M while the radiation source and the object are stationary with respect to each other.
  17. The method of claim 15,
    wherein the radiation sent by the radiation source comprises X-rays.
  18. The method of claim 15,
    wherein the radiation sent by the radiation source comprises radiation pulses, and
    wherein radiation of each radiation pulse of the radiation pulses that has transmitted through the object is used by the image sensor for capturing a partial image of the multiple partial images.
  19. The method of claim 1,
    wherein the image sensor comprises P active areas,
    wherein each active area of the P active areas comprises multiple sensing elements,
    wherein the P active areas are arranged in Q active area rows,
    wherein each active area row of the Q active area rows comprises multiple active areas of the P active areas,
    wherein for each active area row of the Q active area rows, a straight line perpendicular to the axis intersects all active areas of said each active area row, and
    wherein P and Q are integers greater than 1.
  20. The method of claim 19,
    wherein any two adjacent active areas of any active area row of the Q active area rows overlap each other with respect to a direction which is perpendicular to a best-fit plane that intersects all sensing elements of the image sensor.
  21. The method of claim 19,
    wherein the image sensor further comprises a row gap between any two adjacent active area rows of the Q active area rows, and
    wherein said row gap is along a direction perpendicular to the axis.
  22. The method of claim 1,
    wherein the image sensor moves on the circular orbit (1) , then on the circular orbit (2) , …, and then on the circular orbit (M) as the image sensor captures the multiple partial images, and
    wherein for each value of i, i=1, …, (M-1) , for every pair of (A) a first image capturing position of the image sensor on the circular orbit (i) and (B) a second image capturing position of the image sensor on the circular orbit (i+1) , a straight line going through the first and second capturing positions is not parallel to the axis.
PCT/CN2021/141110 2021-12-24 2021-12-24 Imaging systems and methods of operation WO2023115516A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/141110 WO2023115516A1 (en) 2021-12-24 2021-12-24 Imaging systems and methods of operation
TW111143634A TW202326175A (en) 2021-12-24 2022-11-15 Methods of operation of imaging systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/141110 WO2023115516A1 (en) 2021-12-24 2021-12-24 Imaging systems and methods of operation

Publications (1)

Publication Number Publication Date
WO2023115516A1 true WO2023115516A1 (en) 2023-06-29

Family

ID=86901068

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/141110 WO2023115516A1 (en) 2021-12-24 2021-12-24 Imaging systems and methods of operation

Country Status (2)

Country Link
TW (1) TW202326175A (en)
WO (1) WO2023115516A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103873751A (en) * 2014-03-28 2014-06-18 陈维龙 Three-dimensional panoramic scanning device and three-dimensional module generating method
US20150004558A1 (en) * 2011-12-21 2015-01-01 Carestream Health, Inc. Dental imaging with photon-counting detector
US20180012370A1 (en) * 2016-07-06 2018-01-11 Qualcomm Incorporated Systems and methods for mapping an environment
US20190349636A1 (en) * 2018-05-08 2019-11-14 Gree, Inc. Video distribution system distributing video that includes message from viewing user

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150004558A1 (en) * 2011-12-21 2015-01-01 Carestream Health, Inc. Dental imaging with photon-counting detector
CN103873751A (en) * 2014-03-28 2014-06-18 陈维龙 Three-dimensional panoramic scanning device and three-dimensional module generating method
US20180012370A1 (en) * 2016-07-06 2018-01-11 Qualcomm Incorporated Systems and methods for mapping an environment
US20190349636A1 (en) * 2018-05-08 2019-11-14 Gree, Inc. Video distribution system distributing video that includes message from viewing user

Also Published As

Publication number Publication date
TW202326175A (en) 2023-07-01

Similar Documents

Publication Publication Date Title
US20230280482A1 (en) Imaging systems
US20220350038A1 (en) Imaging system
US11904187B2 (en) Imaging methods using multiple radiation beams
US20210327949A1 (en) Imaging systems and methods of operating the same
WO2023115516A1 (en) Imaging systems and methods of operation
WO2023123301A1 (en) Imaging systems with rotating image sensors
WO2024031301A1 (en) Imaging systems and corresponding operation methods
US11882378B2 (en) Imaging methods using multiple radiation beams
US11948285B2 (en) Imaging systems with multiple radiation sources
WO2022198468A1 (en) Imaging systems with image sensors having multiple radiation detectors
WO2023141911A1 (en) Method and system for performing diffractometry
WO2023123302A1 (en) Imaging methods using bi-directional counters
US20230281754A1 (en) Imaging methods using an image sensor with multiple radiation detectors
US11617554B2 (en) Imaging systems using x-ray fluorescence
WO2023130199A1 (en) Image sensors and methods of operation
WO2023122921A1 (en) Image sensors with small and thin integrated circuit chips
WO2023123161A1 (en) Imaging systems with image sensors for side radiation incidence during imaging
WO2023173387A1 (en) Radiation detectors including perovskite
WO2023039701A1 (en) 3d (3-dimensional) printing with void filling
WO2024044925A1 (en) Side incidence image sensors with protruding integrated circuit chips
WO2023283848A1 (en) Battery roll testing with imaging systems
WO2023077367A1 (en) Imaging methods with reduction of effects of features in an imaging system
WO2023039774A1 (en) Imaging methods using multiple radiation beams
US20240003830A1 (en) Imaging methods using an image sensor with multiple radiation detectors
WO2023130197A1 (en) Flow speed measurements using imaging systems