WO2024020829A1 - Imaging system and method for sorting animals by anatomy - Google Patents

Imaging system and method for sorting animals by anatomy Download PDF

Info

Publication number
WO2024020829A1
WO2024020829A1 PCT/CN2022/108138 CN2022108138W WO2024020829A1 WO 2024020829 A1 WO2024020829 A1 WO 2024020829A1 CN 2022108138 W CN2022108138 W CN 2022108138W WO 2024020829 A1 WO2024020829 A1 WO 2024020829A1
Authority
WO
WIPO (PCT)
Prior art keywords
animal
image sensor
radiation
radiation source
perspective
Prior art date
Application number
PCT/CN2022/108138
Other languages
French (fr)
Inventor
Yurun LIU
Peiyan CAO
Original Assignee
Shenzhen Xpectvision Technology Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xpectvision Technology Co., Ltd. filed Critical Shenzhen Xpectvision Technology Co., Ltd.
Priority to PCT/CN2022/108138 priority Critical patent/WO2024020829A1/en
Priority to TW112120397A priority patent/TW202404464A/en
Publication of WO2024020829A1 publication Critical patent/WO2024020829A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • B07C5/342Sorting according to other particular properties according to optical properties, e.g. colour

Definitions

  • a radiation detector is a device that measures a property of a radiation. Examples of the property may include a spatial distribution of the intensity, phase, and polarization of the radiation.
  • the radiation measured by the radiation detector may be a radiation that has transmitted through an object.
  • the radiation measured by the radiation detector may be electromagnetic radiation such as infrared light, visible light, ultraviolet light, X-ray, or ⁇ -ray.
  • the radiation may be of other types such as ⁇ -rays and ⁇ -rays.
  • An imaging system may include one or more image sensors each of which may have one or more radiation detectors.
  • a system for sorting animals by anatomy comprising a radiation source, an image sensor, a path to lead an animal through a space between the radiation source and the image sensor, an actuator movable between a first position and a second position and a controller.
  • the image sensor is configured to capture a first radiographic image of the animal from a first perspective and a second radiographic image of the animal from a second perspective.
  • the image sensor is configured to generate a three-dimensional representation of the animal based on the first and second radiographic images.
  • the image sensor is configured to detect a predetermined feature in the three-dimensional representation of the animal.
  • the controller is configured to locate the actuator in the first position when the image sensor detects the predetermined feature in the three-dimensional representation of the animal and to locate the actuator in the second position when the image sensor detects absence of the predetermined feature in the three-dimensional representation of the animal.
  • the path comprises multiple compartments each of which is configured to hold the animal, each compartment of the multiple compartments comprises a marker distinguishing said each compartment from the remaining compartments of the multiple compartments, and each image of the first radiographic image and the second radiographic image comprises an image of the marker of the compartment holding the animal.
  • the path includes a conveyor.
  • the radiation source is above the conveyor and the image sensor is beneath the conveyor.
  • the radiation source is on one lateral side of the conveyor and the image sensor is on another lateral side of the conveyor opposite the radiation source.
  • the path includes an incline.
  • system further comprises a first incentive at the first destination and a second incentive at the second destination.
  • the radiation source emits a fan beam having a small angular divergence transverse to the path and a large angular divergence along the path.
  • the first perspective is achieved when a first periphery of the fan beam projects through the animal
  • the second perspective is achieved when a second periphery of the fan beam projects through the animal, such that an angular distance between the first and second perspectives approaches the large angular divergence of the fan beam.
  • the large angular divergence is from 45 degrees to 135 degrees.
  • the large angular divergence is 90 degrees and the angular distance between the first and second perspectives is greater than 85 degrees.
  • the first perspective is achieved when the animal is at a proximal end of the path in the space between the radiation source and the image sensor
  • the second perspective is achieved when the animal is at a distal end of the path in the space between the radiation source and the image sensor.
  • the radiation source emits a first beam and a second beam separated from the first beam by a separation angle, the first perspective is achieved when the first beam projects through the animal, the second perspective is achieved when the second beam projects through the animal, and an angular distance between the first and second perspectives corresponds to the separation angle between the first and second beams.
  • the separation angle is from 45 degrees to 135 degrees.
  • the separation angle is 90 degrees.
  • the radiation source emits a third beam and a fourth beam respectively toward a first portion and a second portion of the image sensor, the first portion and the second portion are discrete from each other, the first portion of the image sensor captures the first radiographic image based on an interaction between the animal and the third beam, the second portion of the image sensor captures the second radiographic image based on an interaction between the animal and the fourth beam, and the first radiographic image and the second radiographic image are captured simultaneously.
  • the predetermined feature includes sex-specific anatomy.
  • the predetermined feature includes species-specific anatomy.
  • a method of sorting animals by anatomy comprising: passing an animal through a space between a radiation source and an image sensor; capturing a first radiographic image of the animal from a first perspective; capturing a second radiographic image of the animal from a second perspective; generating a three-dimensional representation of the animal based on the first and second radiographic images; determining whether a predetermined feature appears in the three-dimensional representation of the animal; directing the animal to a first destination when the predetermined feature appears in the three-dimensional representation of the animal; directing the animal to a second destination when the predetermined feature does not appear in the three-dimensional representation of the animal.
  • the predetermined feature includes sex-specific anatomy.
  • the predetermined feature includes species-specific anatomy.
  • Fig. 1 schematically shows a radiation detector, according to an embodiment.
  • Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector, according to an embodiment.
  • Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector, according to an embodiment.
  • Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector, according to an alternative embodiment.
  • Fig. 5 schematically shows a top view of a radiation detector package including the radiation detector and a printed circuit board (PCB) , according to an embodiment.
  • PCB printed circuit board
  • Fig. 6 schematically shows a cross-sectional view of an image sensor including the packages of Fig. 5 mounted to a system PCB (printed circuit board) , according to an embodiment.
  • PCB printed circuit board
  • Fig. 7 schematically shows a perspective view of an imaging system, according to an embodiment.
  • Fig. 8 shows a flowchart generalizing the operation of the imaging system, according to an embodiment.
  • Fig. 9 schematically shows another arrangement of the imaging system, according to an embodiment.
  • Fig. 10 schematically shows the imaging system, according to an alternative embodiment.
  • Fig. 1 schematically shows a radiation detector 100, as an example.
  • the radiation detector 100 may include an array of pixels 150 (also referred to as sensing elements 150) .
  • the array may be a rectangular array (as shown in Fig. 1) , a honeycomb array, a hexagonal array, or any other suitable array.
  • the array of pixels 150 in the example of Fig. 1 has 4 rows and 7 columns; however, in general, the array of pixels 150 may have any number of rows and any number of columns.
  • Each pixel 150 may be configured to detect radiation from a radiation source (not shown) incident thereon and may be configured to measure a characteristic (e.g., the energy of the particles, the wavelength, and the frequency) of the radiation.
  • the radiation may include radiation particles such as photons (X-rays, gamma rays, etc. ) and subatomic particles (alpha particles, beta particles, etc. )
  • Each pixel 150 may be configured to count numbers of particles of radiation incident thereon whose energy falls in a plurality of bins of energy, within a period of time. All the pixels 150 may be configured to count the numbers of particles of radiation incident thereon within a plurality of bins of energy within the same period of time. When the incident particles of radiation have similar energy, the pixels 150 may be simply configured to count numbers of particles of radiation incident thereon within a period of time, without measuring the energy of the individual particles of radiation.
  • Each pixel 150 may have its own analog-to-digital converter (ADC) configured to digitize an analog signal representing the energy of an incident particle of radiation into a digital signal, or to digitize an analog signal representing the total energy of a plurality of incident particles of radiation into a digital signal.
  • ADC analog-to-digital converter
  • the pixels 150 may be configured to operate in parallel. For example, when one pixel 150 measures an incident particle of radiation, another pixel 150 may be waiting for a particle of radiation to arrive. The pixels 150 may not have to be individually addressable.
  • the radiation detector 100 described here may have applications such as in an X-ray telescope, X-ray mammography, industrial X-ray defect detection, X-ray microscopy or microradiography, X-ray casting inspection, X-ray non-destructive testing, X-ray weld inspection, X-ray digital subtraction angiography, etc. It may be suitable to use this radiation detector 100 in place of a photographic plate, a photographic film, a PSP plate, an X-ray image intensifier, a scintillator, or another semiconductor X-ray detector.
  • Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector 100 of Fig. 1 along a line 2-2, according to an embodiment.
  • the radiation detector 100 may include a radiation absorption layer 110 and an electronics layer 120 (which may include one or more ASICs or application-specific integrated circuits) for processing and analyzing electrical signals which incident radiation generates in the radiation absorption layer 110.
  • the radiation detector 100 may or may not include a scintillator (not shown) .
  • the radiation absorption layer 110 may include a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof.
  • the semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
  • the radiation absorption layer 110 may include one or more diodes (e.g., p-i-n or p-n) formed by a first doped region 111, one or more discrete regions 114 of a second doped region 113.
  • the second doped region 113 may be separated from the first doped region 111 by an optional intrinsic region 112.
  • the discrete regions 114 may be separated from one another by the first doped region 111 or the intrinsic region 112.
  • the first doped region 111 and the second doped region 113 may have opposite types of doping (e.g., region 111 is p-type and region 113 is n-type, or region 111 is n-type and region 113 is p-type) .
  • each of the discrete regions 114 of the second doped region 113 forms a diode with the first doped region 111 and the optional intrinsic region 112.
  • the radiation absorption layer 110 has a plurality of diodes (more specifically, 7 diodes corresponding to 7 pixels 150 of one row in the array of Fig. 1, of which only 2 pixels 150 are labeled in Fig. 3 for simplicity) .
  • the plurality of diodes may have an electrical contact 119A as a shared (common) electrode.
  • the first doped region 111 may also have discrete portions.
  • the electronics layer 120 may include an electronic system 121 suitable for processing or interpreting signals generated by the radiation incident on the radiation absorption layer 110.
  • the electronic system 121 may include an analog circuitry such as a filter network, amplifiers, integrators, and comparators, or a digital circuitry such as a microprocessor, and memory.
  • the electronic system 121 may include one or more ADCs (analog to digital converters) .
  • the electronic system 121 may include components shared by the pixels 150 or components dedicated to a single pixel 150.
  • the electronic system 121 may include an amplifier dedicated to each pixel 150 and a microprocessor shared among all the pixels 150.
  • the electronic system 121 may be electrically connected to the pixels 150 by vias 131. Space among the vias may be filled with a filler material 130, which may increase the mechanical stability of the connection of the electronics layer 120 to the radiation absorption layer 110. Other bonding techniques are possible to connect the electronic system 121 to the pixels 150 without using the vias 131.
  • the radiation absorption layer 110 including diodes
  • particles of the radiation may be absorbed and generate one or more charge carriers (e.g., electrons, holes) by a number of mechanisms.
  • the charge carriers may drift to the electrodes of one of the diodes under an electric field.
  • the electric field may be an external electric field.
  • the electrical contact 119B may include discrete portions each of which is in electrical contact with the discrete regions 114.
  • the term “electrical contact” may be used interchangeably with the word “electrode.
  • the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete regions 114 ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete regions 114 than the rest of the charge carriers) .
  • Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete regions 114 are not substantially shared with another of these discrete regions 114.
  • a pixel 150 associated with a discrete region 114 may be an area around the discrete region 114 in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete region 114. Namely, less than 2%, less than 1%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel 150.
  • Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, according to an alternative embodiment.
  • the radiation absorption layer 110 may include a resistor of a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof, but does not include a diode.
  • the semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
  • the electronics layer 120 of Fig. 4 is similar to the electronics layer 120 of Fig. 3 in terms of structure and function.
  • the radiation When the radiation hits the radiation absorption layer 110 including the resistor but not diodes, it may be absorbed and generate one or more charge carriers by a number of mechanisms.
  • a particle of the radiation may generate 10 to 100,000 charge carriers.
  • the charge carriers may drift to the electrical contacts 119A and 119B under an electric field.
  • the electric field may be an external electric field.
  • the electrical contact 119B may include discrete portions.
  • the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete portions of the electrical contact 119B ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete portions than the rest of the charge carriers) .
  • a pixel 150 associated with a discrete portion of the electrical contact 119B may be an area around the discrete portion in which substantially all (more than 98%, more than 99.5%, more than 99.9%or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete portion of the electrical contact 119B. Namely, less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel associated with the one discrete portion of the electrical contact 119B.
  • Fig. 5 schematically shows a top view of a radiation detector package 500 including the radiation detector 100 and a printed circuit board (PCB) 510.
  • PCB printed circuit board
  • the term “PCB” as used herein is not limited to a particular material.
  • a PCB may include a semiconductor.
  • the radiation detector 100 may be mounted to the PCB 510.
  • the wiring between the radiation detector 100 and the PCB 510 is not shown for the sake of clarity.
  • the package 500 may have one or more radiation detectors 100.
  • the PCB 510 may include an input/output (I/O) area 512 not covered by the radiation detector 100 (e.g., for accommodating bonding wires 514) .
  • the radiation detector 100 may have an active area 190 which is where the pixels 150 (Fig. 1) are located.
  • the radiation detector 100 may have a perimeter zone 195 near the edges of the radiation detector 100.
  • the perimeter zone 195 has no pixels 150, and the radiation detector 100 does not detect particles of radiation incident on the perimeter zone
  • Fig. 6 schematically shows a cross-sectional view of an image sensor 600, according to an embodiment.
  • the image sensor 600 may include one or more radiation detector packages 500 of Fig. 5 mounted to a system PCB 650.
  • the electrical connection between the PCBs 510 and the system PCB 650 may be made by bonding wires 514.
  • the PCB 510 may have the I/O area 512 not covered by the radiation detectors 100.
  • the packages 500 may have gaps in between. The gaps may be approximately 1 mm or more.
  • a dead zone of a radiation detector (e.g., the radiation detector 100) is the area of the radiation-receiving surface of the radiation detector, on which incident particles of radiation cannot be detected by the radiation detector.
  • a dead zone of a package (e.g., package 500) is the area of the radiation-receiving surface of the package, on which incident particles of radiation cannot be detected by the radiation detector or detectors in the package.
  • the dead zone of the package 500 includes the perimeter zones 195 and the I/O area 512.
  • a dead zone (e.g., 688) of an image sensor (e.g., image sensor 600) with a group of packages (e.g., packages 500 mounted on the same PCB and arranged in the same layer or in different layers) includes the combination of the dead zones of the packages in the group and the gaps between the packages.
  • the radiation detector 100 (Fig. 1) operating by itself may be considered an image sensor.
  • the package 500 (Fig. 5) operating by itself may be considered an image sensor.
  • the image sensor 600 including the radiation detectors 100 may have the dead zone 688 among the active areas 190 of the radiation detectors 100. However, the image sensor 600 may capture multiple partial images of an object or scene (not shown) one by one, and then these captured partial images may be stitched to form a stitched image of the entire object or scene.
  • image in the present patent application (including the claims) is not limited to spatial distribution of a property of a radiation (such as intensity) .
  • image may also include the spatial distribution of density of a substance or element.
  • Fig. 7 schematically shows a perspective view of an imaging system 700, according to an embodiment.
  • the imaging system 700 may include the image sensor 600, a radiation source 710, a path 720, an actuator 730, and a controller 740.
  • the radiation source 710 may send a radiation beam 712 toward an animal 790 (e.g., a newly hatched chicken) .
  • the path 720 may move or lead the animal 790 through a space 725 between the radiation source 710 and image sensor 600 so that the image sensor 600 can capture radiographic images of the animal 790 based on the interactions between the animal 790 and the radiation beam 712.
  • the interactions between the animal 790 and the radiation beam 712 may include scenarios such as: (A) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 are blocked by the animal 790, (B) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 travel through the animal 790 without changing their directions, and (C) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 collide with atoms of the animal 790 and thereby change their directions.
  • the path 720 may include a conveyor 723. Note that the animal 790 is shown at different places in Fig. 7 to show the movement of the animal 790 over time.
  • the actuator 730 may move between a first position (as shown in Fig. 7) and a second position (not shown) .
  • the path 720 may lead the animal 790 from the conveyor 723 to a first destination X; and when the actuator 730 is in the second position (not shown) , the path 720 may lead the animal 790 from the conveyor 723 to a second destination Y separate from the first destination X.
  • the image sensor 600 may capture a first radiographic image of the animal 790 from a first perspective. Later when the animal 790 is at a position B on the conveyor 723, the image sensor 600 may capture a second radiographic image of the animal 790 from a second perspective.
  • the image sensor 600 may generate a three-dimensional representation of the animal 790 based on the first and second radiographic images.
  • the image sensor 600 may generate the three-dimensional representation of the animal 790 using the controller 740.
  • the controller 740 may be part of the image sensor 600.
  • the image sensor 600 may detect a predetermined feature in the three-dimensional representation of the animal 790.
  • the image sensor 600 may detect the predetermined feature in the three-dimensional representation of the animal 790 using the controller 740.
  • the controller 740 may be part of the image sensor 600.
  • the predetermined feature may include sex-specific anatomy (e.g., testes of a male chicken) .
  • the predetermined feature may include species-specific anatomy.
  • the controller 740 may (A) locate (i.e., to position or arrange) the actuator 730 in the first position when the image sensor 600 (or the controller 740) detects the presence of the predetermined feature in the three-dimensional representation of the animal 790, and (B) locate the actuator 730 in the second position when the image sensor 600 (or the controller 740) detects the absence of the predetermined feature in the three-dimensional representation of the animal 790.
  • the controller 740 may be part of the image sensor 600.
  • the animal 790 is a newly hatched chicken.
  • the image sensor 600 detects the presence of testes in the three-dimensional representation of the chicken.
  • the controller 740 causes the actuator 730 to move to the first position thereby causing the path 720 to lead the chicken from the conveyor 723 to the first destination X (as shown in Fig. 7) .
  • the imaging system 700 may be used to sort newly hatched chickens based on gender.
  • Fig. 8 shows a flowchart 800 generalizing the operation of the imaging system 700, according to an embodiment.
  • the operation may include passing an animal through a space between a radiation source and an image sensor.
  • the animal 790 is passed through the space 725 between the radiation source 710 and the image sensor 600.
  • the operation may include capturing a first radiographic image of the animal from a first perspective.
  • a first radiographic image of the animal For example, in the embodiments described above, with reference to Fig. 7, when the animal 790 is at position A on the conveyor 723, the image sensor 600 captures the first radiographic image of the animal 790 from the first perspective.
  • the operation may include capturing a second radiographic image of the animal from a second perspective.
  • the image sensor 600 captures the second radiographic image of the animal 790 from the second perspective.
  • the operation may include generating a three-dimensional representation of the animal based on the first and second radiographic images.
  • the image sensor 600 or the controller 740
  • the operation may include generating a three-dimensional representation of the animal 790 based on the first and second radiographic images.
  • the operation may include determining whether a predetermined feature appears in the three-dimensional representation of the animal. For example, in the embodiments described above, with reference to Fig. 7, the image sensor 600 (or the controller 740) determines whether the predetermined feature (e.g., testes) appears in the three-dimensional representation of the animal 790.
  • the predetermined feature e.g., testes
  • the operation may include directing the animal to a first destination when the predetermined feature appears in the three-dimensional representation of the animal.
  • the animal 790 is directed to the first destination X when the predetermined feature (e.g., testes) appears in the three-dimensional representation of the animal 790.
  • the operation may include directing the animal to a second destination when the predetermined feature does not appear in the three-dimensional representation of the animal.
  • the animal 790 is directed to the second destination Y when the predetermined feature (e.g., testes) does not appear in the three-dimensional representation of the animal 790.
  • step 870 is performed not necessarily after step 860 is performed.
  • the path 720 may further include multiple compartments (not shown) fixed to the belt of the conveyor 723, with each of the multiple compartments being configured to hold the animal 790.
  • each compartment of the multiple compartments may include a marker (e.g., a number) distinguishing said each compartment from the remaining compartments of the multiple compartments.
  • the imaging system 700 may operate as follows. Assume the animal 790 is in a compartment marked with number 5 (called compartment #5) . As the conveyor 723 moves the multiple compartments (including compartment #5) from left to right through the space 725, the image sensor 600 may capture multiple radiographic images of the multiple compartments. Next, in an embodiment, the image sensor 600 (or the controller 740) may analyze the captured radiographic images to pick out at least 2 radiographic images each of which has the image of the #5 marker. As a result, each image of these at least 2 radiographic images should have the image of the animal 790. Then, the image sensor 600 (or the controller 740) may generate a three-dimensional representation of the animal 790 based on these at least 2 radiographic images.
  • the radiation source 710 may be above the conveyor 723, and the image sensor 600 may be beneath the conveyor 723 as shown.
  • the radiation source 710 may be on one lateral side of the conveyor 723, and the image sensor 600 may be on another lateral side of the conveyor 723 opposite the radiation source 710 as shown.
  • the path 720 may include an incline (not shown) instead of the conveyor 723.
  • the incline causes the animal 790 to move downhill from left to right (i.e., from position A to position B) as the image sensor 600 captures the first radiographic image and then the second radiographic image of the animal 790.
  • the imaging system 700 may include (A) a first incentive 750x at the first destination X for motivating the animal 790 to move toward the first destination X, and (B) a second incentive 750y at the second destination Y for motivating the animal 790 to move toward the second destination Y.
  • the radiation beam 712 from the radiation source 710 may be a fan beam having a small angular divergence (narrow beam width) transverse to the path 720, and a large angular divergence (wide beam width) along the path 720.
  • the first perspective is achieved (i.e., the animal 790 is at position A) when a first periphery 712p1 of the radiation beam 712 projects through the animal 790
  • the second perspective is achieved (i.e., the animal 790 is at position B) when a second periphery 712p1 of the radiation beam 712 projects through the animal 790, such that an angular distance between the first and second perspectives (i.e., angle ⁇ between the first periphery 712p1 and the second periphery 712p1) approaches the large angular divergence (angle ⁇ ) of the radiation beam 712.
  • the large angular divergence (angle ⁇ ) may be from 45 degrees to 135 degrees. In an embodiment, the large angular divergence (angle ⁇ ) may be 90 degrees and the angular distance between the first and second perspectives (angle ⁇ ) may be greater than 85 degrees.
  • the first perspective is achieved when the animal 790 is at a proximal end (i.e., position A) of the path 720 in the space 725 between the radiation source 710 and the image sensor 600
  • the second perspective is achieved when the animal 790 is at a distal end (i.e., position B) of the path 720 in the space 725 between the radiation source 710 and the image sensor 600.
  • the radiation source 710 sends a radiation beam (i.e., the radiation beam 712) toward the space 725 (in which the animal 790 moves) .
  • the periphery 712p1 and the periphery 712p2 are 2 portions of the radiation beam 712.
  • the periphery 712p1 and the periphery 712p2 may be 2 discrete radiation beams sent by the radiation source 710 toward the space 725 (in which the animal 790 moves) .
  • the periphery 712p1 and the periphery 712p2 may be referred to as the radiation beam 712p1 and the radiation beam 712p2, respectively.
  • the radiation beams 712p1 and 712p2 may be separated from each other by a separation angle (angle ⁇ ) .
  • the first perspective may be achieved when the radiation beam 712p1 projects through the animal 790; and the second perspective may be achieved when the radiation beam 712p2 projects through the animal 790.
  • an angular distance between the first and second perspectives corresponds to the separation angle (angle ⁇ ) between the radiation beams 712p1 and 712p2.
  • the radiation source 710 may send the radiation beams 712p1 and 712p2 simultaneously or in sequence (i.e., one after another) .
  • the separation angle (angle ⁇ ) between the radiation beams 712p1 and 712p2 may be from 45 degrees to 135 degrees. In an embodiment, the separation angle (angle ⁇ ) may be 90 degrees.
  • the radiation source 710 may emit a radiation beam 712a and a radiation beam 712b respectively toward a first portion 600a and a second portion 600b of the image sensor 600, wherein the first portion 600a and the second portion 600b are discrete from each other as shown (i.e., the first portion 600a and the second portion 600b have no common point) .
  • the first portion 600a of the image sensor 600 captures the first radiographic image based on the interaction between the animal 790 and the radiation beam 712a
  • the second portion 600b of the image sensor 600 captures the second radiographic image based on the interaction between the animal 790 and the radiation beam 712b, wherein the first radiographic image and the second radiographic image are captured simultaneously.
  • the first radiographic image and the second radiographic image are 2 separate portions of a larger image captured by the image sensor 600 in a single exposure and based on the interactions between the animal 790 and the radiation beams 712a and 712b.
  • the image sensor 600 (or the controller 740) may generate the three- dimensional representation of the animal 790 based on the first and second radiographic images.

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)

Abstract

Disclosed herein is a system (700) for sorting animals by anatomy. The system (700) has a radiation source (710), an image sensor (600), a path (720) to lead an animal (790) through a space (725) between them, an actuator (730) movable between different positions and a controller (740). The positions of the actuator (730) cause the path (720) to lead to different destinations. The image sensor (600) can capture radiographic images of the animal (790) from different perspectives, generate a three-dimensional representation of the animal (790) based on the radiographic images, and detect a predetermined feature in the three-dimensional representation of the animal (790). The controller (740) can locate the actuator (730) in one position when the image sensor (600) detects the predetermined feature in the three-dimensional representation of the animal (790) and locate the actuator (730) in another position when the image sensor (600) detects absence of the predetermined feature in the three-dimensional representation of the animal (790).

Description

IMAGING SYSTEM AND METHOD FOR SORTING ANIMALS BY ANATOMY Background
A radiation detector is a device that measures a property of a radiation. Examples of the property may include a spatial distribution of the intensity, phase, and polarization of the radiation. The radiation measured by the radiation detector may be a radiation that has transmitted through an object. The radiation measured by the radiation detector may be electromagnetic radiation such as infrared light, visible light, ultraviolet light, X-ray, or γ-ray. The radiation may be of other types such as α-rays and β-rays. An imaging system may include one or more image sensors each of which may have one or more radiation detectors.
Summary
Disclosed herein is a system for sorting animals by anatomy, comprising a radiation source, an image sensor, a path to lead an animal through a space between the radiation source and the image sensor, an actuator movable between a first position and a second position and a controller. When the actuator is in the first position the path leads to a first destination and when the actuator is in the second position the path leads to a second destination separate from the first destination. The image sensor is configured to capture a first radiographic image of the animal from a first perspective and a second radiographic image of the animal from a second perspective. The image sensor is configured to generate a three-dimensional representation of the animal based on the first and second radiographic images. The image sensor is configured to detect a predetermined feature in the three-dimensional representation of the animal. The controller is configured to locate the actuator in the first position when the image sensor detects the predetermined feature in the three-dimensional representation of the animal and to locate the actuator in the second position when the image sensor detects absence of the predetermined feature in the three-dimensional representation of the animal.
In an aspect, the path comprises multiple compartments each of which is configured to hold the animal, each compartment of the multiple compartments comprises a marker distinguishing said each compartment from the remaining compartments of the multiple compartments, and each image of the first radiographic image and the second radiographic image comprises an image of the marker of the compartment holding the animal.
In an aspect, the path includes a conveyor.
In an aspect, the radiation source is above the conveyor and the image sensor is beneath the conveyor.
In an aspect, the radiation source is on one lateral side of the conveyor and the image sensor is on another lateral side of the conveyor opposite the radiation source.
In an aspect, the path includes an incline.
In an aspect, the system further comprises a first incentive at the first destination and a second incentive at the second destination.
In an aspect, the radiation source emits a fan beam having a small angular divergence transverse to the path and a large angular divergence along the path.
In an aspect, the first perspective is achieved when a first periphery of the fan beam projects through the animal, and the second perspective is achieved when a second periphery of the fan beam projects through the animal, such that an angular distance between the first and second perspectives approaches the large angular divergence of the fan beam.
In an aspect, the large angular divergence is from 45 degrees to 135 degrees.
In an aspect, the large angular divergence is 90 degrees and the angular distance between the first and second perspectives is greater than 85 degrees.
In an aspect, the first perspective is achieved when the animal is at a proximal end of the path in the space between the radiation source and the image sensor, and the second perspective is achieved when the animal is at a distal end of the path in the space between the radiation source and the image sensor.
In an aspect, the radiation source emits a first beam and a second beam separated from the first beam by a separation angle, the first perspective is achieved when the first beam projects through the animal, the second perspective is achieved when the second beam projects through the animal, and an angular distance between the first and second perspectives corresponds to the separation angle between the first and second beams.
In an aspect, the separation angle is from 45 degrees to 135 degrees.
In an aspect, the separation angle is 90 degrees.
In an aspect, the radiation source emits a third beam and a fourth beam respectively toward a first portion and a second portion of the image sensor, the first portion and the second portion are discrete from each other, the first portion of the image sensor captures the first radiographic image based on an interaction between the animal and the third beam, the second portion of the image sensor captures the second radiographic image based on an  interaction between the animal and the fourth beam, and the first radiographic image and the second radiographic image are captured simultaneously.
In an aspect, the predetermined feature includes sex-specific anatomy.
In an aspect, the predetermined feature includes species-specific anatomy.
Disclosed herein is a method of sorting animals by anatomy, comprising: passing an animal through a space between a radiation source and an image sensor; capturing a first radiographic image of the animal from a first perspective; capturing a second radiographic image of the animal from a second perspective; generating a three-dimensional representation of the animal based on the first and second radiographic images; determining whether a predetermined feature appears in the three-dimensional representation of the animal; directing the animal to a first destination when the predetermined feature appears in the three-dimensional representation of the animal; directing the animal to a second destination when the predetermined feature does not appear in the three-dimensional representation of the animal.
In an aspect, the predetermined feature includes sex-specific anatomy.
In an aspect, the predetermined feature includes species-specific anatomy.
Brief Description of Figures
Fig. 1 schematically shows a radiation detector, according to an embodiment.
Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector, according to an embodiment.
Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector, according to an embodiment.
Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector, according to an alternative embodiment.
Fig. 5 schematically shows a top view of a radiation detector package including the radiation detector and a printed circuit board (PCB) , according to an embodiment.
Fig. 6 schematically shows a cross-sectional view of an image sensor including the packages of Fig. 5 mounted to a system PCB (printed circuit board) , according to an embodiment.
Fig. 7 schematically shows a perspective view of an imaging system, according to an embodiment.
Fig. 8 shows a flowchart generalizing the operation of the imaging system, according to an embodiment.
Fig. 9 schematically shows another arrangement of the imaging system, according to an embodiment.
Fig. 10 schematically shows the imaging system, according to an alternative embodiment.
Detailed Description
RADIATION DETECTOR
Fig. 1 schematically shows a radiation detector 100, as an example. The radiation detector 100 may include an array of pixels 150 (also referred to as sensing elements 150) . The array may be a rectangular array (as shown in Fig. 1) , a honeycomb array, a hexagonal array, or any other suitable array. The array of pixels 150 in the example of Fig. 1 has 4 rows and 7 columns; however, in general, the array of pixels 150 may have any number of rows and any number of columns.
Each pixel 150 may be configured to detect radiation from a radiation source (not shown) incident thereon and may be configured to measure a characteristic (e.g., the energy of the particles, the wavelength, and the frequency) of the radiation. The radiation may include radiation particles such as photons (X-rays, gamma rays, etc. ) and subatomic particles (alpha particles, beta particles, etc. ) Each pixel 150 may be configured to count numbers of particles of radiation incident thereon whose energy falls in a plurality of bins of energy, within a period of time. All the pixels 150 may be configured to count the numbers of particles of radiation incident thereon within a plurality of bins of energy within the same period of time. When the incident particles of radiation have similar energy, the pixels 150 may be simply configured to count numbers of particles of radiation incident thereon within a period of time, without measuring the energy of the individual particles of radiation.
Each pixel 150 may have its own analog-to-digital converter (ADC) configured to digitize an analog signal representing the energy of an incident particle of radiation into a digital signal, or to digitize an analog signal representing the total energy of a plurality of incident particles of radiation into a digital signal. The pixels 150 may be configured to operate in parallel. For example, when one pixel 150 measures an incident particle of radiation, another pixel 150 may be waiting for a particle of radiation to arrive. The pixels 150 may not have to be individually addressable.
The radiation detector 100 described here may have applications such as in an X-ray telescope, X-ray mammography, industrial X-ray defect detection, X-ray microscopy or microradiography, X-ray casting inspection, X-ray non-destructive testing, X-ray weld inspection, X-ray digital subtraction angiography, etc. It may be suitable to use this radiation detector 100 in place of a photographic plate, a photographic film, a PSP plate, an X-ray image intensifier, a scintillator, or another semiconductor X-ray detector.
Fig. 2 schematically shows a simplified cross-sectional view of the radiation detector 100 of Fig. 1 along a line 2-2, according to an embodiment. Specifically, the radiation detector 100 may include a radiation absorption layer 110 and an electronics layer 120 (which may include one or more ASICs or application-specific integrated circuits) for processing and analyzing electrical signals which incident radiation generates in the radiation absorption layer 110. The radiation detector 100 may or may not include a scintillator (not shown) . The radiation absorption layer 110 may include a semiconductor material such as silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
Fig. 3 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, as an example. Specifically, the radiation absorption layer 110 may include one or more diodes (e.g., p-i-n or p-n) formed by a first doped region 111, one or more discrete regions 114 of a second doped region 113. The second doped region 113 may be separated from the first doped region 111 by an optional intrinsic region 112. The discrete regions 114 may be separated from one another by the first doped region 111 or the intrinsic region 112. The first doped region 111 and the second doped region 113 may have opposite types of doping (e.g., region 111 is p-type and region 113 is n-type, or region 111 is n-type and region 113 is p-type) . In the example of Fig. 3, each of the discrete regions 114 of the second doped region 113 forms a diode with the first doped region 111 and the optional intrinsic region 112. Namely, in the example in Fig. 3, the radiation absorption layer 110 has a plurality of diodes (more specifically, 7 diodes corresponding to 7 pixels 150 of one row in the array of Fig. 1, of which only 2 pixels 150 are labeled in Fig. 3 for simplicity) . The plurality of diodes may have an electrical contact 119A as a shared (common) electrode. The first doped region 111 may also have discrete portions.
The electronics layer 120 may include an electronic system 121 suitable for processing or interpreting signals generated by the radiation incident on the radiation absorption layer  110. The electronic system 121 may include an analog circuitry such as a filter network, amplifiers, integrators, and comparators, or a digital circuitry such as a microprocessor, and memory. The electronic system 121 may include one or more ADCs (analog to digital converters) . The electronic system 121 may include components shared by the pixels 150 or components dedicated to a single pixel 150. For example, the electronic system 121 may include an amplifier dedicated to each pixel 150 and a microprocessor shared among all the pixels 150. The electronic system 121 may be electrically connected to the pixels 150 by vias 131. Space among the vias may be filled with a filler material 130, which may increase the mechanical stability of the connection of the electronics layer 120 to the radiation absorption layer 110. Other bonding techniques are possible to connect the electronic system 121 to the pixels 150 without using the vias 131.
When radiation from the radiation source (not shown) hits the radiation absorption layer 110 including diodes, particles of the radiation may be absorbed and generate one or more charge carriers (e.g., electrons, holes) by a number of mechanisms. The charge carriers may drift to the electrodes of one of the diodes under an electric field. The electric field may be an external electric field. The electrical contact 119B may include discrete portions each of which is in electrical contact with the discrete regions 114. The term “electrical contact” may be used interchangeably with the word “electrode. ” In an embodiment, the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete regions 114 ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete regions 114 than the rest of the charge carriers) . Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete regions 114 are not substantially shared with another of these discrete regions 114. A pixel 150 associated with a discrete region 114 may be an area around the discrete region 114 in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete region 114. Namely, less than 2%, less than 1%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel 150.
Fig. 4 schematically shows a detailed cross-sectional view of the radiation detector 100 of Fig. 1 along the line 2-2, according to an alternative embodiment. More specifically, the radiation absorption layer 110 may include a resistor of a semiconductor material such as  silicon, germanium, GaAs, CdTe, CdZnTe, or a combination thereof, but does not include a diode. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest. In an embodiment, the electronics layer 120 of Fig. 4 is similar to the electronics layer 120 of Fig. 3 in terms of structure and function.
When the radiation hits the radiation absorption layer 110 including the resistor but not diodes, it may be absorbed and generate one or more charge carriers by a number of mechanisms. A particle of the radiation may generate 10 to 100,000 charge carriers. The charge carriers may drift to the  electrical contacts  119A and 119B under an electric field. The electric field may be an external electric field. The electrical contact 119B may include discrete portions. In an embodiment, the charge carriers may drift in directions such that the charge carriers generated by a single particle of the radiation are not substantially shared by two different discrete portions of the electrical contact 119B ( “not substantially shared” here means less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow to a different one of the discrete portions than the rest of the charge carriers) . Charge carriers generated by a particle of the radiation incident around the footprint of one of these discrete portions of the electrical contact 119B are not substantially shared with another of these discrete portions of the electrical contact 119B. A pixel 150 associated with a discrete portion of the electrical contact 119B may be an area around the discrete portion in which substantially all (more than 98%, more than 99.5%, more than 99.9%or more than 99.99%of) charge carriers generated by a particle of the radiation incident therein flow to the discrete portion of the electrical contact 119B. Namely, less than 2%, less than 0.5%, less than 0.1%, or less than 0.01%of these charge carriers flow beyond the pixel associated with the one discrete portion of the electrical contact 119B.
RADIATION DETECTOR PACKAGE
Fig. 5 schematically shows a top view of a radiation detector package 500 including the radiation detector 100 and a printed circuit board (PCB) 510. The term “PCB” as used herein is not limited to a particular material. For example, a PCB may include a semiconductor. The radiation detector 100 may be mounted to the PCB 510. The wiring between the radiation detector 100 and the PCB 510 is not shown for the sake of clarity. The package 500 may have one or more radiation detectors 100. The PCB 510 may include an input/output (I/O) area 512 not covered by the radiation detector 100 (e.g., for accommodating bonding wires 514) . The radiation detector 100 may have an active area 190 which is where the pixels 150 (Fig. 1) are  located. The radiation detector 100 may have a perimeter zone 195 near the edges of the radiation detector 100. The perimeter zone 195 has no pixels 150, and the radiation detector 100 does not detect particles of radiation incident on the perimeter zone 195.
IMAGE SENSOR
Fig. 6 schematically shows a cross-sectional view of an image sensor 600, according to an embodiment. The image sensor 600 may include one or more radiation detector packages 500 of Fig. 5 mounted to a system PCB 650. The electrical connection between the PCBs 510 and the system PCB 650 may be made by bonding wires 514. In order to accommodate the bonding wires 514 on the PCB 510, the PCB 510 may have the I/O area 512 not covered by the radiation detectors 100. In order to accommodate the bonding wires 514 on the system PCB 650, the packages 500 may have gaps in between. The gaps may be approximately 1 mm or more. Particles of radiation incident on the perimeter zones 195, on the I/O area 512, or on the gaps cannot be detected by the packages 500 on the system PCB 650. A dead zone of a radiation detector (e.g., the radiation detector 100) is the area of the radiation-receiving surface of the radiation detector, on which incident particles of radiation cannot be detected by the radiation detector. A dead zone of a package (e.g., package 500) is the area of the radiation-receiving surface of the package, on which incident particles of radiation cannot be detected by the radiation detector or detectors in the package. In this example shown in Fig. 5 and Fig. 6, the dead zone of the package 500 includes the perimeter zones 195 and the I/O area 512. A dead zone (e.g., 688) of an image sensor (e.g., image sensor 600) with a group of packages (e.g., packages 500 mounted on the same PCB and arranged in the same layer or in different layers) includes the combination of the dead zones of the packages in the group and the gaps between the packages.
In an embodiment, the radiation detector 100 (Fig. 1) operating by itself may be considered an image sensor. In an embodiment, the package 500 (Fig. 5) operating by itself may be considered an image sensor.
The image sensor 600 including the radiation detectors 100 may have the dead zone 688 among the active areas 190 of the radiation detectors 100. However, the image sensor 600 may capture multiple partial images of an object or scene (not shown) one by one, and then these captured partial images may be stitched to form a stitched image of the entire object or scene.
The term “image” in the present patent application (including the claims) is not limited to spatial distribution of a property of a radiation (such as intensity) . For example, the term “image” may also include the spatial distribution of density of a substance or element.
IMAGING SYSTEM
Fig. 7 schematically shows a perspective view of an imaging system 700, according to an embodiment. In an embodiment, the imaging system 700 may include the image sensor 600, a radiation source 710, a path 720, an actuator 730, and a controller 740.
In an embodiment, the radiation source 710 may send a radiation beam 712 toward an animal 790 (e.g., a newly hatched chicken) . In an embodiment, the path 720 may move or lead the animal 790 through a space 725 between the radiation source 710 and image sensor 600 so that the image sensor 600 can capture radiographic images of the animal 790 based on the interactions between the animal 790 and the radiation beam 712.
The interactions between the animal 790 and the radiation beam 712 may include scenarios such as: (A) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 are blocked by the animal 790, (B) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 travel through the animal 790 without changing their directions, and (C) some of the radiation particles of the radiation beam 712 that are incident on the animal 790 collide with atoms of the animal 790 and thereby change their directions.
In an embodiment, the path 720 may include a conveyor 723. Note that the animal 790 is shown at different places in Fig. 7 to show the movement of the animal 790 over time.
In an embodiment, the actuator 730 may move between a first position (as shown in Fig. 7) and a second position (not shown) . When the actuator 730 is in the first position (as shown in Fig. 7) , the path 720 may lead the animal 790 from the conveyor 723 to a first destination X; and when the actuator 730 is in the second position (not shown) , the path 720 may lead the animal 790 from the conveyor 723 to a second destination Y separate from the first destination X.
OPERATION OF IMAGING SYSTEM
In an embodiment, when the animal 790 is at a position A on the conveyor 723, the image sensor 600 may capture a first radiographic image of the animal 790 from a first perspective. Later when the animal 790 is at a position B on the conveyor 723, the image  sensor 600 may capture a second radiographic image of the animal 790 from a second perspective.
In an embodiment, after the image sensor 600 captures the first and second radiographic images, the image sensor 600 may generate a three-dimensional representation of the animal 790 based on the first and second radiographic images. In an example, the image sensor 600 may generate the three-dimensional representation of the animal 790 using the controller 740. The controller 740 may be part of the image sensor 600.
In an embodiment, after the three-dimensional representation of the animal 790 is generated, the image sensor 600 may detect a predetermined feature in the three-dimensional representation of the animal 790. In an example, the image sensor 600 may detect the predetermined feature in the three-dimensional representation of the animal 790 using the controller 740. The controller 740 may be part of the image sensor 600.
In an embodiment, the predetermined feature may include sex-specific anatomy (e.g., testes of a male chicken) . In an embodiment, the predetermined feature may include species-specific anatomy.
In an embodiment, the controller 740 may (A) locate (i.e., to position or arrange) the actuator 730 in the first position when the image sensor 600 (or the controller 740) detects the presence of the predetermined feature in the three-dimensional representation of the animal 790, and (B) locate the actuator 730 in the second position when the image sensor 600 (or the controller 740) detects the absence of the predetermined feature in the three-dimensional representation of the animal 790. The controller 740 may be part of the image sensor 600.
For example, with reference to Fig. 7, assume the animal 790 is a newly hatched chicken. Assume further that the image sensor 600 detects the presence of testes in the three-dimensional representation of the chicken. As a result, the controller 740 causes the actuator 730 to move to the first position thereby causing the path 720 to lead the chicken from the conveyor 723 to the first destination X (as shown in Fig. 7) .
Assume a second newly hatched chicken moves through the imaging system 700 in a similar manner. Assume further that the image sensor 600 detects the absence of testes in the three-dimensional representation of the second chicken. As a result, the controller 740 causes the actuator 730 to move to the second position thereby causing the path 720 to lead the second chicken from the conveyor 723 to the second destination Y. In other words, the imaging system 700 may be used to sort newly hatched chickens based on gender.
FLOWCHART GENERALIZING OPERATION OF IMAGING SYSTEM
Fig. 8 shows a flowchart 800 generalizing the operation of the imaging system 700, according to an embodiment. In step 810, the operation may include passing an animal through a space between a radiation source and an image sensor. For example, in the embodiments described above, with reference to Fig. 7, the animal 790 is passed through the space 725 between the radiation source 710 and the image sensor 600.
In step 820, the operation may include capturing a first radiographic image of the animal from a first perspective. For example, in the embodiments described above, with reference to Fig. 7, when the animal 790 is at position A on the conveyor 723, the image sensor 600 captures the first radiographic image of the animal 790 from the first perspective.
In step 830, the operation may include capturing a second radiographic image of the animal from a second perspective. For example, in the embodiments described above, with reference to Fig. 7, when the animal 790 is at position B on the conveyor 723, the image sensor 600 captures the second radiographic image of the animal 790 from the second perspective.
In step 840, the operation may include generating a three-dimensional representation of the animal based on the first and second radiographic images. For example, in the embodiments described above, with reference to Fig. 7, the image sensor 600 (or the controller 740) generates the three-dimensional representation of the animal 790 based on the first and second radiographic images.
In step 850, the operation may include determining whether a predetermined feature appears in the three-dimensional representation of the animal. For example, in the embodiments described above, with reference to Fig. 7, the image sensor 600 (or the controller 740) determines whether the predetermined feature (e.g., testes) appears in the three-dimensional representation of the animal 790.
In step 860, the operation may include directing the animal to a first destination when the predetermined feature appears in the three-dimensional representation of the animal. For example, in the embodiments described above, with reference to Fig. 7, the animal 790 is directed to the first destination X when the predetermined feature (e.g., testes) appears in the three-dimensional representation of the animal 790.
In step 870, the operation may include directing the animal to a second destination when the predetermined feature does not appear in the three-dimensional representation of the animal. For example, in the embodiments described above, with reference to Fig. 7, the  animal 790 is directed to the second destination Y when the predetermined feature (e.g., testes) does not appear in the three-dimensional representation of the animal 790. Note that step 870 is performed not necessarily after step 860 is performed.
OTHER EMBODIMENTS
COMPARTMENTS FOR ANIMAL
In an embodiment, with reference to Fig. 7, the path 720 may further include multiple compartments (not shown) fixed to the belt of the conveyor 723, with each of the multiple compartments being configured to hold the animal 790. In an embodiment, each compartment of the multiple compartments may include a marker (e.g., a number) distinguishing said each compartment from the remaining compartments of the multiple compartments.
In this case, the imaging system 700 may operate as follows. Assume the animal 790 is in a compartment marked with number 5 (called compartment #5) . As the conveyor 723 moves the multiple compartments (including compartment #5) from left to right through the space 725, the image sensor 600 may capture multiple radiographic images of the multiple compartments. Next, in an embodiment, the image sensor 600 (or the controller 740) may analyze the captured radiographic images to pick out at least 2 radiographic images each of which has the image of the #5 marker. As a result, each image of these at least 2 radiographic images should have the image of the animal 790. Then, the image sensor 600 (or the controller 740) may generate a three-dimensional representation of the animal 790 based on these at least 2 radiographic images.
ARRANGEMENTS OF RADIATION SOURCE AND IMAGE SENSOR
In an embodiment, with reference to Fig. 7, the radiation source 710 may be above the conveyor 723, and the image sensor 600 may be beneath the conveyor 723 as shown. Alternatively, with reference to Fig. 9, the radiation source 710 may be on one lateral side of the conveyor 723, and the image sensor 600 may be on another lateral side of the conveyor 723 opposite the radiation source 710 as shown.
INCLINE
In an embodiment, with reference to Fig. 7, the path 720 may include an incline (not shown) instead of the conveyor 723. With the animal 790 being on the incline, the incline causes the animal 790 to move downhill from left to right (i.e., from position A to position B) as the image sensor 600 captures the first radiographic image and then the second radiographic image of the animal 790.
INCENTIVE FOR MOTIVATING ANIMAL TO MOVE
In an embodiment, with reference to Fig. 7, the imaging system 700 may include (A) a first incentive 750x at the first destination X for motivating the animal 790 to move toward the first destination X, and (B) a second incentive 750y at the second destination Y for motivating the animal 790 to move toward the second destination Y.
RADIATION BEAM FROM RADIATION SOURCE
In an embodiment, with reference to Fig. 7, the radiation beam 712 from the radiation source 710 may be a fan beam having a small angular divergence (narrow beam width) transverse to the path 720, and a large angular divergence (wide beam width) along the path 720.
In an embodiment, with reference to Fig. 7, the first perspective is achieved (i.e., the animal 790 is at position A) when a first periphery 712p1 of the radiation beam 712 projects through the animal 790, and the second perspective is achieved (i.e., the animal 790 is at position B) when a second periphery 712p1 of the radiation beam 712 projects through the animal 790, such that an angular distance between the first and second perspectives (i.e., angle α between the first periphery 712p1 and the second periphery 712p1) approaches the large angular divergence (angle θ) of the radiation beam 712.
In an embodiment, with reference to Fig. 7, the large angular divergence (angle θ) may be from 45 degrees to 135 degrees. In an embodiment, the large angular divergence (angle θ) may be 90 degrees and the angular distance between the first and second perspectives (angle α) may be greater than 85 degrees.
PROXIMAL END AND DISTAL END OF PATH
In an embodiment, with reference to Fig. 7, the first perspective is achieved when the animal 790 is at a proximal end (i.e., position A) of the path 720 in the space 725 between the radiation source 710 and the image sensor 600, and the second perspective is achieved when the animal 790 is at a distal end (i.e., position B) of the path 720 in the space 725 between the radiation source 710 and the image sensor 600.
ALTERNATIVE EMBODIMENTS
TWO RADIATION BEAMS INSTEAD OF ONE
In the embodiments described above, with reference to Fig. 7, the radiation source 710 sends a radiation beam (i.e., the radiation beam 712) toward the space 725 (in which the animal 790 moves) . Note that the periphery 712p1 and the periphery 712p2 are 2 portions of  the radiation beam 712. Alternatively, the periphery 712p1 and the periphery 712p2 may be 2 discrete radiation beams sent by the radiation source 710 toward the space 725 (in which the animal 790 moves) . Hence, hereafter, in this alternative embodiment, the periphery 712p1 and the periphery 712p2 may be referred to as the radiation beam 712p1 and the radiation beam 712p2, respectively.
In an embodiment, with reference to Fig. 7, the radiation beams 712p1 and 712p2 may be separated from each other by a separation angle (angle α) . In addition, the first perspective may be achieved when the radiation beam 712p1 projects through the animal 790; and the second perspective may be achieved when the radiation beam 712p2 projects through the animal 790. In addition, an angular distance between the first and second perspectives corresponds to the separation angle (angle α) between the radiation beams 712p1 and 712p2. In an embodiment, the radiation source 710 may send the radiation beams 712p1 and 712p2 simultaneously or in sequence (i.e., one after another) .
In an embodiment, with reference to Fig. 7, the separation angle (angle α) between the radiation beams 712p1 and 712p2 may be from 45 degrees to 135 degrees. In an embodiment, the separation angle (angle α) may be 90 degrees.
TWO RADIATION BEAMS NOT FROM ONE POINT
In an embodiment, with reference to Fig. 10, the radiation source 710 may emit a radiation beam 712a and a radiation beam 712b respectively toward a first portion 600a and a second portion 600b of the image sensor 600, wherein the first portion 600a and the second portion 600b are discrete from each other as shown (i.e., the first portion 600a and the second portion 600b have no common point) .
In an embodiment, the first portion 600a of the image sensor 600 captures the first radiographic image based on the interaction between the animal 790 and the radiation beam 712a, and the second portion 600b of the image sensor 600 captures the second radiographic image based on the interaction between the animal 790 and the radiation beam 712b, wherein the first radiographic image and the second radiographic image are captured simultaneously. Note that in this case, the first radiographic image and the second radiographic image are 2 separate portions of a larger image captured by the image sensor 600 in a single exposure and based on the interactions between the animal 790 and the radiation beams 712a and 712b. Next, in an embodiment, the image sensor 600 (or the controller 740) may generate the three- dimensional representation of the animal 790 based on the first and second radiographic images.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (21)

  1. A system for sorting animals by anatomy, comprising:
    a radiation source;
    an image sensor;
    a path to lead an animal through a space between the radiation source and the image sensor;
    an actuator movable between a first position and a second position, wherein when the actuator is in the first position the path leads to a first destination and when the actuator is in the second position the path leads to a second destination separate from the first destination;
    wherein the image sensor is configured to capture a first radiographic image of the animal from a first perspective and a second radiographic image of the animal from a second perspective;
    wherein the image sensor is configured to generate a three-dimensional representation of the animal based on the first and second radiographic images;
    wherein the image sensor is configured to detect a predetermined feature in the three-dimensional representation of the animal; and
    a controller configured to locate the actuator in the first position when the image sensor detects the predetermined feature in the three-dimensional representation of the animal and to locate the actuator in the second position when the image sensor detects absence of the predetermined feature in the three-dimensional representation of the animal.
  2. The system of claim 1,
    wherein the path comprises multiple compartments each of which is configured to hold the animal,
    wherein each compartment of the multiple compartments comprises a marker distinguishing said each compartment from the remaining compartments of the multiple compartments, and
    wherein each image of the first radiographic image and the second radiographic image comprises an image of the marker of the compartment holding the animal.
  3. The system of claim 1, wherein the path includes a conveyor.
  4. The system of claim 3, wherein the radiation source is above the conveyor and the image sensor is beneath the conveyor.
  5. The system of claim 3, wherein the radiation source is on one lateral side of the conveyor and the image sensor is on another lateral side of the conveyor opposite the radiation source.
  6. The system of claim 1, wherein the path includes an incline.
  7. The system of claim 1, further comprising a first incentive at the first destination and a second incentive at the second destination.
  8. The system of claim 1, wherein the radiation source emits a fan beam having a small angular divergence transverse to the path and a large angular divergence along the path.
  9. The system of claim 8,
    wherein the first perspective is achieved when a first periphery of the fan beam projects through the animal, and
    the second perspective is achieved when a second periphery of the fan beam projects through the animal,
    such that an angular distance between the first and second perspectives approaches the large angular divergence of the fan beam.
  10. The system of claim 9, wherein the large angular divergence is from 45 degrees to 135 degrees.
  11. The system of claim 9, wherein the large angular divergence is 90 degrees and the angular distance between the first and second perspectives is greater than 85 degrees.
  12. The system of claim 8,
    wherein the first perspective is achieved when the animal is at a proximal end of the path in the space between the radiation source and the image sensor, and
    wherein the second perspective is achieved when the animal is at a distal end of the path in the space between the radiation source and the image sensor.
  13. The system of claim 1,
    wherein the radiation source emits a first beam and a second beam separated from the first beam by a separation angle,
    the first perspective is achieved when the first beam projects through the animal,
    the second perspective is achieved when the second beam projects through the animal, and
    an angular distance between the first and second perspectives corresponds to the separation angle between the first and second beams.
  14. The system of claim 13, wherein the separation angle is from 45 degrees to 135 degrees.
  15. The system of claim 13, wherein the separation angle is 90 degrees.
  16. The system of claim 1,
    wherein the radiation source emits a third beam and a fourth beam respectively toward a first portion and a second portion of the image sensor, wherein the first portion and the second portion are discrete from each other,
    wherein the first portion of the image sensor captures the first radiographic image based on an interaction between the animal and the third beam,
    wherein the second portion of the image sensor captures the second radiographic image based on an interaction between the animal and the fourth beam, and
    wherein the first radiographic image and the second radiographic image are captured simultaneously.
  17. The system of claim 1, wherein the predetermined feature includes sex-specific anatomy.
  18. The system of claim 1, wherein the predetermined feature includes species-specific anatomy.
  19. A method of sorting animals by anatomy, comprising:
    passing an animal through a space between a radiation source and an image sensor;
    capturing a first radiographic image of the animal from a first perspective;
    capturing a second radiographic image of the animal from a second perspective;
    generating a three-dimensional representation of the animal based on the first and second radiographic images;
    determining whether a predetermined feature appears in the three-dimensional representation of the animal;
    directing the animal to a first destination when the predetermined feature appears in the three-dimensional representation of the animal;
    directing the animal to a second destination when the predetermined feature does not appear in the three-dimensional representation of the animal.
  20. The method of claim 19, wherein the predetermined feature includes sex-specific anatomy.
  21. The method of claim 19, wherein the predetermined feature includes species-specific anatomy.
PCT/CN2022/108138 2022-07-27 2022-07-27 Imaging system and method for sorting animals by anatomy WO2024020829A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/108138 WO2024020829A1 (en) 2022-07-27 2022-07-27 Imaging system and method for sorting animals by anatomy
TW112120397A TW202404464A (en) 2022-07-27 2023-05-31 Imaging system and method for sorting animals by anatomy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/108138 WO2024020829A1 (en) 2022-07-27 2022-07-27 Imaging system and method for sorting animals by anatomy

Publications (1)

Publication Number Publication Date
WO2024020829A1 true WO2024020829A1 (en) 2024-02-01

Family

ID=89704797

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/108138 WO2024020829A1 (en) 2022-07-27 2022-07-27 Imaging system and method for sorting animals by anatomy

Country Status (2)

Country Link
TW (1) TW202404464A (en)
WO (1) WO2024020829A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5183008A (en) * 1991-02-11 1993-02-02 Dec International, Inc. Livestock sorting system having identification sensor and gate mounted exit switch
JPH10253603A (en) * 1997-03-11 1998-09-25 Matsushita Electric Ind Co Ltd Internal spawn detecting device
DE202004018081U1 (en) * 2004-11-20 2005-01-20 Mogensen Gmbh & Co. Kg Rubbish sorting device for separating a rubbish stream into organic and inorganic components, has X-ray source and linear detector the output of which is analyzed by a computer so that a control unit can control compressed air jets
CN105445290A (en) * 2014-09-02 2016-03-30 同方威视技术股份有限公司 Product quality online detection X-ray apparatus
CN109420623A (en) * 2017-08-31 2019-03-05 无锡日联科技股份有限公司 It is a kind of applied to the full-automatic X-ray detection of shell, sorting system
CN212069552U (en) * 2020-02-20 2020-12-04 无锡日联科技股份有限公司 X-ray detection equipment for temperature fuse
CN112702950A (en) * 2018-09-07 2021-04-23 深圳帧观德芯科技有限公司 Apparatus and method for imaging an object with radiation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5183008A (en) * 1991-02-11 1993-02-02 Dec International, Inc. Livestock sorting system having identification sensor and gate mounted exit switch
JPH10253603A (en) * 1997-03-11 1998-09-25 Matsushita Electric Ind Co Ltd Internal spawn detecting device
DE202004018081U1 (en) * 2004-11-20 2005-01-20 Mogensen Gmbh & Co. Kg Rubbish sorting device for separating a rubbish stream into organic and inorganic components, has X-ray source and linear detector the output of which is analyzed by a computer so that a control unit can control compressed air jets
CN105445290A (en) * 2014-09-02 2016-03-30 同方威视技术股份有限公司 Product quality online detection X-ray apparatus
CN109420623A (en) * 2017-08-31 2019-03-05 无锡日联科技股份有限公司 It is a kind of applied to the full-automatic X-ray detection of shell, sorting system
CN112702950A (en) * 2018-09-07 2021-04-23 深圳帧观德芯科技有限公司 Apparatus and method for imaging an object with radiation
CN212069552U (en) * 2020-02-20 2020-12-04 无锡日联科技股份有限公司 X-ray detection equipment for temperature fuse

Also Published As

Publication number Publication date
TW202404464A (en) 2024-02-01

Similar Documents

Publication Publication Date Title
US20230280482A1 (en) Imaging systems
US11904187B2 (en) Imaging methods using multiple radiation beams
US20210327949A1 (en) Imaging systems and methods of operating the same
WO2024020829A1 (en) Imaging system and method for sorting animals by anatomy
WO2023130199A1 (en) Image sensors and methods of operation
WO2024031301A1 (en) Imaging systems and corresponding operation methods
WO2023123301A1 (en) Imaging systems with rotating image sensors
US20230281754A1 (en) Imaging methods using an image sensor with multiple radiation detectors
US11617554B2 (en) Imaging systems using x-ray fluorescence
WO2023123302A1 (en) Imaging methods using bi-directional counters
WO2023039774A1 (en) Imaging methods using multiple radiation beams
US11948285B2 (en) Imaging systems with multiple radiation sources
WO2023141911A1 (en) Method and system for performing diffractometry
WO2023087123A1 (en) Image sensors with shielded electronics layers
WO2023039701A1 (en) 3d (3-dimensional) printing with void filling
US11825201B2 (en) Image sensors and methods of operating the same
WO2023102675A1 (en) Flow cytometry systems with image sensors
US11156730B2 (en) Radiation detector
WO2023272421A1 (en) Battery film testing with imaging systems
US11882378B2 (en) Imaging methods using multiple radiation beams
WO2023077367A1 (en) Imaging methods with reduction of effects of features in an imaging system
WO2023122921A1 (en) Image sensors with small and thin integrated circuit chips
WO2022198468A1 (en) Imaging systems with image sensors having multiple radiation detectors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22952293

Country of ref document: EP

Kind code of ref document: A1