WO2016013018A1 - High accuracy infrared measurements - Google Patents

High accuracy infrared measurements Download PDF

Info

Publication number
WO2016013018A1
WO2016013018A1 PCT/IL2015/050760 IL2015050760W WO2016013018A1 WO 2016013018 A1 WO2016013018 A1 WO 2016013018A1 IL 2015050760 W IL2015050760 W IL 2015050760W WO 2016013018 A1 WO2016013018 A1 WO 2016013018A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared
regions
imaging system
cameras
images
Prior art date
Application number
PCT/IL2015/050760
Other languages
French (fr)
Inventor
Ernest Grimberg
Omer YANAI
Original Assignee
Opgal Optronic Industries Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opgal Optronic Industries Ltd. filed Critical Opgal Optronic Industries Ltd.
Publication of WO2016013018A1 publication Critical patent/WO2016013018A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/0022Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
    • G01J5/0025Living bodies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/02Constructional details
    • G01J5/07Arrangements for adjusting the solid angle of collected radiation, e.g. adjusting or orienting field of view, tracking position or encoding angular position
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging

Definitions

  • the present invention relates to the field of infrared imagery, and more particularly, to enhancing the accuracy of infrared temperature measurements.
  • a very slight temperature difference may be of diagnostic value, yet the actual differences are below the focusing power of current infrared cameras, particularly commercially available, uncooled infrared cameras.
  • various diseases of the eye include local inflammation, dryness or any other disease may potentially be identified by infrared imaging.
  • the temperature differences occurring in such inflammation are very hard to be identified by current infrared imaging as they are too slight.
  • One aspect of the present invention provides an imaging system comprising: (i) at least two visual range cameras and one infrared camera, set at fixed spatial relations and focused at an intersection of their optical axes, (ii) a controller arranged to move the cameras together, maintaining the fixed spatial relations, to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras and (iii) an image processor arranged to derive infrared measurements of at least one visually identified region of the captured object.
  • Figure 1 is a high level schematic block diagram of an imaging system according to some embodiments of the invention.
  • Figure 2 is a high level flowchart illustrating a method, according to some embodiments of the invention.
  • Imaging systems and methods which use visual range cameras, set at fixed spatial relations and focused at an intersection of their optical axes, to focus an infrared camera set at the same spatial relation.
  • the cameras are moved together, and the object is positioned at the focus of all cameras upon identifying an overlapping of the visual range images.
  • An image processor derives infrared measurements of different regions in object. Prior knowledge concerning the temperature distribution in different regions of the object is utilized to enhance infrared measurement accuracy.
  • Figure 1 is a high level schematic block diagram of an imaging system 100 according to some embodiments of the invention.
  • Imaging system 100 comprises at least two visual range cameras 110 and one infrared camera 120, set at fixed spatial relations and focused at an intersection 115 of their optical axes 111, 112, respectively.
  • the exact details of the configuration of the cameras may vary according to the geometrical features of the objects that are to be imaged and according to parameters of cameras 110, 120. More than two visual range cameras 110 and more than one infrared camera 120 may be used in certain embodiments.
  • infrared camera 120 may be positioned centrally and flanked by visual range cameras 110.
  • Visual range cameras 110 may be positioned on either side of infrared camera 120 along the horizontal axis, the vertical axis or obliquely.
  • Imaging system 100 further comprises a controller 130 arranged to move (135) cameras 110, 120 together, maintaining the fixed spatial relations, to determine a capturing distance D of infrared camera 120 from an object 92 by achieving focused overlapping images of the object by at least two visual range cameras 110.
  • a controller 130 arranged to move (135) cameras 110, 120 together, maintaining the fixed spatial relations, to determine a capturing distance D of infrared camera 120 from an object 92 by achieving focused overlapping images of the object by at least two visual range cameras 110.
  • a controller 130 arranged to move (135) cameras 110, 120 together, maintaining the fixed spatial relations, to determine a capturing distance D of infrared camera 120 from an object 92 by achieving focused overlapping images of the object by at least two visual range cameras 110.
  • intersection 115 of optical axes 111 is not on object 92 (151)
  • two non-overlapping images are captured by visual range cameras 110, in which areas peripheral to object 92 rather than object 92 itself are focused.
  • the fixed spatial relations of visual range cameras 110 are selected to enable bringing intersection 115 of optical axes 111 onto object 92 by bringing the images captured by cameras 110 to overlap via movement 135 (which may be parallel to axis 121 of infrared camera 120 or be in a different direction).
  • the distances and angles between cameras 110, 120 are selected to enhance the ease of adjusting capturing distance D by regarding the non-overlapping images.
  • cameras 110 may be positioned close enough to each other to yield non-overlapping images which are similar enough to be easily brought together.
  • Cameras 110 may also be positioned to avoid obstruction of the images by adjacent anatomical objects such as the nose or the eye lashes.
  • the arrangement of infrared camera 120 to be focused at intersection 115 of optical axes 111 of visual range cameras 110 allows focusing infrared camera 120 on regions which have slight or unrecognizable thermal details and can thus not be used for infrared image focusing.
  • certain embodiments of the disclosed method of operation are completely passive, not delivering any form of radiation.
  • focusing infrared camera 120 may be carried actively, using any type of range finder, employing e.g., electro-optical radiation or ultrasound.
  • Imaging system 100 further comprises an image processor 140 arranged to derive infrared measurements of at least one visually identified region of the captured object.
  • the visually identified region(s) may be indistinguishable from its surroundings in infrared.
  • image processor 140 may be further arranged to compare derived infrared measurements of at least two visually identified regions 156, 157 of the captured object.
  • a user interface module 155 may enable marking regions with respect to object(s) 92 which may be used for infrared measurements.
  • one (156) of the visually identified regions may be a corneal limbus. Utilizing the physiological knowledge that the temperature of the limbus is equal along the circumference of the limbus (at least at the exposed sections), image processor 140 may be configured to define region 156 to have a constant temperature and use this assumption to measure the temperature at enhanced precision.
  • Image processor 140 may be configured to improve the signal to noise ratio (SNR) in region 156 such as a limbus, based on knowledge (e.g., physiological knowledge) that region 156 has a uniform temperature. Defining regions 156, 157 may be carried manually, automatically, or interactively, with image processor 140 complementing manual selections. In certain embodiments, another one (157) of the visually identified regions may comprise an area within the limbus and/or an area outside the limbus. Processor 140 may be configured to sample different regions 157 and improve signal to noise ratios in regions 156 considered to have a relatively uniform temperature (e.g., using physiological knowledge or generally external knowledge).
  • SNR signal to noise ratio
  • Image processor 140 may be further configured to improve signal to noise ratios in specified image regions 156 considered to have a relatively uniform temperature by selectively combining infrared temperature measurements from specified regions 156.
  • SNR is calculated as the ratio Signal/Noise, with Noise calculated (for example) as the Standard Deviation.
  • the Noise decreases by a factor of ⁇ N, and the SNR improves accordingly.
  • by using a region of 100 pixels one can improve the accuracy by a factor of 10, and reach an accuracy of ⁇ 0.01C° using commercially available, uncooled infrared cameras having an accuracy of ⁇ 0.1 C°.
  • imaging system 100 may be associated with a chin rest, onto which patients may rest their chin.
  • the association with the chin rest supports maintaining fixed spatial relations between the cameras during their movement.
  • System 100 may be configured for quick and easy focusing and capturing operations, supported by user interface 150 (e.g., graphical user interface, touch screen etc.).
  • Imaging system 100 may comprise user interface 150 configured to display captured visual images and/or captured infrared images and allow a user define, possibly iteratively, regions 156, 157.
  • Processor 140 may be configured to suggest region boundaries and/or to identify parts of region 156 (e.g., of the limbus), possibly interactively with a physician.
  • FIG. 2 is a high level flowchart illustrating a method 200, according to some embodiments of the invention.
  • Data processing stages and control stages of method 200 may be implemented by respective processors and algorithms may be implemented by respective computer program product(s) comprising a non-transitory computer usable medium having computer usable program code tangibly embodied thereon, the computer usable program code configured to carry out at least part of the respective stages of method 200.
  • Method 200 may comprise moving together at least two visual range cameras and one infrared camera (stage 215), set at fixed spatial relations and focused at an intersection of their optical axes (stage 210), to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras (stage 220); and deriving infrared measurements of at least one visually identified region of the captured object (stage 230).
  • method 200 may further comprise fusing images from the visual and infrared cameras (stage 231).
  • the at least one visually identified region may be indistinguishable from its surroundings in infrared, method 200 however may still enable infrared measurements of the region based on its visual identification and spatial configuration of the cameras. Moreover, method 200 may comprise increasing SNR by sampling regions of known or presumed identical infrared measurement values (stage 232).
  • method 200 may further comprise comparing derived infrared measurements of at least two visually identified regions of the captured object (stage 235).
  • Method 200 may further comprise improving signal to noise ratios in specified image regions considered to have a relatively uniform temperature (e.g., due to physiological knowledge) by selectively combining infrared temperature measurements from the specified regions (stage 236).
  • Exemplary embodiments may comprise measuring the limbus, which at least parts thereof have a uniform temperature, and using the limbus as a temperature reference for regions inside and outside the circumference of the limbus.
  • method 200 may be further used to measure and compare temperatures of different regions, over time.
  • method 200 may further comprise configuring a user interface to display, and displaying, at least one of the captured visual and infrared images and allow a user define the regions (stage 240).
  • Certain embodiments comprise a computer program product comprising a non-transitory computer readable storage medium having computer readable program embodied therewith, the computer readable program configured to determine a capturing distance of an infrared camera from an object by achieving focused overlapping images of the object by at least two visual range cameras, which are set with the infrared camera at fixed spatial relations and focused at an intersection of the optical axes, and moved together to yield the images.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Radiation Pyrometers (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Imaging systems and methods are provided, which use visual range and infrared thermal camera(s), set at fixed spatial relations and focused at an intersection of their optical axes, to focus the infrared camera(s). The cameras are moved together, and the object is positioned at the focus of all cameras upon identifying an overlapping of the visual range images. An image processor fuses images and derives infrared measurements of different regions in object. Prior knowledge concerning the temperature distribution in different regions of the object is utilized to enhance infrared measurement accuracy.

Description

HIGH ACCURACY INFRARED MEASUREMENTS
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of Israeli Patent Application No. 233807 filed on July 24, 2014, which is incorporated herein by reference in its entirety.
BACKGROUND OF THE INVENTION
1. TECHNICAL FIELD
[0002] The present invention relates to the field of infrared imagery, and more particularly, to enhancing the accuracy of infrared temperature measurements.
2. DISCUSSION OF RELATED ART
[0003] In various medical, industrial and agricultural cases, a very slight temperature difference may be of diagnostic value, yet the actual differences are below the focusing power of current infrared cameras, particularly commercially available, uncooled infrared cameras. For example, various diseases of the eye include local inflammation, dryness or any other disease may potentially be identified by infrared imaging. However, the temperature differences occurring in such inflammation are very hard to be identified by current infrared imaging as they are too slight.
SUMMARY OF THE INVENTION
[0004] One aspect of the present invention provides an imaging system comprising: (i) at least two visual range cameras and one infrared camera, set at fixed spatial relations and focused at an intersection of their optical axes, (ii) a controller arranged to move the cameras together, maintaining the fixed spatial relations, to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras and (iii) an image processor arranged to derive infrared measurements of at least one visually identified region of the captured object.
[0005] These, additional, and/or other aspects and/or advantages of the present invention are set forth in the detailed description which follows; possibly inferable from the detailed description; and/or learnable by practice of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS
[0006] For a better understanding of embodiments of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections throughout.
[0007] In the accompanying drawings:
[0008] Figure 1 is a high level schematic block diagram of an imaging system according to some embodiments of the invention.
[0009] Figure 2 is a high level flowchart illustrating a method, according to some embodiments of the invention.
DETAILED DESCRIPTION OF THE INVENTION
[0010] With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
[0011] Before at least one embodiment of the invention is explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
[0012] Imaging systems and methods are provided, which use visual range cameras, set at fixed spatial relations and focused at an intersection of their optical axes, to focus an infrared camera set at the same spatial relation. The cameras are moved together, and the object is positioned at the focus of all cameras upon identifying an overlapping of the visual range images. An image processor derives infrared measurements of different regions in object. Prior knowledge concerning the temperature distribution in different regions of the object is utilized to enhance infrared measurement accuracy.
[0013] Figure 1 is a high level schematic block diagram of an imaging system 100 according to some embodiments of the invention.
[0014] Imaging system 100 comprises at least two visual range cameras 110 and one infrared camera 120, set at fixed spatial relations and focused at an intersection 115 of their optical axes 111, 112, respectively. The exact details of the configuration of the cameras may vary according to the geometrical features of the objects that are to be imaged and according to parameters of cameras 110, 120. More than two visual range cameras 110 and more than one infrared camera 120 may be used in certain embodiments.
[0015] For example, in case of imaging eyes, infrared camera 120 may be positioned centrally and flanked by visual range cameras 110. Visual range cameras 110 may be positioned on either side of infrared camera 120 along the horizontal axis, the vertical axis or obliquely.
[0016] Imaging system 100 further comprises a controller 130 arranged to move (135) cameras 110, 120 together, maintaining the fixed spatial relations, to determine a capturing distance D of infrared camera 120 from an object 92 by achieving focused overlapping images of the object by at least two visual range cameras 110. As illustrated in the bottom part of Figure 1 (schematically representing a user interface 150 that provides an interface to controller 130 and/or image processor 140, see below) when object 92 (in the illustrated case the limbus of the eye) is at intersection 115 of optical axes 111, the images captured by visual range cameras 110 overlap (152) to yield a single image in which object 92 is focused. When intersection 115 of optical axes 111 is not on object 92 (151), two non-overlapping images are captured by visual range cameras 110, in which areas peripheral to object 92 rather than object 92 itself are focused. The fixed spatial relations of visual range cameras 110 are selected to enable bringing intersection 115 of optical axes 111 onto object 92 by bringing the images captured by cameras 110 to overlap via movement 135 (which may be parallel to axis 121 of infrared camera 120 or be in a different direction). The distances and angles between cameras 110, 120 are selected to enhance the ease of adjusting capturing distance D by regarding the non-overlapping images. For example, cameras 110 may be positioned close enough to each other to yield non-overlapping images which are similar enough to be easily brought together. Cameras 110 may also be positioned to avoid obstruction of the images by adjacent anatomical objects such as the nose or the eye lashes. [0017] ID certain embodiments, the arrangement of infrared camera 120 to be focused at intersection 115 of optical axes 111 of visual range cameras 110 allows focusing infrared camera 120 on regions which have slight or unrecognizable thermal details and can thus not be used for infrared image focusing. Moreover, certain embodiments of the disclosed method of operation are completely passive, not delivering any form of radiation. However, in certain embodiments, focusing infrared camera 120 may be carried actively, using any type of range finder, employing e.g., electro-optical radiation or ultrasound.
[0018] Imaging system 100 further comprises an image processor 140 arranged to derive infrared measurements of at least one visually identified region of the captured object. In particular, the visually identified region(s) may be indistinguishable from its surroundings in infrared.
[0019] In certain embodiments, image processor 140 may be further arranged to compare derived infrared measurements of at least two visually identified regions 156, 157 of the captured object. In the example illustrated at the bottom of Figure 1, a user interface module 155 may enable marking regions with respect to object(s) 92 which may be used for infrared measurements. In certain embodiments, one (156) of the visually identified regions may be a corneal limbus. Utilizing the physiological knowledge that the temperature of the limbus is equal along the circumference of the limbus (at least at the exposed sections), image processor 140 may be configured to define region 156 to have a constant temperature and use this assumption to measure the temperature at enhanced precision. Image processor 140 may be configured to improve the signal to noise ratio (SNR) in region 156 such as a limbus, based on knowledge (e.g., physiological knowledge) that region 156 has a uniform temperature. Defining regions 156, 157 may be carried manually, automatically, or interactively, with image processor 140 complementing manual selections. In certain embodiments, another one (157) of the visually identified regions may comprise an area within the limbus and/or an area outside the limbus. Processor 140 may be configured to sample different regions 157 and improve signal to noise ratios in regions 156 considered to have a relatively uniform temperature (e.g., using physiological knowledge or generally external knowledge). Image processor 140 may be further configured to improve signal to noise ratios in specified image regions 156 considered to have a relatively uniform temperature by selectively combining infrared temperature measurements from specified regions 156. [0020] The inventors have discovered that the disclosed principles enable to increase the temperature measurement accuracy by an order of magnitude. SNR is calculated as the ratio Signal/Noise, with Noise calculated (for example) as the Standard Deviation. By selecting a sample of N measurements from a uniform population with a normal distribution, the Noise (standard deviation) decreases by a factor of νN, and the SNR improves accordingly. In a non- limiting example, by using a region of 100 pixels, one can improve the accuracy by a factor of 10, and reach an accuracy of ±0.01C° using commercially available, uncooled infrared cameras having an accuracy of ±0.1 C°.
[0021] In certain embodiments, imaging system 100 may be associated with a chin rest, onto which patients may rest their chin. The association with the chin rest supports maintaining fixed spatial relations between the cameras during their movement. System 100 may be configured for quick and easy focusing and capturing operations, supported by user interface 150 (e.g., graphical user interface, touch screen etc.).
[0022] Imaging system 100 may comprise user interface 150 configured to display captured visual images and/or captured infrared images and allow a user define, possibly iteratively, regions 156, 157. Processor 140 may be configured to suggest region boundaries and/or to identify parts of region 156 (e.g., of the limbus), possibly interactively with a physician.
[0023] Figure 2 is a high level flowchart illustrating a method 200, according to some embodiments of the invention. Data processing stages and control stages of method 200 may be implemented by respective processors and algorithms may be implemented by respective computer program product(s) comprising a non-transitory computer usable medium having computer usable program code tangibly embodied thereon, the computer usable program code configured to carry out at least part of the respective stages of method 200.
[0024] Method 200 may comprise moving together at least two visual range cameras and one infrared camera (stage 215), set at fixed spatial relations and focused at an intersection of their optical axes (stage 210), to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras (stage 220); and deriving infrared measurements of at least one visually identified region of the captured object (stage 230). In certain embodiments, method 200 may further comprise fusing images from the visual and infrared cameras (stage 231). [0025] The at least one visually identified region may be indistinguishable from its surroundings in infrared, method 200 however may still enable infrared measurements of the region based on its visual identification and spatial configuration of the cameras. Moreover, method 200 may comprise increasing SNR by sampling regions of known or presumed identical infrared measurement values (stage 232).
[0026] In certain embodiments, method 200 may further comprise comparing derived infrared measurements of at least two visually identified regions of the captured object (stage 235). Method 200 may further comprise improving signal to noise ratios in specified image regions considered to have a relatively uniform temperature (e.g., due to physiological knowledge) by selectively combining infrared temperature measurements from the specified regions (stage 236). Exemplary embodiments may comprise measuring the limbus, which at least parts thereof have a uniform temperature, and using the limbus as a temperature reference for regions inside and outside the circumference of the limbus. In certain embodiments, method 200 may be further used to measure and compare temperatures of different regions, over time.
[0027] In certain embodiments, method 200 may further comprise configuring a user interface to display, and displaying, at least one of the captured visual and infrared images and allow a user define the regions (stage 240).
[0028] Certain embodiments comprise a computer program product comprising a non-transitory computer readable storage medium having computer readable program embodied therewith, the computer readable program configured to determine a capturing distance of an infrared camera from an object by achieving focused overlapping images of the object by at least two visual range cameras, which are set with the infrared camera at fixed spatial relations and focused at an intersection of the optical axes, and moved together to yield the images.
[0029] In the above description, an embodiment is an example or implementation of the invention. The various appearances of "one embodiment", "an embodiment", "certain embodiments" or "some embodiments" do not necessarily all refer to the same embodiments.
[0030] Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination.
Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment. [0031] Certain embodiments of the invention may include features from different embodiments disclosed above, and certain embodiments may incorporate elements from other embodiments disclosed above. The disclosure of elements of the invention in the context of a specific embodiment is not to be taken as limiting their used in the specific embodiment alone.
[0032] Furthermore, it is to be understood that the invention can be carried out or practiced in various ways and that the invention can be implemented in certain embodiments other than the ones outlined in the description above.
[0033] The invention is not limited to those diagrams or to the corresponding descriptions. For example, flow need not move through each illustrated box or state, or in exactly the same order as illustrated and described.
[0034] Meanings of technical and scientific terms used herein are to be commonly understood as by one of ordinary skill in the art to which the invention belongs, unless otherwise defined.
[0035] While the invention has been described with respect to a limited number of embodiments, these should not be construed as limitations on the scope of the invention, but rather as exemplifications of some of the preferred embodiments. Other possible variations, modifications, and applications are also within the scope of the invention. Accordingly, the scope of the invention should not be limited by what has thus far been described, but by the appended claims and their legal equivalents.

Claims

CLAIMS What is claimed is:
1. An imaging system comprising:
at least two visual range cameras and one infrared camera, set at fixed spatial relations and focused at an intersection of their optical axes;
a controller arranged to move the cameras together, maintaining the fixed spatial relations, to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras; and
an image processor arranged to derive infrared temperature measurements of at least one visually identified region of the captured object.
2. The imaging system of claim 1, wherein the image processor is further arranged to fuse images from the at least two visual range cameras and the infrared camera.
3. The imaging system of claim 1, wherein the at least one visually identified region is indistinguishable from its surroundings in infrared.
4. The imaging system of claim 1, wherein the imaging system is configured to compare derived infrared measurements of at least two visually identified regions of the captured object.
5. The imaging system of claim 1, wherein the image processor is further configured to and improve signal to noise ratios in specified image regions considered to have a relatively uniform temperature by selectively combining infrared temperature measurements from the specified regions.
6. The imaging system of any one of claims 1.-5, wherein one of the visually identified regions is a corneal limbus.
7. The imaging system of any one of claims 1-5, wherein another one of the visually identified regions comprises at least one of an area within the limbus and an area outside the limbus.
8. The imaging system of claim 7, further comprising a user interface configured to display at least one of the captured visual and infrared images and allow a physician define the regions other than the limbus.
9. A method comprising:
moving together at least two visual range cameras and one infrared camera, set at fixed spatial relations and focused at an intersection of their optical axes, to determine a capturing distance of the infrared camera from an object by achieving focused overlapping images of the object by the at least two visual range cameras; and
deriving infrared temperature measurements of at least one visually identified region of the captured object, wherein the at least one visually identified region is indistinguishable from its surroundings in infrared.
10. The method of claim 9, further comprising fusing images from the visual and infrared cameras.
11. The method of claim 9, further comprising increasing signal to noise ratio (SNR) by sampling regions of known or presumed identical infrared measurement values
12. The method of claim 9, further comprising comparing derived infrared measurements of at least two visually identified regions of the captured object.
13. The method of claim 9, further comprising improving signal to noise ratios in specified image regions considered to have a relatively uniform temperature by selectively combining infrared temperature measurements from the specified regions.
14. The method of claim 9, further comprising configuring a user interface to display, and displaying, at least one of the captured visual and infrared images and allow a user define the regions.
15. A computer program product comprising a non-transitory computer readable storage medium having computer readable program embodied therewith, the computer readable program configured to determine a capturing distance of an infrared camera from an object by achieving focused overlapping images of the object by at least two visual range cameras, which are set with the infrared camera at fixed spatial relations and focused at an intersection of the optical axes, and moved together to yield the images.
PCT/IL2015/050760 2014-07-24 2015-07-23 High accuracy infrared measurements WO2016013018A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IL233807 2014-07-24
IL23380714 2014-07-24
IL236593A IL236593A0 (en) 2014-07-24 2015-01-05 High accuracy infrared measurements
IL236593 2015-01-05

Publications (1)

Publication Number Publication Date
WO2016013018A1 true WO2016013018A1 (en) 2016-01-28

Family

ID=54347409

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2015/050760 WO2016013018A1 (en) 2014-07-24 2015-07-23 High accuracy infrared measurements

Country Status (2)

Country Link
IL (1) IL236593A0 (en)
WO (1) WO2016013018A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021204947A1 (en) 2020-04-08 2021-10-14 Heba Bevan Infection and disease sensing systems
GB2603467A (en) * 2021-01-28 2022-08-10 Sita Advanced Travel Solutions Ltd A method and system for screening a user's temperature
US11519602B2 (en) 2019-06-07 2022-12-06 Honeywell International Inc. Processes and systems for analyzing images of a flare burner

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5603328A (en) * 1993-01-18 1997-02-18 The State Of Israel, Ministry Of Defence, Armament Development Authority Infra-red vascular angiography system
US20030020871A1 (en) * 2001-07-30 2003-01-30 Gregg Niven Anterior chamber diameter measurement system from limbal ring measurement
US20090065695A1 (en) * 2007-09-11 2009-03-12 Demarco Robert Infrared camera for locating a target using at least one shaped light source
US20120062842A1 (en) * 2009-04-01 2012-03-15 Centervue S.P.A Instrument for eye examination
US20140002793A1 (en) * 2011-11-04 2014-01-02 Joshua Noel Hogan Non-invasive optical monitoring

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5603328A (en) * 1993-01-18 1997-02-18 The State Of Israel, Ministry Of Defence, Armament Development Authority Infra-red vascular angiography system
US20030020871A1 (en) * 2001-07-30 2003-01-30 Gregg Niven Anterior chamber diameter measurement system from limbal ring measurement
US20090065695A1 (en) * 2007-09-11 2009-03-12 Demarco Robert Infrared camera for locating a target using at least one shaped light source
US20120062842A1 (en) * 2009-04-01 2012-03-15 Centervue S.P.A Instrument for eye examination
US20140002793A1 (en) * 2011-11-04 2014-01-02 Joshua Noel Hogan Non-invasive optical monitoring

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11519602B2 (en) 2019-06-07 2022-12-06 Honeywell International Inc. Processes and systems for analyzing images of a flare burner
WO2021204947A1 (en) 2020-04-08 2021-10-14 Heba Bevan Infection and disease sensing systems
GB2603467A (en) * 2021-01-28 2022-08-10 Sita Advanced Travel Solutions Ltd A method and system for screening a user's temperature

Also Published As

Publication number Publication date
IL236593A0 (en) 2015-10-29

Similar Documents

Publication Publication Date Title
CN107193383B (en) Secondary sight tracking method based on face orientation constraint
EP3073894B1 (en) Corrected 3d imaging
CN107357429B (en) Method, apparatus, and computer-readable storage medium for determining gaze
Mlot et al. 3D gaze estimation using eye vergence
US20150029322A1 (en) Method and computations for calculating an optical axis vector of an imaged eye
CN107976257B (en) A kind of image display method of infrared thermal imager, device and infrared thermal imager
EP3264980A1 (en) System and method for patient positioning
JP2013212172A5 (en)
KR101411426B1 (en) Thermal image system capable of chasing specific position in thermal image and method for chasing specific position using thereof
IL300597A (en) Systems and methods for eye examination
JP2011194060A5 (en)
JP2014166271A5 (en)
JP5583980B2 (en) Image processing method and image processing apparatus
KR20150036395A (en) Strabismus Detection
CN104809424A (en) Method for realizing sight tracking based on iris features
WO2016013018A1 (en) High accuracy infrared measurements
JP2019042304A5 (en)
WO2014115144A1 (en) Enhancing infrared measurement accuracy in a specified area
US9974436B2 (en) Ophthalmic apparatus, and treatment site measuring method for the apparatus
US20180199810A1 (en) Systems and methods for pupillary distance estimation from digital facial images
JP5279153B1 (en) Pupil position measuring method and bilateral lens manufacturing method
JP2016209133A5 (en)
JP2003079577A (en) Visual axis measuring apparatus and method, visual axis measuring program, and recording medium recording the same
US20190316966A1 (en) Image processing device and image processing method
US20170135578A1 (en) Volume measuring device, volume measuring method, and volume measuring program for three-dimensional tomographic image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15825145

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15825145

Country of ref document: EP

Kind code of ref document: A1