WO2017046796A1 - Image data correction based on different viewpoints - Google Patents

Image data correction based on different viewpoints Download PDF

Info

Publication number
WO2017046796A1
WO2017046796A1 PCT/IL2016/051022 IL2016051022W WO2017046796A1 WO 2017046796 A1 WO2017046796 A1 WO 2017046796A1 IL 2016051022 W IL2016051022 W IL 2016051022W WO 2017046796 A1 WO2017046796 A1 WO 2017046796A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
picture
optical data
image
elements
Prior art date
Application number
PCT/IL2016/051022
Other languages
French (fr)
Inventor
Israel Boaz Arnon
Original Assignee
Real Imaging Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Real Imaging Ltd. filed Critical Real Imaging Ltd.
Publication of WO2017046796A1 publication Critical patent/WO2017046796A1/en
Priority to IL258135A priority Critical patent/IL258135B2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0091Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a subject-specific function.
  • thermographic images A thermographic image is typically obtained by receiving from the body of the subject radiation at any one of several infrared wavelength ranges and analyzing the radiation to provide a two- dimensional temperature map of the surface.
  • the thermographic image can be in the form of either or both of a visual image and corresponding temperature data.
  • the output from infrared cameras used for infrared thermography typically provides an image comprising a plurality of pixel data points, each pixel providing temperature information which is visually displayed, using a color code or grayscale code.
  • the temperature information can be further processed by computer software to generate for example, mean temperature for the image, or a discrete area of the image, by averaging temperature data associated with all the pixels or a sub-collection thereof.
  • thermographic image Based on the thermographic image, a physician diagnoses the site, and determines, for example, whether or not the site includes an inflammation while relying heavily on experience and intuition.
  • the non-thermographic image data acquisition functionality acquires non- thermographic image data
  • the thermographic image data acquisition functionality acquires thermographic image data.
  • U.S. Patent No. 7,292,719 discloses a system for determining presence or absence of one or more thermally distinguishable objects in a living body.
  • a combined image generator configured combines non-thermographic three-dimensional data of a three-dimensional tissue region in the living body with thermographic two-dimensional data of the tissue region so as to generate three-dimensional temperature data associated with the three- dimensional tissue region.
  • a method of correcting image data comprises: obtaining at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; identifying picture-elements at overlap portions of the datasets; comparing optical data associated with each of at least some of the identified picture-elements among the datasets, and based on the comparison, correcting optical data associated with picture-elements outside the overlap portions.
  • the method comprises: for each dataset, calculating a viewing angle for each picture-element over the dataset; and constructing, based on the comparison, a correction function which is specific to the scene and which describes a dependence of the correction of the optical data on the viewing angle.
  • At least two of the datasets is a 3D representation of the scene, the representation having 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data.
  • the optical data comprises visible light data, wherein the comparing and the correcting is applied to visible light data.
  • the optical data comprises ultraviolet light data wherein the comparing and the correcting is applied to ultraviolet light data.
  • the optical data comprises X ray data wherein the comparing and the correcting is applied to X ray data.
  • the optical data comprises thermal data, wherein the comparing and the correcting are applied to thermal data.
  • the method comprises determining presence or absence of a thermally distinguished region in the portion of the body based on the corrected thermal data.
  • the method comprises generating a temperature map of the portion of the body using the corrected thermal data.
  • the method comprises displaying the temperature map on a display and/or transmitting the temperature map to a non-transitory computer readable medium.
  • the body is a living body.
  • the body is a living body and the method comprises determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the portion of the body is a breast of a human subject.
  • the method comprises regenerating at least one of the datasets using the corrected optical data.
  • the method comprises displaying the re-generated dataset a display and/or transmitting the re-generated dataset to a non-transitory computer readable medium.
  • the method comprises searching for two picture-elements within the overlap that are at viewing angles which are the same in absolute values but opposite in signs; compares optical data associated with the two picture-elements; and correct optical data of at least one of the two picture- elements to ensure that the two picture-elements have the same or similar optical data.
  • the computer software product comprises a non- transitory computer-readable medium in which program instructions are stored, which instructions, when read by a data processor, cause the data processor to receive at least two image datasets and execute the method as delegated above and optionally and preferably as further exemplified below.
  • an image correction system comprising: a digital input for receiving at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; and a data processor configured for identifying picture-elements at overlap portions of the datasets, for comparing optical data associated with each of at least some of the identified picture- elements among the datasets, and for correcting, based on the comparison, optical data associated with picture-elements outside the overlap portions.
  • the data processor is configured for calculating, for each dataset, a viewing angle for each picture-element over the dataset, and constructing, based on the comparison, a correction function which is specific to the scene and which describes a dependence of the correction of the optical data on the viewing angle.
  • the correction function is nonlinear with respect to the angle.
  • the correction function comprises a quadratic function.
  • At least two of the datasets is a 3D representation of the scene, the representation having 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data.
  • the optical data comprises thermal data
  • the data processor is configured to apply the comparison and the correction to thermal data.
  • the data processor is configured for determining presence or absence of a thermally distinguished region in the portion of the body based on the corrected thermal data.
  • the system comprises an image generator configured for generating a temperature map of the portion of the body using the corrected thermal data.
  • the system comprises at least one of a display for displaying the temperature map, and a non-transitory computer readable medium for storing the temperature map.
  • the data processor is configured for determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the system comprises an image generator for re-generating at least one of the datasets using the corrected optical data.
  • the system comprises at least one of a display for displaying the re-generated dataset a display, and a non-transitory computer readable medium for storing the re-generated dataset.
  • an imaging system comprising: an image generator for generating at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; and the image correction system as delineated above and optionally and preferably as further detailed and exemplified below.
  • Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
  • a data processor such as a computing platform for executing a plurality of instructions.
  • the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
  • a network connection is provided as well.
  • a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
  • FIGs. 1A-C are schematic illustrations of a synthesized thermospatial image, according to some embodiments of the present invention.
  • FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention.
  • FIG. 3 is a graph of measured temperatures of several points on a human's skin as a function of angle between the normal to skin and the optical axis of a thermal camera;
  • FIG. 4 is a schematic illustration of an image correction system, according to some embodiments of the present invention.
  • FIG. 5 shows experimental results in which thermal data of images were corrected in accordance with some embodiments of the present invention
  • FIG. 6 is a thermographic image obtained during experiments performed according to some embodiments of the present invention.
  • FIGs. 7A and 7B are visible light images from the two different viewing angles, obtained during experiments performed according to some embodiments of the present invention.
  • FIG. 7C shows registration of the images of FIGs. 7A and 7B, obtained during experiments performed according to some embodiments of the present invention
  • FIG. 7D shows picture-elements for which a registration difference calculated during experiments performed according to some embodiments of the present invention was less than 2 mm;
  • FIGs. 7E and 7F show grey level differences between thermal images before
  • FIG. 7E and after (FIG. 7F) thermal data correction performed according to some embodiments of the present invention
  • FIG. 7G shows difference between the absolute values of the images in FIGs. 7E and 7F;
  • FIG. 7H shows regions at which the thermal correction of the present embodiments provides improvement
  • FIG. 8 is a flowchart diagram of a method suitable for correcting image data, in embodiments of the invention in which the correction is based, at least in part, on different viewpoints;
  • FIG. 9 is a schematic illustration showing two image datasets, according to some embodiments of the present invention. DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION
  • the present invention in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a subject-specific function.
  • the present inventors have devised an approach which enables the correction of image data.
  • the image data typically include two or more image datasets, each dataset being arranged gridwise over a plurality of picture-elements, wherein two or more of the datasets acquired from different viewpoints relative to a scene.
  • the picture-elements form spatial data, wherein each picture-element is associated with optical data that corresponds to an optical field received from a respective portion of the imaged scene.
  • the optical field can be at any wavelength or range of wavelengths. Representative examples of types of optical field suitable for the present embodiments including, without limitation, infrared light (e.g.
  • far infrared light far infrared light, mid infrared light, near infrared light
  • visible light ultraviolet light (e.g., near ultraviolet light, extreme ultraviolet light)
  • ultraviolet light e.g., near ultraviolet light, extreme ultraviolet light
  • X ray field e.g., soft X ray field, hard X ray field
  • one or more of datasets is optionally and preferably a 3D representation of the scene.
  • the representation has 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data.
  • the body can be any object that reflects, transmits or emits an optical field of the respective type.
  • the body can be a living body of a human subject or animal.
  • the body can be an inanimate object in the scene.
  • the image data is optionally and preferably corrected based on a comparison of optical data associated with picture-elements that describe the same portion of an imaged scene but from different viewpoints.
  • the compared optical data can describe any property of a body or bodies within the imaged scene that can be measured optically. Representative examples include, without limitation, reflectivity to the optical field under analysis, transparency to the optical field under analysis, emissivity of the optical field under analysis, radiance of the optical field under analysis, color of the body or bodies within the imaged scene, and temperature of the body or bodies within the imaged scene.
  • the optical data that is compared according to some embodiments of the present invention is preferably angle-dependent. Specifically, the optical data depends on the viewing angle, more preferably the absolute value of the viewing angle of a picture- element under analysis.
  • the viewing angle is conveniently defined between the normal to the surface of a respective body in the imaged scene, at the picture-element under analysis, and the optical axis of the imaging system that provides the optical data associated with that picture-element.
  • corrected image data can be used to determine presence or absence of one or more distinguishable regions in a portion of a body, more preferably a living body.
  • the technique of the present embodiments is applied for correcting thermal data, which can include temperature data or some proxy thereof, such as, but not limited to, grayscale data and/or wavelength data.
  • the corrected thermal data is used for determining the likelihood for the presence of a thermally distinguishable region in the portion of the body.
  • the analysis of the present embodiments can be used to extract properties of the underlying tissue.
  • determination of the likelihood that a thermally distinguished region is present in the portion of the body can be used for assessing whether or not the portion of the body has a pathology such as a tumor or an inflammation.
  • An elevated temperature or a temperature pattern is generally associated with a tumor due to the metabolic abnormality of the tumor and proliferation of blood vessels (angiogenesis) at and/or near the tumor. In a cancerous tumor the cells proliferate faster and thus are more active and generate more heat. This tends to enhance the temperature differential between the tumor itself and the surrounding tissue.
  • the present embodiments can therefore be used for diagnosis of cancer, particularly, but not exclusively breast cancer.
  • the corrected optical data can be used for recoloring an image constituted by the data.
  • the technique of the present embodiments is optionally and preferably applied to surface information that describes the surface of the body.
  • the surface information optionally and preferably comprises optical information and spatial information.
  • the optical information can pertain to any of the aforementioned properties.
  • the optical information comprises data pertaining to heat evacuated from or absorbed by the surface. Since different parts of the surface generally evacuate or absorb different amount of heat, the thermal information comprises a set of tuples, each comprising the coordinates of a region or a point on the surface and a thermal numerical value (e.g. , temperature, thermal energy) associated with the point or region.
  • the thermal information can be transformed to visible signals, in which case the thermal information is in the form of a thermographic image.
  • the optical information can comprise data pertaining to intensity and/or wavelength range and/or color of light emitted, transmitted or reflected by the surface.
  • the optical information can comprise data pertaining to intensities or grey levels of X rays transmitted through the surface.
  • the optical data (e.g. , thermal data, visible light data, X ray data) is typically arranged gridwise in a plurality of picture-elements (e.g., pixels, arrangements of pixels, subpixels) representing the image (e.g. , thermographic image, when the optical data includes thermal data).
  • Each picture-element is represented by an intensity value or a grey-level and/or a spectrum over the grid.
  • the picture-element is represented by a spectrum, it typically includes a plurality of intensity values each corresponding to a range of wavelengths (e.g., a color, when the optical data is visible light data).
  • a spectrum can be obtained by means of picture-elements that are divided into sub-picture- elements (e.g.
  • a picture-element can comprise three sub- picture-elements, one corresponding to red light, one corresponding to green light, and one corresponding to blue light, according to the so called RGB format.
  • the number of different intensity values can be different from the number of grey-levels.
  • an 8-bit display can generate 256 different grey-levels.
  • the number of different intensity values can be much larger.
  • the optical information includes thermal information that spans over a range of 37 °C and is digitized with a resolution of 0.1 °C.
  • Use of higher formats e.g. , 10 bit, 12 bit, 14 bit or higher
  • a photon thermal camera can provide information pertaining to the number of photons detected by the camera detector. Such information can extend over a range of about 6000-8000 intensity values.
  • the optical data is provided in 4 bytes (in distinction, for example, from 3 bytes that are used in conventional RGB images) creating a representation similar to floating point, where the first three bytes represent three wavelength (e.g. , color) channels and the forth byte represents a common exponent to the three wavelength channels.
  • HDR High Dynamic Range
  • the correction technique is applied to intensity values, in some embodiments of the present invention the correction technique is applied to grey-levels, and in some embodiments of the present invention the correction technique is applied to hue values calculated based on combination of sub-pixels. Also contemplated are multi processing procedures in which these type of values or levels are combined.
  • pixel is sometimes abbreviated herein to indicate a picture-element. However, this is not intended to limit the meaning of the term “picture-element” which refers to a unit of the composition of an image.
  • thermographic image is used interchangeably throughout the specification without limiting the scope of the present embodiments in any way. Specifically, unless otherwise defined, the use of the term “thermographic image” is not to be considered as limited to the transformation of the thermal information into visible signals.
  • a thermographic image can be stored in the memory of a computer readable medium, preferably a non-transitory computer readable medium, as a set of tuples as described above.
  • the spatial information comprises data pertaining to geometric properties of a surface which at least partially encloses a three-dimensional volume.
  • the surface is non-planar, e.g. , curved.
  • the surface is a two-dimensional object embedded in a three-dimensional space.
  • a surface is a metric space induced by a smooth connected and compact Riemannian 2-manifold.
  • the geometric properties of the surface would be provided explicitly for example, the slope and curvature (or even other spatial derivatives or combinations thereof) for every point of the surface.
  • the spatial information of the surface is a reduced version of a 3D spatial representation, which may be either a point-cloud or a 3D reconstruction (e.g. , a polygonal mesh or a curvilinear mesh) based on the point cloud.
  • the 3D spatial representation is expressed via a 3D coordinate-system, such as, but not limited to, Cartesian, Spherical, Ellipsoidal, 3D Parabolic or Paraboloidal coordinate 3D system.
  • the spatial data in some embodiments of the present invention, can be in a form of an image. Since the spatial data represent the surface such image is typically a two- dimensional image which, in addition to indicating the lateral extent of body members, further indicates the relative or absolute distance of the body members, or portions thereof, from some reference point, such as the location of the imaging device. Thus, the image typically includes information residing on a surface of a three-dimensional body and not necessarily in the bulk. Yet, it is commonly acceptable to refer to such image as "a three-dimensional image” because the surface is conveniently defined over a three- dimensional system of coordinate. Thus, throughout this specification and in the claims section that follows, the terms "three-dimensional image” and "three-dimensional representation” primarily relate to surface entities.
  • the lateral dimensions of the spatial data are referred to as the x and y dimensions, and the range data (the depth or distance of the body members from a reference point) is referred to as the z dimension.
  • the surface information of a body comprises wavelength (e.g., color) information and spatial information
  • the surface information (wavelength and spatial) of a body is typically in the form of a synthesized representation which includes both wavelength data representing the image (e.g. , visible light image) and spatial data representing the surface, where the wavelength data is associated with the spatial data (i.e. , a tuple of the spatial data is associated with a color or set of colors of the thermal data).
  • a spectrospatial representation can be in the form of digital data (e.g., a list of tuples associated with digital data describing spectral quantities) or in the form of a three- dimensional image.
  • the surface information of a body comprises thermal information and spatial information
  • the surface information (thermal and spatial) of a body is typically in the form of a synthesized representation which includes both thermal data representing the thermal image and spatial data representing the surface, where the thermal data is associated with the spatial data (i.e. , a tuple of the spatial data is associated with a heat-related value of the thermal data).
  • a thermospatial representation Such representation is referred to as a thermospatial representation.
  • the thermospatial representation can be in the form of digital data (e.g. , a list of tuples associated with digital data describing thermal quantities) or in the form of an image (e.g. , a three-dimensional image color-coded or grey-level coded according to the thermal data).
  • thermospatial representation in the form of an image is referred to hereinafter as a thermospatial image.
  • the thermospatial representation is optionally and preferably defined over a 3D spatial representation of the body and has thermal data associated with a surface of the 3D spatial representation, and arranged gridwise over the surface in a plurality of picture-elements (e.g. , pixels, arrangements of pixels) each represented by an intensity value or a grey-level over the grid.
  • thermospatial representation when the thermospatial representation is in the form of digital data, the digital data describing thermal properties can also be expressed either in terms of intensities or in terms of grey-levels as described above. Digital thermospatial representation can also correspond to thermospatial image whereby each tuple corresponds to a picture-element of the image.
  • thermographic images are mapped or projected onto the surface of the 3D spatial representation to form the thermospatial representation.
  • the thermographic image to be projected onto the surface of the 3D spatial representation preferably comprises thermal data which are expressed over the same coordinate- system as the 3D spatial representation. Any type of thermal data can be used.
  • the thermal data comprises absolute temperature values
  • the thermal data comprises relative temperature values each corresponding, e.g., to a temperature difference between a respective point of the surface and some reference point
  • the thermal data comprises local temperature differences.
  • the thermal data can comprise both absolute and relative temperature values, and the like.
  • the information in the thermographic image also includes the thermal conditions (e.g. , temperature) at one or more reference markers.
  • the acquisition of surface data is typically performed by positioning the reference markers, e.g. , by comparing their coordinates in the thermographic image with their coordinates in the 3D spatial representation, to thereby match, at least approximately, also other points hence to form the synthesized thermospatial representation.
  • thermospatial image for the case that the body comprise the breasts of a female or male subject is illustrated in FIGs. 1A-C, showing a 3D spatial representation illustrated as a non-planar surface (FIG. 1A), a thermographic image illustrated as planar isothermal contours (FIG. IB), and a synthesized thermospatial image formed by mapping the thermographic image on a surface of the 3D spatial representation (FIG. 1C).
  • the thermal data of the thermospatial image is represented as grey-level values over a grid generally shown at 102. It is to be understood that the representation according to grey-level values is for illustrative purposes and is not to be considered as limiting. As explained above, the processing of thermal data can also be performed using intensity values.
  • a reference marker 101 which optionally, but not obligatorily, can be used for the mapping.
  • a series of thermal images of a section of a living body is obtained.
  • Different thermal images of the series include thermal data acquired from the portion of the body at different time instants.
  • Such series of thermal images can be used by the present embodiments to determine thermal changes occurred in the portion of the body over time.
  • thermospatial representation of a section of a living body is obtained.
  • Different thermospatial representations of the series include thermal data acquired from the portion of the body at different time instants.
  • Such series of thermospatial representations can be used by the present embodiments to determine thermal and optionally spatial changes occurred in the portion of the body over time.
  • the series can include any number of thermal images or thermospatial representations. It was found by the inventors of the present invention that two thermal images or thermospatial representations are sufficient to perform the analysis, but more than two thermal images or thermospatial representations (e.g. , 3, 4, 5 or more) can also be used, for example, to increase accuracy of the results and/or to allow selection of best acquisitions.
  • thermospatial representations of a section of the body is obtained, where at least two different thermospatial representations correspond to different viewpoints of the body.
  • thermographic image and synthesized thermospatial image can be obtained in any technique known in the art, such as the technique disclosed in International Patent Publication No. WO 2006/003658, U.S. Published Application No. 20010046316, and U.S. Patent Nos. 6,442,419, 6,765,607, 6,965,690, 6,701,081, 6,801,257, 6,201,541, 6, 167,151, 6, 167,151, 6,094, 198 and 7,292,719.
  • Some embodiments of the invention can be embodied on a tangible medium such as a computer for performing the method steps. Some embodiments of the invention can be embodied on a computer readable medium, preferably a non-transitory computer readable medium, comprising computer readable instructions for carrying out the method steps. Some embodiments of the invention can also be embodied in electronic device having digital computer capabilities arranged to run the computer program on the tangible medium or execute the instruction on a computer readable medium. Computer programs implementing method steps of the present embodiments can commonly be distributed to users on a tangible distribution medium. From the distribution medium, the computer programs can be copied to a hard disk or a similar intermediate storage medium. The computer programs can be run by loading the computer instructions either from their distribution medium or their intermediate storage medium into the execution memory of the computer, configuring the computer to act in accordance with the method of this invention. All these operations are well-known to those skilled in the art of computer systems.
  • the methods described below can be applied to at least one 3D thermospatial representation of a portion of a body, preferably a living body.
  • the portion of the body can include one or more organs, e.g., a breast or a pair of breasts, or a part of an organ, e.g., a part of a breast.
  • the 3D thermospatial representation has 3D spatial data representing a non-planar surface of the portion of the living body and thermal data associated with the 3D spatial data.
  • FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention.
  • the method begins at 10 and continues to 11 at which one or more image datasets are obtained.
  • the obtained image datasets include at least one 3D thermospatial representation.
  • the obtained image datasets include at least one 3D spectrospatial representation.
  • the method can obtain the image dataset by receiving it from an external source such as an image dataset generator (e.g. , a 3D thermospatial or spectrospatial representation generator), or by generating the image dataset for example, by imaging.
  • an image dataset generator e.g. , a 3D thermospatial or spectrospatial representation generator
  • the method can generate the dataset by combining 3D and thermal imaging.
  • the image dataset includes a 3D spectrospatial representation
  • the method can generate the dataset by visible light 3D imaging.
  • the method optionally continues to 12 at which the spatial and/or optical (e.g. , thermal) data is preprocessed.
  • the preprocessing operation includes definition of one or more spatial boundaries for the surface, so as to define a region-of-interest for the analysis.
  • the preprocessing operation can include defining a spatial boundary between the surface of the portion of the body and surface of the nearby tissue.
  • the surface of the nearby tissue is preferably excluded from the analysis.
  • the preprocessing comprises transformation of coordinates.
  • the method when the method is executed for correcting image data pertaining to more than one body portions having similar shapes, the method preferably transform the coordinates of one or more portions of the body so as to ensure that all body portions are described by the same coordinate- system.
  • a representative example is a situation in which the surface data describe a left breast and a right breast. In this situation, the system of coordinates describing one of the breasts can be flipped so as to describe both breasts using the same coordinate- system.
  • the preprocessing can comprise normalization of thermal data. The normalization is useful, for example, when it is desired not to work with too high values of intensities.
  • the normalization is performed so as to transform the range of thermal values within the thermal data to a predetermined range between a predetermined minimal thermal value and a predetermined maximal thermal value. This can be done using a linear transformation as known in the art. A typical value for the predetermined minimal thermal value is 1, and a typical value for the predetermined maximal thermal value is 10. Other ranges or normalization schemes are not excluded from the scope of the present invention.
  • the preprocessing operation includes slicing of the surface described by the spatial data to a plurality of slices.
  • the correction procedure described below can be applied separately for each slice.
  • the slicing can be along a normal direction (away from the body), parallel direction or azimuthal direction as desired.
  • the slicing can also be according to anatomical information (for example a different slice for a nipple region). Also contemplated is arbitrary slicing, in which case the surface is sliced to N regions.
  • the preprocessing comprises normalization of the spatial data.
  • the normalization is useful when it is desired to compare between optical data of different portions of the body, for example, body portions having similar shapes but different sizes. These embodiments are particularly useful when the portion of the body is a breast and it is desired to compare the optical (e.g. , thermal) data of breasts of different sizes (e.g., a left breast to a right breast of the same subject, or a breast of one subject to a breast of another subject).
  • the method preferably continues to 13 at which a viewing angle is calculated based on the spatial data for one or more of a plurality of picture-elements over the thermospatial or spectra spatial representation.
  • the viewing angle ⁇ of a given picture- element p is conveniently defined between the normal to the surface of the body at picture-element p and the optical axis of the thermal imaging system that provides the thermal data associated with picture-element p.
  • the viewing angle ⁇ is calculable because the shape of the surface is known from the spatial data, and because the optical axis is known from the optical data (e.g. , the thermal data).
  • the method optionally and preferably continues to 14 at which a correction function g(9) is applied to each of at least some of the picture-elements for correcting optical (e.g. , thermal, spectral, color) data associated with the respective picture- element.
  • a correction function g(9) is applied to each of at least some of the picture-elements for correcting optical (e.g. , thermal, spectral, color) data associated with the respective picture- element.
  • optical e.g. , thermal, spectral, color
  • the associated thermal data typically relates to the luminosity of the light multiplied by the thermal imaging system' s response the light and integrated over the wavelength:
  • ⁇ ( ⁇ ) ⁇ R( )L(A,T)dA
  • R(X) is the response of the thermal imaging system to light at wavelength ⁇
  • L( ,T) is the luminosity of light of wavelength ⁇ arriving from a surface being at a temperature T.
  • the thermal data associated with the pixel sensor s is typically a linear function of P(T):
  • the pixel sensor of the thermal imaging system When the pixel sensor of the thermal imaging system receives light that propagate along the optical axis of the thermal imaging system, the corresponding gray level is typically as indicated above. For light rays that arrive from directions that deviate from the optical axis the thermal imaging system typically employs a
  • Lambertian correction that is proportional to the fourth power of the cosine of the deviation angle. It was found by the present inventors that some curved objects, particularly living bodies, the Lambertian correction is insufficient since different parts of the surface have different primary light emission directions.
  • the grey levels provided by the thermal imaging system do not adequately describe the temperature map of the surface.
  • the grey level provided by the thermal imaging system when the picture-element is at a viewing angle ⁇ differs from the grey level that would have been provided by the thermal imaging system had the picture-element been at a viewing angle 9 2 .
  • the optical data provided by the imaging system are corrected such that the corrected optical data of all picture-elements (for which the correction is employed) are the optical data that would have been provided by the imaging system had all these picture-element been at the same viewing angle relative to the imaging system.
  • the correction can be employed such that for all the picture-elements the corrected optical data are the optical data that would have been provided by the imaging system had all these picture-element been at zero viewing angle.
  • the above procedure can be written mathematically as follows. Denote the optical (e.g. , thermal) data provided by the imaging system (e.g. , thermal imaging system) for viewing angle ⁇ by GL(T,9), and the optical data that would have been provided by the imaging system for zero viewing angle ⁇ by GL(T,0), where T represent a property of the body in the imaged scene (e.g. , temperature), and where both GL(T,9) and GL(T,0) correspond to the same value of T.
  • T represent a property of the body in the imaged scene (e.g. , temperature)
  • both GL(T,9) and GL(T,0) correspond to the same value of T.
  • the relation between GL(T,9) and GL(T,0) can be expressed as:
  • GL(T,0) g(9)-GL(T,9)
  • the correction function g(9) can be applied either before or after the preprocessing.
  • the correction function g(9) is preferably nonlinear with respect to the angle.
  • FIG. 3 is a graph of measured temperatures of several points on a human's skin as a function of angle between the normal to skin and the optical axis of a thermal camera. The data shown were collected from several human subjects. As shown, the angular dependence of the temperature is the generally the same for all the tested subjects. Without being bound to any particular theory it is assumed that a universal correction function g(9) can be employed for correcting the thermal data irrespectively of the subject under analysis.
  • the correction function g(9) can be stored in a non-transitory computer readable medium as a lookup table, or it can be provided as an analytical function. Such analytical function can be obtained by parameterizing the correction function g(9) and calculating the parameters based on experimentally observed angular dependence.
  • the correction function can be predetermined based on the universal angular dependence discovered by the present inventor. For example, the same correction function can be used for any subject.
  • the present inventor also found that the correction of image data can be based on image datasets describing the body from different viewpoints. In these embodiments, it is not necessary to employ the aforementioned universal function for correcting the data. Yet, the form of the universal function (e.g. , the type of on-linearity) can be used, in combination with the information extracted from the different viewpoints, to correct the data.
  • the method proceeds to 15 at which the image dataset (e.g., 3D thermospatial representation) is regenerated using said corrected optical (e.g., thermal) data, and/or 16 at which a temperature map of the portion of the body is generated using the corrected thermal data.
  • the regenerated dataset and/or temperature map can optionally be displayed on a display device.
  • the method can continue to 17 at which the corrected data are compared 17 to data of a reference image dataset (e.g., a reference thermospatial representation), which can be obtained from a library or can be constructed by the method of the present embodiments.
  • the reference image dataset can describe a reference portion of the body other than the portion of the body being analyzed.
  • the reference portion of the body can be a portion of the body which is similar in shape to the portion of the body being analyzed.
  • the portion of the body is a breast
  • the reference portion of the body can be the other breast of the same subject.
  • the aforementioned transformation of coordinates is preferably employed so as to facilitate conceptual overlap of one portion of the body over the other.
  • the reference image dataset includes history data of the portion of the body.
  • the reference portion of the body can be the same portion of the body as captured at an earlier time.
  • the inclusion of history data in the image dataset can be achieved by recording the reference image dataset at a date earlier than the date at which the method is executed. This embodiment can also be useful for monitoring changes in the portion of the body over time.
  • the reference image dataset can be one of the image datasets of the series. This is particularly useful when the optical data includes thermal data.
  • the ambient temperature at the surface of the portion of the body can be changed between two successive captures of surface information, and the corresponding thermospatial representations are obtained. In these embodiments, the corrected thermal data of two such successive thermospatial representations are compared.
  • a change in the ambient temperature corresponds to different boundary conditions for different thermospatial representations.
  • two successive thermospatial representations describe the portion of the body while the subject is exposed to two different ambient temperatures.
  • a change in the ambient temperature can be imposed, for example, by establishing contact between a cold object and the portion of the body or directing a flow of cold gas (e.g., air) to the surface of the portion of the body between successive data acquisitions.
  • cold gas e.g., air
  • a procedure in which the portion of the body is immersed in cold liquid (e.g. , water) between successive data acquisitions.
  • a procedure in which another portion of the body is exposed to a different (e.g. , lower) temperature so as to ensure transient thermal condition.
  • the subject can immerse his or her limb in a cold liquid (e.g. , water).
  • the method can optionally and preferably continue to 18 at which the presence or absence of a thermally distinguished region in the portion of the living body is determined based on the corrected thermal data. This can be achieved using any technique known in the art, except that the uncorrected thermal data used in known techniques is replaced with data corrected according to some embodiments of the present invention.
  • the method can also determine whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the set of criteria can include at least one of the temperatures of the region, the temperature difference between the region and its immediate surrounding, the temperature difference between the region and the average temperature of the body portion or some region-of-interest thereof, the size of the region and the like.
  • FIG. 8 is a flowchart diagram of a method suitable for correcting image data, in embodiments of the invention in which the correction is based, at least in part, on different viewpoints.
  • the method begins at 80 and continues to 81 at which two or more partially overlapping image datasets are obtained.
  • the partially overlapping image datasets correspond to different viewpoints of the living body.
  • the image datasets are "partially overlapping" in the sense that each image dataset provides a field-of-view wherein the field-of-views of the image datasets partially overlap.
  • the method can obtain the image datasets by receiving them from an external source, or by generating the datasets, for example, by imaging.
  • the image datasets are 3D thermospatial or spectrospatial representations, as further detailed hereinabove.
  • the method can obtain the 3D representations by receiving them from an external source such as a 3D thermospatial or spectrospatial representation generator, or by generating the 3D thermospatial or spectrospatial representations, for example, by combining 3D and thermal imaging or by 3D imaging.
  • the method optionally continues to 82 at which the spatial and/or optical (e.g. , thermal, spectral, color) data of at least one, more preferably all the datasets, is preprocessed.
  • the preprocessing operation includes definition of one or more spatial boundaries for the surface, so as to define a region-of-interest for the analysis, as further detailed hereinabove; in some embodiments of the present invention the preprocessing comprises transformation of coordinates, as further detailed hereinabove; in some embodiments of the present invention the preprocessing comprises normalization of the optical (e.g.
  • the preprocessing operation includes slicing of the surface described by the spatial data to a plurality of slices, as further detailed hereinabove; and in some embodiments of the present invention the preprocessing comprises normalization of the spatial data, as further detailed hereinabove.
  • the method optionally and preferably continues to 83 at which the viewing angle ⁇ is calculated based on the spatial data for each of a plurality of picture-elements over the image dataset, as further detailed hereinabove.
  • the method optionally and preferably continues to 84 at which picture-elements at overlap portions of the datasets are identified. This can be done by analyzing each of the datasets to identified one or more objects within the overlap portion of the respective field-of-views, and marking the picture-elements that correspond to these objects, and optionally and preferably, also picture-element that are nearby the objects, as picture- elements of the overlap.
  • the identified objects can be part of the body or bodies in the imaged scene (e.g., an identifiable blood vessel that is close to the skin, a nipple, a mole, a nevus, a birthmark, a tattoo) or it can be a reference marker (e.g. , marker 101).
  • the method optionally and preferably continues to 85 at which optical (e.g. , thermal spectral, color) data associated with each of at least some of the identified picture-elements are compared among datasets.
  • optical e.g. , thermal spectral, color
  • FIG. 9 Shown in FIG. 9 are two datasets 202 and 204, which can be thermospatial representations or other image datasets, as further detailed hereinabove.
  • Datasets 202 and 204 are illustrated in planar manner, however, this need not necessarily be the case, since, for some applications, both datasets 202 and 204 are non- planar (see, e.g., FIGs. 1A-C) so as to allow them to describe non-planar surfaces.
  • Each representation is defined over a plurality of picture-elements 206.
  • the picture-elements at the overlap portions of datasets 202 and 204 are shown at 208 and 210, respectively. Shown in FIG. 9 are a picture-element pi at overlap 208 and a picture-element p 2 at overlap 210.
  • Elements pi and p 2 correspond to the same elementary area over the surface of the body.
  • the optical data associated with elements pi and p 2 are supposed to be the same. However, due to the angular dependence of the data (for example, non-Lambertian radiation), these optical data typically differ. This applies also to any other pair of picture-elements at 208 and 210 that correspond to the same area or elementary area over the surface of the body, except, perhaps a few pair of picture- elements at 208 and 210 for which the optical data are not different.
  • the method compares the optical (e.g. , thermal, spectral, color) data associated with pairs of picture- elements that correspond to the same area over the surface of the body, and make a record of the comparison.
  • the method can, for example, make a record of the difference or ratio between the grey level values or hue values, of the respective picture-elements.
  • the method compares the optical data for at least 50% or at least 60% or at least 70% or at least 80% or at least 90% or at least 95%, e.g. , all the picture-elements within the overlaps 208 and 210.
  • the method proceeds to 86 at which optical data associated with picture- elements outside the overlap portions are corrected based on the comparison.
  • This can be done, for example, by extrapolation.
  • the recorded comparison can be fitted to a function that provides a correction factor for each picture-element at the overlap, and the function can be applied also to picture-element outside the overlap.
  • the function is preferably non-linear.
  • prior knowledge regarding the expected dependence of the correction factor on the location if the picture-element can be used for the fit.
  • the recorded comparison can be fitted to a correction function which describes a dependence of the correction on the viewing angle.
  • the correction function is optionally and preferably symmetric for a change in sign of the viewing angle.
  • the value of the correction factor is known for some angles (e.g. , when there are picture-elements at a zero or relatively small, for example, less than 10°, viewing angle, in which case the correction factor for these elements can be set to 1)
  • the known correction factor can be used as an input for the fit.
  • the correction factor is applied such that the optical data, after the correction, is the same for these two picture- elements.
  • the present embodiments contemplate a procedure which searches for two picture-elements that are at viewing angles which are the same in absolute values but opposite in signs, compares the optical data associated with these two picture- elements and, if the optical data are significantly different (for example, with tolerance of less than 10%), correct the optical data of at least one of these two picture-elements to ensure that the two picture-elements have the same or similar optical data.
  • the correction applied for these two picture-elements is optionally and preferably used for constructing the correction function.
  • the correction function is constructed based on the comparison 85, the function is specific to the body of the subject under investigation, even when prior knowledge is used for its constructions.
  • the subject- specific correction function g(9) can be constructed as a lookup table that can be stored in a non-transitory computer readable medium, or it can be constructed as an analytical function, e.g. , a polynomial function of ⁇ .
  • a representative procedure for constructing an analytical subject-specific correction function is as follows.
  • the recorded comparisons are fitted to a function g(9) which is parameterized according to a predetermined parameterization (e.g. , a polynomial function, optionally and preferably symmetric with respect to a change in sign, of the angle ⁇ ).
  • the parameters of the function are determined by the fitting procedure such that when the function is applied to the optical data of picture-elements within the overlaps the resulting corrected optical data of pairs of picture-elements at the overlap that correspond to the same area element over the surface of the body are matched (e.g. , having the same grey level) within a predetermined tolerance (e.g. , a tolerance of 30% or 20% or 10% or 5% or less).
  • the set ⁇ s ⁇ of parameters can include any number of parameters.
  • the fitting procedure varies the set of parameters ⁇ s ⁇ to reduce the value of the objective function h.
  • the stopping criterion for the fitting process is to achieve matching of optical (thermal, spectral, color) data, within the predetermined tolerance, for at least 50% or at least 60% or at least 70% or at least 80% or at least 90% of the pairs of corresponding picture-elements at the overlap.
  • the function g(9) can be applied also to picture-elements outside the overlap.
  • the correction function g(9) can be applied either before or after the preprocessing.
  • the method proceeds to 87 at which the image dataset is regenerated using the corrected optical data.
  • the method can proceeds to 88 at which a temperature map of the portion of the body is generated using the corrected thermal data.
  • the regenerated dataset and/or temperature map can optionally be displayed on a display device.
  • the method can continue to 89 at which the corrected data are compared to data of a reference dataset, as further detailed hereinabove.
  • the method can optionally and preferably continue to 90 at which the presence or absence of a thermally distinguished region in the portion of the living body is determined based on the corrected thermal data, as further detailed hereinabove.
  • the method ends at 91.
  • System 20 preferably comprises a digital input 22 for receiving one or more image datasets (e.g., 3D thermospatial or spectrospatial representations) as further detailed hereinabove.
  • System 20 can further comprise a data processor 24.
  • data processor 24 calculates, based on the spatial data of the input dataset (e.g., the input 3D thermospatial or spectrospatial representation), a viewing angle 9, and that applies a correction function g(9) for correcting the optical (e.g. , thermal) data as further detailed hereinabove.
  • data processor 24 identifies picture-elements at the overlap portions of the representations, compares optical data associated with the identified picture-elements among the datasets, and corrects, based on the comparison, optical data associated with picture-elements outside the overlap portions.
  • Data processor 24 can also construct a subject-specific correction function based on the comparison, as further detailed hereinabove.
  • System 20 typically comprises a non-transitory computer readable medium 26, that can stores computer readable data types and/or computer readable instructions for carrying out the operations of data processor 24.
  • a non-transitory computer readable medium 26 that can stores computer readable data types and/or computer readable instructions for carrying out the operations of data processor 24.
  • the lookup table can be stored in medium 26 and when the correction function g(9) is in the form of analytical function, computer instructions for calculating g(9) for a given angle ⁇ can be stored in medium 26.
  • data processor 24 optionally and preferably determines presence or absence of a thermally distinguished region in portion of living body based on corrected thermal data, as further detailed hereinabove.
  • data processor 24 determines whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • system 20 comprises an image generator 28 that re-generates the dataset (e.g. , the 3D thermospatial or spectrospatial representation) using the corrected optical (e.g. , thermal, spectral, color) data. Additionally or alternatively, when the optical data includes thermal data, image generator 28 can generate a temperature map of portion of body using the corrected thermal data.
  • the re-generated dataset and/or temperature map can be stored in memory medium 26.
  • System 20 can further comprise a digital output 30 that outputs the regenerated dataset and/or temperature map, at any known data format.
  • the re-generated dataset and/or temperature map can be transmitted to an external system such as a cloud storage facility or a remote computer.
  • the re-generated dataset and/or temperature map can also be transmitted to a display 32 which displays the dataset and/or temperature map, for example, as color or gray scale images or as contour plots.
  • compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
  • a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
  • range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • FIGs. 7A- H Visible light and thermal images of one woman subject are shown in FIGs. 7A- H.
  • FIGs. 7A and 7B are visible light images from the two different viewing angles
  • FIG. 7C shows the registration of the two images
  • FIG. 7D shows picture-elements for which the registration difference is less than 2 mm
  • FIGs. 7E and 7F show grey level differences between the thermal images before (FIG. 7E) and after (FIG. 7F) the correction of thermal data
  • FIG. 7G shows the difference between the absolute values of the images in FIGs. 7E and 7F
  • FIG. 7H marks regions at which the thermal correction provides improvement.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

A method of correcting image data, the method comprises: obtaining at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; identifying picture-elements at overlap portions of the datasets; comparing optical data associated with each of at least some of the identified picture-elements among the datasets, and based on the comparison, correcting optical data associated with picture-elements outside the overlap portions.

Description

IMAGE DATA CORRECTION BASED ON DIFFERENT
VIEWPOINTS
RELATED APPLICATIONS
This application claims the benefit of priority of U.S. Provisional Patent Application Nos. 62/218,026 and 62/218,020, both filed on September 14, 2015, the contents of which are incorporated herein by reference in their entirety.
FIELD AND BACKGROUND OF THE INVENTION
The present invention, in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a subject-specific function.
The use of imaging in medicine is known. Presently there are numerous different imaging modalities at the disposal of a physician allowing imaging of hard and soft tissues and characterization of both normal and pathological tissues. For example, infra red imaging is utilized for characterizing a thermally distinguishable site in a human body for the purposes of identifying inflammation. Infrared cameras produce two- dimensional images known as thermographic images. A thermographic image is typically obtained by receiving from the body of the subject radiation at any one of several infrared wavelength ranges and analyzing the radiation to provide a two- dimensional temperature map of the surface. The thermographic image can be in the form of either or both of a visual image and corresponding temperature data. The output from infrared cameras used for infrared thermography typically provides an image comprising a plurality of pixel data points, each pixel providing temperature information which is visually displayed, using a color code or grayscale code. The temperature information can be further processed by computer software to generate for example, mean temperature for the image, or a discrete area of the image, by averaging temperature data associated with all the pixels or a sub-collection thereof.
Based on the thermographic image, a physician diagnoses the site, and determines, for example, whether or not the site includes an inflammation while relying heavily on experience and intuition. International Patent Publication No. 2006/003658, the contents of which are hereby incorporated by reference, discloses a system which includes non-thermographic image data acquisition functionality and thermographic image data acquisition functionality. The non-thermographic image data acquisition functionality acquires non- thermographic image data, and the thermographic image data acquisition functionality acquires thermographic image data.
U.S. Patent No. 7,292,719, the contents of which are hereby incorporated by reference discloses a system for determining presence or absence of one or more thermally distinguishable objects in a living body. A combined image generator configured combines non-thermographic three-dimensional data of a three-dimensional tissue region in the living body with thermographic two-dimensional data of the tissue region so as to generate three-dimensional temperature data associated with the three- dimensional tissue region. SUMMARY OF THE INVENTION
According to an aspect of some embodiments of the present invention there is provided a method of correcting image data. The method comprises: obtaining at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; identifying picture-elements at overlap portions of the datasets; comparing optical data associated with each of at least some of the identified picture-elements among the datasets, and based on the comparison, correcting optical data associated with picture-elements outside the overlap portions.
According to some embodiments of the invention the method comprises: for each dataset, calculating a viewing angle for each picture-element over the dataset; and constructing, based on the comparison, a correction function which is specific to the scene and which describes a dependence of the correction of the optical data on the viewing angle.
According to some embodiments of the invention at least two of the datasets is a 3D representation of the scene, the representation having 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data. According to some embodiments of the invention the optical data comprises visible light data, wherein the comparing and the correcting is applied to visible light data.
According to some embodiments of the invention the optical data comprises ultraviolet light data wherein the comparing and the correcting is applied to ultraviolet light data.
According to some embodiments of the invention the optical data comprises X ray data wherein the comparing and the correcting is applied to X ray data.
According to some embodiments of the invention the optical data comprises thermal data, wherein the comparing and the correcting are applied to thermal data.
According to some embodiments of the invention the method comprises determining presence or absence of a thermally distinguished region in the portion of the body based on the corrected thermal data.
According to some embodiments of the invention the method comprises generating a temperature map of the portion of the body using the corrected thermal data.
According to some embodiments of the invention the method comprises displaying the temperature map on a display and/or transmitting the temperature map to a non-transitory computer readable medium.
According to some embodiments of the invention the body is a living body.
According to some embodiments of the invention the body is a living body and the method comprises determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
According to some embodiments of the invention the portion of the body is a breast of a human subject.
According to some embodiments of the invention the method comprises regenerating at least one of the datasets using the corrected optical data.
According to some embodiments of the invention the method comprises displaying the re-generated dataset a display and/or transmitting the re-generated dataset to a non-transitory computer readable medium.
According to some embodiments of the invention the method comprises searching for two picture-elements within the overlap that are at viewing angles which are the same in absolute values but opposite in signs; compares optical data associated with the two picture-elements; and correct optical data of at least one of the two picture- elements to ensure that the two picture-elements have the same or similar optical data.
According to an aspect of some embodiments of the present invention there is provided a computer software product. The computer software product comprises a non- transitory computer-readable medium in which program instructions are stored, which instructions, when read by a data processor, cause the data processor to receive at least two image datasets and execute the method as delegated above and optionally and preferably as further exemplified below.
According to an aspect of some embodiments of the present invention there is provided an image correction system. The system comprises: a digital input for receiving at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; and a data processor configured for identifying picture-elements at overlap portions of the datasets, for comparing optical data associated with each of at least some of the identified picture- elements among the datasets, and for correcting, based on the comparison, optical data associated with picture-elements outside the overlap portions.
According to some embodiments of the invention the data processor is configured for calculating, for each dataset, a viewing angle for each picture-element over the dataset, and constructing, based on the comparison, a correction function which is specific to the scene and which describes a dependence of the correction of the optical data on the viewing angle.
According to some embodiments of the invention the correction function is nonlinear with respect to the angle.
According to some embodiments of the invention the correction function comprises a quadratic function.
According to some embodiments of the invention at least two of the datasets is a 3D representation of the scene, the representation having 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data. According to some embodiments of the invention the optical data comprises thermal data, and the data processor is configured to apply the comparison and the correction to thermal data.
According to some embodiments of the invention the data processor is configured for determining presence or absence of a thermally distinguished region in the portion of the body based on the corrected thermal data.
According to some embodiments of the invention the system comprises an image generator configured for generating a temperature map of the portion of the body using the corrected thermal data.
According to some embodiments of the invention the system comprises at least one of a display for displaying the temperature map, and a non-transitory computer readable medium for storing the temperature map.
According to some embodiments of the invention the data processor is configured for determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
According to some embodiments of the invention the system comprises an image generator for re-generating at least one of the datasets using the corrected optical data.
According to some embodiments of the invention the system comprises at least one of a display for displaying the re-generated dataset a display, and a non-transitory computer readable medium for storing the re-generated dataset.
According to an aspect of some embodiments of the present invention there is provided an imaging system. The system comprises: an image generator for generating at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views; and the image correction system as delineated above and optionally and preferably as further detailed and exemplified below.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings and images. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings:
FIGs. 1A-C are schematic illustrations of a synthesized thermospatial image, according to some embodiments of the present invention; FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention;
FIG. 3 is a graph of measured temperatures of several points on a human's skin as a function of angle between the normal to skin and the optical axis of a thermal camera;
FIG. 4 is a schematic illustration of an image correction system, according to some embodiments of the present invention;
FIG. 5 shows experimental results in which thermal data of images were corrected in accordance with some embodiments of the present invention;
FIG. 6 is a thermographic image obtained during experiments performed according to some embodiments of the present invention;
FIGs. 7A and 7B are visible light images from the two different viewing angles, obtained during experiments performed according to some embodiments of the present invention;
FIG. 7C shows registration of the images of FIGs. 7A and 7B, obtained during experiments performed according to some embodiments of the present invention;
FIG. 7D shows picture-elements for which a registration difference calculated during experiments performed according to some embodiments of the present invention was less than 2 mm;
FIGs. 7E and 7F show grey level differences between thermal images before
(FIG. 7E) and after (FIG. 7F) thermal data correction performed according to some embodiments of the present invention;
FIG. 7G shows difference between the absolute values of the images in FIGs. 7E and 7F;
FIG. 7H shows regions at which the thermal correction of the present embodiments provides improvement;
FIG. 8 is a flowchart diagram of a method suitable for correcting image data, in embodiments of the invention in which the correction is based, at least in part, on different viewpoints; and
FIG. 9 is a schematic illustration showing two image datasets, according to some embodiments of the present invention. DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION
The present invention, in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a subject-specific function.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
The present inventors have devised an approach which enables the correction of image data. The image data typically include two or more image datasets, each dataset being arranged gridwise over a plurality of picture-elements, wherein two or more of the datasets acquired from different viewpoints relative to a scene. The picture-elements form spatial data, wherein each picture-element is associated with optical data that corresponds to an optical field received from a respective portion of the imaged scene. The optical field can be at any wavelength or range of wavelengths. Representative examples of types of optical field suitable for the present embodiments including, without limitation, infrared light (e.g. , far infrared light, mid infrared light, near infrared light), visible light, ultraviolet light (e.g., near ultraviolet light, extreme ultraviolet light), and X ray field (e.g., soft X ray field, hard X ray field).
In any of the embodiments of the present invention one or more of datasets is optionally and preferably a 3D representation of the scene. Preferably, the representation has 3D spatial data representing a non-planar surface of a portion of a body in the scene and optical data associated with the 3D spatial data. The body can be any object that reflects, transmits or emits an optical field of the respective type. For example, the body can be a living body of a human subject or animal. Alternatively, the body can be an inanimate object in the scene.
The image data is optionally and preferably corrected based on a comparison of optical data associated with picture-elements that describe the same portion of an imaged scene but from different viewpoints. The compared optical data can describe any property of a body or bodies within the imaged scene that can be measured optically. Representative examples include, without limitation, reflectivity to the optical field under analysis, transparency to the optical field under analysis, emissivity of the optical field under analysis, radiance of the optical field under analysis, color of the body or bodies within the imaged scene, and temperature of the body or bodies within the imaged scene.
The optical data that is compared according to some embodiments of the present invention is preferably angle-dependent. Specifically, the optical data depends on the viewing angle, more preferably the absolute value of the viewing angle of a picture- element under analysis. The viewing angle is conveniently defined between the normal to the surface of a respective body in the imaged scene, at the picture-element under analysis, and the optical axis of the imaging system that provides the optical data associated with that picture-element.
It was found by the present inventors that corrected image data according to some embodiments of the present invention can be used to determine presence or absence of one or more distinguishable regions in a portion of a body, more preferably a living body. Preferably, but not necessarily, the technique of the present embodiments is applied for correcting thermal data, which can include temperature data or some proxy thereof, such as, but not limited to, grayscale data and/or wavelength data.
While the embodiments below are described with a particular emphasis to correction of thermal data or some proxy thereof, it is to be understood that any type of optical data as further detailed hereinabove can be corrected in some embodiments of the present invention.
In some embodiments of the present invention, the corrected thermal data is used for determining the likelihood for the presence of a thermally distinguishable region in the portion of the body. When the thermal image is of a portion of a living body such as a breast of a male or female subject, the analysis of the present embodiments can be used to extract properties of the underlying tissue. For example, determination of the likelihood that a thermally distinguished region is present in the portion of the body can be used for assessing whether or not the portion of the body has a pathology such as a tumor or an inflammation. An elevated temperature or a temperature pattern is generally associated with a tumor due to the metabolic abnormality of the tumor and proliferation of blood vessels (angiogenesis) at and/or near the tumor. In a cancerous tumor the cells proliferate faster and thus are more active and generate more heat. This tends to enhance the temperature differential between the tumor itself and the surrounding tissue. The present embodiments can therefore be used for diagnosis of cancer, particularly, but not exclusively breast cancer.
When the technique of the present embodiments is applied for correcting visible light data, such as, but not limited to, data collected by a visible light sensor, e.g. , visible light imager, the corrected optical data can be used for recoloring an image constituted by the data.
The technique of the present embodiments is optionally and preferably applied to surface information that describes the surface of the body. The surface information optionally and preferably comprises optical information and spatial information. The optical information can pertain to any of the aforementioned properties.
For example, when the optical data include thermal data, the optical information comprises data pertaining to heat evacuated from or absorbed by the surface. Since different parts of the surface generally evacuate or absorb different amount of heat, the thermal information comprises a set of tuples, each comprising the coordinates of a region or a point on the surface and a thermal numerical value (e.g. , temperature, thermal energy) associated with the point or region. The thermal information can be transformed to visible signals, in which case the thermal information is in the form of a thermographic image.
When the optical data include visible light data, the optical information can comprise data pertaining to intensity and/or wavelength range and/or color of light emitted, transmitted or reflected by the surface. When the optical data include X ray data, the optical information can comprise data pertaining to intensities or grey levels of X rays transmitted through the surface.
The optical data (e.g. , thermal data, visible light data, X ray data) is typically arranged gridwise in a plurality of picture-elements (e.g., pixels, arrangements of pixels, subpixels) representing the image (e.g. , thermographic image, when the optical data includes thermal data). Each picture-element is represented by an intensity value or a grey-level and/or a spectrum over the grid. When the picture-element is represented by a spectrum, it typically includes a plurality of intensity values each corresponding to a range of wavelengths (e.g., a color, when the optical data is visible light data). Such a spectrum can be obtained by means of picture-elements that are divided into sub-picture- elements (e.g. , sub-pixels) each represented by an intensity value corresponding to a different range of wavelengths. For example, a picture-element can comprise three sub- picture-elements, one corresponding to red light, one corresponding to green light, and one corresponding to blue light, according to the so called RGB format.
It is appreciated that the number of different intensity values can be different from the number of grey-levels. For example, an 8-bit display can generate 256 different grey-levels. However, in principle, the number of different intensity values can be much larger. As a representative example, suppose that the optical information includes thermal information that spans over a range of 37 °C and is digitized with a resolution of 0.1 °C. In this case, there are 370 different intensity values and the use of grey-levels is less accurate by a factor of approximately 1.4. Use of higher formats (e.g. , 10 bit, 12 bit, 14 bit or higher) is also contemplated. For example, a photon thermal camera can provide information pertaining to the number of photons detected by the camera detector. Such information can extend over a range of about 6000-8000 intensity values.
Also contemplated are embodiments in which the optical data is provided in 4 bytes (in distinction, for example, from 3 bytes that are used in conventional RGB images) creating a representation similar to floating point, where the first three bytes represent three wavelength (e.g. , color) channels and the forth byte represents a common exponent to the three wavelength channels. Such a format is particularly useful when the optical data describes a High Dynamic Range (HDR) images, having at least 4 or at least 5 or at least 6 or more orders of magnitude.
In some embodiments of the present invention the correction technique is applied to intensity values, in some embodiments of the present invention the correction technique is applied to grey-levels, and in some embodiments of the present invention the correction technique is applied to hue values calculated based on combination of sub-pixels. Also contemplated are multi processing procedures in which these type of values or levels are combined. The term "pixel" is sometimes abbreviated herein to indicate a picture-element. However, this is not intended to limit the meaning of the term "picture-element" which refers to a unit of the composition of an image.
The terms "thermographic image", "thermal image", "thermal information" and "thermal data" are used interchangeably throughout the specification without limiting the scope of the present embodiments in any way. Specifically, unless otherwise defined, the use of the term "thermographic image" is not to be considered as limited to the transformation of the thermal information into visible signals. For example, a thermographic image can be stored in the memory of a computer readable medium, preferably a non-transitory computer readable medium, as a set of tuples as described above.
In embodiments in which the surface information also comprises spatial information, the spatial information comprises data pertaining to geometric properties of a surface which at least partially encloses a three-dimensional volume. In some embodiments of the present invention the surface is non-planar, e.g. , curved. Generally, the surface is a two-dimensional object embedded in a three-dimensional space. Formally, a surface is a metric space induced by a smooth connected and compact Riemannian 2-manifold. Ideally, the geometric properties of the surface would be provided explicitly for example, the slope and curvature (or even other spatial derivatives or combinations thereof) for every point of the surface. Yet, such information is rarely attainable and the spatial information is provided for a sampled version of the surface, which is a set of points on the Riemannian 2-manifold and which is sufficient for describing the topology of the 2-manifold. Typically, the spatial information of the surface is a reduced version of a 3D spatial representation, which may be either a point-cloud or a 3D reconstruction (e.g. , a polygonal mesh or a curvilinear mesh) based on the point cloud. The 3D spatial representation is expressed via a 3D coordinate-system, such as, but not limited to, Cartesian, Spherical, Ellipsoidal, 3D Parabolic or Paraboloidal coordinate 3D system.
The spatial data, in some embodiments of the present invention, can be in a form of an image. Since the spatial data represent the surface such image is typically a two- dimensional image which, in addition to indicating the lateral extent of body members, further indicates the relative or absolute distance of the body members, or portions thereof, from some reference point, such as the location of the imaging device. Thus, the image typically includes information residing on a surface of a three-dimensional body and not necessarily in the bulk. Yet, it is commonly acceptable to refer to such image as "a three-dimensional image" because the surface is conveniently defined over a three- dimensional system of coordinate. Thus, throughout this specification and in the claims section that follows, the terms "three-dimensional image" and "three-dimensional representation" primarily relate to surface entities.
The lateral dimensions of the spatial data are referred to as the x and y dimensions, and the range data (the depth or distance of the body members from a reference point) is referred to as the z dimension.
When the surface information of a body comprises wavelength (e.g., color) information and spatial information, the surface information (wavelength and spatial) of a body is typically in the form of a synthesized representation which includes both wavelength data representing the image (e.g. , visible light image) and spatial data representing the surface, where the wavelength data is associated with the spatial data (i.e. , a tuple of the spatial data is associated with a color or set of colors of the thermal data). Such representation is referred to as a spectrospatial representation. The spectrospatial representation can be in the form of digital data (e.g., a list of tuples associated with digital data describing spectral quantities) or in the form of a three- dimensional image.
When the surface information of a body comprises thermal information and spatial information, the surface information (thermal and spatial) of a body is typically in the form of a synthesized representation which includes both thermal data representing the thermal image and spatial data representing the surface, where the thermal data is associated with the spatial data (i.e. , a tuple of the spatial data is associated with a heat-related value of the thermal data). Such representation is referred to as a thermospatial representation. The thermospatial representation can be in the form of digital data (e.g. , a list of tuples associated with digital data describing thermal quantities) or in the form of an image (e.g. , a three-dimensional image color-coded or grey-level coded according to the thermal data). A thermospatial representation in the form of an image is referred to hereinafter as a thermospatial image. The thermospatial representation is optionally and preferably defined over a 3D spatial representation of the body and has thermal data associated with a surface of the 3D spatial representation, and arranged gridwise over the surface in a plurality of picture-elements (e.g. , pixels, arrangements of pixels) each represented by an intensity value or a grey-level over the grid.
When the thermospatial representation is in the form of digital data, the digital data describing thermal properties can also be expressed either in terms of intensities or in terms of grey-levels as described above. Digital thermospatial representation can also correspond to thermospatial image whereby each tuple corresponds to a picture-element of the image.
Preferably, one or more thermographic images are mapped or projected onto the surface of the 3D spatial representation to form the thermospatial representation. The thermographic image to be projected onto the surface of the 3D spatial representation preferably comprises thermal data which are expressed over the same coordinate- system as the 3D spatial representation. Any type of thermal data can be used. In one embodiment the thermal data comprises absolute temperature values, in another embodiment the thermal data comprises relative temperature values each corresponding, e.g., to a temperature difference between a respective point of the surface and some reference point, in an additional embodiment, the thermal data comprises local temperature differences. Also contemplated, are combinations of the above types of temperature data, for example, the thermal data can comprise both absolute and relative temperature values, and the like.
Typically, but not obligatorily, the information in the thermographic image also includes the thermal conditions (e.g. , temperature) at one or more reference markers. The acquisition of surface data is typically performed by positioning the reference markers, e.g. , by comparing their coordinates in the thermographic image with their coordinates in the 3D spatial representation, to thereby match, at least approximately, also other points hence to form the synthesized thermospatial representation.
A representative example of a synthesized thermospatial image for the case that the body comprise the breasts of a female or male subject is illustrated in FIGs. 1A-C, showing a 3D spatial representation illustrated as a non-planar surface (FIG. 1A), a thermographic image illustrated as planar isothermal contours (FIG. IB), and a synthesized thermospatial image formed by mapping the thermographic image on a surface of the 3D spatial representation (FIG. 1C). As illustrated, the thermal data of the thermospatial image is represented as grey-level values over a grid generally shown at 102. It is to be understood that the representation according to grey-level values is for illustrative purposes and is not to be considered as limiting. As explained above, the processing of thermal data can also be performed using intensity values. Also shown in FIGs. 1A-C, is a reference marker 101 which optionally, but not obligatorily, can be used for the mapping.
In some embodiments of the present invention a series of thermal images of a section of a living body is obtained. Different thermal images of the series include thermal data acquired from the portion of the body at different time instants. Such series of thermal images can be used by the present embodiments to determine thermal changes occurred in the portion of the body over time.
In some embodiments of the present invention a series of thermospatial representation of a section of a living body is obtained. Different thermospatial representations of the series include thermal data acquired from the portion of the body at different time instants. Such series of thermospatial representations can be used by the present embodiments to determine thermal and optionally spatial changes occurred in the portion of the body over time.
The series can include any number of thermal images or thermospatial representations. It was found by the inventors of the present invention that two thermal images or thermospatial representations are sufficient to perform the analysis, but more than two thermal images or thermospatial representations (e.g. , 3, 4, 5 or more) can also be used, for example, to increase accuracy of the results and/or to allow selection of best acquisitions.
In some embodiments of the present invention two or more thermospatial representations of a section of the body is obtained, where at least two different thermospatial representations correspond to different viewpoints of the body.
The 3D spatial representation, thermographic image and synthesized thermospatial image can be obtained in any technique known in the art, such as the technique disclosed in International Patent Publication No. WO 2006/003658, U.S. Published Application No. 20010046316, and U.S. Patent Nos. 6,442,419, 6,765,607, 6,965,690, 6,701,081, 6,801,257, 6,201,541, 6, 167,151, 6, 167,151, 6,094, 198 and 7,292,719.
Some embodiments of the invention can be embodied on a tangible medium such as a computer for performing the method steps. Some embodiments of the invention can be embodied on a computer readable medium, preferably a non-transitory computer readable medium, comprising computer readable instructions for carrying out the method steps. Some embodiments of the invention can also be embodied in electronic device having digital computer capabilities arranged to run the computer program on the tangible medium or execute the instruction on a computer readable medium. Computer programs implementing method steps of the present embodiments can commonly be distributed to users on a tangible distribution medium. From the distribution medium, the computer programs can be copied to a hard disk or a similar intermediate storage medium. The computer programs can be run by loading the computer instructions either from their distribution medium or their intermediate storage medium into the execution memory of the computer, configuring the computer to act in accordance with the method of this invention. All these operations are well-known to those skilled in the art of computer systems.
The methods described below can be applied to at least one 3D thermospatial representation of a portion of a body, preferably a living body. The portion of the body can include one or more organs, e.g., a breast or a pair of breasts, or a part of an organ, e.g., a part of a breast. The 3D thermospatial representation has 3D spatial data representing a non-planar surface of the portion of the living body and thermal data associated with the 3D spatial data.
It is to be understood that, unless otherwise defined, the operations of the methods described hereinbelow can be executed either contemporaneously or sequentially in many combinations or orders of execution. Specifically, the ordering of the flowchart diagrams is not to be considered as limiting. For example, two or more operations, appearing in the following description or in the flowchart diagrams in a particular order, can be executed in a different order (e.g. , a reverse order) or substantially contemporaneously. Additionally, several operations method steps described below are optional and may not be executed. FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention.
The method begins at 10 and continues to 11 at which one or more image datasets are obtained. In some embodiments of the present invention the obtained image datasets include at least one 3D thermospatial representation. In some embodiments of the present invention the obtained image datasets include at least one 3D spectrospatial representation.
The method can obtain the image dataset by receiving it from an external source such as an image dataset generator (e.g. , a 3D thermospatial or spectrospatial representation generator), or by generating the image dataset for example, by imaging. When the image dataset includes a 3D thermospatial representation, the method can generate the dataset by combining 3D and thermal imaging. When the image dataset includes a 3D spectrospatial representation, the method can generate the dataset by visible light 3D imaging.
The method optionally continues to 12 at which the spatial and/or optical (e.g. , thermal) data is preprocessed. In some embodiments of the present invention the preprocessing operation includes definition of one or more spatial boundaries for the surface, so as to define a region-of-interest for the analysis. For example, when the spatial data of the dataset comprises data representing a surface of tissue being nearby to the portion of the body, the preprocessing operation can include defining a spatial boundary between the surface of the portion of the body and surface of the nearby tissue. In this embodiment, the surface of the nearby tissue is preferably excluded from the analysis.
In some embodiments of the present invention the preprocessing comprises transformation of coordinates. For example, when the method is executed for correcting image data pertaining to more than one body portions having similar shapes, the method preferably transform the coordinates of one or more portions of the body so as to ensure that all body portions are described by the same coordinate- system. A representative example is a situation in which the surface data describe a left breast and a right breast. In this situation, the system of coordinates describing one of the breasts can be flipped so as to describe both breasts using the same coordinate- system. When the optical data includes thermal data, the preprocessing can comprise normalization of thermal data. The normalization is useful, for example, when it is desired not to work with too high values of intensities. In various exemplary embodiments of the invention the normalization is performed so as to transform the range of thermal values within the thermal data to a predetermined range between a predetermined minimal thermal value and a predetermined maximal thermal value. This can be done using a linear transformation as known in the art. A typical value for the predetermined minimal thermal value is 1, and a typical value for the predetermined maximal thermal value is 10. Other ranges or normalization schemes are not excluded from the scope of the present invention.
In some embodiments of the present invention the preprocessing operation includes slicing of the surface described by the spatial data to a plurality of slices. In these embodiments, the correction procedure described below can be applied separately for each slice. The slicing can be along a normal direction (away from the body), parallel direction or azimuthal direction as desired. The slicing can also be according to anatomical information (for example a different slice for a nipple region). Also contemplated is arbitrary slicing, in which case the surface is sliced to N regions.
In some embodiments of the present invention the preprocessing comprises normalization of the spatial data. The normalization is useful when it is desired to compare between optical data of different portions of the body, for example, body portions having similar shapes but different sizes. These embodiments are particularly useful when the portion of the body is a breast and it is desired to compare the optical (e.g. , thermal) data of breasts of different sizes (e.g., a left breast to a right breast of the same subject, or a breast of one subject to a breast of another subject).
The method preferably continues to 13 at which a viewing angle is calculated based on the spatial data for one or more of a plurality of picture-elements over the thermospatial or spectra spatial representation. The viewing angle Θ of a given picture- element p is conveniently defined between the normal to the surface of the body at picture-element p and the optical axis of the thermal imaging system that provides the thermal data associated with picture-element p. The viewing angle Θ is calculable because the shape of the surface is known from the spatial data, and because the optical axis is known from the optical data (e.g. , the thermal data). The method optionally and preferably continues to 14 at which a correction function g(9) is applied to each of at least some of the picture-elements for correcting optical (e.g. , thermal, spectral, color) data associated with the respective picture- element. The concept of the correction function will now be explained in greater detail, for the case of thermal data, but the skilled person, provided by the details described herein would know how to construct a correction function also for the case of other types of optical data. For example, when the optical data is visual light data, the variable T below can be replaced by a variable representing a color.
For a light ray having a range of wavelengths [λι, λ2] and arriving to a pixel sensor s of a thermal imaging system from the surface of the body, the associated thermal data typically relates to the luminosity of the light multiplied by the thermal imaging system' s response the light and integrated over the wavelength:
Ρ(Τ) = ^R( )L(A,T)dA
k
where R(X) is the response of the thermal imaging system to light at wavelength λ, L( ,T) is the luminosity of light of wavelength λ arriving from a surface being at a temperature T. The thermal data associated with the pixel sensor s (for example, the grey level GL) is typically a linear function of P(T):
GL(T) = a (T) + b
where a and b are constants that do not depend on the temperature.
When the pixel sensor of the thermal imaging system receives light that propagate along the optical axis of the thermal imaging system, the corresponding gray level is typically as indicated above. For light rays that arrive from directions that deviate from the optical axis the thermal imaging system typically employs a
Lambertian correction that is proportional to the fourth power of the cosine of the deviation angle. It was found by the present inventors that some curved objects, particularly living bodies, the Lambertian correction is insufficient since different parts of the surface have different primary light emission directions.
In these situations, the grey levels provided by the thermal imaging system do not adequately describe the temperature map of the surface. In other words, for a picture-element at temperature T, the grey level provided by the thermal imaging system when the picture-element is at a viewing angle θι differs from the grey level that would have been provided by the thermal imaging system had the picture-element been at a viewing angle 92.
According to some embodiments of the present invention the optical data provided by the imaging system are corrected such that the corrected optical data of all picture-elements (for which the correction is employed) are the optical data that would have been provided by the imaging system had all these picture-element been at the same viewing angle relative to the imaging system. For example, the correction can be employed such that for all the picture-elements the corrected optical data are the optical data that would have been provided by the imaging system had all these picture-element been at zero viewing angle.
The above procedure can be written mathematically as follows. Denote the optical (e.g. , thermal) data provided by the imaging system (e.g. , thermal imaging system) for viewing angle Θ by GL(T,9), and the optical data that would have been provided by the imaging system for zero viewing angle Θ by GL(T,0), where T represent a property of the body in the imaged scene (e.g. , temperature), and where both GL(T,9) and GL(T,0) correspond to the same value of T. The relation between GL(T,9) and GL(T,0) can be expressed as:
GL(T,9) = f(9)-GL(T,0), and
GL(T,0) = g(9)-GL(T,9)
where g(9) = f l(Q), is the correction function applied to the optical data GL(T,9) to provide the corrected optical data GL(T,0).
When the preprocessing operation 12 is executed, the correction function g(9) can be applied either before or after the preprocessing.
The correction function g(9) is preferably symmetric with respect to a change in sign of its argument 9. Specifically, in these embodiments g(9) = g(-9). The correction function g(9) is preferably nonlinear with respect to the angle. A representative example of such nonlinear dependence for the case in which the optical data include thermal data is shown in FIG. 3, which is a graph of measured temperatures of several points on a human's skin as a function of angle between the normal to skin and the optical axis of a thermal camera. The data shown were collected from several human subjects. As shown, the angular dependence of the temperature is the generally the same for all the tested subjects. Without being bound to any particular theory it is assumed that a universal correction function g(9) can be employed for correcting the thermal data irrespectively of the subject under analysis.
The correction function g(9) can be stored in a non-transitory computer readable medium as a lookup table, or it can be provided as an analytical function. Such analytical function can be obtained by parameterizing the correction function g(9) and calculating the parameters based on experimentally observed angular dependence. A representative example of nonlinear correction function is a quadratic function. In experiments performed by the present inventors it was found that a quadratic function of the form g(9) = K 9 or equivalent thereof can be employed, where 9 is measured in radians and K is a constant ranging from -2 to -0.1 or from -1 to -0.1 or from -0.8 to -0.2 or from -0.5 to -0.4. In some embodiments of the present invention K is about -0.4686.
The correction function can be predetermined based on the universal angular dependence discovered by the present inventor. For example, the same correction function can be used for any subject. The present inventor also found that the correction of image data can be based on image datasets describing the body from different viewpoints. In these embodiments, it is not necessary to employ the aforementioned universal function for correcting the data. Yet, the form of the universal function (e.g. , the type of on-linearity) can be used, in combination with the information extracted from the different viewpoints, to correct the data. These embodiments are described hereinunder with reference to the flowchart diagram of FIG. 8.
In some embodiments of the present invention the method proceeds to 15 at which the image dataset (e.g., 3D thermospatial representation) is regenerated using said corrected optical (e.g., thermal) data, and/or 16 at which a temperature map of the portion of the body is generated using the corrected thermal data. The regenerated dataset and/or temperature map can optionally be displayed on a display device.
The method can continue to 17 at which the corrected data are compared 17 to data of a reference image dataset (e.g., a reference thermospatial representation), which can be obtained from a library or can be constructed by the method of the present embodiments. The reference image dataset can describe a reference portion of the body other than the portion of the body being analyzed. For example, the reference portion of the body can be a portion of the body which is similar in shape to the portion of the body being analyzed. When the portion of the body is a breast, the reference portion of the body can be the other breast of the same subject. In this embodiment, the aforementioned transformation of coordinates is preferably employed so as to facilitate conceptual overlap of one portion of the body over the other.
In some embodiments of the present invention the reference image dataset includes history data of the portion of the body. Thus, the reference portion of the body can be the same portion of the body as captured at an earlier time. The inclusion of history data in the image dataset can be achieved by recording the reference image dataset at a date earlier than the date at which the method is executed. This embodiment can also be useful for monitoring changes in the portion of the body over time.
When a series of image datasets (e.g. , of thermospatial representations) is obtained, the reference image dataset can be one of the image datasets of the series. This is particularly useful when the optical data includes thermal data. The ambient temperature at the surface of the portion of the body can be changed between two successive captures of surface information, and the corresponding thermospatial representations are obtained. In these embodiments, the corrected thermal data of two such successive thermospatial representations are compared.
A change in the ambient temperature corresponds to different boundary conditions for different thermospatial representations. Specifically, in these embodiments, two successive thermospatial representations describe the portion of the body while the subject is exposed to two different ambient temperatures. A change in the ambient temperature can be imposed, for example, by establishing contact between a cold object and the portion of the body or directing a flow of cold gas (e.g., air) to the surface of the portion of the body between successive data acquisitions. Also contemplated is a procedure in which the portion of the body is immersed in cold liquid (e.g. , water) between successive data acquisitions. Also contemplated is a procedure in which another portion of the body is exposed to a different (e.g. , lower) temperature so as to ensure transient thermal condition. For example, the subject can immerse his or her limb in a cold liquid (e.g. , water).
When the compared data include thermal data, the method can optionally and preferably continue to 18 at which the presence or absence of a thermally distinguished region in the portion of the living body is determined based on the corrected thermal data. This can be achieved using any technique known in the art, except that the uncorrected thermal data used in known techniques is replaced with data corrected according to some embodiments of the present invention. The method can also determine whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria. The set of criteria can include at least one of the temperatures of the region, the temperature difference between the region and its immediate surrounding, the temperature difference between the region and the average temperature of the body portion or some region-of-interest thereof, the size of the region and the like.
The method ends at 19.
FIG. 8 is a flowchart diagram of a method suitable for correcting image data, in embodiments of the invention in which the correction is based, at least in part, on different viewpoints.
The method begins at 80 and continues to 81 at which two or more partially overlapping image datasets are obtained. The partially overlapping image datasets correspond to different viewpoints of the living body. The image datasets are "partially overlapping" in the sense that each image dataset provides a field-of-view wherein the field-of-views of the image datasets partially overlap.
The method can obtain the image datasets by receiving them from an external source, or by generating the datasets, for example, by imaging. In some embodiments of the present invention the image datasets are 3D thermospatial or spectrospatial representations, as further detailed hereinabove. In these embodiments the method can obtain the 3D representations by receiving them from an external source such as a 3D thermospatial or spectrospatial representation generator, or by generating the 3D thermospatial or spectrospatial representations, for example, by combining 3D and thermal imaging or by 3D imaging.
The method optionally continues to 82 at which the spatial and/or optical (e.g. , thermal, spectral, color) data of at least one, more preferably all the datasets, is preprocessed. In some embodiments of the present invention the preprocessing operation includes definition of one or more spatial boundaries for the surface, so as to define a region-of-interest for the analysis, as further detailed hereinabove; in some embodiments of the present invention the preprocessing comprises transformation of coordinates, as further detailed hereinabove; in some embodiments of the present invention the preprocessing comprises normalization of the optical (e.g. , thermal spectral, color) data, as further detailed hereinabove; in some embodiments of the present invention the preprocessing operation includes slicing of the surface described by the spatial data to a plurality of slices, as further detailed hereinabove; and in some embodiments of the present invention the preprocessing comprises normalization of the spatial data, as further detailed hereinabove.
The method optionally and preferably continues to 83 at which the viewing angle Θ is calculated based on the spatial data for each of a plurality of picture-elements over the image dataset, as further detailed hereinabove.
The method optionally and preferably continues to 84 at which picture-elements at overlap portions of the datasets are identified. This can be done by analyzing each of the datasets to identified one or more objects within the overlap portion of the respective field-of-views, and marking the picture-elements that correspond to these objects, and optionally and preferably, also picture-element that are nearby the objects, as picture- elements of the overlap. The identified objects can be part of the body or bodies in the imaged scene (e.g., an identifiable blood vessel that is close to the skin, a nipple, a mole, a nevus, a birthmark, a tattoo) or it can be a reference marker (e.g. , marker 101).
Once the picture-elements at the overlap are identified, the method optionally and preferably continues to 85 at which optical (e.g. , thermal spectral, color) data associated with each of at least some of the identified picture-elements are compared among datasets. The procedure is schematically illustrated in FIG. 9. Shown in FIG. 9 are two datasets 202 and 204, which can be thermospatial representations or other image datasets, as further detailed hereinabove.
Datasets 202 and 204 are illustrated in planar manner, however, this need not necessarily be the case, since, for some applications, both datasets 202 and 204 are non- planar (see, e.g., FIGs. 1A-C) so as to allow them to describe non-planar surfaces. Each representation is defined over a plurality of picture-elements 206. The picture-elements at the overlap portions of datasets 202 and 204 are shown at 208 and 210, respectively. Shown in FIG. 9 are a picture-element pi at overlap 208 and a picture-element p2 at overlap 210. Elements pi and p2 correspond to the same elementary area over the surface of the body. Thus, ideally, the optical data associated with elements pi and p2 are supposed to be the same. However, due to the angular dependence of the data (for example, non-Lambertian radiation), these optical data typically differ. This applies also to any other pair of picture-elements at 208 and 210 that correspond to the same area or elementary area over the surface of the body, except, perhaps a few pair of picture- elements at 208 and 210 for which the optical data are not different. At 85, the method compares the optical (e.g. , thermal, spectral, color) data associated with pairs of picture- elements that correspond to the same area over the surface of the body, and make a record of the comparison. The method can, for example, make a record of the difference or ratio between the grey level values or hue values, of the respective picture-elements. Preferably, the method compares the optical data for at least 50% or at least 60% or at least 70% or at least 80% or at least 90% or at least 95%, e.g. , all the picture-elements within the overlaps 208 and 210.
The method proceeds to 86 at which optical data associated with picture- elements outside the overlap portions are corrected based on the comparison. This can be done, for example, by extrapolation. Specifically, the recorded comparison can be fitted to a function that provides a correction factor for each picture-element at the overlap, and the function can be applied also to picture-element outside the overlap. The function is preferably non-linear.
In some embodiments of the present invention, prior knowledge regarding the expected dependence of the correction factor on the location if the picture-element can be used for the fit. For example, when operation 83 is executed and the viewing angle of each picture-element is calculated, the recorded comparison can be fitted to a correction function which describes a dependence of the correction on the viewing angle. The correction function is optionally and preferably symmetric for a change in sign of the viewing angle. When the value of the correction factor is known for some angles (e.g. , when there are picture-elements at a zero or relatively small, for example, less than 10°, viewing angle, in which case the correction factor for these elements can be set to 1), the known correction factor can be used as an input for the fit.
Another example is when two picture-elements are at viewing angles which are the same in absolute values but opposite in signs. In these cases, the correction factor is applied such that the optical data, after the correction, is the same for these two picture- elements. Thus, the present embodiments contemplate a procedure which searches for two picture-elements that are at viewing angles which are the same in absolute values but opposite in signs, compares the optical data associated with these two picture- elements and, if the optical data are significantly different (for example, with tolerance of less than 10%), correct the optical data of at least one of these two picture-elements to ensure that the two picture-elements have the same or similar optical data. The correction applied for these two picture-elements is optionally and preferably used for constructing the correction function.
Since the correction function is constructed based on the comparison 85, the function is specific to the body of the subject under investigation, even when prior knowledge is used for its constructions.
The subject- specific correction function g(9) can be constructed as a lookup table that can be stored in a non-transitory computer readable medium, or it can be constructed as an analytical function, e.g. , a polynomial function of Θ. In various exemplary embodiments of the invention g(9) is symmetric with respect to a change in sign of its argument Θ. Specifically, in these embodiments g(9) = g(-0).
A representative procedure for constructing an analytical subject- specific correction function is as follows. The recorded comparisons are fitted to a function g(9) which is parameterized according to a predetermined parameterization (e.g. , a polynomial function, optionally and preferably symmetric with respect to a change in sign, of the angle Θ). The parameters of the function are determined by the fitting procedure such that when the function is applied to the optical data of picture-elements within the overlaps the resulting corrected optical data of pairs of picture-elements at the overlap that correspond to the same area element over the surface of the body are matched (e.g. , having the same grey level) within a predetermined tolerance (e.g. , a tolerance of 30% or 20% or 10% or 5% or less).
A typical objective function for performing the fit can be written mathematically as h({ s}) =∑9i,e2lg(0i)pi(0i)-g(92)p2(02)l, where { s} is the set of parameters that define the parameterized function g, ρι(θι) is a picture-element at overlap 208 characterized by a viewing angle θι, Ρ22) is a picture-element at overlap 210 characterized by a viewing angle Θ2, and ρι(θι) and p2(02) correspond to the same area element on the surface of the body. Another example of objective function is h({ s}) = ∑eiie2[g(Qi)pi(6i)/g(92)p2(92)] - Other examples are also contemplated. The set { s} of parameters can include any number of parameters. For example, when g(9) is parameterized to a quadratic from, e.g. , g(9)= ao+ai9+a29 or g(9)= ao+ail9l+a29 , the set { s } includes three parameters ({ s} = { ao, al 5 a2}, in the present example). The fitting procedure varies the set of parameters { s} to reduce the value of the objective function h. Preferably, the stopping criterion for the fitting process is to achieve matching of optical (thermal, spectral, color) data, within the predetermined tolerance, for at least 50% or at least 60% or at least 70% or at least 80% or at least 90% of the pairs of corresponding picture-elements at the overlap.
Once the function g(9) is obtained by fitting, the function can be applied also to picture-elements outside the overlap. When the preprocessing operation 82 is executed, the correction function g(9) can be applied either before or after the preprocessing.
In some embodiments of the present invention the method proceeds to 87 at which the image dataset is regenerated using the corrected optical data. In embodiments in which the optical data includes thermal data, the method can proceeds to 88 at which a temperature map of the portion of the body is generated using the corrected thermal data. The regenerated dataset and/or temperature map can optionally be displayed on a display device. The method can continue to 89 at which the corrected data are compared to data of a reference dataset, as further detailed hereinabove. The method can optionally and preferably continue to 90 at which the presence or absence of a thermally distinguished region in the portion of the living body is determined based on the corrected thermal data, as further detailed hereinabove.
The method ends at 91.
Reference is now made to FIG. 4 which is a schematic illustration of an image correction system 20, according to some embodiments of the present invention. System 20 preferably comprises a digital input 22 for receiving one or more image datasets (e.g., 3D thermospatial or spectrospatial representations) as further detailed hereinabove. System 20 can further comprise a data processor 24.
In some embodiments of the present invention data processor 24 calculates, based on the spatial data of the input dataset (e.g., the input 3D thermospatial or spectrospatial representation), a viewing angle 9, and that applies a correction function g(9) for correcting the optical (e.g. , thermal) data as further detailed hereinabove. In some embodiments of the present invention data processor 24 identifies picture-elements at the overlap portions of the representations, compares optical data associated with the identified picture-elements among the datasets, and corrects, based on the comparison, optical data associated with picture-elements outside the overlap portions. Data processor 24 can also construct a subject-specific correction function based on the comparison, as further detailed hereinabove.
System 20 typically comprises a non-transitory computer readable medium 26, that can stores computer readable data types and/or computer readable instructions for carrying out the operations of data processor 24. For example, when the correction function g(9) is in the form of a lookup table, the lookup table can be stored in medium 26 and when the correction function g(9) is in the form of analytical function, computer instructions for calculating g(9) for a given angle Θ can be stored in medium 26.
When the optical data includes thermal data, data processor 24 optionally and preferably determines presence or absence of a thermally distinguished region in portion of living body based on corrected thermal data, as further detailed hereinabove. Optionally and preferably data processor 24 determines whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
In some embodiments of the present invention system 20 comprises an image generator 28 that re-generates the dataset (e.g. , the 3D thermospatial or spectrospatial representation) using the corrected optical (e.g. , thermal, spectral, color) data. Additionally or alternatively, when the optical data includes thermal data, image generator 28 can generate a temperature map of portion of body using the corrected thermal data. The re-generated dataset and/or temperature map can be stored in memory medium 26. System 20 can further comprise a digital output 30 that outputs the regenerated dataset and/or temperature map, at any known data format. The re-generated dataset and/or temperature map can be transmitted to an external system such as a cloud storage facility or a remote computer. The re-generated dataset and/or temperature map can also be transmitted to a display 32 which displays the dataset and/or temperature map, for example, as color or gray scale images or as contour plots.
As used herein the term "about" refers to ± 10 %. The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments." Any particular embodiment of the invention may include a plurality of "optional" features unless such features conflict.
The terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean "including but not limited to".
The term "consisting of means "including and limited to".
The term "consisting essentially of" means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.
Throughout this application, various embodiments of this invention may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
Whenever a numerical range is indicated herein, it is meant to include any cited numeral (fractional or integral) within the indicated range. The phrases "ranging/ranges between" a first indicate number and a second indicate number and "ranging/ranges from" a first indicate number "to" a second indicate number are used herein interchangeably and are meant to include the first and second indicated numbers and all the fractional and integral numerals therebetween.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Various embodiments and aspects of the present invention as delineated hereinabove and as claimed in the claims section below find experimental support in the following examples.
EXAMPLES
Reference is now made to the following examples, which together with the above descriptions illustrate some embodiments of the invention in a non limiting fashion.
Experiment was conducted according to some embodiments of the present invention for correcting thermal data. Three women participated in the experiment. The breasts of each woman were imaged using two thermal cameras, wherein two different viewing angles were used for each camera. 3D spatial data for the breasts were also obtained from a 3D imager, and 3D thermospatial representations were obtained from the thermal and spatial data. The images of each camera were registered by translation and rotation relative to anchor points marked on each image.
The thermal data of each image was corrected in accordance with some embodiments of the present invention and the mean and standard deviation of the difference between the grey levels in each pair of images from the same camera were calculated both before and after the correction. The results of the calculations are summarized in Table 1, below, and graphically illustrated in FIG. 5. Table 1
Figure imgf000032_0001
As shown in Table 1 and FIG. 5, the mean and standard deviation is consistently reduced by the correction procedure, indicating an improvement of the thermal uniformity between picture-elements that describe the same region from two different viewing angles. This observation was supported by a paired t-test which showed a confidence level of 95% that the mean of the differences is higher before the correction than after the correction. Some cases exhibited relatively high standard deviation values, presumable due to artifacts in the thermal data, such as the artifact that is marked with an oval in FIG. 6.
Visible light and thermal images of one woman subject are shown in FIGs. 7A- H. FIGs. 7A and 7B are visible light images from the two different viewing angles, FIG. 7C shows the registration of the two images, FIG. 7D shows picture-elements for which the registration difference is less than 2 mm, FIGs. 7E and 7F show grey level differences between the thermal images before (FIG. 7E) and after (FIG. 7F) the correction of thermal data, FIG. 7G shows the difference between the absolute values of the images in FIGs. 7E and 7F, and FIG. 7H marks regions at which the thermal correction provides improvement.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

Claims

WHAT IS CLAIMED IS:
1. A method of correcting image data, the method comprising:
obtaining at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of-views;
identifying picture-elements at overlap portions of said datasets;
comparing optical data associated with each of at least some of said identified picture-elements among said datasets, and
based on said comparison, correcting optical data associated with picture- elements outside said overlap portions.
2. The method of claim 1, further comprising:
for each dataset, calculating a viewing angle for each picture-element over said dataset; and
constructing, based on said comparison, a correction function which is specific to said scene and which describes a dependence of said correction of said optical data on said viewing angle.
3. The method of claim 2, wherein said correction function is nonlinear with respect to said angle.
4. The method of claim 2, wherein said correction function comprises a quadratic function.
5. The method according to any of claims 1-4, wherein at least two of said datasets is a 3D representation of said scene, said representation having 3D spatial data representing a non-planar surface of a portion of a body in said scene and optical data associated with said 3D spatial data.
6. The method according to any of claims 1-4, wherein said optical data comprises visible light data, and wherein said comparing and said correcting is applied to visible light data.
7. The method according to any of claims 1-4, wherein said optical data comprises ultraviolet light data and wherein said comparing and said correcting is applied to ultraviolet light data.
8. The method according to any of claims 1-4, wherein said optical data comprises X ray data and wherein said comparing and said correcting is applied to X ray data.
9. The method according to any of claims 1-4, wherein said optical data comprises thermal data, and wherein said comparing and said correcting is applied to thermal data.
10. The method according to claim 9, further comprising determining presence or absence of a thermally distinguished region in said portion of said body based on said corrected thermal data.
11. The method according to claim 5, further comprising generating a temperature map of said portion of said body using said corrected thermal data.
12. The method according to any of claims 6-10, further comprising generating a temperature map of said portion of said body using said corrected thermal data.
13. The method according to claim 11, further comprising displaying said temperature map on a display and/or transmitting said temperature map to a non- transitory computer readable medium.
14. The method according to claim 12, further comprising displaying said temperature map on a display and/or transmitting said temperature map to a non- transitory computer readable medium.
15. The method according to claim 5, wherein said body is a living body.
16. The method according to any of claims 6-13, wherein said body is a living body.
17. The method according to claim 10, wherein said body is a living body and the method comprises determining whether or not said thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
18. The method according to claim 11, wherein said portion of said body is a breast of a human subject.
19. The method according to claim 12, wherein said portion of said body is a breast of a human subject.
20. The method according to claim 1, further comprising re-generating at least one of said datasets using said corrected optical data.
21. The method according to any of claims 2-18, further comprising regenerating at least one of said datasets using said corrected optical data.
22. The method according to claim 20, further comprising displaying said regenerated dataset a display and/or transmitting said re-generated dataset to a non- transitory computer readable medium.
23. The method according to claim 21, further comprising displaying said regenerated dataset a display and/or transmitting said re-generated dataset to a non- transitory computer readable medium.
24. The method according to claim 1, comprising:
searching for two picture-elements within said overlap that are at viewing angles which are the same in absolute values but opposite in signs;
compares optical data associated with said two picture-elements; and correct optical data of at least one of said two picture-elements to ensure that said two picture-elements have the same or similar optical data.
25. The method according to any of claims 2-22, comprising:
searching for two picture-elements within said overlap that are at viewing angles which are the same in absolute values but opposite in signs;
compares optical data associated with said two picture-elements; and
correct optical data of at least one of said two picture-elements to ensure that said two picture-elements have the same or similar optical data.
26. A computer software product, comprising a non-transitory computer- readable medium in which program instructions are stored, which instructions, when read by a data processor, cause the data processor to receive at least two image datasets and execute the method according to any of claims 1-24.
27. An image correction system, the system comprising:
a digital input for receiving at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field- of-views; and
a data processor configured for identifying picture-elements at overlap portions of said datasets, for comparing optical data associated with each of at least some of said identified picture-elements among said datasets, and for correcting, based on said comparison, optical data associated with picture-elements outside said overlap portions.
28. The system of claim 27, wherein said data processor is configured for calculating, for each dataset, a viewing angle for each picture-element over said dataset, and constructing, based on said comparison, a correction function which is specific to said scene and which describes a dependence of said correction of said optical data on said viewing angle.
29. The system of claim 28, wherein said correction function is nonlinear with respect to said angle.
30. The system of claim 28, wherein said correction function comprises a quadratic function.
31. The system according to any of claims 27-30, wherein at least two of said datasets is a 3D representation of said scene, said representation having 3D spatial data representing a non-planar surface of a portion of a body in said scene and optical data associated with said 3D spatial data.
32. The system according to any of claims 27-30, wherein said optical data comprises visible light data, and wherein said data processor is configured to apply said comparison and said correction to visible light data.
33. The system according to any of claims 27-30, wherein said optical data comprises ultraviolet light data and wherein said data processor is configured to apply said comparison and said correction to ultraviolet light data.
34. The system according to any of claims 27-30, wherein said optical data comprises X ray data and wherein said data processor is configured to apply said comparison and said correction to X ray data.
35. The system according to any of claims 27-30, wherein said optical data comprises thermal data, and wherein said data processor is configured to apply said comparison and said correction to thermal data.
36. The system according to claim 35, wherein said data processor is configured for determining presence or absence of a thermally distinguished region in said portion of said body based on said corrected thermal data.
37. The system according to any of claims 31-36, further comprising an image generator configured for generating a temperature map of said portion of said body using said corrected thermal data.
38. The system according to claim 37, further comprising at least one of a display for displaying said temperature map, and a non-transitory computer readable medium for storing said temperature map.
39. The system according any of claims 31-38, wherein said body is a living body.
40. The system according to claim 36, wherein said body is a living body and wherein said data processor is configured for determining whether or not said thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
41. The system according to any of claims 27-40, further comprising an image generator for re-generating at least one of said datasets using said corrected optical data.
42. The system according to claim 41, further comprising at least one of a display for displaying said re-generated dataset a display, and a non-transitory computer readable medium for storing said re-generated dataset.
43. An imaging system, comprising:
an image generator for generating at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of- views; and
the image correction system according to claim 27.
44. An imaging system, comprising:
an image generator for generating at least two image datasets respectively captured from at least two different viewpoints of a scene and having partially overlapping field-of- views; and
the image correction system according to any of claims 28-42.
PCT/IL2016/051022 2015-09-14 2016-09-14 Image data correction based on different viewpoints WO2017046796A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IL258135A IL258135B2 (en) 2015-09-14 2018-03-14 Image data correction based on different viewpoints

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562218026P 2015-09-14 2015-09-14
US201562218020P 2015-09-14 2015-09-14
US62/218,020 2015-09-14
US62/218,026 2015-09-14

Publications (1)

Publication Number Publication Date
WO2017046796A1 true WO2017046796A1 (en) 2017-03-23

Family

ID=58288253

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2016/051022 WO2017046796A1 (en) 2015-09-14 2016-09-14 Image data correction based on different viewpoints
PCT/IL2016/051021 WO2017046795A1 (en) 2015-09-14 2016-09-14 Method and system for correcting image data

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/IL2016/051021 WO2017046795A1 (en) 2015-09-14 2016-09-14 Method and system for correcting image data

Country Status (2)

Country Link
IL (2) IL258135B2 (en)
WO (2) WO2017046796A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019136908A1 (en) * 2018-01-12 2019-07-18 平安科技(深圳)有限公司 Cancer identification method, device and storage medium
WO2020204784A1 (en) * 2019-04-05 2020-10-08 Delaval Holding Ab Method and control arrangement for detecting a health condition of an animal
CN113592798A (en) * 2021-07-21 2021-11-02 山东理工大学 Road disease intelligent identification method, system, terminal and medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201541B1 (en) * 1997-12-11 2001-03-13 Cognitens, Ltd. System and method for “Stitching” a plurality of reconstructions of three-dimensional surface features of object(s) in a scene defined relative to respective coordinate systems to relate them to a common coordinate system
US6283917B1 (en) * 1998-10-01 2001-09-04 Atl Ultrasound Ultrasonic diagnostic imaging system with blurring corrected spatial compounding
US20040061774A1 (en) * 2002-04-10 2004-04-01 Wachtel Robert A. Digital imaging system using overlapping images to formulate a seamless composite image and implemented using either a digital imaging sensor array
US20050047544A1 (en) * 2003-08-29 2005-03-03 Dongshan Fu Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data
US20050129325A1 (en) * 2003-11-27 2005-06-16 Sony Corporation Image processing apparatus and method
US20060250389A1 (en) * 2005-05-09 2006-11-09 Gorelenkov Viatcheslav L Method for creating virtual reality from real three-dimensional environment
US20070236595A1 (en) * 2006-04-10 2007-10-11 Sony Taiwan Limited. Method for Improving Image Stitching Accuracy with Lens Distortion Correction and Device for Implementing the Same
US20100054627A1 (en) * 2008-09-04 2010-03-04 Pro Track Ltd. Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection
US20100135550A1 (en) * 2007-06-25 2010-06-03 Real Imaging Ltd. Method, device and system for thermography
US20120157800A1 (en) * 2010-12-17 2012-06-21 Tschen Jaime A Dermatology imaging device and method
WO2015029022A1 (en) * 2013-08-29 2015-03-05 Real Imaging Ltd. Surface simulation
US20150160342A1 (en) * 2012-10-05 2015-06-11 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201541B1 (en) * 1997-12-11 2001-03-13 Cognitens, Ltd. System and method for “Stitching” a plurality of reconstructions of three-dimensional surface features of object(s) in a scene defined relative to respective coordinate systems to relate them to a common coordinate system
US6283917B1 (en) * 1998-10-01 2001-09-04 Atl Ultrasound Ultrasonic diagnostic imaging system with blurring corrected spatial compounding
US20040061774A1 (en) * 2002-04-10 2004-04-01 Wachtel Robert A. Digital imaging system using overlapping images to formulate a seamless composite image and implemented using either a digital imaging sensor array
US20050047544A1 (en) * 2003-08-29 2005-03-03 Dongshan Fu Apparatus and method for registering 2D radiographic images with images reconstructed from 3D scan data
US20050129325A1 (en) * 2003-11-27 2005-06-16 Sony Corporation Image processing apparatus and method
US20060250389A1 (en) * 2005-05-09 2006-11-09 Gorelenkov Viatcheslav L Method for creating virtual reality from real three-dimensional environment
US20070236595A1 (en) * 2006-04-10 2007-10-11 Sony Taiwan Limited. Method for Improving Image Stitching Accuracy with Lens Distortion Correction and Device for Implementing the Same
US20100135550A1 (en) * 2007-06-25 2010-06-03 Real Imaging Ltd. Method, device and system for thermography
US20100054627A1 (en) * 2008-09-04 2010-03-04 Pro Track Ltd. Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection
US20120157800A1 (en) * 2010-12-17 2012-06-21 Tschen Jaime A Dermatology imaging device and method
US20150160342A1 (en) * 2012-10-05 2015-06-11 Faro Technologies, Inc. Using two-dimensional camera images to speed registration of three-dimensional scans
WO2015029022A1 (en) * 2013-08-29 2015-03-05 Real Imaging Ltd. Surface simulation

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019136908A1 (en) * 2018-01-12 2019-07-18 平安科技(深圳)有限公司 Cancer identification method, device and storage medium
WO2020204784A1 (en) * 2019-04-05 2020-10-08 Delaval Holding Ab Method and control arrangement for detecting a health condition of an animal
CN113592798A (en) * 2021-07-21 2021-11-02 山东理工大学 Road disease intelligent identification method, system, terminal and medium
CN113592798B (en) * 2021-07-21 2023-08-15 山东理工大学 Intelligent identification method, system, terminal and medium for road diseases

Also Published As

Publication number Publication date
IL258134B (en) 2022-04-01
IL258134A (en) 2018-05-31
IL258135B1 (en) 2023-03-01
IL258135A (en) 2018-05-31
WO2017046795A1 (en) 2017-03-23
IL258135B2 (en) 2023-07-01

Similar Documents

Publication Publication Date Title
US10264980B2 (en) Method apparatus and system for determining a data signature of 3D image
US8620041B2 (en) Method apparatus and system for analyzing thermal images
EP2164385B1 (en) Method, device and system for thermography
US20160206211A1 (en) Surface simulation
US20110021944A1 (en) Method apparatus and system for analyzing thermal images
GB2496255A (en) Generating an enhanced image from medical imaging data
IL258135B2 (en) Image data correction based on different viewpoints
Kręcichwost et al. Chronic wounds multimodal image database
Juszczyk et al. Wound 3D geometrical feature estimation using Poisson reconstruction
JP5800549B2 (en) Image processing apparatus, operation method of image processing apparatus, and image processing program
Ben-Hamadou et al. Construction of extended 3D field of views of the internal bladder wall surface: A proof of concept
Strakowska et al. Cross-correlation based movement correction method for biomedical dynamic infrared imaging
Ali et al. Robust bladder image registration by redefining data-term in total variational approach
EP4258207A1 (en) Automatic rib fracture detection from unfolded scan images
You Quantitative infrared thermography for infantile hemangioma assessment
Gutierrez Fusion of thermal and three-dimensional data for chronic wound monitoring
WO2023194267A1 (en) Automatic rib fracture detection from unfolded scan images
IL213326A (en) Method, apparatus and system for determining a theremospatial signature
Chemaly Multimodality image registration for visualization in robotic assisted breast biopsy
Rettmann et al. Toward standardized mapping for left atrial analysis and cardiac ablation guidance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16845838

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 258135

Country of ref document: IL

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16845838

Country of ref document: EP

Kind code of ref document: A1