WO2017046795A1 - Procédé et système pour corriger des données d'image - Google Patents

Procédé et système pour corriger des données d'image Download PDF

Info

Publication number
WO2017046795A1
WO2017046795A1 PCT/IL2016/051021 IL2016051021W WO2017046795A1 WO 2017046795 A1 WO2017046795 A1 WO 2017046795A1 IL 2016051021 W IL2016051021 W IL 2016051021W WO 2017046795 A1 WO2017046795 A1 WO 2017046795A1
Authority
WO
WIPO (PCT)
Prior art keywords
thermospatial
data
thermal
representation
living body
Prior art date
Application number
PCT/IL2016/051021
Other languages
English (en)
Inventor
Israel Boaz Arnon
Yoel Arieli
Oria KAHANA
Original Assignee
Real Imaging Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Real Imaging Ltd. filed Critical Real Imaging Ltd.
Publication of WO2017046795A1 publication Critical patent/WO2017046795A1/fr
Priority to IL258134A priority Critical patent/IL258134B/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0091Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • A61B5/015By temperature mapping of body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a universal function.
  • thermographic images A thermographic image is typically obtained by receiving from the body of the subject radiation at any one of several infrared wavelength ranges and analyzing the radiation to provide a two- dimensional temperature map of the surface.
  • the thermographic image can be in the form of either or both of a visual image and corresponding temperature data.
  • the output from infrared cameras used for infrared thermography typically provides an image comprising a plurality of pixel data points, each pixel providing temperature information which is visually displayed, using a color code or grayscale code.
  • the temperature information can be further processed by computer software to generate for example, mean temperature for the image, or a discrete area of the image, by averaging temperature data associated with all the pixels or a sub-collection thereof.
  • thermographic image Based on the thermographic image, a physician diagnoses the site, and determines, for example, whether or not the site includes an inflammation while relying heavily on experience and intuition.
  • thermographic image data acquisition functionality acquires non-thermographic image data
  • thermographic image data acquisition functionality acquires thermographic image data
  • U.S. Patent No. 7,292,719 discloses a system for determining presence or absence of one or more thermally distinguishable objects in a living body.
  • a combined image generator configured combines non-thermographic three-dimensional data of a three-dimensional tissue region in the living body with thermographic two-dimensional data of the tissue region so as to generate three-dimensional temperature data associated with the three- dimensional tissue region.
  • a method of correcting image data comprises obtaining at least one 3D thermospatial representation having 3D spatial data representing a non-planar surface of a portion of a living body and thermal data associated with the 3D spatial data; based on the spatial data, calculating a viewing angle for each of a plurality of picture-elements over the thermospatial representation; and for each of at least some of the picture-elements, applying a predetermined correction function which describes a dependence of a correction of the thermal data on the viewing angle, for correcting thermal data associated with the picture-element.
  • the predetermined correction function is stored in a non-transitory computer readable medium as a lookup table.
  • 3D thermospatial representation comprises obtaining two or more 3D thermospatial representations of the same portion of the living body, and wherein the applying the predetermined correction function is repeated for at least two of the two or more 3D thermospatial representations, each time using the same predetermined correction function.
  • the method comprises determining presence or absence of a thermally distinguished region in the portion of the living body based on the corrected thermal data.
  • the method comprises determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the portion of the living body is a breast of a human subject.
  • the method comprises re-generating the at least one 3D thermospatial representation using the corrected thermal data.
  • the method comprises displaying the re-generated 3D thermospatial representation on a display and/or transmitting the re-generated 3D thermospatial representation to a non-transitory computer readable medium.
  • the method comprises generating a temperature map of the portion of the body using the corrected thermal data.
  • the method comprises displaying the temperature map on a display and/or transmitting the temperature map to a non-transitory computer readable medium.
  • a computer software product comprises a non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a data processor, cause the data processor to receive at least one 3D thermospatial representation of a portion of a living body and execute the method as delineated above and optionally and preferably as further exemplified below.
  • an image correction system comprising: a digital input for receiving at least one 3D thermospatial representation having 3D spatial data representing a non- planar surface of a portion of a living body and thermal data associated with the 3D spatial data; and a data processor configured for calculating, based on the spatial data, a viewing angle for each of a plurality of picture-elements over the thermospatial representation, and for applying, for each of at least some of the picture-elements, a predetermined correction function which describes a dependence of a correction of the thermal data on the viewing angle, for correcting thermal data associated with the picture-element.
  • the predetermined correction function is nonlinear with respect to the angle.
  • the predetermined correction function comprises a quadratic function.
  • the system comprises a non- transitory computer readable medium, wherein the predetermined correction function is stored in the computer readable medium as a lookup table.
  • the input receives two or more 3D thermospatial representations of the same portion of the living body, and wherein the data processor is configured for applying the predetermined correction function separately for at least two of the two or more 3D thermospatial representations, each time using the same predetermined correction function.
  • the data processor is configured for determining presence or absence of a thermally distinguished region in the portion of the living body based on the corrected thermal data.
  • the data processor is configured for determining whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the system comprises an image generator for re-generating the at least one 3D thermospatial representation using the corrected thermal data.
  • the system comprises at least one of a display for displaying the re-generated 3D thermospatial representation, and a non-transitory computer readable medium for storing the re-generated 3D thermospatial representation.
  • the system comprises an image generator for generating a temperature map of the portion of the body using the corrected thermal data.
  • the system comprises at least one of a display for displaying the temperature map, and a non-transitory computer readable medium for storing the temperature map.
  • an imaging system comprising: a thermospatial generator for generating a 3D thermospatial representation of a portion of a living body, the 3D thermospatial representation having 3D spatial data representing a non-planar surface of the portion of the living body and thermal data associated with the 3D spatial data; and the image correction system as delineated above and optionally and preferably as further exemplified below.
  • Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
  • a data processor such as a computing platform for executing a plurality of instructions.
  • the data processor includes a volatile memory for storing instructions and/or data and/or a non-volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data.
  • a network connection is provided as well.
  • a display and/or a user input device such as a keyboard or mouse are optionally provided as well.
  • FIGs. 1A-C are schematic illustrations of a synthesized thermospatial image, according to some embodiments of the present invention.
  • FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention
  • FIG. 3 is a graph of measured temperatures of several points on a human's skin as a function of angle between the normal to skin and the optical axis of a thermal camera;
  • FIG. 4 is a schematic illustration of an image correction system, according to some embodiments of the present invention.
  • FIG. 5 shows experimental results in which thermal data of images were corrected in accordance with some embodiments of the present invention
  • FIG. 6 is a thermographic image obtained during experiments performed according to some embodiments of the present invention.
  • FIGs. 7A and 7B are visible light images from the two different viewing angles, obtained during experiments performed according to some embodiments of the present invention.
  • FIG. 7C shows registration of the images of FIGs. 7 A and 7B, obtained during experiments performed according to some embodiments of the present invention;
  • FIG. 7D shows picture-elements for which a registration difference calculated during experiments performed according to some embodiments of the present invention was less than 2 mm;
  • FIGs. 7E and 7F show grey level differences between thermal images before (FIG. 7E) and after (FIG. 7F) thermal data correction performed according to some embodiments of the present invention
  • FIG. 7G shows difference between the absolute values of the images in FIGs. 7E and 7F.
  • FIG. 7H shows regions at which the thermal correction of the present embodiments provides improvement.
  • the present invention in some embodiments thereof, relates to data analysis and, more particularly, but not exclusively, to a method and system for correcting image data, preferably using a universal function.
  • corrected image data can be used to determine presence or absence of one or more distinguishable regions in a portion of a body, more preferably a living body.
  • the technique of the present embodiments is applied for correcting thermal data, which can include temperature data or some proxy thereof, such as, but not limited to, grayscale data and/or wavelength data.
  • the corrected thermal data is used for determining the likelihood for the presence of a thermally distinguishable region in the portion of the body.
  • the analysis of the present embodiments can be used to extract properties of the underlying tissue. For example, determination of the likelihood that a thermally distinguished region is present in the portion of the body can be used for assessing whether or not the portion of the body has a pathology such as a tumor or an inflammation.
  • An elevated temperature is generally associated with a tumor due to the metabolic abnormality of the tumor and proliferation of blood vessels (angiogenesis) at and/or near the tumor.
  • angiogenesis blood vessels
  • the cells proliferate faster and thus are more active and generate more heat. This tends to enhance the temperature differential between the tumor itself and the surrounding tissue.
  • the present embodiments can therefore be used for diagnosis of cancer, particularly, but not exclusively breast cancer.
  • the technique of the present embodiments is optionally and preferably applied to surface information that describes the surface of the body.
  • the surface information optionally and preferably comprises thermal information and spatial information.
  • the thermal information comprises data pertaining to heat evacuated from or absorbed by the surface. Since different parts of the surface generally evacuate or absorb different amount of heat, the thermal information comprises a set of tuples, each comprising the coordinates of a region or a point on the surface and a thermal numerical value (e.g. , temperature, thermal energy) associated with the point or region.
  • the thermal information can be transformed to visible signals, in which case the thermal information is in the form of a thermographic image.
  • the thermal data is typically arranged gridwise in a plurality of picture-elements (e.g. , pixels, arrangements of pixels) representing the thermographic image.
  • Each picture-element is represented by an intensity value or a grey-level over the grid.
  • the number of different intensity values can be different from the number of grey-levels.
  • an 8-bit display can generate 256 different grey- levels.
  • the number of different intensity values corresponding to thermal information can be much larger.
  • the thermal information spans over a range of 37 °C and is digitized with a resolution of 0.1 °C. In this case, there are 370 different intensity values and the use of grey-levels is less accurate by a factor of approximately 1.4.
  • a photon thermal camera can provide information pertaining to the number of photons detected by the camera detector. Such information can extend over a range of about 6000-8000 intensity values.
  • the correction technique is applied to intensity values, and in some embodiments of the present invention the correction technique is applied to grey-levels. Combinations of the two (such as double processing) are also contemplated.
  • pixel is sometimes abbreviated herein to indicate a picture-element. However, this is not intended to limit the meaning of the term “picture-element” which refers to a unit of the composition of an image.
  • thermographic image is used interchangeably throughout the specification without limiting the scope of the present embodiments in any way. Specifically, unless otherwise defined, the use of the term “thermographic image” is not to be considered as limited to the transformation of the thermal information into visible signals.
  • a thermographic image can be stored in the memory of a computer readable medium, preferably a non-transitory computer readable medium, as a set of tuples as described above.
  • the spatial information comprises data pertaining to geometric properties of a surface which at least partially encloses a three-dimensional volume.
  • the surface is non-planar, e.g. , curved.
  • the surface is a two-dimensional object embedded in a three-dimensional space.
  • a surface is a metric space induced by a smooth connected and compact Riemannian 2-manifold.
  • the geometric properties of the surface would be provided explicitly for example, the slope and curvature (or even other spatial derivatives or combinations thereof) for every point of the surface.
  • the spatial information of the surface is a reduced version of a 3D spatial representation, which may be either a point-cloud or a 3D reconstruction (e.g. , a polygonal mesh or a curvilinear mesh) based on the point cloud.
  • the 3D spatial representation is expressed via a 3D coordinate-system, such as, but not limited to, Cartesian, Spherical, Ellipsoidal, 3D Parabolic or Paraboloidal coordinate 3D system.
  • the spatial data in some embodiments of the present invention, can be in a form of an image. Since the spatial data represent the surface such image is typically a two- dimensional image which, in addition to indicating the lateral extent of body members, further indicates the relative or absolute distance of the body members, or portions thereof, from some reference point, such as the location of the imaging device. Thus, the image typically includes information residing on a surface of a three-dimensional body and not necessarily in the bulk. Yet, it is commonly acceptable to refer to such image as "a three-dimensional image” because the surface is conveniently defined over a three- dimensional system of coordinate. Thus, throughout this specification and in the claims section that follows, the terms "three-dimensional image” and "three-dimensional representation” primarily relate to surface entities.
  • the lateral dimensions of the spatial data are referred to as the x and y dimensions, and the range data (the depth or distance of the body members from a reference point) is referred to as the z dimension.
  • thermospatial representation can be in the form of digital data (e.g. , a list of tuples associated with digital data describing thermal quantities) or in the form of an image (e.g. , a three-dimensional image color-coded or grey-level coded according to the thermal data).
  • a thermospatial representation in the form of an image is referred to hereinafter as a thermospatial image.
  • thermospatial representation is defined over a 3D spatial representation of the body and has thermal data associated with a surface of the 3D spatial representation, and arranged gridwise over the surface in a plurality of picture-elements (e.g. , pixels, arrangements of pixels) each represented by an intensity value or a grey-level over the grid.
  • picture-elements e.g. , pixels, arrangements of pixels
  • thermospatial representation when the thermospatial representation is in the form of digital data, the digital data describing thermal properties can also be expressed either in terms of intensities or in terms of grey-levels as described above. Digital thermospatial representation can also correspond to thermospatial image whereby each tuple corresponds to a picture-element of the image.
  • thermographic images are mapped or projected onto the surface of the 3D spatial representation to form the thermospatial representation.
  • the thermographic image to be projected onto the surface of the 3D spatial representation preferably comprises thermal data which are expressed over the same coordinate- system as the 3D spatial representation. Any type of thermal data can be used.
  • the thermal data comprises absolute temperature values
  • the thermal data comprises relative temperature values each corresponding, e.g., to a temperature difference between a respective point of the surface and some reference point
  • the thermal data comprises local temperature differences.
  • the thermal data can comprise both absolute and relative temperature values, and the like.
  • the information in the thermographic image also includes the thermal conditions (e.g. , temperature) at one or more reference markers.
  • the acquisition of surface data is typically performed by positioning the reference markers, e.g. , by comparing their coordinates in the thermographic image with their coordinates in the 3D spatial representation, to thereby match, at least approximately, also other points hence to form the synthesized thermospatial representation.
  • thermospatial image for the case that the body comprise the breasts of a female or male subject is illustrated in FIGs. 1A-C, showing a 3D spatial representation illustrated as a non-planar surface (FIG. 1A), a thermographic image illustrated as planar isothermal contours (FIG. IB), and a synthesized thermospatial image formed by mapping the thermographic image on a surface of the 3D spatial representation (FIG. 1C).
  • the thermal data of the thermospatial image is represented as grey-level values over a grid generally shown at 102. It is to be understood that the representation according to grey-level values is for illustrative purposes and is not to be considered as limiting. As explained above, the processing of thermal data can also be performed using intensity values.
  • a reference marker 101 which optionally, but not obligatorily, can be used for the mapping.
  • a series of thermal images of a section of a living body is obtained.
  • Different thermal images of the series include thermal data acquired from the portion of the body at different time instants.
  • Such series of thermal images can be used by the present embodiments to determine thermal changes occurred in the portion of the body over time.
  • thermospatial representation of a section of a living body is obtained.
  • Different thermospatial representations of the series include thermal data acquired from the portion of the body at different time instants.
  • Such series of thermospatial representations can be used by the present embodiments to determine thermal and optionally spatial changes occurred in the portion of the body over time.
  • the series can include any number of thermal images or thermospatial representations. It was found by the inventors of the present invention that two thermal images or thermospatial representations are sufficient to perform the analysis, but more than two thermal images or thermospatial representations (e.g. , 3, 4, 5 or more) can also be used, for example, to increase accuracy of the results and/or to allow selection of best acquisitions.
  • thermographic image and synthesized thermospatial image can be obtained in any technique known in the art, such as the technique disclosed in International Patent Publication No. WO 2006/003658, U.S. Published Application No. 20010046316, and U.S. Patent Nos. 6,442,419, 6,765,607, 6,965,690, 6,701,081, 6,801,257, 6,201,541, 6, 167,151, 6, 167,151, 6,094, 198 and 7,292,719.
  • Some embodiments of the invention can be embodied on a tangible medium such as a computer for performing the method steps. Some embodiments of the invention can be embodied on a computer readable medium, preferably a non-transitory computer readable medium, comprising computer readable instructions for carrying out the method steps. Some embodiments of the invention can also be embodied in electronic device having digital computer capabilities arranged to run the computer program on the tangible medium or execute the instruction on a computer readable medium. Computer programs implementing method steps of the present embodiments can commonly be distributed to users on a tangible distribution medium. From the distribution medium, the computer programs can be copied to a hard disk or a similar intermediate storage medium. The computer programs can be run by loading the computer instructions either from their distribution medium or their intermediate storage medium into the execution memory of the computer, configuring the computer to act in accordance with the method of this invention. All these operations are well-known to those skilled in the art of computer systems.
  • FIG. 2 is a flowchart diagram of a method suitable for correcting image data, according to some embodiments of the present invention.
  • the method is applied to at least one 3D thermospatial representation of a portion of a body, preferably a living body.
  • the portion of the body can include one or more organs, e.g., a breast or a pair of breasts, or a part of an organ, e.g., a part of a breast.
  • the 3D thermospatial representation has 3D spatial data representing a non-planar surface of the portion of the living body and thermal data associated with the 3D spatial data.
  • the method begins at 10 and continues to 11 at which the 3D thermospatial representation is obtained.
  • the method can obtain the 3D thermospatial representation by receiving it from an external source such as a 3D thermospatial representation generator, or by generating the 3D thermospatial representation, for example, by combining 3D and thermal imaging.
  • the method optionally continues to 12 at which the spatial and/or thermal data is preprocessed.
  • the preprocessing operation includes definition of one or more spatial boundaries for the surface, so as to define a region-of-interest for the analysis.
  • the preprocessing operation can include defining a spatial boundary between the surface of the portion of the body and surface of the nearby tissue.
  • the surface of the nearby tissue is preferably excluded from the analysis.
  • the preprocessing comprises transformation of coordinates.
  • the method when the method is executed for correcting image data pertaining to more than one body portions having similar shapes, the method preferably transform the coordinates of one or more portions of the body so as to ensure that all body portions are described by the same coordinate- system.
  • a representative example is a situation in which the surface data describe a left breast and a right breast. In this situation, the system of coordinates of the 3D thermospatial representation of one of the breasts can be flipped so as to describe both thermal images and both 3D thermospatial representations using the same coordinate-system.
  • the preprocessing comprises normalization of the thermal data.
  • the normalization is useful, for example, when it is desired not to work with too high values of intensities.
  • the normalization is performed so as to transform the range of thermal values within the thermal data to a predetermined range between a predetermined minimal thermal value and a predetermined maximal thermal value. This can be done using a linear transformation as known in the art.
  • a typical value for the predetermined minimal thermal value is 1, and a typical value for the predetermined maximal thermal value is 10.
  • Other ranges or normalization schemes are not excluded from the scope of the present invention.
  • the preprocessing operation includes slicing of the surface described by the spatial data to a plurality of slices.
  • the correction procedure described below can be applied separately for each slice.
  • the slicing can be along a normal direction (away from the body), parallel direction or azimuthal direction as desired.
  • the slicing can also be according to anatomical information (for example a different slice for a nipple region). Also contemplated is arbitrary slicing, in which case the surface is sliced to N regions.
  • the preprocessing comprises normalization of the spatial data.
  • the normalization is useful when it is desired to compare between thermal data of different portions of the body, for example, body portions having similar shapes but different sizes. These embodiments are particularly useful when the portion of the body is a breast and it is desired to compare the thermal data of breasts of different sizes (e.g. , a left breast to a right breast of the same subject, or a breast of one subject to a breast of another subject).
  • the method preferably continues to 13 at which a viewing angle is calculated based on the spatial data for one or more of a plurality of picture-elements over the thermospatial representation.
  • the viewing angle ⁇ of a given picture-element p is conveniently defined between the normal to the surface of the body at picture-element p and the optical axis of the thermal imaging system that provides the thermal data associated with picture-element p.
  • the viewing angle ⁇ is calculable because the shape of the surface is known from the spatial data, and because the optical axis is known from the thermal data.
  • the method optionally and preferably continues to 14 at which a predetermined correction function g(9) is applied to each of at least some of the picture-elements for correcting thermal data associated with the respective picture-element.
  • a predetermined correction function g(9) is applied to each of at least some of the picture-elements for correcting thermal data associated with the respective picture-element.
  • the associated thermal data typically relates to the luminosity of the light multiplied by the thermal imaging system' s response the li ht and integrated over the wavelength:
  • R( ) is the response of the thermal imaging system to light at wavelength ⁇
  • L( ,T) is the luminosity of light of wavelength ⁇ arriving from a surface being at a temperature T.
  • the thermal imaging system When the pixel sensor of the thermal imaging system receives light that propagate along the optical axis of the thermal imaging system, the corresponding gray level is typically as indicated above.
  • the thermal imaging system typically employs a Lambertian correction that is proportional to the fourth power of the cosine of the deviation angle. It was found by the present inventors that some curved objects, particularly living bodies, the Lambertian correction is insufficient since different parts of the surface have different primary light emission directions.
  • the grey levels provided by the thermal imaging system do not adequately describe the temperature map of the surface.
  • the grey level provided by the thermal imaging system when the picture-element is at a viewing angle ⁇ differs from the grey level that would have been provided by the thermal imaging system had the picture-element been at a viewing angle ⁇ 2 .
  • the thermal data provided by the thermal imaging system are corrected such that the corrected thermal data of all picture-elements (for which the correction is employed) are the thermal data that would have been provided by the thermal imaging system had all these picture- element been at the same viewing angle relative to the thermal imaging system.
  • the correction can be employed such that for all the picture-elements the corrected thermal data are the thermal data that would have been provided by the thermal imaging system had all these picture-element been at zero viewing angle.
  • the predetermined correction function g(9) can be applied either before or after the preprocessing.
  • the predetermined correction function g(9) is preferably nonlinear with respect to the angle.
  • a representative example of such nonlinear dependence is shown in FIG.
  • the correction function g(9) can be stored in a non-transitory computer readable medium as a lookup table, or it can be provided as an analytical function. Such analytical function can be obtained by parameterizing the correction function g(9) and calculating the parameters based on experimentally observed angular dependence.
  • the method proceeds to 15 at which the 3D thermospatial representation is regenerated using said corrected thermal data, and/or 16 at which a temperature map of the portion of the body is generated using the corrected thermal data.
  • the 3D thermospatial representation and/or temperature map can optionally be displayed on a display device.
  • the method can continue to 17 at which the corrected data are compared 17 to data of a reference thermospatial representation, which can be obtained from a library or can be constructed by the method of the present embodiments.
  • the reference thermospatial representation can describe a reference portion of the body other than the portion of the body being analyzed.
  • the reference portion of the body can be a portion of the body which is similar in shape to the portion of the body being analyzed.
  • the portion of the body is a breast
  • the reference portion of the body can be the other breast of the same subject.
  • the aforementioned transformation of coordinates is preferably employed so as to facilitate conceptual overlap of one portion of the body over the other.
  • the reference thermospatial representation includes history data of the portion of the body.
  • the reference portion of the body can be the same portion of the body as captured at an earlier time.
  • the inclusion of history data in the thermospatial representation can be achieved by recording the reference thermospatial representation at a date earlier than the date at which the method is executed. This embodiment can also be useful for monitoring changes in the portion of the body over time.
  • the reference thermospatial representation can be one of the thermospatial representations of the series.
  • the ambient temperature at the surface of the portion of the body is changed between two successive captures of surface information, and the corresponding thermospatial representations are obtained.
  • the corrected thermal data of two such successive thermospatial representations are compared.
  • a change in the ambient temperature corresponds to different boundary conditions for different thermospatial representations.
  • two successive thermospatial representations describe the portion of the body while the subject is exposed to two different ambient temperatures.
  • a change in the ambient temperature can be imposed, for example, by establishing contact between a cold object and the portion of the body or directing a flow of cold gas (e.g., air) to the surface of the portion of the body between successive data acquisitions.
  • cold gas e.g., air
  • a procedure in which the portion of the body is immersed in cold liquid (e.g. , water) between successive data acquisitions.
  • a procedure in which another portion of the body is exposed to a different (e.g. , lower) temperature so as to ensure transient thermal condition.
  • the subject can immerse his or her limb in a cold liquid (e.g. , water).
  • a cold liquid e.g. , water
  • the method can optionally and preferably continue to 18 at which the presence or absence of a thermally distinguished region in the portion of the living body is determined based on the corrected thermal data. This can be achieved using any technique known in the art, except that the uncorrected thermal data used in known techniques is replaced with data corrected according to some embodiments of the present invention.
  • the method can also determine whether or not the thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • the set of criteria can include at least one of the temperatures of the region, the temperature difference between the region and its immediate surrounding, the temperature difference between the region and the average temperature of the body portion or some region-of-interest thereof, the size of the region and the like.
  • System 20 preferably comprises a digital input 22 for receiving one or more 3D thermospatial representations as further detailed hereinabove.
  • System 20 can further comprise a data processor 24 that calculates, based on the spatial data of the input 3D thermospatial representation, a viewing angle ⁇ , and that applies a predetermined correction function g(9) for correcting the thermal data as further detailed hereinabove.
  • System 20 typically comprises a non-transitory computer readable medium 26, that can stores computer readable data types and/or computer readable instructions for carrying out the operations of data processor 24.
  • a non-transitory computer readable medium 26 that can stores computer readable data types and/or computer readable instructions for carrying out the operations of data processor 24.
  • the lookup table can be stored in medium 26 and when the correction function g(9) is in the form of analytical function, computer instructions for calculating g(9) for a given angle 9 can be stored in medium 26.
  • Data processor 24 can, in some embodiments of the present invention, determine presence or absence of a thermally distinguished region in portion of living body based on corrected thermal data, as further detailed hereinabove. Optionally and preferably data processor 24 determines whether or not thermally distinguished region is a tumor based on a predetermined criterion or a predetermined set of criteria.
  • system 20 comprises an image generator 28 that re-generates the 3D thermospatial representation using the corrected thermal data. Additionally or alternatively, image generator 28 can generate a temperature map of portion of body using the corrected thermal data. The re-generated 3D thermospatial representation and/or temperature map can be stored in memory medium 26.
  • System 20 can further comprise a digital output 30 that outputs the regenerated 3D thermospatial representation and/or temperature map, at any known data format.
  • the re-generated 3D thermospatial representation and/or temperature map can be transmitted to an external system such as a cloud storage facility or a remote computer.
  • the re-generated 3D thermospatial representation and/or temperature map can also be transmitted to a display 32 which displays the 3D thermospatial representation and/or temperature map, for example, as color or gray scale images or as contour plots.
  • compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
  • a compound or “at least one compound” may include a plurality of compounds, including mixtures thereof.
  • range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the invention. Accordingly, the description of a range should be considered to have specifically disclosed all the possible subranges as well as individual numerical values within that range. For example, description of a range such as from 1 to 6 should be considered to have specifically disclosed subranges such as from 1 to 3, from 1 to 4, from 1 to 5, from 2 to 4, from 2 to 6, from 3 to 6 etc., as well as individual numbers within that range, for example, 1, 2, 3, 4, 5, and 6. This applies regardless of the breadth of the range.
  • FIGs. 7A- H Visible light and thermal images of one woman subject are shown in FIGs. 7A- H.
  • FIGs. 7A and 7B are visible light images from the two different viewing angles
  • FIG. 7C shows the registration of the two images
  • FIG. 7D shows picture-elements for which the registration difference is less than 2 mm
  • FIGs. 7E and 7F shows grey level differences between the thermal images before (FIG. 7E) and after (FIG. 7F) the correction of thermal data
  • FIG. 7G shows the difference between the absolute values of the images in FIGs. 7E and 7F
  • FIG. 7H marks regions at which the thermal correction provides improvement.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé pour corriger des données d'image, le procédé consistant à obtenir au moins une représentation thermospatiale 3D comprenant des données spatiales 3D représentant une surface non-plane d'une partie d'un corps vivant et des données thermiques associées aux données spatiales 3D ; sur la base des données spatiales, calculer un angle de visualisation pour chaque élément parmi une pluralité d'éléments d'image sur la représentation thermospatiale ; et, pour chaque élément d'au moins certains des éléments d'image, appliquer une fonction de correction prédéterminée qui décrit une dépendance d'une correction des données thermiques sur l'angle de visualisation, afin de corriger des données thermiques associées à l'élément d'image.
PCT/IL2016/051021 2015-09-14 2016-09-14 Procédé et système pour corriger des données d'image WO2017046795A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
IL258134A IL258134B (en) 2015-09-14 2018-03-14 Method and system for correcting image data

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562218020P 2015-09-14 2015-09-14
US201562218026P 2015-09-14 2015-09-14
US62/218,020 2015-09-14
US62/218,026 2015-09-14

Publications (1)

Publication Number Publication Date
WO2017046795A1 true WO2017046795A1 (fr) 2017-03-23

Family

ID=58288253

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/IL2016/051022 WO2017046796A1 (fr) 2015-09-14 2016-09-14 Correction de données d'image sur la base de différents points de vue
PCT/IL2016/051021 WO2017046795A1 (fr) 2015-09-14 2016-09-14 Procédé et système pour corriger des données d'image

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/IL2016/051022 WO2017046796A1 (fr) 2015-09-14 2016-09-14 Correction de données d'image sur la base de différents points de vue

Country Status (2)

Country Link
IL (2) IL258135B2 (fr)
WO (2) WO2017046796A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108154509B (zh) * 2018-01-12 2022-11-11 平安科技(深圳)有限公司 癌症识别方法、装置及存储介质
SE543210C2 (en) * 2019-04-05 2020-10-27 Delaval Holding Ab Method and control arrangement for detecting a health condition of an animal
CN113592798B (zh) * 2021-07-21 2023-08-15 山东理工大学 一种道路病害智能辨识方法、系统、终端及介质

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150071513A1 (en) * 2007-06-25 2015-03-12 Real Imaging Ltd. Method and apparatus for analyzing images

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201541B1 (en) * 1997-12-11 2001-03-13 Cognitens, Ltd. System and method for “Stitching” a plurality of reconstructions of three-dimensional surface features of object(s) in a scene defined relative to respective coordinate systems to relate them to a common coordinate system
US6283917B1 (en) * 1998-10-01 2001-09-04 Atl Ultrasound Ultrasonic diagnostic imaging system with blurring corrected spatial compounding
WO2003087929A1 (fr) * 2002-04-10 2003-10-23 Pan-X Imaging, Inc. Systeme d'imagerie numerique
US7756567B2 (en) * 2003-08-29 2010-07-13 Accuray Incorporated Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data
JP4069855B2 (ja) * 2003-11-27 2008-04-02 ソニー株式会社 画像処理装置及び方法
US20060250389A1 (en) * 2005-05-09 2006-11-09 Gorelenkov Viatcheslav L Method for creating virtual reality from real three-dimensional environment
TW200740212A (en) * 2006-04-10 2007-10-16 Sony Taiwan Ltd A stitching accuracy improvement method with lens distortion correction
IL193906A (en) * 2008-09-04 2012-06-28 Pro Track Ltd Methods and systems for creating an aligned bank of images with an iterative self-correction technique for coordinate acquisition and object detection
US20120157800A1 (en) * 2010-12-17 2012-06-21 Tschen Jaime A Dermatology imaging device and method
DE102012109481A1 (de) * 2012-10-05 2014-04-10 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
GB2517720B (en) * 2013-08-29 2017-09-27 Real Imaging Ltd Surface Simulation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150071513A1 (en) * 2007-06-25 2015-03-12 Real Imaging Ltd. Method and apparatus for analyzing images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PETROCELLI, SAMUEL; ET AL.: "3D Thermal Imaging: An approach towards true field temperature measurement", PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON QUANTITATIVE INFRARED THERMOGRAPHY, 31 May 2015 (2015-05-31), XP055370273, Retrieved from the Internet <URL:http://www.ndt.net/search/docs.php3?showForm=off&id=17712> [retrieved on 20161225] *

Also Published As

Publication number Publication date
WO2017046796A1 (fr) 2017-03-23
IL258134A (en) 2018-05-31
IL258135A (en) 2018-05-31
IL258135B2 (en) 2023-07-01
IL258134B (en) 2022-04-01
IL258135B1 (en) 2023-03-01

Similar Documents

Publication Publication Date Title
US10264980B2 (en) Method apparatus and system for determining a data signature of 3D image
US8620041B2 (en) Method apparatus and system for analyzing thermal images
US10299686B2 (en) Method apparatus and system for analyzing images
Xia et al. Automated bone segmentation from large field of view 3D MR images of the hip joint
US20160206211A1 (en) Surface simulation
US20180300890A1 (en) Ultrasonic image processing system and method and device thereof, ultrasonic diagnostic device
IL258134A (en) Method and system for correcting image data
Suzani et al. Semi-automatic segmentation of vertebral bodies in volumetric MR images using a statistical shape+ pose model
JP6262869B2 (ja) 乳房パラメータマップの生成
Lee et al. Three-dimensional analysis of acetabular orientation using a semi-automated algorithm
CN105684040B (zh) 支持肿瘤响应测量的方法
Bichinho et al. A computer tool for the fusion and visualization of thermal and magnetic resonance images
JP2015136480A (ja) 3次元医用画像表示制御装置およびその作動方法並びに3次元医用画像表示制御プログラム
Skalski et al. LEFMIS: locally-oriented evaluation framework for medical image segmentation algorithms
EP4254331A1 (fr) Traitement combiné des images des côtes et de la colonne vertébrale pour une évaluation rapide des acquisitions
You Quantitative infrared thermography for infantile hemangioma assessment
WO2023194267A1 (fr) Détection automatique de fracture de côte à partir d&#39;images de balayage non pliées
WO2023186729A1 (fr) Traitement combiné d&#39;image de côte et de colonne vertébrale pour évaluation rapide de balayages
Kaczmarek Integration of thermographic data with the 3D object model
WO2024017742A1 (fr) Visualisation efficace de plan de nervure par augmentation automatique
Chaisaowong et al. Automatic spatiotemporal matching of detected pleural thickenings
IL213326A (en) Method, device and system for finding thermospheric signature

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16845837

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 258134

Country of ref document: IL

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16845837

Country of ref document: EP

Kind code of ref document: A1