WO2021117013A1 - Determination of medical condition risk using thermographic images - Google Patents

Determination of medical condition risk using thermographic images Download PDF

Info

Publication number
WO2021117013A1
WO2021117013A1 PCT/IB2020/061860 IB2020061860W WO2021117013A1 WO 2021117013 A1 WO2021117013 A1 WO 2021117013A1 IB 2020061860 W IB2020061860 W IB 2020061860W WO 2021117013 A1 WO2021117013 A1 WO 2021117013A1
Authority
WO
WIPO (PCT)
Prior art keywords
patient
thermographic
image
rules
image data
Prior art date
Application number
PCT/IB2020/061860
Other languages
French (fr)
Inventor
Luis Enrique HERNÁNDEZ
Kevin Andrés HERNÁNDEZ
Jorge Antonio JUÁREZ
Pedro Abraham SÁNCHEZ
Jan Andrei MERINO
Sergio Armando SIFUENTES
José Carlos LUNA
Ricardo NIÑO DE RIVERA
Enrique MARTIN DEL CAMPO
Christian Axel VERA
Alberto Eduardo JUÁREZ
Original Assignee
Hearthcore S.A.P.I. De C.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hearthcore S.A.P.I. De C.V. filed Critical Hearthcore S.A.P.I. De C.V.
Publication of WO2021117013A1 publication Critical patent/WO2021117013A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • Breast cancer is one of the most common forms of cancer worldwide, particularly among females. While the chances of surviving a breast cancer diagnosis are increasing, those chances fall when the cancer is not detected at a sufficiently early stage.
  • Mammography is widely used for non-invasive breast cancer detection.
  • x-rays of the breast are analyzed by a trained mammographer or radiologist.
  • the level of skill required for accurate interpretation of an x-ray image renders access to providers difficult in many parts of the world.
  • women often feel discomfort or even pain when having a mammogram.
  • many patients end up forgoing recommended periodic mammograms.
  • Embodiments described herein overcome the aforementioned disadvantages in multiple ways.
  • different AI rules, or different combinations thereof, from a plurality of AI rules may be applied to images of different parts of a patient’s body. Images may be taken from different angles, such as frontal, left oblique, and right oblique. Also, images may be taken quiescently as basal images, or with the subject body part stimulated or conditioned in various ways, such as by heating, cooling, or moistening. Images may be processed to create statistical representations of both full images and regions of interest, rather than attempting to classify raw images.
  • results obtained therefrom may be used as common inputs to a subsequent set of AI rules, in order to minimize effects of any errors from potentially unexpected results from any one of the previously applied sets of AI rules. It may even be ensured that only high-quality images are presented to the AI classification methods by using information from previous images to determine if a subject of a new image is, for example, correctly positioned or aligned.
  • a processor-implemented method for determining a risk of a medical condition for a patient includes receiving, at the processor, one or more thermographic images. The method also includes applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body. The method further includes determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
  • AI artificial intelligence
  • the method may further include receiving, at the processor, clinical data for the patient.
  • the AI rules may define a statistical assessment of temperature values and the clinical data.
  • the temperature values may be extracted from the respective sets of thermographic image data.
  • the statistical assessment may include first-order statistical descriptors of histograms of the respective sets of thermographic image data.
  • the first-order statistical descriptors may include at least one of average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and statistical range.
  • the respective sets of thermographic image data may comprise pixel values.
  • the statistical assessment may include second-order statistical descriptors derived from co-occurrence matrices of the thermographic image data.
  • the co-occurrence matrices may be derived from matrices of the pixel values of the respective sets of thermographic image data, or directly from matrices of the temperature values.
  • the second-order statistical descriptors may include at least one of second angular moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity. It should be noted that, as a first-order statistical descriptor, entropy may be calculated directly from the temperature values, or from the pixel values. It should be further noted that as a second-order statistical descriptor, entropy may be calculated from the co occurrence matrix values. Variances’ difference may be defined as a difference between variance values from a pair of distinct image samples.
  • the method may include filtering a preliminary sample of thermographic images for a measure of image quality, and subsequently capturing the one or more thermographic images with an imaging device.
  • the filtering may include comparing the preliminary sample with a reference image.
  • the reference image may include at least one of a positional marking or an alignment mask.
  • the alignment mask may include a depiction of space between a head, a shoulder, and a raised arm of a patient on respective sides of the patient’s body.
  • the preliminary sample may include a live feed from the imaging device.
  • the measure of image quality may include at least one of alignment, angle of rotation, height, or inclination of the patient’s body with respect to an orientation of an imaging device, and interference within an area of the image.
  • the measure of image quality may further include at least one of distance between the patient’s body and the imaging device, image contrast, and a background or core temperature identified within the preliminary sample.
  • the measure of image quality may include a measure of symmetry between parts of left and right sides of the patient’s body.
  • the method may include prompting a user to adjust a parameter of the patient or of the imaging device to improve the measure of image quality prior to capturing the one or more thermographic images.
  • the one or more thermographic images may be obtained from a machine vision module.
  • the plurality of anatomical parts of the patient’s body may be defined by medical knowledge including regions commonly affected either directly or indirectly by the medical condition.
  • the plurality of anatomical parts of the patient’s body may include at least a portion of a breast.
  • the AI rules may include detecting a bottom contour of the breast.
  • Image processing may include detecting a bottom contour of the breast.
  • the plurality of anatomical parts of the patient’s body may be further defined by the processor automatically generating image masks including a combination of points, lines, and curves.
  • the method may further include adjusting points, lines, or curves on, adding points, lines, or curves to, or removing points, lines, or curves from the image masks to match a part of the patient’s body having a unique anatomical structure or manifestation.
  • the method may include outputting the level of risk of the medical condition for the patient to a user. Outputting the level of risk may be performed by outputting a report including the level of risk, a plurality of subjective analysis selections made by the user, the thermographic images, and the respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body.
  • the medical condition may include at least one of breast cancer, fibrosis, mastitis, duct stacia, and adenosis.
  • Applying a plurality of respective AI ruler to the respective sets of thermographic image data may include applying multiple sets of AI rules in parallel to produce a plurality of mutually independent results, and subsequently applying final inference rules to the plurality of mutually independent results.
  • Applying image processing may include applying AI machine vision rules to the one or more thermographic images.
  • the plurality of respective AI rules may be a plurality of respective AI classification rules.
  • a system for determining a risk of a medical condition for a patient includes a processor configured to receive one or more thermographic images.
  • the processor is also configured to apply image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body.
  • the processor is further configured to determine a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
  • the processor may be further configured to receive clinical data of the patient. This embodiment may further optionally include any features described herein in relation to the method described above.
  • a system for determining a risk of a medical condition for a patient includes means for receiving, at a processor, one or more thermographic images.
  • the system also includes means for applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body.
  • the system also includes means for determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
  • the system may also include means for receiving, at the processor, clinical data of the patient. This embodiment may further optionally include any features described herein in relation to the method described above.
  • a processor-implemented method for training an AI- enabled system to determine a risk of a medical condition for a patient may include receiving, at a processor, one or more thermographic images. The method also includes applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a training subject’s body. The method also includes establishing, based on the respective sets of thermographic image data for a plurality of anatomical parts of the training subject’s body, a plurality of respective artificial intelligence (AI) rules to be used in determining a level of risk of a medical condition for the patient. The method may also include receiving, at the processor, clinical data of the training subject. This embodiment may further optionally include any features described herein in connection with any of the other embodiments described herein.
  • AI artificial intelligence
  • FIG. 1 A illustrates elements of an example system for determining a risk of a medical condition for a patient.
  • FIG. IB illustrates elements of another example system for determining a risk of a medical condition for a patient.
  • FIG. 2A is a schematic block diagram illustrating example methods and systems for performing a statistical assessment of image data and clinical data according to AI rules.
  • FIG. 2B shows a construction of an example co-occurrence matrix used in performing a statistical assessment of image data and clinical data according to AI rules.
  • FIG. 3 is a schematic block diagram illustrating example methods and systems for filtering a preliminary sample of thermographic image data for image quality.
  • FIG. 4A is a depiction of an example thermographic medical imaging system.
  • FIG. 4B is a depiction of an example thermographic medical imaging system, shown as viewed by a patient.
  • FIG. 4C is a depiction of an example thermographic medical imaging system, shown as viewed by an operator.
  • FIG. 5 illustrates elements of an example system for determining a risk of a medical condition for a patient.
  • FIG. 6 is a schematic block diagram illustrating example methods and systems for capturing a segmented image of an anatomical part of a patient’s body.
  • FIG. 7 is an illustration of example imaging angles.
  • FIG. 8 is a depiction of an example binarized image for use in alignment of an imaging system.
  • FIG. 9A is a depiction of a user interface for automatic segmentation of thermographic images of an anatomical part of a patient’s body according to AI rules.
  • FIG. 9B is a color depiction of a user interface for manual adjustment of segmented areas of thermographic images.
  • FIG. 10A is a depiction of a user interface displaying a segmented image in color.
  • FIG. 10B is a depiction of a user interface displaying a segmented image in grayscale.
  • FIG. 11 is a depiction of a user interface displaying results of a segmented image.
  • FIG. 12 is a depiction of an example output of a thermographic medical imaging system.
  • FIG. 13 is a depiction of an example report provided by a thermographic imaging system.
  • FIG. 14A is a depiction of an example subjective analysis section that may be included in an output of a thermographic medical imaging system.
  • FIGS. 14B and 14C are depictions of example temperature comparisons that may be included in an output of a thermographic medical imaging system.
  • FIG. 15 is a schematic block diagram showing example medical conditions that may be detected through use of a method or system for determining risk of a medical condition.
  • FIG. 16 is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
  • FIG. 17 is a schematic block diagram showing an example method for evaluating alignment between a patient and a thermographic medical imaging system.
  • FIGS. 18A and 18B are schematic block diagrams showing an example method for training an AI-enabled thermographic medical imaging system.
  • FIG. 19A is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
  • FIG. 19B is a schematic block diagram showing an example method for classifying thermographic medical image data by application of AI rules.
  • FIG. 20 illustrates an example computer network, over which, embodiments of the claimed systems and methods may operate.
  • FIG. 21 is a system block diagram illustrating an example computer network, over which, embodiments of the claimed systems and methods may operate.
  • FIG. 22 is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
  • Infrared thermography is typically used to noninvasively measure the temperature of an object. The technique is based on a proportional relationship between the infrared light emitted and the surface temperature of object. Infrared radiation falls within the non-visible wavelength range of approximately 1 mm to 750nm. All bodies emit infrared radiation, making thermography useful in different areas such as, medicine, heavy industry, sports, agriculture, military applications, astronomy, and electronics, among others. Infrared radiation is measured using infrared light sensors, which operate similarly to cameras used to capture light from the visible spectrum forming images as photographs. Cameras include sensors comprising a matrix arrangement of different transducers capable of sensing point temperature measurements. These measurements are interpreted using pseudo-color, which comprises a color palette directly related to changes in surface temperatures. A gray scale may also be used to distinguish the temperatures, where dark tones reveal lower temperatures relative to the high temperatures represented by light tones.
  • Example implementations of systems for detecting thermal abnormalities utilize infrared sensors to capture infrared images similarly to the manner in which a conventional camera captures images from detection of visible light.
  • the images are processed digitally to manipulate digital arrays linked to an image that can be projected onto the screen of a computer system.
  • the following concepts are described below to enhance understanding of the application of digital image processing in example implementations:
  • Image a two-dimensional arrangement of pixels with different light intensity (grayscale). If the light intensity of a pixel is represented by n bits, then there will be 2 different grayscales.
  • Color a combination of red, green, and blue.
  • a color can be expressed by a triplet of component variables R, G, and B, respectively having intensity values for red, green, and blue, with the intensity values defined on a scale ranging from 0 to 1.
  • Brightness the extent to which an area is illuminated.
  • Tone the relative intensity of color
  • Luminosity the brightness of one area relative to another.
  • Chroma the coloration of an area relative to the brightness of a reference white.
  • G + B may be graphically represented in three-dimensional space.
  • the maximum gain for each component corresponds to the wavelength of the respective component color.
  • a component color may be herein interchangeably referred to as a basic color.
  • Histogram of an image a representation of the distribution of pixels in the different tones or fixed ranges in the given scale (gray, color, temperature, etc).
  • FIG. 1 A depicts an example system 100a for determining a level of risk 110 of a medical condition for a patient 102.
  • the level of risk 110 may be expressed as a percent risk of a medical condition, a percent chance of a medical condition, or by any other means or scale by which a probability can be quantified.
  • One or more images 104 of the patient’s 102 body or parts thereof are received at a processor 106.
  • the processor 106 can be implemented as part of a computer, tablet, smartphone, or similar electronic device, or as an embedded module within an imaging system or other piece of dedicated equipment.
  • the processor 106 is shown in FIG. 1 A as a single unit, but it should be understood that the processor functions of FIG.
  • the processor 106 applies image processing to the images 104 to define respective sets of thermographic image data 108 for a plurality of anatomical parts 109 of the patient’s 102 body, a process also referred to herein as segmentation.
  • a plurality of respective artificial intelligence (AI) rules 116 is applied to the respective sets of thermographic image data 108 to determine a level of risk 110 of a medical condition in the patient 102.
  • a plurality can be taken to mean two or more, and as such, two AI rules 116a, 116b are shown in FIG. 1 A, but it should be understood that other embodiments may employ larger numbers of AI rules, such as three or more, four or more, or five or more AI rules, and so on.
  • FIG. IB depicts another example system 100b for determining a level of risk 110 of a medical condition for a patient 102.
  • the level of risk 110 is depicted in FIG. IB as a percentage, but, as in FIG. 1 A, the level of risk 110 may be embodied by any means or scale by which a probability can be quantified.
  • One or more images 104 of the patient’s 102 body or parts thereof are received at a processor 106.
  • Clinical data 112 of a patient is also received by the processor 106.
  • the clinical data 112 may be provided directly by a patient to the system, or indirectly through an imaging technician or other medical professional. Alternatively, the clinical data 112 may read from an electronic or paper record by a medical professional or by a computer, and thus duly provided to the processor 106.
  • the processor 106 is shown in FIG. IB as being part of a computer.
  • the processor 106 may be implemented as part of a tablet, smartphone, or similar electronic device, or as an embedded module within an imaging system or other piece of dedicated equipment.
  • the processor 106 is shown in FIG. IB as a single unit, but it should be understood that the processor functions of FIG. IB may be performed by a network of processors in communication with each other.
  • the processor 106 applies image processing to the images 104 to define respective sets of thermographic image data 108 for a plurality of anatomical parts 109 of the patient’s 102 body.
  • a plurality of respective AI rules 116 is applied to the respective sets of thermographic image data 108 to determine a level of risk 110 of a medical condition in the patient 102.
  • a plurality can be taken to mean two or more, and as such, two AI rules 116a, 116b are shown in FIG. IB, but it should be understood that other embodiments may employ larger numbers of AI rules, such as three or more, four or more, or five or more AI rules, and so on.
  • FIG. IB Functions of the processor 106 are shown in FIG. IB to occur within a looping methodology wherein the processor outputs a result of a first function, which result in turn becomes an input for a second function.
  • intermediate data generated by performance of processor functions such as application of image processing or application of AI rules 116 may remain internal to the processor 106 until a level of risk 110 is determined, or such intermediate data may exit the processor 106, or even the host computer, and subsequently re-enter the processor as an input to a following process.
  • Such intermediate data may also traverse a network between multiple processors 106 to be processed in a distributed manner or in another manner to make efficient use of network resources.
  • Digital image processing provides information of interest from captured images, or from any sections of an image.
  • the information may be processed, for example, by determining first-order statistical descriptors, which may be extracted from the image data.
  • the first-order statistical descriptors may be obtained using a histogram of the image data.
  • Second-order statistical descriptors may also be extracted from the image, which may be obtained using co-occurrence matrices.
  • An H(i) histogram represents the frequency of occurrence of a specific value in a given set of events.
  • i represents the value of each pixel based on the depth of color of the image.
  • the value i may be expressed in, for example, 8, 12, 16, or another number of bits.
  • the first-order statistical descriptors shown in Table 1 may therefore be determined based on p(i).
  • the number of values a pixel may have is denoted by L.
  • a co-occurrence matrix is a matrix array that represents the joint probability that two pixels have intensity values of i and j, respectively, at a distance d, in a given direction.
  • Second-order statistical descriptors refer to a set of values obtained from the statistical processing of co-occurrence matrices in digital images. These characteristics are referred to as texture characteristics and are shown in Table 2. Co-occurrence matrices are described hereinbelow in further detail with reference to FIG. 2B. [0063] Table 2. Second order statistical descriptors.
  • Statistical descriptors may be used in a processor-implemented method capable of discriminating abnormal thermograms from the analysis of thermal asymmetry.
  • the analysis of asymmetry may be used by analyzing the means of the values of the pixels. Additional characteristics such as mean, variance, statistical asymmetry and kurtosis may further enhance such methods. Further enhancements to the analysis of abnormal thermograms may be achieved using AI, and, in particular, neural networks in a predictive model that is trained to process the image data.
  • One aspect of using a neural network involves validating and monitoring performance of the neural network.
  • Different techniques can be used to monitor the training process of an AI system.
  • cross-validation or, more specifically, cross fold validation may be used.
  • Cross fold validation determines the responsiveness of a machine learning system to training using two sets of data that are mutually independent. The first set is called a training set, and the second is referred to as a validation set.
  • Cross fold validation is a known machine learning monitoring method and need not be described in detail.
  • the system will continue the training process, as new data is received, until a threshold training error is reached.
  • the threshold may be set to less than 10%.
  • this error should not be interpreted as being a conventional error rate. That is, defining the threshold error 5 as based on the rate of mismatches could lead to erroneous conclusions due to the high probability of obtaining a relatively small proportion of positive cases in the data pool.
  • the error for purposes of embodiments described herein is defined to be an FI Score indicator.
  • the FI Score is based on a goal of increasing sensitivity and positive predictive value. These parameters are defined below:
  • the parameters will be extracted from the average of the values obtained in each of the folds.
  • An FI score greater than 90% will indicate a correct training of the intelligent system.
  • FIG. 2A depicts an example method 200a of performing image processing for segmentation based on a statistical assessment 218 of temperature values 214 and clinical data 212.
  • the statistical assessment 218 may be defined by AI rules 216.
  • AI rules 216 may also be referred to as statistical AI rules, inference rules, or AI classification rules.
  • AI rules 216 that define the statistical assessment 218 may be based on various combinations of RGB representations of temperature data for regions of interest and of full images, and patient clinical data and identification data.
  • the AI rules may be established through analysis of training datasets according to the aforementioned various combinations of bases, and may be applied to classify new datasets according to the aforementioned various combinations of bases.
  • the bases of AI rules are described hereinbelow in further detail with respect to FIGs. 19A and 19B.
  • the temperature values 214 of FIG. 2 A may be derived from the respective sets of thermographic image data 208 for a plurality of anatomical parts 109 of a patient’s 102 body.
  • Histograms 220 may be created from the sets of thermographic image data 208.
  • the histograms may comprise pixel values 234 of the sets of thermographic image data 208, or other parameters by which the sets of thermographic image data 208 can be quantified.
  • the statistical assessment 218 may include first-order statistical descriptors 222, which in turn may include average 224, standard deviation 225, asymmetry 226, kurtosis 227, energy 228, entropy 229, mode 230, median 231, maximum 232, and statistical range 233.
  • the aforementioned first-order statistical descriptors 222 are described hereinabove in further detail with respect to Table 1.
  • the statistical assessment 218 may also include second-order statistical descriptors 238.
  • the second-order statistical descriptors may be derived from co-occurrence matrices 236 of the pixel values 234.
  • the second-order statistical descriptors may include second angular moment 240, contrast 241, correlation 242, variance 243, entropy 244, variances’ difference 245, and homogeneity 246.
  • the aforementioned second-order statistical descriptors 238 are described hereinabove in further detail with respect to Table 2.
  • entropy 229 may be calculated directly from the temperature values 214, or from the pixel values 234, which may include RGB representations of the temperature values 214. It should be further noted that as a second-order statistical descriptor 238, entropy 244 may be calculated from values of the co-occurrence matrices 236. Variances’ difference 245 may be defined as a difference between variance values from a pair of distinct image samples.
  • a co-occurrence matrix represents the joint probability that two pixels have intensity values of i and j, respectively, at a distance d, in a given direction. This matrix not only considers information about intensity levels, but also positions of pixels with similar intensity values.
  • FIG. 2B shows a 4x4 matrix 200b with three intensity levels.
  • a co-occurrence matrix 200c is displayed, with distance equal to d - 1, a diagonal direction 200d of 135°, a dimension of 3x3, and intensity levels of 0, 1 and 2.
  • the element with a value of 2 at the position (1,1) of the co-occurrence matrix indicates that level 0 is next to level 0 in the intensity matrix 200b two times, in the direction 200d mentioned above.
  • the element at position (2,3) relates level 1 next to level 2 in the same direction, presenting itself once.
  • FIG. 3 shows an example embodiment of a method 300 for filtering 352 a preliminary sample 350 of an image 374 for image quality.
  • the preliminary sample 350 may include a live feed 348.
  • the live feed 348 may be provided by a thermographic imaging device or another type of imaging device.
  • An image may be binarized by a processor to simplify analysis.
  • a binarization threshold for example, may be set to 20.
  • a measure of image quality 354 may include alignment 354a, angle of rotation 354b, height 354c, and/or inclination 354d of a patient 102 with respect to an orientation of the imaging device.
  • a measure of image quality 354 may include a detection of interference 354e, physical or otherwise, within an image area.
  • a measure of image quality 354 may include a distance 354f from a patient 102 to an imaging device.
  • a measure of image quality 354 may include image contrast 354g, background or core temperature 354h, and/or a measure of left-right symmetry 354i.
  • the filtering 352 for image quality may include comparing 362 the preliminary sample 350 with a reference image 364.
  • the reference image 364 may include a positional marking 366, and/or an alignment mask 367.
  • the alignment mask 367 may include a space 368 defined between a head, shoulder, and raised arm of a patient 102.
  • the positional marking 366 may include a rectangle displayed and centered over the preliminary sample 350. The top edge of the rectangle may be placed, for example, at a distance from the top of the preliminary sample 350 equal to, for example, 6% of the height of the preliminary sample 350. The height of the rectangle may be equal to, for example, 17% of the height of the preliminary sample 350.
  • the width of the rectangle may be equal to, for example, 40% of the width of the preliminary sample 350.
  • the filtering process stage 352 may deem the image quality acceptable if, by a shoulder point detection method, the processor 106 determines that both shoulder points are within the rectangle.
  • the shoulder point detection method is described in more detail below in view of image segmentation.
  • the method 300 may include displaying a message to alert a user that the patient 102 is in an incorrect position, prompting 370 the user to adjust a parameter of the patient 102 or of the imaging device. Another preliminary sample 350 can subsequently be filtered 352 for image quality. If the filtering 352 determines that image quality is acceptable, the method 300 may include displaying a message to alert the user that the patient 102 is in a correct position. An image 374 can subsequently be captured.
  • an example system 400 includes thermal imaging equipment 474, a processor 406, an operator console 478 and a patient monitor 476 as shown in FIG. 4A.
  • the operator console employs a touchscreen for user input, but a keyboard and/or a mouse may be used in addition to or instead of a touchscreen.
  • FIG. 5 is a schematic diagram of an example system 500 for determining a level of risk of a medical condition for a patient 502 that includes an infrared sensor 574, a patient positioning monitor 576, a data acquisition system and terminal 578, a report printer 592, a network interface 584 such as a wireless access point, a network server 586, a data analysis system 588 using, for example, a computer configured to perform data analysis using AI, and an analysis terminal 590 for operating the data analysis system 588 and viewing results. It should be understood that the data analysis system 588 and the analysis terminal 590 may be implemented separately or in a single unit.
  • the processor 506 may be a stand-alone or embedded unit, or implemented in any suitable computing device that uses a processor 506 to execute computer program instructions and includes an interface to the infrared sensor 574 to collect image data.
  • the computing device may operate in its own console or may be integrated with other components of the system.
  • the system 400 is mounted in a unit having wheels 496 for transport and a wheel lock to secure its position (not pictured).
  • the system 400 is capable of adjustable vertical positioning of the thermal imaging equipment 474, as well as rotational and lateral positioning 480 at a selected angle to facilitate imaging of a patient 402 as shown in FIG. 4B.
  • FIG. 4B illustrates operation of an example system from a patient’s 402 perspective
  • FIG. 4C illustrates operation from the operator’s 403 position.
  • the system 400 images patients 402 positioned in front of the system 400 under control of the operator 403.
  • the operator console 478 includes at least one monitor to allow the operator 403 to operate the system 400.
  • the patient monitor 476 displays information relevant to the patient 402 and the capture and processing of the patient images.
  • the system 400 may include an ambient temperature control system to facilitate image capture (not pictured).
  • the ambient temperature control system may be implemented in software executed by the processor 406, within or separately from the operator console 476.
  • the patient 402 sits within a field of view of the IR camera 474.
  • thermal imaging equipment and “IR camera” are herein used interchangeably, and may include components directed to the physical adjustment of IR camera orientation.
  • the operator 403 adjusts the IR camera 474 while monitoring the image detected by the IR camera 474 using the patient positioning monitor 476.
  • the operator 403 captures a set of images sufficient to analyze relevant areas of the patient’s 402 body.
  • the operator 403 monitors each view and performs adjustments to the IR camera 474 position to obtain suitable images.
  • the imaging views or conditions may include angles of incidence such as frontal, left oblique, and right oblique.
  • the imaging views or conditions may also include images wherein the subject part of a patient’s 402 body is stimulated, for example by application of or removal of heat. Such intentional temperature changes may be affected through example means such as application of moisture to the subject part of a patient’s 402 body, or by actuation of the ambient temperature control system.
  • Images taken while the subject part of a patient’s body is stimulated may be referred to as functional, while corresponding images taken without stimulation may be referred to as basal.
  • the operator 403 may then be provided with imaging functions to select sections of the image of the subject part of patient’s 402 body for detailed analysis.
  • the process of selecting sections of the image may be referred to as segmentation.
  • the imaging functions may be accessible via a graphical user interface (GUI) with GUI objects such as buttons, menus, etc.
  • GUI graphical user interface
  • the subject parts of a patient’s 402 body may be the patient’s breasts, especially if the system 400 is employed in the detection of breast cancer.
  • the system 400 analyzes the image data to obtain information regarding any thermal abnormalities that may correlate with a manifestation of a medical condition such as cancerous tissue.
  • the system software pre-processes the image data to obtain statistical parameters described above as first-order and second-order statistical descriptors.
  • the data is then processed using a predictive model trained using historical data collected from patients having varying degrees of abnormal tissue. Further training may be performed using any new data. Results may then be provided to show the condition of the subject tissue corresponding to the image data collected by the system.
  • the first-order statistical descriptors are characteristics derived from a histogram of the thermographic image.
  • a histogram H(i) may be defined to represent the frequency at which a thermographic image includes a specific pixel value.
  • the variable i represents the value of the pixel, which further represents the color intensity of the pixel location in the image have a resolution based on the data width of the system (8 bits, 12 bits, 16 bits, etc.).
  • a characterization function may receive the thermographic image data from the IR camera 474 to obtain a desired set of statistical, morphological, and mathematical characteristics. These characteristics are ultimately compared with corresponding characteristics learned by an AI engine from a set of historical data.
  • the historical data may include a set of statistical parameters such as mean, standard deviation, variance, or other similar parameters.
  • the historical data may even include the whole matrix of temperatures as a set of characteristics.
  • This historical data may be obtained, for example, from six different images of a sample group or sample population of patients.
  • the images may include, for example, a basal frontal section, a functional frontal section, two oblique basal (right and left) sections, and two oblique basal (right and left) sections. In an example implementation, these six image sections correspond to a set of images captured in a patient scan.
  • Characterization of temperature data from the obtained thermographic images may begin by performing a linear transformation on the temperature data to enable the data to be displayed by pixels functioning on eight or more bits.
  • Several example implementations of the transformation may be used.
  • One example solution is to identify the maximum and minimum temperatures of the basal and functional images of each patient for all patients in the sample population. The values are rounded to their nearest integer, after which, 0.5°C is added to values representing a maximum temperature within each image, and 0.5°C is subtracted from values representing a minimum temperature within each image.
  • An alternative example solution is to determine maximum and minimum average temperatures of the basal and functional images of each patient for all patients in the sample population. As with the first method, the values are rounded to their nearest integer, 0.5°C is added to values representing a maximum temperature within each image, and 0.5°C is subtracted from values representing a minimum temperature within each image. Two more example solutions would follow similar respective procedures to the two solutions described above, except that instead of respectively adding 0.5°C to or subtracting 0.5°C from the maximum or minimum temperature values, the standard deviation would be calculated for the maximums and minimums for the basal and functional images of all patients in the sample population. It should be noted that basal images may also be referred to as baseline images. [0088] The solutions described above are used to calculate the terms T + , T 2 , L + , L 2 , which are then used in the following relationships to perform the desired linear transformation: where:
  • T Temperature value of interest from temperature matrix Tp Minimum temperature value minus a deviation value of 0.5°C T 2 : Maximum temperature value plus a deviation value of 0.5°C Lp Low end of pixel range (e.g. 0)
  • L 2 High end of pixel range (e.g. 255 for 8-bit pixels)
  • histograms may be constructed from thermographic image data for each breast of a patient.
  • Some example characteristic parameters that may be determined from a histogram of each breast yielding the following first-order statistical descriptors, previously listed in Table 1, include:
  • Co-occurrence matrices may be defined to indicate a number of instances in which pixels having the same value are adjacent to one another in 0°, 45°, 90°, and 135° directions.
  • the second-order descriptors may be obtained from such a set of four matrices as shown above in Table 2.
  • Particularly useful second-order descriptors may include second angular moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity.
  • First and second order descriptors of the thermographic data may be normalized, to enable future comparison with historical data, using the following relationship: where: c': new normalized value of given descriptor x: original value of given descriptor x: mean of given descriptor s: characteristic standard deviation
  • the normalized historical data set may be introduced to a processor-implemented method that performs a primary components analysis (PCA) to reduce the data from n dimensions to k dimensions.
  • PCA primary components analysis
  • a covariance matrix is calculated using
  • svd is a singular value decomposition function.
  • var represents a desired percentage value for the variance, which in an example implementation may be from 90% to 99%.
  • a variance analysis is then performed to obtain a new projection to fewer descriptors with the restriction of obtaining the minimum number of characteristics that allow retention of 99% of the variance of the sample data of a training sample approximately 80% of the sample size.
  • a processor is then configured as a classifier to receive, in vector form, the characteristics obtained from the PCA of the full vector comprising the full data set.
  • the classifier in an example implementation may include at least the following classifiers:
  • classifiers may be used to obtain a probability that a patient belongs to a predefined class (e.g. healthy, with risk of breast cancer, abnormal, etc.). This probability may be presented as a percentage.
  • a predefined class e.g. healthy, with risk of breast cancer, abnormal, etc.
  • FIG. 6 is a schematic block diagram depicting an example processor-implemented method 600 for segmentation of images of anatomical parts 609 of a patient’s 502 body.
  • the anatomical parts may include at least a portion of a patient’s 502 breast 611, especially in breast cancer detection applications.
  • the anatomical parts 609 may be defined by medical knowledge 607, which may include information about regions 605 commonly affected by a given medical condition. Such regions 605 may include areas local to or adjacent to the breast, or areas on the neck or in the armpits.
  • the anatomical parts 609 may be defined by the processor 506 automatically generating image masks 617. These image masks serve as a starting point for segmentation and may include a combination of points 619, lines 621, and curves 623 strategically placed according to a plurality of AI rules 616.
  • the AI rules 616 may include detecting a bottom contour of a breast 613.
  • Image processing may include detecting a bottom contour of a breast 613.
  • the AI rules 616 may also be referred to as AI machine vision rules and may include various edge detection processes such as determining locations of body parts such as a bottom contour of a breast 613.
  • the method 600 may subsequently include adjusting 623 positions of points 619, lines 621, and curves 623 along the bottom contour of a breast 613 or along the presently defined perimeter of the image mask 617 to better match 631 an anatomical part 609 unique to a patient’s 502 body.
  • the method may likewise include adding 625 points 619, lines 621, and curves 623 to, or removing 627 points 619, lines 621, and curves 623 from the bottom contour of a breast 613 or along the presently defined perimeter of the image mask 617 to better match the anatomical part 609 to the patient’s 502 body. If the match 631 is not sufficient, further adjustments 623, additions 625, or removals 627 can be made. If the match 631 is sufficient, the segmented image 633 of the anatomical part 609 can be captured.
  • FIG. 7 depicts anatomical guides 700 that may be displayed to the operator 403 on the patient monitor 476.
  • the anatomical guides 700 allow the operator 403 to select from six different images for analysis. These images are functional 701a and basal 701b images of the patient’s right oblique 704a, frontal 704b, and left oblique 704c views.
  • the user interface then may allow the operator to perform imaging, maintenance, or calibration functions such as for example, infrared record capture, isotherm modification, focusing, sensor calibration, infrared sensor connection and disconnection, and reading and verification of a sensor serial number to verify its compatibility with an authorized sensor database.
  • the processor 506 may be configured to binarize a captured image similarly to the binarization performed during filtering 352 as described above.
  • the processor may be further configured to dilate the binarized image by 2 pixels, and may store the resulting image in memory.
  • the data values of the original binary image may be subtracted from corresponding values of the dilated image to obtain a contour.
  • Such a contour may be stored in memory as an external border image for each angle from which original images are taken, including frontal, right oblique, and left oblique.
  • the binarization, dilation, and subtraction process stages may be performed for an image of only one side, and a mirror image may be created for the complementary side.
  • the example method for automatically generating image masks 617 continues as follows.
  • a set of internal borders is found by applying the Canny Edge Detection method, a process known in the art, to a grayscale conversion of the original RGB image, and the result is stored. Pre-defmed portions of top and bottom sections of the internal border image are removed. These portions may be, for example, 60% for the top, and 12% for the bottom.
  • the image that results from the removal of these portions contains the region where most of a breast is expected to appear in any patient.
  • Data representing these internal borders is subsequently added to corresponding data for the previously determined external borders, creating a reference image. Again, for oblique images, this process is performed for one side, with a mirror image created for the complementary side.
  • the method continues through a process of armpit and shoulder point detection as follows. Fiaving the binary image as a base, a point by point sweep may be performed in the top line of the image, from left to right. V alues different from zero may be detected and clustered. Three different clusters should be found, corresponding to the arms and neck of the patient. Coordinates of the midpoint of each cluster may then be calculated. If the expected three clusters are not present such that their midpoints cannot be calculated, the system may display a “non-valid image” message.
  • the distance between the central cluster midpoint and one selected midpoint of the sides may be calculated.
  • the distance may be stored in variable A.
  • Point A may then be created, below the central midpoint at a distance equal to the distance “A”.
  • point A the central mid-point, and the selected side midpoint
  • a right triangle is defined by lines connecting each point.
  • a point by point sweep may be performed from bottom to top of the defined right triangle area, searching for the first point equal to zero.
  • the coordinates of the first zero point may be stored. This point defines a shoulder point on a selected side.
  • the same process may be performed to find and store coordinates of the opposite side shoulder point.
  • a point by point sweep from left to right, and from the top to 30% of the image may then be performed to define the right external border coordinates.
  • another point by point sweep, this sweep moving from right to left from the top to 30% of the image may be performed to define the left external border coordinates.
  • the distances between a selected shoulder point and each one of the corresponding external border points may then be calculated.
  • the coordinates of the point that is at the shortest distance may be stored as the corresponding side armpit point.
  • the previous process may be repeated to find the coordinates of the opposite armpit point.
  • FIG. 8 is a depiction of the above calculated points being employed for automatic segmentation of an image.
  • the armpit 839 and shoulder 841 points, along with the external 837a and internal 837b borders, may now be used to perform the automatic phase of image segmentation.
  • a point by point sweep may be performed from left to right, from right to left, and from top to bottom to detect the left and right external borders 837a.
  • the coordinates of those borders may be stored in a new vector.
  • the borders’ coordinate vectors may then be trimmed, for example, by 30% from both ends.
  • the distances between each point of both trimmed borders may then be measured.
  • the average distance may be calculated and halved to find the midpoint of the image.
  • the whole Canny image may be divided by the symmetry axes and stored in two half images 800. Taking one half image 800 as a base, the image may be trimmed, for example, by 65% from the top and 8% from the bottom. Starting from a line of the image that is above the bottom part of the image by, for example, a measure of 15%, a point by point sweep may be performed to detect the external vertical border 837a.
  • the borders left are the horizontal internal borders.
  • the coordinates and the length of each horizontal internal border may be stored and sorted by size.
  • the horizontal borders may be used to generate a series of second-degree polynomials that describe a vertical parabola 837c. As many parabolas may be generated as vertical borders that are found.
  • a parabola 837c may be chosen that best adjusts to a series of fixed parameters that better describe the lower curve of a breast.
  • the parameters may include vertex, focus, and width.
  • intersection coordinates are detected between the parabola 837c and the external vertical border 837a (point 1 819a), as well as between the parabola 837c and a vertical axis 835 defined as being parallel to the symmetry axis and separated by a distance equal to, for example, 2% of the image width (point 2 819b).
  • the coordinates of points 1 819a and 2 819b may be stored in a coordinates vector. With points 1 819a and 2 819b two additional points (points 3 819c and 4 819d) that are needed for a 4-point Bezier curve may be calculated. This is an optimization cycle that will output coordinates of two points (points 3 819c and 4 819d) that generate a curve that better adjusts to the parabola 837c.
  • the coordinates of points 3 819c and 4 819d may be stored in a coordinates vector. From the coordinates of the shoulder point 841 and the corresponding armpit point 839, the coordinates of a midpoint between them (point 5 819e) may be calculated. From the coordinates of point 5 819e. a point may be projected over the vertical axis at a distance equal to, for example, 2% of the image's height below its original coordinates in the vertical axis (point 6 819f). From the coordinates of points 5 819e and 6 819f, the coordinates of a midpoint between them (point 7 819g) may be calculated.
  • the external vertical border 837a vector may be divided into 4 equal segments, limited by the armpit point 839 and the intersection with the parabola 837c (point 1 819a).
  • the coordinates of the 3 points that define the limits of the inner segments may be stored as point 8 819h, point 9 819i, and point 10 819j respectively.
  • a region of interest (ROI) 843 may be defined by the following 3 curves and 3 lines:
  • FIG. 9A An example user interface of an image processing function is shown in FIG. 9A.
  • the thermal imaging interface 945 in FIG. 9A may be used to automatically generate image masks 917a, 917b from ROIs 943.
  • images corresponding to a patient’s 402 left breast, right breast, left armpit, right armpit, left groove, right groove, left nipple, right nipple, left half of the neck and half right neck may be sectioned and separated from an original image, which is referred to here as segmentation.
  • the images shown can be modified, adjusted and repositioned by the operator 403 using a computer graphical interface.
  • An image segmentation method may include defining a vertical axis of symmetry of the image, and establishing points that define edges of an anatomical part 609 of a patient’s 402 body.
  • An automated processor-implemented method defines up to a predetermined number of points 919 and curves 923 to define the section for the region of interest 943. The number of points 919 and curves 923 defined may vary for different implementations, but in one example, ten points and two curves may be defined.
  • a second mask 917b may be automatically adjusted to mirror the first adjusted mask 917a.
  • the overlaid image in FIG. 9A depicts operation of the automated segmentation process.
  • the underlaid images in FIG. 9A depict images prior to segmentation, but are also capable of depicting images after segmentation wherein the adjusted masks for the regions of interest are shown in full brightness with surrounding areas darkened.
  • FIG. 9B Another example user interface of an image processing function is shown in FIG. 9B.
  • an ROI 923 is shown as being defined by points 919 and curves 923.
  • the example user interface of FIG. 10A displays segmented images for a plurality of parts 1009 of a patient’s 502 body.
  • the segmented images are shown in full brightness with surrounding areas darkened.
  • the six images shown correspond to the six example imaging angles of FIG. 7.
  • FIG. 10B shows an example user interface displaying grayscale representations of the same images of a plurality of parts 1009 of a patient’s 502 body as in FIG. 10A.
  • Hot zones 1094 which may be identified by AI image analysis, are shown in FIG. 10B.
  • FIG. 11 is an example of the process shown in FIG. 9A, with grayscale representations of segmented images 1133.
  • the segmented images of the regions of interest may be stored in a database. Once the segmentation of the patient's areas of interest is performed, the system may automatically obtain the temperature, maximum, minimum, first- and second-degree statistical descriptor values, as well as other image characteristics that would be analyzed using an AI implementation previously trained with historical data including positive cases documented by mammography and confirmed histopathologically through a biopsy.
  • the first method begins by defining first-order statistical descriptors, including average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and range, from the image histogram for pixel intensity for a total of 10 characteristics. Then for the whole of the image, the co-occurrence matrices in the four angular directions (0°, 45°, 90°, and 135°) are obtained. For each of co-occurrence matrix, seven second-order statistical descriptors, including second moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity, are obtained for a total of 38 characteristics.
  • the second method is a machine learning method called “End to End.”
  • an entire set of image data for each region of interest is entered into a convolutional neural network that defines each pixel as a particular characteristic from which are drifted or convoluted other characteristics between pixels. This is called a background acquisition of statistical variables.
  • FIG. 12 depicts a user interface 1200 for classification and medical evaluation.
  • a doctor or diagnostician, may use this interface 1200 to observe results of an automatic classification performed by the system.
  • the doctor may make annotations for observations 1298a and enter subjective variables 1298b for the issuance of a medical recommendation.
  • a report may be generated from the interface 1200.
  • results of an automatic classification may be provided through a user interface, such as the user interface 1200 of FIG. 12, by being displayed and presented to a user on a computer monitor or similar display device, visual or otherwise.
  • This computer monitor may be the operator console 478 of FIGS. 4A-4C, the patient monitor 476 or patient positioning monitor 576 of FIGs. 4A-4C and FIG. 5 respectively, or the data acquisition system and terminal 578 or the analysis terminal 590 of FIG. 5, or any other suitable display device within an embodiment.
  • a report 1349 is shown in FIG. 13.
  • a report 1349 may include the patient's 402 clinical data 1312, credentials 1396 of the operator and the physician issuing the recommendations, as well as the images 1333, relevant patient temperature measurements 1399, hot zones 1094 detected (not pictured in this example), medical observations 1398, and a tissue classification 1310 as a percentage of 0 to 100% based on any detected temperature abnormalities associated with the presence of a metabolically active tumor.
  • FIG. 14A depicts an example subjective analysis section 1496 that may be present in a report 1349. Observations can include areola symmetry, thermovascular network characteristics, and evidence of lumps in the breast profile. Separate entries for left and right breasts may be provided.
  • FIG 14B shows an example temperature comparison 1447b between left and right breasts that may be included in a report 1300. The example imaging angles of FIG. 7 can be selected and compared in terms of maximum and average temperature in FIG. 14B.
  • FIG. 14C shows an example temperature comparison 1447c between left and right areolas that may be included in a report 1300. The example imaging angles of FIG. 7 again may be used.
  • FIG. 15 depicts several example medical conditions 1551 that may be detected through use of a method or system 1500 for determining risk of a medical condition according to the present disclosure.
  • Breast cancer 1551a, fibrosis 1551b, mastitis 1551c, duct stacia 155 Id, and adenosis 155 le may be so detected.
  • FIG. 16 offers a high-level depiction of an example method for training AI components of the claimed systems and methods to automatically classify image data with respect to thermal anomalies that may be present in the data.
  • Image data 1653 may first be generated from thermal images of a training subject’s body. Such data may include RGB and grayscale depictions of full images as well as ROIs.
  • First- and second-order statistical descriptors 1655 may then be calculated according to the method 200 of FIG. 2 for each of the four data types (RGB full image, RGB ROI, grayscale full image, grayscale ROI). Normalization and data visualization 1657 may then be performed for each data type, exemplifying a first filtering process stage wherein image data that is not viable for training is excluded from the training data set.
  • a second filtering process stage may follow, wherein data outside the normal distribution for each data type is excluded 1659 from the data set.
  • An ANOSIM 1661 may then be performed for each dataset to assess the linear independence of each statistical descriptor 1655. Descriptors 1655 with the highest differentiation degree 1663 may then be selected for each data type. Training and validation universes 1665 may then be created for each data type.
  • training 1667 may be performed for each data set, followed by a validation and evaluation process stage 1669, wherein correctly and incorrectly classified datasets are identified. Incorrectly classified datasets may then be analyzed 1671 by searching for patterns. A comparison 1673 is then performed to investigate correlation between incorrect classifications and clinical data. Stages 1659 through 1671 may then be repeated 1675 with a modified differentiability threshold until classification error is minimized.
  • FIG. 17 shows an example method 1700 for detecting depictions of space between a patient’s head, shoulder, and raised arm within a set of image data. Such a detection may be used in either an alignment procedure prior to capturing and saving image data, or in segmenting images that have been previously saved.
  • the method begins with capturing an image 1777, which may include either saving a set of image data, or simply being connected to a live stream from an imaging device.
  • Image data may then be binarized 1779 so that holes between a patient’s head, neck, shoulder, and raised arm may be detected 1781. Existence of holes may be queried 1783. If the holes are not found to exist, a message to that effect may be displayed 1785.
  • a lower point 1787 may be detected in each hole. The location of each lower point 1787 may then be queried 1789 with respect to a rectangle superimposed upon the image. If at least one of the holes is not found to be located within the rectangle, a message to that effect may be displayed 1791. If the holes are found within the rectangle, an affirmative message may be displayed 1793.
  • FIG. 18A offers an alternative high-level example method 1800a for training AI components of the claimed systems and methods to automatically classify image data with respect to thermal anomalies that may be present in the data.
  • Data may first be generated 1895a from thermal images for a number of data types, including RGB, grayscale, ROI RGB, and ROI grayscale.
  • Output labels may then be created 1895b for different takes, including frontal and left and right oblique.
  • Universes for training, testing, and validation may then be created 1895d.
  • a machine learning model may subsequently be developed 1895e by testing and validating classifications performed by the AI components. Model validation and evaluation 1895f may then follow.
  • FIG. 18B shows a high-level example method 1800b for classifying image data using AI components.
  • Data may first be generated 1895a from thermal images for a number of data types, including RGB, grayscale, ROI RGB, and ROI grayscale.
  • Output labels may then be created 1895c for different quality parameters of the thermal images, including height, focus, rotation, position of a patient’s arms, lateral displacement, patient tilt, and distance between the patient and the imaging device.
  • Universes for training, testing, and validation may then be created 1895d.
  • a machine learning model may subsequently be developed 1895e by testing and validating classifications performed by the AI components. Model validation and evaluation 1895f may then follow. Classification and error analysis 1895 may then take place, followed by an efficiency evaluation 1895h of the model.
  • FIG. 19A depicts an example method 1900a of training the AI components of the claimed systems and methods, which will now be described.
  • This series of training process stages establishes inference rules that will be used to automatically classify image data with respect to thermal anomalies that may be present in the data, and, therefrom, to determine a level of risk for a medical condition.
  • Thermal image data for one or more training subjects may be present in a database 1997a and may be initially processed or segmented 1997b.
  • a first filtering process stage 1997c may include removal of data for thermal images from the database 1997a that may not be viable for inclusion in a training data set, due to the presence of exclusion parameters in the corresponding clinical record.
  • Exclusion parameters may include, for example, indications of recent total or partial mastectomy, indications of recent surgical interventions on the thorax area, indications of previous breast cancer detection, absence of a clinical record, or absence of mammography results.
  • a first example approach may use a binary classification system, which means that every input data will be assigned to a class A (positive for a thermal anomaly) or a class B (negative for a thermal anomaly).
  • the output may be a membership value which, if surpassing a typical threshold of 0.5, will be assigned to class A, and will otherwise be assigned to class B.
  • the system may perform a multiclass classification, based on the Birads scale. Labels may be generated for classes in Birads I, Birads II, Birads III, Birads IVa, Birads IVb, Birads IVc, Birads V, including a subsystem to classify the previously assigned Birads IV within its different subclassifications.
  • a second filtering process stage 1997h may proceed as follows. With the created clusters of data 1997d, 1997e, 1997f, 1997g, which contain data for thermal images in RGB and grayscale, as well as a temperature matrix from the ROIs, and clinical data, a plurality of statistical descriptors may be calculated. An objective of the calculation may be to enable subsequent elimination of data from the edges of the distributions. Another objective may be to evaluate the similarity between data sets by performing a correlation analysis of the statistical descriptors and an ANOSIM in two or more clusters 1997d, 1997e, 1997f, 1997g to select statistical descriptors with more linear independence.
  • the aforementioned objectives of the statistical descriptor calculations will support Deep Neural Network (DNN) models 1997i in generalizing the knowledge as inference rules 1997j for the new input data from the patients on the inference stage.
  • the DNN models may be mathematical functions that can learn from data, such as image or text data, to define an n-dimensional decision boundary between n descriptors from given data sets. Predictive accuracy of the DNNs can increase when the DNNs are provided with more data.
  • the statistical descriptors as previously described herein, may include average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, minimum, range, angular second moment, contrast, correlation, and homogeneity.
  • the statistical descriptors may further include a measure of dissimilitude. Dissimilitude may be demonstrated by a correlation matrix composed of distance measurements such as Euclidean distances between values of respective descriptors.
  • a method of training a supervised learning system may include a data set for training and a dataset for validation, with their respective labels (or classes) which are presented to the system.
  • the data input may interact with inference rules, which may take forms of weights and biases.
  • a weight may be defined as the slope of a decision boundary, and a bias may be defined as an offset of a decision boundary.
  • Weights and biases may be adjusted to reduce classification errors in the following way: inputs given to the system may be evaluated, a global classification error may then be calculated, and an optimization function may then be used to update the weights and the biases, thus updating the inference rules.
  • FIG. 19B shows an example method 1900b similar to the training method 1900a, as applied to classification of thermal image data as positive or negative regarding any presence of thermal abnormalities.
  • the aforementioned similarity lies in an initial application of inference rules 1997j to obtain a plurality of results 1997m, followed by application of final inference rules 1997n to arrive at a final classification 1997o with minimal classification error.
  • the inference rules 1997j may be implemented as sets of inference rules, wherein a set of inference rules may include one or more inference rules.
  • FIG. 19B shows n sets of inference rules, labeled respectively as “Inference Rules 1,” “Inference Rules 2,” and “Inference Rules n.”
  • the quantity n of sets of inference rules may encompass, for example, two sets of inference rules, four sets of inference rules, ten sets of inference rules, or any other number of sets of inference rules that can be stored and executed properly on the system. Multiple sets of inference rules may thus be applied in parallel to the data being classified, such that results of the plurality of results 1997m are mutually independent. Such mutual independence contributes to minimization of classification errors when final inference rules 1997n are applied to the plurality of results 1997m.
  • inference rules described herein are examples of AI rules, and may be employed at least as AI classification rules.
  • FIG. 20 illustrates a computer network (or system) 1000 or similar digital processing environment, according to some embodiments of the present disclosure.
  • Client computer(s)/devices 50 and server computer(s) 60 provide processing, storage, and input/output devices executing application programs and the like.
  • the client computer(s)/devices 50 can also be linked through communications network 70 to other computing devices, including other client devices/processes 50 and server computer(s) 60.
  • the communications network 70 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, local area or wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth®, etc.) to communicate with one another.
  • a global network e.g., the Internet
  • IP Transmission Control Protocol/IP
  • Bluetooth® Bluetooth®
  • Client computers/devices 50 may be configured with a computing module (located at one or more of elements 50, 60, and/or 70).
  • a user may access the computing module executing on the server computers 60 from a user device, such a mobile device, a personal computer, or any computing device known to one skilled in the art without limitation.
  • the client devices 50 and server computers 60 may be distributed across a computing module.
  • Server computers 60 may be configured as the computing modules which communicate with client devices 50 for providing access to (and/or accessing) databases that include data associated with thermographic images or other types of image data.
  • the server computers 60 may not be separate server computers but part of cloud network 70.
  • the server computer e.g., computing module
  • the server computer may enable users to determine location, size, or number of physical objects (including but not limited to target objects and/or reference objects) by allowing access to data located on the client 50, server 60, or network 70 (e.g., global computer network).
  • the client (configuration module) 50 may communicate data representing the physical objects back to and/or from the server (computing module) 60.
  • the client 50 may include client applications or components executing on the client 50 for determining location, size, or number of physical objects, and the client 50 may communicate corresponding data to the server (e.g., computing module) 60.
  • the server e.g., computing module
  • Some embodiments of the system 1000 may include a computer system for determining a level of risk of a medical condition for a patient based on image data.
  • the system 1000 may include a plurality of processors 84.
  • the system 1000 may also include a memory 90.
  • the memory 90 may include: (i) computer code instructions stored thereon; and/or (ii) data representing thermographic images or other types of image data.
  • the data may include segments including portions of the thermographic images or other types of image data.
  • the memory 90 may be operatively coupled to the plurality of processors 84 such that, when executed by the plurality of processors 84, the computer code instructions may cause the computer system 1000 to implement a computing module (the computing module being located on, in, or implemented by any of elements 50, 60, 70 of FIG. 20 or elements 82, 84, 86, 90, 92, 94, 95 of FIG. 21) configured to perform one or more functions.
  • a computing module the computing module being located on, in, or implemented by any of elements 50, 60, 70 of FIG. 20 or elements 82, 84, 86, 90, 92, 94, 95 of FIG. 21 configured to perform one or more functions.
  • FIG. 21 is a diagram of an example internal structure of a computer (e.g., client processor/device 50 or server computers 60) in the computer system 1000 of FIG. 21.
  • Each computer 50, 60 contains a system bus 79, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system.
  • the system bus 79 is essentially a shared conduit that connects different elements of a computer system (e.g., processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements.
  • Attached to the system bus 79 is an I/O device interface 82 for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer 50, 60.
  • a network interface 86 allows the computer to connect to various other devices attached to a network (e.g., network 70 of FIG. 20).
  • Memory 90 provides volatile storage for computer software instructions 92 and data 94 used to implement some embodiments (e.g., input and output video streams described herein).
  • Disk storage 95 provides non-volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present disclosure.
  • a central processor unit 84 is also attached to the system bus 79 and provides for the execution of computer instructions.
  • the processor routines 92 and data 94 are a computer program product (generally referenced 92), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the present disclosure.
  • the computer program product 92 can be installed by any suitable software installation procedure, as is well known in the art.
  • at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection.
  • Other embodiments may include a computer program propagated signal product 107 (of FIG.
  • a propagated signal on a propagation medium e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)
  • a propagation medium e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)
  • Such carrier medium or signals provide at least a portion of the software instructions for the routines/program 92 of the present disclosure.
  • the propagated signal is an analog carrier wave or digital signal carried on the propagated medium.
  • the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network.
  • the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer.
  • the computer readable medium of computer program product 92 is a propagation medium that the computer system 50 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.
  • carrier medium or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.
  • Embodiments or aspects thereof may be implemented in the form of hardware (including but not limited to hardware circuitry), firmware, or software. If implemented in software, the software may be stored on any non-transient computer readable medium that is configured to enable a processor to load the software or subsets of instructions thereof. The processor then executes the instructions and is configured to operate or cause an apparatus to operate in a manner as described herein.
  • FIG. 22 shows an example method 2200 for training an AI-enabled thermographic medical imaging system to determine a risk of a medical condition for a patient 502.
  • One or more thermographic images 2204 may be received at a processor 2206.
  • Image processing 2206a may be applied to the one or more thermographic images 2204 to define respective sets of thermographic image data 2208 for a plurality of anatomical parts 2209 of a training subject’s body.
  • a plurality of respective AI rules 2297j may be established based on the respective sets of thermographic image data 2208 for a plurality of anatomical parts 2209 of a training subject’s body. The plurality of respective AI rules 2297j may then be used in determining a level of risk of a medical condition for the patient 502.

Abstract

A processor-implemented method and medical imaging system include various features adapted to determine a risk of a medical condition for a patient. A processor can receive one or more thermographic images and process the images to define respective sets of thermographic image data for a plurality of anatomical parts of the patient's body. The processor can then execute a set of artificial intelligence (AI) rules to determine a risk of a medical condition based on the image data. Clinical data pertaining to the patient may also serve as an input to the determination of risk by the processor.

Description

DETERMINATION OF MEDICAL CONDITION RISK USING THERMOGRAPHIC
IMAGES
RELATED APPLICATION
[0001] This application claims the benefit of U.S. Provisional Application No. 62/948,176, filed on 13 December 2019. The entire teachings of the above application are incorporated herein by reference.
BACKGROUND
[0002] Breast cancer is one of the most common forms of cancer worldwide, particularly among females. While the chances of surviving a breast cancer diagnosis are increasing, those chances fall when the cancer is not detected at a sufficiently early stage.
[0003] Current methodologies for detecting breast cancer include palpation and mammography, followed by analysis of a biopsy when a closer examination is warranted. While palpation performed in a self-examination may be sufficient to lead a patient to seek care from a doctor, patients can miss signs of problematic lumps or miss tissue that may not be felt by palpation. As a result, many patients do not seek a doctor’s care in time for successful treatment.
[0004] Mammography is widely used for non-invasive breast cancer detection. In mammography, x-rays of the breast are analyzed by a trained mammographer or radiologist. However, the level of skill required for accurate interpretation of an x-ray image renders access to providers difficult in many parts of the world. In addition, women often feel discomfort or even pain when having a mammogram. Thus, many patients end up forgoing recommended periodic mammograms.
[0005] Other methodologies, such as ultrasound and infrared thermography, can provide higher sensitivity and specificity than mammography with respect to smaller tumors typical of early-stage breast cancer. SUMMARY
[0006] Both ultrasound and infrared thermography still require analysis and interpretation of images by skilled technicians or even doctors. Further, many areas of the world, especially rural or developing areas, suffer from a shortage of providers with the required skills for ultrasound or infrared thermography to meet the needs of the community. In order to draw conclusions regarding the presence of breast cancer or other medical conditions from ultrasound or thermographic images, the images must be analyzed by a specially trained entity. Typically, this entity is a person, such as a technician or a doctor.
[0007] Some advancements in machine learning have allowed machines to be trained to perform the image analysis automatically using artificial intelligence (AI). Automation of image analysis through AI can bring significant improvements in capabilities of disease detection to communities that are medically understaffed or otherwise underserved.
However, existing methods of AI image analysis and classification suffer from unpredictable levels of classification error due to issues such as inconsistent image quality and image processing techniques that may occasionally result in statistical outliers.
[0008] Embodiments described herein overcome the aforementioned disadvantages in multiple ways. For example, different AI rules, or different combinations thereof, from a plurality of AI rules may be applied to images of different parts of a patient’s body. Images may be taken from different angles, such as frontal, left oblique, and right oblique. Also, images may be taken quiescently as basal images, or with the subject body part stimulated or conditioned in various ways, such as by heating, cooling, or moistening. Images may be processed to create statistical representations of both full images and regions of interest, rather than attempting to classify raw images. Additionally, after applying different AI rules to images of different subject body parts, results obtained therefrom may be used as common inputs to a subsequent set of AI rules, in order to minimize effects of any errors from potentially unexpected results from any one of the previously applied sets of AI rules. It may even be ensured that only high-quality images are presented to the AI classification methods by using information from previous images to determine if a subject of a new image is, for example, correctly positioned or aligned.
[0009] In one embodiment, a processor-implemented method for determining a risk of a medical condition for a patient includes receiving, at the processor, one or more thermographic images. The method also includes applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body. The method further includes determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
[0010] The method may further include receiving, at the processor, clinical data for the patient. The AI rules may define a statistical assessment of temperature values and the clinical data. The temperature values may be extracted from the respective sets of thermographic image data. The statistical assessment may include first-order statistical descriptors of histograms of the respective sets of thermographic image data. The first-order statistical descriptors may include at least one of average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and statistical range. The respective sets of thermographic image data may comprise pixel values. The statistical assessment may include second-order statistical descriptors derived from co-occurrence matrices of the thermographic image data. The co-occurrence matrices may be derived from matrices of the pixel values of the respective sets of thermographic image data, or directly from matrices of the temperature values. The second-order statistical descriptors may include at least one of second angular moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity. It should be noted that, as a first-order statistical descriptor, entropy may be calculated directly from the temperature values, or from the pixel values. It should be further noted that as a second-order statistical descriptor, entropy may be calculated from the co occurrence matrix values. Variances’ difference may be defined as a difference between variance values from a pair of distinct image samples.
[0011] The method may include filtering a preliminary sample of thermographic images for a measure of image quality, and subsequently capturing the one or more thermographic images with an imaging device. The filtering may include comparing the preliminary sample with a reference image. The reference image may include at least one of a positional marking or an alignment mask. The alignment mask may include a depiction of space between a head, a shoulder, and a raised arm of a patient on respective sides of the patient’s body. The preliminary sample may include a live feed from the imaging device. The measure of image quality may include at least one of alignment, angle of rotation, height, or inclination of the patient’s body with respect to an orientation of an imaging device, and interference within an area of the image. The measure of image quality may further include at least one of distance between the patient’s body and the imaging device, image contrast, and a background or core temperature identified within the preliminary sample. The measure of image quality may include a measure of symmetry between parts of left and right sides of the patient’s body.
The method may include prompting a user to adjust a parameter of the patient or of the imaging device to improve the measure of image quality prior to capturing the one or more thermographic images.
[0012] The one or more thermographic images may be obtained from a machine vision module. The plurality of anatomical parts of the patient’s body may be defined by medical knowledge including regions commonly affected either directly or indirectly by the medical condition. The plurality of anatomical parts of the patient’s body may include at least a portion of a breast. The AI rules may include detecting a bottom contour of the breast.
Image processing may include detecting a bottom contour of the breast. The plurality of anatomical parts of the patient’s body may be further defined by the processor automatically generating image masks including a combination of points, lines, and curves. The method may further include adjusting points, lines, or curves on, adding points, lines, or curves to, or removing points, lines, or curves from the image masks to match a part of the patient’s body having a unique anatomical structure or manifestation.
[0013] The method may include outputting the level of risk of the medical condition for the patient to a user. Outputting the level of risk may be performed by outputting a report including the level of risk, a plurality of subjective analysis selections made by the user, the thermographic images, and the respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body. The medical condition may include at least one of breast cancer, fibrosis, mastitis, duct stacia, and adenosis.
[0014] Applying a plurality of respective AI ruler to the respective sets of thermographic image data may include applying multiple sets of AI rules in parallel to produce a plurality of mutually independent results, and subsequently applying final inference rules to the plurality of mutually independent results.
[0015] Applying image processing may include applying AI machine vision rules to the one or more thermographic images. The plurality of respective AI rules may be a plurality of respective AI classification rules.
[0016] In another embodiment, a system for determining a risk of a medical condition for a patient includes a processor configured to receive one or more thermographic images. The processor is also configured to apply image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body. The processor is further configured to determine a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body. The processor may be further configured to receive clinical data of the patient. This embodiment may further optionally include any features described herein in relation to the method described above.
[0017] In another embodiment, a system for determining a risk of a medical condition for a patient includes means for receiving, at a processor, one or more thermographic images.
The system also includes means for applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body. The system also includes means for determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body. The system may also include means for receiving, at the processor, clinical data of the patient. This embodiment may further optionally include any features described herein in relation to the method described above. [0018] In another embodiment, a processor-implemented method for training an AI- enabled system to determine a risk of a medical condition for a patient may include receiving, at a processor, one or more thermographic images. The method also includes applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a training subject’s body. The method also includes establishing, based on the respective sets of thermographic image data for a plurality of anatomical parts of the training subject’s body, a plurality of respective artificial intelligence (AI) rules to be used in determining a level of risk of a medical condition for the patient. The method may also include receiving, at the processor, clinical data of the training subject. This embodiment may further optionally include any features described herein in connection with any of the other embodiments described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee. [0020] The foregoing will be apparent from the following more particular description of example embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments.
[0021] FIG. 1 A illustrates elements of an example system for determining a risk of a medical condition for a patient.
[0022] FIG. IB illustrates elements of another example system for determining a risk of a medical condition for a patient.
[0023] FIG. 2A is a schematic block diagram illustrating example methods and systems for performing a statistical assessment of image data and clinical data according to AI rules. [0024] FIG. 2B shows a construction of an example co-occurrence matrix used in performing a statistical assessment of image data and clinical data according to AI rules. [0025] FIG. 3 is a schematic block diagram illustrating example methods and systems for filtering a preliminary sample of thermographic image data for image quality.
[0026] FIG. 4A is a depiction of an example thermographic medical imaging system.
[0027] FIG. 4B is a depiction of an example thermographic medical imaging system, shown as viewed by a patient.
[0028] FIG. 4C is a depiction of an example thermographic medical imaging system, shown as viewed by an operator.
[0029] FIG. 5 illustrates elements of an example system for determining a risk of a medical condition for a patient.
[0030] FIG. 6 is a schematic block diagram illustrating example methods and systems for capturing a segmented image of an anatomical part of a patient’s body.
[0031] FIG. 7 is an illustration of example imaging angles.
[0032] FIG. 8 is a depiction of an example binarized image for use in alignment of an imaging system.
[0033] FIG. 9A is a depiction of a user interface for automatic segmentation of thermographic images of an anatomical part of a patient’s body according to AI rules.
[0034] FIG. 9B is a color depiction of a user interface for manual adjustment of segmented areas of thermographic images.
[0035] FIG. 10A is a depiction of a user interface displaying a segmented image in color.
[0036] FIG. 10B is a depiction of a user interface displaying a segmented image in grayscale. [0037] FIG. 11 is a depiction of a user interface displaying results of a segmented image.
[0038] FIG. 12 is a depiction of an example output of a thermographic medical imaging system.
[0039] FIG. 13 is a depiction of an example report provided by a thermographic imaging system.
[0040] FIG. 14A is a depiction of an example subjective analysis section that may be included in an output of a thermographic medical imaging system.
[0041] FIGS. 14B and 14C are depictions of example temperature comparisons that may be included in an output of a thermographic medical imaging system.
[0042] FIG. 15 is a schematic block diagram showing example medical conditions that may be detected through use of a method or system for determining risk of a medical condition.
[0043] FIG. 16 is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
[0044] FIG. 17 is a schematic block diagram showing an example method for evaluating alignment between a patient and a thermographic medical imaging system.
[0045] FIGS. 18A and 18B are schematic block diagrams showing an example method for training an AI-enabled thermographic medical imaging system.
[0046] FIG. 19A is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
[0047] FIG. 19B is a schematic block diagram showing an example method for classifying thermographic medical image data by application of AI rules.
[0048] FIG. 20 illustrates an example computer network, over which, embodiments of the claimed systems and methods may operate.
[0049] FIG. 21 is a system block diagram illustrating an example computer network, over which, embodiments of the claimed systems and methods may operate.
[0050] FIG. 22 is a schematic block diagram showing an example method for training an AI-enabled thermographic medical imaging system.
DETAILED DESCRIPTION
[0051] A description of example embodiments follows.
[0052] Infrared thermography is typically used to noninvasively measure the temperature of an object. The technique is based on a proportional relationship between the infrared light emitted and the surface temperature of object. Infrared radiation falls within the non-visible wavelength range of approximately 1 mm to 750nm. All bodies emit infrared radiation, making thermography useful in different areas such as, medicine, heavy industry, sports, agriculture, military applications, astronomy, and electronics, among others. Infrared radiation is measured using infrared light sensors, which operate similarly to cameras used to capture light from the visible spectrum forming images as photographs. Cameras include sensors comprising a matrix arrangement of different transducers capable of sensing point temperature measurements. These measurements are interpreted using pseudo-color, which comprises a color palette directly related to changes in surface temperatures. A gray scale may also be used to distinguish the temperatures, where dark tones reveal lower temperatures relative to the high temperatures represented by light tones.
[0053] The interest in using infrared thermography in medical diagnostics is largely based on the extent to which temperature variations indicate the presence of abnormal physiological activity. This correlation stems from alterations of metabolism or blood flow that makes the abnormalities detectable at an earlier stage than with other techniques. In one aspect, systems and methods are described below for detecting thermal abnormalities to screen for abnormal tissue in the early stages of the disease.
[0054] Example implementations of systems for detecting thermal abnormalities utilize infrared sensors to capture infrared images similarly to the manner in which a conventional camera captures images from detection of visible light. The images are processed digitally to manipulate digital arrays linked to an image that can be projected onto the screen of a computer system. The following concepts are described below to enhance understanding of the application of digital image processing in example implementations:
• Computer vision: acquisition, processing, classification, and recognition of digital images.
• Pixel: the basic element of an image.
• Image: a two-dimensional arrangement of pixels with different light intensity (grayscale). If the light intensity of a pixel is represented by n bits, then there will be 2 different grayscales.
• Color: a combination of red, green, and blue. A color can be expressed by a triplet of component variables R, G, and B, respectively having intensity values for red, green, and blue, with the intensity values defined on a scale ranging from 0 to 1.
• Brightness: the extent to which an area is illuminated.
• Tone: the relative intensity of color.
• Luminosity: the brightness of one area relative to another. • Chroma: the coloration of an area relative to the brightness of a reference white.
• RGB Space: a specific color is a combination of amounts of component colors red, green, and blue. Color may be expressed as an arithmetic sum of the component variables X = R +
G + B, and may be graphically represented in three-dimensional space. The maximum gain for each component corresponds to the wavelength of the respective component color. A component color may be herein interchangeably referred to as a basic color.
• Histogram of an image: a representation of the distribution of pixels in the different tones or fixed ranges in the given scale (gray, color, temperature, etc).
[0055] FIG. 1 A depicts an example system 100a for determining a level of risk 110 of a medical condition for a patient 102. The level of risk 110 may be expressed as a percent risk of a medical condition, a percent chance of a medical condition, or by any other means or scale by which a probability can be quantified. One or more images 104 of the patient’s 102 body or parts thereof are received at a processor 106. The processor 106 can be implemented as part of a computer, tablet, smartphone, or similar electronic device, or as an embedded module within an imaging system or other piece of dedicated equipment. The processor 106 is shown in FIG. 1 A as a single unit, but it should be understood that the processor functions of FIG. 1 A may alternatively be performed by a network of processors in communication with each other. The processor 106 applies image processing to the images 104 to define respective sets of thermographic image data 108 for a plurality of anatomical parts 109 of the patient’s 102 body, a process also referred to herein as segmentation. A plurality of respective artificial intelligence (AI) rules 116 is applied to the respective sets of thermographic image data 108 to determine a level of risk 110 of a medical condition in the patient 102. A plurality can be taken to mean two or more, and as such, two AI rules 116a, 116b are shown in FIG. 1 A, but it should be understood that other embodiments may employ larger numbers of AI rules, such as three or more, four or more, or five or more AI rules, and so on.
[0056] FIG. IB depicts another example system 100b for determining a level of risk 110 of a medical condition for a patient 102. The level of risk 110 is depicted in FIG. IB as a percentage, but, as in FIG. 1 A, the level of risk 110 may be embodied by any means or scale by which a probability can be quantified. One or more images 104 of the patient’s 102 body or parts thereof are received at a processor 106. Clinical data 112 of a patient is also received by the processor 106. The clinical data 112 may be provided directly by a patient to the system, or indirectly through an imaging technician or other medical professional. Alternatively, the clinical data 112 may read from an electronic or paper record by a medical professional or by a computer, and thus duly provided to the processor 106.
[0057] The processor 106 is shown in FIG. IB as being part of a computer.
Alternatively, the processor 106 may be implemented as part of a tablet, smartphone, or similar electronic device, or as an embedded module within an imaging system or other piece of dedicated equipment. The processor 106 is shown in FIG. IB as a single unit, but it should be understood that the processor functions of FIG. IB may be performed by a network of processors in communication with each other. The processor 106 applies image processing to the images 104 to define respective sets of thermographic image data 108 for a plurality of anatomical parts 109 of the patient’s 102 body. A plurality of respective AI rules 116 is applied to the respective sets of thermographic image data 108 to determine a level of risk 110 of a medical condition in the patient 102. A plurality can be taken to mean two or more, and as such, two AI rules 116a, 116b are shown in FIG. IB, but it should be understood that other embodiments may employ larger numbers of AI rules, such as three or more, four or more, or five or more AI rules, and so on.
[0058] Functions of the processor 106 are shown in FIG. IB to occur within a looping methodology wherein the processor outputs a result of a first function, which result in turn becomes an input for a second function. It should be understood that intermediate data generated by performance of processor functions such as application of image processing or application of AI rules 116 may remain internal to the processor 106 until a level of risk 110 is determined, or such intermediate data may exit the processor 106, or even the host computer, and subsequently re-enter the processor as an input to a following process. Such intermediate data may also traverse a network between multiple processors 106 to be processed in a distributed manner or in another manner to make efficient use of network resources.
[0059] Digital image processing provides information of interest from captured images, or from any sections of an image. The information may be processed, for example, by determining first-order statistical descriptors, which may be extracted from the image data. The first-order statistical descriptors may be obtained using a histogram of the image data. Second-order statistical descriptors may also be extracted from the image, which may be obtained using co-occurrence matrices.
[0060] An H(i) histogram represents the frequency of occurrence of a specific value in a given set of events. For an image, i represents the value of each pixel based on the depth of color of the image. The value i may be expressed in, for example, 8, 12, 16, or another number of bits. The histogram may be viewed as being a probability density function by defining the probability of occurrence, p(i), of the specific value, i, of a pixel in the image, where p(i) = H(i)/T, and where T is the number of pixels in the image. The first-order statistical descriptors shown in Table 1 may therefore be determined based on p(i). In Table 1, the number of values a pixel may have is denoted by L.
[0061] Table 1. First-order statistical descriptors.
Figure imgf000013_0001
[0062] A co-occurrence matrix is a matrix array that represents the joint probability that two pixels have intensity values of i and j, respectively, at a distance d, in a given direction. Second-order statistical descriptors refer to a set of values obtained from the statistical processing of co-occurrence matrices in digital images. These characteristics are referred to as texture characteristics and are shown in Table 2. Co-occurrence matrices are described hereinbelow in further detail with reference to FIG. 2B. [0063] Table 2. Second order statistical descriptors.
Figure imgf000014_0001
Figure imgf000015_0001
[0064] Statistical descriptors may be used in a processor-implemented method capable of discriminating abnormal thermograms from the analysis of thermal asymmetry. The analysis of asymmetry may be used by analyzing the means of the values of the pixels. Additional characteristics such as mean, variance, statistical asymmetry and kurtosis may further enhance such methods. Further enhancements to the analysis of abnormal thermograms may be achieved using AI, and, in particular, neural networks in a predictive model that is trained to process the image data.
[0065] One aspect of using a neural network involves validating and monitoring performance of the neural network. Different techniques can be used to monitor the training process of an AI system. For purposes of this disclosure, cross-validation or, more specifically, cross fold validation may be used. Cross fold validation determines the responsiveness of a machine learning system to training using two sets of data that are mutually independent. The first set is called a training set, and the second is referred to as a validation set. Cross fold validation is a known machine learning monitoring method and need not be described in detail.
[0066] When processing image data, the system will continue the training process, as new data is received, until a threshold training error is reached. In an example implementation, the threshold may be set to less than 10%. However, this error should not be interpreted as being a conventional error rate. That is, defining the threshold error 5 as based on the rate of mismatches could lead to erroneous conclusions due to the high probability of obtaining a relatively small proportion of positive cases in the data pool. The error for purposes of embodiments described herein is defined to be an FI Score indicator.
[0067] The FI Score is based on a goal of increasing sensitivity and positive predictive value. These parameters are defined below:
Figure imgf000016_0001
The parameters will be extracted from the average of the values obtained in each of the folds. An FI score greater than 90% will indicate a correct training of the intelligent system.
[0068] FIG. 2A depicts an example method 200a of performing image processing for segmentation based on a statistical assessment 218 of temperature values 214 and clinical data 212. The statistical assessment 218 may be defined by AI rules 216. AI rules 216 may also be referred to as statistical AI rules, inference rules, or AI classification rules. AI rules 216 that define the statistical assessment 218 may be based on various combinations of RGB representations of temperature data for regions of interest and of full images, and patient clinical data and identification data. The AI rules may be established through analysis of training datasets according to the aforementioned various combinations of bases, and may be applied to classify new datasets according to the aforementioned various combinations of bases. The bases of AI rules are described hereinbelow in further detail with respect to FIGs. 19A and 19B.
[0069] The temperature values 214 of FIG. 2 A may be derived from the respective sets of thermographic image data 208 for a plurality of anatomical parts 109 of a patient’s 102 body. Histograms 220 may be created from the sets of thermographic image data 208. The histograms may comprise pixel values 234 of the sets of thermographic image data 208, or other parameters by which the sets of thermographic image data 208 can be quantified. The statistical assessment 218 may include first-order statistical descriptors 222, which in turn may include average 224, standard deviation 225, asymmetry 226, kurtosis 227, energy 228, entropy 229, mode 230, median 231, maximum 232, and statistical range 233. The aforementioned first-order statistical descriptors 222 are described hereinabove in further detail with respect to Table 1. The statistical assessment 218 may also include second-order statistical descriptors 238. [0070] The second-order statistical descriptors may be derived from co-occurrence matrices 236 of the pixel values 234. The second-order statistical descriptors may include second angular moment 240, contrast 241, correlation 242, variance 243, entropy 244, variances’ difference 245, and homogeneity 246. The aforementioned second-order statistical descriptors 238 are described hereinabove in further detail with respect to Table 2.
It should be noted that, as a first-order statistical descriptor 222, entropy 229 may be calculated directly from the temperature values 214, or from the pixel values 234, which may include RGB representations of the temperature values 214. It should be further noted that as a second-order statistical descriptor 238, entropy 244 may be calculated from values of the co-occurrence matrices 236. Variances’ difference 245 may be defined as a difference between variance values from a pair of distinct image samples.
[0071] A co-occurrence matrix, as mentioned above, represents the joint probability that two pixels have intensity values of i and j, respectively, at a distance d, in a given direction. This matrix not only considers information about intensity levels, but also positions of pixels with similar intensity values. FIG. 2B shows a 4x4 matrix 200b with three intensity levels. Next to it, a co-occurrence matrix 200c is displayed, with distance equal to d - 1, a diagonal direction 200d of 135°, a dimension of 3x3, and intensity levels of 0, 1 and 2. The element with a value of 2 at the position (1,1) of the co-occurrence matrix indicates that level 0 is next to level 0 in the intensity matrix 200b two times, in the direction 200d mentioned above. The element at position (2,3) relates level 1 next to level 2 in the same direction, presenting itself once.
[0072] FIG. 3 shows an example embodiment of a method 300 for filtering 352 a preliminary sample 350 of an image 374 for image quality. The preliminary sample 350 may include a live feed 348. The live feed 348 may be provided by a thermographic imaging device or another type of imaging device. An image may be binarized by a processor to simplify analysis. A binarization threshold, for example, may be set to 20. A measure of image quality 354 may include alignment 354a, angle of rotation 354b, height 354c, and/or inclination 354d of a patient 102 with respect to an orientation of the imaging device. A measure of image quality 354 may include a detection of interference 354e, physical or otherwise, within an image area. A measure of image quality 354 may include a distance 354f from a patient 102 to an imaging device. A measure of image quality 354 may include image contrast 354g, background or core temperature 354h, and/or a measure of left-right symmetry 354i. The filtering 352 for image quality may include comparing 362 the preliminary sample 350 with a reference image 364.
[0073] The reference image 364 may include a positional marking 366, and/or an alignment mask 367. The alignment mask 367 may include a space 368 defined between a head, shoulder, and raised arm of a patient 102. The positional marking 366 may include a rectangle displayed and centered over the preliminary sample 350. The top edge of the rectangle may be placed, for example, at a distance from the top of the preliminary sample 350 equal to, for example, 6% of the height of the preliminary sample 350. The height of the rectangle may be equal to, for example, 17% of the height of the preliminary sample 350.
The width of the rectangle may be equal to, for example, 40% of the width of the preliminary sample 350. The filtering process stage 352 may deem the image quality acceptable if, by a shoulder point detection method, the processor 106 determines that both shoulder points are within the rectangle. The shoulder point detection method is described in more detail below in view of image segmentation.
[0074] If the filtering 352 process stage determines that image quality is unacceptable according to at least one of the measures of image quality 354, the method 300 may include displaying a message to alert a user that the patient 102 is in an incorrect position, prompting 370 the user to adjust a parameter of the patient 102 or of the imaging device. Another preliminary sample 350 can subsequently be filtered 352 for image quality. If the filtering 352 determines that image quality is acceptable, the method 300 may include displaying a message to alert the user that the patient 102 is in a correct position. An image 374 can subsequently be captured.
[0075] Referring to FIGs. 4A, 4B, and 4C, an example system 400 includes thermal imaging equipment 474, a processor 406, an operator console 478 and a patient monitor 476 as shown in FIG. 4A. In one example, the operator console employs a touchscreen for user input, but a keyboard and/or a mouse may be used in addition to or instead of a touchscreen. [0076] FIG. 5 is a schematic diagram of an example system 500 for determining a level of risk of a medical condition for a patient 502 that includes an infrared sensor 574, a patient positioning monitor 576, a data acquisition system and terminal 578, a report printer 592, a network interface 584 such as a wireless access point, a network server 586, a data analysis system 588 using, for example, a computer configured to perform data analysis using AI, and an analysis terminal 590 for operating the data analysis system 588 and viewing results. It should be understood that the data analysis system 588 and the analysis terminal 590 may be implemented separately or in a single unit.
[0077] The processor 506 may be a stand-alone or embedded unit, or implemented in any suitable computing device that uses a processor 506 to execute computer program instructions and includes an interface to the infrared sensor 574 to collect image data. The computing device may operate in its own console or may be integrated with other components of the system. In one example, the system 400 is mounted in a unit having wheels 496 for transport and a wheel lock to secure its position (not pictured).
[0078] In an example implementation, the system 400 is capable of adjustable vertical positioning of the thermal imaging equipment 474, as well as rotational and lateral positioning 480 at a selected angle to facilitate imaging of a patient 402 as shown in FIG. 4B. FIG. 4B illustrates operation of an example system from a patient’s 402 perspective and FIG. 4C illustrates operation from the operator’s 403 position. As shown in FIGS. 4B and 4C, the system 400 images patients 402 positioned in front of the system 400 under control of the operator 403.
[0079] The operator console 478 includes at least one monitor to allow the operator 403 to operate the system 400. The patient monitor 476 displays information relevant to the patient 402 and the capture and processing of the patient images. The system 400 may include an ambient temperature control system to facilitate image capture (not pictured). The ambient temperature control system may be implemented in software executed by the processor 406, within or separately from the operator console 476.
[0080] As shown in FIGs. 4A, 4B, and 4C, the patient 402 sits within a field of view of the IR camera 474. It should be noted that the terms “thermal imaging equipment” and “IR camera” are herein used interchangeably, and may include components directed to the physical adjustment of IR camera orientation. The operator 403 adjusts the IR camera 474 while monitoring the image detected by the IR camera 474 using the patient positioning monitor 476. The operator 403 captures a set of images sufficient to analyze relevant areas of the patient’s 402 body.
[0081] When the system 400 acquires image data from the IR camera 474, for each of several imaging views or conditions, the operator 403 monitors each view and performs adjustments to the IR camera 474 position to obtain suitable images. The imaging views or conditions may include angles of incidence such as frontal, left oblique, and right oblique. The imaging views or conditions may also include images wherein the subject part of a patient’s 402 body is stimulated, for example by application of or removal of heat. Such intentional temperature changes may be affected through example means such as application of moisture to the subject part of a patient’s 402 body, or by actuation of the ambient temperature control system.
[0082] Images taken while the subject part of a patient’s body is stimulated may be referred to as functional, while corresponding images taken without stimulation may be referred to as basal. The operator 403 may then be provided with imaging functions to select sections of the image of the subject part of patient’s 402 body for detailed analysis. The process of selecting sections of the image may be referred to as segmentation. The imaging functions may be accessible via a graphical user interface (GUI) with GUI objects such as buttons, menus, etc. In some embodiments, the subject parts of a patient’s 402 body may be the patient’s breasts, especially if the system 400 is employed in the detection of breast cancer. Once suitable images and image sections are obtained, the system 400 analyzes the image data to obtain information regarding any thermal abnormalities that may correlate with a manifestation of a medical condition such as cancerous tissue.
[0083] In an example implementation, the system software pre-processes the image data to obtain statistical parameters described above as first-order and second-order statistical descriptors. The data is then processed using a predictive model trained using historical data collected from patients having varying degrees of abnormal tissue. Further training may be performed using any new data. Results may then be provided to show the condition of the subject tissue corresponding to the image data collected by the system.
[0084] The first-order statistical descriptors are characteristics derived from a histogram of the thermographic image. For example, a histogram H(i) may be defined to represent the frequency at which a thermographic image includes a specific pixel value. The variable i represents the value of the pixel, which further represents the color intensity of the pixel location in the image have a resolution based on the data width of the system (8 bits, 12 bits, 16 bits, etc.). In this example, the histogram maybe processed as a probability density function that defines p(i) as the probability of the occurrence of a specific pixel value in the image, p(i) = H(i)/T, where T is the number of pixels in the image.
[0085] A characterization function may receive the thermographic image data from the IR camera 474 to obtain a desired set of statistical, morphological, and mathematical characteristics. These characteristics are ultimately compared with corresponding characteristics learned by an AI engine from a set of historical data. The historical data may include a set of statistical parameters such as mean, standard deviation, variance, or other similar parameters. The historical data may even include the whole matrix of temperatures as a set of characteristics. This historical data may be obtained, for example, from six different images of a sample group or sample population of patients. The images may include, for example, a basal frontal section, a functional frontal section, two oblique basal (right and left) sections, and two oblique basal (right and left) sections. In an example implementation, these six image sections correspond to a set of images captured in a patient scan.
[0086] Characterization of temperature data from the obtained thermographic images may begin by performing a linear transformation on the temperature data to enable the data to be displayed by pixels functioning on eight or more bits. Several example implementations of the transformation may be used. One example solution is to identify the maximum and minimum temperatures of the basal and functional images of each patient for all patients in the sample population. The values are rounded to their nearest integer, after which, 0.5°C is added to values representing a maximum temperature within each image, and 0.5°C is subtracted from values representing a minimum temperature within each image.
[0087] An alternative example solution is to determine maximum and minimum average temperatures of the basal and functional images of each patient for all patients in the sample population. As with the first method, the values are rounded to their nearest integer, 0.5°C is added to values representing a maximum temperature within each image, and 0.5°C is subtracted from values representing a minimum temperature within each image. Two more example solutions would follow similar respective procedures to the two solutions described above, except that instead of respectively adding 0.5°C to or subtracting 0.5°C from the maximum or minimum temperature values, the standard deviation would be calculated for the maximums and minimums for the basal and functional images of all patients in the sample population. It should be noted that basal images may also be referred to as baseline images. [0088] The solutions described above are used to calculate the terms T+, T2, L+, L2, which are then used in the following relationships to perform the desired linear transformation:
Figure imgf000021_0001
where:
P: Pixel value of interest for corresponding temperature value from temperature matrix T: Temperature value of interest from temperature matrix Tp Minimum temperature value minus a deviation value of 0.5°C T2: Maximum temperature value plus a deviation value of 0.5°C Lp Low end of pixel range (e.g. 0)
L2: High end of pixel range (e.g. 255 for 8-bit pixels)
It should be noted that respective sets of the above described minimum and maximum values are calculated separately for the sets of basal and functional images.
[0089] In some embodiments, especially in breast cancer detection applications, histograms may be constructed from thermographic image data for each breast of a patient. Some example characteristic parameters that may be determined from a histogram of each breast yielding the following first-order statistical descriptors, previously listed in Table 1, include:
• Average
• Standard deviation
• Asymmetry
• Kurtosis
• Energy
• Entropy
The above descriptors may be supplemented with the following descriptors not obtained from the histogram, but directly from the pixel distribution:
• Mode
• Median
• Maximum
[0090] Co-occurrence matrices may be defined to indicate a number of instances in which pixels having the same value are adjacent to one another in 0°, 45°, 90°, and 135° directions. The second-order descriptors may be obtained from such a set of four matrices as shown above in Table 2. Particularly useful second-order descriptors may include second angular moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity. [0091] First and second order descriptors of the thermographic data may be normalized, to enable future comparison with historical data, using the following relationship:
Figure imgf000022_0001
where: c': new normalized value of given descriptor x: original value of given descriptor x: mean of given descriptor s: characteristic standard deviation
The normalized historical data set may be introduced to a processor-implemented method that performs a primary components analysis (PCA) to reduce the data from n dimensions to k dimensions. In an example implementation, a covariance matrix is calculated using
Figure imgf000023_0001
Then the eigenvalues and eigenvector of the covariance are computed using
Figure imgf000023_0002
where svd is a singular value decomposition function. The smallest value of k is then selected such that
Figure imgf000023_0003
where var represents a desired percentage value for the variance, which in an example implementation may be from 90% to 99%. A variance analysis is then performed to obtain a new projection to fewer descriptors with the restriction of obtaining the minimum number of characteristics that allow retention of 99% of the variance of the sample data of a training sample approximately 80% of the sample size. A processor is then configured as a classifier to receive, in vector form, the characteristics obtained from the PCA of the full vector comprising the full data set. The classifier in an example implementation may include at least the following classifiers:
• Anomaly detector
• Neural networks
• Support vector machine
• Logistic regression module
These classifiers may be used to obtain a probability that a patient belongs to a predefined class (e.g. healthy, with risk of breast cancer, abnormal, etc....). This probability may be presented as a percentage.
[0092] FIG. 6 is a schematic block diagram depicting an example processor-implemented method 600 for segmentation of images of anatomical parts 609 of a patient’s 502 body. The anatomical parts may include at least a portion of a patient’s 502 breast 611, especially in breast cancer detection applications. The anatomical parts 609 may be defined by medical knowledge 607, which may include information about regions 605 commonly affected by a given medical condition. Such regions 605 may include areas local to or adjacent to the breast, or areas on the neck or in the armpits. The anatomical parts 609 may be defined by the processor 506 automatically generating image masks 617. These image masks serve as a starting point for segmentation and may include a combination of points 619, lines 621, and curves 623 strategically placed according to a plurality of AI rules 616.
[0093] The AI rules 616 may include detecting a bottom contour of a breast 613. Image processing may include detecting a bottom contour of a breast 613. The AI rules 616 may also be referred to as AI machine vision rules and may include various edge detection processes such as determining locations of body parts such as a bottom contour of a breast 613. The method 600 may subsequently include adjusting 623 positions of points 619, lines 621, and curves 623 along the bottom contour of a breast 613 or along the presently defined perimeter of the image mask 617 to better match 631 an anatomical part 609 unique to a patient’s 502 body. The method may likewise include adding 625 points 619, lines 621, and curves 623 to, or removing 627 points 619, lines 621, and curves 623 from the bottom contour of a breast 613 or along the presently defined perimeter of the image mask 617 to better match the anatomical part 609 to the patient’s 502 body. If the match 631 is not sufficient, further adjustments 623, additions 625, or removals 627 can be made. If the match 631 is sufficient, the segmented image 633 of the anatomical part 609 can be captured.
[0094] FIG. 7 depicts anatomical guides 700 that may be displayed to the operator 403 on the patient monitor 476. The anatomical guides 700 allow the operator 403 to select from six different images for analysis. These images are functional 701a and basal 701b images of the patient’s right oblique 704a, frontal 704b, and left oblique 704c views. The user interface then may allow the operator to perform imaging, maintenance, or calibration functions such as for example, infrared record capture, isotherm modification, focusing, sensor calibration, infrared sensor connection and disconnection, and reading and verification of a sensor serial number to verify its compatibility with an authorized sensor database.
[0095] An example embodiment of a processor-implemented method for automatically generating image masks 617 is described as follows. The processor 506 may be configured to binarize a captured image similarly to the binarization performed during filtering 352 as described above. The processor may be further configured to dilate the binarized image by 2 pixels, and may store the resulting image in memory. The data values of the original binary image may be subtracted from corresponding values of the dilated image to obtain a contour. Such a contour may be stored in memory as an external border image for each angle from which original images are taken, including frontal, right oblique, and left oblique. For oblique images, the binarization, dilation, and subtraction process stages may be performed for an image of only one side, and a mirror image may be created for the complementary side.
[0096] The example method for automatically generating image masks 617 continues as follows. A set of internal borders is found by applying the Canny Edge Detection method, a process known in the art, to a grayscale conversion of the original RGB image, and the result is stored. Pre-defmed portions of top and bottom sections of the internal border image are removed. These portions may be, for example, 60% for the top, and 12% for the bottom.
The image that results from the removal of these portions contains the region where most of a breast is expected to appear in any patient. Data representing these internal borders is subsequently added to corresponding data for the previously determined external borders, creating a reference image. Again, for oblique images, this process is performed for one side, with a mirror image created for the complementary side.
[0097] The method continues through a process of armpit and shoulder point detection as follows. Fiaving the binary image as a base, a point by point sweep may be performed in the top line of the image, from left to right. V alues different from zero may be detected and clustered. Three different clusters should be found, corresponding to the arms and neck of the patient. Coordinates of the midpoint of each cluster may then be calculated. If the expected three clusters are not present such that their midpoints cannot be calculated, the system may display a “non-valid image” message.
[0098] With the coordinates of the three midpoints now known, the distance between the central cluster midpoint and one selected midpoint of the sides may be calculated. The distance may be stored in variable A. Point A may then be created, below the central midpoint at a distance equal to the distance “A”. With point A, the central mid-point, and the selected side midpoint, a right triangle is defined by lines connecting each point. A point by point sweep may be performed from bottom to top of the defined right triangle area, searching for the first point equal to zero. When found, the coordinates of the first zero point may be stored. This point defines a shoulder point on a selected side.
[0099] The same process may be performed to find and store coordinates of the opposite side shoulder point. A point by point sweep from left to right, and from the top to 30% of the image may then be performed to define the right external border coordinates. Next, another point by point sweep, this sweep moving from right to left from the top to 30% of the image, may be performed to define the left external border coordinates. The distances between a selected shoulder point and each one of the corresponding external border points may then be calculated. The coordinates of the point that is at the shortest distance may be stored as the corresponding side armpit point. The previous process may be repeated to find the coordinates of the opposite armpit point.
[00100] FIG. 8 is a depiction of the above calculated points being employed for automatic segmentation of an image. The armpit 839 and shoulder 841 points, along with the external 837a and internal 837b borders, may now be used to perform the automatic phase of image segmentation. Taking the Canny border images as a base, a point by point sweep may be performed from left to right, from right to left, and from top to bottom to detect the left and right external borders 837a. The coordinates of those borders may be stored in a new vector. [00101] The borders’ coordinate vectors may then be trimmed, for example, by 30% from both ends. The distances between each point of both trimmed borders may then be measured. The average distance may be calculated and halved to find the midpoint of the image. From this point, a symmetry axis for the image is created. The whole Canny image may be divided by the symmetry axes and stored in two half images 800. Taking one half image 800 as a base, the image may be trimmed, for example, by 65% from the top and 8% from the bottom. Starting from a line of the image that is above the bottom part of the image by, for example, a measure of 15%, a point by point sweep may be performed to detect the external vertical border 837a. The borders left are the horizontal internal borders. The coordinates and the length of each horizontal internal border may be stored and sorted by size. The horizontal borders may be used to generate a series of second-degree polynomials that describe a vertical parabola 837c. As many parabolas may be generated as vertical borders that are found. A parabola 837c may be chosen that best adjusts to a series of fixed parameters that better describe the lower curve of a breast. The parameters may include vertex, focus, and width.
[00102] Once the points that describe the chosen parabola 837c are obtained, intersection coordinates are detected between the parabola 837c and the external vertical border 837a (point 1 819a), as well as between the parabola 837c and a vertical axis 835 defined as being parallel to the symmetry axis and separated by a distance equal to, for example, 2% of the image width (point 2 819b). The coordinates of points 1 819a and 2 819b may be stored in a coordinates vector. With points 1 819a and 2 819b two additional points (points 3 819c and 4 819d) that are needed for a 4-point Bezier curve may be calculated. This is an optimization cycle that will output coordinates of two points (points 3 819c and 4 819d) that generate a curve that better adjusts to the parabola 837c.
[00103] The coordinates of points 3 819c and 4 819d may be stored in a coordinates vector. From the coordinates of the shoulder point 841 and the corresponding armpit point 839, the coordinates of a midpoint between them (point 5 819e) may be calculated. From the coordinates of point 5 819e. a point may be projected over the vertical axis at a distance equal to, for example, 2% of the image's height below its original coordinates in the vertical axis (point 6 819f). From the coordinates of points 5 819e and 6 819f, the coordinates of a midpoint between them (point 7 819g) may be calculated. Finally, the external vertical border 837a vector may be divided into 4 equal segments, limited by the armpit point 839 and the intersection with the parabola 837c (point 1 819a). The coordinates of the 3 points that define the limits of the inner segments may be stored as point 8 819h, point 9 819i, and point 10 819j respectively. A region of interest (ROI) 843 may be defined by the following 3 curves and 3 lines:
• Bezier curve 1, defined by points 1, 2, 3, and 4
• Bezier curve 2, defined by points 1, 9, and 10
• Bezier curve 3, defined by points 5, 8, and 9
• Line 1, defined between points 5 and 7
• Line 2, defined between points 6 and 7
• Line 3, defined between points 6 and 4.
[00104] An example user interface of an image processing function is shown in FIG. 9A. The thermal imaging interface 945 in FIG. 9A may be used to automatically generate image masks 917a, 917b from ROIs 943. In some implementations, images corresponding to a patient’s 402 left breast, right breast, left armpit, right armpit, left groove, right groove, left nipple, right nipple, left half of the neck and half right neck may be sectioned and separated from an original image, which is referred to here as segmentation. The images shown can be modified, adjusted and repositioned by the operator 403 using a computer graphical interface. An image segmentation method may include defining a vertical axis of symmetry of the image, and establishing points that define edges of an anatomical part 609 of a patient’s 402 body. An automated processor-implemented method defines up to a predetermined number of points 919 and curves 923 to define the section for the region of interest 943. The number of points 919 and curves 923 defined may vary for different implementations, but in one example, ten points and two curves may be defined. Once a first mask 917a is adjusted to match an anatomical part 609 of a patient’s 402 body, a second mask 917b may be automatically adjusted to mirror the first adjusted mask 917a. The overlaid image in FIG. 9A depicts operation of the automated segmentation process. The underlaid images in FIG. 9A depict images prior to segmentation, but are also capable of depicting images after segmentation wherein the adjusted masks for the regions of interest are shown in full brightness with surrounding areas darkened.
[00105] Another example user interface of an image processing function is shown in FIG. 9B. Within the thermal imaging interface 945 in FIG. 9B, an ROI 923 is shown as being defined by points 919 and curves 923.
[00106] The example user interface of FIG. 10A displays segmented images for a plurality of parts 1009 of a patient’s 502 body. The segmented images are shown in full brightness with surrounding areas darkened. The six images shown correspond to the six example imaging angles of FIG. 7.
[00107] FIG. 10B shows an example user interface displaying grayscale representations of the same images of a plurality of parts 1009 of a patient’s 502 body as in FIG. 10A. Hot zones 1094, which may be identified by AI image analysis, are shown in FIG. 10B.
[00108] FIG. 11 is an example of the process shown in FIG. 9A, with grayscale representations of segmented images 1133. The segmented images of the regions of interest may be stored in a database. Once the segmentation of the patient's areas of interest is performed, the system may automatically obtain the temperature, maximum, minimum, first- and second-degree statistical descriptor values, as well as other image characteristics that would be analyzed using an AI implementation previously trained with historical data including positive cases documented by mammography and confirmed histopathologically through a biopsy.
[00109] Two example methods for obtaining characteristics of the image will now be described. The first method begins by defining first-order statistical descriptors, including average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and range, from the image histogram for pixel intensity for a total of 10 characteristics. Then for the whole of the image, the co-occurrence matrices in the four angular directions (0°, 45°, 90°, and 135°) are obtained. For each of co-occurrence matrix, seven second-order statistical descriptors, including second moment, contrast, correlation, variance, entropy, variances’ difference, and homogeneity, are obtained for a total of 38 characteristics. [00110] The second method is a machine learning method called “End to End.” In this method, an entire set of image data for each region of interest is entered into a convolutional neural network that defines each pixel as a particular characteristic from which are drifted or convoluted other characteristics between pixels. This is called a background acquisition of statistical variables.
[00111] FIG. 12 depicts a user interface 1200 for classification and medical evaluation. A doctor, or diagnostician, may use this interface 1200 to observe results of an automatic classification performed by the system. The doctor may make annotations for observations 1298a and enter subjective variables 1298b for the issuance of a medical recommendation. A report may be generated from the interface 1200. Alternatively, results of an automatic classification may be provided through a user interface, such as the user interface 1200 of FIG. 12, by being displayed and presented to a user on a computer monitor or similar display device, visual or otherwise. This computer monitor may be the operator console 478 of FIGS. 4A-4C, the patient monitor 476 or patient positioning monitor 576 of FIGs. 4A-4C and FIG. 5 respectively, or the data acquisition system and terminal 578 or the analysis terminal 590 of FIG. 5, or any other suitable display device within an embodiment.
[00112] An example report 1349 is shown in FIG. 13. A report 1349 may include the patient's 402 clinical data 1312, credentials 1396 of the operator and the physician issuing the recommendations, as well as the images 1333, relevant patient temperature measurements 1399, hot zones 1094 detected (not pictured in this example), medical observations 1398, and a tissue classification 1310 as a percentage of 0 to 100% based on any detected temperature abnormalities associated with the presence of a metabolically active tumor.
[00113] FIG. 14A depicts an example subjective analysis section 1496 that may be present in a report 1349. Observations can include areola symmetry, thermovascular network characteristics, and evidence of lumps in the breast profile. Separate entries for left and right breasts may be provided. FIG 14B shows an example temperature comparison 1447b between left and right breasts that may be included in a report 1300. The example imaging angles of FIG. 7 can be selected and compared in terms of maximum and average temperature in FIG. 14B. FIG. 14C shows an example temperature comparison 1447c between left and right areolas that may be included in a report 1300. The example imaging angles of FIG. 7 again may be used.
[00114] FIG. 15 depicts several example medical conditions 1551 that may be detected through use of a method or system 1500 for determining risk of a medical condition according to the present disclosure. Breast cancer 1551a, fibrosis 1551b, mastitis 1551c, duct stacia 155 Id, and adenosis 155 le may be so detected.
[00115] FIG. 16 offers a high-level depiction of an example method for training AI components of the claimed systems and methods to automatically classify image data with respect to thermal anomalies that may be present in the data. Image data 1653 may first be generated from thermal images of a training subject’s body. Such data may include RGB and grayscale depictions of full images as well as ROIs. First- and second-order statistical descriptors 1655 may then be calculated according to the method 200 of FIG. 2 for each of the four data types (RGB full image, RGB ROI, grayscale full image, grayscale ROI). Normalization and data visualization 1657 may then be performed for each data type, exemplifying a first filtering process stage wherein image data that is not viable for training is excluded from the training data set. A second filtering process stage may follow, wherein data outside the normal distribution for each data type is excluded 1659 from the data set. An ANOSIM 1661 may then be performed for each dataset to assess the linear independence of each statistical descriptor 1655. Descriptors 1655 with the highest differentiation degree 1663 may then be selected for each data type. Training and validation universes 1665 may then be created for each data type. Next, training 1667 may be performed for each data set, followed by a validation and evaluation process stage 1669, wherein correctly and incorrectly classified datasets are identified. Incorrectly classified datasets may then be analyzed 1671 by searching for patterns. A comparison 1673 is then performed to investigate correlation between incorrect classifications and clinical data. Stages 1659 through 1671 may then be repeated 1675 with a modified differentiability threshold until classification error is minimized.
[00116] FIG. 17 shows an example method 1700 for detecting depictions of space between a patient’s head, shoulder, and raised arm within a set of image data. Such a detection may be used in either an alignment procedure prior to capturing and saving image data, or in segmenting images that have been previously saved. The method begins with capturing an image 1777, which may include either saving a set of image data, or simply being connected to a live stream from an imaging device. Image data may then be binarized 1779 so that holes between a patient’s head, neck, shoulder, and raised arm may be detected 1781. Existence of holes may be queried 1783. If the holes are not found to exist, a message to that effect may be displayed 1785. If the holes are found, a lower point 1787 may be detected in each hole. The location of each lower point 1787 may then be queried 1789 with respect to a rectangle superimposed upon the image. If at least one of the holes is not found to be located within the rectangle, a message to that effect may be displayed 1791. If the holes are found within the rectangle, an affirmative message may be displayed 1793.
[00117] FIG. 18A offers an alternative high-level example method 1800a for training AI components of the claimed systems and methods to automatically classify image data with respect to thermal anomalies that may be present in the data. Data may first be generated 1895a from thermal images for a number of data types, including RGB, grayscale, ROI RGB, and ROI grayscale. Output labels may then be created 1895b for different takes, including frontal and left and right oblique. Universes for training, testing, and validation may then be created 1895d. A machine learning model may subsequently be developed 1895e by testing and validating classifications performed by the AI components. Model validation and evaluation 1895f may then follow.
[00118] FIG. 18B shows a high-level example method 1800b for classifying image data using AI components. Data may first be generated 1895a from thermal images for a number of data types, including RGB, grayscale, ROI RGB, and ROI grayscale. Output labels may then be created 1895c for different quality parameters of the thermal images, including height, focus, rotation, position of a patient’s arms, lateral displacement, patient tilt, and distance between the patient and the imaging device. Universes for training, testing, and validation may then be created 1895d. A machine learning model may subsequently be developed 1895e by testing and validating classifications performed by the AI components. Model validation and evaluation 1895f may then follow. Classification and error analysis 1895 may then take place, followed by an efficiency evaluation 1895h of the model.
[00119] FIG. 19A depicts an example method 1900a of training the AI components of the claimed systems and methods, which will now be described. This series of training process stages establishes inference rules that will be used to automatically classify image data with respect to thermal anomalies that may be present in the data, and, therefrom, to determine a level of risk for a medical condition. Thermal image data for one or more training subjects may be present in a database 1997a and may be initially processed or segmented 1997b. A first filtering process stage 1997c may include removal of data for thermal images from the database 1997a that may not be viable for inclusion in a training data set, due to the presence of exclusion parameters in the corresponding clinical record. Exclusion parameters may include, for example, indications of recent total or partial mastectomy, indications of recent surgical interventions on the thorax area, indications of previous breast cancer detection, absence of a clinical record, or absence of mammography results.
[00120] Using an identification number (ID) for each training subject, extraction of the data may be performed, and clusters of data 1997d, 1997e, 1997f, 1997g may be created with respective labels. A first example approach may use a binary classification system, which means that every input data will be assigned to a class A (positive for a thermal anomaly) or a class B (negative for a thermal anomaly). The output may be a membership value which, if surpassing a typical threshold of 0.5, will be assigned to class A, and will otherwise be assigned to class B.
[00121] In a second approach the system may perform a multiclass classification, based on the Birads scale. Labels may be generated for classes in Birads I, Birads II, Birads III, Birads IVa, Birads IVb, Birads IVc, Birads V, including a subsystem to classify the previously assigned Birads IV within its different subclassifications.
[00122] A second filtering process stage 1997h may proceed as follows. With the created clusters of data 1997d, 1997e, 1997f, 1997g, which contain data for thermal images in RGB and grayscale, as well as a temperature matrix from the ROIs, and clinical data, a plurality of statistical descriptors may be calculated. An objective of the calculation may be to enable subsequent elimination of data from the edges of the distributions. Another objective may be to evaluate the similarity between data sets by performing a correlation analysis of the statistical descriptors and an ANOSIM in two or more clusters 1997d, 1997e, 1997f, 1997g to select statistical descriptors with more linear independence.
[00123] The aforementioned objectives of the statistical descriptor calculations will support Deep Neural Network (DNN) models 1997i in generalizing the knowledge as inference rules 1997j for the new input data from the patients on the inference stage. The DNN models may be mathematical functions that can learn from data, such as image or text data, to define an n-dimensional decision boundary between n descriptors from given data sets. Predictive accuracy of the DNNs can increase when the DNNs are provided with more data. The statistical descriptors, as previously described herein, may include average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, minimum, range, angular second moment, contrast, correlation, and homogeneity. The statistical descriptors may further include a measure of dissimilitude. Dissimilitude may be demonstrated by a correlation matrix composed of distance measurements such as Euclidean distances between values of respective descriptors. [00124] A method of training a supervised learning system may include a data set for training and a dataset for validation, with their respective labels (or classes) which are presented to the system. The data input may interact with inference rules, which may take forms of weights and biases. A weight may be defined as the slope of a decision boundary, and a bias may be defined as an offset of a decision boundary. Weights and biases may be adjusted to reduce classification errors in the following way: inputs given to the system may be evaluated, a global classification error may then be calculated, and an optimization function may then be used to update the weights and the biases, thus updating the inference rules.
[00125] After the weights and the bias are updated, a similar process may be applied to the validation dataset, without updating the weights and the bias. This process may repeat until an error near zero is obtained, where almost 100% of the input data is classified correctly in both the training and validation universes.
[00126] As many supervised systems may be trained as data sets are available, to obtain a plurality of inference rules 1997j as outputs, which may in turn be used as inputs to a final supervised training system 1997k to obtain a model of final inference rules 19971 for the classification. An advantage over previous systems, which could present an undesirable output for a certain dataset, is that an undesirable output can be complemented with the output of other more adequate systems that will diminish the prediction error.
[00127] FIG. 19B shows an example method 1900b similar to the training method 1900a, as applied to classification of thermal image data as positive or negative regarding any presence of thermal abnormalities. The aforementioned similarity lies in an initial application of inference rules 1997j to obtain a plurality of results 1997m, followed by application of final inference rules 1997n to arrive at a final classification 1997o with minimal classification error.
[00128] The inference rules 1997j may be implemented as sets of inference rules, wherein a set of inference rules may include one or more inference rules. FIG. 19B shows n sets of inference rules, labeled respectively as “Inference Rules 1,” “Inference Rules 2,” and “Inference Rules n.” The quantity n of sets of inference rules may encompass, for example, two sets of inference rules, four sets of inference rules, ten sets of inference rules, or any other number of sets of inference rules that can be stored and executed properly on the system. Multiple sets of inference rules may thus be applied in parallel to the data being classified, such that results of the plurality of results 1997m are mutually independent. Such mutual independence contributes to minimization of classification errors when final inference rules 1997n are applied to the plurality of results 1997m. It should be understood that inference rules described herein are examples of AI rules, and may be employed at least as AI classification rules.
[00129] FIG. 20 illustrates a computer network (or system) 1000 or similar digital processing environment, according to some embodiments of the present disclosure. Client computer(s)/devices 50 and server computer(s) 60 provide processing, storage, and input/output devices executing application programs and the like. The client computer(s)/devices 50 can also be linked through communications network 70 to other computing devices, including other client devices/processes 50 and server computer(s) 60.
The communications network 70 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, local area or wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth®, etc.) to communicate with one another. Other electronic device/computer network architectures are suitable.
[00130] Client computers/devices 50 may be configured with a computing module (located at one or more of elements 50, 60, and/or 70). In some embodiments, a user may access the computing module executing on the server computers 60 from a user device, such a mobile device, a personal computer, or any computing device known to one skilled in the art without limitation. According to some embodiments, the client devices 50 and server computers 60 may be distributed across a computing module.
[00131] Server computers 60 may be configured as the computing modules which communicate with client devices 50 for providing access to (and/or accessing) databases that include data associated with thermographic images or other types of image data. The server computers 60 may not be separate server computers but part of cloud network 70. In some embodiments, the server computer (e.g., computing module) may enable users to determine location, size, or number of physical objects (including but not limited to target objects and/or reference objects) by allowing access to data located on the client 50, server 60, or network 70 (e.g., global computer network). The client (configuration module) 50 may communicate data representing the physical objects back to and/or from the server (computing module) 60. In some embodiments, the client 50 may include client applications or components executing on the client 50 for determining location, size, or number of physical objects, and the client 50 may communicate corresponding data to the server (e.g., computing module) 60. [00132] Some embodiments of the system 1000 may include a computer system for determining a level of risk of a medical condition for a patient based on image data. The system 1000 may include a plurality of processors 84. The system 1000 may also include a memory 90. The memory 90 may include: (i) computer code instructions stored thereon; and/or (ii) data representing thermographic images or other types of image data. The data may include segments including portions of the thermographic images or other types of image data. The memory 90 may be operatively coupled to the plurality of processors 84 such that, when executed by the plurality of processors 84, the computer code instructions may cause the computer system 1000 to implement a computing module (the computing module being located on, in, or implemented by any of elements 50, 60, 70 of FIG. 20 or elements 82, 84, 86, 90, 92, 94, 95 of FIG. 21) configured to perform one or more functions.
[00133] According to some embodiments, FIG. 21 is a diagram of an example internal structure of a computer (e.g., client processor/device 50 or server computers 60) in the computer system 1000 of FIG. 21. Each computer 50, 60 contains a system bus 79, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system. The system bus 79 is essentially a shared conduit that connects different elements of a computer system (e.g., processor, disk storage, memory, input/output ports, network ports, etc.) that enables the transfer of information between the elements. Attached to the system bus 79 is an I/O device interface 82 for connecting various input and output devices (e.g., keyboard, mouse, displays, printers, speakers, etc.) to the computer 50, 60. A network interface 86 allows the computer to connect to various other devices attached to a network (e.g., network 70 of FIG. 20). Memory 90 provides volatile storage for computer software instructions 92 and data 94 used to implement some embodiments (e.g., input and output video streams described herein). Disk storage 95 provides non-volatile storage for computer software instructions 92 and data 94 used to implement an embodiment of the present disclosure. A central processor unit 84 is also attached to the system bus 79 and provides for the execution of computer instructions.
[00134] In one embodiment, the processor routines 92 and data 94 are a computer program product (generally referenced 92), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the present disclosure. The computer program product 92 can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection. Other embodiments may include a computer program propagated signal product 107 (of FIG. 20) embodied on a propagated signal on a propagation medium (e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)). Such carrier medium or signals provide at least a portion of the software instructions for the routines/program 92 of the present disclosure.
[00135] In alternate embodiments, the propagated signal is an analog carrier wave or digital signal carried on the propagated medium. For example, the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network. In one embodiment, the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer. In another embodiment, the computer readable medium of computer program product 92 is a propagation medium that the computer system 50 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.
[00136] Generally speaking, the term "carrier medium" or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.
[00137] Embodiments or aspects thereof may be implemented in the form of hardware (including but not limited to hardware circuitry), firmware, or software. If implemented in software, the software may be stored on any non-transient computer readable medium that is configured to enable a processor to load the software or subsets of instructions thereof. The processor then executes the instructions and is configured to operate or cause an apparatus to operate in a manner as described herein.
[00138] Further, hardware, firmware, software, routines, or instructions may be described herein as performing certain actions and/or functions of the data processors. However, it should be appreciated that such descriptions contained herein are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc. [00139] It should be understood that the flow diagrams, block diagrams, and network diagrams may include more or fewer elements, be arranged differently, or be represented differently. But it further should be understood that certain implementations may dictate the block and network diagrams and the number of block and network diagrams illustrating the execution of the embodiments be implemented in a particular way.
[00140] Accordingly, further embodiments may also be implemented in a variety of computer architectures, physical, virtual, cloud computers, and/or some combination thereof, and, thus, the data processors described herein are intended for purposes of illustration only and not as a limitation of the embodiments.
[00141] FIG. 22 shows an example method 2200 for training an AI-enabled thermographic medical imaging system to determine a risk of a medical condition for a patient 502. One or more thermographic images 2204 may be received at a processor 2206. Image processing 2206a may be applied to the one or more thermographic images 2204 to define respective sets of thermographic image data 2208 for a plurality of anatomical parts 2209 of a training subject’s body. A plurality of respective AI rules 2297j may be established based on the respective sets of thermographic image data 2208 for a plurality of anatomical parts 2209 of a training subject’s body. The plurality of respective AI rules 2297j may then be used in determining a level of risk of a medical condition for the patient 502.
[00142] While example embodiments have been particularly shown and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the embodiments encompassed by the appended claims.

Claims

CLAIMS What is claimed is:
1. A processor-implemented method for determining a risk of a medical condition for a patient, the method comprising: receiving, at a processor, one or more thermographic images; applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body; determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
2. The method of Claim 1 further comprising receiving, at the processor, clinical data of the patient.
3. The method of Claim 2 wherein the AI rules define a statistical assessment of temperature values and the clinical data, the temperature values extracted from the respective sets of thermographic image data.
4. The method of Claim 3 wherein the statistical assessment includes first-order statistical descriptors of histograms of the respective sets of thermographic image data.
5. The method of Claim 4 wherein the first-order statistical descriptors include at least one of average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and statistical range.
6. The method of Claim 3 wherein the respective sets of thermographic image data comprise pixel values, and wherein the statistical assessment includes second-order statistical descriptors derived from co-occurrence matrices of the pixel values of the respective sets of thermographic image data.
7. The method of Claim 6 wherein the second-order statistical descriptors include at least one of second angular moment, contrast, correlation, variance, entropy, variance of the difference, and homogeneity.
8. The method of Claim 1 further comprising filtering a preliminary sample of thermographic images for a measure of image quality, and subsequently capturing the one or more thermographic images with an imaging device.
9. The method of Claim 8 wherein the filtering includes comparing the preliminary sample with a reference image.
10. The method of Claim 9 wherein the reference image includes at least one of a positional marking or an alignment mask.
11. The method of Claim 10 wherein the alignment mask includes a depiction of space between a head, a shoulder, and a raised arm of a patient on respective sides of the patient’s body.
12. The method of Claim 8 wherein the preliminary sample includes a live feed from the imaging device.
13. The method of Claim 8 wherein the measure of image quality includes at least one of alignment, angle of rotation, height, or inclination of the patient’s body with respect to an orientation of an imaging device, and interference within an area of the image.
14. The method of Claim 13 wherein the measure of image quality further includes at least one of distance between the patient’s body and the imaging device, image contrast, and a background or core temperature identified within the preliminary sample.
15. The method of Claim 8 wherein the measure of image quality includes a measure of symmetry between parts of left and right sides of the patient’s body.
16. The method of Claim 8 further comprising prompting a user to adjust a parameter of the patient or of the imaging device to improve the measure of image quality prior to capturing the one or more thermographic images.
17. The method of Claim 1 wherein the one or more thermographic images are obtained from a machine vision module.
18. The method of Claim 1 wherein the plurality of anatomical parts of the patient’s body is defined by medical knowledge including regions commonly affected either directly or indirectly by the medical condition.
19. The method of Claim 1 wherein the plurality of anatomical parts of the patient’s body includes at least a portion of a breast, and wherein the AI rules include detecting a bottom contour of the breast.
20. The method of Claim 1 wherein the plurality of anatomical parts of the patient’s body is further defined by the processor automatically generating image masks including a combination of points, lines, and curves.
21. The method of Claim 20 further comprising adjusting points, lines or curves on, adding points, lines, or curves to, or removing points, lines, or curves from the image masks to match a part of the patient’s body having a unique anatomical structure or manifestation.
22. The method of Claim 1 further comprising outputting the level of risk of the medical condition for the patient to a user.
23. The method of Claim 22 wherein outputting the level of risk is performed by outputting a report including the level of risk, a plurality of subjective analysis selections made by the user, the thermographic images or representations thereof, and the respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body.
24. The method of Claim 1 wherein the medical condition includes at least one of breast cancer, fibrosis, mastitis, duct stacia, and adenosis.
25. The method of Claim 1 wherein applying a plurality of respective AI rules to the respective sets of thermographic image data includes applying multiple sets of AI rules in parallel to produce a plurality of mutually independent results, and subsequently applying final inference rules to the plurality of mutually independent results.
26. The method of Claim 1 wherein applying image processing includes applying AI machine vision rules to the one or more thermographic images.
27. The method of Claim 1 wherein the plurality of respective AI rules is a plurality of respective AI classification rules.
28. A system for determining a risk of a medical condition for a patient, the system comprising a processor configured to: receive one or more thermographic images; apply image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body; determine a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
29. The system of Claim 28 wherein the processor is further configured to receive clinical data of the patient.
30. The system of Claim 29 wherein the AI rules define a statistical assessment of temperature values and the clinical data, the temperature values extracted from the respective sets of thermographic image data.
31. The system of Claim 30 wherein the statistical assessment includes first-order statistical descriptors of histograms of the respective sets of thermographic image data.
32. The system of Claim 31 wherein the first-order statistical descriptors include at least one of average, standard deviation, asymmetry, kurtosis, energy, entropy, mode, median, maximum, and statistical range.
33. The system of Claim 30 wherein the respective sets of thermographic image data comprise pixel values, and wherein the statistical assessment includes second-order statistical descriptors derived from co-occurrence matrices of the pixel values of the respective sets of thermographic image data.
34. The system of Claim 33 wherein the second-order statistical descriptors include at least one of second angular moment, contrast, correlation, variance, entropy, variance of the difference, and homogeneity.
35. The system of Claim 28 wherein the processor is further configured to filter a preliminary sample of thermographic images for a measure of image quality, the system further comprising an IR camera configured to subsequently capture the one or more thermographic images and transmit the one or more thermographic images to the processor.
36. The system of Claim 35 further comprising a reference image to which the preliminary sample is compared to filter the preliminary sample for a measure of image quality.
37. The system of Claim 36 wherein the reference element includes at least one of a positional marking or an alignment mask.
38. The system of Claim 37 wherein the alignment mask includes a depiction of space between a head, a shoulder, and a raised arm of a patient on respective sides of the patient’s body.
39. The system of Claim 35 wherein the preliminary sample includes a live feed from the IR camera.
40. The system of Claim 35 wherein the measure of image quality includes at least one of alignment, angle of rotation, height, or inclination of the patient’s body with respect to an orientation of an imaging device, and interference within an area of the image.
41. The system of Claim 40 wherein the measure of image quality further includes at least one of distance between the patient’s body and the imaging device, image contrast, and a background or core temperature identified within the preliminary sample.
42. The system of Claim 35 wherein the measure of image quality includes a measure of symmetry between parts of left and right sides of the patient’s body.
43. The system of Claim 35 further comprising a parameter of the patient or of the imaging device that is adjustable by a user to improve the measure of image quality prior to capturing the one or more thermographic images.
44. The system of Claim 28 further comprising a machine vision module from which the one or more thermographic images are obtained.
45. The system of Claim 28 wherein the plurality of anatomical parts of the patient’s body is defined by medical knowledge including regions commonly affected either directly or indirectly by the medical condition.
46. The system of Claim 28 wherein the plurality of anatomical parts of the patient’s body includes at least a portion of a breast, and wherein the AI rules include detecting a bottom contour of the breast.
47. The system of Claim 28 wherein the plurality of anatomical parts of the patient’s body is further defined by the processor automatically generating image masks including a combination of points, lines, and curves.
48. The system of Claim 47 wherein the processor is further configured to adjust the location of points, lines, or curves on, add points, lines, or curves to, or remove points, lines, or curves from the image masks to match a part of the patient’s body having a unique anatomical structure or manifestation.
49. The system of Claim 28 wherein the processor is further configured to output the level of risk of the medical condition for the patient to a user.
50. The system of Claim 49 wherein the processor is configured to output the level of risk by outputting a report including the level of risk, a plurality of subjective analysis selections made by the user, the thermographic images or representations thereof, and the respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body.
51. The system of Claim 28 wherein the medical condition includes at least one of breast cancer, fibrosis, mastitis, duct stacia, and adenosis.
52. The system of Claim 28 wherein applying a plurality of respective AI rules to the respective sets of thermographic image data includes applying multiple sets of AI rules in parallel to produce a plurality of mutually independent results, and subsequently applying final inference rules to the plurality of mutually independent results.
53. The system of Claim 28 wherein applying image processing includes applying AI machine vision rules to the one or more thermographic images.
54. The system of Claim 28 wherein the plurality of respective AI rules is a plurality of respective AI classification rules.
55. A system for determining a risk of a medical condition for a patient, the system comprising: means for receiving, at a processor, one or more thermographic images; means for applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a patient’s body; means for determining a level of risk of a medical condition for the patient by applying a plurality of respective artificial intelligence (AI) rules to the respective sets of thermographic image data for respective parts of the plurality of anatomical parts of the patient’s body.
56. A processor-implemented method for training an artificial intelligence (Al)-enabled system to determine a risk of a medical condition for a patient, the method comprising: receiving, at a processor, one or more thermographic images; applying image processing to the one or more thermographic images to define respective sets of thermographic image data for a plurality of anatomical parts of a training subject’s body; establishing, based on the respective sets of thermographic image data for a plurality of anatomical parts of the training subject’s body, a plurality of respective artificial intelligence (AI) rules to be used in determining a level of risk of a medical condition for the patient.
PCT/IB2020/061860 2019-12-13 2020-12-12 Determination of medical condition risk using thermographic images WO2021117013A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962948176P 2019-12-13 2019-12-13
US62/948,176 2019-12-13

Publications (1)

Publication Number Publication Date
WO2021117013A1 true WO2021117013A1 (en) 2021-06-17

Family

ID=76329674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/061860 WO2021117013A1 (en) 2019-12-13 2020-12-12 Determination of medical condition risk using thermographic images

Country Status (1)

Country Link
WO (1) WO2021117013A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470818A (en) * 2021-07-08 2021-10-01 建信金融科技有限责任公司 Disease prediction method, device, system, electronic device and computer readable medium
CN116309501A (en) * 2023-03-27 2023-06-23 北京鹰之眼智能健康科技有限公司 Sore surface type prediction method, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180193667A1 (en) * 2016-05-04 2018-07-12 Brainlab Ag Monitoring a Patient's Position Using a Planning Image and Subsequent Thermal Imaging
WO2018158504A1 (en) * 2017-03-01 2018-09-07 Thermidas Oy Multimodal medical imaging and analyzing system, method and server

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180193667A1 (en) * 2016-05-04 2018-07-12 Brainlab Ag Monitoring a Patient's Position Using a Planning Image and Subsequent Thermal Imaging
WO2018158504A1 (en) * 2017-03-01 2018-09-07 Thermidas Oy Multimodal medical imaging and analyzing system, method and server

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113470818A (en) * 2021-07-08 2021-10-01 建信金融科技有限责任公司 Disease prediction method, device, system, electronic device and computer readable medium
CN116309501A (en) * 2023-03-27 2023-06-23 北京鹰之眼智能健康科技有限公司 Sore surface type prediction method, electronic equipment and storage medium
CN116309501B (en) * 2023-03-27 2024-02-02 北京鹰之眼智能健康科技有限公司 Sore surface type prediction method, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11253171B2 (en) System and method for patient positioning
Adal et al. An automated system for the detection and classification of retinal changes due to red lesions in longitudinal fundus images
Almazroa et al. Optic disc and optic cup segmentation methodologies for glaucoma image detection: a survey
US9445713B2 (en) Apparatuses and methods for mobile imaging and analysis
EP2888718B1 (en) Methods and systems for automatic location of optic structures in an image of an eye, and for automatic retina cup-to-disc ratio computation
US20210174505A1 (en) Method and system for imaging and analysis of anatomical features
JP4184842B2 (en) Image discrimination device, method and program
US8340388B2 (en) Systems, computer-readable media, methods, and medical imaging apparatus for the automated detection of suspicious regions of interest in noise normalized X-ray medical imagery
Hassan et al. Joint segmentation and quantification of chorioretinal biomarkers in optical coherence tomography scans: A deep learning approach
US20200134820A1 (en) Tumor boundary reconstruction using hyperspectral imaging
US9401021B1 (en) Method and system for identifying anomalies in medical images especially those including body parts having symmetrical properties
US20180000462A1 (en) Classifying hormone receptor status of malignant tumorous tissue from breast thermographic images
US11615508B2 (en) Systems and methods for consistent presentation of medical images using deep neural networks
US20200320692A1 (en) Predicting a pathological condition from a medical image
WO2021117013A1 (en) Determination of medical condition risk using thermographic images
He et al. Segmenting diabetic retinopathy lesions in multispectral images using low-dimensional spatial-spectral matrix representation
Wong et al. Learning-based approach for the automatic detection of the optic disc in digital retinal fundus photographs
WO2014171830A1 (en) Method and system for determining a phenotype of a neoplasm in a human or animal body
Abdel-Nasser et al. Automatic nipple detection in breast thermograms
Bochko et al. Lower extremity ulcer image segmentation of visual and near‐infrared imagery
US20210209755A1 (en) Automatic lesion border selection based on morphology and color features
WO2018223069A1 (en) Bilirubin estimation using sclera color and accessories therefor
CN115844436A (en) CT scanning scheme self-adaptive formulation method based on computer vision
Asem et al. Blood vessel segmentation in modern wide-field retinal images in the presence of additive Gaussian noise
Merickel Jr et al. Segmentation of the optic nerve head combining pixel classification and graph search

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20898227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20898227

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22/11/2022)