WO2007051299A1 - Surface analysis method and system - Google Patents

Surface analysis method and system Download PDF

Info

Publication number
WO2007051299A1
WO2007051299A1 PCT/CA2006/001795 CA2006001795W WO2007051299A1 WO 2007051299 A1 WO2007051299 A1 WO 2007051299A1 CA 2006001795 W CA2006001795 W CA 2006001795W WO 2007051299 A1 WO2007051299 A1 WO 2007051299A1
Authority
WO
WIPO (PCT)
Prior art keywords
digital image
filter
intensity value
color component
value
Prior art date
Application number
PCT/CA2006/001795
Other languages
French (fr)
Inventor
Ronald Perrault
Original Assignee
Cryos Technology, Inc.
PERRAULT, Frédérick
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cryos Technology, Inc., PERRAULT, Frédérick filed Critical Cryos Technology, Inc.
Priority to US12/092,480 priority Critical patent/US20100020164A1/en
Priority to EP06804670.5A priority patent/EP1958150B1/en
Priority to CA2628087A priority patent/CA2628087C/en
Publication of WO2007051299A1 publication Critical patent/WO2007051299A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • G06T5/75
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/45For evaluating or diagnosing the musculoskeletal system or teeth
    • A61B5/4538Evaluating a particular part of the muscoloskeletal system or a particular medical condition
    • A61B5/4561Evaluating static posture, e.g. undesirable back curvature
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • the present invention relates to a surface analysis method and system. More specifically, the present invention relates to a surface analysis method and system for the diagnostic of postural abnormalities in the structure of the human body.
  • Polarized light photography has also been developed to selectively enhance either surface or subsurface features of the skin. These results are accomplished by placing a polarizing filter (typically a linear polarizing filter) both in front of the flash unit, and in front of the camera.
  • a polarizing filter typically a linear polarizing filter
  • surface features of the skin such as scales, wrinkles, fine lines, pores, and hairs are visually enhanced.
  • the polarizing filters are aligned perpendicular to each other, subsurface features of the skin such as erythema, pigmentation and blood vessels are visually enhanced.
  • UVA ultraviolet A
  • the flash unit is filtered to produce ultraviolet A (UVA) light and the camera is filtered so that only visible light enters the lens, has been used to visually enhance the appearance of pigmentation, the bacteria p. acnes, and horns.
  • a variation of ultraviolet photography has been termed the "sun camera" where UVA light is used to illuminate the skin and an UVA sensitive film or a digital camera is used to record the reflected ultraviolet light from the skin. In this arrangement, both the pigment distribution and the surface features of the skin are visually enhanced.
  • US Patent No. 6,907,193 entitled “Method of taking polarized images of the skin and the use thereof, issued to Kollias et al. on June 14, 2006, discloses a method of investigation of the skin using first a white light, followed by an ultraviolet light and finally a phosphorescent blue light. Each time a specific lighting is used, a picture of the patient is taken at an angle between 35 and 55 degrees. The angle allows the amplification of skin characteristics such as fine lines, skin texture, hairs, etc. Furthermore, the use of filters, such as polarizing filters is described. High frequency filters, red light blocking filters, etc. are also used to amplify some characteristics of the skin.
  • the present invention relates to surface analysis system, comprising:
  • a digital imaging system for generating a digital image of a surface
  • a processor so coupled to the digital imaging system as to receive the digital image, to process the digital image and to generate a processed digital image highlighting relief variations in the surface; the processor being so configured as to apply at least one digital filter to the digital image to highlight relief variation in the surface;
  • a display system for displaying the processed digital image.
  • the present invention further relates to the above described surface analysis system wherein the at least one digital filter includes a custom filter, the processor, when applying the custom filter to the digital image, performing the steps of:
  • the present invention also relates to a surface analysis method, comprising:
  • the present invention relates to digital filtering method for filtering a digital image provided with at least two color components, the digital filtering method comprising:
  • Figure 1 is a flow diagram of a surface analysis method according to a non-limitative illustrative embodiment of the present invention
  • Figure 2 is a digital image of the front view of a patient's feet
  • Figure 3 is the digital image of Figure 2 to which an inverse filter was applied;
  • Figure 4 is a flow diagram of an inverse filter algorithm
  • Figure 5 is the digital image of Figure 2 to which a solarize filter with a level equal to 0 was applied;
  • Figure 6 is the digital image of Figure 2 to which a solarize filter with a level equal to 128 was applied;
  • Figure 7 is a flow diagram of a solarize filter algorithm
  • Figure 8 is the digital image of Figure 2 to which an edge detect and inverse filters were applied;
  • Figures 9a and 9b is a flow diagram of an edge detect filter algorithm
  • Figure 10 is the digital image of Figure 2 to which a custom filter was applied;
  • Figure 11 is the digital image of Figure 2 to which the custom and inverse filters were applied;
  • Figures 12a and 12b is a flow diagram of the custom filter algorithm
  • Figure 13 is a digital image of the back of a patient to which was applied the custom filter with a level of 255 followed by the inverse filter
  • Figure 14 is a digital image of the front of a patient
  • Figure 15 is the digital image of Figure 14 to which was applied a custom filter with a level of 255 followed by the inverse filter;
  • Figure 16 is a digital image of the front view of a patient's feet, showing eversion of the lower limbs, to which was applied a custom filter with a level of 255 followed by the inverse filter;
  • Figure 17 is a digital image of the front view of a patient's feet, showing normal lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter;
  • Figure 18 is a digital image of the back view of a patient's feet, showing eversion of the lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter;
  • Figure 19 is a digital image of the back view of a patient's feet, showing normal lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter;
  • Figure 20 is a schematic view of a surface analysis system.
  • a method and system according to a non- limitative illustrative embodiment of the present invention provide a surface analysis system and method for the diagnostic of postural abnormalities in the structure of the human body.
  • the method generally consist in using a digital filter, or a combination of digital filters, applied to digital images of a human body in order to highlight deformities and asymmetries on the surface of the skin covering the human body structure by accentuating the reflection of light upon the relief of the skin surface. It is to be understood that such a method may also be used in other contexts such as, for example, the analysis of the surface of the metallic body of a vehicle in order to identify any warping or indentations caused by an impact or an applied torque.
  • a system 1 that may be use to implement the method is shown in Figure 20 and advantageously consist of a digital camera 2, at least one flash unit, constant direct or diffuse source of light or a combination thereof 4 and a processing unit 6, such as, for example, a personal computer, to process digital images taken of a patient 8 by the digital camera 2 by applying the various filters to the digital images.
  • a processing unit 6 such as, for example, a personal computer, to process digital images taken of a patient 8 by the digital camera 2 by applying the various filters to the digital images.
  • the patient may be replaced by an object, for example a vehicle in the case where the surface under analysis is a metallic body of a vehicle.
  • FIG. 1 there is shown a flow diagram depicting the steps involved in the surface analysis method according to an illustrative embodiment of the present invention, which is indicated by blocks 102 to 106.
  • the method starts by importing one, or more, digital image of the surface to be analyzed, for example a digital image of the body of a patient or the body of a vehicle.
  • the digital image may be obtained using a digital imaging system such as, for example, a digital camera or a digital scanner or by scanning a conventional photograph or image.
  • one, or more, digital filter is applied to the digital image using, for example, a dedicated processor or a personal computer, in order to accentuate the reflection of light upon the surface.
  • a dedicated processor or a personal computer in order to accentuate the reflection of light upon the surface.
  • filters plus combinations of filters may be used in particular; these will be detailed further below.
  • the filtered digital image are displayed, for example on a computer screen or a color printer.
  • the filtered digital image may be analyzed so as to detect deformities and asymmetries on the surface under analysis.
  • the filtered digital image may be analyzed, for example, by a skilled technician observing the display or by an automated process recognizing certain colored structures and/or patterns.
  • the first three filters are common filters, namely: the inverse filter, the solarize filter and the edge detect filter.
  • the fourth filter is a custom type of filter. As for the combinations of filters, they are the application of the inverse filter to a digital image on which the edge detect filter has already been applied and the application of the inverse filter to a digital image on which the custom filter has already been applied.
  • RGB Red Green Blue
  • the inverse filter helps with the viewing of contrast by producing a negative image of the original digital image 10. This is achieved by inversing the intensity of the Red Green Blue (RGB) components of each pixels of the original digital image 10, i.e. the new intensity value of each of the RGB component of a given pixel will be 255 (the maximum intensity value) minus the original intensity value of that component of the pixel.
  • RGB Red Green Blue
  • Figure 3 shows the inversed image 12 of the original digital image 10 of Figure 2 after the application of the inverse filter.
  • FIG. 4 An illustrative example of an inverse filter algorithm that may be used is depicted by the flow diagram shown in Figure 4. The steps of the algorithm are indicated by blocks 202 to 208.
  • the algorithm starts at block 202 by selecting a pixel "p" of the original digital image 10 which has not yet been selected.
  • new intensity values of the RGB components are computed for pixel p using the following equations:
  • R'(p) is the new red component intensity value of pixel p after the application of the inverse filter, R(p) being the original red component intensity value of pixel p;
  • G'(p) is the new green component intensity value of pixel p after the application of the inverse filter, G(p) being the original green component intensity value of pixel p;
  • B'(p) is the new blue component intensity value of pixel p after the application of the inverse filter, B(p) being the original blue component intensity value of pixel p.
  • the solarize filter is similar in concept to the inverse filter with the difference that, for each pixel, the solarize filter only inverses the intensity value of the RGB components which are smaller or equal to a predetermined level "L", the level having a value in between 0 (the minimum intensity value) and 255 (the maximum intensity value).
  • the solarize filter may be used to invert the RGB intensity values for low intensity pixels of a digital image.
  • the level is set at 255
  • the solarize filter's effect is the same as that of the inverse filter.
  • Figures 5 and 6 show examples of effects of the solarize filter upon the original digital image 10 of Figure 2. In Figure 5 the level is set to 0, resulting in digital image 14, while in Figure 6 the level is set to 128, resulting in digital image 15.
  • FIG. 7 An illustrative example of a solarize filter algorithm that may be used is depicted by the flow diagram shown in Figure 7. The steps of the algorithm are indicated by blocks 302 to 328.
  • the algorithm starts at block 302 by setting the level L and then, at block 304, selecting a pixel "p" of the original digital image 10 which has not yet been selected.
  • the red component intensity value of pixel p, R(p) is compared with level L, if R(p) is lower than L, then the algorithm proceeds to block 308 and computes the new red component intensity value of pixel p using Equation 1 , if not, the algorithm proceeds to block 310 where the new red component intensity value of pixel p is computed using the following equation:
  • the blue component intensity value of pixel p, B(p) is compared with level L, if B(p) is lower than L, then the algorithm proceeds to block 320 and computes the new blue component intensity value of pixel p using Equation 3, if not, the algorithm proceeds to block 322 where the new blue component intensity value of pixel p is computed using the following equation:
  • the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 302, if not is exits at block 328.
  • the purpose of the edge detect filter is to highlight edges between high intensity and low intensity areas of the original digital image 10, i.e. the limit between areas having high RGB intensity variations. For each RGB component intensity value of a given pixel, the value of the difference between the intensity value of that RGB component and the average of the intensity values of the eight (8) neighboring pixels, for that same RGB component, is computed. If that difference value is greater than a certain level "L", then it is set to 255 (the maximum intensity value). Finally, the pixel's new three RGB intensity values are set to the value of the RGB component having the greatest difference value. This results in a shades of grey image where the lighter lines identify edges and contours in the original digital image 10.
  • Figure 8 shows the resulting image 16 after the application of the edge detect filter and the inverse filter to the original digital image 10 of Figure 2.
  • the inverse filter simply being applied for added clarity in order to show the edge lines in dark lines over a light background instead of light lines on a dark background.
  • the background of the original digital image 10 may be selected according to the surface being photographed so as to provide improved contrast.
  • FIG. 9a An illustrative example of an edge detect filter algorithm that may be used is depicted by the flow diagram shown in Figures 9a and 9b. The steps of the algorithm are indicated by blocks 402 to 436.
  • the algorithm starts at block 402 by setting the level L and then, at block 404, selecting a pixel "p" of the original digital image 10 which has not yet been selected.
  • the average of the red component intensity values of the eight (8) neighboring pixels to pixel p, Avg 8 [R(p)] is computed using the following equation:
  • x and y are the coordinates of pixel p.
  • Diff ⁇ [R(p)] I R(p) - Avg 8 [R(p)]
  • x and y are the coordinates of pixel p.
  • Diff 8 [G(p)] is compared with level L, if Diff 8 [G(p)] is greater than L, then the algorithm proceeds to block 420 where it sets Diff 8 [G(p)] to 255 (the maximum intensity value) and then proceeds to block 422, if not, the algorithm proceeds to block 422.
  • Diff 8 [B(p)] is compared with level L, if Diff 8 [B(p)] is greater than L, then the algorithm proceeds to block 428 where it sets Diff 8 [B(p)] to 255 (the maximum intensity value) and then proceeds to block 430, if not, the algorithm proceeds to block 430.
  • the algorithm identifies the maximum absolute difference MaxDiff 8 [RGB(p)] among Diff 8 [R(p)], Diff 8 [G(p)] and Diff 8 [B(p)], and, at block 432, assigns MaxDiff 8 [RGB(p)] to each new individual RGB component intensity value of pixel p, i.e. R'(p), G'(p) and B'(p). Therefore, if one of the absolute differences Diff 8 [R(p)], Diff 8 [G(p)] and Diff 8 [B(p)] are greater than level L, all the individual RGB component intensity value of pixel p will be set to 255 (the maximum intensity value).
  • the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 404, if not is exits at block 436.
  • the custom filter applies two different sets of rules, one for the green and blue components and one for the red component.
  • the red component is particularly present in the skin of a patient, in other applications it may be the green or the blue components which may warrant a different rule.
  • a value equal to the product of the component's intensity value and a predetermined levels "L G " or "L B " divided by 100 is added to that component's original intensity value to yield the resulting component intensity value.
  • any resulting intensity value lower than the minimum value, in this case 0, is set to 0 (possible in the case where L G or LB has a negative value) and that any resulting intensity value greater than the maximum value, in this case 255, is set to 255.
  • red component of the given pixel For the red component of the given pixel, a value equal to the product of the red intensity value, to which is subtracted the value of the red component intensity values of the eight (8) neighbouring pixels, and a predetermined level "LR" divided by 100 is added to the red component's original intensity value to yield the resulting red intensity value.
  • any resulting intensity value lower than 0 is set to 0 and that any resulting intensity value greater than 255 is set to 255.
  • Figure 10 shows the resulting image 18 after the application of the custom filter to the original digital image 10 of Figure 2.
  • Figure 11 shows the resulting image 20 after the application of both the custom filter and the inverse filter to the original digital image 10 of Figure 2, the inverse filter simply being applied for added clarity.
  • FIG. 12a An illustrative example of the custom filter algorithm that may be used is depicted by the flow diagram shown in Figures 12a and 12b. The steps of the algorithm are indicated by blocks 502 to 540.
  • the algorithm starts at block 502 by setting the levels L R , LG and L B , and then, at block 504, selecting a pixel "p" of the original digital image 10 which has not yet been selected.
  • the sum of the red component intensity values of the eight (8) neighboring pixels to pixel p, Sum 8 [R(p)] is computed using the following equation:
  • x and y are the coordinates of pixel p.
  • the algorithm verifies if R'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 512 where it sets R'(p) to 255, if not, the algorithm proceeds to block 514 where it verifies if R'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 516 where it sets R'(p) to 0.
  • the new value of the green component intensity value of pixel p, G'(p), is computed using the following equation:
  • the algorithm verifies if G'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 522 where it sets G'(p) to 255, if not, the algorithm proceeds to block 524 where it verifies if G'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 526 where it sets G'(p) to 0.
  • the new value of the blue component intensity value of pixel p, B'(p) is computed using the following equation:
  • the algorithm verifies if B'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 532 where it sets B'(p) to 255, if not, the algorithm proceeds to block 534 where it verifies if B'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 536 where it sets G'(p) to 0.
  • the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 504, if not is exits at block 540.
  • Figure 8 illustrates the combination of the edge detect filter with the inverse filter while Figure 11 illustrates the combination of the custom filter with the inverse filter.
  • the surface analysis method is used in the context of the diagnostic of postural abnormalities in the structure of the human body.
  • the patient's postural evaluation is based on the detection of light reflection pattern changes on the surface of his or her skin. Those changes are influenced by the position of the patient's different body segments compared to each other and by muscle mass and/or tension differences.
  • FIG 13 there is shown a treated image 30 of the back of a patient after the application of the custom filter with levels "L R ", "L G " and "LB” of 255 followed by the inverse filter.
  • the light reflection patterns on the left side of the patient differ from those on the right side, that is areas 32b, 34b and 36b. It may be observed that there is less light reflected off the right scapula area 32b compared to the left scapula area 32a. From this it may be deduced that the right scapula area 32b is further away from the camera, indicating a possible postural problem.
  • the treated image 30 also permits the identification of abnormalities of the underlying muscle structure on the right side of the patient, by comparing lines 35a and 35b.
  • FIGs 14 and 15 there is shown an untreated digital image 40 of a patient ( Figure 14) and the resulting treated image 50 ( Figure 15) after the application of the custom filter with levels "L R ", "LQ” and "LB” of 255 followed by the inverse filter.
  • the treated image 50 now shows clear abnormalities and asymmetries in the same corresponding areas, namely right and left shoulders 52a, 52b and right and left upper legs 54a, 54b, which may help a practitioner in establishing a diagnostic.
  • FIG. 16 An example of the application of the custom filter, with levels “LR”, “L G “ and “LB” of 255, followed by the inverse filter to a digital image of a patient for the diagnostic of a physiological condition is illustrated in Figures 16 to 19.
  • Figures 16 and 17 show front views of the lower limbs of two differrent patients while Figures 18 and 19 respectively show back views of the lower limbs of the same patients.
  • the treated image 60 shows signs of eversion of the lower limbs while the treated image 70 of Figure 17 shows normal lower limbs.
  • Another sign of eversion may be seen by examining the ankle regions 64 and 74, and observing that in treated image 60 the intern malleoli and the navicular bone, illustrated by line 65, are medially positionned compared to normal, which is illustrated by line 75 on treated image 70.
  • a further sign of eversion may be seen by examining the foot region 66 of treated image 60 and tracing a line 67 in the center of the brightest portion of the light reflection, indicating the direction of the foot's center of gravity. As it may be observed, line 67 is at an angle with the vertical, this is an indication that the foot's center of gravity is not centered.
  • the treated image 80 also shows signs of eversion of the lower limbs while the treated image 90 of Figure 19 shows normal lower limbs.
  • An indication of eversion may be seen by examining the heel region 82 of treated image 80 and tracing a line 83 in the center of the brightest portion of the light reflection, indicating the alignment of the Achillies tendon. As it may be observed, line 83 is at an angle with the vertical, this is an indication that the Achillies tendon is inclined.
  • Treated images such as those shown above may be taken before each treatment given to a patient in order to observe the progress of the treatment and, if necessary, readjust it.
  • a physician or other skilled professional may use other reference structures highlighted by the treated image in order to help him or her establish a diagnostic as well as compute values such as the hallux abductus angle or the Q angle which commonly require and X-ray image of the patient. It is also to be understood that other body parts or regions may be examined such as, for example, the underfoot in order to analyze the arch of the foot. It may be further understood that the above described operations may be automated using, for example, an algorithm to identify the highlighted structures and compute values such as the hallux abductus angle or the Q angle.

Abstract

A surface analysis method and system, the system comprising a digital imaging system for generating a digital image of a surface, a processor so coupled to the digital imaging system as to receive the digital image, to process the digital image and to generate a processed digital image highlighting relief variations in the surface, and a display system for displaying the processed digital image. The processor being so configured as to apply at least one digital filter to the digital image to highlight relief variation in the surface.

Description

SURFACE ANALYSIS METHOD AND SYSTEM
CROSS-REFERENCE TO RELATED APPLICATION
[0001] The present application claims the benefits of U.S. provisional patent application No. 60/733,178 filed November 4, 2005, which is hereby incorporated by reference.
TECHNICAL FIELD
[0002] The present invention relates to a surface analysis method and system. More specifically, the present invention relates to a surface analysis method and system for the diagnostic of postural abnormalities in the structure of the human body.
BACKGROUND
[0003] Various non-invasive biomedical investigation and diagnosis methods and systems have been explored, in particular optical methods and systems because of the relative simplicity and affordability of the equipment employed.
[0004] Different known optical methods involve both passive and active means of optically investigating the organism. In the first case, the organism's own radiation at the infrared (IR) range is recorded, while in the second case, external illumination of insignificant density, absolutely harmless for the human organism, is employed.
[0005] In the case of IR investigation, the recorded thermal radiation results from the metabolic generation of heat emanating from the human body. The patterns of such thermal emissions are affected by the activities of the tissues, organs and vessels inside the body. The amount of radiation can reflect the metabolic rate of the human body. [0006] For example, US patent No. 6,023,637 entitled "Method and apparatus for thermal radiation imaging", issued to Liu et al. on February 8, 2002, discloses a method and apparatus for obtaining images reflecting the metabolic activity within the body of a patient. This is accomplished using digital images indicative of the patient's body IR intensity. The various IR intensities are assigned distinct colors, which forms a new image reflecting the patient's metabolic activity.
[0007] In the case of external illumination, one of the most common applications is the investigation of skin condition. For example, angled lighting has been used to generate a gradient of the illuminating field on the skin in order to enhance the visualization of wrinkles and fine lines. Depending on the direction of the gradient (vertical or horizontal), different sets of wrinkles and fine lines may be visually enhanced.
[0008] Polarized light photography has also been developed to selectively enhance either surface or subsurface features of the skin. These results are accomplished by placing a polarizing filter (typically a linear polarizing filter) both in front of the flash unit, and in front of the camera. When the polarizing filters are in the same orientation with each other, surface features of the skin such as scales, wrinkles, fine lines, pores, and hairs are visually enhanced. When the polarizing filters are aligned perpendicular to each other, subsurface features of the skin such as erythema, pigmentation and blood vessels are visually enhanced.
[0009] Ultraviolet photography, where the flash unit is filtered to produce ultraviolet A (UVA) light and the camera is filtered so that only visible light enters the lens, has been used to visually enhance the appearance of pigmentation, the bacteria p. acnes, and horns. A variation of ultraviolet photography has been termed the "sun camera" where UVA light is used to illuminate the skin and an UVA sensitive film or a digital camera is used to record the reflected ultraviolet light from the skin. In this arrangement, both the pigment distribution and the surface features of the skin are visually enhanced.
[0010] For example, US Patent No. 6,907,193, entitled "Method of taking polarized images of the skin and the use thereof, issued to Kollias et al. on June 14, 2006, discloses a method of investigation of the skin using first a white light, followed by an ultraviolet light and finally a phosphorescent blue light. Each time a specific lighting is used, a picture of the patient is taken at an angle between 35 and 55 degrees. The angle allows the amplification of skin characteristics such as fine lines, skin texture, hairs, etc. Furthermore, the use of filters, such as polarizing filters is described. High frequency filters, red light blocking filters, etc. are also used to amplify some characteristics of the skin.
[0011] Another example is US patent No. 5,747,789, entitled "Method for investigation of distribution of physiological components in human body tissues and apparatus for its realization", issued to Godik on May 5th 1998, which discloses a method for the investigation a region of a patient's body. The method begins by illuminating the region under investigation and recording, at regular intervals, the spatial distribution of the intensity of the reflected light using, for example, a digital camera. The sequence of spatial distribution of the intensity of the reflected light thus obtained gives information on a spatial picture of the functional dynamics of the arterial and venous capillary blood content. Depending on the physiological component to be investigated, a light source composed of specific wavelengths is used in order to heighten the sensitivity of the method. This wavelength specific light source is produced with the use of optical filters.
[0012] Finally, US patent application No. 2004/0125996, entitled "Skin diagnostic imaging method and apparatus", naming Eddowes et al. as inventors and published on July 1st 2004, discloses a method and apparatus for face skin diagnostic, the method consisting in illuminating the face of a patient with a white light combined with red and blue or red and green filters, and taking a digital images of the patient's face thus illuminated. A digital image of the patient's face is also taken using an ultraviolet light source. The images thus obtained are analyzed by a computer program which identifies skin regions requiring preventive skin treatment.
[0013] There is a need for a simple non-invasive analysis method and system, which doe not require sophisticated equipment, for the diagnostic of postural abnormalities in the structure of the human body.
SUMMARY
[0014] The present invention relates to surface analysis system, comprising:
a digital imaging system for generating a digital image of a surface;
a processor so coupled to the digital imaging system as to receive the digital image, to process the digital image and to generate a processed digital image highlighting relief variations in the surface; the processor being so configured as to apply at least one digital filter to the digital image to highlight relief variation in the surface; and
[0015] a display system for displaying the processed digital image.
[0016] The present invention further relates to the above described surface analysis system wherein the at least one digital filter includes a custom filter, the processor, when applying the custom filter to the digital image, performing the steps of:
a. applying a first set of rules to a selected color component of each pixel of the digital image, the first set of rules comprising: i. adding to an intensity value of the selected color component a value equal to the product of the intensity value, to which is subtracted the sum of intensity values of eight neighboring pixels, multiplied by a predetermined fraction associated with the selected color component;
b. applying a second set of rules to the remaining color components of each pixel of the digital image, the second set of rules comprising:
i. adding to an intensity value of each of the remaining color components a value equal to the product of the intensity value multiplied by a predetermined fraction associated with each of the remaining color components;
c. setting to the minimum value any color component intensity value lower than the minimum value; and
d. setting to a maximum value any color component intensity value greater than the maximum value.
[0017] The present invention also relates to a surface analysis method, comprising:
a. capturing a digital image of a surface;
b. processing the digital image using at least one digital filter to highlight relief variations in the surface; and
c. displaying the processed digital image. [0018] As well, the present invention relates to digital filtering method for filtering a digital image provided with at least two color components, the digital filtering method comprising:
a. selecting a color component;
b. applying a first set of rules to the selected color component of each pixel of a digital image, the first set of rules comprising:
i. adding to an intensity value of the selected color component a value equal to the product of the intensity value, to which is subtracted the sum of intensity values of eight neighboring pixels, multiplied by a predetermined fraction associated with the selected color component;
c. applying a second set of rules to the remaining color components of each pixel of the digital image, the second set of rules comprising:
i. adding to an intensity value of each of the remaining color component a value equal to the product of the intensity value multiplied by a predetermined fraction associated with each of the remaining color components;
d. setting to a minimum value any color component intensity value lower than the minimum value; and
e. setting to a maximum value any color component intensity value greater than the maximum value.
BRIEF DESCRIPTION OF THE DRAWINGS [0019] A non-limitative illustrative embodiment of the invention will now be described by way of example only with reference to the accompanying drawings, in which:
[0020] Figure 1 is a flow diagram of a surface analysis method according to a non-limitative illustrative embodiment of the present invention;
[0021] Figure 2 is a digital image of the front view of a patient's feet;
[0022] Figure 3 is the digital image of Figure 2 to which an inverse filter was applied;
[0023] Figure 4 is a flow diagram of an inverse filter algorithm;
[0024] Figure 5 is the digital image of Figure 2 to which a solarize filter with a level equal to 0 was applied;
[0025] Figure 6 is the digital image of Figure 2 to which a solarize filter with a level equal to 128 was applied;
[0026] Figure 7 is a flow diagram of a solarize filter algorithm;
[0027] Figure 8 is the digital image of Figure 2 to which an edge detect and inverse filters were applied;
[0028] Figures 9a and 9b is a flow diagram of an edge detect filter algorithm;
[0029] Figure 10 is the digital image of Figure 2 to which a custom filter was applied;
[0030] Figure 11 is the digital image of Figure 2 to which the custom and inverse filters were applied;
[0031] Figures 12a and 12b is a flow diagram of the custom filter algorithm; [0032] Figure 13 is a digital image of the back of a patient to which was applied the custom filter with a level of 255 followed by the inverse filter;
[0033] Figure 14 is a digital image of the front of a patient;
[0034] Figure 15 is the digital image of Figure 14 to which was applied a custom filter with a level of 255 followed by the inverse filter;
[0035] Figure 16 is a digital image of the front view of a patient's feet, showing eversion of the lower limbs, to which was applied a custom filter with a level of 255 followed by the inverse filter;
[0036] Figure 17 is a digital image of the front view of a patient's feet, showing normal lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter;
[0037] Figure 18 is a digital image of the back view of a patient's feet, showing eversion of the lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter;
[0038] Figure 19 is a digital image of the back view of a patient's feet, showing normal lower limbs, to which was applied the custom filter with a level of 255 followed by the inverse filter; and
[0039] Figure 20 is a schematic view of a surface analysis system.
DETAILED DESCRIPTION
[0040] Generally stated, a method and system according to a non- limitative illustrative embodiment of the present invention provide a surface analysis system and method for the diagnostic of postural abnormalities in the structure of the human body. The method generally consist in using a digital filter, or a combination of digital filters, applied to digital images of a human body in order to highlight deformities and asymmetries on the surface of the skin covering the human body structure by accentuating the reflection of light upon the relief of the skin surface. It is to be understood that such a method may also be used in other contexts such as, for example, the analysis of the surface of the metallic body of a vehicle in order to identify any warping or indentations caused by an impact or an applied torque.
[0041] A system 1 that may be use to implement the method is shown in Figure 20 and advantageously consist of a digital camera 2, at least one flash unit, constant direct or diffuse source of light or a combination thereof 4 and a processing unit 6, such as, for example, a personal computer, to process digital images taken of a patient 8 by the digital camera 2 by applying the various filters to the digital images. It is to be understood that in an alternative embodiment the patient may be replaced by an object, for example a vehicle in the case where the surface under analysis is a metallic body of a vehicle.
[0042] Referring to Figure 1 , there is shown a flow diagram depicting the steps involved in the surface analysis method according to an illustrative embodiment of the present invention, which is indicated by blocks 102 to 106.
[0043] At block 102 the method starts by importing one, or more, digital image of the surface to be analyzed, for example a digital image of the body of a patient or the body of a vehicle. The digital image may be obtained using a digital imaging system such as, for example, a digital camera or a digital scanner or by scanning a conventional photograph or image.
[0044] Then, at block 104, one, or more, digital filter is applied to the digital image using, for example, a dedicated processor or a personal computer, in order to accentuate the reflection of light upon the surface. Four filters plus combinations of filters may be used in particular; these will be detailed further below.
[0045] At block 106, the filtered digital image are displayed, for example on a computer screen or a color printer. [0046] Optionally, at block 108, the filtered digital image may be analyzed so as to detect deformities and asymmetries on the surface under analysis. The filtered digital image may be analyzed, for example, by a skilled technician observing the display or by an automated process recognizing certain colored structures and/or patterns.
Filters
[0047] As mentioned above, four filters and combinations of filters may be used in particular although it is to be understood that other filters or other combinations of filters may be used as well.
[0048] The first three filters are common filters, namely: the inverse filter, the solarize filter and the edge detect filter. The fourth filter is a custom type of filter. As for the combinations of filters, they are the application of the inverse filter to a digital image on which the edge detect filter has already been applied and the application of the inverse filter to a digital image on which the custom filter has already been applied.
[0049] The effects of the four filters, and the combinations of filters, will now be described with reference to Figure 2 which is an original digital image 10 of the front view of the feet of a patient.
[0050] In the following description, reference will be made to the Red Green Blue (RGB) color model, for which the intensity value of each component varies from a minimum value of 0 to a maximum value of 255. It is to be understood that other color models, having different ranges of values, may be used as well.
Inverse filter
[0051] The inverse filter helps with the viewing of contrast by producing a negative image of the original digital image 10. This is achieved by inversing the intensity of the Red Green Blue (RGB) components of each pixels of the original digital image 10, i.e. the new intensity value of each of the RGB component of a given pixel will be 255 (the maximum intensity value) minus the original intensity value of that component of the pixel. For example, Figure 3 shows the inversed image 12 of the original digital image 10 of Figure 2 after the application of the inverse filter.
[0052] An illustrative example of an inverse filter algorithm that may be used is depicted by the flow diagram shown in Figure 4. The steps of the algorithm are indicated by blocks 202 to 208.
[0053] The algorithm starts at block 202 by selecting a pixel "p" of the original digital image 10 which has not yet been selected. At block 204, new intensity values of the RGB components are computed for pixel p using the following equations:
R'(p) = 255 - R(p); Equation 1
G'(p) = 255 - G(p); Equation 2
B'(p) = 255 - B(p); Equation 3
where
R'(p) is the new red component intensity value of pixel p after the application of the inverse filter, R(p) being the original red component intensity value of pixel p;
G'(p) is the new green component intensity value of pixel p after the application of the inverse filter, G(p) being the original green component intensity value of pixel p; and
B'(p) is the new blue component intensity value of pixel p after the application of the inverse filter, B(p) being the original blue component intensity value of pixel p. [0054] Then, at block 206, the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 202, if not it exits at block 208.
Solarize filter
[0055] The solarize filter is similar in concept to the inverse filter with the difference that, for each pixel, the solarize filter only inverses the intensity value of the RGB components which are smaller or equal to a predetermined level "L", the level having a value in between 0 (the minimum intensity value) and 255 (the maximum intensity value). Basically, the solarize filter may be used to invert the RGB intensity values for low intensity pixels of a digital image. Thus, if the level is set at 255, the solarize filter's effect is the same as that of the inverse filter. Figures 5 and 6 show examples of effects of the solarize filter upon the original digital image 10 of Figure 2. In Figure 5 the level is set to 0, resulting in digital image 14, while in Figure 6 the level is set to 128, resulting in digital image 15.
[0056] An illustrative example of a solarize filter algorithm that may be used is depicted by the flow diagram shown in Figure 7. The steps of the algorithm are indicated by blocks 302 to 328.
[0057] The algorithm starts at block 302 by setting the level L and then, at block 304, selecting a pixel "p" of the original digital image 10 which has not yet been selected. At block 306, the red component intensity value of pixel p, R(p), is compared with level L, if R(p) is lower than L, then the algorithm proceeds to block 308 and computes the new red component intensity value of pixel p using Equation 1 , if not, the algorithm proceeds to block 310 where the new red component intensity value of pixel p is computed using the following equation:
R'(p) = R(p). Equation 4 [0058] At block 312, the green component intensity value of pixel p, G(p), is compared with level L, if G(p) is lower than L, then the algorithm proceeds to block 314 and computes the new green component intensity value of pixel p using Equation 2, if not, the algorithm proceeds to block 316 where the new green component intensity value of pixel p is computed using the following equation:
G'(p) = G(p). Equation 5
[0059] Similarly, At block 318, the blue component intensity value of pixel p, B(p), is compared with level L, if B(p) is lower than L, then the algorithm proceeds to block 320 and computes the new blue component intensity value of pixel p using Equation 3, if not, the algorithm proceeds to block 322 where the new blue component intensity value of pixel p is computed using the following equation:
B'(p) = B(p). Equation 6
[0060] Then, at block 324, the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 302, if not is exits at block 328.
Edge detect filter
[0061] The purpose of the edge detect filter is to highlight edges between high intensity and low intensity areas of the original digital image 10, i.e. the limit between areas having high RGB intensity variations. For each RGB component intensity value of a given pixel, the value of the difference between the intensity value of that RGB component and the average of the intensity values of the eight (8) neighboring pixels, for that same RGB component, is computed. If that difference value is greater than a certain level "L", then it is set to 255 (the maximum intensity value). Finally, the pixel's new three RGB intensity values are set to the value of the RGB component having the greatest difference value. This results in a shades of grey image where the lighter lines identify edges and contours in the original digital image 10. For example, Figure 8 shows the resulting image 16 after the application of the edge detect filter and the inverse filter to the original digital image 10 of Figure 2. The inverse filter simply being applied for added clarity in order to show the edge lines in dark lines over a light background instead of light lines on a dark background.
[0062] It is to be understood that the background of the original digital image 10 may be selected according to the surface being photographed so as to provide improved contrast.
[0063] An illustrative example of an edge detect filter algorithm that may be used is depicted by the flow diagram shown in Figures 9a and 9b. The steps of the algorithm are indicated by blocks 402 to 436.
[0064] The algorithm starts at block 402 by setting the level L and then, at block 404, selecting a pixel "p" of the original digital image 10 which has not yet been selected. At block 406, the average of the red component intensity values of the eight (8) neighboring pixels to pixel p, Avg8[R(p)], is computed using the following equation:
+ 1 +\,J*ι
Avgi[R(p)] = Avgs[R(x,y)] = ]- ∑ ∑R(x + Uy + j) ; Equation 7
'=-1 7=-l
where
x and y are the coordinates of pixel p.
[0065] Following which, at block 408, the absolute difference between the red component intensity value of pixel p, R(p), and the average Avg8[R(p)] of block 406 is computed as Diff8[R(p)]. More specifically:
Diffβ[R(p)] = I R(p) - Avg8[R(p)] |. Equation 8 [0066] Then, at block 410, Diff8[R(p)] is compared with level L, if Diff8[R(p)] is greater than L, then the algorithm proceeds to block 412 where it sets Diff8[R(p)] to 255 and then proceeds to block 414, if not, the algorithm proceeds to block 414.
[0067] At block 414, the average of the green component intensity values of the eight (8) neighboring pixels to pixel p, Avg8[G(p)], is computed using the following equation:
+1 +l,j≠ι
AVg8[G(P)] = Avgs [G(x, y)] = ± ∑ ∑Giχ + Uy + β ; Equation 9 ι=-l j=-\
where
x and y are the coordinates of pixel p.
[0068] Following which, at block 416, the absolute difference between the green component intensity value of pixel p, G(p), and the average Avg8[G(p)] of block 414 is computed as Diff8[G(p)]. More specifically:
Diff8[G(p)] = I G(p) - Avg8[G(p)] |. Equation 10
[0069] Then, at block 418, Diff8[G(p)] is compared with level L, if Diff8[G(p)] is greater than L, then the algorithm proceeds to block 420 where it sets Diff8[G(p)] to 255 (the maximum intensity value) and then proceeds to block 422, if not, the algorithm proceeds to block 422.
[0070] At block 422, the average of the blue component intensity values of the eight (8) neighboring pixels to pixel p, Avg8[B(p)], is computed using the following equation:
+1 +l,J*ι
Λvgs[B(p)] = Avgs[B(x,y)] = l J] J]B(X + Uy + J) ; Equation 11
,=-1 j=-\
where x and y are the coordinates of pixel p.
[0071] Following which, at block 424, the absolute difference between the green component intensity value of pixel p, G(p), and the average Avg8[G(p)] of block 422 is computed as Diff8[B(p)]. More specifically:
DJfF8[B(P)] = I B(p) - Avg8[B(p)] |. Equation 12
[0072] Then, at block 426, Diff8[B(p)] is compared with level L, if Diff8[B(p)] is greater than L, then the algorithm proceeds to block 428 where it sets Diff8[B(p)] to 255 (the maximum intensity value) and then proceeds to block 430, if not, the algorithm proceeds to block 430.
[0073] At block 430, the algorithm identifies the maximum absolute difference MaxDiff8[RGB(p)] among Diff8[R(p)], Diff8[G(p)] and Diff8[B(p)], and, at block 432, assigns MaxDiff8[RGB(p)] to each new individual RGB component intensity value of pixel p, i.e. R'(p), G'(p) and B'(p). Therefore, if one of the absolute differences Diff8[R(p)], Diff8[G(p)] and Diff8[B(p)] are greater than level L, all the individual RGB component intensity value of pixel p will be set to 255 (the maximum intensity value).
[0074] Then, at block 434, the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 404, if not is exits at block 436.
Custom filter
[0075] For each pixel the custom filter applies two different sets of rules, one for the green and blue components and one for the red component. It is to be understood that in the illustrative embodiment the red component is particularly present in the skin of a patient, in other applications it may be the green or the blue components which may warrant a different rule. [0076] For the green and blue components of a given pixel "p", a value equal to the product of the component's intensity value and a predetermined levels "LG" or "LB" divided by 100 is added to that component's original intensity value to yield the resulting component intensity value. It is to be understood that any resulting intensity value lower than the minimum value, in this case 0, is set to 0 (possible in the case where LG or LB has a negative value) and that any resulting intensity value greater than the maximum value, in this case 255, is set to 255.
[0077] For the red component of the given pixel, a value equal to the product of the red intensity value, to which is subtracted the value of the red component intensity values of the eight (8) neighbouring pixels, and a predetermined level "LR" divided by 100 is added to the red component's original intensity value to yield the resulting red intensity value.
[0078] Again, it is to be understood that any resulting intensity value lower than 0 is set to 0 and that any resulting intensity value greater than 255 is set to 255.
[0079] For example, Figure 10 shows the resulting image 18 after the application of the custom filter to the original digital image 10 of Figure 2. As for Figure 11 , it shows the resulting image 20 after the application of both the custom filter and the inverse filter to the original digital image 10 of Figure 2, the inverse filter simply being applied for added clarity.
[0080] An illustrative example of the custom filter algorithm that may be used is depicted by the flow diagram shown in Figures 12a and 12b. The steps of the algorithm are indicated by blocks 502 to 540.
[0081] The algorithm starts at block 502 by setting the levels LR, LG and LB, and then, at block 504, selecting a pixel "p" of the original digital image 10 which has not yet been selected. At block 506, the sum of the red component intensity values of the eight (8) neighboring pixels to pixel p, Sum8[R(p)], is computed using the following equation:
Sums[R(p)] = Sums[R(x,y)] = ∑ ∑R(x + i,y + j) ; Equation 13
where
x and y are the coordinates of pixel p.
[0082] Following which, at block 508, the new value of the red component intensity value of pixel p, R'(p), is computed using the following equation:
R(p) = R{p) + \mz^nMΛl^. Equation 14
100
[0083] Then, at block 510, the algorithm verifies if R'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 512 where it sets R'(p) to 255, if not, the algorithm proceeds to block 514 where it verifies if R'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 516 where it sets R'(p) to 0.
[0084] At block 518, the new value of the green component intensity value of pixel p, G'(p), is computed using the following equation:
G\p) = G(p) + G^ ' ∑G . Equation 15
[0085] Then, at block 520, the algorithm verifies if G'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 522 where it sets G'(p) to 255, if not, the algorithm proceeds to block 524 where it verifies if G'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 526 where it sets G'(p) to 0. [0086] At block 528, the new value of the blue component intensity value of pixel p, B'(p), is computed using the following equation:
B' (p) = B(p) + B(p) Ll> . Equation 16
100
[0087] Then, at block 530, the algorithm verifies if B'(p) is greater than 255 (the maximum intensity value), if so then the algorithm proceeds to block 532 where it sets B'(p) to 255, if not, the algorithm proceeds to block 534 where it verifies if B'(p) is lower than 0 (the minimum intensity value), if so then the algorithm proceeds to block 536 where it sets G'(p) to 0.
[0088] Then, at block 538, the algorithm verifies if there are any remaining pixels that have not yet been selected, if so it returns to block 504, if not is exits at block 540.
Filter combinations
[0089] As mentioned previously, the various filters may be used individually or in combination. For example, Figure 8 illustrates the combination of the edge detect filter with the inverse filter while Figure 11 illustrates the combination of the custom filter with the inverse filter.
Analysis
[0090] In the illustrative embodiment described herein, the surface analysis method is used in the context of the diagnostic of postural abnormalities in the structure of the human body. The patient's postural evaluation is based on the detection of light reflection pattern changes on the surface of his or her skin. Those changes are influenced by the position of the patient's different body segments compared to each other and by muscle mass and/or tension differences. [0091] Referring to Figure 13, there is shown a treated image 30 of the back of a patient after the application of the custom filter with levels "LR", "LG" and "LB" of 255 followed by the inverse filter. It may be seen that the light reflection patterns on the left side of the patient, more particularly in areas 32a, 34a and 36a, differ from those on the right side, that is areas 32b, 34b and 36b. It may be observed that there is less light reflected off the right scapula area 32b compared to the left scapula area 32a. From this it may be deduced that the right scapula area 32b is further away from the camera, indicating a possible postural problem. The treated image 30 also permits the identification of abnormalities of the underlying muscle structure on the right side of the patient, by comparing lines 35a and 35b.
[0092] Referring now to Figures 14 and 15, there is shown an untreated digital image 40 of a patient (Figure 14) and the resulting treated image 50 (Figure 15) after the application of the custom filter with levels "LR", "LQ" and "LB" of 255 followed by the inverse filter. Referring to Figure 14, when observing the right and left shoulder areas, 42a and 42b, respectively, and the right and left upper leg areas, 44a and 44b, respectively, no obvious abnormalities or asymmetries may be easily observed. Referring now to Figure 15, the treated image 50 now shows clear abnormalities and asymmetries in the same corresponding areas, namely right and left shoulders 52a, 52b and right and left upper legs 54a, 54b, which may help a practitioner in establishing a diagnostic.
[0093] An example of the application of the custom filter, with levels "LR", "LG" and "LB" of 255, followed by the inverse filter to a digital image of a patient for the diagnostic of a physiological condition is illustrated in Figures 16 to 19. Figures 16 and 17 show front views of the lower limbs of two differrent patients while Figures 18 and 19 respectively show back views of the lower limbs of the same patients. [0094] Referring to Figure 16, it may be seen that the treated image 60 shows signs of eversion of the lower limbs while the treated image 70 of Figure 17 shows normal lower limbs. This may be deduced by various factors, such as, for example, observing that the calf 62 of treated image 60 is less illuminated and the reflection less uniform than that the calf 72 of treated image 70, which is an indicator of the presence of tibial rotation.
[0095] Another sign of eversion may be seen by examining the ankle regions 64 and 74, and observing that in treated image 60 the intern malleoli and the navicular bone, illustrated by line 65, are medially positionned compared to normal, which is illustrated by line 75 on treated image 70. A further sign of eversion may be seen by examining the foot region 66 of treated image 60 and tracing a line 67 in the center of the brightest portion of the light reflection, indicating the direction of the foot's center of gravity. As it may be observed, line 67 is at an angle with the vertical, this is an indication that the foot's center of gravity is not centered. Conversely, examining the foot region 76 of treated image 70 and tracing a line 77 in the center of the brightest portion of the light reflection, it may be observed that line 77 is vertical, indicating that the foot's center of gravity is in the middle of the foot and thus normal.
[0096] Referring now to Figure 18, it may be seen that the treated image 80 also shows signs of eversion of the lower limbs while the treated image 90 of Figure 19 shows normal lower limbs. An indication of eversion may be seen by examining the heel region 82 of treated image 80 and tracing a line 83 in the center of the brightest portion of the light reflection, indicating the alignment of the Achillies tendon. As it may be observed, line 83 is at an angle with the vertical, this is an indication that the Achillies tendon is inclined. Conversely, examining the heel region 92 of treated image 90 and tracing a line 93 in the center of the brightest portion of the light reflection, it may be observed that line 93 is vertical, indicating that the Achillies tendon is straight and thus normal. [0097] Treated images such as those shown above may be taken before each treatment given to a patient in order to observe the progress of the treatment and, if necessary, readjust it.
[0098] It is to be understood that a physician or other skilled professional may use other reference structures highlighted by the treated image in order to help him or her establish a diagnostic as well as compute values such as the hallux abductus angle or the Q angle which commonly require and X-ray image of the patient. It is also to be understood that other body parts or regions may be examined such as, for example, the underfoot in order to analyze the arch of the foot. It may be further understood that the above described operations may be automated using, for example, an algorithm to identify the highlighted structures and compute values such as the hallux abductus angle or the Q angle.
[0099] Although the present invention has been described by way of a non-restrictive illustrative embodiment and examples thereof, it should be noted that it will be apparent to persons skilled in the art that modifications may be applied to the present illustrative embodiment without departing from the scope of the present invention. It is also to be understood that the present invention may be used for the detection of abnormalities in other types of surfaces such as, for example, vehicle bodywork or the surfaces of high precision metal components.

Claims

WHAT IS CLAIMED IS:
1. A surface analysis system, comprising: a digital imaging system for generating a digital image of a surface; a processor so coupled to the digital imaging system as to receive the digital image, to process the digital image and to generate a processed digital image highlighting relief variations in the surface; the processor being so configured as to apply at least one digital filter to the digital image to highlight relief variation in the surface; and a display system for displaying the processed digital image.
2. A surface analysis system according to claim 1 , wherein the surface is a surface of a patient's body.
3. A surface analysis system according to claim 1 , wherein the relief variations are indicative of structures underlying the surface.
4. A surface analysis system according to claim 1 , wherein the at least one digital filter includes a filter selected from a group consisting of an inverse filter, a solarize filter and an edge detect filter.
5. A surface analysis system according to claim 1 , wherein the at least one digital filter includes a custom filter, the processor, when applying the custom filter to the digital image, performing the steps of: a. applying a first set of rules to a selected color component of each pixel of the digital image, the first set of rules comprising: i. adding to an intensity value of the selected color component a value equal to the product of the intensity value, to which is subtracted the sum of intensity values of eight neighboring pixels, multiplied by a predetermined fraction associated with the selected color component; b. applying a second set of rules to the remaining color components of each pixel of the digital image, the second set of rules comprising: i. adding to an intensity value of each of the remaining color components a value equal to the product of the intensity value multiplied by a predetermined fraction associated with each of the remaining color components; c. setting to the minimum value any color component intensity value lower than the minimum value; and d. setting to a maximum value any color component intensity value greater than the maximum value.
6. A surface analysis system according to claim 5, wherein the selected color component is the red component.
7. A surface analysis system according to claim 5, wherein the at least one digital filter further includes an inverse filter.
8. A surface analysis method, comprising: a. capturing a digital image of a surface; b. processing the digital image using at least one digital filter to highlight relief variations in the surface; and c. displaying the processed digital image.
9. A surface analysis method according to claim 8, wherein the surface is a surface of a patient's body.
10. A surface analysis method according to claim 8, wherein the relief variations are indicative of structures underlying the surface.
11.A surface analysis method according to claim 8, wherein the processing step includes processing the digital image using at least one digital filter selected from a group consisting of an inverse filter, a solarize filter and an edge detect filter.
12. A surface analysis method according to claim 8, wherein the processing step includes processing the digital image using a custom filter, the application of the custom filter to the digital image comprising: a. applying a first set of rules to a selected color component of each pixel of the digital image, the first set of rules comprising: i. adding to an intensity value of the selected color component a value equal to the product of the intensity value, to which is subtracted the sum of intensity values of eight neighboring pixels, multiplied by a predetermined fraction associated with the selected color component; b. applying a second set of rules to the remaining color components of each pixel of the digital image, the second set of rules comprising: i. adding to an intensity value of each of the remaining color component a value equal to the product of the intensity value multiplied by a predetermined fraction associated with each of the remaining color components; c. setting to a minimum value any color component intensity value lower than the minimum value; and d. setting to a maximum value any color component intensity value greater than the minimum value.
13. A surface analysis method according to claim 12, wherein the selected color component is the red component.
14. A surface analysis method according to claim 12, wherein the processing step further includes processing the digital image with an inverse filter.
15.A digital filtering method for filtering a digital image provided with at least two color components, the digital filtering method comprising: a. selecting a color component; b. applying a first set of rules to the selected color component of each pixel of a digital image, the first set of rules comprising: i. adding to an intensity value of the selected color component a value equal to the product of the intensity value, to which is subtracted the sum of intensity values of eight neighboring pixels, multiplied by a predetermined fraction associated with the selected color component; c. applying a second set of rules to the remaining color components of each pixel of the digital image, the second set of rules comprising: i. adding to an intensity value of each of the remaining color component a value equal to the product of the intensity value multiplied by a predetermined fraction associated with each of the remaining color components; d. setting to a minimum value any color component intensity value lower than the minimum value; and e. setting to a maximum value any color component intensity value greater than the maximum value.
PCT/CA2006/001795 2005-11-04 2006-11-01 Surface analysis method and system WO2007051299A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/092,480 US20100020164A1 (en) 2005-11-04 2006-11-01 Surface Analysis Method and System
EP06804670.5A EP1958150B1 (en) 2005-11-04 2006-11-01 Surface analysis method and system
CA2628087A CA2628087C (en) 2005-11-04 2006-11-01 Surface analysis method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US73317805P 2005-11-04 2005-11-04
US60/733,178 2005-11-04

Publications (1)

Publication Number Publication Date
WO2007051299A1 true WO2007051299A1 (en) 2007-05-10

Family

ID=38005389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2006/001795 WO2007051299A1 (en) 2005-11-04 2006-11-01 Surface analysis method and system

Country Status (4)

Country Link
US (1) US20100020164A1 (en)
EP (1) EP1958150B1 (en)
CA (1) CA2628087C (en)
WO (1) WO2007051299A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10574883B2 (en) 2017-05-31 2020-02-25 The Procter & Gamble Company System and method for guiding a user to take a selfie
US10818007B2 (en) 2017-05-31 2020-10-27 The Procter & Gamble Company Systems and methods for determining apparent skin age
US11055762B2 (en) 2016-03-21 2021-07-06 The Procter & Gamble Company Systems and methods for providing customized product recommendations

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2625775A1 (en) 2005-10-14 2007-04-19 Applied Research Associates Nz Limited A method of monitoring a surface feature and apparatus therefor
CA2845015C (en) * 2011-08-11 2020-10-06 University Of Virginia Image-based identification of muscle abnormalities
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US10013527B2 (en) 2016-05-02 2018-07-03 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
EP4183328A1 (en) 2017-04-04 2023-05-24 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
CN111369455B (en) * 2020-02-27 2022-03-18 复旦大学 Highlight object measuring method based on polarization image and machine learning
CN113008470B (en) * 2020-07-22 2024-02-13 威盛电子股份有限公司 Gas leakage detection device and gas leakage detection method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697516B1 (en) 1997-03-28 2004-02-24 Sollac Method for inspecting the surface of a moving strip by prior classification of the detected surface irregularity
WO2005065293A2 (en) 2003-12-29 2005-07-21 Syris Scientific. L.L.C. Polarized material inspection apparatus and system

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4519041A (en) * 1982-05-03 1985-05-21 Honeywell Inc. Real time automated inspection
US4693255A (en) * 1985-04-22 1987-09-15 Beall Harry C Medical apparatus method for assessing the severity of certain skin traumas
DE3831630A1 (en) * 1988-09-17 1990-03-22 Ulrich M Landwehr METHOD AND DEVICE FOR DETERMINING THE DIMENSIONS OF AN OBJECT BY PHOTOGRAPHIC WAY
GB9304058D0 (en) * 1993-03-01 1993-04-14 Orthotics Limited Improvements relating to foot orthoses
DE69428148T2 (en) * 1993-03-24 2002-05-29 Fujifilm Electronic Imaging Color change of pictures
US5747789A (en) * 1993-12-01 1998-05-05 Dynamics Imaging, Inc. Method for investigation of distribution of physiological components in human body tissues and apparatus for its realization
AU1925595A (en) * 1994-02-18 1995-09-04 Imedge Technology, Inc. Method of producing and detecting high-contrast images of the surface topography of objects and a compact system for carrying out the same
US5590660A (en) * 1994-03-28 1997-01-07 Xillix Technologies Corp. Apparatus and method for imaging diseased tissue using integrated autofluorescence
US6061463A (en) * 1995-02-21 2000-05-09 Imedge Technology, Inc. Holographic fingerprint device
US6023637A (en) * 1997-03-31 2000-02-08 Liu; Zhong Qi Method and apparatus for thermal radiation imaging
US6567682B1 (en) * 1999-11-16 2003-05-20 Carecord Technologies, Inc. Apparatus and method for lesion feature identification and characterization
KR100397483B1 (en) * 2000-09-21 2003-09-13 이희만 Noncontact-type foot-shape measuring device using line scanning, and measuring method therefor
US6907193B2 (en) * 2001-11-08 2005-06-14 Johnson & Johnson Consumer Companies, Inc. Method of taking polarized images of the skin and the use thereof
FR2840686B1 (en) * 2002-06-10 2008-11-14 Oreal METHOD FOR DETERMINING THE ABILITY TO SPREAD AND / OR ABSORB THE LIGHT OF A COSMETIC PRODUCT
US20040125996A1 (en) * 2002-12-27 2004-07-01 Unilever Home & Personal Care Usa, Division Of Conopco, Inc. Skin diagnostic imaging method and apparatus
DE10315923A1 (en) * 2003-04-08 2004-10-28 Tbs Holding Ag Procedure to detect data of uneven surfaces for biometric data, using non-contact optical sensing of surface

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697516B1 (en) 1997-03-28 2004-02-24 Sollac Method for inspecting the surface of a moving strip by prior classification of the detected surface irregularity
WO2005065293A2 (en) 2003-12-29 2005-07-21 Syris Scientific. L.L.C. Polarized material inspection apparatus and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1958150A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11055762B2 (en) 2016-03-21 2021-07-06 The Procter & Gamble Company Systems and methods for providing customized product recommendations
US10574883B2 (en) 2017-05-31 2020-02-25 The Procter & Gamble Company System and method for guiding a user to take a selfie
US10818007B2 (en) 2017-05-31 2020-10-27 The Procter & Gamble Company Systems and methods for determining apparent skin age

Also Published As

Publication number Publication date
EP1958150A1 (en) 2008-08-20
US20100020164A1 (en) 2010-01-28
EP1958150B1 (en) 2013-05-22
CA2628087A1 (en) 2007-05-10
EP1958150A4 (en) 2011-09-21
CA2628087C (en) 2016-11-01

Similar Documents

Publication Publication Date Title
CA2628087C (en) Surface analysis method and system
EP1566142A1 (en) Imaging of buried structures
JP6501915B2 (en) Method and system for laser speckle imaging of tissue using color image sensor
JP6763719B2 (en) Biometric information measuring device, biometric information measuring method and program
US20160086380A1 (en) Hyperspectral imager
EP1433418A1 (en) Skin diagnostic imaging method and apparatus
CN102083362B (en) Locating and analyzing perforator flaps for plastic and reconstructive surgery
WO2014175154A1 (en) Blood flow image diagnosis device and diagnosis method
JP2004321793A (en) Method and system for computational analysis of skin image
CN112218576A (en) Device and method for acquiring and analyzing images of the skin
JP2009297295A (en) Evaluation method of smoothness of skin
JP6907398B2 (en) Endoscope system
JP2009528148A (en) Acne manifestation method before appearance
EP3219251A1 (en) Organ image capture device and program
CN106714651B (en) Evaluate value calculation apparatus and electronic endoscope system
Wieringa et al. Remote non-invasive stereoscopic imaging of blood vessels: first in-vivo results of a new multispectral contrast enhancement technology
JP4649965B2 (en) Health degree determination device and program
JP4574195B2 (en) Fundus diagnosis device
JP2018109759A (en) Electron microscope
JP5399874B2 (en) Image processing apparatus and image processing method
JP2018000587A (en) Information acquisition device, imaging device and information acquisition method
US20030069485A1 (en) Optical image measuring device
van Herpt et al. Burn imaging with a whole field laser Doppler perfusion imager based on a CMOS imaging array
JP2529539B2 (en) Bio-reflected light weak difference detection device
CN107405050B (en) Endoscopic system and evaluation value calculation apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2628087

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006804670

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2006804670

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 12092480

Country of ref document: US