WO2019092509A1 - Uv device for evaluation of skin conditions - Google Patents

Uv device for evaluation of skin conditions Download PDF

Info

Publication number
WO2019092509A1
WO2019092509A1 PCT/IB2018/001517 IB2018001517W WO2019092509A1 WO 2019092509 A1 WO2019092509 A1 WO 2019092509A1 IB 2018001517 W IB2018001517 W IB 2018001517W WO 2019092509 A1 WO2019092509 A1 WO 2019092509A1
Authority
WO
WIPO (PCT)
Prior art keywords
lighting
filtering device
light
camera
external device
Prior art date
Application number
PCT/IB2018/001517
Other languages
French (fr)
Inventor
Laurent Petit
Laurent Chantalat
Matthieu JOMIER
Romain VERNET
Cyprien ADNET
Original Assignee
Galderma Research & Development
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Galderma Research & Development filed Critical Galderma Research & Development
Publication of WO2019092509A1 publication Critical patent/WO2019092509A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface

Definitions

  • the present disclosure generally relates to a device configured for use with a camera for capturing an image of a patient's skin.
  • a Woods lamp also known as a black-light (or black light), or ultraviolet (UV) light, is a lamp which emits long-wave (UV) light and only a small amount of visible light.
  • a Woods lamp is helpful in diagnosing bacterial infections, including Propionibacterium acnes (also referred to as "p. acnes"), a bacterium that causes acne. This bacterium exhibits an orange-type glow under a Woods lamp. More specifically, acne fluoresces orange-red under a Woods lamp due to the presence of propionibacteria on a person's hair follicles or skin.
  • features of a facial image of a subject may be extracted by illuminating a portion of the subject's skin using a Woods lamp.
  • the Woods lamp may help identify an amount of acne-causing bacteria present on the portion of the subject's skin by rendering visible on the subject's skin residues, such as porphyrin, which are excreted by the bacteria.
  • the presence and/or amount of porphyrin present on a subject's skin is a proxy for the presence and/or amount of p-acnes.
  • An image of the illuminated residue is captured and then analyzed to determine the presence and/or amount of p-acnes present on the subject's skin.
  • Rubinstenn, et al. does not teach how to use new technologies to improve consumer engagement with treatment of a skin of the patient in order to improve patient compliance with the treatments.
  • the lighting and filtering device is configured to be physically mounted to the camera.
  • the lighting and filtering device is configured to be physically mounted to an external device that comprises the camera.
  • the external device is a cellular phone, a tablet computer, a handheld media device, a laptop computer, or a desktop computer.
  • the external device is a cellular phone.
  • the lighting and filtering device is configured to be physically mounted to the cellular phone via a magnet.
  • the lighting and filtering device also includes a housing having a first lateral side configured to extend along a first lateral side of the cellular phone, a second lateral side configured to extend along a second lateral side of the cellular phone, and a third lateral side extending from the first lateral side to the second lateral side.
  • the housing includes an annular portion, the at least one light source is located at a surface of the annular portion, and the annular portion surrounds the filter.
  • the at least one light source includes a plurality of light sources, and the plurality of light sources are evenly distributed around the surface of the annular portion.
  • the plurality of light sources includes 8 light sources.
  • the lighting and filtering device also includes a battery configured to supply power to the at least one light source.
  • the lighting and filtering device also includes a processor configured to control the at least one light source.
  • the lighting and filtering device also includes a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
  • the transceiver operates using a frequency-hopping technique.
  • the external device is configured to receive imaging data from the camera via the transceiver and further configured to process the imaging data.
  • the external device is further configured to determine a presence and an amount of a substance on the portion of the patient's skin based on the imaging data.
  • the external device is further configured to compare the imaging data to imaging data of a control image.
  • the lighting and filtering device also includes a connector configured to allow wired communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
  • the connector is a USB type connector.
  • the lighting and filtering device also includes a processor, a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor, and a connector configured to allow wired communication between the processor and the external device, such that the at least one light source is controllable by the external device via the processor.
  • the transceiver operates using a frequency-hopping technique
  • the connector is a USB type connector.
  • the band-pass filter is configured to attenuate light that is less than a second predetermined wavelength and greater than a third predetermined wavelength, the second predetermined wavelength being in a range of 550 to 650 nm, and the third predetermined wavelength being in a range of 650 to 750 nm.
  • the first predetermined wavelength is in a range of 430 nm to 440 nm.
  • the first predetermined wavelength is 435 nm.
  • the at least one light source is configured to emit light having a peak wavelength in a range between 385 nm and 425 nm.
  • the at least one light source is configured to emit light having a peak wavelength in a range between 400 nm and 415 nm.
  • the at least one light source is configured to emit light having a peak wavelength of 405 nm.
  • the peak wavelength of light emitted by the band-pass filter is in a range between 600 nm and 650 nm.
  • the peak wavelength of light emitted by the band-pass filter is in a range between 630 nm and 640 nm.
  • the peak wavelength of light emitted by the band-pass filter is 635 nm.
  • the band-pass filter is configured to facilitate transmission of light having a full width at half maximum in a range of 70 nm to 130 nm therethrough.
  • the lighting and filtering device comprises the camera.
  • the lighting and filtering device also includes a black housing.
  • a method of diagnosing p-acnes using an external device and a lighting and filtering device including a processor and a camera and the lighting and filtering device including a light source configured to emit a light having a peak wavelength in a range between 365 nm and 445 nm, the lighting and filtering device configured to be connected to the external device, the method including connecting, by a user, the lighting and filtering device to the external device, positioning, by a user, the lighting and filtering device adjacent a target area of a user's skin, emitting, by the lighting and filtering device, the light onto the target area, and determining, by the processor, whether the target area contains any p-acnes.
  • the method further includes obtaining, by the camera, an image of the target area while the light is emitted onto the target area.
  • the method further includes instructing, by the processor, the camera to obtain the image, where the processor does not instruct the camera to obtain the image until the light has been emitted onto the target area for a target amount of time.
  • the method further includes determining, by the processor, whether each pixel of the image has an attribute that is above an attribute threshold.
  • the attribute is a red component of the pixel.
  • the method further includes recording, by the processor, each pixel that has an attribute that is above the attribute threshold.
  • the method further includes calculating, by the processor, an area percentage for the image by comparing a total number of pixels for the image and a number of pixels for which the attribute is above the attribute threshold.
  • the method further includes calculating, by the processor, an area score by comparing the area percentage to an area percentage threshold.
  • the method further includes calculating, by the processor, an area score for the image by comparing the area percentage and an area percentage threshold and correlating, by the processor, the area score to a plurality of area score ranges to determine if the target area contains any p-acnes.
  • the method further includes calculating, by the processor, an average fluorescence for the image.
  • the method further includes displaying, by the external device, the image, the area score, and the average fluorescence.
  • FIG. 1 is a front perspective view of a lighting and filtering device configured for use with a camera, according to one exemplary embodiment.
  • FIG. 2 is a front perspective view of a lighting and filtering device configured for use with a camera, similar to the lighting and filtering device configured for use with a camera shown in FIG. 1.
  • FIG. 3 is a front cross-sectional perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 4 is a front perspective view of a lighting and filtering device configured for use with a camera, according to a second exemplary embodiment.
  • FIG. 5 is a front perspective view of a lighting and filtering device configured for use with a camera, according to a third exemplary embodiment.
  • FIG. 6 is a rear perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 5.
  • FIG. 7 is a graph illustrating an internal transmittance of a 435 nm long-pass filter useable with a lighting and filtering device configured for use with a camera.
  • FIG. 8 is a graph illustrating a transmission rate for wavelengths of light of a 630 nm band-pass filter useable with a lighting and filtering device configured for use with a camera.
  • FIG. 9 is a front perspective view of an assembly including the lighting and filtering device configured for use with a camera shown in FIG. 1 mounted on a cellular phone.
  • FIG. 10 is a rear perspective view of the assembly shown in FIG. 9.
  • FIG. 11 is a schematic illustrating components of an external device on which the lighting and filtering device configured for use with a camera, shown in FIG. 1, is mountable.
  • FIG. 12 is a front perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 1, according to one aspect.
  • FIG. 13 is a front perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 4 mounted on a cellular phone.
  • FIG. 14 is a rear view of the assembly shown in FIG. 13.
  • FIG. 15A is a photograph showing an image captured by a camera of a handheld media device.
  • FIG. 15B is a photograph showing an image captured by the camera of a handheld media device using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 16A is a photograph showing an image captured by a camera of a cellular phone.
  • FIG. 16B is a photograph showing an image captured by a camera of the cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 17 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 18 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 19 is a block diagram of a process for analyzing p-acnes on a user's skin.
  • FIG. 20 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 21 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
  • FIG. 22 is a front perspective view of a lighting and filtering device configured for use with a camera and having a blocking tube, according to one exemplary embodiment.
  • FIG. 23 is a side perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 24 is a rear perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 25 is a detailed view of the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 26 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 27 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 28 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
  • FIG. 29 is a front view of a blocking tube configured for use with a lighting and filtering device, according to one exemplary embodiment.
  • FIG. 30 is a cross-sectional view of the blocking tube shown in FIG. 29 taken about plane A-A.
  • FIG. 31 is a bottom view of the blocking tube shown in FIG. 29.
  • FIG. 32 is a rear view of the blocking tube shown in FIG. 29.
  • the description below relates to a lighting and filtering device configured for use with a camera.
  • the lighting and filtering device configured for use with a camera is configured to provide imaging of a portion of a patient's skin and determine a level (e.g., an intensity, a density, etc.) of fluorescence illuminating the portion of the patient's skin.
  • the lighting and filtering device configured for use with a camera thereby effectively and efficiently
  • the lighting and filtering device configured for use with a camera (such as a device for providing imaging of a patient's skin or device for evaluating a skin condition of a patient) herein disclosed is configured to assist a patient in determining an effectiveness of skin treatment by, for example, determining an existence or amount of a substance present on the skin of the patient.
  • the lighting and filtering device configured for use with a camera is useable in determining an existence and/or amount of a substance on the patient's skin indicative of a state and/or extent of a skin condition.
  • a skin condition includes, but is not limited to, increase in pigmentation (e.g., melasma, postinflammatory pigmentation, etc.), loss of pigmentation (e.g., Vitiligo, ash-leaf macules in tuberous sclerosis, and hypomelanosis of Ito), pityriasis versicolor, malassezia folliculitis, tinea capitis, head lice, scabies, erythrasma, pseudomonas (wound infection), acne (e.g., p. acne), poryphia, presence of porphyrins, photodamage, and actinic keratosis.
  • pigmentation e.g., melasma, postinflammatory pigmentation, etc.
  • loss of pigmentation e.g., Vitiligo, ash-leaf macules in tuberous sclerosis, and hypomelanosis of Ito
  • pityriasis versicolor
  • the lighting and filtering device configured for use with a camera is useable in determining an existence and/or amount of a substance on the patient's skin that is unrelated to a skin condition of the patient, for example, a presence and/or amount of sun screen applied to a portion of the patient's skin or application of salicylic acid (e.g., application of chemical peel) on a portion of the patient's skin.
  • Acne, particularly p. acne is particularly prone to poor patient compliance and consumer engagement of treatment for the acne.
  • One type of acne treatment involves applying a topical product to the acne, such as a product that includes as an active ingredient benzoyl peroxide.
  • a severity of p. acne can be determined based on a presence and amount of porphyrin present on a patient's skin because the bacteria which cause p. acne excrete porphyrin.
  • a device 1 e.g., a lighting and filtering device configured for use with a camera.
  • the device 1 is configured to be physically mounted to the camera.
  • the device 1 is configured to be mounted by any suitable means to an external device that includes a camera.
  • the external device may be a computing device such as a cellular phone (e.g., smartphone, etc.), a tablet computer, a handheld media device, a laptop computer, or a desktop computer.
  • the device 1 is configured to illuminate a portion of a patient's skin to facilitate imaging the portion of the patient's skin so that a presence and/or amount of a substance (for example, a substance secreted by bacteria that cause acne) present on the portion of the patient's skin can be detected and determined, thereby allowing for the patient to determine or visualize an effectiveness of treatment of the patient's skin.
  • a substance for example, a substance secreted by bacteria that cause acne
  • the device 1 includes a housing (e.g., case) 5 which includes a front, closed surface 6 and a back, open surface 7 (shown in FIG. 3).
  • the housing 5 is made of any suitable material, for example, molded plastic or metal, such that the housing 5 is configured to be mounted to a computing device.
  • the device 1 is configured to be physically mounted to an external device via a magnet.
  • the device 1 may be configured to be physically mounted to an external device via adhesive (e.g., fugitive glue, double sided adhesive, etc.) or via a hook and loop fastener (e.g., Velcro, etc.).
  • the device 1 may be configured to be physically mounted to an external device via fasteners (e.g., snap members, clasps, bolts, etc.).
  • the housing 5 is of any suitable size or shape, for example, a size or shape for mounting device 1 onto at least one side of an external device (e.g., a cellular phone) so that the device 1 is used in conjunction with the external device (as described below) without adversely affecting the patient's ability to use the external device.
  • the housing 5 includes a first lateral side (e.g., a right side) 8 configured to extend along a first lateral side 128 of an external device 120 (shown in FIG. 9), which is, in this case, a cellular phone. As shown in FIG.
  • the housing 5 also includes a second lateral side (e.g., left side) 9 configured to extend along a first lateral side 129 of an external device 120 (shown in FIG. 9). As shown in FIG. 1, the housing 5 also includes a third lateral side (e.g., bottom side) 10 extending from the first lateral side 8 to the second lateral side 9 of the housing 5.
  • a second lateral side e.g., left side
  • a third lateral side e.g., bottom side
  • the housing 5 is colored black (e.g., by painting the housing 5 with a black paint, dying the housing 5 black, or other suitable coloration process) to avoid external light reflection off of the housing 5 when the device 1 is in use.
  • the housing 5 is formed of molded black plastic.
  • the housing 5 includes a recess or opening 15 configured to house a power button 20 (shown in FIG. 1) for controlling an electric power provided to the device 1 (for example, a button for turning the device 1 on or off).
  • the housing 5 also includes a recess or opening 16 configured to house a hub or plug for connecting the device 1 to an electric power supply or to house a USB type connector.
  • the recess 16 is a port configured to receive a USB type connector 25 configured to provide a wired connection between the device 1 and an external device (such as the external device 120 shown in FIGS. 9-10) and/or an external power source.
  • the housing 5 also includes an annular portion 30.
  • the annular portion 30 is of any suitable size.
  • the annular portion 30 is configured to house or support at least one lighting element, such as at least one light source 40 or filter 45.
  • the annular portion 30 includes a surface 31 configured to support the at least one light source 40 and/or the filter 45.
  • the surface 31 has a circular cross section and is configured to be received into an interior region 32 of the annular portion 30 (for example, the interior region 32 of annular portion 30 defined by an outer circumference and an inner circumference of annular portion 30).
  • the annular portion 30 is configured to hold the at least one light source 40 (described below) at the surface 31 and is further configured to surround the filter 45. As shown in FIG.
  • the annular portion 30 is disposed on the first lateral side 8 of the housing 5, but the annular portion 30 is not limited to this particular orientation.
  • the annular portion 30 is disposed on the second lateral side 9 of the housing 5.
  • the at least one light source 40 is, for example, a Woods lamp for providing UV light and of a size appropriate for mounting on an external device (e.g., cellular phone, etc.).
  • the at least one light source 40 includes a plurality of light sources 40 which are evenly distributed around the surface 31 of the annular portion 30 of the housing 5.
  • FIG. 1 shows eight light sources distributed along a circumference of the annular portion 30, the at least one light source 40 is not limited to this particular example.
  • the at least one light source 40 includes one light source.
  • the at least one light source 40 includes two, three, four, five, six, seven, or eight light sources.
  • the at least one light source 40 includes more light sources, such as sixteen light sources, thirty-two light sources, or sixty-four light sources.
  • the at least one light source 40 is a light emitting diode (LED)
  • the at least one light source 40 is a UV flash light.
  • the at least one light source 40 is configured to emit light having a predetermined wavelength, for example, light having at least a component of UV light (e.g., light having a wavelength of between 100 nm and 400 nm).
  • the at least one light source 40 emits light having a wavelength outside the UV wavelength ranges to avoid damaging a skin of a patient using the device 1.
  • the at least one light source 40 is configured, for example, to emit light having a peak wavelength in a range between 365 nm and 445 nm.
  • the at least one light source 40 is configured to emit light having a peak wavelength in a range between 385 nm and 425 nm. More preferably, the at least one light source 40 is configured to emit light having a peak wavelength in a range between 400 nm and 420 nm. More preferably, the at least one light source 40 is configured to emit light having a peak wavelength in a range between 400 nm and 415 nm. More preferably, the at least one light source 40 is configured to emit light having a wavelength of 405 nm. As another specific example, the at least one light source 40 is configured to emit a light component having a peak wavelength of 415 nm.
  • the at least one light source 40 is configured such that it can emit light of different peak wavelengths, depending on the wavelength of light needed or desired by a patient, for example, wavelengths of 365 nm, 385 nm, 405 nm, and 415 nm.
  • the at least one light source 40 is configured to provide light so as to create or cause a fluorescence on the portion of the skin of the patient.
  • the fluorescence on the portion of skin is determined based on the wavelength of light emitted by the at least one light source 40 and a substance present on the skin of the patient.
  • the light filter 45 is any suitable filter for filtering out light along an optical path from an imaging object to a camera.
  • the light filter 45 is configured to be disposed in front of a lens of a camera.
  • the light filter 45 is, for example, a band-pass or long-pass filter or ultraviolet light filter to filter light having a long wavelength (such as UV light).
  • the light filter 45 is configured to attenuate light that is less than a predetermined wavelength.
  • the predetermined wavelength is, for example, within a range of 400 nm to 450 nm.
  • the predetermined wavelength is, for another example, within a range of 430 nm and 435 nm.
  • the predetermined wavelength is, for another example, 435 nm.
  • the light filter 45 is configured to attenuate light that is less than a predetermined wavelength.
  • the predetermined wavelength of light is in a range of 400 nm to 450 nm.
  • the predetermined wavelength of light is in a range of 430 nm to 440 nm. More preferably, the predetermined wavelength of light is 435 nm.
  • the light filter 45 is calibrated for specific color ranges and may be a commercially available light filter. The light filter 45 is configured to be positioned along an optical path between a camera and the portion of the patient's skin and thereby filter light from the portion of the patient's skin to the camera of the mobile computing device.
  • the light filter 45 is a band-pass filter
  • the light filter 45 is configured to be disposed in front of the lens of the camera and is configured to facilitate transmission of light having a peak wavelength in a range between 550 nm and 750 nm therethrough, as shown in FIG. 8.
  • the band-pass filter is configured to attenuate light that is less than a first predetermined wavelength and greater than a second predetermined wavelength.
  • the first predetermined wavelength is in a range of 550 nm to 650 nm
  • the second predetermined wavelength is in a range of 650 nm to 750 nm.
  • the band-pass filter emits light having a peak wavelength in a range between 600 nm and 650 nm. As a further specific example, the band-pass filter emits light having a peak wavelength in a range between 630 nm and 640 nm. As still further specific example, the bandpass filter emits light having a peak wavelength of 635 nm. As another example, the band-pass filter is configured to facilitate transmission of light having a full width at half maximum in a range of 70 nm to 130 nm therethrough.
  • the housing 5 is configured to house at least one electric power source (e.g., battery) 55.
  • the at least one battery 55 is configured to supply electric power to a circuit board 50 and/or the at least one light source 40.
  • the battery 55 is any suitable commercially available battery.
  • the housing 5 is also configured to house a circuit board 50 (which may act as a processor or controller).
  • the circuit board 50 is configured to control the various component of the device 1, such as the at least one light source 40.
  • the circuit board 50 is also configured to control a power output of the battery 55 to the at least one light source 40 and to control a power input to the at least one battery 55 from an external power supply for charging the battery 55.
  • the circuit board 50 is configured to control the at least one light source 40, such as controlling a wavelength, a light or flash intensity (such as the number of LEDs emitting light at a given time), and/or a flash duration of light emitted by the at least one light source 40.
  • the circuit board 50 includes a LED driver, a USB battery charging module, and a Bluetooth module.
  • the LED driver may receive power (e.g., 3.7 Volts, etc.) from the battery 55 and increase that power to a level required by the at least one light source 40.
  • the USB battery charging module may facilitate charging of the battery 55 using an external USB charger (e.g., may facilitate intermittent charging of the battery 55, etc.).
  • the Bluetooth module may be configured to facilitate interaction between the device 1 and an external device (described below).
  • the housing also includes holes 11 and 12.
  • the device 1 includes a transceiver configured to allow wireless communication between the processor and the external device (described below) such that the at least one light source is controllable by the external device via the circuit board (e.g., processor) 50.
  • the transceiver operates using a frequency-hopping technique.
  • the transceiver may be a Bluetooth ® transceiver.
  • the external device is configured to receive imaging data from the camera (described below) via the transceiver and further configured to process the imaging data (as described below).
  • the device 1 also includes a connector 25 configured to allow wired communication between the processor and the external device such that the at least one light source is controllable by the external device via the circuit board 50.
  • the connector 25 may be a USB type connector.
  • FIG. 4 a second exemplary embodiment of the lighting and filtering device configured for use with a camera is shown.
  • the device 2 shown in FIG. 4 is similar to the device 1, except that the device 2 is of a different size and shape.
  • the device 2 includes housing 5a having a front surface 6a, a first lateral side 8a, a second lateral side 9a, and a third lateral side 10a.
  • the housing 5a also includes an annular portion 30a disposed on the first lateral side 8a of the housing 5a.
  • the annular portion 30a houses at least one light source 40 and a light filter (e.g., band-pass filter) 45.
  • the light filter 45 is configured to be selectively coupled to, and selectively uncoupled from, the annular portion 30a via a threaded interface.
  • FIGS. 5-6 a third exemplary embodiment of the lighting and filtering device configured for use with a camera is shown.
  • the device 3 is similar to the device 1 herein disclosed, except for the differences herein described.
  • the device 3 includes a housing 5b which includes a front, closed surface 6b and a rear, closed surface 7b.
  • the device 3 also includes a first lateral side 8b, a second lateral side 9b, and a third lateral side 10b between the first lateral side 8b and the second lateral side 9b.
  • the device 3 also includes an annular portion 30b which is disposed between the first lateral side 8b and the second lateral side 9b.
  • the annular portion 30b is configured to house the at least one light source 40 (described above) and a light filter (e.g., band-pass filter) 45 (described above).
  • the annular portion 30b is also configured to house a camera 60 which is configured to capture an image of a portion of a patient's skin illuminated by the at least one light source 40.
  • the device 30 also includes a button 13b for controlling the camera 60.
  • the device 3 also includes the circuit board 50 (described above), except that the circuit board 50 also includes a processor configured to process an image captured by the camera.
  • the circuit board 50 is configured to execute programmable logic (such as an algorithm of a mobile software application) stored in a memory stored in device 3.
  • programmable logic such as an algorithm of a mobile software application
  • the device 3 is usable for evaluating a skin condition of the patient because the processor processes and analyzes the image provided by the camera to identify skin fluorescence produced by the device 3 on the skin of the patient.
  • the processor is then capable of mapping skin parameters and trends in relation to the patient by comparing the captured image or images to a control image or images.
  • the device 3 is useable to compute a severity index of the acne based on a presence and/or amount of porphyrin on the skin of the patient based on a fluorescence level detected by the camera 60 on the surface of the skin.
  • the fluorescence level in one example, corresponds to an intensity of the fluorescence (such as an overall brightness, an overall amount of fluorescence produced, etc.) or, in another example, corresponds to a density of the fiuorescence (such as an amount of fluorescence distributed over the surface area of the skin of a user).
  • an assembly 100 includes a device 1 (as described above) and an external device 120, such as a smartphone.
  • the external device 120 includes a front side 121 which includes an interface 123 which allows a user of the external device 120 (such as a patient) to use software (e.g., mobile software applications) on the external device 120.
  • the external device 120 also includes a back side 122 on which the device 1 is mountable.
  • the device 1 is mountable on the external device 120 by any suitable means, for example, by use of a magnet to mount the device 1 to the back side 122 of the external device 120.
  • the device 1 is in electrical connection with the external device 120, for example, by a physical connection or by a wireless connection or a combination of both.
  • the external device 120 also includes a first lateral side 128, a second lateral side 129, and a third lateral side 130 disposed between the first lateral side 128 and the second lateral side 129.
  • the external device 120 includes a camera 160, a memory 170, and a processor 180.
  • the camera 160 is capable of imaging a portion of the skin of a patient by capturing an image of the portion of skin.
  • the camera 160 is an 8 Mega-pixel (Mpixel) camera with a 2.4 aperture.
  • the camera 160 is a 12 Mpixel camera with a 2.2 aperture.
  • the camera 160 is a 12 Mpixel camera with a 1.8 aperture.
  • the camera 160 is configured to provide the captured image to the memory 170 of the external device 120.
  • the memory 170 is configured to store the captured image as a stored captured image.
  • the stored image is, according to one example, a two-dimensional image.
  • the stored image includes multiple images superimposed on each other to create a single two-dimensional image.
  • a facial stored image includes a left facial side image, a right facial side image, and a frontal facial image, superimposed to create a single two-dimensional image.
  • the stored image is a three-dimensional image created by superimposing a left facial side image, a right facial side image, and frontal facial image.
  • camera 160 includes dual-camera technology.
  • the external device 120 also includes a programmable memory 170, as shown in FIG. 11.
  • the programmable memory 170 is configured to store software, such as an operating system (OS).
  • OS operating system
  • the OS is Apple's iOS ® or Google's Android ® , but in other examples, the OS is another mobile OS.
  • the memory 170 is configured to store programmable logic such as a mobile software application, (e.g., an application for evaluating a skin condition of a patient or evaluating an effectiveness of a skin treatment of the patient) configured to process and/or analyze an image stored in the memory 170 and thereby determine a presence and/or an amount of a substance present on the patient's skin based on the image.
  • the memory 170 is also configured to store images captured by the camera 160 of the external device 120 or control images provided to the external device 120.
  • the memory 170 is, as one example, transitory memory and/or non-transitory memory.
  • memory 170 is implemented as RAM, ROM, flash, ferromagnetic, phase-change memory, and/or other suitable memory technologies.
  • the external device 120 also includes a processor 180 which is configured to execute programmable logic (such as an algorithm of a mobile software application) stored in the memory 170 of the external device 120.
  • programmable logic such as an algorithm of a mobile software application
  • the processor 180 executes the mobile software application for evaluating the skin condition of the patient
  • the processor processes and analyzes the image to identify skin fluorescence produced by the device 1 on the skin of the patient.
  • the processor is then capable of mapping skin parameters and trends in relation to the patient by comparing the captured image or images to a control image or images.
  • the processor 180 may facilitate mobile communication between the device 1 and a cloud based server.
  • the cloud based server may be configured to perform analysis of images taken by the external device 120, rather than relying on the mobile software application performing such analysis. This may reduce the operating requirements of the device 1, thereby facilitating use of the device 1 with external devices 120 having varying capabilities.
  • the device 1 also includes a mirror 90 which is positioned on the front surface 6 of the housing 5, and is also positioned relative to the annular portion 30. Accordingly, when the device 1 is mounted to the external device 120, the mirror 90 is positioned relative to the camera 160 of the external device 120 such that the patient or user can use the mirror 90 to facilitate appropriate positioning of the device 1 relative to the portion of the patient's skin which the patient desires to image.
  • FIGS. 13-14 show an alternative aspect of the assembly.
  • Assembly 200 includes an external device 220, which is as external device 120 as described above and device 2, as described above.
  • Assembly 200 is useable as is assembly 100 for patient treatment, as disclosed herein.
  • FIGS. 15A-18 show photographs taken using the camera 160 of an external device 120 compared to high contrast and/or filtered images.
  • FIG. 15A shows an image captured by a camera of a handheld media device.
  • FIG. 15B shows an image captured by the camera of a handheld media device using the device 1.
  • FIG. 16A shows an image captured by a camera of a cellular phone.
  • FIG. 16B shows an image captured by the camera of the cellular phone using the device 1.
  • FIG. 17 shows an image captured by the camera 160 of the external device 120 using the device 1.
  • FIG. 18 shows an image captured by the camera 160 of the external device 120 using the device 1.
  • FIGS. 17 and 18 are useable to detect a fluorescence from the skin of the face of the patient.
  • the assembly 100 is useable to compute a severity index of the acne based on a presence and/or amount of porphyrin on the skin of the patient based on a fluorescence level detected by the camera 160 or camera 60 on the surface of the skin.
  • the fluorescence level in one example, corresponds to an intensity of the fluorescence (such as an overall brightness, an overall amount of fluorescence produced, etc.) or, in another example, corresponds to a density of the fluorescence (such as an amount of fluorescence distributed over the surface area of the skin of a user).
  • the circuit board 50, the processor 180 or mobile software application compares the presence or amount of porphyrin present on the skin based on a captured image provided by the camera 60 or camera 160 and compares that captured image to a control image (for example, a control image stored in the memory 170 of the external device 120).
  • the control image is, as one example, a stock image showing a predetermined amount of porphyrin.
  • the predetermined amount of porphyrin is an amount that represents a low (e.g., zero) severity index of acne.
  • the circuit board 50, the processor 180, or mobile software application compares the captured image or images to the control image and calculates a difference in the amount of porphyrin detected in the captured images relative to the amount of porphyrin in the control image or images.
  • the mobile software application then computes a severity index (e.g., scaled from 1 to 10, etc.) based on the calculated difference and provides the severity index to the patient via an interface 123 of the external device 120.
  • the mobile software application and the processor 180 track serial images of the skin of the patient and provide overall treatment progress to the patient in any suitable visual format, such as graphical presentation using interface 123.
  • the mobile software application compares one or more captured images of the patient's skin to an initial image of the patient's skin prior to or concurrent with the beginning of any treatment and which is stored in the memory of the external device 120.
  • the mobile software application calculates a difference in the amount of porphyrin detected in the captured image or images and the amount of porphyrin detected in the initial image.
  • the mobile software application communicates the calculated difference to the patient by, for example, graphically displaying the calculated difference to indicate an overall treatment progress via the interface 123.
  • the graphical display is, for example, in the form of a numerical value (e.g., a percent difference, a severity index difference), a trend line, or an image comparison (e.g., side-by-side image comparison of the captured image or images to the initial image or images, a graphical morphing of the initial image or images to show the initial amount of porphyrin detected in the captured image or images showing a current amount of porphyrin detected).
  • a numerical value e.g., a percent difference, a severity index difference
  • a trend line e.g., a numerical value, e.g., a percent difference, a severity index difference
  • an image comparison e.g., side-by-side image comparison of the captured image or images to the initial image or images, a graphical morphing of the initial image or images to show the initial amount of porphyrin detected in the captured image or images showing a current amount of porphyrin detected.
  • the mobile software application graphically displays overall progress based on a series of images captured of the patient's skin.
  • the algorithm of the mobile software application is configured to analyze an image of the skin of the patient, in particular, the mobile software application analyzes the porphyrin amount that is visible or otherwise detectable due to the fluorescence of the porphyrin and then counts, measures, and/or estimates the amount of p. acnes and provides a quantitative output to the user via the interface 123 of the external device 120.
  • treatment product usage and effectiveness is improved, particularly because the patient is able to confirm that an acne treatment is working, even if not visually noticeable immediately but is made evident by the severity index showing a declining trend in an incidence or severity of p. acnes.
  • the treatment is not working effectively, the patient is informed by the mobile software application of ineffective treatment and, for example, alterative or additional treatment methods or other lifestyle tips that may improve the acne condition.
  • the assembly 100 is useable to evaluate treatment of a patient's skin for the condition of photodamage.
  • Photodamage refers to structural and/or functional deterioration of sun- exposed skin, specifically damage to skin and/or DNA of the skin caused by exposure to UV radiation.
  • the assembly 100 is used to identify portions of a patient's skin which have suffered from photodamage.
  • the mobile software application provides an algorithm that computes a quantitative measure of photodamage using a captured image of the portion of skin. The quantitative measure is communicated to the patient by the mobile software application using the interface 123 of the external device 120.
  • the mobile software application is also configured to provide the patient with recommended treatment products that can be used to treat photodamage and/or lifestyle changes that can be taken to reduce a future amount of photodamage to the skin.
  • the mobile software application compares the captured image to a control image having a low or zero level of photodamage which is stored in the memory 170 of the external device 120. Specifically, the mobile software application compares a detected amount of photodamage based on a captured image or images of the patient's skin to a detected amount of photodamage based on the control image. The mobile software application then graphically displays an indication of the photodamage present on the patient. The mobile software application, for example, graphically displays a scaling factor or index level (e.g., a relative scale from 1 to 10) of the severity of photodamage on the patient.
  • a scaling factor or index level e.g., a relative scale from 1 to 10.
  • the mobile software application compares a detected amount of photodamage based on a captured image or images of the patient's skin to a detected amount of photodamage based on an initial or previous captured image or images of the patient's skin.
  • the mobile software application then graphically displays an indication of a progress or change of severity of photodamage present on the patient and/or provides an indication in reduction of photodamage over time.
  • the assembly 100 is useable to evaluate or determine a presence or severity of actinic keratosis (also called solar keratosis) lesions, which are scaly, crusty growths (lesions) caused by UV radiation damage to a patient's skin.
  • actinic keratosis lesions typically appear on sun- exposed areas of skin (e.g., face, bald scalp, lips, and the back of the hands, etc.) and are often elevated, rough in texture, and resemble warts.
  • Endogenous photosensitizer protoporphyrin IX (PpIX) is a proxy for a condition of actinic keratosis lesions.
  • Fluorescence emitted by methyl ester of ALA (MAL)-induced PpIX is useful in providing a fluorescence-based diagnosis of lesions. This permits the detection of otherwise occult areas of abnormal skin, including tumor margins which may be related to the actinic keratosis lesions.
  • the mobile software application includes an algorithm which computes a quantitative measure of actinic keratosis lesions based on a captured image or images of a portion of skin of the patient.
  • the quantitative measure is displayed by the mobile software application, for example, on the interface 123 of the external device 120, along with information related to recommended treatment products and/or lifestyle changes that can be taken by the patient to reduce or lower a severity or incidence of actinic keratosis lesions in the future.
  • the mobile software application computes the quantitative measure by comparing a detected amount of actinic keratosis lesions based on the captured image and images to a detected amount of actinic keratosis lesions based on a control image or images stored in the memory 170 of the external device 120.
  • the quantitative measure is based on the determined difference in the detected amounts of the skin condition.
  • the mobile software application provides a graphical display showing a scaling factor or index (e.g., a relative scale of 1 to 10) of the severity of the actinic keratosis lesions on the patient.
  • the mobile software application analyzes a previously captured image or images of the user's skin and stored in the memory 170 of the external device 120.
  • the mobile software application compares a detected amount of actinic keratosis lesions based on the previously captured image or images to the captured image or images and graphically displays an indication of the progress or reduction of severity of actinic keratosis lesions present on the patient over time.
  • the assembly 100 is useable to evaluate or determine an appropriate application of a substance to the surface of the skin of a patient even when the patient is not suffering from a skin condition. For example, the assembly 100 is useable to determine or verify a correct application of sunscreen on a skin of the user.
  • the mobile software application provides an algorithm which computes the effectiveness of the sunscreen application quantitatively. For example, the assembly 100 analyzes a captured image from the camera 160 of the external device 120 and compares the captured image to a control image which illustrates or shows an optimal amount of sunscreen application (e.g., an optimal overall density of sunscreen applied to skin) stored in the memory 170 of the external device 120.
  • the mobile software application determines an effectiveness of the sunscreen application and graphically displays an indication of the effectiveness via the interface 123 of the external device 120.
  • the mobile software application provides a graphic showing a scaling factor or index level (e.g., a relative scale from 1 to 10) on the effectiveness of the application of sunscreen.
  • the graphical display provides relative coloring on the images of the skin of the patient to indicate areas of effective and ineffective sunscreen application (e.g., red areas indicate poor application of sunscreen while green areas indicate optimal or appropriate application of sunscreen).
  • the present disclosure relates to a method for evaluating skin condition on a skin area of a patient using a device herein disclosed (such as a device connectable to an external device).
  • the method includes illuminating a skin area of the user with the device, capturing an image of the illuminated skin area with a camera, processing the image with a processing program to determine the level of fluorescence on the skin area, and mapping progress of treatment of the skin condition based on the processed image compared to a control image.
  • the skin condition is acne.
  • the level of fluorescence on the skin area of the user detects the presence of Propionibacterium acnes via porphyrins.
  • the mapping of progress of treatment includes calculation of an acne severity index for the patient.
  • FIG. 19 illustrates a process 1900 for using the assembly 100 to analyze p-acnes on a user's skin.
  • the process 1900 begins, in block 1902, with connecting the device 1 to the external device 120.
  • the device 1 may be plugged into the external device 120 via a connector (e.g., a USB connector, a Lightning connector, etc.).
  • the device 1 may, in block 120, be connected to an external power source (e.g., wall outlet, etc.).
  • an external power source e.g., wall outlet, etc.
  • the process 900 continues, in block 1904, with initiating a mobile software application on the external device 120.
  • the mobile software application may, for example, be stored in the memory 170 of the external device 120.
  • the mobile software application may be initiated by a user selecting, via a touch screen of the external device 120, an icon associated with the mobile software application.
  • the mobile software application may also be initiated when the device 1 is connected to the external device 120.
  • the process 1900 then continues, in block 1906, with positioning the device 1 adjacent a target area of the user's skin.
  • the target area may be a cheek or nose area of a user's skin.
  • the process 1900 continues, in block 1908, with emitting, by the device 1, UV light onto the target area.
  • the UV light may be emitted by the device 1 via the at least one light source 40.
  • the UV light is emitted by the at least one light source 40 and then passed through the filter 45 prior to being emitted onto the target area.
  • certain locations on the target area may exhibit a fluorescence.
  • a fluorescence For example, if a p-acne is present in the target area, the p-acne may exhibit a fluorescence when UV light is emitted onto the target area.
  • Different fluorescence may correspond to different conditions existing in the target area.
  • a concentration of fluorescence in the target area when UV light is emitted onto the target area may indicate a concentration of p-acnes present in the target area.
  • the process 1900 continues, in block 1910, with obtaining (e.g., capturing, etc.), by the camera 160, an image (e.g., photograph, high resolution image, etc.) of the target area.
  • the camera 160 does not obtain the image until the UV light has been emitted onto the target area for a target amount of time (e.g.., 2 seconds, 5 seconds, 10 seconds, 15 seconds, etc.).
  • the image may be obtained in response to an interaction by the user (e.g., a user pressing a button on the device 1, a user pressing a button on the external device 120, etc.) or in response to the target amount of time elapsing (e.g., after the UV light has been emitted onto the target area of 10 seconds, etc.).
  • a user pressing a button on the device 1 e.g., a user pressing a button on the device 1, a user pressing a button on the external device 120, etc.
  • the target amount of time elapsing e.g., after the UV light has been emitted onto the target area of 10 seconds, etc.
  • the external device 120 performs analysis (e.g., processing, etc.) of the image to determine whether any conditions exist in the target area. Specifically, the process 1900 continues, in block 1912, with determining, by the processor 180 of the external device 120, as instructed by the mobile software application, whether each pixel of the image has an attribute that is above an attribute threshold.
  • the attribute may be, for example, an amount of red light (e.g., the "R” component in an RGB value for the pixel, etc.) or an amount of blue light (e.g., the "B" component in an RGB value for the pixel, etc.).
  • the attribute may be the red component of the RGB values for each of the pixels in an image and the processor 180 may be configured to utilize an attribute threshold of 90.
  • the process 180 may, for a pixel with a red component of 85, determine that the pixel does not have an attribute that is above the attribute threshold, and may, for a pixel with a red component of 100, determine that the pixel has an attribute that is above the attribute threshold.
  • the process 1900 continues in block 1914 with recording (e.g., cataloging, etc.) each pixel for which the processor 180 has determined has an attribute that is above the attribute threshold. Where the attribute of a pixel exceeds the attribute threshold in block 1912, the processor 180 may record the pixel by coordinate in an array. When a pixel is recorded, the attribute for the pixel is also recorded.
  • an image may be a square having a width of 1024 pixels and a height of 1024 pixel such that each pixel in the image has a coordinate (x, y) that corresponds to a location within the image, where 0 ⁇ x ⁇ 1024 and 0 ⁇ y ⁇ 1024.
  • the process 1900 repeats blocks 1912 and 1914 for each pixel in the image.
  • block 1914 has been performed for the last pixel in the image
  • the process 1900 continues in block 1916 with calculating, by the processor 180, an area percentage for the image.
  • the area percentage is calculated by dividing the number of pixels for which the associated attribute was greater than the attribute threshold by the total number of pixels. For example, an image that is 2048 pixels wide by 3072 pixels tall has 6,291,456 pixels. Following this example, if 250,000 pixels each have an attribute (e.g., "R" component, etc.) that is above an attribute threshold (e.g., 90, 80, 70, 110, 150, etc.), the area percentage is approximately 3.97%.
  • an attribute threshold e.g. 90, 80, 70, 110, 150, etc.
  • the process 1900 continues, in block 1918, with calculating, by the processor 180, an area score for the image based on a comparison between the area percentage and an area percentage threshold (e.g., 1%, 2%, 3%, 4%, 5%, etc.).
  • an area percentage threshold e.g., 1%, 2%, 3%, 4%, 5%, etc.
  • the processor 180 is configured to assign an area score of between 0 and 10 to the image, where an area score of 0 indicates substantially no p-acne is shown in the image, an area score of 1-3 indicates
  • the process 1900 continues, in block 1920, with calculating, by the processor 180, an average fluorescence for the image by dividing the sum of all the attributes for pixels where the attribute is greater than the attribute threshold by the number of pixels where the attribute is greater than the attribute threshold.
  • the average fluorescence may be related to a condition existing in the target area. For example, for a given area score, a greater average fluorescence may indicate a different type of p-acne is present in the target area.
  • the process 1900 ends by, in block 1922, displaying, by the external device 120, the image, area score, and/or average fluorescence, to a user.
  • the user may be provided with a direct visual representation of the target area and, through repeated iterations of the process 1900, visualize the effects of preventative care, thereby increasing the incentive for the user to perform preventative care.
  • FIGS. 20 and 21 illustrate images of a target area.
  • the process 1900 was implemented for both FIGS. 20 and 21.
  • the process 1900 produced an area score of 1.79, an area percentage of 1.09%, and an average fluorescence of 164.2.
  • the process 1900 produced an area score of 1.82, an area percentage of 1.33%, and an average fluorescence of 137.2.
  • FIGS. 22-25 illustrate the device 1 according to some embodiments.
  • the device 1 includes a blocking tube 2200 (e.g., cylinder, pipe, etc.).
  • the blocking tube 2200 is coupled to the annular portion 30 and disposed about the light sources 40 and the filter 45.
  • the blocking tube 2200 is configured to be selectively coupled to, and selectively removed (e.g., decoupled, etc.) from, the device 1.
  • the device 1 is utilized by placing a distil end of the blocking tube 2200 in contact with, or in close proximity to, a target area. Such a placement of the blocking tube 2200 causes light from extraneous sources to be removed and, therefore, for the image to become more representative of the target area.
  • the area score, the area percentage, and the average fluorescence may be more accurate, thereby providing the user with more accurate information so that, for example, the user is less surprised if preventative treatment is less effective than otherwise would be expected.
  • FIGS. 26-28 illustrate images obtained by the device 1 with the blocking tube 2200 coupled to the device.
  • the process 1900 was implemented for FIGS. 26-28.
  • the process 1900 produced an area score of 5.3, an area percentage of 4.37%, and an average fluorescence of 121.1.
  • the process 1900 produced an area score of 1.85, an area percentage of 1.73%, and an average fluorescence of 108.1.
  • the process 1900 produced an area score of 0.54, an area percentage of 1.07%, and an average fluorescence of 50.8.
  • the external device 120 may automatically determine when the blocking tube 2200 is coupled to the device 1 and when the blocking tube 2200 is not coupled to the device 1.
  • the device 1 may include a sensor in the annular portion 30 that senses presence of the blocking tube 2200 (e.g., via electrical resistance, via electrical contact, etc.).
  • the mobile software application may automatically or manually be adjusted depending on whether or not the blocking tube 2200 is coupled to the device 1.
  • the mobile software application may be configured to receive an input from a user to switch between a "Macro" mode for using the device 1 with the blocking tube 2200 coupled to the device 1 and a "Full Face” mode for using the device 1 with the blocking tube 2200 not coupled to the device 1.
  • FIGS. 29-32 illustrate the blocking tube 2200 in greater detail, with various dimensions shown in millimeters.
  • FIG. 30 is a cross-sectional view of FIG. 29, taken about plane A-A.
  • the blocking tube 2200 is configured for an external device 120 having a thickness of approximately 9 millimeters.
  • the height of the blocking tube 2200 shown in FIGS. 29-32 as 64 millimeters. It is understood that other similar heights and other dimensions of the blocking tube 2200 could be selected such that the blocking tube 2200 could be tailored for a target application.
  • at least an internal surface 3000 of the blocking tube 2200 is black, not translucent, not transparent, and non-reflective.
  • the blocking tube 2200 when the blocking tube 2200 is coupled to the device 1, the area score, the area percentage, and the average fluorescence may be more accurate, thereby providing the user with more accurate information so that, for example, the user is less surprised if preventative treatment is less effective than otherwise would be expected.
  • Coupled means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members or the two members and any additional intermediate members being integrally formed as a single unitary body with one another or with the two members or the two members and any additional intermediate members being attached to one another.

Abstract

A lighting and filtering device configured for use with a camera including at least one light source configured to emit light having a peak wavelength in a range between 365 nm and 44 nm, and thereby illuminate a portion of a patient's skin to produce a fluorescence on the portion of the patient's skin, and a filter configured to be disposed in front of a lens of the camera, the filter being (i) a band-pass filter configured to facilitate transmission of light having a peak wavelength in a range between 550 nm and 750 nm therethrough, or (ii) a long-pass filter configured to attenuate light that is less than a first predetermined wavelength, the first predetermined wavelength being in a range of 400 nm to 450 nm.

Description

UV DEVICE FOR EVALUATION OF SKIN CONDITIONS
CROSS-REFERENCE TO RELATED PATENT APPLICATION
[0001] This application claims the benefit of and priority to U.S. Provisional Patent
Application No. 62/584,404, filed November 10, 2017, the entire disclosure of which is incorporated herein by reference.
FIELD OF INVENTION
[0002] The present disclosure generally relates to a device configured for use with a camera for capturing an image of a patient's skin.
BACKGROUND
[0003] Many medical treatments, whether prescription or not, are known to have significant patient compliance issues which reduce the effectiveness of the treatments. In some cases, poor patient compliance results from the patient's inability to detect or observe improvements in medical conditions even though such improvements are occurring. For example, some medical conditions do not manifest immediate or significant visual improvement despite the fact that the conditions are improving in the patient. In other cases, a patient simply has poor "consumer engagement" with the specific treatment appropriate for the patient.
[0004] A Woods lamp, also known as a black-light (or black light), or ultraviolet (UV) light, is a lamp which emits long-wave (UV) light and only a small amount of visible light. A Woods lamp is helpful in diagnosing bacterial infections, including Propionibacterium acnes (also referred to as "p. acnes"), a bacterium that causes acne. This bacterium exhibits an orange-type glow under a Woods lamp. More specifically, acne fluoresces orange-red under a Woods lamp due to the presence of propionibacteria on a person's hair follicles or skin. [0005] As described in U.S. Patent Publication No. 20036/0063801 ("Rubinstenn, et al"), features of a facial image of a subject may be extracted by illuminating a portion of the subject's skin using a Woods lamp. The Woods lamp may help identify an amount of acne-causing bacteria present on the portion of the subject's skin by rendering visible on the subject's skin residues, such as porphyrin, which are excreted by the bacteria. In this manner, the presence and/or amount of porphyrin present on a subject's skin is a proxy for the presence and/or amount of p-acnes. An image of the illuminated residue is captured and then analyzed to determine the presence and/or amount of p-acnes present on the subject's skin. However, Rubinstenn, et al. does not teach how to use new technologies to improve consumer engagement with treatment of a skin of the patient in order to improve patient compliance with the treatments.
[0006] There is a need for a device which improves patient compliance with treatments for various skin conditions or other skin treatments by providing for good consumer engagement with specific treatment appropriate for the patient.
SUMMARY OF THE INVENTION
[0007] The following presents a general summary of aspects of the present disclosure in order to provide a basic understanding of the disclosure. This summary is not an extensive overview of the disclosure and is not intended to identify key or critical elements of the invention or to delineate the scope of the present disclosure. The following summary merely presents some concepts of the present disclosure in a general form as a prelude to the more detailed description provided below.
[0008] According to one exemplary embodiment, a lighting and filtering device configured for use with a camera includes at least one light source configured to emit light having a peak wavelength in a range between 365 nm and 445 nm, and thereby illuminate a portion of a patient's skin to produce a fluorescence on the portion of the patient's skin, and a filter configured to be disposed in front of a lens of the camera, the filter being (i) a band-pass filter configured to facilitate transmission of light having a peak wavelength in a range between 550 nm and 750 nm therethrough, or (ii) a long-pass filter configured to attenuate light that is less than a first predetermined wavelength, the first predetermined wavelength being in a range of 400 nm to 450 nm.
[0009] According to one aspect, the lighting and filtering device is configured to be physically mounted to the camera.
[0010] According to another aspect, the lighting and filtering device is configured to be physically mounted to an external device that comprises the camera.
[0011] According to another aspect, the external device is a cellular phone, a tablet computer, a handheld media device, a laptop computer, or a desktop computer.
[0012] According to another aspect, the external device is a cellular phone.
[0013] According to another aspect, the lighting and filtering device is configured to be physically mounted to the cellular phone via a magnet.
[0014] According to another aspect, the lighting and filtering device also includes a housing having a first lateral side configured to extend along a first lateral side of the cellular phone, a second lateral side configured to extend along a second lateral side of the cellular phone, and a third lateral side extending from the first lateral side to the second lateral side.
[0015] According to another aspect, the housing includes an annular portion, the at least one light source is located at a surface of the annular portion, and the annular portion surrounds the filter.
[0016] According to another aspect, the at least one light source includes a plurality of light sources, and the plurality of light sources are evenly distributed around the surface of the annular portion.
[0017] According to another aspect, the plurality of light sources includes 8 light sources. [0018] According to another aspect, the lighting and filtering device also includes a battery configured to supply power to the at least one light source.
[0019] According to another aspect, the lighting and filtering device also includes a processor configured to control the at least one light source.
[0020] According to another aspect, the lighting and filtering device also includes a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
[0021] According to another aspect, the transceiver operates using a frequency-hopping technique.
[0022] According to another aspect, the external device is configured to receive imaging data from the camera via the transceiver and further configured to process the imaging data.
[0023] According to another aspect, the external device is further configured to determine a presence and an amount of a substance on the portion of the patient's skin based on the imaging data.
[0024] According to another aspect, the external device is further configured to compare the imaging data to imaging data of a control image.
[0025] According to another aspect the lighting and filtering device also includes a connector configured to allow wired communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
[0026] According to another aspect, the connector is a USB type connector.
[0027] According to another aspect, the lighting and filtering device also includes a processor, a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor, and a connector configured to allow wired communication between the processor and the external device, such that the at least one light source is controllable by the external device via the processor.
[0028] According to another aspect, the transceiver operates using a frequency-hopping technique, and the connector is a USB type connector.
[0029] According to another aspect, the band-pass filter is configured to attenuate light that is less than a second predetermined wavelength and greater than a third predetermined wavelength, the second predetermined wavelength being in a range of 550 to 650 nm, and the third predetermined wavelength being in a range of 650 to 750 nm.
[0030] According to another aspect, the first predetermined wavelength is in a range of 430 nm to 440 nm.
[0031] According to another aspect, the first predetermined wavelength is 435 nm.
[0032] According to another aspect, the at least one light source is configured to emit light having a peak wavelength in a range between 385 nm and 425 nm.
[0033] According to another aspect, the at least one light source is configured to emit light having a peak wavelength in a range between 400 nm and 415 nm.
[0034] According to another aspect, the at least one light source is configured to emit light having a peak wavelength of 405 nm.
[0035] According to another aspect, the peak wavelength of light emitted by the band-pass filter is in a range between 600 nm and 650 nm.
[0036] According to another aspect, the peak wavelength of light emitted by the band-pass filter is in a range between 630 nm and 640 nm.
[0037] According to another aspect, the peak wavelength of light emitted by the band-pass filter is 635 nm. [0038] According to another aspect, wherein the band-pass filter is configured to facilitate transmission of light having a full width at half maximum in a range of 70 nm to 130 nm therethrough.
[0039] According to another aspect, the lighting and filtering device comprises the camera.
[0040] According to another aspect, the lighting and filtering device also includes a black housing.
[0041] According to one exemplary embodiment, a method of diagnosing p-acnes using an external device and a lighting and filtering device, the external device including a processor and a camera and the lighting and filtering device including a light source configured to emit a light having a peak wavelength in a range between 365 nm and 445 nm, the lighting and filtering device configured to be connected to the external device, the method including connecting, by a user, the lighting and filtering device to the external device, positioning, by a user, the lighting and filtering device adjacent a target area of a user's skin, emitting, by the lighting and filtering device, the light onto the target area, and determining, by the processor, whether the target area contains any p-acnes.
[0042] According to one aspect, the method further includes obtaining, by the camera, an image of the target area while the light is emitted onto the target area.
[0043] According to another aspect, the method further includes instructing, by the processor, the camera to obtain the image, where the processor does not instruct the camera to obtain the image until the light has been emitted onto the target area for a target amount of time.
[0044] According to another aspect, the method further includes determining, by the processor, whether each pixel of the image has an attribute that is above an attribute threshold.
[0045] According to another aspect, the attribute is a red component of the pixel. [0046] According to another aspect, the method further includes recording, by the processor, each pixel that has an attribute that is above the attribute threshold.
[0047] According to another aspect, the method further includes calculating, by the processor, an area percentage for the image by comparing a total number of pixels for the image and a number of pixels for which the attribute is above the attribute threshold.
[0048] According to another aspect, the method further includes calculating, by the processor, an area score by comparing the area percentage to an area percentage threshold.
[0049] According to another aspect, the method further includes calculating, by the processor, an area score for the image by comparing the area percentage and an area percentage threshold and correlating, by the processor, the area score to a plurality of area score ranges to determine if the target area contains any p-acnes.
[0050] According to another aspect, the method further includes calculating, by the processor, an average fluorescence for the image.
[0051] According to another aspect, the method further includes displaying, by the external device, the image, the area score, and the average fluorescence.
BRIEF DESCRIPTION OF THE DRAWINGS
[0052] The accompanying drawings are not intended to be drawn to scale. Like reference numbers and designations in the various drawings indicate like elements. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
[0053] FIG. 1 is a front perspective view of a lighting and filtering device configured for use with a camera, according to one exemplary embodiment.
[0054] FIG. 2 is a front perspective view of a lighting and filtering device configured for use with a camera, similar to the lighting and filtering device configured for use with a camera shown in FIG. 1. [0055] FIG. 3 is a front cross-sectional perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0056] FIG. 4 is a front perspective view of a lighting and filtering device configured for use with a camera, according to a second exemplary embodiment.
[0057] FIG. 5 is a front perspective view of a lighting and filtering device configured for use with a camera, according to a third exemplary embodiment.
[0058] FIG. 6 is a rear perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 5.
[0059] FIG. 7 is a graph illustrating an internal transmittance of a 435 nm long-pass filter useable with a lighting and filtering device configured for use with a camera.
[0060] FIG. 8 is a graph illustrating a transmission rate for wavelengths of light of a 630 nm band-pass filter useable with a lighting and filtering device configured for use with a camera.
[0061] FIG. 9 is a front perspective view of an assembly including the lighting and filtering device configured for use with a camera shown in FIG. 1 mounted on a cellular phone.
[0062] FIG. 10 is a rear perspective view of the assembly shown in FIG. 9.
[0063] FIG. 11 is a schematic illustrating components of an external device on which the lighting and filtering device configured for use with a camera, shown in FIG. 1, is mountable.
[0064] FIG. 12 is a front perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 1, according to one aspect.
[0065] FIG. 13 is a front perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 4 mounted on a cellular phone.
[0066] FIG. 14 is a rear view of the assembly shown in FIG. 13. [0067] FIG. 15A is a photograph showing an image captured by a camera of a handheld media device.
[0068] FIG. 15B is a photograph showing an image captured by the camera of a handheld media device using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0069] FIG. 16A is a photograph showing an image captured by a camera of a cellular phone.
[0070] FIG. 16B is a photograph showing an image captured by a camera of the cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0071] FIG. 17 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0072] FIG. 18 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0073] FIG. 19 is a block diagram of a process for analyzing p-acnes on a user's skin.
[0074] FIG. 20 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0075] FIG. 21 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 2.
[0076] FIG. 22 is a front perspective view of a lighting and filtering device configured for use with a camera and having a blocking tube, according to one exemplary embodiment.
[0077] FIG. 23 is a side perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 22. [0078] FIG. 24 is a rear perspective view of the lighting and filtering device configured for use with a camera shown in FIG. 22.
[0079] FIG. 25 is a detailed view of the lighting and filtering device configured for use with a camera shown in FIG. 22.
[0080] FIG. 26 is a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
[0081] FIG. 27 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
[0082] FIG. 28 a photograph showing an image captured by a camera of a cellular phone using the lighting and filtering device configured for use with a camera shown in FIG. 22.
[0083] FIG. 29 is a front view of a blocking tube configured for use with a lighting and filtering device, according to one exemplary embodiment.
[0084] FIG. 30 is a cross-sectional view of the blocking tube shown in FIG. 29 taken about plane A-A.
[0085] FIG. 31 is a bottom view of the blocking tube shown in FIG. 29.
[0086] FIG. 32 is a rear view of the blocking tube shown in FIG. 29.
DETAILED DESCRIPTION
[0087] The description below relates to a lighting and filtering device configured for use with a camera. The lighting and filtering device configured for use with a camera is configured to provide imaging of a portion of a patient's skin and determine a level (e.g., an intensity, a density, etc.) of fluorescence illuminating the portion of the patient's skin. The lighting and filtering device configured for use with a camera thereby effectively and efficiently
communicates information regarding the portion of the patient's skin such that the patient can determine an effectiveness of treatment for the patient's skin. Accordingly, patient compliance with the treatment is facilitated.
[0088] The lighting and filtering device configured for use with a camera (such as a device for providing imaging of a patient's skin or device for evaluating a skin condition of a patient) herein disclosed is configured to assist a patient in determining an effectiveness of skin treatment by, for example, determining an existence or amount of a substance present on the skin of the patient. For example, the lighting and filtering device configured for use with a camera is useable in determining an existence and/or amount of a substance on the patient's skin indicative of a state and/or extent of a skin condition. A skin condition includes, but is not limited to, increase in pigmentation (e.g., melasma, postinflammatory pigmentation, etc.), loss of pigmentation (e.g., Vitiligo, ash-leaf macules in tuberous sclerosis, and hypomelanosis of Ito), pityriasis versicolor, malassezia folliculitis, tinea capitis, head lice, scabies, erythrasma, pseudomonas (wound infection), acne (e.g., p. acne), poryphia, presence of porphyrins, photodamage, and actinic keratosis. As an additional example, the lighting and filtering device configured for use with a camera is useable in determining an existence and/or amount of a substance on the patient's skin that is unrelated to a skin condition of the patient, for example, a presence and/or amount of sun screen applied to a portion of the patient's skin or application of salicylic acid (e.g., application of chemical peel) on a portion of the patient's skin.
[0089] Acne, particularly p. acne, is particularly prone to poor patient compliance and consumer engagement of treatment for the acne. One type of acne treatment involves applying a topical product to the acne, such as a product that includes as an active ingredient benzoyl peroxide. A severity of p. acne can be determined based on a presence and amount of porphyrin present on a patient's skin because the bacteria which cause p. acne excrete porphyrin.
[0090] As shown in FIGS. 1-3, herein disclosed is a device 1 (e.g., a lighting and filtering device configured for use with a camera). In some implementations, the device 1 is configured to be physically mounted to the camera. In some implementations, the device 1 is configured to be mounted by any suitable means to an external device that includes a camera. The external device may be a computing device such as a cellular phone (e.g., smartphone, etc.), a tablet computer, a handheld media device, a laptop computer, or a desktop computer. The device 1 is configured to illuminate a portion of a patient's skin to facilitate imaging the portion of the patient's skin so that a presence and/or amount of a substance (for example, a substance secreted by bacteria that cause acne) present on the portion of the patient's skin can be detected and determined, thereby allowing for the patient to determine or visualize an effectiveness of treatment of the patient's skin.
[0091] Referring to FIGS. 1-2, the device 1 includes a housing (e.g., case) 5 which includes a front, closed surface 6 and a back, open surface 7 (shown in FIG. 3). Referring to FIGS. 1-3, the housing 5 is made of any suitable material, for example, molded plastic or metal, such that the housing 5 is configured to be mounted to a computing device. As one example, the device 1 is configured to be physically mounted to an external device via a magnet. In other examples, the device 1 may be configured to be physically mounted to an external device via adhesive (e.g., fugitive glue, double sided adhesive, etc.) or via a hook and loop fastener (e.g., Velcro, etc.). In still other examples, the device 1 may be configured to be physically mounted to an external device via fasteners (e.g., snap members, clasps, bolts, etc.). The housing 5 is of any suitable size or shape, for example, a size or shape for mounting device 1 onto at least one side of an external device (e.g., a cellular phone) so that the device 1 is used in conjunction with the external device (as described below) without adversely affecting the patient's ability to use the external device. For example, the housing 5 includes a first lateral side (e.g., a right side) 8 configured to extend along a first lateral side 128 of an external device 120 (shown in FIG. 9), which is, in this case, a cellular phone. As shown in FIG. 1, the housing 5 also includes a second lateral side (e.g., left side) 9 configured to extend along a first lateral side 129 of an external device 120 (shown in FIG. 9). As shown in FIG. 1, the housing 5 also includes a third lateral side (e.g., bottom side) 10 extending from the first lateral side 8 to the second lateral side 9 of the housing 5.
[0092] According to one example, the housing 5 is colored black (e.g., by painting the housing 5 with a black paint, dying the housing 5 black, or other suitable coloration process) to avoid external light reflection off of the housing 5 when the device 1 is in use. As a specific example, the housing 5 is formed of molded black plastic.
[0093] As shown in FIG. 3, the housing 5 includes a recess or opening 15 configured to house a power button 20 (shown in FIG. 1) for controlling an electric power provided to the device 1 (for example, a button for turning the device 1 on or off). The housing 5 also includes a recess or opening 16 configured to house a hub or plug for connecting the device 1 to an electric power supply or to house a USB type connector. For example, the recess 16 is a port configured to receive a USB type connector 25 configured to provide a wired connection between the device 1 and an external device (such as the external device 120 shown in FIGS. 9-10) and/or an external power source.
[0094] The housing 5 also includes an annular portion 30. The annular portion 30 is of any suitable size. The annular portion 30 is configured to house or support at least one lighting element, such as at least one light source 40 or filter 45. As shown in FIG. 3, the annular portion 30 includes a surface 31 configured to support the at least one light source 40 and/or the filter 45. The surface 31 has a circular cross section and is configured to be received into an interior region 32 of the annular portion 30 (for example, the interior region 32 of annular portion 30 defined by an outer circumference and an inner circumference of annular portion 30). The annular portion 30 is configured to hold the at least one light source 40 (described below) at the surface 31 and is further configured to surround the filter 45. As shown in FIG. 1, the annular portion 30 is disposed on the first lateral side 8 of the housing 5, but the annular portion 30 is not limited to this particular orientation. For example, as shown in FIGS. 2-3, the annular portion 30 is disposed on the second lateral side 9 of the housing 5.
[0095] Referring back to FIG. 1., the at least one light source 40 is, for example, a Woods lamp for providing UV light and of a size appropriate for mounting on an external device (e.g., cellular phone, etc.). The at least one light source 40, in some implementations, includes a plurality of light sources 40 which are evenly distributed around the surface 31 of the annular portion 30 of the housing 5. Although FIG. 1 shows eight light sources distributed along a circumference of the annular portion 30, the at least one light source 40 is not limited to this particular example. According to one example, the at least one light source 40 includes one light source. According to another example, the at least one light source 40 includes two, three, four, five, six, seven, or eight light sources. According to a still another example, the at least one light source 40 includes more light sources, such as sixteen light sources, thirty-two light sources, or sixty-four light sources.
[0096] As one specific example, the at least one light source 40 is a light emitting diode (LED) In a particular example, the at least one light source 40 is a UV flash light. The at least one light source 40 is configured to emit light having a predetermined wavelength, for example, light having at least a component of UV light (e.g., light having a wavelength of between 100 nm and 400 nm). As another example, the at least one light source 40 emits light having a wavelength outside the UV wavelength ranges to avoid damaging a skin of a patient using the device 1. Specifically, the at least one light source 40 is configured, for example, to emit light having a peak wavelength in a range between 365 nm and 445 nm. Preferably, the at least one light source 40 is configured to emit light having a peak wavelength in a range between 385 nm and 425 nm. More preferably, the at least one light source 40 is configured to emit light having a peak wavelength in a range between 400 nm and 420 nm. More preferably, the at least one light source 40 is configured to emit light having a peak wavelength in a range between 400 nm and 415 nm. More preferably, the at least one light source 40 is configured to emit light having a wavelength of 405 nm. As another specific example, the at least one light source 40 is configured to emit a light component having a peak wavelength of 415 nm. The at least one light source 40 is configured such that it can emit light of different peak wavelengths, depending on the wavelength of light needed or desired by a patient, for example, wavelengths of 365 nm, 385 nm, 405 nm, and 415 nm. The at least one light source 40 is configured to provide light so as to create or cause a fluorescence on the portion of the skin of the patient. The fluorescence on the portion of skin is determined based on the wavelength of light emitted by the at least one light source 40 and a substance present on the skin of the patient. [0097] The light filter 45 is any suitable filter for filtering out light along an optical path from an imaging object to a camera. The light filter 45 is configured to be disposed in front of a lens of a camera. The light filter 45 is, for example, a band-pass or long-pass filter or ultraviolet light filter to filter light having a long wavelength (such as UV light).
[0098] In the case in which the light filter 45 is a long-pass filter, the light filter 45 is configured to attenuate light that is less than a predetermined wavelength. As shown in FIG. 7, the predetermined wavelength is, for example, within a range of 400 nm to 450 nm. The predetermined wavelength is, for another example, within a range of 430 nm and 435 nm. The predetermined wavelength is, for another example, 435 nm. Accordingly, when the light filter 45 is a long-pass filter, the light filter 45 is configured to attenuate light that is less than a predetermined wavelength. Preferably, the predetermined wavelength of light is in a range of 400 nm to 450 nm. More preferably, the predetermined wavelength of light is in a range of 430 nm to 440 nm. More preferably, the predetermined wavelength of light is 435 nm. To avoid variations in color intensity or shade of the lighting produced by the device 1 or captured by a camera of an external device 120 (shown in FIGS. 9-10), the light filter 45 is calibrated for specific color ranges and may be a commercially available light filter. The light filter 45 is configured to be positioned along an optical path between a camera and the portion of the patient's skin and thereby filter light from the portion of the patient's skin to the camera of the mobile computing device.
[0099] In the case in which the light filter 45 is a band-pass filter, the light filter 45 is configured to be disposed in front of the lens of the camera and is configured to facilitate transmission of light having a peak wavelength in a range between 550 nm and 750 nm therethrough, as shown in FIG. 8. Accordingly, the band-pass filter is configured to attenuate light that is less than a first predetermined wavelength and greater than a second predetermined wavelength. For example, the first predetermined wavelength is in a range of 550 nm to 650 nm, and the second predetermined wavelength is in a range of 650 nm to 750 nm. As more specific example, the band-pass filter emits light having a peak wavelength in a range between 600 nm and 650 nm. As a further specific example, the band-pass filter emits light having a peak wavelength in a range between 630 nm and 640 nm. As still further specific example, the bandpass filter emits light having a peak wavelength of 635 nm. As another example, the band-pass filter is configured to facilitate transmission of light having a full width at half maximum in a range of 70 nm to 130 nm therethrough.
[0100] Referring again to FIG. 3, the housing 5 is configured to house at least one electric power source (e.g., battery) 55. The at least one battery 55 is configured to supply electric power to a circuit board 50 and/or the at least one light source 40. The battery 55 is any suitable commercially available battery.
[0101] The housing 5 is also configured to house a circuit board 50 (which may act as a processor or controller). The circuit board 50 is configured to control the various component of the device 1, such as the at least one light source 40. The circuit board 50 is also configured to control a power output of the battery 55 to the at least one light source 40 and to control a power input to the at least one battery 55 from an external power supply for charging the battery 55. The circuit board 50 is configured to control the at least one light source 40, such as controlling a wavelength, a light or flash intensity (such as the number of LEDs emitting light at a given time), and/or a flash duration of light emitted by the at least one light source 40.
[0102] In various embodiments, the circuit board 50 includes a LED driver, a USB battery charging module, and a Bluetooth module. The LED driver may receive power (e.g., 3.7 Volts, etc.) from the battery 55 and increase that power to a level required by the at least one light source 40. The USB battery charging module may facilitate charging of the battery 55 using an external USB charger (e.g., may facilitate intermittent charging of the battery 55, etc.). The Bluetooth module may be configured to facilitate interaction between the device 1 and an external device (described below).
[0103] The housing also includes holes 11 and 12.
[0104] In some implementations, the device 1 includes a transceiver configured to allow wireless communication between the processor and the external device (described below) such that the at least one light source is controllable by the external device via the circuit board (e.g., processor) 50. As one example, the transceiver operates using a frequency-hopping technique. For example, the transceiver may be a Bluetooth® transceiver. The external device is configured to receive imaging data from the camera (described below) via the transceiver and further configured to process the imaging data (as described below). The device 1 also includes a connector 25 configured to allow wired communication between the processor and the external device such that the at least one light source is controllable by the external device via the circuit board 50. For example, the connector 25 may be a USB type connector.
[0105] As shown in FIG. 4, a second exemplary embodiment of the lighting and filtering device configured for use with a camera is shown. The device 2 shown in FIG. 4 is similar to the device 1, except that the device 2 is of a different size and shape. The device 2 includes housing 5a having a front surface 6a, a first lateral side 8a, a second lateral side 9a, and a third lateral side 10a. The housing 5a also includes an annular portion 30a disposed on the first lateral side 8a of the housing 5a. The annular portion 30a houses at least one light source 40 and a light filter (e.g., band-pass filter) 45. In various embodiments, the light filter 45 is configured to be selectively coupled to, and selectively uncoupled from, the annular portion 30a via a threaded interface.
[0106] As shown in FIGS. 5-6, a third exemplary embodiment of the lighting and filtering device configured for use with a camera is shown. The device 3 is similar to the device 1 herein disclosed, except for the differences herein described. The device 3 includes a housing 5b which includes a front, closed surface 6b and a rear, closed surface 7b. The device 3 also includes a first lateral side 8b, a second lateral side 9b, and a third lateral side 10b between the first lateral side 8b and the second lateral side 9b. The device 3 also includes an annular portion 30b which is disposed between the first lateral side 8b and the second lateral side 9b. The annular portion 30b is configured to house the at least one light source 40 (described above) and a light filter (e.g., band-pass filter) 45 (described above). The annular portion 30b is also configured to house a camera 60 which is configured to capture an image of a portion of a patient's skin illuminated by the at least one light source 40. The device 30 also includes a button 13b for controlling the camera 60.
[0107] The device 3 also includes the circuit board 50 (described above), except that the circuit board 50 also includes a processor configured to process an image captured by the camera. For example, the circuit board 50 is configured to execute programmable logic (such as an algorithm of a mobile software application) stored in a memory stored in device 3. When the processor executes the programmable logic, the device 3 is usable for evaluating a skin condition of the patient because the processor processes and analyzes the image provided by the camera to identify skin fluorescence produced by the device 3 on the skin of the patient. The processor is then capable of mapping skin parameters and trends in relation to the patient by comparing the captured image or images to a control image or images.
[0108] When a patient is undergoing treatment for p. acnes, the device 3 is useable to compute a severity index of the acne based on a presence and/or amount of porphyrin on the skin of the patient based on a fluorescence level detected by the camera 60 on the surface of the skin. The fluorescence level, in one example, corresponds to an intensity of the fluorescence (such as an overall brightness, an overall amount of fluorescence produced, etc.) or, in another example, corresponds to a density of the fiuorescence (such as an amount of fluorescence distributed over the surface area of the skin of a user). To compute the severity index, the processor compares the presence or amount of porphyrin present on the skin based on a captured image provided by the camera 60 and compares that captured image to a control image stored in the memory of the device 3. The control image is, as one example, a stock image showing a predetermined amount of porphyrin. The predetermined amount of porphyrin is an amount that represents a low (e.g., zero) severity index of acne. The processor compares the captured image or images to the control image and calculates a difference in the amount of porphyrin detected in the captured images relative to the amount of porphyrin in the control image or images. The processor then computes a severity index (e.g., scaled from 1 to 10, etc.) based on the calculated difference. [0109] As shown in FIGS. 9-11, an assembly 100 includes a device 1 (as described above) and an external device 120, such as a smartphone.
[0110] Referring to FIGS. 9-10, the external device 120 includes a front side 121 which includes an interface 123 which allows a user of the external device 120 (such as a patient) to use software (e.g., mobile software applications) on the external device 120. The external device 120 also includes a back side 122 on which the device 1 is mountable. The device 1 is mountable on the external device 120 by any suitable means, for example, by use of a magnet to mount the device 1 to the back side 122 of the external device 120. The device 1 is in electrical connection with the external device 120, for example, by a physical connection or by a wireless connection or a combination of both. The external device 120 also includes a first lateral side 128, a second lateral side 129, and a third lateral side 130 disposed between the first lateral side 128 and the second lateral side 129.
[0111] As shown in FIG. 11, the external device 120 includes a camera 160, a memory 170, and a processor 180. The camera 160 is capable of imaging a portion of the skin of a patient by capturing an image of the portion of skin. As one specific example, the camera 160 is an 8 Mega-pixel (Mpixel) camera with a 2.4 aperture. As a further specific example, the camera 160 is a 12 Mpixel camera with a 2.2 aperture. As a still further specific example, the camera 160 is a 12 Mpixel camera with a 1.8 aperture. The camera 160 is configured to provide the captured image to the memory 170 of the external device 120. The memory 170 is configured to store the captured image as a stored captured image. The stored image is, according to one example, a two-dimensional image. According to a further example, the stored image includes multiple images superimposed on each other to create a single two-dimensional image. For example, a facial stored image includes a left facial side image, a right facial side image, and a frontal facial image, superimposed to create a single two-dimensional image. As a still further example, the stored image is a three-dimensional image created by superimposing a left facial side image, a right facial side image, and frontal facial image. According to one aspect, camera 160 includes dual-camera technology. [0112] The external device 120 also includes a programmable memory 170, as shown in FIG. 11. The programmable memory 170 is configured to store software, such as an operating system (OS). In some examples, the OS is Apple's iOS® or Google's Android®, but in other examples, the OS is another mobile OS. The memory 170 is configured to store programmable logic such as a mobile software application, (e.g., an application for evaluating a skin condition of a patient or evaluating an effectiveness of a skin treatment of the patient) configured to process and/or analyze an image stored in the memory 170 and thereby determine a presence and/or an amount of a substance present on the patient's skin based on the image. The memory 170 is also configured to store images captured by the camera 160 of the external device 120 or control images provided to the external device 120. The memory 170 is, as one example, transitory memory and/or non-transitory memory. For example, memory 170 is implemented as RAM, ROM, flash, ferromagnetic, phase-change memory, and/or other suitable memory technologies.
[0113] The external device 120 also includes a processor 180 which is configured to execute programmable logic (such as an algorithm of a mobile software application) stored in the memory 170 of the external device 120. When the processor 180 executes the mobile software application for evaluating the skin condition of the patient, the processor processes and analyzes the image to identify skin fluorescence produced by the device 1 on the skin of the patient. The processor is then capable of mapping skin parameters and trends in relation to the patient by comparing the captured image or images to a control image or images.
[0114] The processor 180 may facilitate mobile communication between the device 1 and a cloud based server. The cloud based server may be configured to perform analysis of images taken by the external device 120, rather than relying on the mobile software application performing such analysis. This may reduce the operating requirements of the device 1, thereby facilitating use of the device 1 with external devices 120 having varying capabilities.
[0115] As shown in FIG. 12, according to one aspect, the device 1 also includes a mirror 90 which is positioned on the front surface 6 of the housing 5, and is also positioned relative to the annular portion 30. Accordingly, when the device 1 is mounted to the external device 120, the mirror 90 is positioned relative to the camera 160 of the external device 120 such that the patient or user can use the mirror 90 to facilitate appropriate positioning of the device 1 relative to the portion of the patient's skin which the patient desires to image.
[0116] FIGS. 13-14 show an alternative aspect of the assembly. Assembly 200 includes an external device 220, which is as external device 120 as described above and device 2, as described above. Assembly 200 is useable as is assembly 100 for patient treatment, as disclosed herein.
[0117] FIGS. 15A-18 show photographs taken using the camera 160 of an external device 120 compared to high contrast and/or filtered images. FIG. 15A shows an image captured by a camera of a handheld media device. FIG. 15B shows an image captured by the camera of a handheld media device using the device 1. FIG. 16A shows an image captured by a camera of a cellular phone. FIG. 16B shows an image captured by the camera of the cellular phone using the device 1. FIG. 17 shows an image captured by the camera 160 of the external device 120 using the device 1. FIG. 18 shows an image captured by the camera 160 of the external device 120 using the device 1. FIGS. 17 and 18 are useable to detect a fluorescence from the skin of the face of the patient.
[0118] When a patient is undergoing treatment for p. acnes, the assembly 100 is useable to compute a severity index of the acne based on a presence and/or amount of porphyrin on the skin of the patient based on a fluorescence level detected by the camera 160 or camera 60 on the surface of the skin. The fluorescence level, in one example, corresponds to an intensity of the fluorescence (such as an overall brightness, an overall amount of fluorescence produced, etc.) or, in another example, corresponds to a density of the fluorescence (such as an amount of fluorescence distributed over the surface area of the skin of a user). To compute the severity index, the circuit board 50, the processor 180 or mobile software application compares the presence or amount of porphyrin present on the skin based on a captured image provided by the camera 60 or camera 160 and compares that captured image to a control image (for example, a control image stored in the memory 170 of the external device 120). The control image is, as one example, a stock image showing a predetermined amount of porphyrin. The predetermined amount of porphyrin is an amount that represents a low (e.g., zero) severity index of acne. The circuit board 50, the processor 180, or mobile software application compares the captured image or images to the control image and calculates a difference in the amount of porphyrin detected in the captured images relative to the amount of porphyrin in the control image or images. The mobile software application then computes a severity index (e.g., scaled from 1 to 10, etc.) based on the calculated difference and provides the severity index to the patient via an interface 123 of the external device 120.
[0119] According to one aspect, the mobile software application and the processor 180 track serial images of the skin of the patient and provide overall treatment progress to the patient in any suitable visual format, such as graphical presentation using interface 123. For example, the mobile software application compares one or more captured images of the patient's skin to an initial image of the patient's skin prior to or concurrent with the beginning of any treatment and which is stored in the memory of the external device 120. The mobile software application calculates a difference in the amount of porphyrin detected in the captured image or images and the amount of porphyrin detected in the initial image. The mobile software application communicates the calculated difference to the patient by, for example, graphically displaying the calculated difference to indicate an overall treatment progress via the interface 123. The graphical display is, for example, in the form of a numerical value (e.g., a percent difference, a severity index difference), a trend line, or an image comparison (e.g., side-by-side image comparison of the captured image or images to the initial image or images, a graphical morphing of the initial image or images to show the initial amount of porphyrin detected in the captured image or images showing a current amount of porphyrin detected).
[0120] As an additional example, the mobile software application graphically displays overall progress based on a series of images captured of the patient's skin. Accordingly, the algorithm of the mobile software application is configured to analyze an image of the skin of the patient, in particular, the mobile software application analyzes the porphyrin amount that is visible or otherwise detectable due to the fluorescence of the porphyrin and then counts, measures, and/or estimates the amount of p. acnes and provides a quantitative output to the user via the interface 123 of the external device 120.
[0121] By more actively engaging the patient with the acne treatment using the mobile software application, treatment product usage and effectiveness is improved, particularly because the patient is able to confirm that an acne treatment is working, even if not visually noticeable immediately but is made evident by the severity index showing a declining trend in an incidence or severity of p. acnes. Similarly, if the treatment is not working effectively, the patient is informed by the mobile software application of ineffective treatment and, for example, alterative or additional treatment methods or other lifestyle tips that may improve the acne condition.
[0122] The assembly 100 is useable to evaluate treatment of a patient's skin for the condition of photodamage. Photodamage refers to structural and/or functional deterioration of sun- exposed skin, specifically damage to skin and/or DNA of the skin caused by exposure to UV radiation. The assembly 100 is used to identify portions of a patient's skin which have suffered from photodamage. The mobile software application provides an algorithm that computes a quantitative measure of photodamage using a captured image of the portion of skin. The quantitative measure is communicated to the patient by the mobile software application using the interface 123 of the external device 120. The mobile software application is also configured to provide the patient with recommended treatment products that can be used to treat photodamage and/or lifestyle changes that can be taken to reduce a future amount of photodamage to the skin. For example, the mobile software application compares the captured image to a control image having a low or zero level of photodamage which is stored in the memory 170 of the external device 120. Specifically, the mobile software application compares a detected amount of photodamage based on a captured image or images of the patient's skin to a detected amount of photodamage based on the control image. The mobile software application then graphically displays an indication of the photodamage present on the patient. The mobile software application, for example, graphically displays a scaling factor or index level (e.g., a relative scale from 1 to 10) of the severity of photodamage on the patient. [0123] The mobile software application, according to another example, compares a detected amount of photodamage based on a captured image or images of the patient's skin to a detected amount of photodamage based on an initial or previous captured image or images of the patient's skin. The mobile software application then graphically displays an indication of a progress or change of severity of photodamage present on the patient and/or provides an indication in reduction of photodamage over time.
[0124] The assembly 100 is useable to evaluate or determine a presence or severity of actinic keratosis (also called solar keratosis) lesions, which are scaly, crusty growths (lesions) caused by UV radiation damage to a patient's skin. Actinic keratosis lesions typically appear on sun- exposed areas of skin (e.g., face, bald scalp, lips, and the back of the hands, etc.) and are often elevated, rough in texture, and resemble warts. Endogenous photosensitizer protoporphyrin IX (PpIX) is a proxy for a condition of actinic keratosis lesions. Fluorescence emitted by methyl ester of ALA (MAL)-induced PpIX is useful in providing a fluorescence-based diagnosis of lesions. This permits the detection of otherwise occult areas of abnormal skin, including tumor margins which may be related to the actinic keratosis lesions.
[0125] The mobile software application includes an algorithm which computes a quantitative measure of actinic keratosis lesions based on a captured image or images of a portion of skin of the patient. The quantitative measure is displayed by the mobile software application, for example, on the interface 123 of the external device 120, along with information related to recommended treatment products and/or lifestyle changes that can be taken by the patient to reduce or lower a severity or incidence of actinic keratosis lesions in the future. The mobile software application, for example, computes the quantitative measure by comparing a detected amount of actinic keratosis lesions based on the captured image and images to a detected amount of actinic keratosis lesions based on a control image or images stored in the memory 170 of the external device 120. The quantitative measure is based on the determined difference in the detected amounts of the skin condition. The mobile software application provides a graphical display showing a scaling factor or index (e.g., a relative scale of 1 to 10) of the severity of the actinic keratosis lesions on the patient. [0126] According to another example, the mobile software application analyzes a previously captured image or images of the user's skin and stored in the memory 170 of the external device 120. The mobile software application then compares a detected amount of actinic keratosis lesions based on the previously captured image or images to the captured image or images and graphically displays an indication of the progress or reduction of severity of actinic keratosis lesions present on the patient over time.
[0127] The assembly 100 is useable to evaluate or determine an appropriate application of a substance to the surface of the skin of a patient even when the patient is not suffering from a skin condition. For example, the assembly 100 is useable to determine or verify a correct application of sunscreen on a skin of the user. The mobile software application provides an algorithm which computes the effectiveness of the sunscreen application quantitatively. For example, the assembly 100 analyzes a captured image from the camera 160 of the external device 120 and compares the captured image to a control image which illustrates or shows an optimal amount of sunscreen application (e.g., an optimal overall density of sunscreen applied to skin) stored in the memory 170 of the external device 120. Based on the comparison, the mobile software application determines an effectiveness of the sunscreen application and graphically displays an indication of the effectiveness via the interface 123 of the external device 120. For example, the mobile software application provides a graphic showing a scaling factor or index level (e.g., a relative scale from 1 to 10) on the effectiveness of the application of sunscreen. As a specific example, the graphical display provides relative coloring on the images of the skin of the patient to indicate areas of effective and ineffective sunscreen application (e.g., red areas indicate poor application of sunscreen while green areas indicate optimal or appropriate application of sunscreen).
[0128] When an image of skin is captured using UV light, the darker the skin in the image, the less UV light is absorbed by the skin, meaning that the skin is better protected against UV radiation by sunscreen. Similarly, white patches in the image captured using UV light show areas of skin where a substantial amount of UV light is getting through, indicating poor application of sunscreen. These principles are incorporated into the algorithm of the mobile software application to provide the quantitative evaluation for the patient.
[0129] According to another exemplary embodiment, the present disclosure relates to a method for evaluating skin condition on a skin area of a patient using a device herein disclosed (such as a device connectable to an external device). The method includes illuminating a skin area of the user with the device, capturing an image of the illuminated skin area with a camera, processing the image with a processing program to determine the level of fluorescence on the skin area, and mapping progress of treatment of the skin condition based on the processed image compared to a control image. In one aspect, the skin condition is acne. In another aspect, the level of fluorescence on the skin area of the user detects the presence of Propionibacterium acnes via porphyrins. In yet another aspect, the mapping of progress of treatment includes calculation of an acne severity index for the patient.
[0130] FIG. 19 illustrates a process 1900 for using the assembly 100 to analyze p-acnes on a user's skin. The process 1900 begins, in block 1902, with connecting the device 1 to the external device 120. For example, the device 1 may be plugged into the external device 120 via a connector (e.g., a USB connector, a Lightning connector, etc.). Additionally, the device 1 may, in block 120, be connected to an external power source (e.g., wall outlet, etc.).
[0131] The process 900 continues, in block 1904, with initiating a mobile software application on the external device 120. The mobile software application may, for example, be stored in the memory 170 of the external device 120. The mobile software application may be initiated by a user selecting, via a touch screen of the external device 120, an icon associated with the mobile software application. The mobile software application may also be initiated when the device 1 is connected to the external device 120.
[0132] The process 1900 then continues, in block 1906, with positioning the device 1 adjacent a target area of the user's skin. For example, the target area may be a cheek or nose area of a user's skin. Once the device 1 is positioned adjacent the target area, the process 1900 continues, in block 1908, with emitting, by the device 1, UV light onto the target area. The UV light may be emitted by the device 1 via the at least one light source 40. In some applications, the UV light is emitted by the at least one light source 40 and then passed through the filter 45 prior to being emitted onto the target area.
[0133] After the UV light has been emitted onto the target area, certain locations on the target area may exhibit a fluorescence. For example, if a p-acne is present in the target area, the p-acne may exhibit a fluorescence when UV light is emitted onto the target area. Different fluorescence may correspond to different conditions existing in the target area. For example, a concentration of fluorescence in the target area when UV light is emitted onto the target area may indicate a concentration of p-acnes present in the target area.
[0134] While the UV light is emitted onto the target area, the process 1900 continues, in block 1910, with obtaining (e.g., capturing, etc.), by the camera 160, an image (e.g., photograph, high resolution image, etc.) of the target area. In some applications, the camera 160 does not obtain the image until the UV light has been emitted onto the target area for a target amount of time (e.g.., 2 seconds, 5 seconds, 10 seconds, 15 seconds, etc.). The image may be obtained in response to an interaction by the user (e.g., a user pressing a button on the device 1, a user pressing a button on the external device 120, etc.) or in response to the target amount of time elapsing (e.g., after the UV light has been emitted onto the target area of 10 seconds, etc.).
[0135] After the image of the target area has been obtained, the external device 120 performs analysis (e.g., processing, etc.) of the image to determine whether any conditions exist in the target area. Specifically, the process 1900 continues, in block 1912, with determining, by the processor 180 of the external device 120, as instructed by the mobile software application, whether each pixel of the image has an attribute that is above an attribute threshold. The attribute may be, for example, an amount of red light (e.g., the "R" component in an RGB value for the pixel, etc.) or an amount of blue light (e.g., the "B" component in an RGB value for the pixel, etc.). For example, the attribute may be the red component of the RGB values for each of the pixels in an image and the processor 180 may be configured to utilize an attribute threshold of 90. According to this example, the process 180 may, for a pixel with a red component of 85, determine that the pixel does not have an attribute that is above the attribute threshold, and may, for a pixel with a red component of 100, determine that the pixel has an attribute that is above the attribute threshold.
[0136] The process 1900 continues in block 1914 with recording (e.g., cataloging, etc.) each pixel for which the processor 180 has determined has an attribute that is above the attribute threshold. Where the attribute of a pixel exceeds the attribute threshold in block 1912, the processor 180 may record the pixel by coordinate in an array. When a pixel is recorded, the attribute for the pixel is also recorded. In an example, an image may be a square having a width of 1024 pixels and a height of 1024 pixel such that each pixel in the image has a coordinate (x, y) that corresponds to a location within the image, where 0<x<1024 and 0<y<1024.
[0137] The process 1900 repeats blocks 1912 and 1914 for each pixel in the image. When block 1914 has been performed for the last pixel in the image, the process 1900 continues in block 1916 with calculating, by the processor 180, an area percentage for the image. The area percentage is calculated by dividing the number of pixels for which the associated attribute was greater than the attribute threshold by the total number of pixels. For example, an image that is 2048 pixels wide by 3072 pixels tall has 6,291,456 pixels. Following this example, if 250,000 pixels each have an attribute (e.g., "R" component, etc.) that is above an attribute threshold (e.g., 90, 80, 70, 110, 150, etc.), the area percentage is approximately 3.97%.
[0138] The process 1900 continues, in block 1918, with calculating, by the processor 180, an area score for the image based on a comparison between the area percentage and an area percentage threshold (e.g., 1%, 2%, 3%, 4%, 5%, etc.). In some embodiments, the processor 180 is configured to assign an area score of between 0 and 10 to the image, where an area score of 0 indicates substantially no p-acne is shown in the image, an area score of 1-3 indicates
substantially normal p-acne is shown in the image, an area score of 4-6 indicates advanced p- acne is shown in the image, and an area score of greater than 6 indicates severe p-acne is shown in the image. [0139] After the area score is calculated, the process 1900 continues, in block 1920, with calculating, by the processor 180, an average fluorescence for the image by dividing the sum of all the attributes for pixels where the attribute is greater than the attribute threshold by the number of pixels where the attribute is greater than the attribute threshold. The average fluorescence may be related to a condition existing in the target area. For example, for a given area score, a greater average fluorescence may indicate a different type of p-acne is present in the target area.
[0140] After block 1920, the process 1900 ends by, in block 1922, displaying, by the external device 120, the image, area score, and/or average fluorescence, to a user. In this way, the user may be provided with a direct visual representation of the target area and, through repeated iterations of the process 1900, visualize the effects of preventative care, thereby increasing the incentive for the user to perform preventative care.
[0141] FIGS. 20 and 21 illustrate images of a target area. The process 1900 was implemented for both FIGS. 20 and 21. For the image shown in FIG. 20, the process 1900 produced an area score of 1.79, an area percentage of 1.09%, and an average fluorescence of 164.2. For the image shown in FIG. 21, the process 1900 produced an area score of 1.82, an area percentage of 1.33%, and an average fluorescence of 137.2.
[0142] FIGS. 22-25 illustrate the device 1 according to some embodiments. In these embodiments, the device 1 includes a blocking tube 2200 (e.g., cylinder, pipe, etc.). The blocking tube 2200 is coupled to the annular portion 30 and disposed about the light sources 40 and the filter 45.
[0143] The blocking tube 2200 is configured to be selectively coupled to, and selectively removed (e.g., decoupled, etc.) from, the device 1. When the block tube 2200 is coupled to the device 1 , the device 1 is utilized by placing a distil end of the blocking tube 2200 in contact with, or in close proximity to, a target area. Such a placement of the blocking tube 2200 causes light from extraneous sources to be removed and, therefore, for the image to become more representative of the target area. As a result, when the blocking tube 2200 is coupled to the device 1, the area score, the area percentage, and the average fluorescence may be more accurate, thereby providing the user with more accurate information so that, for example, the user is less surprised if preventative treatment is less effective than otherwise would be expected.
[0144] FIGS. 26-28 illustrate images obtained by the device 1 with the blocking tube 2200 coupled to the device. The process 1900 was implemented for FIGS. 26-28. For the image shown in FIG. 26, the process 1900 produced an area score of 5.3, an area percentage of 4.37%, and an average fluorescence of 121.1. For the image shown in FIG. 27, the process 1900 produced an area score of 1.85, an area percentage of 1.73%, and an average fluorescence of 108.1. For the image shown in FIG. 28, the process 1900 produced an area score of 0.54, an area percentage of 1.07%, and an average fluorescence of 50.8.
[0145] The external device 120 may automatically determine when the blocking tube 2200 is coupled to the device 1 and when the blocking tube 2200 is not coupled to the device 1. For example, the device 1 may include a sensor in the annular portion 30 that senses presence of the blocking tube 2200 (e.g., via electrical resistance, via electrical contact, etc.). The mobile software application may automatically or manually be adjusted depending on whether or not the blocking tube 2200 is coupled to the device 1. For example, the mobile software application may be configured to receive an input from a user to switch between a "Macro" mode for using the device 1 with the blocking tube 2200 coupled to the device 1 and a "Full Face" mode for using the device 1 with the blocking tube 2200 not coupled to the device 1.
[0146] FIGS. 29-32 illustrate the blocking tube 2200 in greater detail, with various dimensions shown in millimeters. FIG. 30 is a cross-sectional view of FIG. 29, taken about plane A-A. As shown in FIGS. 29-32, the blocking tube 2200 is configured for an external device 120 having a thickness of approximately 9 millimeters. Additionally, the height of the blocking tube 2200 shown in FIGS. 29-32 as 64 millimeters. It is understood that other similar heights and other dimensions of the blocking tube 2200 could be selected such that the blocking tube 2200 could be tailored for a target application. [0147] In various embodiments, at least an internal surface 3000 of the blocking tube 2200 is black, not translucent, not transparent, and non-reflective. In this way, light emitted by the light sources 40 is directed only to the target area, and is not wasted through transmission through the blocking tube 2200. As a result, when the blocking tube 2200 is coupled to the device 1, the area score, the area percentage, and the average fluorescence may be more accurate, thereby providing the user with more accurate information so that, for example, the user is less surprised if preventative treatment is less effective than otherwise would be expected.
[0148] As utilized herein, the terms "approximately," "about," "substantially," and similar terms are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of the disclosure pertains. It should be understood by those of skill in the art who review this disclosure that these terms are intended to allow a description of certain features described and claimed without restricting the scope of these features to the precise numerical ranges provided. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of this disclosure as recited in the appended claims.
[0149] The terms "coupled," "connected," and the like are used herein to mean the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members or the two members and any additional intermediate members being integrally formed as a single unitary body with one another or with the two members or the two members and any additional intermediate members being attached to one another.
[0150] References herein to the position of elements (e.g., "top," "bottom," "above," "below," etc.) are merely used to describe the orientation of various elements in the Figures. It should be noted that the orientation of various elements may differ according to other exemplary embodiments and that such variations are intended to be encompassed by the present disclosure. [0151] It is to be understood that although the present invention has been described with regard to embodiments thereof, various other embodiments and variants may occur to those skilled in the art, which are within the scope and spirit of the invention, and such other embodiments and variants are intended to be covered by corresponding claims. Those skilled in the art will readily appreciate that many modifications are possible (e.g., variations in sizes, structures, shapes and proportions of the various elements, mounting arrangements, use of materials, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter described herein. For example, the order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions,
modifications, changes, and omissions may also be made in the design, operating conditions and arrangement of the various exemplary embodiments without departing from the scope of the present disclosure.
[0152] With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context or application. The various singular/plural permutations may be expressly set forth herein for clarity.

Claims

WHAT IS CLAIMED IS:
1. A lighting and filtering device configured for use with a camera, the lighting and filtering device comprising:
at least one light source configured to emit light having a peak wavelength in a range between 365 nm and 445 nm, and thereby illuminate a portion of a patient's skin to produce a fluorescence on the portion of the patient's skin; and
a filter configured to be disposed in front of a lens of the camera, the filter being (i) a band-pass filter configured to facilitate transmission of light having a peak wavelength in a range between 550 nm and 750 nm therethrough, or (ii) a long-pass filter configured to attenuate light that is less than a first predetermined wavelength, the first predetermined wavelength being in a range of 400 nm to 450 nm.
2. The lighting and filtering device according to claim 1, wherein the lighting and filtering device is configured to be physically mounted to the camera.
3. The lighting and filtering device according to claim 1, wherein the lighting and filtering device is configured to be physically mounted to an external device that comprises the camera.
4. The lighting and filtering device according to claim 3, wherein the external device is a cellular phone, a tablet computer, a handheld media device, a laptop computer, or a desktop computer.
5. The lighting and filtering device according to claim 3, wherein the external device is a cellular phone.
6. The lighting and filtering device according to claim 5, wherein the lighting and filtering device is configured to be physically mounted to the cellular phone via a magnet.
7. The lighting and filtering device according to claim 5, further comprising a housing having:
a first lateral side configured to extend along a first lateral side of the cellular phone;
a second lateral side configured to extend along a second lateral side of the cellular phone; and
a third lateral side extending from the first lateral side to the second lateral side.
8. The lighting and filtering device according to claim 7, wherein:
the housing comprises an annular portion,
the at least one light source is located at a surface of the annular portion, and the annular portion surrounds the filter.
9. The lighting and filtering device according to claim 7, wherein
the at least one light source comprises a plurality of light sources, and the plurality of light sources are evenly distributed around the surface of the annular portion.
10. The lighting and filtering device according to claim 9, wherein the plurality of light sources comprises 8 light sources.
11. The lighting and filtering device according to claim 1, further comprising: a battery configured to supply power to the at least one light source.
12. The lighting and filtering device according to claim 1, further comprising: a processor configured to control the at least one light source.
13. The lighting and filtering device according to claim 12, further comprising: a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
14. The lighting and filtering device according to claim 13, wherein the transceiver operates using a frequency-hopping technique.
15. The lighting and filtering device according to claim 13, wherein the external device is configured to receive imaging data from the camera via the transceiver and further configured to process the imaging data.
16. The lighting and filtering device according to claim 15, wherein the external device is further configured to determine a presence and an amount of a substance on the portion of the patient's skin based on the imaging data.
17. The lighting and filtering device according to claim 16, wherein the external device is further configured to compare the imaging data to imaging data of a control image.
18. The lighting and filtering device according to claim 12, further comprising:
a connector configured to allow wired communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor.
19. The lighting and filtering device according to claim 18, wherein the connector is a USB type connector.
20. The lighting and filtering device according to claim 1, further comprising:
a processor; a transceiver configured to allow wireless communication between the processor and an external device, such that the at least one light source is controllable by the external device via the processor; and
a connector configured to allow wired communication between the processor and the external device, such that the at least one light source is controllable by the external device via the processor.
21. The lighting and filtering device according to claim 20, wherein:
the transceiver operates using a frequency-hopping technique, and the connector is a USB type connector.
22. The lighting and filtering device according to claim 1, wherein the band-pass filter is configured to attenuate light that is less than a second predetermined wavelength and greater than a third predetermined wavelength, the second predetermined wavelength being in a range of 550 to 650 nm, and the third predetermined wavelength being in a range of 650 to 750 nm.
23. The lighting and filtering device according to claim 1, wherein the first predetermined wavelength is in a range of 430 nm to 440 nm.
24. The lighting and filtering device according to claim 23, wherein the first predetermined wavelength is 435 nm.
25. The lighting and filtering device according to claim 1, wherein the at least one light source is configured to emit light having a peak wavelength in a range between 385 nm and 425 nm.
26. The lighting and filtering device according to claim 25, wherein the at least one light source is configured to emit light having a peak wavelength in a range between 400 nm and 415 nm.
27. The lighting and filtering device according to claim 26, wherein the at least one light source is configured to emit light having a peak wavelength of 405 nm.
28. The lighting and filtering device according to claim 1, wherein the peak wavelength of light emitted by the band-pass filter is in a range between 600 nm and 650 nm.
29. The lighting and filtering device according to claim 28, wherein the peak wavelength of light emitted by the band-pass filter is in a range between 630 nm and 640 nm.
30. The lighting and filtering device according to claim 29, wherein the peak wavelength of light emitted by the band-pass filter is 635 nm.
31. The lighting and filtering device according to claim 1 , wherein the band-pass filter is configured to facilitate transmission of light having a full width at half maximum in a range of 70 nm to 130 nm therethrough.
32. The lighting and filtering device according to claim 1, wherein the lighting and filtering device comprises the camera.
33. The lighting and filtering device according to claim 1, further comprising a black housing.
PCT/IB2018/001517 2017-11-10 2018-11-12 Uv device for evaluation of skin conditions WO2019092509A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762584404P 2017-11-10 2017-11-10
US62/584,404 2017-11-10

Publications (1)

Publication Number Publication Date
WO2019092509A1 true WO2019092509A1 (en) 2019-05-16

Family

ID=65279582

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/001517 WO2019092509A1 (en) 2017-11-10 2018-11-12 Uv device for evaluation of skin conditions

Country Status (1)

Country Link
WO (1) WO2019092509A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220245848A1 (en) * 2021-02-04 2022-08-04 Fibonacci Phyllotaxis Inc. System and method for evaluating tumor stability
EP4282326A1 (en) * 2022-05-23 2023-11-29 Sanders S.r.l. Process for coding an image of a scalp illuminated with a fluorescence-based wood's light
US11954861B2 (en) 2014-07-24 2024-04-09 University Health Network Systems, devices, and methods for visualization of tissue and collection and analysis of data regarding same

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063801A1 (en) 2001-10-01 2003-04-03 Gilles Rubinstenn Feature extraction in beauty analysis
WO2016204417A1 (en) * 2015-06-15 2016-12-22 서울바이오시스 주식회사 Hyper-spectral image measurement device and calibration method therefor, photographing module and device for skin diagnosis, skin diagnosis method, and skin image processing method
WO2017012675A1 (en) * 2015-07-23 2017-01-26 Latvijas Universitate Method and device for smartphone mapping of tissue compounds
US20170236281A1 (en) * 2014-07-24 2017-08-17 University Health Network Collection and analysis of data for diagnostic purposes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030063801A1 (en) 2001-10-01 2003-04-03 Gilles Rubinstenn Feature extraction in beauty analysis
US20170236281A1 (en) * 2014-07-24 2017-08-17 University Health Network Collection and analysis of data for diagnostic purposes
WO2016204417A1 (en) * 2015-06-15 2016-12-22 서울바이오시스 주식회사 Hyper-spectral image measurement device and calibration method therefor, photographing module and device for skin diagnosis, skin diagnosis method, and skin image processing method
WO2017012675A1 (en) * 2015-07-23 2017-01-26 Latvijas Universitate Method and device for smartphone mapping of tissue compounds

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11954861B2 (en) 2014-07-24 2024-04-09 University Health Network Systems, devices, and methods for visualization of tissue and collection and analysis of data regarding same
US11961236B2 (en) 2014-07-24 2024-04-16 University Health Network Collection and analysis of data for diagnostic purposes
US20220245848A1 (en) * 2021-02-04 2022-08-04 Fibonacci Phyllotaxis Inc. System and method for evaluating tumor stability
US11908154B2 (en) * 2021-02-04 2024-02-20 Fibonacci Phyllotaxis Inc. System and method for evaluating tumor stability
EP4282326A1 (en) * 2022-05-23 2023-11-29 Sanders S.r.l. Process for coding an image of a scalp illuminated with a fluorescence-based wood's light

Similar Documents

Publication Publication Date Title
US11274966B2 (en) Hyper-spectral image measurement device and calibration method therefor, photographing module and device for skin diagnosis, skin diagnosis method, and skin image processing method
US20230394660A1 (en) Wound imaging and analysis
WO2019092509A1 (en) Uv device for evaluation of skin conditions
EP3160327B1 (en) Acne imaging apparatus
US20230256260A1 (en) Skin Care Implement and System
KR100853655B1 (en) Apparatus, light source system and method for photo-diagnosis and phototherapy of skin disease
KR102512358B1 (en) Portable electronic device, accessory and operation method thereof
KR100785279B1 (en) Apparatus for photo-diagnosis of skin disease using uniform illumination
JP6868160B2 (en) Acquisition of images for use in determining one or more properties of the skin of interest
JP2007047045A (en) Apparatus, method and program for image processing
CN111433603B (en) Imaging method and system for intraoperative surgical margin assessment
JP2020022763A (en) Applicator for treating skin condition
EP2967329B1 (en) Device for use in skin and scalp diagnosis, and method using said device
CA3230185A1 (en) Systems, devices, and methods for imaging and measurement
WO2022169544A1 (en) System and devices for multispectral 3d imaging and diagnostics of tissues, and methods thereof
JP7055129B2 (en) UV device for smartphone-based assessment of skin condition
Demirli et al. RBX® technology overview
EP3481261B1 (en) An apparatus for providing semantic information and a method of operating the same
JP2016540622A (en) Medical imaging
WO2022266145A1 (en) Multi-modal skin imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18845312

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18845312

Country of ref document: EP

Kind code of ref document: A1