US20210366156A1 - Wound characterization system - Google Patents

Wound characterization system Download PDF

Info

Publication number
US20210366156A1
US20210366156A1 US17/195,323 US202117195323A US2021366156A1 US 20210366156 A1 US20210366156 A1 US 20210366156A1 US 202117195323 A US202117195323 A US 202117195323A US 2021366156 A1 US2021366156 A1 US 2021366156A1
Authority
US
United States
Prior art keywords
image
wound
calibration
calibration pattern
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/195,323
Inventor
Mansoor Ghazizadeh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PIXAMETER CORP
Original Assignee
PIXAMETER CORP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/720,260 external-priority patent/US9410827B2/en
Priority claimed from US15/807,705 external-priority patent/US10417785B2/en
Priority claimed from US15/807,693 external-priority patent/US10298780B2/en
Priority claimed from US15/807,713 external-priority patent/US10565735B2/en
Application filed by PIXAMETER CORP filed Critical PIXAMETER CORP
Priority to US17/195,323 priority Critical patent/US20210366156A1/en
Assigned to PIXAMETER CORP. reassignment PIXAMETER CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GHAZIZADEH, MANSOOR
Publication of US20210366156A1 publication Critical patent/US20210366156A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • A61B2560/0228Operational features of calibration, e.g. protocols for calibrating sensors using calibration standards
    • A61B2560/0233Optical standards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • U.S. patent application Ser. No. 15/994,895 is a continuation-in-part of U.S. patent application Ser. No. 15/587,891 filed May 5, 2017, now U.S. Pat. No. 9,989,952 issued Jun. 5, 2018, which is a continuation of U.S. patent application Ser. No. 13/791,987, filed Mar. 9, 2013, now U.S. Pat. No. 9,672,623 issued Jun. 6, 2017, which is a continuation-in-part of U.S. patent application Ser. No. 13/720,260, filed Dec. 19, 2012, now U.S. Pat. No. 9,410,827 issued Aug. 9, 2016, which claims the benefit of U.S. Provisional Patent App. No. 61/795,013, filed Oct. 9, 2012.
  • Smart mobile devices such as smartphones, feature phones, tablet, e-readers, media players, and so on, combine capabilities from multiple single function devices into a single device.
  • such smart mobile devices include various combinations of the capability found in devices such as a cell phone, a programmable computer, a camera, a media player and a portable Internet access device.
  • Many smart mobile devices contain one or more digital cameras that allow a user of the smart mobile device to take high resolution and high fidelity digital pictures.
  • some smart mobile devices include two cameras, one in the front of the smart mobile device and one in the back of the smart mobile device.
  • typical smartphones are able to capture images with a digital resolution of, for example, five to eight megapixels. The trend is to increase the digital resolution of cameras on smart mobile devices.
  • Some cameras for smart mobile digital devices allow for 3D image capture.
  • a captured digital image typically is represented as a two-dimensional matrix of dots, also called pixels.
  • FIG. 1 and FIG. 2 show the front and back, respectively, of a smart mobile device, in accordance with an implementation.
  • FIG. 3 shows a smart mobile device used to make a calibrated measurement in accordance with an implementation.
  • FIG. 4 shows an example of a calibration pattern useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 4A shows another two-dimensional calibration pattern.
  • FIG. 5 and FIG. 6 show, respectively, a front view and a back view of a case for a smart mobile device with imprinted calibration patterns useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 7 and FIG. 8 show, respectively, a front view and a back view of a case for a smart mobile device with alternative imprinted calibration patterns useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 9 and FIG. 10 show, respectively, a back view and a side view of a case for a smart mobile device with suction cups and a foldable pin useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 11 and FIG. 12 show, respectively, a front view and a top view of a case for a smart mobile device to which a hanging string may be attached so as to be useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 13 shows a smart mobile device used to make a calibrated measurement of the distance between two walls in accordance with an implementation.
  • FIG. 14 shows a simplified example of an image that includes a case for a smart mobile device used as a calibration target useful when making measurements on other objects within the image in accordance with an implementation.
  • FIG. 15 shows a simplified example of an image that shows a house on which has been mounted a calibration pattern in a window in accordance with an implementation.
  • FIG. 16 shows an example of a two-dimensional bar code used as a calibration pattern in accordance with an implementation.
  • FIG. 17 shows another example of a two-dimensional bar code used as a calibration pattern in accordance with an implementation.
  • FIG. 18 , FIG. 19 , and FIG. 20 illustrate a calibration pattern being used to extract camera information about an image that is applicable to other images using a same image set-up in accordance with an embodiment.
  • FIG. 21 illustrates an image with a large QR code rendered on a display of a mobile device.
  • FIG. 22 illustrates an image with a small QR code rendered on a display of a mobile device.
  • FIG. 23 illustrates an image with a small QR code rendered on a display of a mobile device together with a visual indicator.
  • FIG. 24 illustrates an image with an enlarged centered QR code rendered on a display of a mobile device.
  • FIG. 25 illustrates a technique for locating and decoding a QR code using a mobile device.
  • FIG. 26 illustrates wound classification
  • FIG. 27 illustrates a QR code and a wound.
  • FIG. 28 illustrates a technique for wound characterization.
  • FIG. 29 illustrates a QR code with color indicators.
  • FIG. 30 illustrates color modification of images including a wound.
  • FIG. 31 illustrates a projective transformation of a QR code.
  • FIG. 32 illustrates a patient with a QR code based identification.
  • FIG. 33 illustrates a system for the use of QR code based patient identification.
  • FIG. 34 illustrates a leg wound and a pair of calibration patterns imaging the entire wound.
  • FIG. 35 illustrates a leg wound and a pair of calibration patterns imaging a first portion of the wound.
  • FIG. 36 illustrates a leg wound and a pair of calibration patterns imaging a second portion of the wound.
  • FIG. 37 illustrates a calibration target for a curved surface.
  • FIG. 38 illustrates a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 39 illustrates a segmented image of a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 40 illustrates an unwrapped image of a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 41 illustrates a wound classification system
  • FIG. 42 illustrates wound classifications.
  • FIG. 1 and FIG. 2 show the front and back, respectively, of a smart mobile device 10 .
  • smart mobile device 10 includes a front facing camera 12 , and a touch sensitive display 11 , as shown in FIG. 1 .
  • Smart mobile device 10 also includes, for example, a back facing camera 22 and a back facing flash 21 , as shown in FIG. 2 .
  • smart mobile device 10 is a smart phone, a tablet, an e-reader, a media player, a digital camera or any other portable device that includes a camera and has processing capability sufficient to run a software application that performs measurements based on a calibration pattern.
  • app 23 represents a software application, stored in smart mobile device 10 , that performs measurements based on a calibration pattern, as described further below.
  • the mobile device may be a mobile phone, a smart phone, a tablet, a laptop, or otherwise.
  • images captured by smart mobile device 10 can be used for measuring object size in three dimensions, for measuring a distance between objects and for measuring color and brightness level of objects in a captured image.
  • inclusion of one or more calibration patterns within an image captured by smart mobile device 10 allows for appropriate calibration.
  • the calibration pattern is placed within a focus plane of a camera that captures the digital image. Placement within the focus plane allows for calibrated measurements of other objects in the digital image.
  • FIG. 3 shows a smart mobile device 10 used to make a calibrated measurement.
  • back facing camera 22 is shown to include a camera lens 31 and a camera sensor 32 .
  • Dotted lines 37 define a field of view 33 for back facing camera 22 .
  • An object of measurement 36 is located on a focus plane 34 , as shown in FIG. 3 .
  • a calibration target 35 is also shown located on focus plane 34 .
  • Focus plane 34 of back facing camera 22 is in a parallel plane to the plane on which camera sensor 32 resides.
  • the distance of focus plane from camera 22 is determined by focus of camera lens 31 of camera 22 .
  • a camera is best placed parallel with a focus plane (e.g., an X-Y plane) in which measurements will occur.
  • a focus plane e.g., an X-Y plane
  • measurements on objects close to the focus plane e.g., in which a location on the Z axis is close to the X-Y plane
  • measurements made on objects farther from the focus plane e.g., in which a location on the Z axis is at a greater distance to the X-Y plane. Therefore, it is typically best, where possible, to focus the camera lens on the intended object of measurement and to include a calibration pattern within the focus plane of the camera lens.
  • a calibration pattern includes one or more known predetermined sub-patterns that have known or knowable characteristics. Including such a calibration pattern in a captured digital image will indicate information about other pixels in the captured digital image. For example, the indicated information obtained from the calibration pattern may include actual dimensions of geometric shapes in the calibration pattern. This can be used to calculate, for example, actual dimension of sizes represented by each pixel within a captured digital image.
  • a measurement of dimensional information can he any measurement that takes into account information about dimensions.
  • a measurement of dimensional information can be a measurement of one or more of the following: distance between points, length, width, area, bounding box location and size, centroid, perimeter length, number of holes, form factor (ratio of area to the square of perimeter), elongation, moments, best-fitting ellipse, ratio of best-fitting ellipse axes, orientation, roundness, convexity related, convex area, minimum bounding box location, size and orientation, feret diameters at different angles, convexity (ratio of convex perimeter to raw perimeter), solidity (ratio of net area to convex area), perimeter related, perimeter points (blob's boundary and holes), filled area, sorting and selecting blobs based on any calculated feature, and user selection of group of features to calculate.
  • the indicated information obtained from the calibration pattern may also include, for example, brightness information for grey levels for objects and color information for objects in the calibration pattern. And so on. This can be used to calculate brightness and color information, etc., of other objects within the captured digital image.
  • FIG. 4 shows an example of a calibration pattern 40 that appears on calibration target 35 .
  • Calibration pattern 40 can include, for example, one or a plurality of various calibration sections used for calibration and can also include encoded or otherwise obtainable information that can be recognized by smart mobile device 10 .
  • An example of a calibration section within calibration pattern 40 is a geographic pattern 42 that has known or knowable physical dimensions.
  • a high gradient pattern 44 can be used by smart mobile device 10 to sharpen image focus.
  • a geographic pattern 45 is another geographic pattern with known physical dimensions that can be used for dimensional measurements.
  • a red area 46 , a blue area 47 , a green area 48 and a gray area 49 are colorimetery and brightness calibration patterns that can be used by smart mobile device 10 to calibrate color and brightness for a captured image and/or to calibrate smart mobile device 10 .
  • identification indicia 43 is visually readable by a user.
  • identification number 43 is a serial number or any other type of number or other identifying indicia that identifies calibration pattern 40 .
  • app 23 can check for identifying indicia 43 in order to use the identifying indicia to obtain information about calibration pattern 40 .
  • different software applications running on smart mobile device 10 may require different calibration patterns.
  • Each unique calibration pattern can be identified, for example, with an identifying indicia.
  • Information for a particular calibration patterned associated with identifying indicia can be stored locally within smart mobile phone 10 or remotely, for example, in a server accessible by smart mobile phone 10 through the Internet.
  • the information for a calibration pattern can be, for example, dimensional measurements from geometric patterns within the calibration pattern, brightness or color values for entities within the calibration pattern, a specification of the layout of the calibration pattern, a specification for a covering case or other entity on which the calibration pattern is embedded or attached and so on.
  • the information can also include, for example, specifications pertaining to smart mobile device 10 , such as packaging specifications and camera specifications.
  • a two-dimensional bar code 41 is a quick response (QR) code or similar code.
  • Two-dimensional bar code 41 can include the identifying indicia for the calibration pattern thus allowing smart mobile device 10 to identify the calibration pattern in a captured image and access from local or remote storage information about the calibration pattern.
  • two-dimensional bar code 41 contains additional information about the calibration pattern.
  • two-dimensional bar code 41 in addition or instead of the identifying indicia for the calibration pattern, contains specific information about actual measurements for sections of the calibration pattern information, information about where the calibration is expected to be located (e.g., on a covering case for mobile device 10 ) and other information that, for example, may be useful to app 23 when performing measurements.
  • App 23 will capture the information by decoding two-dimensional bar code 41 when two-dimensional bar code 41 is within a captured image.
  • calibration pattern 40 can use other means to encode information such as a one dimensional bar code or another information encoding scheme.
  • another type of two-dimensional bar code may include a data matrix consisting of black and white “cells” or modules preferably arranged in either a square or rectangular pattern, also known as a matrix.
  • the information to be encoded is preferably text or numeric data.
  • a typical data size is from a few bytes up to 1556 bytes.
  • the length of the encoded data depends on the number of cells in the matrix. Error correction codes may be included to increase reliability so that even if one or more cells are damaged so it is unreadable, the message can still be read.
  • Data Matrix symbols are preferably rectangular, more preferably square in shape and composed of square “cells” which represent bits. Depending on the coding used, a “light” cell represents a 0 and a “dark” cell is a 1, or vice versa.
  • Every Data Matrix is preferably composed of two solid adjacent borders in an “L” shape (referred to as the “finder pattern”) and two other borders consisting of alternating dark and light “cells” or modules (referred to as the “timing pattern”). Within these borders are rows and columns of cells encoding information. The finder pattern is used to locate and orient the symbol while the timing pattern provides a count of the number of rows and columns in the symbol. As more data is encoded in the symbol, the number of cells (rows and columns) increases. Each code is preferably unique.
  • a particular calibration pattern can be registered with app 23 so that app 23 assumes that the registered calibration pattern in an image is the registered calibration pattern.
  • This registration information allows app 23 operating within smart mobile device 10 to access information about the calibration target from local or remote memory, without having to read configuration information or the identifying indicia directly from calibration target 23 .
  • the identifying indicia can be used to check to see if app 23 is configured to be used with that calibration pattern.
  • app 23 checks the identifying indicia and determines smart mobile device 10 is configured to use the calibration pattern
  • the user of smart mobile device 10 is given, for example, an opportunity to register smart mobile device 10 to be configured to use the calibration pattern. For example, such registration might require a fee.
  • smart mobile device 10 will be able to access information about the calibration pattern. The information can be accessed, for example, from internal memory within smart mobile device 10 or from some external memory source.
  • a captured digital image that includes calibration pattern 40 in the focus plane allows for calibrated measurements, such as two-dimensional measurements of all objects within the focus plane of calibration pattern 40 . Additionally, calibration pattern 40 can then be removed and another digital image captured without the presence of calibration pattern 40 . As long as no other changes are made to the camera set-up, measurements can be made on the newly captured image based on calibration information obtained from the originally captured image.
  • the distance between smart mobile device 10 and an object where calibration pattern 40 resides can be determined by a comparison of pixel sizes in a digital image that includes calibration pattern 40 with the actual size of a known element within calibration pattern 40 while taking into account any magnification performed by camera lens 32 .
  • FIG. 5 and FIG. 6 show, respectively, a front view and a back view of a case 50 for smart mobile device 10 .
  • FIG. 5 shows a calibration pattern 52 included on case 50 .
  • calibration pattern 52 is imprinted within a cavity 51 on the front of case 50 . Including calibration pattern 52 within cavity 51 helps to protect calibration pattern 52 from being eroded through friction when placing smart mobile device 10 into case 50 and removing smart mobile device 10 from case 50 .
  • FIG. 6 shows a calibration pattern 62 imprinted within a cavity 61 on the back of case 50 .
  • Including calibration pattern 62 within cavity 61 helps to protect calibration pattern 62 from being eroded through friction as case 50 interacts with its environment while protecting smart mobile telephone 10 from damage.
  • case 50 is a full outerbox skin case, a four-sided skin case, a three-sided skin case, a perimeter bumper case, a holster case, or any other kind of case designed to protect mobile device 10 .
  • Case 50 is composed of, for example, hard material such as plastic or metal, or is composed of softer material such as leather or cloth composed of natural or synthetic material.
  • sides of case 50 are constructed to allow case 50 to be stood up on a flat surface without tipping, allowing convenient viewing of calibration pattern 52 and calibration pattern 62 .
  • the calibration pattern can be included on case 50 in various ways.
  • the calibration pattern can be imprinted on case 50 at manufacturing time.
  • the calibration pattern can be included on case 50 by, after manufacturing, adhering a label containing the calibration pattern onto case 50 or by any other means which results in calibration pattern being visible on case 50 .
  • case 50 can be carried with mobile device 10 and is used to protect mobile device in addition to providing a ready source for the calibration pattern. Case 50 can be easily detached from smart mobile device 10 without affecting functionality of mobile device 10 .
  • FIG. 7 and FIG. 8 show, respectively, a front view and a back view of a case 70 for smart mobile device 10 .
  • FIG. 7 shows a calibration pattern 72 imprinted within a cavity 71 on the front of case 70 .
  • Calibration pattern 72 is composed, for example, entirely of a two-dimensional bar code, such as a QR pattern. Including calibration pattern 72 within cavity 71 helps to protect calibration pattern 72 from being eroded through friction when placing smart mobile device 10 into case 70 and removing smart mobile device 10 from case 70 .
  • FIG. 8 shows a calibration pattern 82 imprinted within a cavity 81 on the front of case 70 .
  • Calibration pattern 82 is composed, for example, entirely of a two-dimensional bar code, such as a QR pattern. Including calibration pattern 82 within cavity 81 helps to protect calibration pattern 82 from being eroded through friction as case 70 interacts with its environment while protecting smart mobile telephone 10 from damage.
  • the two-dimensional bar code includes some or all calibration patterns geometries required for, for example, dimensional, brightness/grey level and colorimetery measurements.
  • the two-dimensional bar code thus acts as a calibration pattern.
  • the benefit of using the two-dimensional bar code as a calibration pattern is that the two-dimensional bar code takes up much or all of the space available for a calibration pattern and thus can be a sized two-dimensional bar code that can be easier detected within a captured image within a larger field of view
  • FIG. 9 and FIG. 10 show, respectively, a back view and a side view of a case 90 for smart mobile device 10 .
  • Case 90 has been outfitted with various appurtenances for allowing case 90 to be mounted on a focus plane when making measurements.
  • FIG. 9 shows a suction cup 91 , a suction cup 92 , a suction cup 93 and a suction cup 94 embedded on back of case 90 .
  • Suction cup 91 , suction cup 92 , suction cup 93 and suction cup 94 can be used to temporarily adhere the back of case 90 to a hard smooth surface such as metal or glass.
  • a foldable ring 95 can be used to hang case 90 to a pin, nail, hook and so on. Foldable ring 95 can also be used for hanging by a string, strand, thread, cord, etc.
  • FIG. 10 additionally shows a suction cup 101 , a suction cup 102 and a suction cup 103 , embedded on a side of case 90 .
  • Suction cup 101 , suction cup 102 and suction cup 103 can be used to temporarily adhere the side of case 90 to a smooth surface.
  • a foldable pin 104 allows case 90 to be attached to soft material, like drywall, and cloth.
  • the foldable design allows for foldable pin 104 to be in an embedded cavity while not in use.
  • FIG. 11 and FIG. 12 show, respectively, a front view and a top view of a case 110 for smart mobile device 10 .
  • FIG. 11 shows a hanging string 113 attached to case 110 .
  • Hanging string 113 allows case 110 to be suspended at a desired location when a calibration pattern 112 within an indentation 111 of case 110 is to be used as part of a calibrated measurement performed by mobile device 10 .
  • FIG. 12 shows a hang hole 121 and a hang hole 122 located on top of case 110 . For example, hanging string 113 is placed through hang hole 121 and hang hole 122 to attach hanging string 113 to case 110 .
  • FIG. 13 shows smart mobile device 10 used to make a calibrated measurement of the distance between a wall 131 and a wall 132 .
  • Lines 137 define a field of view 134 for back facing camera 22 .
  • a case 135 is attached to wall 131 .
  • Case 135 includes a calibration pattern that faces towards wall 132 .
  • FIG. 14 shows a simplified example of a recorded image 140 that includes an image of case 145 with an embedded calibration pattern.
  • the calibration can be used for measurements of dimensions, colorimetery, brightness and so on of other objects within recorded image 140 .
  • the other objects include, for example, a safety pin 141 , a pencil 144 , a circular object 142 and a square object 143 .
  • app 23 In order to activate app 23 within smart mobile device 10 , app 23 needs to be transferred to smart mobile device 10 if not installed when smart mobile device 10 is purchased.
  • app 23 can be downloaded from the internet or from an app store. Also, a case with an embedded calibration pattern can be obtained.
  • the camera setting of smart mobile device 10 will need to be set according to any instructions included with app 23 .
  • the calibration pattern may then be included in the field of view of a camera of smart mobile device 10 .
  • a particular background may be specified or suggested to maximize contrast between the calibration pattern and the background.
  • the camera of smart mobile device 10 is focused on the calibration pattern based on the capability of the camera of smart mobile device 10 .
  • the focus capability may be, for example, auto focus, tap to focus, or another focusing capability. Once in focus, an image is captured.
  • App 23 will analyze the captured image. For example, if the captured image has a two-dimensional bar code, app 23 will read and decode the two-dimensional bar code and act in accordance with the encoded instructions. If the two-dimensional bar code includes a calibration code identifying indicia and all calibration information, then the app 23 will decode the information, associate the information with the identifying indicia of the calibration pattern and store the information in the memory of smart mobile device 10 . The information can in the future be accessed based on the associated identifying indicia.
  • app 23 can use the identifying indicia, for example, to access information about the calibration pattern previously stored in smart mobile device 10 or download additional information about the calibration pattern from an App central server (cloud) when smart mobile device 10 is connected to the Internet. For example, once information about the calibration pattern is stored in smart mobile device 10 , the setup procedure of app 23 will prompt user for registering this specific calibration pattern with smart mobile device 10 . If permission is granted, registration will proceed.
  • FIG. 3 illustrates the process of measuring object 36 in field of view 33 of back facing camera 22 .
  • calibration target 35 is placed within field of view 33 , preferably in focus plane 34 of measuring object 36 .
  • calibration target 35 is a calibration pattern on a case of smart mobile phone 10 .
  • Smart mobile phone 10 is removed from the case and the case placed so that that calibration pattern plane is parallel to the measurement plane of object 35 and any other objects to be measured.
  • Smart mobile phone 10 is positioned so that object 35 , and any other objects to be measured, are maximized within field of view 33 .
  • FIG. 14 shows multiple images within field of view 33 ,
  • back facing camera 22 is focused at focus plane 34 and an image captured.
  • a manual focus or an auto focus capability such as a tap-on-focus, is used to focus camera lens 31 on calibration target 35 .
  • app 23 analyzes the capture image to perform a calibration process. Particularly, app 23 analyzes the captured image to determine an exact location and orientation of calibration target 35 . App 23 will also look for a two-dimensional bar code or other source of encoded information within the captured image. From information obtained from, for example, a two-dimensional bar code or other source of encoded information, app 23 will verify smart mobile device 10 has access to the relevant calibration information associated with calibration target 35 and if so, use the relevant calibration information associated with calibration target 35 for calibrating back facing camera 22 . If smart mobile device 10 does not have access to the relevant calibration information associated with calibration target 35 , app 23 will try to obtain access to this information, for example, by connecting user to an online source where access can be obtained.
  • app 23 uses algorithms that use geometrical patterns included within the calibration pattern and their geometrical relationships to calculated measurement values, as is understood in the art.
  • object 36 is measured.
  • the user brings up the calibrated captured image.
  • the calibrated captured image will have calibration information with it.
  • the calibrated captured image can be viewed and processed on smart mobile device 10 or transferred to another computing device such as a personal computer for viewing and measuring.
  • an object measurement menu bar is presented to use for making the measurement process more convenient.
  • various measurements can be made. For example, a point-to-point measurement can be made using a ruler placement.
  • an area measurement can be made by placing a geometrical shape on an object.
  • Various associated measurements such as dimensions, gray level, density, colorimetry, and so on can be calculated.
  • a user can identify an object and automated object recognition could be performed.
  • the automated object recognition could return detected values for various associated measurements such as dimensions, gray level, density, colorimetry.
  • app 23 can be written so that when run on mobile device 10 mobile device 10 creates a process running on mobile device 10 that can detect a case that does not necessarily include a calibration pattern.
  • the case can be detected by detecting the outline of the case or some prominent feature on the case or pattern on the case.
  • app 23 uses stored information about the case to make a calibrated measurement.
  • the stored information can be dimensional information, brightness information, color information or information about a feature or a pattern on the case.
  • FIG. 13 illustrates measurement of distance between two objects, in this case the distance between wall 131 and wall 132 .
  • the calibration target i.e., case 135 with an embedded calibration pattern
  • the first object i.e., wall 131 .
  • smart mobile device 10 is placed on the second object, i.e., wall 132 .
  • Smart mobile device 10 is mounted on wall 132 so that camera 22 is directly facing in a direction perpendicular to case 135 (the calibration target).
  • the zoom of camera 22 is adjusted to maximize the size of the calibration target in field of view 137 of smart mobile device 10 .
  • camera 22 is focused on case 135 and an image captured.
  • a manual focus or an auto focus capability such as a tap-on-focus is used to focus camera lens 31 on case 135 .
  • app 23 analyzes the capture image to perforin a calibration process. Particularly, app 23 analyzes the captured image to determine an exact location and orientation of case 135 . App 23 will also look for a two-dimensional bar code or other source of encoded information within the captured image. From information obtained from, for example, a two-dimensional bar code or other source of encoded information, app 23 will verify smart mobile device 10 has access to the relevant calibration information associated with the calibration pattern embedded on case 135 and if so, use the relevant calibration information associated with the calibration pattern embedded on case 135 for calibrating back facing camera 22 . If smart mobile device 10 does not have access to the relevant calibration information associated with calibration target 35 , app 23 will try to obtain access to this information, for example, by connecting user to an online source where access can be obtained.
  • app 23 uses algorithms that use specific patterns in the calibration pattern designed for distance measurement through triangulation.
  • a calibration pattern within an image can be used apart from a smart mobile device.
  • FIG. 15 shows a simplified example of an image 150 that shows a house 157 on which has been mounted a calibration pattern 151 in a window of the house.
  • the image is a digital image captured with any digital camera.
  • the image can be displayed on any computer system able to display digital images.
  • Calibration pattern 151 contains information about calibration pattern 151 .
  • calibration pattern 151 is a two-dimensional bar code that contains encoded display information about calibration pattern 151 .
  • the information displayed in calibration pattern 151 is utilized to make one or more calibrated measurements, such as those represented by an arrow 152 , an arrow 153 , an arrow 154 , an arrow 155 , and an arrow 156 .
  • the calibrated measurements are utilized, for example, by a computing system used by a user, or by a remote server accessed by a user.
  • a calibration pattern in a digital image allows for a computer system to make calibrated measurements.
  • the image can contain objects of any size.
  • the calibrated measurements can be made by any computing system with sufficient processing power to make the pertinent calculations.
  • the information displayed in a calibration pattern can also be used to validate user permission to use a particular application to make calibrated measurements.
  • a particular calibration application can be set up to only operate on images that display a particular calibration pattern or group of calibration patterns.
  • each calibration pattern may include a serial number or some other identification indicia that uniquely identifies the calibration pattern.
  • the application making the calibration measurements can use this identification indicia as a pass code to validate user rights to use the application to make calibrated measurements.
  • FIG. 16 shows a two-dimensional bar code 160 used as a calibration pattern. While in FIG. 16 , calibration pattern 160 is in a tilted orientation, app 23 will calculate the orientation and take the orientation into account when making calibrated measurements. For example, information about calibration pattern 160 will include a value for an actual distance, represented by a line 164 , between a point 161 and a point 162 , a value for an actual distance, represented by a line 165 , between point 162 and a point 163 and a value for an actual distance, represented by a line 166 , between point 163 and point 161 .
  • a high gradient pattern can be inserted to be used to sharpen image focus.
  • particular color or grey areas can be added to calibration pattern 160 to allow for calibration of color and/or brightness for a captured image that includes calibration pattern 160 .
  • placing camera 22 and calibration target 35 in parallel planes when capturing an image of calibration target 35 is important to achieve accurate measurements. Since a user may hold mobile device 10 in hand when capturing an image, there may be some variance from the ideal positioning of camera 22 and calibration target 35 in parallel planes. To accommodate this lack of precision, four or more measuring points of calibration target can be used to measure co-planarity of the planes in which camera 22 and calibration target 35 are situated.
  • FIG. 17 shows a two-dimensional bar code 170 used as a calibration pattern.
  • information about calibration pattern 170 will include a value for an actual distance, represented by a line 176 , between a point 171 and a point 172 , a value for an actual distance, represented by a line 177 , between point 172 and a point 173 , a value for an actual distance, represented by a line 178 , between point 173 and a point 174 , and a value for an actual distance, represented by a line 175 , between point 174 and point 171 .
  • Points 171 , 172 and 173 are used for geometrical calibration of the captured image and orientation assessment of the calibration pattern. All four points 171 , 172 , 173 and 174 are used for a co-planarity measurement.
  • the image co-planarity measurement will have multiple applicability. That is, the co-planarity measurement is used to access image co-planarity at the time of the image capture and provides real-time feedback to the user of smart mobile device 10 on the parallelism of the camera with the calibration pattern image plane when the user is about to capture an image. For example, visual and/or audio feedback is given to the user when the camera with the calibration pattern are co-planar or alternatively when the camera with the calibration pattern are not co-planar.
  • the co-planarity measurement is used to correction any deviation from co-planarity between the camera the calibration pattern image plane.
  • the co-planarity measurement can also be used as a factor in calculating and presenting to the user a value that indicates an expected accuracy of the calibrated measurement.
  • calibrated measurements could also be done by any computer implemented system that includes a processor and computer readable medium encoded with processor readable instructions that, when read, implement a process on the processor that can detect a calibration pattern within an image where the process uses information displayed within the calibration pattern to make a calibrated measurement.
  • a server can make a measurement by accessing a digital image, where the digital image includes a calibration pattern and the calibration pattern includes displayed information about the calibration pattern.
  • the server reads the displayed information to obtain the information about the calibration pattern. Then the server utilizes the displayed information to make a calibrated measurement.
  • FIG. 18 shows a shoe 202 within a picture frame 201 .
  • media 203 that includes a calibration pattern.
  • the calibration pattern allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204 , which are not visible in the image, but represent information available from the calibration pattern.
  • the calibration pattern can provide, for example, information such as pixel size in X direction, pixel size in Y direction, distance to the focus plane, location of the focus plane in the image (can be exposed with placing a graphics overlay to define this plane), if there are multiple focus plane of calibration the above attributed could be duplicated for each plane, dimensional measurement info and overlays for premeasured objects, colorimetric calibration information, brightness calibration information, capture time lighting information (flash, sunlight, etc.), scale with respect to real life (example: scale of a architectural drawing for an image of the drawing), camera settings, and so on.
  • a coordinate crosshair can also be superimposed into a picture, as a guide for a user making measurements
  • the image captured with the calibration pattern is processed to extract the calibration information.
  • This calibration information will be the same for all subsequent images taken from the same image set-up. This allows the subsequent images to be calibrated without physically including in the image media 203 with the calibration pattern.
  • the calibration information can be added subsequently to the image. This could be done by visually superimposing a visible pattern containing the information onto the image or it can be done in a way that does not affect the image, for example, by including the calibration in metadata stored as part of the image.
  • image metadata herein is information stored with an image that gives information about the image but does not affect the appearance of the image as reproduced.
  • FIG. 19 represents the case where an image has been retaken from the same image set-up (but without media 203 in the picture). In this case the image included only shoe 202 .
  • Using calibration from the previously taken image allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204 , which are not visible in the image, but represent information available from the calibration information from the earlier taken image.
  • the calibration information while not originally part of the image, has been added to the image shown in FIG. 19 by superimposing a two-dimensional bar code 208 on the image shown in FIG. 19 .
  • a two-dimensional bar code is only illustrative as this information could be visibly included on the image in other ways, for example through a one-dimensional bar code, a digitally coded label, an alphanumeric coded label or some other communication methodology visible on an image.
  • FIG. 20 represents another case where an image has been retaken from the same image set-up (but without media 203 in the picture).
  • the image included only shoe 202 .
  • Using calibration from the previously taken image allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204 , which are not visible in the image, but represent information available from the calibration information from the earlier taken image.
  • the calibration information while not originally part of the image, has been added to the image metadata, but not added to the image data. This, as shown in FIG. 20 no calibration information appears in the image itself.
  • the calibration information is included only as part of image metadata stored with an image.
  • the original image itself can be altered (e.g., using image processing software) to remove the calibration pattern from the original image.
  • the calibration information could then be re-added to the image in another form, for example, by superimposing the image back onto the image, as illustrated in FIG. 19 , or by including the calibration information in image metadata stored with the image, as illustrated by FIG. 20 .
  • volume manufactures may want to develop a picture taking setup where a camera and picture are calibrated once and images of different objects are taken for future at will measurements.
  • a shoe manufacturer may make a picture taking setup and calibrate the system via a calibration pattern or other means and maintain this setup to take pictures of multiple shoes placed in the focus plane.
  • the ability to extract calibration information from a first taken image and then in post image processing removing the image from the original image allows inclusion of the calibration information, for example in image metadata for the image, while maintaining image originality, artistic perspective and cleanliness. Any calibration pattern in the image that distracts the viewer and impacts the artistic perspective of the image is removed.
  • calibration information stored with an image may vary from subsequent images captured without the calibration pattern or even the images directly derived from an original image. This may occur, for example, where an image taken at a high resolution is uploaded to an on-line site that limits the resolution of uploaded images. If the calibration information stored with the original image (either visible on the picture on in image metadata), is based on the higher resolution, the calibration information stored with the image needs to be resolution scaled to be accurate. If the resolution scaling information of the original image is included in the calibration data, this allows the change in resolution to be taken into account when subsequently interpreted. Including such information, either visibly or with image metadata for the image, allows for precise interpretation of measurement information.
  • the mobile device is capable of locating and decoding a quick response code 300 in an image captured from a scene if the quick response code is sufficiently large.
  • the quick response code 310 is not sufficiently large, the mobile device may have difficulty identifying the quick response code in the captured image, and even if identified, properly decoding the quick response code in the captured image.
  • the mobile device may provide a visual alert 330 to the user on the display indicating that the quick response code should be identified on the display by the user.
  • the visual alert may be a hand with a pointed finger.
  • a stylus or a mouse motion or a set of x & y coordinates may be used to identify the location of the quick response code.
  • the mobile device receives an identification of the location of the quick response code 320 by the user selecting a portion of the image where the quick response code is located.
  • the image presented on the display of the mobile device and/or the image captured by the imaging device of the mobile device is automatically presented on the display with an enlarged quick response code 340 and those portions of the image around the quick response code.
  • the region of the image and/or scene that includes the quick response code is presented in a manner where the quick response code is readily identified and the quick response code is readily decoded.
  • the location indicated by the user may be generally centered on the display when displaying the quick response code at an enlarged scale. Further, a noise reduction process may be included during the scaling process.
  • the increased magnification process may be repeated until the quick response code is sufficiently large to be readily identified and readily decoded. Upon successfully identifying and decoding the quick response code the process may terminate or otherwise reaching a predetermined magnification limit the process may terminate.
  • the magnification limit may be selected by the user, if desired.
  • an exemplary technique to accommodate QR codes that are relatively small or otherwise hard to discriminate and decode is illustrated.
  • An image is captured and the mobile device applies a QR code reader technique to locate the QR code in the image using the current magnification and location 400 .
  • the technique determines if the QR code is located in the captured image 410 . If the QR code is located, then the technique determines if the QR code is decodable in the captured image 420 . If the QR code is located 410 and the QR code is decodable 420 , then the technique decodes the QR code 430 .
  • the technique automatically prompts the user to indicate the location of the QR code within the captured image 440 .
  • the mobile device uses the received location to center a portion of the image and enlarge the image 450 .
  • the system determines if a magnification limit is reached 460 , such that further magnification is not suitable, then the technique ends the search for the QR code and/or the ends the attempt to decode the located QR code 470 . If the magnification limit is not reached 460 , then the repeats the process at with the modified region of the image and the modified magnification to attempt to locate and decode the QR code.
  • a chronic wound on a person tends to occur when the wound does not heal in an orderly set of stages and in a predictable amount of time in the manner in which most wounds heal.
  • wounds that do not properly heal within three months are typically considered to be chronic wounds.
  • the chronic wounds tend to be detained in one or more of the phases of the wound healing.
  • a chronic wound may remain in the inflammatory stage for a significant period of time.
  • chronic wounds may never heal or may take several years to properly heal.
  • Such chronic wounds cause patients severe emotional and physical stress, while also creating significant financial burden on the patient and also on the health care system as a whole. It is estimated that chronic wounds impact over 8.5 million people with an expense in excess of $30 billion a year in the United States.
  • Venous and arterial ulcers, diabetic wounds, and pressure ulcers are some of the major chronic wound categories. Venous and arterial ulcers are primary caused as a result of aging, diabetic wounds are primarily caused as a result of diabetes, and pressure ulcers are primary caused as a result of obesity.
  • the burden of treating such chronic wounds is substantially increasing over time with increasing health care costs, an increasingly aging population, a sharp rise in the incidence of diabetes, and a sharp rise in obesity.
  • cutaneous wound measurements are an important metric to track the healing of a wound and based upon such measurement direct appropriate therapy, especially in the first 4 weeks of the wound healing process.
  • the precision of wound measurement has a statistically significant influence on the assessment of the wound healing process.
  • the wound may include any type of abrasion, skin lesions, moles, Psoriasis mark, or otherwise.
  • the aforementioned wound measurement technique is operator dependent with significant variability that is open to interpretation. Different clinicians could look at the same wound and take divergent measurements and/or measure different parts of the same wound over time.
  • the inter-rater measurement error when two separate providers measure the same wound, ranges between 16 and 50%, which again is highly variable.
  • the skin of a patient may include a wound therein, such as on their lower arm as an example.
  • the size of the wound may vary in its shape and size depending on the patient and the particular type of wound.
  • a QR code may be affixed with an adhesive or otherwise placed in contact with the skin of the patient proximate the wound.
  • the QR code may include size related identification information therein, as previously described.
  • the QR code With the QR code located proximate the wound together with information included therein regarding the scale of the QR code, the QR code may act as a reference scale for the wound. In this manner, the scale of the captured image may be determined and as a result the scale of the wound in the particular captured image may likewise be determined.
  • the QR code is decoded to indicate it is 1 inch by 1 inch is size, then if the wound has the same height as the QR code and twice the length of the QR code, then the wound would be identified as being 1 inch by 2 inches in size.
  • the QR code is preferably positioned near the wound on the skin of the patient 500 , and the QR code is read and decoded to determine the scale of the QR code 510 .
  • the edges of the wound proximate the QR code are identified 520 .
  • the edges of the wound may be identified in any suitable manner.
  • One technique to identify the edges of the wound is for the technician to use their finger to manually trace around the edges of the wound.
  • Another technique to identify the edges of the wound is for the system to use edge identification and/or segmentation and/or color based discrimination techniques to identify the edges of the wound. For example, the identification of the edges of the wound may be fully automatic.
  • the identification of the edges of the wound may be further based upon the user selecting a generally central region of the wound to assist in a more robust identification of the wound.
  • the identification of the edges of the wound may be based upon defining a bounding box around the wound.
  • the identification of the edges of the wound may be based upon a technician's manual outlining of the wound.
  • machine learning techniques may be used to assist in the automatic determination of the edges of the wound.
  • the technique may include the technician defining the general bounding region of the wound and the subsequent automatic refinement of the bounding region to more accurately identify the actual edges of the wound.
  • wound size characteristics may be determined 530 .
  • the wound size characteristics may be any suitable characteristic.
  • One characteristic may be, for example, the maximum width and the maximum length of the wound.
  • Another characteristic may be, for example, the total area of the wound.
  • Another characteristic may be, for example, the elliptical characteristics of the wound.
  • Another characteristic may be, for example, a measure of undetermined areas of the wound such as cut outs, tunnels, and cavities.
  • Another characteristic may be, for example, the symmetry of the wound.
  • the system may include a boundary refinement process 540 to permit the technician to modify the boundary to more accurately represent the wound which is then used to characterize the wound.
  • the progression of a wound's color over time is an indicator of the wound's response to treatment.
  • the apparent color sensed by the imaging device tends to dramatically vary.
  • the color presented by the imaging device also tends to dramatically vary based upon the time of day, light conditions, and the characteristics of the particular device.
  • it is desirable that the colors sensed by the mobile device are represented generally consistently across the series of images.
  • color indicators such as color based markers together within the bounds of the QR code and/or colored portions of the QR code itself and/or color based markers proximate the QR code such as around the periphery of the QR code.
  • FIG. 30 an example of three different original images are illustrated, each of which is of the same wound area under different lighting conditions, and the resulting calibrated color images.
  • a plurality of different images taken on the same or different mobile device may be modified to have more color consistency.
  • the images from the mobile device(s) are provided to a network based computing device, where the color correction is provided at the network based computing device, and the images are available from the network based computing device upon request.
  • the color adjusted image maybe used for wound characterization.
  • the accuracy of the metrological analysis of a captured image is based, at least on part, upon the angle between the camera focal axis and the normal vector of the wound surface with the normal vector of the wound surface being preferred.
  • Each of captured images should be modified, as desired, to provide a would surface that appears to be more consistent with the normal vector of the wound.
  • the system Rather than attempting to use the characteristic of the patient to determine such a normal vector of the wound, the system preferably uses the detected QR code to determine such a normal vector of the wound.
  • the QR code includes square boxes and rectangular regions, which when viewed off perpendicular tend to appear to be non-square and non-rectangular. Accordingly, a transformation may be determined such that the non-square boxes and/or non-rectangular regions of the QR code are transformed to have the appearance of being square boxes and/or rectangular regions, which would likewise be applied to the other regions of the captured image, including the wound.
  • a homography matrix may be applied to a projective transformation of the captured image of a three-dimensional plane to obtain a substantially orthographic view of the wound.
  • the wound may be characterized at a non-normal orientation, if desired.
  • the three-dimensional characteristic may be the depth of the wound and/or the volume of the wound.
  • the different angular orientations may be determined, such as for example, by determining the angular orientation using the projective transformations of the QR code of the captured images.
  • the QR code may be used to determine the scale of each of the images.
  • wound depth may include 1 st degree epidermal loss (epidermis); 2 nd degree partial thickness (superficial loss) dermis (papillary layer); 3 rd degree full thickness hypodermis; 4 th degree full thickness muscles, tendons, bone.
  • the calibration pattern may be other configurations.
  • the calibration pattern may be a rectangular pattern.
  • the calibration pattern may be a set of concentric circles with the pattern encoded within the concentricity of the circles.
  • the calibration pattern may be a circular shape with different color and/or patterned slices therein, such as each having a pie shape.
  • the calibration pattern may be any other type of geometrical shape.
  • the calibration pattern may be a 1 dimensionally encoded pattern or a 2 dimensionally encoded pattern.
  • a patient in cardiac arrest may be mistakenly not resuscitated because clinicians confused him with a patient who had a do-not-resuscitate order on file.
  • a patient may be provided an okay to undergo surgery based on a different patient's records and be found dead in his hospital room the next day.
  • Such patient-identification mix-ups are common and can have deadly consequences.
  • opportunities for ID-mix-ups are still increasing as health care becomes more complex.
  • a patient's wristband can be wrong, can be missing, can be illegible, or can be simply not checked. Accordingly, health-care facilities should adopt an improved standardized protocol to verify patient identities.
  • One technique to standardize verification of patient identities is to standardize how the patient's names are displayed. Another technique is to include bar codes on all medications and the wrists of the patient. With the abundance of similar names and the difficulty of attaching bar codes to all tests and procedures, these techniques have limited viability.
  • the patient preferably has a wrist band that includes a QR code affixed to it or otherwise printed on the wrist band 600 .
  • Other patent identification items may likewise be used, such as bracelets, necklaces, identification tags, etc.
  • a unique patient identification is encoded within the two-dimensional QR code.
  • the identity of the patent may be obtained by capturing an image of the QR code on the wrist band using an imaging device 610 , locating and decoding the QR code that is captured in the image 620 , and using the unique patient identification decoded from the QR code to identify the patient 630 .
  • the identification of the patient may be based upon information stored on the imaging device or otherwise obtained from a network based computing device 640 .
  • the data may be stored in a local or a network based database on a storage device.
  • the QR code with the unique patient identification encoded therein may be included as a page of peel off stickers 700 , each of which includes the same information encoded within the QR code.
  • the QR code with the unique patient identification may be removed from the page, such as peeling off the sticker, and located on different items associated with the patient.
  • a QR code may be placed on each of the patient's medicine bottles 710 .
  • a QR code may be placed on the patient's chart 720 .
  • a QR code may be placed on the patient's IV bag which are connected to an IV line for the patient 730 .
  • the unique patient identification may be integrated into a care provider's electronic medical records system 740 , so that they are synchronized together.
  • the QR code may be used as an identification to retrieve a picture of the patient 750 , which may be presented together with any other suitable information, on a display to further verify the identification of the particular patient.
  • the measurement of objects within an image may be based upon the use of a two-dimensional calibration pattern within the field of view of the image being captured.
  • the image may be captured with any suitable device, such as a camera, a video device, or a camera included within a mobile phone.
  • the calibration pattern is preferably located at the same, or substantially the same, plane as the wound. In such a case, the calibration pattern is preferably affixed to the body of the patient or otherwise in face-to-face contact with the body of the patient.
  • a relatively flat surface of the patient such as the back
  • other surfaces of the patient such as the elbow, ankle, wrist, head, leg, and arm have relatively flat surfaces but the wound is in an substantially curved orientation and the calibration pattern in face-to-face contact with the body of the patient in a proximate location is likewise is in a substantially curved orientation. Accordingly, the substantially curved orientation tends to add a third dimensional component to the captured two-dimensional image.
  • the scale determination at a single plane consistent with the calibration pattern tends to result in substantial errors when applied to other areas of the wound sufficiently spaced apart from the general plane of the calibration pattern.
  • measuring a skin lesion shape size (length, width, and/or area) on a curved region of the patient results in a skin surface shape that is similarly curved.
  • such a skin lesion shaped area would be a wound around the leg of a patient.
  • the leg is spherically shaped and thus not especially suitable for a skin lesion wrapped partially around the leg being calibrated with a single surface plane with a sufficiently small measurement error.
  • a suitable calibration structure for a sufficiently non-planar surface preferably includes a two-part calibration pattern.
  • a first calibration pattern is used to calibrate the substantially non-planar image into a single plane.
  • a second calibration pattern is used to measure the skin lesion of the calibrated image.
  • the first calibration pattern may measure the skin lesion of the image with the first calibration pattern being used to calibrate the measured skin lesion of the image.
  • the second calibration pattern is used to effectively “unwrap” the captured image into a single plane, such as using a homography matrix applied to a projective transformation of the captured image of a three-dimensional plane to obtain a substantially orthographic view of the wound.
  • a single calibration structure may be used to both unwrap and measure the wound of a captured image.
  • the skin lesion is not suitable for being imaged using a single captured image.
  • a first image captures a portion of the skin lesion that includes first and second calibration structures.
  • a second image captures another portion of the skin lesion that includes first and second calibration structures. Additional images of the skin lesion may be captured, as desired. Multiple captured images of the same skin lesion may be separately and/or jointly calibrated and unwrapped to provide a sufficiently accurate measure of the skin lesion.
  • an exemplary calibration structure is illustrated that includes both a first calibration pattern and a second calibration pattern.
  • the second calibration structure may be an elongate strip with parallel edges.
  • the curvature of the skin lesion may be determined based upon the resulting distortion in the parallel edges appearing to be non-parallel in the captured image.
  • the second calibration structure may be an elongate strip with parallel markings thereon.
  • the first calibration structure may be a two-dimensional pattern as previously described.
  • the first calibration pattern may be a one-dimensional pattern as previously described.
  • the one-dimensional pattern may define the surface scale in a single direction.
  • the distance between the parallel series of lines illustrated in FIG. 37 may define the mapping of the surface in a first direction.
  • the length of the parallel series of lines illustrated in FIG. 37 may define the mapping of the surface in a second direction.
  • the first and second calibration structures may be integrated into a single structure, such as, an elongate strip with parallel edges together with a two-dimensional calibration pattern included thereon.
  • the first and second calibration structures may be included on two separate physical structures not affixed to one another.
  • the first and second calibration structures may be included on two separate physical structures that are affixed to one another.
  • the first and second calibration structures are included in the captured image(s) obtained of the skin lesion.
  • an integrated strip may include a plurality of two-dimensional calibration patterns included thereon at spaced apart locations, which is especially suitable for curved surfaces.
  • the first and second calibration patterns are consistent with one another if multiple such first and/or second calibration patterns are used for imaging a single wound in a set of images.
  • the imaging device may be at any angle relative to the skin lesion, while still obtaining a relatively accurate measurement, which simplifies the image capture process for the user.
  • An exemplary technique for calibration a two-dimensional image on a curved surface, where the full surface of the wound is captured in a single image may be as follows:
  • a first calibration pattern is placed on a generally planar portion of the surface of the skin
  • the segmented image (or the entire captured image) is unwrapped based upon the three-dimensional surface model
  • the size of the wound is measured based upon the unwrapped image, based upon the scale data from the first calibration pattern (or second calibration pattern).
  • An exemplary technique for calibration a two-dimensional image on a curved surface, where the full surface of the wound is captured in multiple images may be as follows:
  • a first calibration pattern is placed on a generally planar portion of the surface of the skin
  • the imager scale from the first calibration pattern is used to calculate the size changes in the second calibration pattern.
  • the calibration pattern shrinks in size.
  • the segmented image (or the entire captured image) is unwrapped based upon the three-dimensional surface model
  • the size of the wound is measured based upon the unwrapped image, based upon the scale data from the first calibration pattern(s) (or second calibration pattern(s)) together with stitching the images together.
  • a consistent technique is used for wound assessment in measuring the wound.
  • One technique is linear measurement, generally referred to as a “clock” technique, where a measurement of the longest length is determined, a measurement of the greatest width is determined, and a greatest depth of the wound is determined, using the body as the face of an imaginary clock.
  • the head may be 12o'clock, the feet may be 6 o'clock.
  • the “clock” technique may be achieved using the calibration pattern(s) by positioning the calibration pattern(s) in specific directions.
  • the corner dots and color patterns may define the orientation of the calibration target with respect to the body.
  • the strip orientation may be used to define the orientation of the calibration target with respect to the body.
  • a first care giver may capture an image of a wound and classify it in a first manner
  • a second care giver may capture an image of the same wound at generally the same time and classify it in a different manner.
  • a care giver may capture an image of a first wound of a first patient and classify it in a first manner
  • the same care giver may capture in image of a second wound of a second patent and classify it in a second manner, while the first wound and the second would should more properly be classified in the same manner.
  • This inconsistency in the classification of the wound decreases the effectiveness of the care of the patient.
  • An assisted would classification, partially automated wound classification, and/or fully automated wound classification provides a substantial benefit to the care of the patient.
  • Wounds come in all shapes and in all sizes, so it is desirable to include standardized criteria to classify each wound to assist the care giver to provide more holistic and seamless care for the patients. This is especially the case if multiple care givers are providing various aspects of wound care management for the same patient. While a metric for standardization may exist in some fashion, the differences in interpretations of such a metric and differences in training of classifications using such a metric, still results in substantial variations across care givers and different times that such classification occurs.
  • a complex and prevalent wound etiology and their corresponding stage classification may be as follows.
  • Pressure ulcer pre-stage 1; stage 1; stage 2; stage 3; stage 4; DTPI; unstageable.
  • Diabetic ulcer wagner 1; wagner 2; wagner 3; wagner 4.
  • Such classifications are primarily based on the wound severity and the depth of tissue exposure. The deeper the wound and/or the more exposure of the inner body tissue structure tends to result in the higher severity of the stage classification. For example, a pressure ulcer with stage 4 is associated with a wound having tendon/bone/cartilage exposure. Additional factors tend to be incorporated with wound stage classification, such as, if a wound is classified of a higher severity historically which tends to persist until the closure of the wound.
  • the staging process 4100 may include placing a color calibration pattern next to a wound that is to be capture in an image 4110 .
  • An image capture device captures an image of the wound and the calibration pattern 4120 .
  • the system then calibrates the wound based upon the calibration pattern 4130 . In this manner, the captured image of the wound is modified in a manner that is more consistent among various image capture conditions.
  • the system detects the wound boundaries within the image, either in a manual or automated manner 4140 .
  • the system classified the exposed wound tissue type based on a pre-trained algorithm which considers tissue color and other image characteristics 4150 .
  • the image processing may be based upon an artificial intelligence processing, such as the system is trained using labeled images and thus when presented with a new capture image, it may classify it based upon the training. Beyond tissue color detection other imaging techniques may be applied to detect type tissue by color texture and composition analysis of the wound bed.
  • the system may then perform a wound stage classification based upon pertinent factors 4160 .
  • the factors may include, for example, assigned wound etiology, detected exposure type tissue colors and characteristics, wound staging history, and/or wound location.
  • the resulting classification may be presented to a care giver for verification, if desired, and consideration of continued care plan.
  • an exemplary set of wound classifications that may be selected by a care giver, are illustrated.
  • the care giver may obtain an image of a wound and compare that image against the provided selections, to most accurately classify the wound.
  • the system may be fully automated by using a set of images for classification and image analysis of the wound for classification. The images and descriptions provide information to assist the care giver in the classification.
  • the wound classifications are associated with the wound and saved in the management system.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Dermatology (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

A wound characterization system that includes color calibration.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 15/994,895 filed May 31, 2018, which claims the benefit of U.S. Provisional Patent App. No. 62/708,867 filed Dec. 27, 2017;
  • U.S. patent application Ser. No. 15/994,895 is a continuation-in-part of U.S. patent application Ser. No. 15/807,713 filed Nov. 9, 2017, now U.S. Pat. No. 10,565,735 issued Feb. 18, 2020, which claims the benefit of U.S. Provisional Patent App. No. 62/422/999 filed Nov. 16, 2016;
  • U.S. patent application Ser. No. 15/994,895 is a continuation-in-part of U.S. patent application Ser. No. 15/807,705 filed Nov. 9, 2017, now U.S. Pat. No. 10,417,785 issued Sep. 17, 2019, which claims the benefit of U.S. Provisional Patent App. No. 62/422,987 file Nov. 16, 2016;
  • U.S. patent application Ser. No. 15/994,895 is a continuation-in-part of U.S. patent application Ser. No. 15/807,693 filed Nov. 9, 2017, now U.S. Pat. No. 10,298,780 issued May 21, 2019, which claims the benefit of U.S. Provisional Patent App. No. 62/422,966 filed Nov. 16, 2016; and
  • U.S. patent application Ser. No. 15/994,895 is a continuation-in-part of U.S. patent application Ser. No. 15/587,891 filed May 5, 2017, now U.S. Pat. No. 9,989,952 issued Jun. 5, 2018, which is a continuation of U.S. patent application Ser. No. 13/791,987, filed Mar. 9, 2013, now U.S. Pat. No. 9,672,623 issued Jun. 6, 2017, which is a continuation-in-part of U.S. patent application Ser. No. 13/720,260, filed Dec. 19, 2012, now U.S. Pat. No. 9,410,827 issued Aug. 9, 2016, which claims the benefit of U.S. Provisional Patent App. No. 61/795,013, filed Oct. 9, 2012.
  • BACKGROUND OF THE INVENTION
  • Smart mobile devices such as smartphones, feature phones, tablet, e-readers, media players, and so on, combine capabilities from multiple single function devices into a single device. Typically, such smart mobile devices include various combinations of the capability found in devices such as a cell phone, a programmable computer, a camera, a media player and a portable Internet access device.
  • Many smart mobile devices contain one or more digital cameras that allow a user of the smart mobile device to take high resolution and high fidelity digital pictures. For example, some smart mobile devices include two cameras, one in the front of the smart mobile device and one in the back of the smart mobile device. Currently, typical smartphones are able to capture images with a digital resolution of, for example, five to eight megapixels. The trend is to increase the digital resolution of cameras on smart mobile devices. Some cameras for smart mobile digital devices allow for 3D image capture.
  • Cameras in smart mobile devices are especially handy to capture still or short video clips of memorable events and allow easy storage and sharing with others. A captured digital image typically is represented as a two-dimensional matrix of dots, also called pixels.
  • The foregoing and other objectives, features, and advantages of the invention will be more readily understood upon consideration of the following detailed description of the invention, taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 and FIG. 2 show the front and back, respectively, of a smart mobile device, in accordance with an implementation.
  • FIG. 3 shows a smart mobile device used to make a calibrated measurement in accordance with an implementation.
  • FIG. 4 shows an example of a calibration pattern useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 4A shows another two-dimensional calibration pattern.
  • FIG. 5 and FIG. 6 show, respectively, a front view and a back view of a case for a smart mobile device with imprinted calibration patterns useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 7 and FIG. 8 show, respectively, a front view and a back view of a case for a smart mobile device with alternative imprinted calibration patterns useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 9 and FIG. 10 show, respectively, a back view and a side view of a case for a smart mobile device with suction cups and a foldable pin useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 11 and FIG. 12 show, respectively, a front view and a top view of a case for a smart mobile device to which a hanging string may be attached so as to be useful when a smart mobile device makes a calibrated measurement in accordance with an implementation.
  • FIG. 13 shows a smart mobile device used to make a calibrated measurement of the distance between two walls in accordance with an implementation.
  • FIG. 14 shows a simplified example of an image that includes a case for a smart mobile device used as a calibration target useful when making measurements on other objects within the image in accordance with an implementation.
  • FIG. 15 shows a simplified example of an image that shows a house on which has been mounted a calibration pattern in a window in accordance with an implementation.
  • FIG. 16 shows an example of a two-dimensional bar code used as a calibration pattern in accordance with an implementation.
  • FIG. 17 shows another example of a two-dimensional bar code used as a calibration pattern in accordance with an implementation.
  • FIG. 18, FIG. 19, and FIG. 20 illustrate a calibration pattern being used to extract camera information about an image that is applicable to other images using a same image set-up in accordance with an embodiment.
  • FIG. 21 illustrates an image with a large QR code rendered on a display of a mobile device.
  • FIG. 22 illustrates an image with a small QR code rendered on a display of a mobile device.
  • FIG. 23 illustrates an image with a small QR code rendered on a display of a mobile device together with a visual indicator.
  • FIG. 24 illustrates an image with an enlarged centered QR code rendered on a display of a mobile device.
  • FIG. 25 illustrates a technique for locating and decoding a QR code using a mobile device.
  • FIG. 26 illustrates wound classification.
  • FIG. 27 illustrates a QR code and a wound.
  • FIG. 28 illustrates a technique for wound characterization.
  • FIG. 29 illustrates a QR code with color indicators.
  • FIG. 30 illustrates color modification of images including a wound.
  • FIG. 31 illustrates a projective transformation of a QR code.
  • FIG. 32 illustrates a patient with a QR code based identification.
  • FIG. 33 illustrates a system for the use of QR code based patient identification.
  • FIG. 34 illustrates a leg wound and a pair of calibration patterns imaging the entire wound.
  • FIG. 35 illustrates a leg wound and a pair of calibration patterns imaging a first portion of the wound.
  • FIG. 36 illustrates a leg wound and a pair of calibration patterns imaging a second portion of the wound.
  • FIG. 37 illustrates a calibration target for a curved surface.
  • FIG. 38 illustrates a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 39 illustrates a segmented image of a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 40 illustrates an unwrapped image of a leg with a wound, a first calibration pattern, and a second calibration pattern.
  • FIG. 41 illustrates a wound classification system.
  • FIG. 42 illustrates wound classifications.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENT
  • FIG. 1 and FIG. 2 show the front and back, respectively, of a smart mobile device 10. For example, smart mobile device 10 includes a front facing camera 12, and a touch sensitive display 11, as shown in FIG. 1. Smart mobile device 10 also includes, for example, a back facing camera 22 and a back facing flash 21, as shown in FIG. 2. For example, smart mobile device 10 is a smart phone, a tablet, an e-reader, a media player, a digital camera or any other portable device that includes a camera and has processing capability sufficient to run a software application that performs measurements based on a calibration pattern. In FIG. 2, app 23 represents a software application, stored in smart mobile device 10, that performs measurements based on a calibration pattern, as described further below. The mobile device may be a mobile phone, a smart phone, a tablet, a laptop, or otherwise.
  • If calibrated appropriately, images captured by smart mobile device 10 can be used for measuring object size in three dimensions, for measuring a distance between objects and for measuring color and brightness level of objects in a captured image. For example, as described further herein, inclusion of one or more calibration patterns within an image captured by smart mobile device 10 allows for appropriate calibration. In order to facilitate making measurements, the calibration pattern is placed within a focus plane of a camera that captures the digital image. Placement within the focus plane allows for calibrated measurements of other objects in the digital image.
  • FIG. 3 shows a smart mobile device 10 used to make a calibrated measurement. In FIG. 3, back facing camera 22 is shown to include a camera lens 31 and a camera sensor 32. Dotted lines 37 define a field of view 33 for back facing camera 22. An object of measurement 36 is located on a focus plane 34, as shown in FIG. 3. A calibration target 35 is also shown located on focus plane 34.
  • Focus plane 34 of back facing camera 22 is in a parallel plane to the plane on which camera sensor 32 resides. The distance of focus plane from camera 22 is determined by focus of camera lens 31 of camera 22. Typically, when capturing an image for the purpose of dimension measurements, a camera is best placed parallel with a focus plane (e.g., an X-Y plane) in which measurements will occur. When the focus plane is an X-Y plane, measurements on objects close to the focus plane (e.g., in which a location on the Z axis is close to the X-Y plane) will typically have higher accuracy than measurements made on objects farther from the focus plane (e.g., in which a location on the Z axis is at a greater distance to the X-Y plane). Therefore, it is typically best, where possible, to focus the camera lens on the intended object of measurement and to include a calibration pattern within the focus plane of the camera lens.
  • A calibration pattern includes one or more known predetermined sub-patterns that have known or knowable characteristics. Including such a calibration pattern in a captured digital image will indicate information about other pixels in the captured digital image. For example, the indicated information obtained from the calibration pattern may include actual dimensions of geometric shapes in the calibration pattern. This can be used to calculate, for example, actual dimension of sizes represented by each pixel within a captured digital image.
  • Knowing the actual dimension of sizes represented by each pixel within a captured digital image allows for making measurements of dimensional information. A measurement of dimensional information can he any measurement that takes into account information about dimensions. For example, a measurement of dimensional information can be a measurement of one or more of the following: distance between points, length, width, area, bounding box location and size, centroid, perimeter length, number of holes, form factor (ratio of area to the square of perimeter), elongation, moments, best-fitting ellipse, ratio of best-fitting ellipse axes, orientation, roundness, convexity related, convex area, minimum bounding box location, size and orientation, feret diameters at different angles, convexity (ratio of convex perimeter to raw perimeter), solidity (ratio of net area to convex area), perimeter related, perimeter points (blob's boundary and holes), filled area, sorting and selecting blobs based on any calculated feature, and user selection of group of features to calculate.
  • The indicated information obtained from the calibration pattern may also include, for example, brightness information for grey levels for objects and color information for objects in the calibration pattern. And so on. This can be used to calculate brightness and color information, etc., of other objects within the captured digital image. For a discussion of use of calibration targets in digital photography, see United States Patent Application 2004/0027456 A1 published Feb. 12, 2004.
  • FIG. 4 shows an example of a calibration pattern 40 that appears on calibration target 35. Calibration pattern 40 can include, for example, one or a plurality of various calibration sections used for calibration and can also include encoded or otherwise obtainable information that can be recognized by smart mobile device 10. An example of a calibration section within calibration pattern 40 is a geographic pattern 42 that has known or knowable physical dimensions. A high gradient pattern 44 can be used by smart mobile device 10 to sharpen image focus. A geographic pattern 45 is another geographic pattern with known physical dimensions that can be used for dimensional measurements. A red area 46, a blue area 47, a green area 48 and a gray area 49 are colorimetery and brightness calibration patterns that can be used by smart mobile device 10 to calibrate color and brightness for a captured image and/or to calibrate smart mobile device 10.
  • An identification indicia 43 is visually readable by a user. For example, identification number 43 is a serial number or any other type of number or other identifying indicia that identifies calibration pattern 40. For example, app 23 can check for identifying indicia 43 in order to use the identifying indicia to obtain information about calibration pattern 40. For example, different software applications running on smart mobile device 10 may require different calibration patterns. Each unique calibration pattern can be identified, for example, with an identifying indicia. Information for a particular calibration patterned associated with identifying indicia can be stored locally within smart mobile phone 10 or remotely, for example, in a server accessible by smart mobile phone 10 through the Internet. The information for a calibration pattern can be, for example, dimensional measurements from geometric patterns within the calibration pattern, brightness or color values for entities within the calibration pattern, a specification of the layout of the calibration pattern, a specification for a covering case or other entity on which the calibration pattern is embedded or attached and so on. The information can also include, for example, specifications pertaining to smart mobile device 10, such as packaging specifications and camera specifications.
  • A two-dimensional bar code 41 is a quick response (QR) code or similar code. Two-dimensional bar code 41 can include the identifying indicia for the calibration pattern thus allowing smart mobile device 10 to identify the calibration pattern in a captured image and access from local or remote storage information about the calibration pattern. Alternatively, or in addition, two-dimensional bar code 41 contains additional information about the calibration pattern. For example, two-dimensional bar code 41, in addition or instead of the identifying indicia for the calibration pattern, contains specific information about actual measurements for sections of the calibration pattern information, information about where the calibration is expected to be located (e.g., on a covering case for mobile device 10) and other information that, for example, may be useful to app 23 when performing measurements. App 23 will capture the information by decoding two-dimensional bar code 41 when two-dimensional bar code 41 is within a captured image. Alternative to two-dimensional bar code 41, calibration pattern 40 can use other means to encode information such as a one dimensional bar code or another information encoding scheme.
  • Referring to FIG. 4A, another type of two-dimensional bar code may include a data matrix consisting of black and white “cells” or modules preferably arranged in either a square or rectangular pattern, also known as a matrix. The information to be encoded is preferably text or numeric data. A typical data size is from a few bytes up to 1556 bytes. The length of the encoded data depends on the number of cells in the matrix. Error correction codes may be included to increase reliability so that even if one or more cells are damaged so it is unreadable, the message can still be read.
  • Data Matrix symbols are preferably rectangular, more preferably square in shape and composed of square “cells” which represent bits. Depending on the coding used, a “light” cell represents a 0 and a “dark” cell is a 1, or vice versa. Every Data Matrix is preferably composed of two solid adjacent borders in an “L” shape (referred to as the “finder pattern”) and two other borders consisting of alternating dark and light “cells” or modules (referred to as the “timing pattern”). Within these borders are rows and columns of cells encoding information. The finder pattern is used to locate and orient the symbol while the timing pattern provides a count of the number of rows and columns in the symbol. As more data is encoded in the symbol, the number of cells (rows and columns) increases. Each code is preferably unique.
  • Any other type of two-dimensional code may be used, as desired.
  • A particular calibration pattern can be registered with app 23 so that app 23 assumes that the registered calibration pattern in an image is the registered calibration pattern. This registration information allows app 23 operating within smart mobile device 10 to access information about the calibration target from local or remote memory, without having to read configuration information or the identifying indicia directly from calibration target 23.
  • When the calibration pattern includes an identifying indicia, whether encoded in a two-dimensional bar code or otherwise readable by mobile device 10, the identifying indicia can be used to check to see if app 23 is configured to be used with that calibration pattern. When app 23 checks the identifying indicia and determines smart mobile device 10 is configured to use the calibration pattern, the user of smart mobile device 10 is given, for example, an opportunity to register smart mobile device 10 to be configured to use the calibration pattern. For example, such registration might require a fee. Once registered, smart mobile device 10 will be able to access information about the calibration pattern. The information can be accessed, for example, from internal memory within smart mobile device 10 or from some external memory source.
  • A captured digital image that includes calibration pattern 40 in the focus plane allows for calibrated measurements, such as two-dimensional measurements of all objects within the focus plane of calibration pattern 40. Additionally, calibration pattern 40 can then be removed and another digital image captured without the presence of calibration pattern 40. As long as no other changes are made to the camera set-up, measurements can be made on the newly captured image based on calibration information obtained from the originally captured image.
  • It is also possible to measure distances extending perpendicular e a in the Z dimension). For example, the distance between smart mobile device 10 and an object where calibration pattern 40 resides can be determined by a comparison of pixel sizes in a digital image that includes calibration pattern 40 with the actual size of a known element within calibration pattern 40 while taking into account any magnification performed by camera lens 32.
  • In order to use smart mobile device 10 as a measuring device, it would be helpful to keep a calibration pattern handy to that could be included in an image captured by smart mobile device 10. This is accomplished, for example, by integrated the calibration pattern into a case for smart mobile device 10.
  • FIG. 5 and FIG. 6 show, respectively, a front view and a back view of a case 50 for smart mobile device 10. FIG. 5 shows a calibration pattern 52 included on case 50. For example, calibration pattern 52 is imprinted within a cavity 51 on the front of case 50. Including calibration pattern 52 within cavity 51 helps to protect calibration pattern 52 from being eroded through friction when placing smart mobile device 10 into case 50 and removing smart mobile device 10 from case 50.
  • FIG. 6 shows a calibration pattern 62 imprinted within a cavity 61 on the back of case 50. Including calibration pattern 62 within cavity 61 helps to protect calibration pattern 62 from being eroded through friction as case 50 interacts with its environment while protecting smart mobile telephone 10 from damage.
  • For example, case 50 is a full outerbox skin case, a four-sided skin case, a three-sided skin case, a perimeter bumper case, a holster case, or any other kind of case designed to protect mobile device 10. Case 50 is composed of, for example, hard material such as plastic or metal, or is composed of softer material such as leather or cloth composed of natural or synthetic material. For example, sides of case 50 are constructed to allow case 50 to be stood up on a flat surface without tipping, allowing convenient viewing of calibration pattern 52 and calibration pattern 62.
  • For example, the calibration pattern can be included on case 50 in various ways. For example, the calibration pattern can be imprinted on case 50 at manufacturing time. Alternately, the calibration pattern can be included on case 50 by, after manufacturing, adhering a label containing the calibration pattern onto case 50 or by any other means which results in calibration pattern being visible on case 50.
  • A benefit of including a calibration pattern on case 50 is that case 50 can be carried with mobile device 10 and is used to protect mobile device in addition to providing a ready source for the calibration pattern. Case 50 can be easily detached from smart mobile device 10 without affecting functionality of mobile device 10.
  • FIG. 7 and FIG. 8 show, respectively, a front view and a back view of a case 70 for smart mobile device 10. FIG. 7 shows a calibration pattern 72 imprinted within a cavity 71 on the front of case 70. Calibration pattern 72 is composed, for example, entirely of a two-dimensional bar code, such as a QR pattern. Including calibration pattern 72 within cavity 71 helps to protect calibration pattern 72 from being eroded through friction when placing smart mobile device 10 into case 70 and removing smart mobile device 10 from case 70.
  • FIG. 8 shows a calibration pattern 82 imprinted within a cavity 81 on the front of case 70. Calibration pattern 82 is composed, for example, entirely of a two-dimensional bar code, such as a QR pattern. Including calibration pattern 82 within cavity 81 helps to protect calibration pattern 82 from being eroded through friction as case 70 interacts with its environment while protecting smart mobile telephone 10 from damage.
  • For example, the two-dimensional bar code includes some or all calibration patterns geometries required for, for example, dimensional, brightness/grey level and colorimetery measurements. The two-dimensional bar code thus acts as a calibration pattern. The benefit of using the two-dimensional bar code as a calibration pattern is that the two-dimensional bar code takes up much or all of the space available for a calibration pattern and thus can be a sized two-dimensional bar code that can be easier detected within a captured image within a larger field of view
  • FIG. 9 and FIG. 10 show, respectively, a back view and a side view of a case 90 for smart mobile device 10. Case 90 has been outfitted with various appurtenances for allowing case 90 to be mounted on a focus plane when making measurements. For example, FIG. 9 shows a suction cup 91, a suction cup 92, a suction cup 93 and a suction cup 94 embedded on back of case 90. Suction cup 91, suction cup 92, suction cup 93 and suction cup 94 can be used to temporarily adhere the back of case 90 to a hard smooth surface such as metal or glass.
  • A foldable ring 95 can be used to hang case 90 to a pin, nail, hook and so on. Foldable ring 95 can also be used for hanging by a string, strand, thread, cord, etc.
  • FIG. 10 additionally shows a suction cup 101, a suction cup 102 and a suction cup 103, embedded on a side of case 90. Suction cup 101, suction cup 102 and suction cup 103 can be used to temporarily adhere the side of case 90 to a smooth surface.
  • A foldable pin 104 allows case 90 to be attached to soft material, like drywall, and cloth. The foldable design allows for foldable pin 104 to be in an embedded cavity while not in use.
  • FIG. 11 and FIG. 12 show, respectively, a front view and a top view of a case 110 for smart mobile device 10. FIG. 11 shows a hanging string 113 attached to case 110. Hanging string 113 allows case 110 to be suspended at a desired location when a calibration pattern 112 within an indentation 111 of case 110 is to be used as part of a calibrated measurement performed by mobile device 10. FIG. 12 shows a hang hole 121 and a hang hole 122 located on top of case 110. For example, hanging string 113 is placed through hang hole 121 and hang hole 122 to attach hanging string 113 to case 110.
  • FIG. 13 shows smart mobile device 10 used to make a calibrated measurement of the distance between a wall 131 and a wall 132. Lines 137 define a field of view 134 for back facing camera 22. A case 135 is attached to wall 131. Case 135 includes a calibration pattern that faces towards wall 132.
  • FIG. 14 shows a simplified example of a recorded image 140 that includes an image of case 145 with an embedded calibration pattern. The calibration can be used for measurements of dimensions, colorimetery, brightness and so on of other objects within recorded image 140. The other objects, include, for example, a safety pin 141, a pencil 144, a circular object 142 and a square object 143.
  • In order to activate app 23 within smart mobile device 10, app 23 needs to be transferred to smart mobile device 10 if not installed when smart mobile device 10 is purchased. For example, app 23 can be downloaded from the internet or from an app store. Also, a case with an embedded calibration pattern can be obtained.
  • The camera setting of smart mobile device 10 will need to be set according to any instructions included with app 23.
  • The calibration pattern may then be included in the field of view of a camera of smart mobile device 10. For example, a particular background may be specified or suggested to maximize contrast between the calibration pattern and the background.
  • The camera of smart mobile device 10 is focused on the calibration pattern based on the capability of the camera of smart mobile device 10. The focus capability may be, for example, auto focus, tap to focus, or another focusing capability. Once in focus, an image is captured.
  • App 23 will analyze the captured image. For example, if the captured image has a two-dimensional bar code, app 23 will read and decode the two-dimensional bar code and act in accordance with the encoded instructions. If the two-dimensional bar code includes a calibration code identifying indicia and all calibration information, then the app 23 will decode the information, associate the information with the identifying indicia of the calibration pattern and store the information in the memory of smart mobile device 10. The information can in the future be accessed based on the associated identifying indicia. Alternatively, if the two-dimensional bar code does not include all available information about the calibration pattern, app 23 can use the identifying indicia, for example, to access information about the calibration pattern previously stored in smart mobile device 10 or download additional information about the calibration pattern from an App central server (cloud) when smart mobile device 10 is connected to the Internet. For example, once information about the calibration pattern is stored in smart mobile device 10, the setup procedure of app 23 will prompt user for registering this specific calibration pattern with smart mobile device 10. If permission is granted, registration will proceed.
  • FIG. 3 illustrates the process of measuring object 36 in field of view 33 of back facing camera 22. In a first step, calibration target 35 is placed within field of view 33, preferably in focus plane 34 of measuring object 36. For example, as described above, calibration target 35 is a calibration pattern on a case of smart mobile phone 10. Smart mobile phone 10 is removed from the case and the case placed so that that calibration pattern plane is parallel to the measurement plane of object 35 and any other objects to be measured. Smart mobile phone 10 is positioned so that object 35, and any other objects to be measured, are maximized within field of view 33. For example, FIG. 14 shows multiple images within field of view 33,
  • In a third step, back facing camera 22 is focused at focus plane 34 and an image captured. For example, a manual focus or an auto focus capability, such as a tap-on-focus, is used to focus camera lens 31 on calibration target 35.
  • Once an image is captured, app 23 analyzes the capture image to perform a calibration process. Particularly, app 23 analyzes the captured image to determine an exact location and orientation of calibration target 35. App 23 will also look for a two-dimensional bar code or other source of encoded information within the captured image. From information obtained from, for example, a two-dimensional bar code or other source of encoded information, app 23 will verify smart mobile device 10 has access to the relevant calibration information associated with calibration target 35 and if so, use the relevant calibration information associated with calibration target 35 for calibrating back facing camera 22. If smart mobile device 10 does not have access to the relevant calibration information associated with calibration target 35, app 23 will try to obtain access to this information, for example, by connecting user to an online source where access can be obtained.
  • Once app 23 has access to relevant calibration information, app 23 uses algorithms that use geometrical patterns included within the calibration pattern and their geometrical relationships to calculated measurement values, as is understood in the art.
  • In a fourth step, object 36 is measured. To measure object 36, the user brings up the calibrated captured image. The calibrated captured image will have calibration information with it. The calibrated captured image can be viewed and processed on smart mobile device 10 or transferred to another computing device such as a personal computer for viewing and measuring. For example, an object measurement menu bar is presented to use for making the measurement process more convenient. At the user's option, various measurements can be made. For example, a point-to-point measurement can be made using a ruler placement.
  • Also, an area measurement can be made by placing a geometrical shape on an object. Various associated measurements such as dimensions, gray level, density, colorimetry, and so on can be calculated.
  • Alternatively, a user can identify an object and automated object recognition could be performed. The automated object recognition could return detected values for various associated measurements such as dimensions, gray level, density, colorimetry.
  • Alternatively, app 23 can be written so that when run on mobile device 10 mobile device 10 creates a process running on mobile device 10 that can detect a case that does not necessarily include a calibration pattern. For example, the case can be detected by detecting the outline of the case or some prominent feature on the case or pattern on the case. In this example, app 23 uses stored information about the case to make a calibrated measurement. For example, the stored information can be dimensional information, brightness information, color information or information about a feature or a pattern on the case.
  • FIG. 13 illustrates measurement of distance between two objects, in this case the distance between wall 131 and wall 132. In a first step, the calibration target, i.e., case 135 with an embedded calibration pattern, is placed on the first object, i.e., wall 131.
  • In a second step, smart mobile device 10 is placed on the second object, i.e., wall 132. Smart mobile device 10 is mounted on wall 132 so that camera 22 is directly facing in a direction perpendicular to case 135 (the calibration target).
  • In a third step, the zoom of camera 22 is adjusted to maximize the size of the calibration target in field of view 137 of smart mobile device 10.
  • In a fourth step, camera 22 is focused on case 135 and an image captured. For example, a manual focus or an auto focus capability, such as a tap-on-focus is used to focus camera lens 31 on case 135.
  • In a fifth step, once an image is captured, app 23 analyzes the capture image to perforin a calibration process. Particularly, app 23 analyzes the captured image to determine an exact location and orientation of case 135. App 23 will also look for a two-dimensional bar code or other source of encoded information within the captured image. From information obtained from, for example, a two-dimensional bar code or other source of encoded information, app 23 will verify smart mobile device 10 has access to the relevant calibration information associated with the calibration pattern embedded on case 135 and if so, use the relevant calibration information associated with the calibration pattern embedded on case 135 for calibrating back facing camera 22. If smart mobile device 10 does not have access to the relevant calibration information associated with calibration target 35, app 23 will try to obtain access to this information, for example, by connecting user to an online source where access can be obtained.
  • Once app 23 has access to relevant calibration information, app 23 uses algorithms that use specific patterns in the calibration pattern designed for distance measurement through triangulation.
  • A calibration pattern within an image can be used apart from a smart mobile device. For example, FIG. 15 shows a simplified example of an image 150 that shows a house 157 on which has been mounted a calibration pattern 151 in a window of the house. For example, the image is a digital image captured with any digital camera. The image can be displayed on any computer system able to display digital images. Calibration pattern 151 contains information about calibration pattern 151. For example, calibration pattern 151 is a two-dimensional bar code that contains encoded display information about calibration pattern 151.
  • The information displayed in calibration pattern 151 is utilized to make one or more calibrated measurements, such as those represented by an arrow 152, an arrow 153, an arrow 154, an arrow 155, and an arrow 156. The calibrated measurements are utilized, for example, by a computing system used by a user, or by a remote server accessed by a user.
  • The inclusion of a calibration pattern in a digital image allows for a computer system to make calibrated measurements. For example, the image can contain objects of any size. The calibrated measurements can be made by any computing system with sufficient processing power to make the pertinent calculations.
  • The information displayed in a calibration pattern can also be used to validate user permission to use a particular application to make calibrated measurements. For example, a particular calibration application can be set up to only operate on images that display a particular calibration pattern or group of calibration patterns. For example, each calibration pattern may include a serial number or some other identification indicia that uniquely identifies the calibration pattern. The application making the calibration measurements can use this identification indicia as a pass code to validate user rights to use the application to make calibrated measurements.
  • FIG. 16 shows a two-dimensional bar code 160 used as a calibration pattern. While in FIG. 16, calibration pattern 160 is in a tilted orientation, app 23 will calculate the orientation and take the orientation into account when making calibrated measurements. For example, information about calibration pattern 160 will include a value for an actual distance, represented by a line 164, between a point 161 and a point 162, a value for an actual distance, represented by a line 165, between point 162 and a point 163 and a value for an actual distance, represented by a line 166, between point 163 and point 161. Within calibration pattern 160, a high gradient pattern can be inserted to be used to sharpen image focus. Also, particular color or grey areas can be added to calibration pattern 160 to allow for calibration of color and/or brightness for a captured image that includes calibration pattern 160.
  • As illustrated in FIG. 3, placing camera 22 and calibration target 35 in parallel planes when capturing an image of calibration target 35 is important to achieve accurate measurements. Since a user may hold mobile device 10 in hand when capturing an image, there may be some variance from the ideal positioning of camera 22 and calibration target 35 in parallel planes. To accommodate this lack of precision, four or more measuring points of calibration target can be used to measure co-planarity of the planes in which camera 22 and calibration target 35 are situated.
  • For example, FIG. 17 shows a two-dimensional bar code 170 used as a calibration pattern. For example, information about calibration pattern 170 will include a value for an actual distance, represented by a line 176, between a point 171 and a point 172, a value for an actual distance, represented by a line 177, between point 172 and a point 173, a value for an actual distance, represented by a line 178, between point 173 and a point 174, and a value for an actual distance, represented by a line 175, between point 174 and point 171.
  • Points 171, 172 and 173 are used for geometrical calibration of the captured image and orientation assessment of the calibration pattern. All four points 171, 172, 173 and 174 are used for a co-planarity measurement. The image co-planarity measurement will have multiple applicability. That is, the co-planarity measurement is used to access image co-planarity at the time of the image capture and provides real-time feedback to the user of smart mobile device 10 on the parallelism of the camera with the calibration pattern image plane when the user is about to capture an image. For example, visual and/or audio feedback is given to the user when the camera with the calibration pattern are co-planar or alternatively when the camera with the calibration pattern are not co-planar.
  • Once an image is captured the co-planarity measurement is used to correction any deviation from co-planarity between the camera the calibration pattern image plane. The co-planarity measurement can also be used as a factor in calculating and presenting to the user a value that indicates an expected accuracy of the calibrated measurement.
  • While app 23 within mobile server 10 utilizes the calibration pattern to make calibrated measurements, such calibrated measurements could also be done by any computer implemented system that includes a processor and computer readable medium encoded with processor readable instructions that, when read, implement a process on the processor that can detect a calibration pattern within an image where the process uses information displayed within the calibration pattern to make a calibrated measurement.
  • For example, a server can make a measurement by accessing a digital image, where the digital image includes a calibration pattern and the calibration pattern includes displayed information about the calibration pattern. The server reads the displayed information to obtain the information about the calibration pattern. Then the server utilizes the displayed information to make a calibrated measurement.
  • It is also possible to calibrate an image once and use extracted calibration information from the image to calibrate other images captured using the same image set-up (e.g., camera position, object location, lighting, etc.) To achieve this, one can calibrate the image at the time of picture taking by placing a calibration pattern in the scene and taking a picture. The calibration pattern can then be used to extract camera information about the image, which will be equally applicable to all other images subsequently captured using the same image set-up.
  • This is illustrated by FIG. 18, FIG. 19 and FIG. 20. FIG. 18 shows a shoe 202 within a picture frame 201. Also, within the picture frame is media 203 that includes a calibration pattern. The calibration pattern allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204, which are not visible in the image, but represent information available from the calibration pattern.
  • The calibration pattern can provide, for example, information such as pixel size in X direction, pixel size in Y direction, distance to the focus plane, location of the focus plane in the image (can be exposed with placing a graphics overlay to define this plane), if there are multiple focus plane of calibration the above attributed could be duplicated for each plane, dimensional measurement info and overlays for premeasured objects, colorimetric calibration information, brightness calibration information, capture time lighting information (flash, sunlight, etc.), scale with respect to real life (example: scale of a architectural drawing for an image of the drawing), camera settings, and so on. To define a plane of focus, a coordinate crosshair can also be superimposed into a picture, as a guide for a user making measurements
  • The image captured with the calibration pattern is processed to extract the calibration information. This calibration information will be the same for all subsequent images taken from the same image set-up. This allows the subsequent images to be calibrated without physically including in the image media 203 with the calibration pattern.
  • When a subsequent image has been taken without including in the image the calibration pattern, the calibration information can be added subsequently to the image. This could be done by visually superimposing a visible pattern containing the information onto the image or it can be done in a way that does not affect the image, for example, by including the calibration in metadata stored as part of the image. What is meant by “image metadata” herein is information stored with an image that gives information about the image but does not affect the appearance of the image as reproduced.
  • FIG. 19 represents the case where an image has been retaken from the same image set-up (but without media 203 in the picture). In this case the image included only shoe 202. Using calibration from the previously taken image allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204, which are not visible in the image, but represent information available from the calibration information from the earlier taken image. The calibration information, while not originally part of the image, has been added to the image shown in FIG. 19 by superimposing a two-dimensional bar code 208 on the image shown in FIG. 19. Use of a two-dimensional bar code is only illustrative as this information could be visibly included on the image in other ways, for example through a one-dimensional bar code, a digitally coded label, an alphanumeric coded label or some other communication methodology visible on an image.
  • FIG. 20 represents another case where an image has been retaken from the same image set-up (but without media 203 in the picture). In this case the image included only shoe 202. Using calibration from the previously taken image allows for calibration of dimensions such as represented by dimensional measurements 205 and 206 and by axis of orientation 204, which are not visible in the image, but represent information available from the calibration information from the earlier taken image. The calibration information, while not originally part of the image, has been added to the image metadata, but not added to the image data. This, as shown in FIG. 20 no calibration information appears in the image itself. The calibration information is included only as part of image metadata stored with an image.
  • Alternative to retaking a picture with the same image set-up, the original image itself can be altered (e.g., using image processing software) to remove the calibration pattern from the original image. The calibration information could then be re-added to the image in another form, for example, by superimposing the image back onto the image, as illustrated in FIG. 19, or by including the calibration information in image metadata stored with the image, as illustrated by FIG. 20.
  • The ability to extract calibration information from a first taken image and reuse the calibration information in subsequent images taken with the same image set-up can be advantageous. For example, volume manufactures may want to develop a picture taking setup where a camera and picture are calibrated once and images of different objects are taken for future at will measurements. A shoe manufacturer, for example, may make a picture taking setup and calibrate the system via a calibration pattern or other means and maintain this setup to take pictures of multiple shoes placed in the focus plane.
  • The ability to extract calibration information from a first taken image and then in post image processing removing the image from the original image allows inclusion of the calibration information, for example in image metadata for the image, while maintaining image originality, artistic perspective and cleanliness. Any calibration pattern in the image that distracts the viewer and impacts the artistic perspective of the image is removed.
  • Sometimes it may be necessary to alter calibration information stored with an image. For example, for an original image taken with a calibration pattern, resolution, or some other feature of the image set-up may vary from subsequent images captured without the calibration pattern or even the images directly derived from an original image. This may occur, for example, where an image taken at a high resolution is uploaded to an on-line site that limits the resolution of uploaded images. If the calibration information stored with the original image (either visible on the picture on in image metadata), is based on the higher resolution, the calibration information stored with the image needs to be resolution scaled to be accurate. If the resolution scaling information of the original image is included in the calibration data, this allows the change in resolution to be taken into account when subsequently interpreted. Including such information, either visibly or with image metadata for the image, allows for precise interpretation of measurement information.
  • Referring to FIG. 21, the mobile device is capable of locating and decoding a quick response code 300 in an image captured from a scene if the quick response code is sufficiently large. Referring to FIG. 22, however, if the quick response code 310 is not sufficiently large, the mobile device may have difficulty identifying the quick response code in the captured image, and even if identified, properly decoding the quick response code in the captured image.
  • Referring to FIG. 23, in the case that the quick response code 320 is not readily identifiable, or the quick response code is not readily decoded even if identifiable, then the mobile device may provide a visual alert 330 to the user on the display indicating that the quick response code should be identified on the display by the user. In one embodiment, the visual alert may be a hand with a pointed finger. In another embodiment, a stylus or a mouse motion or a set of x & y coordinates may be used to identify the location of the quick response code. The mobile device receives an identification of the location of the quick response code 320 by the user selecting a portion of the image where the quick response code is located.
  • Referring to FIG. 24, upon identification of the location of the quick response code on the display of the mobile device by the user, the image presented on the display of the mobile device and/or the image captured by the imaging device of the mobile device is automatically presented on the display with an enlarged quick response code 340 and those portions of the image around the quick response code. In this manner, the region of the image and/or scene that includes the quick response code is presented in a manner where the quick response code is readily identified and the quick response code is readily decoded. If desired, the location indicated by the user may be generally centered on the display when displaying the quick response code at an enlarged scale. Further, a noise reduction process may be included during the scaling process. It is to be understood that the increased magnification process may be repeated until the quick response code is sufficiently large to be readily identified and readily decoded. Upon successfully identifying and decoding the quick response code the process may terminate or otherwise reaching a predetermined magnification limit the process may terminate. The magnification limit may be selected by the user, if desired.
  • Referring to FIG. 25, an exemplary technique to accommodate QR codes that are relatively small or otherwise hard to discriminate and decode is illustrated. An image is captured and the mobile device applies a QR code reader technique to locate the QR code in the image using the current magnification and location 400. The technique then determines if the QR code is located in the captured image 410. If the QR code is located, then the technique determines if the QR code is decodable in the captured image 420. If the QR code is located 410 and the QR code is decodable 420, then the technique decodes the QR code 430. If the QR code is not located or if the QR the code is not decodable 420, then the technique automatically prompts the user to indicate the location of the QR code within the captured image 440. Based upon a received indication of the location of the QR code within the image, the mobile device uses the received location to center a portion of the image and enlarge the image 450. The system determines if a magnification limit is reached 460, such that further magnification is not suitable, then the technique ends the search for the QR code and/or the ends the attempt to decode the located QR code 470. If the magnification limit is not reached 460, then the repeats the process at with the modified region of the image and the modified magnification to attempt to locate and decode the QR code.
  • A chronic wound on a person tends to occur when the wound does not heal in an orderly set of stages and in a predictable amount of time in the manner in which most wounds heal. By way of example, wounds that do not properly heal within three months are typically considered to be chronic wounds. The chronic wounds tend to be detained in one or more of the phases of the wound healing. By way of example, a chronic wound may remain in the inflammatory stage for a significant period of time. In some cases, chronic wounds may never heal or may take several years to properly heal. Such chronic wounds cause patients severe emotional and physical stress, while also creating significant financial burden on the patient and also on the health care system as a whole. It is estimated that chronic wounds impact over 8.5 million people with an expense in excess of $30 billion a year in the United States.
  • Venous and arterial ulcers, diabetic wounds, and pressure ulcers are some of the major chronic wound categories. Venous and arterial ulcers are primary caused as a result of aging, diabetic wounds are primarily caused as a result of diabetes, and pressure ulcers are primary caused as a result of obesity. The burden of treating such chronic wounds is substantially increasing over time with increasing health care costs, an increasingly aging population, a sharp rise in the incidence of diabetes, and a sharp rise in obesity.
  • In a 2012 study of the US wound registry data using a 5-year period of identified data from electronic health records originating from 59 hospital-based outpatient wound centers in 18 states with 5240 patients with 7099 wounds were analyzed. The mean patient age was 61.7 years with 52.3% being male and the majority being Caucasian (73.1%) and being Medicare beneficiaries (52.6%). The mean number of serious comorbid conditions per patient was 1.8, with the most common being diabetes (46.8%), obesity or overweight (71.3%), and having cardiovascular or peripheral vascular disease (51.3%). More than 1.6% of patients died in service or within 4 weeks of the last visit. Almost two thirds of wounds healed (65.8%) with an average time to heal of 15 weeks and 10% of wounds taking 33 weeks or more to heal. The average wound surface area was 19.5 cm2. Half of wounds that healed did so with only the use of moist wound care (50.8%) and without the need for advanced therapeutics. The mean cost to heal per wound was $3,927.
  • Referring to FIG. 26, cutaneous wound measurements are an important metric to track the healing of a wound and based upon such measurement direct appropriate therapy, especially in the first 4 weeks of the wound healing process. The precision of wound measurement has a statistically significant influence on the assessment of the wound healing process. The wound may include any type of abrasion, skin lesions, moles, Psoriasis mark, or otherwise.
  • Despite the importance of wound measurement and it's direct impact on chronic wound management, a common technique of wound measurement uses disposable paper rulers. A wound care provider holds such a disposable paper ruler against the wound and reads it using their naked eye. Wound measurement is calculated by multiplying the longest length by the widest width of the wound. Studies have indicated that the use of ruler and naked eye approximation has a 44% error rate in measuring wound area.
  • The aforementioned wound measurement technique is operator dependent with significant variability that is open to interpretation. Different clinicians could look at the same wound and take divergent measurements and/or measure different parts of the same wound over time. The inter-rater measurement error, when two separate providers measure the same wound, ranges between 16 and 50%, which again is highly variable. These inaccuracies of wound measurement affect therapy decisions, leading to prolonged healing times and increasing costs.
  • Further complicating the wound measurement is the need for techniques that can be readily deployed among the thousands of nurses who visit patients in the home environment which is the most common site of wound care. Mobile device based techniques of wound measurement may be of remedy for these practices.
  • Referring to FIG. 27, the skin of a patient may include a wound therein, such as on their lower arm as an example. The size of the wound may vary in its shape and size depending on the patient and the particular type of wound. To characterize the size and shape of the wound, a QR code may be affixed with an adhesive or otherwise placed in contact with the skin of the patient proximate the wound. The QR code may include size related identification information therein, as previously described. With the QR code located proximate the wound together with information included therein regarding the scale of the QR code, the QR code may act as a reference scale for the wound. In this manner, the scale of the captured image may be determined and as a result the scale of the wound in the particular captured image may likewise be determined. By way of example if the QR code is decoded to indicate it is 1 inch by 1 inch is size, then if the wound has the same height as the QR code and twice the length of the QR code, then the wound would be identified as being 1 inch by 2 inches in size.
  • Referring to FIG. 28, as noted the QR code is preferably positioned near the wound on the skin of the patient 500, and the QR code is read and decoded to determine the scale of the QR code 510. The edges of the wound proximate the QR code are identified 520. The edges of the wound may be identified in any suitable manner. One technique to identify the edges of the wound is for the technician to use their finger to manually trace around the edges of the wound. Another technique to identify the edges of the wound is for the system to use edge identification and/or segmentation and/or color based discrimination techniques to identify the edges of the wound. For example, the identification of the edges of the wound may be fully automatic. For example, the identification of the edges of the wound may be further based upon the user selecting a generally central region of the wound to assist in a more robust identification of the wound. For example, the identification of the edges of the wound may be based upon defining a bounding box around the wound. For example, the identification of the edges of the wound may be based upon a technician's manual outlining of the wound. If desired, machine learning techniques may be used to assist in the automatic determination of the edges of the wound. For example, the technique may include the technician defining the general bounding region of the wound and the subsequent automatic refinement of the bounding region to more accurately identify the actual edges of the wound.
  • With the boundary of the wound being identified 520, wound size characteristics may be determined 530. The wound size characteristics may be any suitable characteristic. One characteristic may be, for example, the maximum width and the maximum length of the wound. Another characteristic may be, for example, the total area of the wound. Another characteristic may be, for example, the elliptical characteristics of the wound. Another characteristic may be, for example, a measure of undetermined areas of the wound such as cut outs, tunnels, and cavities. Another characteristic may be, for example, the symmetry of the wound. If desired, the system may include a boundary refinement process 540 to permit the technician to modify the boundary to more accurately represent the wound which is then used to characterize the wound.
  • The progression of a wound's color over time is an indicator of the wound's response to treatment. With images captured using different devices, at different times of day, under different lighting conditions, the apparent color sensed by the imaging device tends to dramatically vary. Further, the color presented by the imaging device also tends to dramatically vary based upon the time of day, light conditions, and the characteristics of the particular device. To provide a temporal review and/or characteristics and/or analysis of a series of images of the progression of a wound, it is desirable that the colors sensed by the mobile device are represented generally consistently across the series of images.
  • Referring to FIG. 29, rather than require the technician to use the same mobile device under the same lighting conditions at the same time of day to obtain a series of images over time, it is preferable to incorporate color indicators such as color based markers together within the bounds of the QR code and/or colored portions of the QR code itself and/or color based markers proximate the QR code such as around the periphery of the QR code. The color indicators may be used to determine a color calibration matrix, such as C=argmincΣi|Ii−C(Ii)|2, where C is a 3×3 color calibration matrix, and Iiis the ith pixel in the image I.
  • Referring to FIG. 30, an example of three different original images are illustrated, each of which is of the same wound area under different lighting conditions, and the resulting calibrated color images. In this manner, a plurality of different images taken on the same or different mobile device, may be modified to have more color consistency. Preferably, the images from the mobile device(s) are provided to a network based computing device, where the color correction is provided at the network based computing device, and the images are available from the network based computing device upon request. The color adjusted image maybe used for wound characterization.
  • In many situations, the accuracy of the metrological analysis of a captured image is based, at least on part, upon the angle between the camera focal axis and the normal vector of the wound surface with the normal vector of the wound surface being preferred. Each of captured images should be modified, as desired, to provide a would surface that appears to be more consistent with the normal vector of the wound. Rather than attempting to use the characteristic of the patient to determine such a normal vector of the wound, the system preferably uses the detected QR code to determine such a normal vector of the wound.
  • Referring to FIG. 31, preferably, the QR code includes square boxes and rectangular regions, which when viewed off perpendicular tend to appear to be non-square and non-rectangular. Accordingly, a transformation may be determined such that the non-square boxes and/or non-rectangular regions of the QR code are transformed to have the appearance of being square boxes and/or rectangular regions, which would likewise be applied to the other regions of the captured image, including the wound. By way of example, a homography matrix may be applied to a projective transformation of the captured image of a three-dimensional plane to obtain a substantially orthographic view of the wound. Moreover, the wound may be characterized at a non-normal orientation, if desired.
  • Furthermore, using two or more images, especially those taken from different angular orientations three-dimensional characteristics of a wound may be determined. For example, the three-dimensional characteristic may be the depth of the wound and/or the volume of the wound. The different angular orientations may be determined, such as for example, by determining the angular orientation using the projective transformations of the QR code of the captured images. As previously described, the QR code may be used to determine the scale of each of the images. By way of example, wound depth may include 1st degree epidermal loss (epidermis); 2nd degree partial thickness (superficial loss) dermis (papillary layer); 3rd degree full thickness hypodermis; 4th degree full thickness muscles, tendons, bone.
  • It is to be understood that the calibration pattern may be other configurations. By way of example, the calibration pattern may be a rectangular pattern. By way of example, the calibration pattern may be a set of concentric circles with the pattern encoded within the concentricity of the circles. By way of example, the calibration pattern may be a circular shape with different color and/or patterned slices therein, such as each having a pie shape. By way of example, the calibration pattern may be any other type of geometrical shape. By way of example, the calibration pattern may be a 1 dimensionally encoded pattern or a 2 dimensionally encoded pattern.
  • Medical record mistakes account for a significant heath issue, one that can result in the death of patients. For example, a patient in cardiac arrest may be mistakenly not resuscitated because clinicians confused him with a patient who had a do-not-resuscitate order on file. For example, a patient may be provided an okay to undergo surgery based on a different patient's records and be found dead in his hospital room the next day. Such patient-identification mix-ups are common and can have deadly consequences. Even with the addition of safety initiatives to make improvements, opportunities for ID-mix-ups are still increasing as health care becomes more complex. In many cases, a patient's wristband can be wrong, can be missing, can be illegible, or can be simply not checked. Accordingly, health-care facilities should adopt an improved standardized protocol to verify patient identities.
  • One technique to standardize verification of patient identities is to standardize how the patient's names are displayed. Another technique is to include bar codes on all medications and the wrists of the patient. With the abundance of similar names and the difficulty of attaching bar codes to all tests and procedures, these techniques have limited viability.
  • Referring to FIG. 32, the patient preferably has a wrist band that includes a QR code affixed to it or otherwise printed on the wrist band 600. Other patent identification items may likewise be used, such as bracelets, necklaces, identification tags, etc. A unique patient identification is encoded within the two-dimensional QR code. In this manner, the identity of the patent may be obtained by capturing an image of the QR code on the wrist band using an imaging device 610, locating and decoding the QR code that is captured in the image 620, and using the unique patient identification decoded from the QR code to identify the patient 630. The identification of the patient may be based upon information stored on the imaging device or otherwise obtained from a network based computing device 640. For example, the data may be stored in a local or a network based database on a storage device.
  • Referring to FIG. 33, the QR code with the unique patient identification encoded therein may be included as a page of peel off stickers 700, each of which includes the same information encoded within the QR code. The QR code with the unique patient identification may be removed from the page, such as peeling off the sticker, and located on different items associated with the patient. For example, a QR code may be placed on each of the patient's medicine bottles 710. For example, a QR code may be placed on the patient's chart 720. For example, a QR code may be placed on the patient's IV bag which are connected to an IV line for the patient 730. Furthermore, the unique patient identification may be integrated into a care provider's electronic medical records system 740, so that they are synchronized together. In addition, the QR code may be used as an identification to retrieve a picture of the patient 750, which may be presented together with any other suitable information, on a display to further verify the identification of the particular patient.
  • The measurement of objects within an image may be based upon the use of a two-dimensional calibration pattern within the field of view of the image being captured. The image may be captured with any suitable device, such as a camera, a video device, or a camera included within a mobile phone. To more accurately measure a wound on the body of the patient, the calibration pattern is preferably located at the same, or substantially the same, plane as the wound. In such a case, the calibration pattern is preferably affixed to the body of the patient or otherwise in face-to-face contact with the body of the patient. While a relatively flat surface of the patient, such as the back, provides a surface suitable for the calibration pattern and the wound being in a substantially co-planar arrangement, other surfaces of the patient, such as the elbow, ankle, wrist, head, leg, and arm have relatively flat surfaces but the wound is in an substantially curved orientation and the calibration pattern in face-to-face contact with the body of the patient in a proximate location is likewise is in a substantially curved orientation. Accordingly, the substantially curved orientation tends to add a third dimensional component to the captured two-dimensional image.
  • While the entire wound of the patient may be captured in an image together with the calibration pattern, the scale determination at a single plane consistent with the calibration pattern tends to result in substantial errors when applied to other areas of the wound sufficiently spaced apart from the general plane of the calibration pattern. By way of example, measuring a skin lesion shape size (length, width, and/or area) on a curved region of the patient results in a skin surface shape that is similarly curved. By way of example, such a skin lesion shaped area would be a wound around the leg of a patient. The leg is spherically shaped and thus not especially suitable for a skin lesion wrapped partially around the leg being calibrated with a single surface plane with a sufficiently small measurement error.
  • Referring to FIG. 34, a suitable calibration structure for a sufficiently non-planar surface preferably includes a two-part calibration pattern. A first calibration pattern is used to calibrate the substantially non-planar image into a single plane. A second calibration pattern is used to measure the skin lesion of the calibrated image. Alternatively, the first calibration pattern may measure the skin lesion of the image with the first calibration pattern being used to calibrate the measured skin lesion of the image. The second calibration pattern is used to effectively “unwrap” the captured image into a single plane, such as using a homography matrix applied to a projective transformation of the captured image of a three-dimensional plane to obtain a substantially orthographic view of the wound. Alternatively, a single calibration structure may be used to both unwrap and measure the wound of a captured image.
  • In some embodiments the skin lesion is not suitable for being imaged using a single captured image. In this case, referring to FIG. 35, a first image captures a portion of the skin lesion that includes first and second calibration structures. In this case, referring to FIG. 36, a second image captures another portion of the skin lesion that includes first and second calibration structures. Additional images of the skin lesion may be captured, as desired. Multiple captured images of the same skin lesion may be separately and/or jointly calibrated and unwrapped to provide a sufficiently accurate measure of the skin lesion.
  • Referring to FIG. 37, an exemplary calibration structure is illustrated that includes both a first calibration pattern and a second calibration pattern.
  • For example, the second calibration structure may be an elongate strip with parallel edges. The curvature of the skin lesion may be determined based upon the resulting distortion in the parallel edges appearing to be non-parallel in the captured image. For example, the second calibration structure may be an elongate strip with parallel markings thereon.
  • For example, the first calibration structure may be a two-dimensional pattern as previously described. For example, the first calibration pattern may be a one-dimensional pattern as previously described. By way of example, the one-dimensional pattern may define the surface scale in a single direction. For example, the distance between the parallel series of lines illustrated in FIG. 37, may define the mapping of the surface in a first direction. For example, the length of the parallel series of lines illustrated in FIG. 37, may define the mapping of the surface in a second direction.
  • Preferably, the first and second calibration structures may be integrated into a single structure, such as, an elongate strip with parallel edges together with a two-dimensional calibration pattern included thereon. Alternatively, the first and second calibration structures may be included on two separate physical structures not affixed to one another. Alternatively, the first and second calibration structures may be included on two separate physical structures that are affixed to one another. Preferably, the first and second calibration structures are included in the captured image(s) obtained of the skin lesion. In some cases, an integrated strip may include a plurality of two-dimensional calibration patterns included thereon at spaced apart locations, which is especially suitable for curved surfaces. In some cases, the preferably the first and second calibration patterns are consistent with one another if multiple such first and/or second calibration patterns are used for imaging a single wound in a set of images.
  • As it may be observed, the imaging device may be at any angle relative to the skin lesion, while still obtaining a relatively accurate measurement, which simplifies the image capture process for the user.
  • An exemplary technique for calibration a two-dimensional image on a curved surface, where the full surface of the wound is captured in a single image may be as follows:
  • (1) a first calibration pattern is placed on a generally planar portion of the surface of the skin;
  • (2) a second calibration pattern is placed on the skin starting or passing through the first calibration pattern, covering as much of the surface as possible;
  • (3) Referring to FIG. 38, an image is captured of the surface focusing on the first calibration pattern;
  • (4) the image is calibrated based upon the first calibration pattern;
  • (5) based upon the image scale from the first calibration pattern, calculate the size changed in the second calibration pattern. On a concave surface as the calibration pattern is further away from the focus plane of the camera, the calibration pattern shrinks in size;
  • (6) based upon the size changes in the second calibration pattern calculate the three-dimensional surface model of the three-dimensional skin surface;
  • (7) referring to FIG. 39, extract the desired object from the image using image segmentation techniques;
  • (8) referring to FIG. 40, the segmented image (or the entire captured image) is unwrapped based upon the three-dimensional surface model;
  • (9) the size of the wound is measured based upon the unwrapped image, based upon the scale data from the first calibration pattern (or second calibration pattern).
  • An exemplary technique for calibration a two-dimensional image on a curved surface, where the full surface of the wound is captured in multiple images may be as follows:
  • (1) a first calibration pattern is placed on a generally planar portion of the surface of the skin;
  • (2) a second calibration pattern is placed on the skin starting or passing through the first calibration pattern, covering as much of the surface as possible;
  • (3) an image is captured of the surface focusing on the first calibration pattern;
  • (4) the image is calibrated based upon the first calibration pattern;
  • (5) the imager scale from the first calibration pattern is used to calculate the size changes in the second calibration pattern. On a concave surface as the calibration pattern is further away from the focus plane of the camera, the calibration pattern shrinks in size.
  • (6) based upon the size changes in the second calibration pattern calculate the three-dimensional surface model of the three-dimensional skin surface;
  • (7) extract the desired object from the image using image segmentation techniques;
  • (8) the segmented image (or the entire captured image) is unwrapped based upon the three-dimensional surface model;
  • (9) this process is repeated for one or more additional images to capture the remaining portions of the wound;
  • (10) the size of the wound is measured based upon the unwrapped image, based upon the scale data from the first calibration pattern(s) (or second calibration pattern(s)) together with stitching the images together.
  • It is preferable in some cases that a consistent technique is used for wound assessment in measuring the wound. One technique is linear measurement, generally referred to as a “clock” technique, where a measurement of the longest length is determined, a measurement of the greatest width is determined, and a greatest depth of the wound is determined, using the body as the face of an imaginary clock. The head may be 12o'clock, the feet may be 6 o'clock.
  • The “clock” technique may be achieved using the calibration pattern(s) by positioning the calibration pattern(s) in specific directions. For example, the corner dots and color patterns may define the orientation of the calibration target with respect to the body. In the case of the strip, the strip orientation may be used to define the orientation of the calibration target with respect to the body. When the two-dimensional pattern or strip is captured in an image, the system knows the orientation of the two-dimensional pattern or strip with respect to the axis of the body. Once the patient's body axis is known, then consistent measurement in accordance with the “clock” technique may be achieved.
  • While the capturing and processing of the wound provides substantial benefits to the care of the patient, the classification of the wound remains inconsistent among different care givers, and even for the same care giver. For example, a first care giver may capture an image of a wound and classify it in a first manner, while a second care giver may capture an image of the same wound at generally the same time and classify it in a different manner. In addition, a care giver may capture an image of a first wound of a first patient and classify it in a first manner, while the same care giver may capture in image of a second wound of a second patent and classify it in a second manner, while the first wound and the second would should more properly be classified in the same manner. This inconsistency in the classification of the wound decreases the effectiveness of the care of the patient.
  • An assisted would classification, partially automated wound classification, and/or fully automated wound classification provides a substantial benefit to the care of the patient. Wounds come in all shapes and in all sizes, so it is desirable to include standardized criteria to classify each wound to assist the care giver to provide more holistic and seamless care for the patients. This is especially the case if multiple care givers are providing various aspects of wound care management for the same patient. While a metric for standardization may exist in some fashion, the differences in interpretations of such a metric and differences in training of classifications using such a metric, still results in substantial variations across care givers and different times that such classification occurs.
  • By way of example, a complex and prevalent wound etiology and their corresponding stage classification may be as follows.
  • Pressure ulcer: pre-stage 1; stage 1; stage 2; stage 3; stage 4; DTPI; unstageable.
  • Burn: 1st degree; 2nd degree; 3rd degree; 4th degree.
  • Diabetic ulcer: wagner 1; wagner 2; wagner 3; wagner 4.
  • Other: superficial; partial thickness; full thickness.
  • Such classifications are primarily based on the wound severity and the depth of tissue exposure. The deeper the wound and/or the more exposure of the inner body tissue structure tends to result in the higher severity of the stage classification. For example, a pressure ulcer with stage 4 is associated with a wound having tendon/bone/cartilage exposure. Additional factors tend to be incorporated with wound stage classification, such as, if a wound is classified of a higher severity historically which tends to persist until the closure of the wound.
  • Referring to FIG. 41, a digital wound automated staging process 4100 is illustrated. The light, exposure, and image capture device tends to vary between different environments when capturing an image of a wound. To reduce the variability in image capture conditions, the staging process 4100 may include placing a color calibration pattern next to a wound that is to be capture in an image 4110. An image capture device captures an image of the wound and the calibration pattern 4120. With an image that includes both the calibration pattern and the wound, the system then calibrates the wound based upon the calibration pattern 4130. In this manner, the captured image of the wound is modified in a manner that is more consistent among various image capture conditions. The system detects the wound boundaries within the image, either in a manual or automated manner 4140. Either manually or by auto-detection the system classified the exposed wound tissue type based on a pre-trained algorithm which considers tissue color and other image characteristics 4150. In addition, the image processing may be based upon an artificial intelligence processing, such as the system is trained using labeled images and thus when presented with a new capture image, it may classify it based upon the training. Beyond tissue color detection other imaging techniques may be applied to detect type tissue by color texture and composition analysis of the wound bed. The system may then perform a wound stage classification based upon pertinent factors 4160. The factors may include, for example, assigned wound etiology, detected exposure type tissue colors and characteristics, wound staging history, and/or wound location. The resulting classification may be presented to a care giver for verification, if desired, and consideration of continued care plan.
  • Referring to FIG. 42, an exemplary set of wound classifications that may be selected by a care giver, are illustrated. In this manner, the care giver may obtain an image of a wound and compare that image against the provided selections, to most accurately classify the wound. Also, the system may be fully automated by using a set of images for classification and image analysis of the wound for classification. The images and descriptions provide information to assist the care giver in the classification. The wound classifications are associated with the wound and saved in the management system.
  • All the references cited herein are incorporated by reference.
  • The terms and expressions which have been employed in the foregoing specification are used therein as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding equivalents of the features shown and described or portions thereof, it being recognized that the scope of the invention is defined and limited only by the claims which follow.

Claims (16)

I/we claim:
1. A method for making a measurement comprising:
(a) capturing a digital image, the captured digital image including both,
(i) a wound associated with a patient,
(ii) a color calibration pattern;
(b) modifying the color characteristics of the captured wound based upon the color calibration pattern;
(c) categorizing said modified captured image into one of a plurality of different wound categories based upon wound severity.
2. The method of claim 1 wherein said modified includes adjusting for exposure.
3. The method of claim 1 wherein said modified includes adjusting for light.
4. The method of claim 1 wherein said modified includes adjusting for image capture device.
5. The method of claim 1 further comprising determine boundaries of said wound.
6. The method of claim 1 further comprising determine boundaries of said modified wound.
7. The method of claim 1 wherein said classified is based upon tissue color.
8. The method of claim 1 wherein said classified is based upon etiology.
9. The method of claim 1 wherein said classified is based upon wound staging history.
10. The method of claim 1 wherein said classified is based upon wound location.
11. The method of claim 1 wherein said classified is based upon comparing the captured wound image to images of priorly classified wounds.
12. The method of claim 1 wherein said classified is based upon wound depth.
13. The method of claim 1 wherein said classified is based upon tissue color.
14. The method of claim 1 wherein said classified is based upon tissue texture.
15. The method of claim 1 wherein said classified is based upon tissue composition.
16. The method of claim 1 wherein said classified is based upon an artificial intelligence trained network.
US17/195,323 2012-10-09 2021-03-08 Wound characterization system Abandoned US20210366156A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/195,323 US20210366156A1 (en) 2012-10-09 2021-03-08 Wound characterization system

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
US201261795013P 2012-10-09 2012-10-09
US13/720,260 US9410827B2 (en) 2012-10-09 2012-12-19 Measurement using a calibration pattern
US13/791,987 US9672623B2 (en) 2012-10-09 2013-03-09 Image calibration
US201662422987P 2016-11-16 2016-11-16
US201662422999P 2016-11-16 2016-11-16
US201662422966P 2016-11-16 2016-11-16
US15/587,891 US9989952B2 (en) 2012-10-09 2017-05-05 Image calibration
US15/807,705 US10417785B2 (en) 2016-11-16 2017-11-09 Image calibration for skin lesions
US15/807,693 US10298780B2 (en) 2016-11-16 2017-11-09 Long range image calibration
US15/807,713 US10565735B2 (en) 2016-11-16 2017-11-09 Image calibration patient identification
US201762708867P 2017-12-27 2017-12-27
US15/994,895 US10943366B2 (en) 2012-10-09 2018-05-31 Wound characterization of a patient
US17/195,323 US20210366156A1 (en) 2012-10-09 2021-03-08 Wound characterization system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/994,895 Continuation US10943366B2 (en) 2012-10-09 2018-05-31 Wound characterization of a patient

Publications (1)

Publication Number Publication Date
US20210366156A1 true US20210366156A1 (en) 2021-11-25

Family

ID=64271866

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/994,895 Active 2033-02-08 US10943366B2 (en) 2012-10-09 2018-05-31 Wound characterization of a patient
US17/195,323 Abandoned US20210366156A1 (en) 2012-10-09 2021-03-08 Wound characterization system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/994,895 Active 2033-02-08 US10943366B2 (en) 2012-10-09 2018-05-31 Wound characterization of a patient

Country Status (1)

Country Link
US (2) US10943366B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11106957B1 (en) * 2020-03-20 2021-08-31 Hitachi, Ltd. System and method to track pipes using tags based on high redundancy codes
BE1027624B1 (en) * 2021-02-25 2022-05-09 Dermatoo Image reconstruction device, method and system for monitoring a medical or cosmetic skin condition
WO2022221237A2 (en) * 2021-04-13 2022-10-20 Lawrence Paulette Centralized system for vaccination verification, inventory management, and analysis

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293602A (en) * 1999-04-08 2000-10-20 Yoshiro Mizuno Electronic chart system
US20090137908A1 (en) * 2007-11-26 2009-05-28 Patwardhan Sachin V Multi-spectral tissue imaging
WO2015030680A1 (en) * 2013-08-28 2015-03-05 Tagbio Pte Ltd System and method for extracting personal information for emergency situation

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7728845B2 (en) 1996-02-26 2010-06-01 Rah Color Technologies Llc Color calibration of color image rendering devices
US6118845A (en) 1998-06-29 2000-09-12 Surgical Navigation Technologies, Inc. System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
EP1176557A1 (en) 2000-07-24 2002-01-30 Setrix AG Method and arrangement for camera calibration
US6985270B1 (en) 2000-08-09 2006-01-10 Eastman Kodak Company Method and photographic element for calibrating digital images
GB2372392A (en) 2001-02-16 2002-08-21 Hewlett Packard Co Improvements to image capture
US7262797B2 (en) 2001-02-22 2007-08-28 Ge Inspection Technologies Lp Method and system for storing calibration data within image files
TW560175B (en) 2001-12-31 2003-11-01 Way Tech Dev Inc Device and method for image color correction
US7248284B2 (en) 2002-08-12 2007-07-24 Edward Alan Pierce Calibration targets for digital cameras and methods of using same
US6845912B2 (en) 2002-12-12 2005-01-25 Thomas Whittier Scannell Bar code for measuring distance and position
US7327390B2 (en) 2003-02-04 2008-02-05 Eastman Kodak Company Method for determining image correction parameters
EP2282510A3 (en) 2004-05-05 2011-03-23 LaserSoft Imaging AG Calibration of imaging devices for minimizing individual color reproducing errors of such devices
US20060222263A1 (en) 2005-04-04 2006-10-05 Carlson Eric A Linear measurement machine-readable medium, method and system
WO2007007559A1 (en) 2005-07-12 2007-01-18 Nikon Corporation Projection device with display monitor
US7616796B2 (en) 2005-08-15 2009-11-10 Apteryx, Inc. System and method for sensor device manufacturer oriented calibration
US7600675B2 (en) 2005-12-28 2009-10-13 Compucredit Intellectual Property Holdings Corp. Ii Method for providing financial instruments to customers of a service provider
GB2436213B8 (en) 2006-03-13 2013-02-13 Clemex Technologies Inc System and method for automatic measurements and calibration of computerized magnifying instruments
US8139117B2 (en) 2006-04-21 2012-03-20 Sick, Inc. Image quality analysis with test pattern
FI123049B (en) 2007-09-03 2012-10-15 Mapvision Ltd Oy Recording Machine Vision System
US8229210B2 (en) 2008-04-02 2012-07-24 Bindu Rama Rao Mobile device with color detection capabilities
TWM354776U (en) 2008-08-06 2009-04-11 Wistron Corp Composite image device and related portable electronic device
US8229228B2 (en) 2008-09-16 2012-07-24 Robert Bosch Gmbh Image analysis using a pre-calibrated pattern of radiation
WO2010045271A1 (en) 2008-10-14 2010-04-22 Joshua Victor Aller Target and method of detecting, identifying, and determining 3-d pose of the target
TWI401409B (en) 2009-12-29 2013-07-11 Avermedia Information Inc Document camera with device size estimation function
US20110179624A1 (en) 2010-01-26 2011-07-28 Z-Line Designs, Inc. Animated assembly system
US8550903B2 (en) 2010-11-15 2013-10-08 Bally Gaming, Inc. System and method for bonus gaming using a mobile device
US9568425B2 (en) 2011-02-01 2017-02-14 Dna Medicine Institute, Inc. Multicoded analytical nanostrips
GB201107385D0 (en) 2011-05-04 2011-06-15 Materialise Nv Medical imaging calibration device
US8836703B2 (en) 2011-09-20 2014-09-16 General Electric Company Systems and methods for accurate measurement with a mobile device
US20130098788A1 (en) 2011-10-19 2013-04-25 Patrick McCarville Cell Phone Case
WO2013059599A1 (en) 2011-10-19 2013-04-25 The Regents Of The University Of California Image-based measurement tools
WO2013096766A2 (en) * 2011-12-21 2013-06-27 Shachaf Catherine M System for imaging lesions aligning tissue surfaces
US8731247B2 (en) 2012-01-20 2014-05-20 Geodigital International Inc. Densifying and colorizing point cloud representation of physical surface using image data
US20130259403A1 (en) 2012-04-03 2013-10-03 Oluwatosin Osinusi Flexible easy-to-use system and method of automatically inserting a photorealistic view of a two or three dimensional object into an image using a cd,dvd or blu-ray disc
US8698901B2 (en) 2012-04-19 2014-04-15 Hewlett-Packard Development Company, L.P. Automatic calibration
US9410827B2 (en) 2012-10-09 2016-08-09 Pixameter Corp. Measurement using a calibration pattern
WO2016149632A1 (en) * 2015-03-18 2016-09-22 Bio1 Systems, Llc Digital wound assessment device and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000293602A (en) * 1999-04-08 2000-10-20 Yoshiro Mizuno Electronic chart system
US20090137908A1 (en) * 2007-11-26 2009-05-28 Patwardhan Sachin V Multi-spectral tissue imaging
WO2015030680A1 (en) * 2013-08-28 2015-03-05 Tagbio Pte Ltd System and method for extracting personal information for emergency situation

Also Published As

Publication number Publication date
US20180336703A1 (en) 2018-11-22
US10943366B2 (en) 2021-03-09

Similar Documents

Publication Publication Date Title
US10565735B2 (en) Image calibration patient identification
US20210366156A1 (en) Wound characterization system
US10417785B2 (en) Image calibration for skin lesions
US11676705B2 (en) Tracking wound healing progress using remote image analysis
US10991096B2 (en) Utilizing personal communications devices for medical testing
US11727547B2 (en) Using patient generated image data to update electronic medical records
US9989952B2 (en) Image calibration
US7657101B2 (en) Devices and methods for identifying and monitoring changes of a suspect area on a patient
US10298780B2 (en) Long range image calibration
US20100165090A1 (en) Frontal Hand Capture of Fingerprints, Palm Prints and Hand Geometry Using Contactless Photography
CN106164929A (en) Method and system for Wound evaluation Yu management
Lee et al. Validation of stereophotogrammetry of the human torso
JP2016523405A (en) System and method using mark analysis in tablet identification
US20140354830A1 (en) System and method for adding scale to photographic images
CA3013602C (en) Method of monitoring medication regimen with portable apparatus
TWI619093B (en) Visual positioning apparatus, method, and computer program product thereof
JP2015191531A (en) Determination method of spatial position of two-dimensional code, and device therefor
CN219878144U (en) Wound measuring device
CN116645330A (en) Scoliosis detection method and device, computer equipment and storage medium
CN108875557A (en) Method, apparatus, system and the computer storage medium veritified for the testimony of a witness

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXAMETER CORP., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GHAZIZADEH, MANSOOR;REEL/FRAME:057146/0307

Effective date: 20181016

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED