CN115810039A - Portable electronic device and wound size measuring method - Google Patents

Portable electronic device and wound size measuring method Download PDF

Info

Publication number
CN115810039A
CN115810039A CN202111254232.6A CN202111254232A CN115810039A CN 115810039 A CN115810039 A CN 115810039A CN 202111254232 A CN202111254232 A CN 202111254232A CN 115810039 A CN115810039 A CN 115810039A
Authority
CN
China
Prior art keywords
wound
reference object
image
actual
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111254232.6A
Other languages
Chinese (zh)
Inventor
胡文芯
杨继仪
林哲渝
谢惠琪
林英期
黄启伦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wistron Corp
Original Assignee
Wistron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wistron Corp filed Critical Wistron Corp
Publication of CN115810039A publication Critical patent/CN115810039A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/225Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/95Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Dermatology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)

Abstract

A wound size measurement method comprising: obtaining an input image by using a camera device of the portable electronic device; identifying the input image by using a convolution neural network model, and selecting a part of the input image with the highest probability of being a wound as an output wound image; and calculating the actual height and the actual width of the wound area in the output wound image according to the lens focal length parameter reported by the operating system, a plurality of reference correction parameters corresponding to the horizontal pitch angle of the portable electronic device, and the pixel height ratio and the pixel width ratio of the output wound image.

Description

Portable electronic device and wound size measuring method
Technical Field
Embodiments of the present invention relate to image processing, and more particularly, to a portable electronic device and a method for measuring a wound size.
Background
Hospitals today are prone to a number of problems in wound care for patients. For example, the types of wounds are diversified and clinical staff is used to divide the work into different specialties, so that the wound records and databases in hospitals are scattered nowadays, and an integrated nursing platform meeting the requirements is not available on the market. In addition, the size record of taking a picture of wound is complicated, and traditional prescription must rely on the manual work to measure with the chi or collocation external hardware equipment, is unfavorable for the application or hinders in the cost to consider, is difficult to popularize. Moreover, the wound care process of the patient needs to be continuous, but the situations of long community care points and the condition of the patient after returning home are difficult to track and judge, and the expert continuous care with the wound professional experience is lacked.
Disclosure of Invention
Embodiments of the present invention provide a portable electronic device and a wound size measuring method to solve the above problems.
An embodiment of the present invention provides a portable electronic device, including: a display panel, an inertia measurement unit, a camera device, a storage device and an arithmetic unit. The inertia measurement unit is used for detecting a horizontal pitch angle of the portable electronic device. The camera device is used for obtaining an input image. The storage device is used for storing an operating system, a wound measurement program, a region candidate network model and a convolution type neural network model. A computing unit for executing the wound measurement procedure to perform the following steps: identifying the input image by using the convolution neural network model, and selecting the input image with the highest probability of being a wound as an output wound image; and calculating the actual height and the actual width of the output wound image according to the lens focal length parameter reported by the operating system, a plurality of reference correction parameters corresponding to the horizontal pitch angle, and the pixel height ratio and the pixel width ratio of the output wound image.
In some embodiments, the reference correction parameters include: the actual height of the reference object, the actual width of the reference object, the pixel height ratio of the reference object and the focusing distance of the reference object. In the horizontal correction process of the wound measurement program, the portable electronic device takes a picture of a reference object with a horizontal pitch angle of 0 degrees to obtain a first reference object image, and obtains a first reference object focal distance from an application programming interface of the operating system, wherein the reference object has the actual height of the reference object and the actual width of the reference object. In the vertical calibration process of the wound measurement program, the portable electronic device takes a picture of the reference object with a horizontal pitch angle of 90 degrees to obtain a second reference object image, and obtains a second reference object focal distance from the application programming interface of the operating system. The arithmetic unit divides a first pixel height of the first reference object image or the second reference object image displayed on a display panel of the portable electronic device by a second pixel height of the display panel to obtain a first reference object pixel height ratio or a second reference object pixel height ratio.
In some embodiments, the arithmetic unit uses the first reference object focus distance as the reference object focus distance and uses the first reference object pixel height ratio as the reference object pixel height ratio in response to the horizontal pitch angle being between 0 and 45 degrees. In response to the horizontal pitch angle being between 45 and 90 degrees, the computing unit uses the second reference object focus distance as the reference object focus distance and uses the second reference object pixel height ratio as the reference object pixel height ratio.
In some embodiments, the operation unit calculates equations (1) and (2) to obtain the actual height and the actual width of the output wound image:
Figure BDA0003323495950000021
Figure BDA0003323495950000022
wherein h is c Is the actual height of the reference object; g is a radical of formula c Focusing the reference object for a distance; p is a radical of formula c Is the reference object pixel height ratio; h is m The actual height of the output wound image; g m The focal length parameter of the lens; p is a radical of m Is the pixel height ratio; w is a m The actual width of the output wound image; w is a c Is the actual width of the reference object.
In some embodiments, the computing unit further performs a machine learning clustering algorithm to divide the output wound image into a wound area and a normal skin area. The operation unit further calculates a first number of pixels in the output wound image and a second number of pixels in the wound area, and divides the second number of pixels by the first number of pixels to obtain a wound area pixel ratio. The computing unit further multiplies the actual height of the output wound image by the actual width to obtain an actual area of the output wound image, and multiplies the actual area by the wound area pixel ratio to obtain an actual area of the wound area.
In some embodiments, the calculation unit further calculates a first red average, a first green average, and a first blue average of the red, green, and blue subpixels of each pixel in the wound area, calculates a second red average, a second green average, and a second blue average of the red, green, and blue subpixels of each pixel in the normal skin area, and calculates the euclidean distance between the wound area and the normal skin area according to the first red average, the first green average, the first blue average, the second red average, the second green average, and the second blue average to represent the severity of the wound area.
In some embodiments, in response to the computing unit determining that the actual area of the output wound image is larger than the actual area of a previous output wound image by a first predetermined ratio, the computing unit notifies a server to add the name of the user of the portable electronic device to a care list for medical staff to perform related review. In some embodiments, in response to the computing unit determining that the severity of the output wound image is greater than a second predetermined ratio from a previous output wound image, the computing unit notifies a server to add the name of the user of the portable electronic device to a care list for medical staff to perform related review.
In some embodiments, before the operation unit identifies the input image by using the convolutional neural network model, the operation unit generates a plurality of first bounding boxes by using the area candidate network model to use the input image, and filters a plurality of second bounding boxes containing wound probability greater than a predetermined value from the first bounding boxes. The convolution-like neural network model selects the second bounding box with the highest probability of being a wound as the output wound image.
An embodiment of the present invention further provides a method for measuring a wound size, which is applied to a portable electronic device, the portable electronic device includes a display panel and a camera device, and the method includes: obtaining an input image by using the camera device; identifying the input image by using a convolution neural network model, and selecting the input image with the highest probability of being a wound as an output wound image; and calculating the actual height and the actual width of the wound area in the output wound image according to the lens focal length parameter reported by the operating system, a plurality of reference correction parameters corresponding to the horizontal pitch angle of the portable electronic device, and the pixel height proportion and the pixel width proportion of the output wound image.
Drawings
Fig. 1 is a block diagram of a portable electronic device according to an embodiment of the invention.
Fig. 2A is a schematic diagram of a model training process of a convolutional neural network model according to an embodiment of the present invention.
Fig. 2B is a diagram illustrating an architecture of a local candidate network model according to an embodiment of the invention.
FIG. 2C is a flow chart of wound identification using a convolutional neural network model and a regional candidate network model according to an embodiment of the invention.
FIG. 2D is a diagram illustrating a segmentation of an input image into a plurality of first bounding boxes using the local candidate network model according to the embodiment of the invention shown in FIG. 2C.
FIG. 2E is a diagram illustrating the determination of a plurality of second bounding boxes from the plurality of first bounding boxes in the embodiment of FIG. 2C according to the present invention.
FIG. 2F is a diagram illustrating the use of a convolutional neural network model to determine the probability that each second bounding box is a wound according to the embodiment of FIG. 2C.
FIG. 2G is a schematic diagram of the output wound image of the embodiment of FIG. 2C according to the invention.
Fig. 3A is a schematic diagram of a portable electronic device according to an embodiment of the invention when the horizontal tilt angle is 0 degree.
Fig. 3B is a schematic diagram illustrating a horizontal calibration process of the portable electronic device according to an embodiment of the invention.
Fig. 3C is a schematic diagram of the portable electronic device according to an embodiment of the invention when the horizontal tilt angle is 90 degrees.
Fig. 3D is a schematic diagram illustrating a vertical calibration process of the portable electronic device according to an embodiment of the invention.
Fig. 4A and 4B are schematic diagrams illustrating imaging of a camera device at different distances according to an embodiment of the invention.
Fig. 5A-5C are schematic diagrams illustrating clustering of output wound images according to an embodiment of the invention.
Fig. 6 is a block diagram of a wound care system in accordance with an embodiment of the present invention.
Fig. 7A and 7B are schematic views of different severity and area of a wound region according to an embodiment of the invention.
Fig. 8 is a flow chart of a method of wound size measurement according to an embodiment of the invention.
Wherein the reference numerals are as follows:
100: portable electronic device
101: front surface
102: rear surface
105: camera device
110: camera module
111: lens barrel
112: color filter array
113: image sensor
115: controller
116: automatic focusing module
120: arithmetic unit
130: memory unit
140: storage device
141: operation system
142: wound measurement procedure
143: convolution neural network model
144: regional candidate network model
145: database with a plurality of databases
150: communication interface
160: display panel
170: inertial measurement unit
171: gyroscope
172: accelerometer
173: magnetometer
1430: inputting image
1431: characteristic map
1432: network layer
1433: outputting wound images
1441: sliding window
1442: positioning frame
1443: intermediate layer
1444: a classification layer
1445: regression layer
210: inputting image
211: first bounding box
212: second bounding box
230: square frame
240: outputting wound images
310: horizontal line
320: reference object
330: reference object image
510: outputting wound images
511. 521: wound area
512. 522: area of normal skin
520: grouped images
600: wound care system
610: server
615: patient database
620: network
710. 720: outputting wound images
711. 721: wound area
712. 722: area of normal skin
S810-S830: step (ii) of
fd: a predetermined distance
h: actual height
d: distance of object to be measured
fp: focus point
Pm: height of image formation
h1: height
w1: width of
H2, H: height of pixel
W2, W: width of pixel
Detailed Description
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of further features, integers, steps, operations, elements, components, and/or groups thereof.
Fig. 1 is a block diagram of a portable electronic device according to an embodiment of the invention. The portable electronic device 100 is, for example, a smart phone, a tablet computer, a notebook computer, etc., but the embodiment of the invention is not limited thereto. As shown in fig. 1, the portable electronic device 100 includes a camera device 105, an arithmetic unit 120, a memory unit 130, a storage device 140, a communication interface 150, a display panel 160, and an inertia measurement unit 170. The camera module 110 is used for capturing an object image. For example, the camera device 105 includes at least a camera module 110, a controller 115, and an auto-focus module 116.
The camera module 110 includes a lens 111, a color filter array 112, and an image sensor 113. The color filter array 112 includes a plurality of red, green, and blue filters arranged in a predetermined pattern, such as a Bayer pattern (Bayer pattern) or other types of patterns. The image sensor 113 is a color image sensor, and may be implemented by a Charge Coupled Device (CCD) sensor or a complementary metal-oxide semiconductor (CMOS) sensor, for example. The controller 115 may be, for example, a microcontroller (microcontroller), but the embodiments of the present invention are not limited thereto.
Incident light from a scene of a target object passes through the lens 111 and the color filter array 112 and is imaged on the image sensor 113, so that the photoelectric elements of each pixel in the image sensor 113 convert the sensed light into an electrical signal and transmit the electrical signal to the controller 115. The controller 115 transmits each pixel of the captured image to the computing unit 120. The auto-focusing module 116 includes, for example, a step motor (step motor) for adjusting the focal length (focal length) of the lens 111 or the entire camera module 110 according to a control signal from the controller 115.
The controller 115 can, for example, perform a passive AF (passive AF) algorithm (such as a contrast detection or a phase detection AF algorithm) on the image captured by the image sensor 113, or receive a pair of Jiao Chukong signals from the display panel 160, so as to control the autofocus module 116 to finely adjust the position of the lens 111 or the entire camera module 110, so that the image sensor 113 can accurately focus on the target object to capture the object image. In addition, the controller 115 transmits the focusing information of the camera module 110 to the operation unit 120, wherein the focusing information may be information such as a focal length, an order of a stepping motor, and the like, but the embodiment of the invention is not limited thereto.
In some embodiments, the portable electronic device 100 includes two or more camera modules 110, wherein the lenses 111 of different camera modules 110 have different focal length ranges, for example, and the controller 115 can perform auto-focusing by using images captured by different camera modules 110 and control the auto-focusing module 116 to fine-tune the lens 111 or its camera module 110 having the corresponding focal length range, so that the image sensor 113 in the camera module 110 can be correctly focused on the target object. The controller 115 may also transmit the focusing information of the camera module 110 selected by the auto-focusing to the computing unit 120, wherein the focusing information may be, for example, the focal length, the step number of the stepping motor, and the like, but the embodiment of the invention is not limited thereto.
The computing unit 120 is electrically connected to the camera device 105, and the computing unit 120 can be implemented in various ways, such as a dedicated hardware circuit or a general-purpose hardware implementation (e.g., a single processor, a multi-processor with parallel processing capability, or other processors with computing capability), such as a central processing unit (cpu), a general-purpose processor (cpu), or a microcontroller (microcontroller), but the embodiment of the invention is not limited thereto.
The storage device 140 is, for example, a non-volatile memory, such as a hard disk drive (hard disk drive), a solid-state disk (solid-state disk), or a read-only memory (read-only memory), etc., but the embodiment of the invention is not limited thereto. The storage device 140 is used to store an operating system 141 (for example, an Android or iOS operating system, etc.) for the portable electronic device 100 to operate, a wound measurement program 142, a Convolutional Neural Network (CNN) model 143, and a regional candidate network (RPN) model 144.
Assuming that both the CNN model 143 and the RPN model have been subjected to the model training process, the RPN model 144 may divide the object image (e.g., the input image) captured by the camera device 105 into a plurality of regions (bounding boxes), and find one or more candidate regions with a higher probability from the segmented object image to input the segmented object image into the CNN model 143. The CNN model 143 performs image recognition on each candidate region to obtain a candidate region with the highest probability (confidence) as a wound image as an output wound image. The wound measurement program 142 is used for estimating dimension (dimension) information of the output wound image according to a Field of view (FOV) of the camera device 105 and a lens focus parameter reported by the operating system 141, wherein the dimension information may be, for example, the width and height of the target object.
The memory unit 130 is, for example, a volatile memory, such as a Static Random Access Memory (SRAM) or a Dynamic Random Access Memory (DRAM), but the embodiment of the invention is not limited thereto. The memory unit 130 may be used as an execution space of the operating system 141 and a storage space for temporary data generated by the wound measurement process 142, and an image buffer. For example, the operation unit 120 may read the operating system 141 and the wound measurement program 142 stored in the storage device 140 to the memory unit 130 and execute them. The communication interface 150 includes, for example, a wired/wireless transmission interface for connecting the portable electronic device 100 to other electronic devices or servers.
The display panel 160 may be, for example, a liquid crystal (liquid crystal) display panel, a light-emitting diode (led) display panel, an organic led (organic light-emitting diode) display panel, an electronic Ink (e-Ink), etc., but the embodiment of the invention is not limited thereto. In some embodiments, the display panel 160 may be integrated with a touch device (not shown) for performing touch operations, such as a capacitive or resistive touch device, and the display panel 160 may be referred to as a touch panel, but the embodiments of the invention are not limited thereto.
The inertial measurement unit 170 includes a gyroscope (gyro) 171, an accelerometer (accelerometer) 172, and a magnetometer (magnetometer) 173. The gyroscope 171 is used for measuring the orientation (orientation) and angular velocity (angular velocity) of the portable electronic device 100, the accelerometer 172 is used for measuring the acceleration of the portable electronic device 100, and the magnetometer 173 is used for measuring the magnetic field strength and the orientation of the portable electronic device 100, wherein the data measured by the gyroscope 171, the accelerometer 172, and the magnetometer 173 belong to the inertial information. For example, the accelerometer 172 and magnetometer 173 in the inertial measurement unit 170 can detect the horizontal pitch angle (pitch) of the portable electronic device 100.
In an embodiment, the operating system 141 used by the portable electronic device 100 is, for example, an Android operating system, and after the automatic focusing, the focal length parameter reported by an Android Camera2 Application Program Interface (API) of the operating system 141 may be, for example, LENS _ FOCUS _ DISTANCE or LENS _ INFO _ FOCUS _ DISTANCE _ CALIBRATION. The lens focus parameter is, for example, a value corrected through an Application Program Interface (API) of the operating system 141, and the unit of the value is diopter (diopter) = 1/meter. The value 0 represents the farthest distance that lens 111 can focus, but the farthest distance is not represented as infinity. For example, when the distance between the target object and the lens 111 is within a specific focal distance range (e.g., about 10-25 cm) and the tilt angle of the portable electronic device 100 is at a specific angle (e.g., 0 degree or 90 degrees), the focal distance f used by the lens 111 can be calculated according to the following formula (1):
Figure BDA0003323495950000101
it should be noted that the value of the LENS FOCUS parameter LENS _ FOCUS _ DISTANCE or LENS _ INFO _ FOCUS _ DISTANCE _ CALIBRATION reported by the os 141 also changes with the horizontal tilt angle of the portable electronic device 100.
In another embodiment, if the portable electronic device 100 is an iPhone 4S or more model and the os 141 is an iOS 8 or more version, the lens focus parameter reported by the os 141 is, for example, lens focus. The lens focal length parameter lenposition is, for example, a value between 0 and 1, where a value 0 represents the closest distance that the lens 111 can focus on, and a value 1 represents the farthest distance that the lens 111 can focus on, but the farthest distance is not represented as infinity. It should be noted that the lens focal length parameter len _ focus does not directly indicate the focal length of the lens 111, but is a value converted by the operating system 141 and is not equal to (fixed constant/focal length). In addition, the value of the lens focus parameter lens focus reported by the os 141 also changes with the horizontal tilt angle (pitch) of the portable electronic device 100.
Fig. 2A is a schematic diagram of a model training process of a convolutional neural network model according to an embodiment of the present invention.
In one embodiment, during the model training process, the medical professional may first mark the wound images (i.e., the area of the wound images is smaller than or equal to that of the training images) in each training image, and each wound image is assembled into the feature data set. The model training process of the CNN model 143 and the RPN model 144 may use the same feature data set to train the CNN model 143 and the RPN model 144 simultaneously, for example. In some embodiments, the CNN model 143 and the RPN model 144 may be implemented using, for example, unet, faster R-CNN, or Mask R-CNN models, although embodiments of the invention are not limited thereto.
The characteristic data set may include wound images of a chronic wound or an acute wound of the patient. Chronic wounds may include diabetic foot ulcers, pressure sores or pressure sores, venous ulcers, and the like. Acute wounds may include chronic unhealthy pressure sores, hemangiomas, ulcers, burns, diabetic foot and toe infection necrosis, and the like, although embodiments of the invention are not so limited. It should be noted that, since the same feature data set is used for model training, the CNN model 143 and the RPN model 144 may use a common feature map (feature map).
For example, during model training, a wound image (e.g., including only the wound site) marked by a medical professional is used as the input image 1430 of the CNN model 143. The CNN model 143, which may also be referred to as a wound recognition model, includes a feature map 1431 and a network layer 1432, wherein the feature map 1431 may be implemented by, for example, a plurality of convolutional layers (Conv) and pooling layers (MaxPool). For example, the convolution layer may perform convolution operations on the input image 1430 and extract features, and the pooling layer may amplify the extracted features. The advantage of using the pooling layer is that the input image has several pixel shifts, the determination result of the CNN model 143 is not affected, and the input image has a good anti-noise function. Finally, the CNN model 143 flattens the feature extraction results for input to the network layer 1432. The network layer 1432 includes, for example, at least two full connection layers (FC). The flattened feature extraction results pass through the network layer 1432 to obtain the output wound image 1433 and its corresponding confidence level.
In some embodiments, to reduce the over-training (over-fitting) problem of the CNN model 143, the weights of certain layers (e.g., fully-connected layers or convolutional layers) in the CNN model 143 may be updated randomly during the model training process. In addition, a data augmentation (data augmentation) function may be added during the model training process to greatly increase the amount of training data. For example, the computing unit 120 may execute a training data enhancement process (not shown) to mirror (mirror), rotate 0/90/180/270 degrees, randomly scale, adjust contrast and exposure of each wound image in the feature data set, and obtain different enhanced training data, and the model training process may use the enhanced training data to train the CNN model 143, which indicates that the training is completed when the CNN model 143 is trained to converge (convert).
Fig. 2B is a diagram illustrating an architecture of a local candidate network model according to an embodiment of the invention.
The RPN model 144, which may also be referred to as a wound location model, is configured as shown in fig. 2B. In one embodiment, the input to the RPN model 144 is a feature map (conv feature map) 1431 of the CNN model 143. In another embodiment, the RPN model 144 may include a separate feature map (not shown) that is different from the feature map 1431 of the CNN model 143. In addition, in the model training process of the RPN model 144, the input of the feature map of the RPN model 144 is an image that is not cropped but has been marked by the human hand to indicate the wound position, i.e., each training image marked in the training data set.
For ease of illustration, the input to the RPN model 144 in fig. 2B is a feature map 1431 of the CNN model 143. For example, the RPN model 144 takes sliding windows 1441 (e.g., 3 × 3 convolutional layers) on the feature map 1431, and then calculates the probability that each positioning frame contains an object by using k different positioning frames (anchor boxes) 1442 corresponding to each pixel, e.g., an intermediate layer (intermediate layer) 1443 with 256 dimensions is calculated, and obtains a classification layer (classification layer) 1444 and a regression layer (regression layer) 1445 corresponding to each sliding window 1441, where the classification layer 1444 has 2k scores and the regression layer 1445 has 4k coordinate positions. The RPN model 144 may obtain a bounding box (bounding box) most likely to contain the object according to the corresponding score of each sliding window 1441.
FIG. 2C is a flow chart of wound identification using a convolutional neural network model and a regional candidate network model according to an embodiment of the invention. FIG. 2D is a diagram illustrating a segmentation of an input image into a plurality of first bounding boxes using the local candidate network model according to the embodiment of the invention shown in FIG. 2C. FIG. 2E is a diagram illustrating the determination of a plurality of second bounding boxes from the plurality of first bounding boxes in the embodiment of FIG. 2C according to the present invention. FIG. 2F is a diagram illustrating the use of a convolutional neural network model to determine the probability that each second bounding box is a wound according to the embodiment of FIG. 2C. FIG. 2G is a diagram illustrating an output of a wound image in the embodiment of FIG. 2C according to the invention.
Please refer to fig. 2C to fig. 2G. In the image recognition stage, the input image 210 is subjected to feature extraction by the CNN model 143 to obtain a feature map 1431, and the RPN model 144 can divide the input image 210 into a plurality of first bounding boxes 211 according to the feature map 1431, as shown in fig. 2D, the probabilities of the first bounding boxes 211 from top to bottom containing the wounds are 32%, 21%, 72%, and 90%, respectively.
The RPN model 144 filters out a plurality of second bounding boxes 212 from the first bounding box 211 that contain a greater chance of having a wound. For example, the RPN model 144 may set a threshold probability, and if the probability that the first bounding box 211 contains a wound is greater than the threshold probability, the RPN model 144 places the first bounding box 211 in the candidate region (prosals), wherein the first bounding box 211 in the candidate region is the second bounding box 212, as shown in fig. 2E.
The RPN model 144 inputs each second bounding box 212 into the network layer 1432 of the CNN model 143 to obtain the probability of whether the image of each second bounding box 212 is a wound, as shown in fig. 2F, the probability of the wound being formed by the left and right second bounding boxes 212 is 95% and 62%, respectively. Finally, at block 230, the cnn model 143 takes the second bounding box 212 that is the highest probability (e.g., 95%, which may also be referred to as confidence) of the wound as the output wound image 240, and the output wound image 240 may be displayed on the display panel 160, as shown in fig. 2G. In detail, since the RPN model 144 can first find the plurality of second bounding boxes 212 with the highest wound probability from the input image and the CNN model 143 can determine the probability that each second bounding box is a wound, the CNN model 143 and the RPN model 144 can increase the operation speed of the portable electronic device 100 for determining whether the input image is a wound image and increase the accuracy of wound identification.
Fig. 4A and 4B are schematic views of imaging of a camera device at different distances according to an embodiment of the invention.
In one embodiment, the wound measurement program 142 calculates the dut distance d according to equation (2):
Figure BDA0003323495950000131
wherein h is the actual height of the objectDegree; f is the focal length used by the camera device 105; p is a radical of w The number of pixels representing the height of the image; p is a radical of m The number of pixels of the target object in the high side of the image; s is the length of the photosensitive element. Since most of the lenses 111 of the portable electronic devices on the market are fixed focus lenses, the focal length f and the image height p are different w And the length s of the photosensitive element can be regarded as a fixed value, and the three parameters can be simplified into a lens parameter p, as shown in formula (3):
Figure BDA0003323495950000132
however, when the wound measurement program 142 executed by the computing unit 120 calculates the size of the wound in the output wound image, the distance d (i.e., the distance from the camera device 105 to the wound) of the object to be measured and the actual height h of the object to be measured are unknown values, so the wound measurement program 142 calculates the distance d of the object to be measured by using the focal length parameter g of the lens reported by the operating system 141. For example, the lens focal length parameter g and the distance d to be measured are in reciprocal relation, but the value of the lens focal length parameter g after reciprocal conversion is not equal to the distance d to be measured, and the reciprocal of the lens focal length parameter g needs to be multiplied by the offset w to obtain the distance d to be measured, as shown in formula (4):
Figure BDA0003323495950000141
as shown in fig. 4A, assuming that the actual height of the object to be measured (e.g. wound) is fixed at h and the distance between the object to be measured is d, the imaging height of the reflected light of the object to be measured passing through the focal point fp on the display screen of the display panel 160 of the portable electronic device 100 is p m . As shown in fig. 4B, if the distance of the object to be measured is increased to 2d, the imaging height of the reflected light of the object to be measured on the display panel 160 of the portable electronic device 100 through the focus fp is as follows
Figure BDA0003323495950000142
Therefore, the proportional relation of equation (5) can be obtained according to equations (2) and (3):
Figure BDA0003323495950000143
similarly, according to equation (5), when the distance d between the objects is fixed, if the heights of the objects are 2h and h, respectively, the imaging height of the display screen is 2p m And p m . In addition, according to the formula (5), when the height of the object to be measured is fixed to h and the distances between the object to be measured are d and 2d, respectively, the imaging height of the display frame is p m And
Figure BDA0003323495950000144
when the arithmetic unit 120 executes the wound measurement program 142 for the first time, the wound measurement program may enter a calibration mode. In the calibration mode, the computing unit 120 can turn on the camera device 105 to focus on a reference object with a known size and take a picture, and the user can frame the range of the reference object on the display panel 160, so that the computing unit 120 can obtain the reference value of the focus distance and the height of the reference object. Then, according to the formula (4) and with the known proportional relationship of the reference object, the formula (6) can be obtained:
Figure BDA0003323495950000145
wherein h is c Is the actual height of the reference object; g c Focusing distance for the reference object; p is a radical of c Is the pixel height ratio of the reference object; h is m The actual height of the object to be measured; g is a radical of formula m Focusing distance for the object to be measured; p is a radical of formula m Is the pixel height ratio of the object to be measured. In some embodiments, the reference object may be, for example, a health card or identification card, or other object having a fixed known size.
The formula (6) is simplified to obtain the formula (7) to obtain the actual height h of the object to be measured m
Figure BDA0003323495950000151
For example, the health card has a dimension of 53.5mm (width) 85.5mm (height), and thus, the actual height h of the reference object in equation (7) c Are known values. In addition, the position of the lens 111 is adjusted by the step motor of the auto-focus module 116 during the focusing process of the portable electronic device 100, however, the step motor is influenced by the gravity of the earth, so that the focal length parameter of the lens reported by the operating system 141 is different when the horizontal tilt angle of the portable electronic device 100 is 0 degree and 90 degrees. It should be noted that the formula (7) is used to calculate the actual height h of the object to be measured m . If equation (7) is to be used to calculate the actual width w of the object to be measured m Then the actual height h of the object to be measured in the formula (7) can be adjusted m And reference object actual height h c Respectively replaced by the actual width w of the object to be detected m And the actual width w of the reference object c As shown in formula (8):
Figure BDA0003323495950000152
the pixel width ratio of the reference object and the pixel width ratio of the object to be measured are similar to the pixel height ratio p of the reference object c And the pixel height ratio p of the object to be measured m Therefore, the reference object pixel height ratio p can be directly used in the formula (8) c And the pixel height ratio p of the object to be measured m . In some embodiments, the computing unit 120 may additionally calculate the reference object pixel width ratio r c And the pixel width ratio r of the object to be measured m To replace the reference object pixel height ratio p in the formula (8) respectively c And the pixel height ratio p of the object to be measured m
For example, when the front surface 101 and the rear surface 102 of the portable electronic device 100 are completely horizontal to the horizontal line 310, the inertial measurement unit 170 may detect that the horizontal pitch angle of the portable electronic device 100 is 0 degree, as shown in fig. 3A.
The horizontal calibration process of the portable electronic device 100 is shown in fig. 3B. User-usable portable electronic device100 shoot the reference object 320 at a horizontal pitch angle of 0 degree and a predetermined distance (e.g., between 10-15 cm) fd to obtain a reference object image 330, wherein the height and width of the reference object 320 are h1 and w1, respectively, which correspond to the actual height h of the reference object c And the actual width w of the reference object c . The user can observe whether the captured reference object image is clear through the display panel 160, and can press the capture button to capture the reference object image 330. The user may also adjust a box on the display panel 160 to indicate the size range of the reference object image 330, and store the calibration parameters.
For example, the resolution of the display panel 160 is W (pixel width) × H (pixel height), and the reference object image 330 displayed on the display panel 160 has a pixel height H2 and a pixel width W2. If the pixel height h2 of the reference object image 330 is 1344 pixels and the pixel height of the display panel 160 is 1920 pixels, the arithmetic unit 120 can calculate the pixel height ratio p of the reference object c =0.70. At this time, if the predetermined DISTANCE is about 13 cm, the LENS FOCUS parameter LENS _ FOCUS _ DISTANCE reported by the api of the os 141 is, for example, 7.59, so that the computing unit 120 can FOCUS the reference object by the reference object DISTANCE g c The setting was 7.59. Therefore, a plurality of reference calibration parameters, such as the actual height h of the reference object, can be obtained during the level calibration process c Reference object pixel height ratio p c And reference object focus distance g c
When the front surface 101 and the rear surface 102 of the portable electronic device 100 are perpendicular to the horizontal line 310, the inertial measurement unit 170 can detect that the horizontal tilt angle of the portable electronic device 100 is 90 degrees, as shown in fig. 3C.
The vertical calibration process of the portable electronic device 100 is shown in fig. 3D. The user can use the portable electronic device 100 to capture the reference object 320 at a horizontal tilt angle of 90 degrees and a predetermined distance (e.g., between 10 cm and 15 cm) fd to obtain the reference object image 330, wherein the height and width of the reference object 320 are h1 and w1, respectively, which correspond to the actual height h of the reference object c And actual width of reference objectw c . In addition, the resolution of the display panel 160 is W (pixel width) × H (pixel height), and the reference object image 330 displayed on the display panel 160 has a pixel height H2 and a pixel width W2.
For example, if the pixel height h2 of the reference object image 330 is 1382 pixels and the pixel height of the display panel 160 is 1920 pixels, the arithmetic unit 120 can calculate the reference object pixel height ratio p c =0.72. At this time, if the predetermined DISTANCE is about 13 cm, the LENS FOCUS parameter LENS _ FOCUS _ DISTANCE reported by the api of the os 141 is, for example, 8.65, and therefore, the computing unit 120 can refer to the object FOCUS DISTANCE g c The setting was 8.65. Therefore, a plurality of reference calibration parameters, such as the actual height h of the reference object, can be obtained during the vertical calibration process c Reference object pixel height ratio p c And reference object focus distance g c
It should be noted that the focus distance g of the reference object obtained in the horizontal calibration process and the vertical calibration process c Are different reference correction parameters. For example, when the horizontal tilt angle of the portable electronic device 100 is between 0 degree and less than 45 degrees, the wound measurement program 142 uses the reference calibration parameters obtained from the horizontal calibration process to calculate by substituting the reference calibration parameters into the equations (7) and (8) to obtain the actual height h of the object to be measured m And the actual width w of the object to be measured m (e.g., in centimeters). When the horizontal pitch angle of the portable electronic device 100 is greater than or equal to 45 degrees to 90 degrees, the wound measurement program 142 uses the reference calibration parameters obtained in the vertical calibration process to calculate by substituting the reference calibration parameters into the equations (7) and (8) to obtain the actual height h of the object to be measured m And the actual width w of the object to be measured m (e.g., in centimeters).
Fig. 5A-5C are schematic diagrams illustrating clustering of output wound images according to an embodiment of the invention.
Fig. 5A shows an output wound image 510 generated by the CNN model 143 and the RPN model 144, which is, for example, an RGB image, that is, each pixel of the output wound image is composed of red, green and blue sub-pixels with brightness between 0 and 255. The wound measurement program 142 may group the pixels of the output wound image 510 by using a machine learning clustering algorithm, for example, the pixels may be classified into a wound group and a normal skin group. The machine learning clustering algorithm may be, for example, K-Means clustering, hierarchical clustering, or other clustering algorithms in the field of the present invention, but the embodiments of the present invention are not limited thereto. For example, after the output wound image 510 of fig. 5A is subjected to image clustering, a clustered image 520 as shown in fig. 5B can be obtained, which can be divided into a wound area 521 and a normal skin area 522.
After the wound measurement program 142 obtains the wound area 521 and the normal skin area 522 from the clustering image 520, the area of the wound area 521 in the clustering image 520 can be calculated. For example, in the foregoing embodiment, the wound measurement program 142 can calculate the actual height of the object to be measured and the actual width of the object to be measured of the output wound image. Assuming that the actual height and the actual width of the object to be measured are 3 cm and 4 cm, respectively, the actual area corresponding to the output wound image 510 is 12 cm squared. If the output wound image 510 has 5 ten thousand pixels and the wound measurement program 142 calculates that the number of pixels in the wound area 521 is 4 ten thousand 5 thousand pixels, the wound measurement program 142 can calculate the wound area pixel ratio as 45000/50000=0.9. Thus, the wound measurement program 142 can calculate the actual area of the wound region 521 as 12 × (45000/50000) =10.8 square centimeters.
In one embodiment, after the wound area 521 and the normal skin area 522 are obtained by the wound measurement program 142, the output wound image 510 is divided into the wound area 511 and the normal skin area 512, and the values of the red, green and blue sub-pixels of each pixel in the wound area 511 and the normal skin area 512 are obtained, as shown in fig. 5C. Wound measurement program 142 calculates an average of the red, green, and blue subpixels for each pixel in wound area 521, e.g., W _ R, respectively avg 、W_G avg And W _ B avg And (4) showing. The wound measurement program 142 calculates the average of the red, green, and blue sub-pixels of each pixel in the normal skin area 522, e.g., N _ R may be used avg 、N_G avg And N _ B avg And (4) showing.
Next, the wound measurement program 142 may use the euclidean distance formula to calculate the severity (severity) of the wound region 521, as shown in equation (9):
Figure BDA0003323495950000181
wherein, N _ R avg Represents the average of all red subpixels in normal skin area 522; n _ G avg Represents the average of all green sub-pixels in the normal skin area 522; n _ B avg The average of all blue subpixels in normal skin area 522; w _ R avg Represents the average of all red subpixels in the wound area 521; w _ G avg Represents the average of all green sub-pixels in the wound area 521; w _ B avg Represents the average of all blue subpixels in the wound area 521.
The severity is a floating point number between 0 and 255 in terms of euclidean distance. A higher severity, i.e., a lower similarity, indicates that the wound area 521 is more severe (i.e., less severe) than the normal skin area 522, if the severity is closer to 255. A closer severity to 0 indicates a lower severity (i.e., a higher similarity) of the wound area 521 as compared to the normal skin area 522.
Fig. 6 is a block diagram of a wound care system in accordance with an embodiment of the present invention. Please refer to fig. 1 and fig. 6.
In one embodiment, the wound care system 600 includes one or more portable electronic devices 100 and a server 610, wherein each portable electronic device 100 may be connected to the server 610 through a network 620. The patient or the medical staff can take a picture of the wound of the patient by using the corresponding portable electronic device 100 at regular time to obtain an input image. The wound measurement program 142 executed by each portable electronic device 100 can recognize the input image by using the CNN model 143 and the RPN model 144 to obtain the output wound image (note: the bounding box with the highest probability of being a wound is cut from the input image).
Each time the patient or the medical staff uses the portable electronic device 100 to photograph the wound of the patient, the wound measurement program 142 can store the output wound images generated by the CNN model 143 and the RPN model 144, the corresponding time information (such as the photographing time of the input image) and the size information (including the height, the width and the area) and the severity of the wound area in the output wound images compared with the normal skin area in the database 145 for the subsequent care process.
In some embodiments, each portable electronic device 100 may further synchronize the content of the respective database 145 to the server 610, wherein the server 610 also includes a patient database 615 for recording the user name, the wound location, and the output wound image from each portable electronic device 100 and its history of time information, size information, and severity. In addition, the server 610 may further sort the user names of different patients according to the information in the patient database 615 to establish a care list for the medical staff to review on the server 610.
For example, if a patient has a relatively large area and a high degree of severity (a long euclidean distance, i.e., a low degree of similarity) in the output wound image captured most recently, the server 610 may prioritize the patient in the care list. As shown in FIG. 7A, assume that the average (W _ R) of the red, green, and blue subpixels of the lesion area 711 in the output lesion image 710 avg, W_G avg ,W_B avg ) Is (230,172,148) and the average (N _ R) of the red, green, and blue subpixels of normal skin area 712 in output wound image 710 avg, N_G avg ,N_B avg ) To (160,106,92), the wound measurement program 142 may calculate the severity of the wound area 711 according to equation (9)
Figure BDA0003323495950000191
Figure BDA0003323495950000192
If the severity is greater than a predetermined threshold (e.g., 70, not limited), thenThe wound measurement program 142 may determine that the wound area 711 belongs to a more severe wound area.
If a patient has a relatively small area and a low severity (e.g., a short euclidean distance, i.e., a high degree of similarity) in the most recently captured output wound image, the server 610 may rank the patient in a care list at a later location. As shown in fig. 7B, assume that the average value (W _ R) of the red, green, and blue subpixels of the wound area 721 in the output wound image 720 avg, W_G avg ,W_B avg ) Is (169,114,121) and the average (N _ R) of the red, green and blue subpixels of normal skin region 722 in output wound image 720 avg, N_G avg ,N_B avg ) To (176,143,119), the wound measurement program 142 may calculate the severity of the wound area 711 according to equation (9)
Figure BDA0003323495950000193
Figure BDA0003323495950000194
If the severity is less than a predetermined threshold (e.g., 70, not limited), then the wound measurement program 142 may determine that the wound area 721 is a less severe wound area.
For example, the wound measurement program 142 may compare the size information or severity of the current output wound image and the previous output wound image from the previous one or more photographs. In one embodiment, when the area of the current output wound image of a portable electronic device 100 is larger than a predetermined ratio (e.g., 5%, not limited) than the area of the previous output wound image, the wound measurement program 142 determines that the wound of the user of the portable electronic device 100 is marked as expanding, so the wound measurement program 142 notifies the server 710 to add the name of the user of the portable electronic device 100 (e.g., zhang san) to a care list, and sets an alert notification in the care list to show "area" for the medical staff to perform a relevant inspection.
In addition, when the severity of the current output wound image of the portable electronic device 100 is greater than the severity of the previous output wound image by a predetermined ratio (e.g., 10%, not limited), the wound measurement program 142 determines that the wound of the user of the portable electronic device 100 is deteriorated, so the wound measurement program 142 notifies the server 710 to add the name of the user of the portable electronic device 100 (e.g., zhang san) to a care list, and sets an alert notification in the care list to show "severity" for the medical staff to perform a relevant inspection.
Fig. 8 is a flow chart of a method of wound size measurement according to an embodiment of the invention.
In step S810, an input image is obtained by the camera device 105 of the portable electronic device 100. For example, the camera device 105 takes a picture of a wound site of a user focused at a first horizontal tilt angle to obtain an input image. For example, when the portable electronic device 100 is used to take a picture, the horizontal pitch angle (pitch) of the electronic device 100 may be changed from 0 degree to 90 degrees, and the inertial measurement unit 170 may detect the change of the horizontal pitch angle of the portable electronic device 100.
In step S820, the operation unit 120 uses the CNN model 143 to identify the input image, and selects a portion of the input image with the highest probability of being a wound as an output wound image. The operation unit 120 uses the RPN model 144 to divide the input image into a plurality of first bounding boxes, and filters out a plurality of second bounding boxes containing a wound probability greater than a predetermined value from the first bounding boxes. For example, as shown in fig. 2D, in the image recognition stage, the input image 210 is subjected to feature extraction by the CNN model 143 to obtain a feature map 1431, and the RPN model 144 can divide the input image 210 into a plurality of first bounding boxes 211 according to the feature map 1431. In addition, the RPN model 144 may set a threshold probability, and if the probability that the first bounding box 211 contains a wound is greater than the threshold probability, the RPN model 144 places the first bounding box 211 in the candidate region (prosals), wherein the first bounding box 211 in the candidate region is the second bounding box 212, as shown in fig. 2E.
In step S830, the operation unit 120 utilizes the lens focus parameter reported by the operating system 141 executed by the portable electronic device 100The actual height and the actual width of the output wound image are calculated by a plurality of reference calibration parameters corresponding to the horizontal and vertical angles of the portable electronic device 100, and the pixel height ratio and the pixel width ratio of the output wound image (for example, displayed on the display panel 160). For example, when the horizontal tilt angle of the portable electronic device 100 is between 0 degree and less than 45 degrees, the wound measurement program 142 uses the reference calibration parameters obtained from the horizontal calibration process to calculate by substituting the reference calibration parameters into the equations (7) and (8) to obtain the actual height h of the object to be measured m And the actual width w of the object to be measured m (e.g., in centimeters). When the horizontal pitch angle of the portable electronic device 100 is greater than or equal to 45 degrees to 90 degrees, the wound measurement program 142 uses the reference calibration parameters obtained in the vertical calibration process to calculate by substituting the reference calibration parameters into the equations (7) and (8) to obtain the actual height h of the object to be measured m And the actual width w of the object to be measured m (e.g., in centimeters).
In summary, embodiments of the present invention provide a portable electronic device and a method for measuring a wound size, which can calculate an actual height and an actual width of an output wound image according to a focal length parameter of a lens reported by an operating system, a plurality of reference calibration parameters corresponding to the first horizontal pitch angle, and a pixel height ratio and a pixel width ratio of the output wound image. The portable electronic device can obtain the reference correction parameters through a horizontal correction process and a vertical correction process.
The portable electronic device and the wound size measuring method in the embodiment of the invention can accurately calculate the actual height, the actual width and the actual area of the wound area in the input image in an objective mode, and can calculate the severity of the wound area. In addition, the portable electronic device and the wound size measuring method in the embodiment of the invention can compare the area or the severity of the current output wound image with the area or the severity of the previously shot wound image output last time so as to judge whether the wound area has signs of expansion or deterioration, and further send the warning notice to the server for medical staff to perform related inspection, so that the patients can be cared.
Use of the terms "first," "second," "third," and the like in the claims are used for modifying elements in the claims, and are not intended to distinguish one element from another, whether a prior relationship, or a chronological order in which steps of a method are performed, but rather are used for distinguishing between elements having the same name.
Although the preferred embodiments of the present invention have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (20)

1. A portable electronic device, comprising:
an inertia measuring unit for detecting a horizontal pitch angle of the portable electronic device;
a camera device for obtaining an input image;
a storage device for storing an operating system, a wound measurement program, a region candidate network model and a convolution neural network model; and
an arithmetic unit for executing the wound measurement procedure to perform the following steps:
identifying the input image by using the convolution neural network model, and selecting the part of the input image with the highest probability of being a wound as an output wound image; and
calculating the actual height and the actual width of the output wound image according to the lens focal length parameter reported by the operating system, the plurality of reference correction parameters corresponding to the horizontal pitch angle, and the pixel height ratio and the pixel width ratio of the output wound image.
2. The portable electronic device of claim 1, wherein the reference calibration parameter comprises: the actual height of the reference object, the actual width of the reference object, the pixel height ratio of the reference object and the focusing distance of the reference object.
3. The portable electronic device of claim 2, wherein during leveling of the wound measurement procedure, the portable electronic device takes a photograph of a reference object with the horizontal tilt angle of 0 degrees to obtain a first reference object image, and obtains a first reference object focus distance from an application programming interface of the operating system, and the reference object has the reference object actual height and the reference object actual width;
wherein during the vertical calibration of the wound measurement procedure, the portable electronic device takes a photograph of the reference object at a horizontal elevation angle of 90 degrees to obtain a second reference object image, and obtains a second reference object focus distance from the application programming interface of the operating system;
the arithmetic unit divides a first pixel height of the first reference object image or the second reference object image displayed on a display panel of the mode device by a second pixel height of the display panel to obtain a first reference object pixel height ratio or a second reference object pixel height ratio.
4. The portable electronic device as claimed in claim 3, wherein the computing unit uses the first reference object focus distance as the reference object focus distance and uses the first reference object pixel height ratio as the reference object pixel height ratio in response to the horizontal tilt angle being between 0 and 45 degrees;
wherein, in response to the horizontal pitch angle being between 45 and 90 degrees, the arithmetic unit uses the second reference object focus distance as the reference object focus distance and uses the second reference object pixel height ratio as the reference object pixel height ratio.
5. The portable electronic device of claim 4, wherein the computing unit calculates equations (1) and (2) to obtain the actual height and the actual width of the output wound image:
Figure FDA0003323495940000021
Figure FDA0003323495940000022
wherein h is c The actual height of the reference object; g c Focusing the reference object for a distance; p is a radical of c Is the reference object pixel height ratio; h is m The actual height of the output wound image; g m The focal length parameter of the lens; p is a radical of m Is the pixel height ratio; w is a m The actual width of the output wound image; w is a c Is the actual width of the reference object.
6. The portable electronic device of claim 1, wherein the computing unit further performs a machine learning clustering algorithm to divide the output wound image into a wound area and a normal skin area;
wherein the arithmetic unit further calculates a first number of pixels in the output wound image and a second number of pixels in the wound area, and divides the second number of pixels by the first number of pixels to obtain a wound area pixel ratio;
the computing unit further multiplies the actual height of the output wound image by the actual width to obtain an actual area of the output wound image, and multiplies the actual area by the wound area pixel ratio to obtain an actual area of the wound area.
7. The portable electronic device of claim 6, wherein the computing unit further calculates a first red average, a first green average, and a first blue average of the red, green, and blue subpixels of each pixel in the wound area, calculates a second red average, a second green average, and a second blue average of the red, green, and blue subpixels of each pixel in the normal skin area, and calculates the Euclidean distance between the wound area and the normal skin area according to the first red average, the first green average, the first blue average, the second red average, the second green average, and the second blue average to indicate the severity of the wound area.
8. The portable electronic device as claimed in claim 7, wherein in response to the computing unit determining that the actual area of the output wound image is greater than a first predetermined ratio from the actual area of a previous output wound image, the computing unit notifies a server to add a user name of the portable electronic device to a care list for medical staff to perform related review.
9. The portable electronic device as claimed in claim 7, wherein in response to the computing unit determining that the severity of the output wound image is greater than a second predetermined ratio from a previous output wound image, the computing unit notifies a server to add a user name of the portable electronic device to a care list for medical staff to perform related review.
10. The portable electronic device of claim 1, wherein before the computing unit identifies the input image using the convolutional neural network model, the computing unit generates a plurality of first bounding boxes using the input image using the regional candidate network model, and filters a plurality of second bounding boxes containing a probability of a wound greater than a predetermined value from the first bounding boxes;
wherein the convolution type neural network model selects the second bounding box with the highest probability of being a wound as an output wound image.
11. A wound size measuring method is used for a portable electronic device, the portable electronic device comprises a display panel and a camera device, and the method comprises the following steps:
obtaining an input image by using the camera device;
identifying the input image by using a convolution neural network model, and selecting the input image with the highest probability of being a wound as an output wound image; and
calculating the actual height and the actual width of the wound area in the output wound image according to the lens focal length parameter reported by the operating system, a plurality of reference correction parameters corresponding to the horizontal pitch angle of the portable electronic device, and the pixel height ratio and the pixel width ratio of the output wound image.
12. The wound size measurement method of claim 11, wherein the reference correction parameters comprise: the actual height of the reference object, the actual width of the reference object, the pixel height ratio of the reference object and the focusing distance of the reference object.
13. The method of claim 12, wherein during leveling of the wound measurement procedure, the portable electronic device takes a photograph of a reference object with the horizontal tilt angle at 0 degrees to obtain a first reference object image, and obtains a first reference object focus distance from an application programming interface of the operating system, the reference object having the reference object actual height and the reference object actual width;
wherein during the vertical calibration of the wound measurement procedure, the portable electronic device takes a picture of the reference object with the horizontal tilt angle of 90 degrees to obtain a second reference object image, and obtains a second reference object focus distance from the application programming interface of the operating system;
the arithmetic unit divides a first pixel height of the first reference object image or the second reference object image displayed on the display panel by a second pixel height of the display panel to obtain a first reference object pixel height ratio or a second reference object pixel height ratio.
14. The method of claim 13, wherein the computing unit uses the first reference object focus distance as the reference object focus distance and the first reference object pixel height ratio as the reference object pixel height ratio in response to the horizontal pitch angle being between 0 and 45 degrees;
wherein, in response to the horizontal pitch angle being between 45 and 90 degrees, the arithmetic unit uses the second reference object focus distance as the reference object focus distance and uses the second reference object pixel height ratio as the reference object pixel height ratio.
15. The wound size measurement method of claim 14, further comprising:
calculating equations (1) and (2) to obtain the actual height and the actual width of the output wound image:
Figure FDA0003323495940000041
Figure FDA0003323495940000042
wherein h is c The actual height of the reference object; g c Focusing distance for the reference object; p is a radical of c Is the reference object pixel height ratio; h is m The actual height of the output wound image; g m The focal length parameter of the lens; p is a radical of m Is the pixel height ratio; w is a m The actual width of the output wound image; w is a c Is the actual width of the reference object.
16. The wound size measurement method of claim 11, further comprising:
executing a machine learning clustering algorithm to divide the output wound image into a wound area and a normal skin area;
calculating a first number of pixels in the output wound image and a second number of pixels in the wound area, and dividing the second number of pixels by the first number of pixels to obtain a wound area pixel ratio; and
multiplying the actual height of the output wound image by the actual width to obtain an actual area of the output wound image, and multiplying the actual area by the wound area pixel ratio to obtain an actual area of the wound area.
17. The wound size measurement method of claim 16, further comprising:
calculating a first red average, a first green average and a first blue average of the red, green and blue subpixels of each pixel in the wound area;
calculating a second red average, a second green average and a second blue average of the red, green and blue subpixels of each pixel in the normal skin region; and
calculating the Euclidean distance between the wound area and the normal skin area according to the first red average value, the first green average value, the first blue average value, the second red average value, the second green average value and the second blue average value to represent the severity of the wound area.
18. The wound size measurement method of claim 17, further comprising: in response to determining that the actual area of the output wound image is larger than the actual area of a previous output wound image by a first predetermined ratio, a server is notified to add the user name of the portable electronic device to a care list for medical staff to perform related inspection.
19. The wound size measurement method of claim 17, further comprising:
and informing a server to add the user name of the portable electronic device into a care list for medical staff to perform relevant inspection in response to judging that the severity of the output wound image is greater than a second preset proportion than that of a previous output wound image.
20. The method of claim 11, wherein prior to identifying the input image using the convolutional neural network-like model, the method further comprises:
generating a plurality of first bounding boxes by using the area candidate network model and using the input image, and filtering a plurality of second bounding boxes containing wound probability larger than a preset value from the first bounding boxes; and
and selecting the second bounding box with the highest probability of being the wound as an output wound image by using the convolution neural network model.
CN202111254232.6A 2021-08-18 2021-10-27 Portable electronic device and wound size measuring method Pending CN115810039A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW110130445A TWI783636B (en) 2021-08-18 2021-08-18 Portable electronic device and method of measuring size of wound
TW110130445 2021-08-18

Publications (1)

Publication Number Publication Date
CN115810039A true CN115810039A (en) 2023-03-17

Family

ID=80001394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111254232.6A Pending CN115810039A (en) 2021-08-18 2021-10-27 Portable electronic device and wound size measuring method

Country Status (4)

Country Link
US (1) US20230058754A1 (en)
EP (1) EP4138033A1 (en)
CN (1) CN115810039A (en)
TW (1) TWI783636B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117442190B (en) * 2023-12-21 2024-04-02 山东第一医科大学附属省立医院(山东省立医院) Automatic wound surface measurement method and system based on target detection

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104161522B (en) * 2014-08-12 2017-07-21 北京工业大学 Wound area measuring system junior range device
US9990472B2 (en) * 2015-03-23 2018-06-05 Ohio State Innovation Foundation System and method for segmentation and automated measurement of chronic wound images
US11116407B2 (en) * 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
TWI617281B (en) * 2017-01-12 2018-03-11 財團法人工業技術研究院 Method and system for analyzing wound status
US20210093227A1 (en) * 2019-09-26 2021-04-01 Canon Kabushiki Kaisha Image processing system and control method thereof
CN111067531A (en) * 2019-12-11 2020-04-28 中南大学湘雅医院 Wound measuring method and device and storage medium

Also Published As

Publication number Publication date
TWI783636B (en) 2022-11-11
TW202309923A (en) 2023-03-01
EP4138033A1 (en) 2023-02-22
US20230058754A1 (en) 2023-02-23

Similar Documents

Publication Publication Date Title
US8059870B2 (en) Time-of-flight sensor-assisted iris capture system and method
KR101533686B1 (en) Apparatus and method for tracking gaze, recording medium for performing the method
US11600003B2 (en) Image processing apparatus and control method for an image processing apparatus that extract a region of interest based on a calculated confidence of unit regions and a modified reference value
JP2020507836A (en) Tracking surgical items that predicted duplicate imaging
BR102012020775B1 (en) image capture device, image processing device and image processing method for generating auxiliary information for the captured image
KR20170133269A (en) Video processing apparatus, video processing method, and program
JP2013504114A (en) Eye state detection apparatus and method
JP7322097B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM AND RECORDING MEDIUM
WO2021164678A1 (en) Automatic iris capturing method and apparatus, computer-readable storage medium, and computer device
CN110213491B (en) Focusing method, device and storage medium
CN105180802B (en) A kind of dimension of object information identifying method and device
WO2016184152A1 (en) Measuring method and apparatus, mobile terminal and storage medium
TWI783636B (en) Portable electronic device and method of measuring size of wound
WO2019230724A1 (en) Image processing system, imaging device, image processing device, electronic device, control method thereof, and storage medium storing control method thereof
US11599993B2 (en) Image processing apparatus, method of processing image, and program
CN115423804B (en) Image calibration method and device and image processing method
CN115880643B (en) Social distance monitoring method and device based on target detection algorithm
JP2012027617A (en) Pattern identification device, pattern identification method and program
US20210401327A1 (en) Imaging apparatus, information processing apparatus, image processing system, and control method
TWI797635B (en) Posture evaluating apparatus, method and system
KR102505705B1 (en) Image analysis server, object counting method using the same and object counting system
US11475571B2 (en) Apparatus, image processing apparatus, and control method
JP7200002B2 (en) Image processing device, imaging device, image processing method, program, and storage medium
CN115862089B (en) Security monitoring method, device, equipment and medium based on face recognition
KR20230078439A (en) System for diagnosis based on image and method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination