CN111698401A - Apparatus, image processing apparatus, control method, and storage medium - Google Patents

Apparatus, image processing apparatus, control method, and storage medium Download PDF

Info

Publication number
CN111698401A
CN111698401A CN202010172159.7A CN202010172159A CN111698401A CN 111698401 A CN111698401 A CN 111698401A CN 202010172159 A CN202010172159 A CN 202010172159A CN 111698401 A CN111698401 A CN 111698401A
Authority
CN
China
Prior art keywords
affected part
image
size
control
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010172159.7A
Other languages
Chinese (zh)
Other versions
CN111698401B (en
Inventor
黑田友树
川合良和
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020023378A external-priority patent/JP2020156082A/en
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN111698401A publication Critical patent/CN111698401A/en
Application granted granted Critical
Publication of CN111698401B publication Critical patent/CN111698401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/0037Performing a preliminary scan, e.g. a prescan for identifying a region of interest
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/447Skin evaluation, e.g. for skin disorder diagnosis specially adapted for aiding the prevention of ulcer or pressure sore development, i.e. before the ulcer or sore has developed
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30092Stomach; Gastric

Abstract

The invention provides an apparatus, an image processing apparatus, and a control method. The apparatus comprises: a sensor configured to capture an image of an affected part; a processor configured to obtain information on a size of the affected part in the captured image, and control a timing to capture the image of the affected part or a timing to prompt the user to perform an imaging operation based on the information on the size of the affected part.

Description

Apparatus, image processing apparatus, control method, and storage medium
Technical Field
Aspects of the embodiments relate to an apparatus, an image processing apparatus, and a control method.
Background
People and animals in a lying position may develop pressure sores or pressure sores because a part of the body is pressed against a support surface that is in contact with the body due to the body weight. It may be desirable to provide a patient who develops a decubitus ulcer with a decubitus ulcer treatment such as a body pressure distribution care and a skin care, and a periodic evaluation and management of the decubitus ulcer.
In bedsore guideline (second edition) on page 23 according to "bedsore prevention and management guideline for bedsore JSPU (fourth edition)" by shin-shoku institute, International Standard Book No. (ISBN) -13978-. DESIGN-R is a tool used to evaluate the healing process of wounds, including decubitus ulcers. The scale is named with the acronym of observation including Depth (Depth), Exudate (overflow), Size (Size), Inflammation/infection (Inflammation/infection), Granulation tissue (Granulation) and Necrotic tissue (Necrotic tissue).
There are two types of DESIGN-R scales, one for severity classification aimed at daily simple evaluation, and one for monitoring that represents in detail the progress of the healing process. DESIGN-R for severity classification included six evaluation terms, each including two classifications: mild and severe. The classification "light" is indicated by lower case letters and "heavy" by upper case letters.
In the preliminary treatment, by making an evaluation using DESIGN-R for the classification of severity, the rough state of bedsore can be grasped. The progress of the treatment can be readily determined, since it can be appreciated by the evaluation which item or items the problem relates to.
For monitoring purposes, a DESIGN-R has also been developed that enables a comparison of severity between patients in addition to monitoring. "R" represents rating (evaluation, scoring). The terms are given different weights respectively, and the total score (0 to 66 points) of six terms other than depth indicates the severity of the bedsore. After the start of the treatment, the progress of the treatment can be evaluated in detail and in an objective manner, which enables not only individual monitoring but also a comparison of the severity between patients.
According to DESIGN-R, the size is assessed by measuring the major (major axis) and minor (minor axis) axes (longest diameter perpendicular to the major axis) of the skin wound area (in cm). The size (as a numerical value obtained by multiplying the measured values of the major axis and the minor axis) is classified into seven levels. Seven levelsComprises the following steps: s 0: none; s 3: less than 4cm2;s6:4cm2Above but less than 16cm2;s8:16cm2Above but less than 36cm2;s9:36cm2Above but less than 64cm2;s12:64cm2Above but less than 100cm2(ii) a And s 15: 100cm2The above.
As discussed in the bedsore guide, above, a DESIGN-R score is recommended for weekly or biweekly assessment of the healing process of bedsores and appropriate care options. For bedsores, regular condition assessment and management may be desirable. In order to observe the change in the state of the bedsore, high evaluation accuracy is required.
In this case, the size of the decubitus is often evaluated based on a value manually measured by placing a measuring scale on the affected part. In particular, the longest straight-line distance between two points in the skin wound area is measured as the major axis. The length perpendicular to the major axis is measured as the minor axis. The measurements of the major and minor axes are multiplied to determine the size of the bedsore.
When taking an image of a bedsore, the bedridden patient is half-lifted, and the photographer takes an image of the back in an unnatural posture. And thus a proper image may not be captured. Since the shape and area of the bedsore vary according to the posture of the patient, the bedsore may appear different every time an image is taken. Therefore, it is difficult to perform accurate comparison of the development of the bedsore using the captured image of the bedsore. Such a problem is not limited to bedsores, but is also applicable to imaging of burns and lacerations.
Disclosure of Invention
Aspects of the embodiments are directed to appropriately capturing an image of an affected part.
According to an aspect of an embodiment, an apparatus includes: a sensor configured to capture an image of an affected part; and a processor configured to obtain information on the size of the affected part in the captured image, and to control a timing to capture the image of the affected part or to control a timing to prompt a user to perform an imaging operation based on the information on the size of the affected part.
An apparatus, comprising: a sensor configured to capture an image of an affected part; and a processor configured to, in a case where the apparatus is facing a subject including the affected part, control so that an image of the affected part is captured or control so that a user is prompted to perform an imaging operation.
An image processing apparatus comprising: a communication circuit configured to receive an image including an affected part from the image pickup apparatus; and a processor configured to calculate a size of the affected part based on the received image, and to transmit information on the size of the affected part to the image pickup apparatus via the communication circuit in order to control a timing to photograph an image of the affected part by the image pickup apparatus or to control a timing to prompt a user to perform an image pickup operation based on the information on the size of the affected part.
A method, comprising: capturing an image of the affected part by a sensor; obtaining information on the size of the affected part in a captured image; and controlling a timing to photograph an image of the affected part or a timing to prompt a user to perform an imaging operation based on the information on the size of the affected part.
A method, comprising: judging whether a subject including an affected part is facing the apparatus; and controlling so that an image of the affected part is captured by a sensor or so that a user is prompted to perform an imaging operation, in a case where an object including the affected part is directly facing the apparatus.
A method, comprising: receiving an image including an affected part from a device; calculating a size of the affected part based on the received image; and transmitting information on the size of the affected part to the apparatus in order to control a timing to capture an image of the affected part or to control a timing to prompt a user to perform an imaging operation based on the information on the size of the affected part.
Other features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Drawings
Fig. 1 is a diagram showing an outline of an image processing system.
Fig. 2 is a diagram showing a hardware configuration of the image pickup apparatus.
Fig. 3 is a diagram showing a hardware configuration of the image processing apparatus.
Fig. 4A and 4B are flowcharts showing the processing of the image processing system.
Fig. 5 is a diagram illustrating a method of calculating the area of the affected area region.
Fig. 6A and 6B are diagrams illustrating a method for superimposing information on image data relating to an affected part.
Fig. 7A, 7B, and 7C are diagrams illustrating a method for superimposing information on image data relating to an affected part.
Fig. 8A and 8B are flowcharts showing the processing of the image processing system.
Detailed Description
Exemplary embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a diagram showing an example of an outline of an image processing system according to a first exemplary embodiment.
The image processing system 1 includes an image capturing apparatus 200 as a handheld portable device and an image processing apparatus 300. In the present exemplary embodiment, as an example of the case of the affected part 102 of the object 101, decubitus occurring on the hip will be described.
In the image processing system 1, the image pickup apparatus 200 picks up a live view image of the affected part 102 of the object 101, and transmits image data on the picked up live view image to the image processing apparatus 300. The image processing apparatus 300 extracts an affected part region including the affected part 102 from the received image data, calculates the area of the affected part region, and transmits information on the calculated region to the image capturing apparatus 200. The image pickup apparatus 200 automatically picks up an image of the affected part 102 if the received area of the affected part region is greater than or equal to the threshold value. Although the present exemplary embodiment is described by using the case where the affected part 102 is a bedsore as an example, the affected part 102 is not limited to the bedsore, and may be a burn or a tear.
Fig. 2 is a diagram illustrating an example of the hardware configuration of the image pickup apparatus 200.
The image capturing apparatus 200 may be a standard single-lens camera, a compact digital camera, or a smartphone or tablet terminal including a camera having an autofocus function.
The image pickup unit 211 includes a lens group 212, a shutter 213, and an image sensor 214. The focus position and zoom magnification can be changed by changing the positions of a plurality of lenses included in the lens group 212. The lens group 212 further includes an aperture for adjusting the exposure amount.
The image sensor 214 includes a solid-state image sensor of a charge accumulation type for converting an optical image into image data. Examples of the solid-state image sensor include a Charge Coupled Device (CCD) sensor and a Complementary Metal Oxide Semiconductor (CMOS) sensor. The reflected light from the object 101 that has passed through the lens group 212 and the shutter 213 forms an image on the image sensor 214. The image sensor 214 generates an electric signal based on the subject image, and outputs image data based on the generated electric signal.
The shutter 213 exposes and shields the image sensor 214 by opening and closing the shutter member to control the exposure time of the image sensor 214. The shutter 213 may be an electronic shutter that controls an exposure time by driving the image sensor 214. In order to implement an electronic shutter on a CMOS sensor, a reset scan for resetting the amount of electric charges accumulated in pixels to zero is performed pixel by pixel or in units of areas each including a plurality of pixels (for example, row by row). Scanning for reading a signal based on the accumulated charge amount is performed pixel by pixel or region by region every time after a predetermined time has elapsed from the reset scanning.
The zoom control circuit 215 controls a motor for driving a zoom lens included in the lens group 212 to control the optical magnification of the lens group 212.
The ranging system 216 calculates distance information on the distance to the object 101. The ranging system 216 may use a typical phase difference ranging sensor or time of flight (TOF) sensor included in a single lens reflex camera. The TOF sensor is a sensor for measuring a distance to an object based on a time difference (or a phase difference) between a transmission timing of an irradiation wave and a reception timing of a reflected wave as the irradiation wave reflected from the object. The ranging system 216 may use a Position Sensitive Device (PSD) system in which a PSD is used as a light receiving element.
The image sensor 214 may be configured to include a plurality of photoelectric conversion regions in each pixel, so that distance information of each pixel position or region position may be determined according to a phase difference between images obtained by the respective photoelectric conversion regions.
The ranging system 216 may be configured to determine range information in one or more predetermined ranging regions in the image. The ranging system 216 may be configured to determine a range map representing a distribution of range information associated with a large number of pixels or regions in an image.
The ranging system 216 may perform television auto focus (TV-AF) or contrast AF that extracts and integrates high frequency components of image data to determine the position of a focus lens where the integrated value is maximum. The ranging system 216 may obtain distance information based on the position of the focus lens.
The image processing circuit 217 applies predetermined image processing to the image data output from the image sensor 214. The image processing circuit 217 performs various types of image processing such as white balance adjustment, gamma correction, color interpolation or demosaicing, and filtering on image data output from the image capturing unit 211 or image data stored in the internal memory 221. The image processing circuit 217 also performs compression processing conforming to the Joint Photographic Experts Group (JPEG) standard on the image-processed image data.
The AF control circuit 218 determines the position of a focus lens included in the lens group 212 based on distance information obtained by the ranging system 216, and controls a motor for driving the focus lens.
The communication unit 219 is a communication interface for communicating with an external apparatus such as the image processing apparatus 300 via a wireless network. Specific examples of the network include networks based on the Wi-Fi (registered trademark) standard. Wi-Fi based communication can be performed via a router. The communication unit 219 may be implemented by a wired communication interface such as a Universal Serial Bus (USB) interface and a Local Area Network (LAN).
The system control circuit 220 includes a Central Processing Unit (CPU), and controls the entire image pickup apparatus 200 by executing a program stored in the internal memory 221. The system control circuit 220 also controls the image pickup unit 211, the zoom control circuit 215, the distance measurement system 216, the image processing circuit 217, and the AF control circuit 218. The system control circuit 220 is not limited to including a CPU, and a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC) may be used. If a predetermined image capturing condition is satisfied, the system control circuit 220 generates the same internal signal as when the user issues an image capturing instruction via the operation unit 224 based on the image capturing condition.
Rewritable memory such as flash memory and Synchronous Dynamic Random Access Memory (SDRAM) may be used as the internal memory 221. The internal memory 221 temporarily stores various types of setting information such as information on a focusing position during image capturing used for operation of the image capturing apparatus 200, image data captured by the image capturing unit 211, and image data after image processing by the image processing circuit 217. The internal memory 221 can temporarily store analysis data and image data such as information on the size of a subject received by the communication unit 219 through communication with the image processing apparatus 300.
The external memory 222 is a nonvolatile recording medium that can be mounted on the image pickup apparatus 200 or built in the image pickup apparatus 200. Examples of the external memory 222 include a Secure Digital (SD) card and a Compact Flash (CF) card. The external memory 222 records image data after image processing by the image processing circuit 217 and image data and analysis data received by the communication unit 219 through communication with the image processing apparatus 300. During playback, the recorded image data may be read from the external memory 222, and output to the outside of the image capturing apparatus 200.
Examples of the display unit 223 include a Thin Film Transistor (TFT) liquid crystal display, an organic Electroluminescence (EL) display, and an Electronic Viewfinder (EVF). The display unit 223 displays the image data temporarily stored in the internal memory 221, the image data recorded in the external memory 222, and the setting screen of the image capturing apparatus 200.
The operation unit 224 includes buttons, switches, keys, and/or a mode dial arranged on the image pickup apparatus 200, or a touch panel also serving as the display unit 223. User commands such as a mode setting command and an image capturing instruction are notified to the system control circuit 220 via the operation unit 224.
The common bus 225 includes signal lines for transmitting and receiving signals between components of the image pickup apparatus 200.
Fig. 3 is a diagram showing an example of the hardware configuration of the image processing apparatus 300.
The image processing apparatus 300 includes a CPU 310, a storage unit 312, a communication unit 313, an output unit 314, and an auxiliary arithmetic unit 317.
The CPU 310 includes an arithmetic unit 311. The CPU 310 controls the entire image processing apparatus 300 by executing a program stored in the storage unit 312.
The storage unit 312 includes a main storage unit 315 such as a Read Only Memory (ROM) and a Random Access Memory (RAM), and the like, and an auxiliary storage unit 316 such as a disk drive and a Solid State Drive (SSD), and the like.
The communication unit 313 is a wireless communication module for communicating with an external apparatus such as the image capturing apparatus 200 via a wireless network.
The output unit 314 outputs the data processed by the arithmetic unit 311 and the data stored in the storage unit 312 to a display, a printer, or an external network connected to the image processing apparatus 300.
The auxiliary operation unit 317 is an Integrated Circuit (IC) for auxiliary operations that operate under the control of the CPU 310. A Graphics Processing Unit (GPU) may be used as the auxiliary arithmetic unit 317. A GPU (a processor originally used for image processing) includes a plurality of multiply accumulators and excels in matrix calculation, and thus can also be used as a processor for signal learning processing. Therefore, GPUs are typically used for processing including deep learning. For example, a Jetson TX2 module manufactured by NVIDIA corporation may be used as the auxiliary operation unit 317. Alternatively, an FPGA or an ASIC may be used as the auxiliary arithmetic unit 317. The auxiliary arithmetic unit 317 performs processing for extracting an affected area from the image data.
Image processing device 300 may include one or more CPUs 310 and one or more storage units 312. In other words, when at least one or more CPUs and at least one or more storage units are connected and the at least one or more CPUs execute programs stored in the at least one or more storage units, the image processing apparatus 300 realizes the functions described below. The image processing apparatus 300 is not limited to the CPU 310, and may use an FPGA or an ASIC.
Fig. 4A and 4B are flowcharts showing an example of the processing of the image processing system 1.
In fig. 4A and 4B, steps S401 to S420 represent processing of the image pickup apparatus 200. Steps S431 to S456 represent processing of the image processing apparatus 300. The flowcharts of fig. 4A and 4B are started by both the image capturing apparatus 200 and the image processing apparatus 300 connected to a network conforming to the Wi-Fi standard, which is a wireless LAN standard.
In step S431, the image processing apparatus 300 performs search processing for searching for the image capturing apparatus 200 to be connected.
In step S401, the image capturing apparatus 200 performs response processing in response to the search processing by the image processing apparatus 300. Here, Universal Plug and Play (UPnP) is used as a technology for extending network search devices. In UPnP, each individual device is identified by a Universally Unique Identifier (UUID).
In step S402, the image capturing apparatus 200 communicably connected to the image processing apparatus 300 starts the live view processing. Specifically, the image capturing unit 211 generates image data, and the image processing circuit 217 performs development processing for generating image data for live view display on the generated image data. The image pickup unit 211 and the image processing circuit 217 repeat this processing, whereby a live view image is displayed on the display unit 223 at a predetermined frame rate.
In step S403, the ranging system 216 determines distance information on the distance to the object 101, and the AF control circuit 218 starts AF processing for controlling driving of the lens group 212 to bring the object 101 into focus. If the ranging system 216 adjusts the focus position by TV-AF or contrast AF, the ranging system 216 determines distance information about the distance to the object 101 in the in-focus image based on the position of the focus lens in the in-focus state. The object to be focused may be a subject in the center of the image data or a subject closest to the image capturing apparatus 200. If the ranging system 216 obtains a distance map about the object 101, the ranging system 216 can estimate a region of interest from the distance map and focus the estimated region. If the image processing apparatus 300 has identified the position of the affected area in the live view image, the ranging system 216 may cause the identified position to be focused. The image capturing apparatus 200 repeats the display of the live view image and the AF processing until a release signal is issued in step S413 described below.
In step S404, the image processing circuit 217 subjects one image data relating to the live view image to development and compression processing to generate image data conforming to the JPEG standard, for example. The image processing circuit 217 performs scaling processing on the compressed image data to reduce the size of the image data.
In step S405, the communication unit 219 obtains the zoom image data and the distance information determined by the ranging system 216. The communication unit 219 also obtains information on the zoom magnification and the size (the number of pixels) of the zoom image data as appropriate.
In step S406, the communication unit 219 transmits the obtained image data and one or more pieces of information including the distance information to the image processing apparatus 300 by wireless communication. The larger the size of the image data to be transmitted here, the longer it takes for wireless communication. Thus, the system control circuit 220 determines the size to which the image processing circuit 217 scales the image data in step S404 based on the allowable communication time. If the size of the image data is too small, the accuracy of the extraction process of the image processing apparatus 300 to extract the affected area in step S442 described below may be affected. Therefore, the system control circuit 220 determines the size of the image data based on the accuracy of the extraction process of the affected area in addition to the communication time. The processing of steps S404 to S406 may be performed on a frame-by-frame basis or several frames at a time.
Then, the processing proceeds to the processing of the image processing apparatus 300.
In step S441, the communication unit 313 of the image processing apparatus 300 receives the image data and one or more pieces of information including the distance information transmitted from the communication unit 219 of the image capturing apparatus 200.
In step S442, the CPU 310 and the auxiliary arithmetic unit 317 of the image processing apparatus 300 extract the affected area region from the received image data.
As a technique for extracting an affected area region, semantic segmentation (semantic segmentation) based on deep learning is performed. Specifically, the high-performance computer for training generates a training model by training the neural network model using a plurality of images of the affected area of the actual bedsore as teaching data. The auxiliary arithmetic unit 317 obtains a training model from a high-performance computer, and estimates a decubitus area or an affected area from image data based on the training model. As an example of the neural network model, a complete convolution network (FCN) as a deep learning-based segmentation model may be applied. The estimation based on the deep learning is processed by the auxiliary operation unit 317 which excels in parallel execution of the product-sum operation. However, the deep learning based estimation can be made by an FPGA or an ASIC. Other deep learning models may be used for segmentation. The segmentation technique is not limited to deep learning. For example, graph cut segmentation, region growing, edge detection, or divide and conquer segmentation may be used. The auxiliary arithmetic unit 317 may internally train the neural network model using the image of the affected area of the bedsore as teaching data.
In step S443, the arithmetic unit 311 of the CPU 310 calculates the area of the affected area as information on the size of the extracted affected area.
Fig. 5 is a diagram illustrating a method for calculating the area of an affected area.
If the image capture device 200 is a standard camera, the image capture device 200 may be treated as a pinhole model, as described in FIG. 5. The incident light 501 passes through the principal point of the lens 212a and is received by the image pickup surface of the image sensor 214. If lens group 212 is approximated by a single lens 212a having no thickness, the two principal points (i.e., the front principal point and the rear principal point) may be considered to coincide with each other. The image capturing apparatus 200 can focus the object 504 by adjusting the focal position of the lens 212a so that an image is formed on the plane of the image sensor 214. Changing the focal length 502, which is the distance F between the imaging plane and the lens principal point, changes the angle of view θ 503, thereby changing the zoom magnification. The width 506 of the object 504 on the focus plane is geometrically determined according to the relationship between the angle of view θ 503 and the object distance 505 of the image capturing apparatus 200. The width 506 of the subject 504 is calculated using a trigonometric function. Specifically, the width 506 of the object 504 is determined by the relationship between the angle of view θ 503 and the object distance 505, which vary with the focal length 502. The length on the focus plane corresponding to one pixel of image data is obtained by dividing the width 506 of the subject 504 by the number of pixels on the line of image data.
The arithmetic unit 311 calculates the area of the affected area as the product of the number of pixels obtained from the affected area extracted in step S442 and the area of each pixel determined according to the length on the focus plane corresponding to one pixel of the image data. The length on the focus plane corresponding to one pixel of the image data may be determined in advance for various combinations of the focal length 502 and the object distance 505, and the resulting length may be provided as table data. Table data corresponding to various image capturing apparatuses 200 may be stored in advance in the image processing apparatus 300.
The arithmetic unit 311 correctly determines the area of the affected area on the premise that the object 504 is a plane and the plane is perpendicular to the optical axis.
In step S444, the arithmetic unit 311 superimposes information indicating the extraction result of the affected area region and information on the size of the affected area region on the image data from which the affected area region is extracted to generate image data (superimposing process).
Fig. 6A and 6B are diagrams illustrating a method for superimposing information indicating the extraction result of the affected area region and information on the size of the affected area region on image data.
Fig. 6A shows an image 601 as an example of image data displayed before superimposition processing. The image 601 includes the subject 101 and the affected part 102. Fig. 6B shows an image 602 as an example of image data displayed after the superimposition processing.
The label 611 is superimposed on the upper left corner of the image 602 shown in fig. 6B. The label 611 displays a character string 612 indicating the area of the affected area in black background white font. The information on the size of the affected area refers to the character string 612 and indicates the area of the affected area calculated by the arithmetic unit 311. The background color of the label 611 and the color of the character string 612 are not limited to black and white as long as they are easy to view. Transparency may be set using alpha blending so that the user can view the image with the label 611 superimposed.
An indicator 613 representing the estimated region of the affected area extracted in step S442 is also superimposed on the image 602. The index 613 representing the estimated region and the original image data of the image 601 are blended by α so that the user can check whether the estimated region is suitable for calculating the area of the affected area. The color of the index 613 indicating the estimated region is desirably different from the color of the object 101. The range of alpha blending transparency may allow the putative region to be distinguished from the original lesion 102. As long as the index 613 indicating the estimated region of the affected area is displayed in a superimposed manner, step S443 can be omitted because the user can check whether the estimated region is appropriate without the label 611.
In step S445, the communication unit 313 of the image processing apparatus 300 transmits information indicating the extraction result of the affected area region and information relating to the size of the affected area region to the image capturing apparatus 200 by wireless communication. In the present exemplary embodiment, the communication unit 313 transmits the image data generated in step S444 and the area of the affected area (i.e., information on the size of the affected area) to the image capturing apparatus 200.
Then, the process returns to the process of the image pickup apparatus 200.
In step S407, the communication unit 219 of the image capturing apparatus 200 receives the data transmitted from the image processing apparatus 300 if the data exists.
In step S408, the system control circuit 220 determines whether or not the image data and the area of the affected area, which is information on the size of the affected area, are received. If the image data and the area of the affected area are received (yes in step S408), the processing proceeds to step S409. Otherwise (no in step S408), the processing proceeds to step S410.
In step S409, the display unit 223 displays the received image data for a predetermined time. Here, the display unit 223 displays the image 602 shown in fig. 6B. Displaying information indicating the extraction result of the affected area superimposed on the live view image before performing image capturing for recording purposes enables a user (hereinafter also referred to as a photographer) to check whether the area of the affected area and the estimated area are appropriate. Although in the present exemplary embodiment, it is described that the index 613 representing the estimated region of the affected area and the label 611 displaying the area of the affected area are to be displayed, the display unit 223 may display any one of the index 613 and the label 611. The display unit 223 may be configured to display neither the indicator 613 nor the label 611 (step S409 may be omitted).
In step S410, the system control circuit 220 controls the timing of capturing an image based on whether or not the image capturing condition is satisfied. Step S410 includes step S411 and step S412. If the affected area is tilted with respect to the imaging apparatus 200, the area of the affected area cannot be correctly calculated. If the inclination of the affected area with respect to the image pickup apparatus 200 changes every time an image is captured, it is not possible to correctly determine how the size changes when the captured images are compared later. It is difficult to support the bedridden patient at the same position every time, and it is difficult for the photographer to hold the image pickup apparatus 200 to face the affected part. In steps S411 and S412, thereby, the imaging apparatus 200 performs processing for estimating whether the imaging apparatus 200 is facing the affected area region.
In step S411, the system control circuit 220 compares the received area of the affected area with a predetermined threshold value, and determines whether the area is greater than or equal to the threshold value. The threshold value is recorded in advance in the external memory 222 as an initial value. The system control circuit 220 of the image pickup apparatus 200 reads the threshold value from the external memory 222 at the time of startup. For example, an area corresponding to the size of the affected area that can be recognized as a decubitus ulcer is set as the threshold value. Later, the user is free to change and set the threshold. Thereby, the threshold value can be changed and set for each subject whose image is to be captured. In step S411, if the area is not greater than or equal to the threshold value (no in step S411), the processing returns to step S404. The image capturing apparatus 200 then performs the processing of step S404 and subsequent steps as described above.
In step S411, if the area is determined to be greater than or equal to the threshold value (yes in step S411), the processing proceeds to step S412. In step S412, the system control circuit 220 determines the ratio or difference between the last received area and the area received immediately before the last. When the photographer is moving the image pickup apparatus 200 so that the image pickup apparatus 200 is in a position just opposite to the affected area, the area received last and the area received immediately before last are greatly different. Specifically, as the imaging apparatus 200 in a state of being inclined with respect to the affected area approaches to a state of facing the affected area, the area of the affected area increases. On the other hand, as the imaging apparatus 200 in a state of facing the affected area is approached in a state of being inclined with respect to the affected area, the area of the affected area decreases. If the image pickup apparatus 200 is in a state close to being directly facing the affected area region, the change in the area of the affected area region is reduced. In step S412, the system control circuit 220 determines whether the ratio or difference between the last received area and the area received immediately before the last falls within a predetermined range. If the ratio or difference between the last received area and the area received immediately before the last falls within the predetermined range, the system control circuit 220 accordingly estimates that the image pickup apparatus 200 is facing the affected area (yes in step S412) and the processing proceeds to step S413. In step S412, if the ratio or difference does not fall within the predetermined range (no in step S412), the processing returns to step S404. The image capturing apparatus 200 then performs the aforementioned processing of step S404 and subsequent steps.
In step S413, the system control circuit 220 issues a release signal. The release signal is a signal equivalent to a signal issued when the user presses a release button included in the operation unit 224 of the image pickup apparatus 200 to issue an image pickup instruction. A release signal is emitted to simulate the situation where the user presses the release button. In other words, the system control circuit 220 may issue the release signal at a timing when the image pickup apparatus 200 enters a state close to being directly opposite to the affected area during a period when the photographer is moving the image pickup apparatus 200 to bring the image pickup apparatus 200 to a position just opposite to the affected area.
In step S414, the ranging system 216 determines the distance information on the distance to the object 101. The AF control circuit 218 performs AF processing for controlling driving of the lens group 212 so that the object 101 is focused. The process is the same as the process of step S403. Since the AF processing is performed in step S403, the processing of step S414 may be omitted here.
In step S415, the image capturing unit 211 captures a still image for recording at the timing of issuance of the release signal, and generates image data.
In step S416, the image processing circuit 217 subjects the generated image data to development and compression processing to generate image data conforming to, for example, the JPEG standard. The image processing circuit 217 performs scaling processing on the compressed image data to reduce the size of the image data. In order to prioritize the accuracy in measuring the affected area, the size of the image data scaled here is desirably larger than or equal to the size of the image data scaled in step S404. For example, scaled 4-bit red-green-blue (RGB) color image data comprising 1440 × 1080 pixels has a size of about 4.45 megabytes. However, the size of the scaled image data is not limited thereto.
In step S417, the communication unit 219 obtains the zoom image data and the distance information obtained by the ranging system 216 in step S414. The communication unit 219 also obtains information on zoom magnification and information on the size (the number of pixels) of the zoom image data as appropriate.
In step S418, the communication unit 219 transmits the obtained image data and one or more pieces of information including the distance information to the image processing apparatus 300 by wireless communication.
Next, the process of the image processing apparatus 300 will be described.
In step S451, the communication unit 313 of the image processing apparatus 300 receives the image data and one or more pieces of information including the distance information transmitted from the communication unit 219 of the image capturing apparatus 200.
In step S452, the CPU 310 and the auxiliary arithmetic unit 317 of the image processing apparatus 300 extract the affected area region from the received image data. The details of this processing are the same as those of step S442. And thus a description thereof will be omitted.
In step S453, the arithmetic unit 311 of the CPU 310 calculates the area of the affected area as information on the size of the extracted affected area. The details of this processing are the same as the processing of step S443. And thus a description thereof will be omitted.
In step S454, the arithmetic unit 311 performs image analysis. Specifically, the arithmetic unit 311 calculates the lengths of the major axis and the minor axis of the extracted affected area region and the area of the rectangle circumscribing the affected area region based on the length corresponding to one pixel of the image data on the focus plane obtained in step S453. The bedsore evaluation index DESIGN-R defines the value of the bedsore size measured as the product of the lengths of the major and minor axes. By analyzing the major axis and the minor axis, the image processing system 1 according to the present exemplary embodiment can ensure compatibility with data measured according to DESIGN-R. Since DESIGN-R does not include a definition of a method for mathematically calculating the major and minor axes, there may be multiple possible methods according to DESIGN-R.
As a first example of the method for calculating the major axis and the minor axis, the operation unit 311 calculates a rectangle having a minimum area (minimum circumscribed rectangle) among the rectangles circumscribing the affected part region. The arithmetic unit 311 then calculates the lengths of the long side and the short side of the rectangle as the long axis and the short axis, respectively. The arithmetic unit 311 calculates the area of the rectangle based on the length corresponding to one pixel of the image data on the focus plane obtained in step S453.
As a second example of a method for calculating the long axis and the short axis, the operation unit 311 selects a maximum Feret diameter (feree diameter) as a maximum caliper diameter as the long axis, and selects a minimum Feret diameter as the short axis. The operation unit 311 may select the maximum Feret diameter as the maximum caliper diameter as the major axis, and select the length measured in the direction perpendicular to the axis of the maximum Feret diameter as the minor axis.
Any method of calculating the major and minor axes may be selected based on compatibility with conventional measurements.
The image data received in step S441 is not subjected to processing for calculating the lengths of the major axis and the minor axis of the affected area and the area of the rectangle. The live view is intended to enable a user to check an extraction result of an affected area. The image analysis processing corresponding to step S454 is thus omitted from being performed on the image data received in step S441, to shorten the processing time.
In step S455, the arithmetic unit 311 superimposes information indicating the extraction result of the affected area region and information on the size of the affected area region on the image data from which the affected area region is extracted, thereby generating image data.
Fig. 7A to 7C are diagrams illustrating a method for superimposing information indicating the extraction result of the affected area and information on the size of the affected area including the major axis and the minor axis of the affected area on the image data. Since a plurality of types of information about the size of the affected area can be assumed, the respective methods will be described with reference to fig. 7A to 7C.
Fig. 7A illustrates an image 701 generated by using a minimum bounding rectangle as a method of calculating the major and minor axes. As in fig. 6B, a label 611 for displaying a character string 612 indicating the area of the affected area region in black background white font is superimposed on the upper left corner of the image 701 as information relating to the size of the affected area region.
A label 712 displaying the long axis and the short axis calculated based on the minimum bounding rectangle is superimposed on the upper right corner of the image 701 as information relating to the size of the affected area. The label 712 includes character strings 713 and 714. The character string 713 represents the length of the major axis (in cm). The character string 714 represents the length of the minor axis (in cm). A rectangular box 715 representing a minimum bounding rectangle is superimposed over the affected area in the image 701. The rectangular box 715 together with the superposition of the lengths of the major and minor axes enables the user to check the measurement positions in the image 701 where the lengths are measured.
A scale bar 716 is superimposed on the lower right hand corner of the image 701. The scale bar 716 is intended to measure the size of the affected part 102. The size of the scale bar 716 relative to the image data varies based on the distance information. Specifically, the scale bar 716 is a scale bar up to 5cm in units of 1cm based on the length corresponding to one pixel of the image data on the focus plane obtained in step S453, and corresponds to the size on the focus plane (i.e., the object 101) of the image capturing apparatus 200. The user can calculate the size of the object 101 or the affected area 102 by referring to the scale bar 716.
The size evaluation index 717 according to the aforementioned DESIGN-R is superimposed on the lower left corner of the image 701. As described above, there are seven classification levels according to the size evaluation index 717 of DESIGN-R based on the value obtained by measuring the major axis and the minor axis (the maximum diameter perpendicular to the major axis) of the skin wound area (in cm) and multiplying the measurement values. In the present exemplary embodiment, a size evaluation index 717 obtained by replacing the major axis and the minor axis with values output by the respective calculation methods is superimposed.
Fig. 7B shows an image 702 generated by using the maximum Feret diameter as the major axis and the minimum Feret diameter as the minor axis. A label 722 displaying a character string 723 indicating the length of the major axis and a character string 724 indicating the length of the minor axis is superimposed on the upper right corner of the image 702. An auxiliary line 725 indicating the measurement position of the maximum Feret diameter and an auxiliary line 726 indicating the measurement position of the minimum Feret diameter are displayed in the affected area in the image 702. The superimposition of the auxiliary lines 725 and 726 together with the character strings 723 and 724 representing the major axis length and the minor axis length enables the user to calculate the measured positions of the measured lengths in the image 702.
Fig. 7C shows an image 703 in which the major axis is the same as the image 702 shown in fig. 7B and the minor axis is not obtained by measuring the minimum Feret diameter but is the length in the direction perpendicular to the axis of the maximum Feret diameter. A label 732 showing a character string 723 indicating the length of the long axis and a character string 734 indicating the length of the short axis is superimposed on the upper right corner of the image 703. In the affected area in the image 703, an auxiliary line 725 corresponding to the measurement position of the maximum Feret diameter and an auxiliary line 736 corresponding to the length measured in the direction perpendicular to the axis of the maximum Feret diameter are displayed.
Each of the various types of information related to the image data shown in fig. 7A to 7C may be superimposed individually, or a plurality of types of information may be superimposed in combination. The user may be able to select the information to be displayed. The images shown in fig. 6A and 6B and fig. 7A to 7C are merely examples. The display mode, display position, size, font size, font color, and positional relationship of the information relating to the size of the affected part 102 and the affected part region may be changed based on various conditions.
In step S456, the communication unit 313 of the image processing apparatus 300 transmits information indicating the extraction result of the affected area region and information relating to the size of the affected area region to the image capturing apparatus 200. In the present exemplary embodiment, the communication unit 313 transmits the image data generated in step S455 to the image capturing apparatus 200.
Then, the process advances to the process of the image pickup apparatus 200.
In step S419, the communication unit 219 of the image capturing apparatus 200 receives the image data transmitted from the image processing apparatus 300.
In step S420, the display unit 223 displays the received image data for a predetermined time. Here, the display unit 223 displays one of the images 701 to 703 shown in fig. 7A to 7C. After the predetermined time has elapsed, the process returns to step S402.
As described above, according to the present exemplary embodiment, the system control circuit 220 controls the timing of capturing the image of the affected part 102 based on the obtained information on the size of the affected part region. Since the system control circuit 220 can perform control such that imaging of the affected part is automatically performed when the imaging apparatus 200 is located at a position that makes the affected part region appropriate in size, it is possible to appropriately capture an image of the affected part 102 even if the user is in an unnatural posture at the time of capturing the image.
Specifically, the system control circuit 220 controls the automatic execution of the imaging of the affected part 102 by issuing the release signal in the case where the area of the newly received affected part region is greater than or equal to the threshold value and the variation between the area of the newly received affected part region and the area of the affected part region received immediately before falls within a predetermined range. In other words, if the area of the affected area region is greater than or equal to the threshold value, the imaging apparatus 200 may be assumed to be in an appropriate position to capture an image of the affected area 102. The image pickup apparatus 200 may be assumed to be in a state close to being directly opposite to the affected part region if the change in the area of the affected part region falls within a predetermined range. Since the system control circuit 220 controls the automatic execution of the imaging of the affected part when such a condition is satisfied, the user can appropriately take an image of the affected part 102. Alternatively, the imaging apparatus 200 may be configured to determine whether the imaging apparatus 200 is in an appropriate position to capture an image of the affected part 102 by determining only whether the change in the area of the affected part region falls within a predetermined range without determining whether the area of the affected part region is greater than or equal to a threshold value.
If the user manually gives an image capturing instruction, the processing of displaying image data in the aforementioned step S409 is performed. Step S409 may be omitted if the system control circuit 220 issues a release signal. Likewise, the process of generating the superimposed image data in step S444 may be omitted. In step S445, information on the size of the affected area may be transmitted without transmitting image data.
In the image processing system 1 according to the present exemplary embodiment, information on the size of the affected area region is displayed on the display unit 223 of the image capturing apparatus 200 by the user capturing an image of the affected area 102 using the image capturing apparatus 200. This can reduce the burden on medical staff and the burden on a patient to be evaluated when evaluating the size of the affected area of the decubitus ulcer. Since the size of the affected area is calculated based on the computer program, individual differences can be reduced and the accuracy of evaluation of the size of the bedsore can be improved, as compared with a case where medical staff manually measures the size. Further, the area of the affected area serving as an index for more accurately representing the scale of the bedsore can be calculated and displayed. The function by which the user checks whether the estimated region of the affected area is appropriate during live view display is unnecessary, and steps S441 to S445 may be omitted.
The image processing apparatus 300 may store information indicating the extraction result of the affected area region, information relating to the size of the affected area region, and image data on which these pieces of information are superimposed in the storage unit 312. In this case, the output unit 314 may output any one or more of the information or image data stored in the storage unit 312 to a connected output device such as a display or the like. The image display on the display allows users other than the user who captured the image of the affected part 102 to obtain image data relating to the affected part 102 and information relating to the size in real time (or obtained in the past). The arithmetic unit 311 of the image processing apparatus 300 may have a function for displaying a scale bar for freely changing the position and angle of the image data sent from the output unit 314 to the display. Displaying such a scale bar enables a user to view the display to measure the length of the desired location of the affected area. The scale of the scale bar is automatically adjusted based on the distance information, the information on the zoom magnification, and the information on the size (the number of pixels) of the zoom image data received in step S451.
The image processing apparatus 300 may be a stationary device with a constant power supply. The constant power supply enables the image processing apparatus 300 to receive image data relating to an affected part and information relating to size at any timing, and can prevent battery depletion. Since the stationary device generally has a large storage capacity, the stationary-type image processing apparatus 300 can store a large amount of image data.
(first modification)
In the above-described flowcharts of fig. 4A and 4B, if it is determined in step S411 that the area is not greater than or equal to the threshold value, the process does not advance to step S412, and the processes of step S404 and subsequent steps are repeated. However, the affected area does not necessarily have a specific size or more. Therefore, if it is determined as "no" in step S411 of the flowchart of fig. 4B, the system control circuit 220 may determine whether a certain period of time has elapsed in a state where the area is not greater than or equal to the threshold value. If the certain period of time has not elapsed, the process returns to step S404. If the certain period of time has elapsed, the system control circuit 220 estimates that the size of the affected area is small, and the process proceeds to step S412. For example, the certain period of time may be defined based on a period of time such as 5 seconds and 10 seconds. The certain period of time may be defined based on the number of data receptions in step S408. By issuing the release signal when a certain period of time has elapsed, the image pickup apparatus 200 can capture an image of the affected part 102 even if the affected part region does not have a size more than a certain size.
(second modification)
In the above-described flowcharts of fig. 4A and 4B, the process of step S410 is described as controlling the image capturing timing based on whether the image capturing condition is satisfied. However, the processing may control the timing of prompting the user to perform the image capturing operation based on whether the image capturing condition is satisfied. Specifically, if the process proceeds from step S412 to step S413, the system control circuit 220 may issue a notification prompting the user to perform an image capturing operation. For example, the system control circuit 220 prompts the user to press a release button included in the operation unit 224 by issuing a notification of voice or alarm or displaying the notification on the display unit 223. According to this modification, after step S413, the system control circuit 220 determines whether the release button included in the operation unit 224 is actually pressed by the user. If the release button is pressed, the process proceeds to step S414.
A second exemplary embodiment will be described below. In the first exemplary embodiment, the threshold value to be compared with the area of the affected area is described as being predetermined. The present exemplary embodiment describes a case where history information is used as a threshold value. In the following description, the same components as those of the first exemplary embodiment are assigned the same reference numerals. A detailed description thereof will be omitted.
Specifically, in step S411, the system control circuit 220 compares the received area of the affected area with a predetermined threshold value. If the area is greater than or equal to the threshold, the process proceeds to step S412. In step S412, if the change in area falls within the predetermined range, the process proceeds to step S413. In step S413, the system control circuit 220 issues a release signal, and records the received area in the external memory 222. If the processing of the flowchart of fig. 4A and 4B proceeds to step S411 when the image of the affected part 102 is captured next time, the system control circuit 220 reads the area recorded in the external memory 222 as the threshold value. The system control circuit 220 may determine the threshold by setting a margin for the read threshold. For example, the system control circuit 220 may determine the threshold value by subtracting a certain margin from the read threshold value. The longer the interval from the previous image capturing, the larger margin can be subtracted from the reading threshold.
In recording the area, the system control circuit 220 may record the area to be recorded and the patient information in association with each other. For example, the image capturing apparatus 200 or the image processing apparatus 300 may obtain the patient information by the user inputting the patient information to the image capturing apparatus 200 or the image processing apparatus 300 and/or selecting the patient information on the image capturing apparatus 200 or the image processing apparatus 300. If the image processing apparatus 300 obtains the patient information, in step S445, the image processing apparatus 300 transmits the patient information to the image capturing apparatus 200 in association with the area. This enables the image pickup apparatus 200 to record the area and the patient information in association with each other.
If the patient information is used, the system control circuit 220 reads the area associated with the patient information as a threshold value in step S411 and compares the threshold value with the received area. If the area is greater than or equal to the threshold value (yes in step S411), the processing proceeds to step S412. In step S412, if the change in area falls within the predetermined range (yes in step S412), the processing proceeds to step S413. In step S413, the system control circuit 220 issues a release signal, associates the received area with the patient information, and records the received area for updating. If there is no area associated with the patient information, the system control circuit 220 compares a predetermined threshold value (initial value) with the received area or displays a notification for prompting the user to manually capture an image on the display unit 223.
The image processing apparatus 300 may record the area in association with the patient information. For example, assume that the imaging apparatus 200 photographs a barcode label including patient information about a patient before photographing the affected part 102. The image capturing apparatus 200 transmits image data relating to the barcode label to the image processing apparatus 300 during the transmission in the above-described step S406. The image processing apparatus 300 obtains patient information from the image data relating to the barcode label, reads the area associated with the patient information as a threshold value, and transmits the threshold value to the image capturing apparatus 200 during the transmission in the above-described step S445. The system control circuit 220 of the image capturing apparatus 200 can thus set a threshold value corresponding to the patient by controlling the image capturing timing based on the received threshold value. The CPU 310 of the image processing apparatus 300 may also update the area recorded in association with the patient information with the area of the affected area calculated in step S453.
As described above, according to the present exemplary embodiment, the threshold value is set based on the area of the affected area photographed in the past, and the threshold value is compared with the received area. Since the size of the bedsore generally varies with time, the imaging apparatus 200 can capture an image of the affected part under appropriate conditions by setting a threshold value based on the area of the affected part region captured in the past. The imaging apparatus 200 can also capture an image of an affected part under an optimal condition corresponding to the patient by setting a threshold value based on the area of the affected part region of the same patient captured in the past. Since the accuracy of the threshold value is improved, the image pickup apparatus 200 may be configured such that if the area of the affected area is greater than or equal to the threshold value in step S411, the process skips step S412 and proceeds to step S413 to issue a release signal.
A third exemplary embodiment will be described below. In the first exemplary embodiment, if the change in the area of the affected area region is within a predetermined range, the imaging apparatus 200 is determined to be directly facing the affected area region. The present exemplary embodiment describes a case where it is determined whether the image pickup apparatus 200 is facing an object including an affected part by using distance information. If the image pickup apparatus 200 is facing the object, a release signal is issued. In the following description, the same components as those of the first exemplary embodiment are assigned the same reference numerals. A detailed description thereof will be omitted.
Fig. 8A and 8B are flowcharts showing an example of the processing of the image processing system 1. The same processes as those of fig. 4A and 4B are assigned the same step numbers, and the description thereof is omitted as necessary. Specifically, the flowcharts of fig. 8A and 8B are obtained by replacing step S410 in the flowchart of fig. 4B with step S810.
Step S810 represents a process in which the system control circuit 220 controls the timing of capturing an image based on the orientation of the image capturing apparatus 200 with respect to the object. Step S810 includes step S811.
In step S811, the system control circuit 220 determines whether the image capturing apparatus 200 is facing an object including an affected part. As described above, the distance measurement system 216 of the image pickup apparatus 200 can calculate distance information regarding a distance to an object or a distance map (distance map information) indicating the distribution of the distance information. The information on the plane may be obtained by using distance information or distance map information on three or more points. The object and the image pickup apparatus 200 may be determined to be directly facing each other if the respective distance information on the respective points coincide with each other or are smaller than a predetermined difference.
The system control circuit 220 can thereby determine whether the image pickup apparatus 200 is currently facing the object based on the distance information or the distance map information calculated by the distance measurement system 216. If the image capturing apparatus 200 is determined to be facing the object (yes in step S811), the processing proceeds to step S413. In step S413, the system control circuit 220 issues a release signal. If the image capturing apparatus 200 is determined not to be facing the object (no in step S811), the processing returns to step S404.
As described above, according to the present exemplary embodiment, the timing of capturing an image is controlled based on the orientation of the apparatus 200 with respect to the subject. Since the release signal can be issued while the image pickup apparatus 200 is facing the object, an image of the affected part can be appropriately captured.
Various exemplary embodiments and modifications of the present invention have been described above. However, the present invention is not limited to the above-described exemplary embodiment and modifications, and various modifications may be made without departing from the scope of the present invention. The foregoing exemplary embodiments and modifications may be combined as desired.
For example, the first exemplary embodiment may be combined with the third exemplary embodiment. Specifically, the system control circuit 220 may determine whether the image pickup apparatus 200 is facing an object based on the distance information, and compare the received area with a threshold value if the image pickup apparatus 200 is determined to be facing an object. If the area is greater than or equal to the threshold, the system control circuitry 220 may issue a release signal. Alternatively, the system control circuit 220 may determine whether the received area is greater than or equal to a threshold value, and if the area is greater than or equal to the threshold value, determine whether the image pickup apparatus 200 is facing the object based on the distance information. The system control circuit 220 may issue a release signal if the image pickup apparatus 200 is facing the object.
In the foregoing exemplary embodiment, the information on the size of the affected area is described as the area of the affected area. However, this is not restrictive, and the information on the size of the affected area may be any one of the following: the area of the affected area, the length of the major axis of the affected area, and the length of the minor axis of the affected area. If the length of the major axis of the affected area or the length of the minor axis of the affected area is used as the information on the size of the affected area, the system control circuit 220 obtains the length of the major axis of the affected area or the length of the minor axis of the affected area in step S408. In step S411, the system control circuit 220 may determine whether the length of the long axis of the affected area or the length of the short axis of the affected area is greater than or equal to a predetermined threshold.
In the above-described exemplary embodiment, the image pickup apparatus 200 is described as including the system control circuit 220. However, this is not restrictive. The image pickup apparatus 200 may not include the system control circuit 220, and in this case, the processing to be performed by the system control circuit 220 is executed by various hardware circuits.
OTHER EMBODIMENTS
The embodiments of the present invention can also be realized by a method in which software (programs) that perform the functions of the above-described embodiments are supplied to a system or an apparatus through a network or various storage media, and a computer or a Central Processing Unit (CPU), a Micro Processing Unit (MPU) of the system or the apparatus reads out and executes the methods of the programs.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims (20)

1. An apparatus, comprising:
a sensor configured to capture an image of an affected part; and
a processor configured to obtain information on a size of the affected part in the captured image, and control a timing to capture the image of the affected part or a timing to prompt a user to perform an imaging operation based on the information on the size of the affected part.
2. The apparatus according to claim 1, wherein the processor is configured to, in a case where the size of the affected part is greater than or equal to a threshold value, control so that an image of the affected part is captured or control so that a user is prompted to perform an imaging operation.
3. The apparatus according to claim 2, wherein the threshold value is a value based on a size of the affected part of the same patient that has been previously photographed.
4. The apparatus according to claim 2, wherein the processor is configured to, in a case where the size of the affected part is greater than or equal to the threshold value, control so as not to take an image of the affected part or control so as not to prompt a user to perform an imaging operation in a case where a ratio of the size of the affected part to a change in the size of the affected part obtained before or a difference between the size of the affected part and the size of the affected part obtained before does not fall within a predetermined range.
5. The apparatus according to claim 4, wherein the processor is configured to, in a case where the size of the affected part is greater than or equal to the threshold value and a ratio of the size of the affected part to a change in the size of the affected part obtained before or a difference between the size of the affected part and the size of the affected part obtained before falls within the predetermined range, control such that an image of the affected part is captured or control such that a user is prompted to perform an imaging operation.
6. The apparatus according to claim 2, wherein the processor is configured to, in a case where the size of the affected part is not larger than or equal to the threshold value and a predetermined period of time has elapsed, control photographing so that an image of the affected part is photographed or control so that a user is prompted to perform an imaging operation.
7. The apparatus according to claim 1, wherein the processor is configured to, in a case where a ratio of a size of the affected part to a change in a previously obtained size of the affected part or a difference between the size of the affected part and the previously obtained size of the affected part falls within a predetermined range, control such that an image of the affected part is captured or control such that a user is prompted to perform an imaging operation.
8. The apparatus according to claim 1, wherein the processor is configured to transmit the image including the photographed affected part to the outside via a communication circuit, and receive information on the size of the affected part from the outside via the communication circuit in response to transmitting the image to the outside.
9. The apparatus according to claim 8, wherein the processor is configured to receive information on the size of the affected part via the communication circuit whenever an image is transmitted to the outside via the communication circuit.
10. The apparatus according to claim 1, wherein the processor is configured to, in a case where it is determined that the apparatus is facing the object based on distance information on a distance from the apparatus to the object including the affected part, control so that an image of the affected part is captured or control so that a user is prompted to perform an image capturing operation.
11. The apparatus as set forth in claim 1, wherein,
wherein the affected part is a decubitus ulcer, an
Wherein the size of the affected part is at least any one of: the area of the affected area, the length of the major axis of the affected area, and the length of the minor axis of the affected area.
12. An apparatus, comprising:
a sensor configured to capture an image of an affected part; and
a processor configured to, in a case where the apparatus is facing an object including the affected part, control so that an image of the affected part is captured or control so that a user is prompted to perform an image capturing operation.
13. The apparatus according to claim 12, wherein the processor is configured to determine whether the subject including the affected part is facing the apparatus based on distance information on a distance from the apparatus to the subject.
14. An image processing apparatus comprising:
a communication circuit configured to receive an image including an affected part from the image pickup apparatus; and
a processor configured to calculate a size of the affected part based on the received image, and to transmit information on the size of the affected part to the image pickup apparatus via the communication circuit in order to control a timing to photograph an image of the affected part by the image pickup apparatus or to control a timing to prompt a user to perform an image pickup operation based on the information on the size of the affected part.
15. A method, comprising:
capturing an image of the affected part by a sensor;
obtaining information on the size of the affected part in a captured image; and
controlling a timing to capture an image of the affected part or controlling a timing to prompt a user to perform an imaging operation based on the information on the size of the affected part.
16. The method of claim 15, further comprising: and if the size of the affected part is greater than or equal to a threshold value, controlling to take an image of the affected part or controlling to prompt a user to perform an imaging operation.
17. The method of claim 15, further comprising: and a control unit configured to, when a ratio of a size of the affected part to a change in a previously obtained size of the affected part or a difference between the size of the affected part and the previously obtained size of the affected part falls within a predetermined range, control the imaging unit to capture an image of the affected part or control the imaging unit to prompt a user to perform an imaging operation.
18. The method of claim 15, further comprising:
transmitting an image including the photographed affected part to the outside via a communication circuit; and
in response to transmitting the image to the outside, information on the size of the affected part is received from the outside via the communication circuit.
19. A method, comprising:
judging whether a subject including an affected part is facing the apparatus; and
when the subject including the affected part is facing the apparatus, control is performed so that an image of the affected part is captured by a sensor or control is performed so that a user is prompted to perform an imaging operation.
20. A method, comprising:
receiving an image including an affected part from a device;
calculating a size of the affected part based on the received image; and
in order to control a timing to capture an image of the affected part or a timing to prompt a user to perform an imaging operation based on the information regarding the size of the affected part, the information regarding the size of the affected part is transmitted to the apparatus.
CN202010172159.7A 2019-03-13 2020-03-12 Apparatus, image processing apparatus, control method, and storage medium Active CN111698401B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2019045955 2019-03-13
JP2019-045955 2019-03-13
JP2020023378A JP2020156082A (en) 2019-03-13 2020-02-14 Imaging apparatus, image processing system, and control method
JP2020-023378 2020-02-14

Publications (2)

Publication Number Publication Date
CN111698401A true CN111698401A (en) 2020-09-22
CN111698401B CN111698401B (en) 2022-08-30

Family

ID=72422837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010172159.7A Active CN111698401B (en) 2019-03-13 2020-03-12 Apparatus, image processing apparatus, control method, and storage medium

Country Status (2)

Country Link
US (1) US11475571B2 (en)
CN (1) CN111698401B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11908154B2 (en) * 2021-02-04 2024-02-20 Fibonacci Phyllotaxis Inc. System and method for evaluating tumor stability

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106485691A (en) * 2015-08-31 2017-03-08 佳能株式会社 Information processor, information processing system and information processing method
US20170147789A1 (en) * 2015-03-23 2017-05-25 Consensus Orthopedics, Inc. System and methods with user interfaces for monitoring physical therapy and rehabilitation
CN108606782A (en) * 2018-04-28 2018-10-02 泰州市榕兴医疗用品股份有限公司 A kind of surface of a wound imaging system
CN108737631A (en) * 2017-04-25 2018-11-02 北京小米移动软件有限公司 The method and device of Quick Acquisition image
CN108742671A (en) * 2018-07-09 2018-11-06 傅君芬 Hand stone age imaging device with photographing module and shielding X-ray function
CN108875648A (en) * 2018-06-22 2018-11-23 深源恒际科技有限公司 A method of real-time vehicle damage and component detection based on mobile video stream
CN109223303A (en) * 2018-10-18 2019-01-18 杭州市余杭区第五人民医院 Full-automatic wound shooting assessment safety goggles and measurement method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6575904B2 (en) * 2000-05-09 2003-06-10 Matsushita Electric Industrial Co., Ltd. Biodata interfacing system
JP3800197B2 (en) * 2003-04-25 2006-07-26 コニカミノルタフォトイメージング株式会社 Imaging device
JP5225065B2 (en) * 2008-12-27 2013-07-03 キヤノン株式会社 Imaging apparatus and imaging method
US20110087110A1 (en) * 2009-10-13 2011-04-14 Cell Genetics, Llc Medical imaging processes for facilitating catheter-based delivery of therapy to affected organ tissue
KR101594298B1 (en) * 2009-11-17 2016-02-16 삼성전자주식회사 Apparatus and method for adjusting focus in digital image processing device
WO2011145296A1 (en) * 2010-05-21 2011-11-24 パナソニック株式会社 Image capturing apparatus, image processing apparatus, image processing method, and image processing program
US9104909B2 (en) * 2010-12-15 2015-08-11 Canon Kabushiki Kaisha Image processing apparatus and method of processing image
JP6351323B2 (en) * 2014-03-20 2018-07-04 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
US10482592B2 (en) * 2014-06-13 2019-11-19 Nikon Corporation Shape measuring device, structured object manufacturing system, shape measuring method, structured object manufacturing method, shape measuring program, and recording medium
JP6490417B2 (en) * 2014-12-18 2019-03-27 株式会社東芝 Moving body tracking treatment apparatus and moving body tracking treatment program
JP6635690B2 (en) * 2015-06-23 2020-01-29 キヤノン株式会社 Information processing apparatus, information processing method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170147789A1 (en) * 2015-03-23 2017-05-25 Consensus Orthopedics, Inc. System and methods with user interfaces for monitoring physical therapy and rehabilitation
CN106485691A (en) * 2015-08-31 2017-03-08 佳能株式会社 Information processor, information processing system and information processing method
CN108737631A (en) * 2017-04-25 2018-11-02 北京小米移动软件有限公司 The method and device of Quick Acquisition image
CN108606782A (en) * 2018-04-28 2018-10-02 泰州市榕兴医疗用品股份有限公司 A kind of surface of a wound imaging system
CN108875648A (en) * 2018-06-22 2018-11-23 深源恒际科技有限公司 A method of real-time vehicle damage and component detection based on mobile video stream
CN108742671A (en) * 2018-07-09 2018-11-06 傅君芬 Hand stone age imaging device with photographing module and shielding X-ray function
CN109223303A (en) * 2018-10-18 2019-01-18 杭州市余杭区第五人民医院 Full-automatic wound shooting assessment safety goggles and measurement method

Also Published As

Publication number Publication date
US20200294236A1 (en) 2020-09-17
CN111698401B (en) 2022-08-30
US11475571B2 (en) 2022-10-18

Similar Documents

Publication Publication Date Title
US7764880B2 (en) Pickup apparatus
US7672580B2 (en) Imaging apparatus and method for controlling display device
TWI425828B (en) Image capturing apparatus, method for determing image area ,and computer-readable recording medium
US20080192122A1 (en) Photographing apparatus, method and computer program product
JP7322097B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM AND RECORDING MEDIUM
JP5171468B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
US20080100721A1 (en) Method of detecting specific object region and digital camera
JP2021051573A (en) Image processing apparatus, and method of controlling image processing apparatus
JP4866317B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
WO2019230724A1 (en) Image processing system, imaging device, image processing device, electronic device, control method thereof, and storage medium storing control method thereof
CN111698401B (en) Apparatus, image processing apparatus, control method, and storage medium
US11599993B2 (en) Image processing apparatus, method of processing image, and program
JP2006208443A (en) Automatic focusing apparatus
US20210401327A1 (en) Imaging apparatus, information processing apparatus, image processing system, and control method
JP2003289468A (en) Imaging apparatus
JP2008172732A (en) Imaging apparatus, control method thereof, and program
JP2020156082A (en) Imaging apparatus, image processing system, and control method
JP5499796B2 (en) Electronics
JP2020151461A (en) Imaging apparatus, information processing apparatus, and information processing system
JP2020095600A (en) Processing system, processing device, terminal device, processing method, and program
US20240000307A1 (en) Photography support device, image-capturing device, and control method of image-capturing device
JP2010171841A (en) Imaging apparatus
KR102494696B1 (en) Method and device for generating an image
JP5379885B2 (en) Imaging apparatus and control method
JP2022147595A (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant