WO2023026538A1 - Système d'assistance médicale, procédé d'assistance médicale et dispositif d'assistance à l'évaluation - Google Patents

Système d'assistance médicale, procédé d'assistance médicale et dispositif d'assistance à l'évaluation Download PDF

Info

Publication number
WO2023026538A1
WO2023026538A1 PCT/JP2022/010844 JP2022010844W WO2023026538A1 WO 2023026538 A1 WO2023026538 A1 WO 2023026538A1 JP 2022010844 W JP2022010844 W JP 2022010844W WO 2023026538 A1 WO2023026538 A1 WO 2023026538A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation
affected area
image
medical support
value
Prior art date
Application number
PCT/JP2022/010844
Other languages
English (en)
Japanese (ja)
Inventor
真司 渡辺
健人 竹中
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023026538A1 publication Critical patent/WO2023026538A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present disclosure relates to a medical support system, a medical support method, and an evaluation support device.
  • ulcerative colitis is treated using an endoscope. Specifically, the doctor evaluates the disease condition of the affected area from the captured image obtained by the endoscope, and performs treatment while controlling the dosage and the like based on the evaluation. In addition, when it is difficult to evaluate the condition of a patient by follow-up observation using such an endoscope, a part of the affected tissue may be excised for pathological diagnosis.
  • AI artificial intelligence
  • medical support systems medical support systems
  • an image of the affected area is taken with an endoscope
  • an AI system that functions as a determiner built in advance by machine learning is used to analyze the affected area image and evaluate the disease status of the affected area.
  • AI system it is possible to avoid variations in evaluation among doctors, and treat patients based on evaluation results that are more objective and highly reproducible. becomes possible.
  • the present disclosure proposes a medical support system, a medical support method, and an evaluation support device capable of presenting highly accurate evaluation results in real time.
  • a medical support system including an information processing device and a program for causing the information processing device to execute information processing related to medical support, wherein the information processing device treats an affected area according to the program.
  • an image acquisition unit that sequentially acquires a plurality of images of the affected area obtained by imaging; a determination unit that determines whether the acquired images of the affected area satisfy image quality conditions for evaluating the affected area; and the image quality conditions.
  • an evaluation execution determination unit that determines whether or not to execute the evaluation of the affected area based on the continuous acquisition of the affected area images determined to satisfy an evaluation unit that derives information about an evaluation value for the affected area based on the plurality of affected area images determined to satisfy the image quality condition; and an output unit that outputs information about the evaluation value.
  • the processor sequentially acquires a plurality of affected area images obtained by imaging the affected area, and whether the acquired affected area images satisfy image quality conditions for evaluating the affected area Determining whether to execute the evaluation of the affected area based on the fact that the affected area images determined to satisfy the image quality condition are continuously acquired; deriving information about an evaluation value for the affected area based on the plurality of images of the affected area determined to satisfy the image quality condition when execution of evaluation is determined; and outputting information about the evaluation value.
  • a method of medical assistance comprising:
  • an image acquisition unit that sequentially acquires a plurality of evaluation target images obtained by imaging an evaluation target, and an image quality condition for the acquired evaluation target images to evaluate the evaluation target.
  • a determination unit that determines whether the image quality condition is satisfied; and a decision as to whether or not to perform the evaluation of the evaluation target based on the continuous acquisition of the evaluation target images that are determined to satisfy the image quality condition.
  • an evaluation execution determination unit that, when execution of evaluation of the evaluation target is determined, based on the plurality of evaluation target images determined to satisfy the image quality condition, information regarding an evaluation value for the evaluation target.
  • An evaluation support device comprising: an evaluation unit that derives the evaluation value; and an output unit that outputs information about the evaluation value together with the evaluation target image.
  • FIG. 1 is a schematic diagram showing a technical overview of a medical support system 10 according to an embodiment of the present disclosure
  • FIG. 1 is a block diagram showing an example configuration of a medical support system 10 according to an embodiment of the present disclosure
  • FIG. 4 is a flow chart showing the flow of work by a user in the medical support method according to the embodiment of the present disclosure
  • 4 is a flowchart (part 1) showing a score derivation method performed by the medical support device 200 in the medical support method according to the embodiment of the present disclosure
  • FIG. 4 is a schematic diagram illustrating an overview of skip frame determination and parallel processing in the derivation method according to the embodiment of the present disclosure
  • FIG. 4 is a schematic diagram illustrating an example of step determination in skip frame determination according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram illustrating details of skip frame determination in the derivation method according to the embodiment of the present disclosure
  • 2 is a flowchart (part 2) showing a score derivation method performed by the medical support device 200 in the medical support method according to the embodiment of the present disclosure
  • FIG. 3 is a schematic diagram illustrating an example of a derivation method according to an embodiment of the present disclosure
  • FIG. FIG. 3 is a schematic diagram (part 1) illustrating an example of display in the embodiment of the present disclosure
  • 10 is a flowchart showing ranking display performed by the medical support device 200 in the medical support method according to the embodiment of the present disclosure
  • FIG. 2 is a schematic diagram (part 2) illustrating an example of display in the embodiment of the present disclosure
  • FIG. 3 is a schematic diagram illustrating an example of an ultrasound image
  • 2 is a hardware configuration diagram showing an example of a computer that implements the functions of the medical support device 200.
  • FIG. 1 is a schematic diagram illustrating a technical overview of a medical support system 10 according to the present disclosure.
  • the medical support system 10 according to the embodiment of the present disclosure is a medical evaluation support system or a research evaluation system that can be used to determine the condition of an evaluation target in medical care or research.
  • Examples of evaluation targets for the medical support system 10 include humans (patients) and animals (patients).
  • the doctor 1 examines the inside of the patient 2 using an endoscope 100 included in the medical support system 10, for example. At that time, an affected part image 400 inside the body of the patient 2 captured by the endoscope 100 is displayed by the display device 300 included in the medical support system 10 . The doctor 1 can examine the inside of the patient 2 by viewing the affected part image 400 displayed on the display device 300 . Furthermore, in the technology according to the present disclosure, the display device 300 displays the diseased part image 400 and the evaluation result of the disease state of the affected part derived by an algorithm (determiner) constructed by machine learning. Therefore, the doctor 1 can improve the diagnosis accuracy by making a diagnosis considering the evaluation result of the disease condition by the above algorithm.
  • an algorithm determineer
  • the evaluation result is a score (evaluation value) consisting of a specific numerical value indicating the evaluation of the disease state of the affected area determined by the doctor (endoscopist) 1 by visually recognizing the surface of the affected area, or based on the score. It is a graded score and also includes a value indicating the pathological examination evaluation obtained by pathological examination of the affected area by a doctor (pathologist).
  • the score (evaluation value) for the disease state of the affected area is the value scored by Doctor 1 for the bleeding, tumor, or fluoroscopic image of the affected area.
  • an evaluation value obtained by any known scoring method can be used as the score for the disease state of the affected area.
  • a score based on the Ulcerative Colitis Endoscopic Index of Severity can be used as the score for the disease condition of the affected area.
  • the UCEIS score is an index that has recently come into use as an evaluation value of the severity of ulcerative colitis.
  • UCEIS scores are defined for assessment of at least vascular permeability, bleeding, and ulceration of ulcerative colitis. Since the UCEIS score can be finely classified, it is possible to formulate a fine diagnostic policy and to reduce evaluation variations among endoscopists.
  • the Mayo score can be used as the score for the disease condition of the affected area.
  • the Mayo score can classify the severity of ulcerative colitis on several scales from 0 to 3. Specifically, the Mayo score consists of assessment values for each of vascular permeability, bleeding, and ulcerative severity of ulcerative colitis by endoscopic findings, rectal bleeding, and comprehensive assessment by Physician 1.
  • Mayo score remission is defined as a case where the sum of the scores for each of the above items is 2 or less and none of the subscores exceeds 1.
  • Mayo score improvement if the Mayo score has decreased by 3 or more or 30% or more from the baseline, and the rectal bleeding subscore has decreased by 1 or more, or the rectal bleeding subscore is 0 or 1 is defined as “Mayo score improvement”.
  • Mayo score defines "mucosal healing" when the endoscopic findings have a subscore of 0 or 1.
  • the evaluation value of the pathological examination is the evaluation value determined by the first grade doctor for the diagnosis result of the pathological examination by biopsy of the affected area.
  • the Geboes biopsy tissue finding score can be used as the evaluation value of the pathological examination.
  • the Geboes score is an index generally used as an index for scoring pathological findings of ulcerative colitis.
  • the medical support system 10 can be used for diagnosis of digestive system diseases using the endoscope 100, for example.
  • the medical support system 10 can be suitably used for diagnosis of inflammatory bowel disease such as ulcerative colitis or Crohn's disease.
  • inflammatory bowel disease such as ulcerative colitis or Crohn's disease.
  • a case where the medical support system 10 according to the embodiment of the present disclosure is used for diagnosing ulcerative colitis will be described as an example. Application is not limited.
  • Ulcerative colitis one of the inflammatory bowel diseases, is a chronic disease in which erosions and ulcers occur due to inflammation in the mucous membrane of the large intestine. It affects more than 200,000 people, mainly young people. It is said that there are Major symptoms of ulcerative colitis include diarrhea, bloody stool, abdominal pain, fever, anemia, etc. In addition, various complications may occur. However, the cause of ulcerative colitis has not been identified, nor has any therapy been established, and it is one of the specific diseases designated by the Ministry of Health, Labor and Welfare.
  • ulcerative colitis may progress to a chronic state by repeating remission (symptoms are calm) and relapses (symptoms are worsening). known to increase the risk of developing In light of this, it is considered very important to appropriately estimate the prognosis of ulcerative colitis based on disease activity evaluation by lower gastrointestinal endoscopy.
  • ulcerative colitis and the like by visual recognition of the affected area depends on subjective judgment, and the degree of difficulty is high. It is difficult to avoid the occurrence of variations between 1.
  • tissue collection may damage the patient 2 and cause complications.
  • pathological examinations cannot be performed anywhere, and diagnosis takes time.
  • additional costs are incurred for the patient 2 by performing the pathological examination.
  • an image of the affected area (affected area image 400) is captured by the endoscope 100, and an AI system (medical support system) that functions as a determiner built in advance by machine learning 10) is used to analyze the affected area image 400 and evaluate the disease state of the affected area.
  • AI system medical support system
  • the AI system it is possible to thoroughly analyze the affected area image 400 compared to the case where the doctor 1 who is a human being performs the analysis, so that it is possible to prevent omissions in diagnosis.
  • the present inventors have made intensive studies from the viewpoint of image analysis. It turned out that there is a feature that the images of the images) are very similar to each other. Furthermore, during the study, some of the multiple frames include frames that are out of focus, frames that include motion blur (motion blur) due to movement of the subject and the endoscope, and shadow changes due to those movements, It was found that there are frames in which the image of the affected area is not clear due to evaporation or reflection of the liquid substance present in the affected area. It has been found that it is difficult to obtain highly accurate evaluation results when analyzing such frames by an AI system.
  • the present inventors came to create the embodiments of the present disclosure described below based on the unique knowledge as described above.
  • the frames (still images) to be analyzed by the AI system are limited to, for example, frames with appropriate focus and frames without motion blur, rather than all obtained frames.
  • excluding frames in which the diseased part is unclear from the analysis targets is effective in improving the accuracy of evaluation by the AI system, and is also effective in reducing the amount of data to be processed. Therefore, according to the present embodiment, it is possible to reduce the processing load while improving the accuracy of the evaluation, so that it is possible to display the image of the affected area and present the evaluation result in real time.
  • the medical support system 10 mainly includes an endoscope 100, a medical support device 200, and a display device 300.
  • the medical support system 10 receives an affected part image 400 captured by the endoscope 100 as an input, derives a score (evaluation value) of the affected part in the medical support apparatus 200, and displays the score (evaluation value) of the affected part to a user (for example, doctor 1) via the display device 300.
  • the score of the affected area can be presented.
  • the configuration of the medical support system 10 shown in FIG. 2 is an example, and the configuration is not limited to that shown in FIG. Each device included in the medical support system 10 according to the present embodiment will be sequentially described below.
  • an example of using the endoscope 100 as an imaging device for imaging an evaluation target is shown, but this embodiment is not limited to the endoscope 100, and can be used with an endoscope, a microscope, Any imaging device capable of imaging an object to be evaluated, such as an ultrasonic inspection device, may be used.
  • a visible light image such as a white light image obtained by the endoscope 100 may be used as the affected area image 400 of the affected area.
  • White-light images are images that can be captured most easily with the basic functions of endoscopes, so if disease conditions can be evaluated using white-light images, the time, items, and costs of endoscopic examinations can be reduced. be able to.
  • the medical support apparatus 200 processes an affected area image 400 captured by the endoscope 100 and displays it on a display device 300, which will be described later, and analyzes the affected area image 400 to derive an evaluation result of the disease state of the affected area. and can be performed in parallel. For example, the medical support apparatus 200 subdivides the affected area image 400, derives a score (more specifically, a tile evaluation value) for the subdivided image (more specifically, the tile image 402) (see FIG. 9), and further By deriving the score for the entire image from each of these scores, the evaluation result of the disease state of the affected area may be derived.
  • a score more specifically, a tile evaluation value
  • the medical support apparatus 200 includes, for example, a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit) that perform various types of arithmetic processing, a ROM (Read Only Memory) in which data is stored in advance, and data that is temporarily stored in It is composed of a computer having a RAM (Random Access Memory) for storage. Details of the medical support device 200 will be described later.
  • a CPU Central Processing Unit
  • a GPU Graphics Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the display device 300 can present the affected part image 400 and evaluation results (scores, etc.) to the user (for example, the doctor 1).
  • the display device 300 may be, for example, a touch panel display, a three-dimensional display, a spatial display, a projection display, or the like.
  • the display device 300 may be any of a CRT (Cathode Ray Tube) display, a liquid crystal display, a plasma display, an EL (Electro Luminescence) display, a laser projector, an LED (Light Emitting Diode) projector, and the like.
  • the medical support apparatus 200 includes an image acquisition unit 202, an image processing unit 204, a determination unit 206, an evaluation execution determination unit 208, an evaluation unit 210, an evaluation value determination unit 212, an output unit 214, and a storage unit 216.
  • the medical support apparatus 200 includes an image acquisition unit 202, an image processing unit 204, a determination unit 206, an evaluation execution determination unit 208, an evaluation unit 210, an evaluation value determination unit 212, an output unit 214, and a storage unit 216.
  • Each functional unit included in the medical support device 200 according to this embodiment will be described below.
  • the image acquisition unit 202 can sequentially acquire a plurality of affected area images 400 (frames) (evaluation target images) obtained by imaging the affected area with the endoscope 100 and output them to the image processing unit 204 described later.
  • the blur value means a value obtained by indexing the degree of motion blur occurring in the diseased part image 400 .
  • the amount of motion means the amount of motion (relative position change) of the same subject between successive affected area images (frames) 400 .
  • the evaluation execution determination unit 208 determines that the affected area images 400 determined by the determination unit 206 to satisfy the image quality conditions for evaluating the affected area are continuously acquired, and the evaluation unit 210 described later. It is possible to decide whether to perform evaluation of the affected area by Specifically, the evaluation execution determination unit 208 determines the number of consecutive acquisitions of the affected area images 400 determined to satisfy the image quality conditions for evaluating the affected area and a predetermined threshold (second threshold). It is possible to decide whether to perform the evaluation of the affected area based on the result of comparison with. Details of the determination performed by the evaluation execution determination unit 208 will be described later.
  • the evaluation unit 210 cuts out the affected area image 400 as a plurality of tile-shaped tile images 402 (see FIG. 9), and calculates a tile evaluation value indicating the evaluation of the affected area in each of the plurality of tile images 402. derive Then, the evaluation unit 210 can select one tile evaluation value from among the plurality of derived tile evaluation values and set it as the score (evaluation value) of the diseased part image 400 . Specifically, for example, the evaluation unit 210 may statistically process a plurality of derived tile evaluation values, and acquire the average value, median value, etc. obtained by the statistical processing as the score of the affected area image 400. can.
  • the algorithm (determiner) is generated by machine learning.
  • the determiner includes at least one of a large number of affected area images 400, an evaluation result of a doctor 1 (for example, a specialist) based on visual recognition of the affected area image 400, and an evaluation result of a pathological examination by a pathologist.
  • a doctor 1 for example, a specialist
  • One or more can be generated by machine learning.
  • evaluation is performed with an accuracy equivalent to the difference in score between a plurality of doctors 1. It is possible to generate an algorithm (determiner) that can For example, the algorithm (determiner) can be generated by machine learning using a deep neural network or the like.
  • the doctor 1 inserts the endoscope 100 into the body of the patient 2 (for example, the large intestine, etc.) (step S1).
  • FIG. 4 is a flowchart showing a score derivation method performed by the medical support device 200 in the medical support method according to this embodiment.
  • the score derivation method performed by the medical support device 200 can mainly include a plurality of steps from step S31 to step S39. Details of each of these steps according to the present embodiment will be sequentially described below.
  • the medical support device 200 sequentially acquires a plurality of affected area images 400 (frames) obtained by imaging the affected area with the endoscope 100 (step S31).
  • the medical support device 200 outputs the affected area image 400 processed in step S32 described above to the display device 300 (step S33).
  • the medical support apparatus 200 determines whether or not the affected area image 400 processed in step S32 described above satisfies the image quality conditions for evaluating the affected area (step S34). Specifically, the medical support apparatus 200 can make the determination by comparing the blur value and motion amount of the affected area image 400 with a predetermined threshold, for example. Then, the medical support apparatus 200 temporarily stores (buffers) the affected area image 400 that satisfies the image quality conditions for evaluating the affected area, and uses the stored affected area image 400 in subsequent processing. Further, the medical support apparatus 200 counts (accumulates) the number of consecutively acquired affected area images 400 determined to satisfy the image quality conditions for evaluating the affected area.
  • the affected area image 400 determined to satisfy the image quality conditions for evaluating the affected area is obtained continuously 8 to 10 times (n times) (predetermined threshold value). In that case, the medical support device 200 decides to perform the evaluation of the affected area.
  • the processing of steps S34 and S35 is also processing for determining whether or not to execute the evaluation of the affected area based on the image quality of the frame (affected area image 400), so it is called "skip frame determination". Details of the skip frame determination will be described later.
  • the medical support apparatus 200 calculates a score (evaluation value ) is derived in real time (step S36). Furthermore, in the present embodiment, the medical support apparatus 200 uses the derived score to calculate a histological (G) score such as a three-level index (for example, “Active”/“Healing”/“Unknown”). can be derived. Details of the evaluation performed in step S36 will be described later.
  • G histological
  • the medical support apparatus 200 renders the score or the like so as to superimpose the score or the like derived in step S36 on the affected area image 400, and outputs the rendered score or the like to the display device 300 in real time (step S37). Details of the superimposed display executed in step S37 will be described later.
  • the medical support apparatus 200 compares the score (evaluation value) derived in step S36 described above with the scores of the diseased part images 400 acquired consecutively to the diseased part image 400 corresponding to the score (statistical processing). Then, it is determined whether or not it is an outlier (step S38). If the score is not an outlier (step S38: Yes), the medical support apparatus 200 proceeds to step S39 to store the score in the storage unit 216. FIG. On the other hand, if the score is an outlier (step S38: No), the medical support apparatus 200 excludes the score from the values to be stored, and returns to the process of step S31.
  • the medical support apparatus 200 uses the score (evaluation value) derived in step S36 described above and not excluded in step S38 as the diseased part image 400 corresponding to the score or the identification information (ID) of the diseased part image 400. ) (step S39). Then, the stored score is used in the final diagnosis by the doctor 1 or the like.
  • FIG. 5 is a schematic diagram for explaining an overview of skip frame determination and parallel processing in the derivation method according to this embodiment
  • FIG. 6 is a schematic diagram for explaining an example of step determination in skip frame determination according to this embodiment. It is a diagram.
  • FIG. 7 is a schematic diagram for explaining the details of skip frame determination in the derivation method according to this embodiment.
  • the contrast ratio (maximum/minimum brightness) of the affected area image 400 may be used as an index used to determine whether the image quality conditions for evaluating the affected area are satisfied.
  • the blur value (F value) of the diseased part image 400 is divided into three stages, and the motion amount (M value) of the diseased part image 400 and a predetermined threshold value are determined according to the classification of the blur value. By comparing , it is possible to appropriately determine whether or not the image quality conditions for evaluating the affected area are satisfied.
  • the blur value (F value) of the diseased part image 400 is classified into one of three levels (upper range, intermediate range, lower range). Which range the blur value falls into can be determined by comparing the blur value with a preset threshold value.
  • the affected area image 400 without blur has a sufficiently high blur value (F value). Therefore, as shown in FIG. 6, when the blur value is classified into the upper range, it is determined that the image quality conditions for evaluating the affected area are satisfied.
  • each threshold value described above it is possible to appropriately extract the part that the doctor 1 wants to determine.
  • the image quality is affected by the endoscope 100 , so it is preferable that each threshold is set individually for each endoscope 100 .
  • the above threshold may be set using a Look Up Table (LUT).
  • LUT is created as a two-dimensional table that normalizes all patterns consisting of axes of blur value and motion amount. Then, in the LUT, thresholds are set for classification according to the two values of the blur value and the motion amount. Although this method can also deal with local values, errors may occur depending on the fineness of division in the LUT.
  • the threshold may be set by calibration. For example, using a color chart/resolution chart, various adjustments (calibrations) may be performed with thresholds predetermined by the medical support system 10 so as to obtain evaluation results above a certain level. With this method, since the white level and black level can also be obtained, it is possible to adjust the contrast in addition to the color and resolution.
  • the medical support apparatus 200 performs stepwise threshold determination as described above on the blur value (F value) and the amount of motion (M value) acquired as described above, and the affected area image 400 is used to evaluate the affected area. It is determined whether the image quality condition for performing is satisfied (threshold processing). Subsequently, the medical support apparatus 200 continuously stores the affected area images 400 determined to satisfy the image quality conditions for evaluating the affected area, and accumulates the number (accumulation processing). Then, the medical support apparatus 200 evaluates the affected area when the affected area images 400 determined to satisfy the image quality conditions for evaluating the affected area are obtained n times (predetermined threshold value) consecutively. Judgment to execute (judgement).
  • FIG. 8 is a flowchart showing a score (evaluation value) derivation method performed by the medical support device 200 in the medical support method according to this embodiment
  • FIG. 9 illustrates an example of the derivation method according to this embodiment. It is a schematic diagram to do.
  • the evaluation unit 210 can derive the score for the affected area in real time based on a plurality of successive affected area images 400 .
  • an example of the score derivation method performed by the medical support device 200 can mainly include a plurality of steps from step S361 to step S364. The details of each of these steps according to the present embodiment will be sequentially described below.
  • the evaluation unit 210 cuts out the affected area image 400 as a tile-shaped tile image 402 having a polygonal shape, as shown in FIG. 9 (step S362).
  • the tile image 402 is preferably sized such that the disease state of the affected area can be determined, which may include, for example, an image of a blood vessel or an object to be examined such as an ulcer.
  • the evaluation unit 210 may cut out the affected area image 400 into a rectangular shape such as a square.
  • the evaluation unit 210 may cut out the affected part image 400 so that the circumscribed circle of the quadrangular shape is 5 mm or more and 15 mm or less.
  • the shape of the tile image 402 is not limited to a polygon, and may be any shape that can subdivide the diseased part image 400 .
  • the tile image 402 may be cut out in a random shape that differs depending on the location.
  • the evaluation unit 210 derives a tile evaluation value for each evaluation item as the tile evaluation value.
  • the evaluation item may be an item of bleeding in an affected area, a tumor, or a fluoroscopic image of a blood vessel, or may be an item of a pathological examination.
  • the evaluation unit 210 may derive a tile evaluation value that is a mixture of evaluation items.
  • a tile evaluation value may be derived using evaluation values of at least two or more of bleeding, ulceration, and fluoroscopic images of the affected area.
  • the tile evaluation value may be obtained by summing the evaluation values of at least two or more of the bleeding, ulcer, and fluoroscopic images of the affected area.
  • the evaluation unit 210 derives the overall score (evaluation value) of the affected area image 400 based on the tile evaluation values (step S364). This allows the user (for example, the doctor 1, etc.) to evaluate the entire affected area image 400 rather than a portion of it, thereby enabling more accurate diagnosis of ulcerative disease.
  • the score may be derived as an average value obtained by averaging the tile evaluation values of the tile images 402 included in the affected area image 400 .
  • the evaluation unit 210 may derive a score by weighting the bleeding site.
  • the evaluation unit 210 may derive a score using the maximum value or the median value of the tile evaluation values instead of or in addition to the average value of the tile evaluation values.
  • the evaluation unit 210 may further use the probability distribution (variance, standard deviation, etc.) of the tile evaluation values to derive the score.
  • the derived score (evaluation value) 500 is used to derive a three-level index (for example, "Active” / "Healing” / "Unknown"), etc., instead of the score 500, It may be superimposed on the affected part image 400 .
  • the whole or part of the affected area image 400 may be displayed with color or brightness according to the derived score 500 .
  • the score 500 superimposed on the affected area image 400 may have color, brightness, character type, size, or thickness corresponding to the score 500 .
  • outlier detection Next, the outlier detection process, which is the process of step S36 described above, will be described.
  • the quality of the scores referred to when making a final diagnosis is improved, and the final diagnosis by the doctor 1 is improved. Accuracy can be further improved.
  • the score (evaluation value) 500 larger than the position of the first delimiter from the large number side (75th percentile (third quartile)) is regarded as an outlier Detecting a score of 500 that is greater than the position of the second delimiter (50th percentile (median)) as an outlier than detecting it is a score that is closer to the evaluation result based on clinical findings by Doctor 1. I know I can get 500. Further, in the present embodiment, it is preferable to appropriately select which range is to be detected as an outlier according to the number of obtained scores 500 and the statistical processing method thereof.
  • the ranking display according to this embodiment can mainly include a plurality of steps from step S41 to step S44. The details of each of these steps according to the present embodiment will be sequentially described below.
  • the top scores 500 from a plurality of scores (evaluation values) 500 generated in real time are displayed in a ranking format.
  • all scores such as the affected part image 400 corresponding to the score 500, the tile image 402, and the tile evaluation value are displayed. Therefore, the doctor 1 can confirm the diagnosis based on various data regarding the score 500 selected by him/herself.
  • the medical support apparatus 200 selects the top scores 500 by a predetermined number from among the plurality of stored scores 500 from which outlier scores (evaluation values) 500 are excluded, in descending order. Displayed in a ranking format (step S41).
  • the display device 300 can display a ranking table 502, as shown in FIG. Note that the ranking table 502 may have, for example, a cursor 600 or the like for moving the displayed score 500 up and down.
  • the medical support apparatus 200 acquires data such as the affected area image 400, the tile image 402, and the tile evaluation value corresponding to the selected score (evaluation value) 500 from the storage unit 216 (step S43).
  • the medical support apparatus 200 displays the affected area image 400, the tile image 402, the score (evaluation value), and all scores including the tile evaluation value obtained in step S43 (step S44).
  • the display device 300 displays a diseased part image 400 corresponding to the selected score 500 and a collective image 404 in which tile images 402 obtained by cutting out the diseased part image 400 into tiles are aggregated. You may At this time, the collective image 404 may be an image in which the tile images 402 having different colors according to the tile evaluation values are collected. Further, the collective image 404 may be displayed together with the collective images 404 of the lesion images 400 acquired before and after it.
  • the display device 300 may also display a cursor 600 to chronologically change the collective image 404 to be displayed. In this case, the doctor 1 can change the collective image 404 to be displayed in time series by moving the cursor 600 left and right.
  • adjacent affected area images 400 are very similar to each other. may be regarded as a change point (scene change) in observation of the affected area.
  • one affected area image 400 may be selected from among the plurality of affected area images 400 included in one observation section (scene) and displayed as the affected area image 400 representing the observation section (scene). . By doing so, it is possible to easily grasp the overall case.
  • ulcerative colitis patients (a total of 770 people) were subjected to endoscope 100 using this embodiment.
  • An endoscopist collected mucosal biopsies from five regions of the large intestine (ascending colon, transverse colon, descending colon, sigmoid colon, and rectum), performed pathological evaluation, and compared the scores according to the present embodiment. Additionally, a UCEIS score was determined for each patient by an expert physician and compared to the score according to the present embodiment.
  • the presence or absence of histological inflammation could be evaluated in real time in 81.0% of all patients compared with the results of pathological evaluation. Furthermore, the precision for histological remission was 97.9% sensitive and 94.6% specific.
  • the intraclass correlation coefficient between the scores according to the present embodiment and the expert scores for the UCEIS score is 0.927, indicating that the scores according to the present embodiment have similar accuracy to the expert scores. rice field. It is said that if the intraclass correlation coefficient is 0.85 or more, it can be regarded as clinically consistent.
  • the present embodiment is not limited to application to ulcerative colitis, and can be applied to examinations and diagnoses that require real-time endoscopic examination (oral, transnasal, large intestine).
  • scirrhous gastric cancer cancer that progresses as it permeates the stomach wall and tissues, and the stomach becomes hard and thick
  • early esophageal cancer early colon cancer
  • Crohn's disease a type of inflammatory bowel disease, a chronic disease of unknown cause in which erosion and ulcers occur due to inflammation occurring mainly in the digestive tract, such as the small intestine and large intestine).
  • an ultrasound image is an image acquired by the doctor 1 or the like while moving the head portion of the probe, and is required to be an appropriate contrast image with high visibility for highly accurate diagnosis.
  • the present embodiment to the ultrasound image 410, it is possible to extract only images suitable for diagnosis and analyze only the extracted images, thereby enabling accurate evaluation in real time. becomes.
  • an affected area image 400 with appropriate focus for analysis by the AI system, for example, an affected area image 400 with appropriate focus, an affected area image 400 without motion blur, and other affected area images suitable for analysis and evaluation of the affected area Try to use only 400.
  • the analysis uses consecutive affected area images 400 having similar images. By doing so, according to the present embodiment, it is possible to reduce the processing load while improving the accuracy of the evaluation, so that the evaluation result can be presented in real time together with the display of the affected area image 400. .
  • the CPU 1100 (including the Graphics Processing Unit (GPU)) operates based on programs stored in the ROM 1300 or HDD 1400 and controls each part.
  • the ROM 1300 stores a boot program executed by the CPU 1100 when the computer 1000 is started up, a program depending on the hardware of the computer 1000, and the like.
  • the HDD 1400 stores programs executed by the CPU 1100 and data used by these programs.
  • Communication interface 1500 receives data from another device via a predetermined communication network, sends the data to CPU 1100, and transmits data generated by CPU 1100 to another device via a predetermined communication network.
  • the media interface 1700 reads programs or data stored in the recording medium 1800 and provides them to the CPU 1100 via the RAM 1200.
  • CPU 1100 loads such a program from recording medium 1800 onto RAM 1200 via media interface 1700, and executes the loaded program.
  • the recording medium 1800 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or a PD (Phase-change rewritable disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor. memory and the like.
  • the CPU 1100 of the computer 1000 executes a program loaded on the RAM 1200 to obtain the image acquisition unit 202, the image processing unit 204, It realizes the functions of the determination unit 206, the evaluation execution determination unit 208, the evaluation unit 210, the evaluation value determination unit 212, the output unit 214, and the like.
  • CPU 1100 of computer 1000 reads these programs from recording medium 1800 and executes them, but as another example, these programs may be obtained from another device via a predetermined communication network.
  • the HDD 1400 stores programs, data, and the like according to the embodiment of the present disclosure.
  • the above-described embodiment of the present disclosure includes, for example, a medical support method executed by the medical support system 10 as described above, a program for operating the medical support system 10, and a program in which the program is recorded. May include non-transitory tangible media. Also, the program may be distributed via a communication line (including wireless communication) such as the Internet.
  • an information processing device a program for causing the information processing device to execute information processing related to medical support; including, A medical support system,
  • the information processing device according to the program, an image acquisition unit that sequentially acquires a plurality of images of the affected area obtained by imaging the affected area; a determination unit that determines whether the acquired affected area image satisfies image quality conditions for evaluating the affected area; an evaluation execution determination unit that determines whether or not to perform evaluation of the affected area based on successive acquisition of the affected area images determined to satisfy the image quality condition; an evaluation unit that derives information regarding an evaluation value for the affected area based on the plurality of affected area images determined to satisfy the image quality condition when execution of evaluation of the affected area is determined; an output unit that outputs information about the evaluation value; functioning as Medical support system.
  • the evaluation execution determination unit determines whether or not the evaluation of the affected area can be performed based on a comparison result between the number of consecutively acquired affected area images determined to satisfy the image quality condition and a second threshold.
  • the medical support system according to any one of (1) to (5) above, which determines the (7) Any of the above (1) to (6), wherein the evaluation unit analyzes a plurality of consecutively acquired affected area images using an algorithm obtained by machine learning to derive an evaluation value for the affected area. or the medical support system according to one.
  • the evaluation unit cutting out the affected area image as a plurality of tiled tile images; deriving a tile evaluation value indicating an evaluation of the affected area in each of the plurality of tile images; The medical support system according to (7) above.
  • the medical support system according to (8) above wherein the evaluation unit selects one of the derived tile evaluation values as the evaluation value.
  • the evaluation unit obtains the evaluation value by statistically processing the plurality of derived tile evaluation values.
  • the output unit displays the evaluation value or an index based on the evaluation value superimposed on the affected area image in real time.
  • an evaluation value determination unit that detects outliers based on statistical processing from among the plurality of derived evaluation values and excludes the detected outliers from the plurality of evaluation values; a storage unit that stores the plurality of non-excluded evaluation values in association with identification information of the affected area image corresponding to each of the evaluation values; further comprising The medical support system according to any one of (8) to (11) above. (13) The medical support system according to (12) above, wherein the evaluation value determination unit detects the outlier using one of quartiles, standard deviation, Smirnov-Grubbs test, and Dixon test.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Optics & Photonics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

L'invention concerne un système d'assistance médicale comprenant : une unité d'acquisition d'image (202) qui acquiert séquentiellement une pluralité d'images de partie affectée obtenues par imagerie d'une partie affectée ; une unité de détermination (206) qui détermine si les images de partie affectée acquises satisfont une condition de qualité pour évaluer la partie affectée ; une unité de détermination d'exécution d'évaluation (208) qui détermine si une évaluation de la partie affectée peut être exécutée sur la base des images de partie affectée déterminées pour satisfaire la qualité d'image qui est acquise en continu ; une unité d'évaluation (210) qui, lorsqu'il est déterminé que la partie affectée doit être évaluée, dérive des informations concernant une valeur d'évaluation pour la partie affectée sur la base de la pluralité d'images de partie affectée déterminées pour satisfaire la condition de qualité d'image ; et une unité de sortie (214) qui délivre en sortie des informations concernant la valeur d'évaluation.
PCT/JP2022/010844 2021-08-27 2022-03-11 Système d'assistance médicale, procédé d'assistance médicale et dispositif d'assistance à l'évaluation WO2023026538A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021139204 2021-08-27
JP2021-139204 2021-08-27

Publications (1)

Publication Number Publication Date
WO2023026538A1 true WO2023026538A1 (fr) 2023-03-02

Family

ID=85322640

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/010844 WO2023026538A1 (fr) 2021-08-27 2022-03-11 Système d'assistance médicale, procédé d'assistance médicale et dispositif d'assistance à l'évaluation

Country Status (1)

Country Link
WO (1) WO2023026538A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017199635A1 (fr) * 2016-05-18 2017-11-23 オリンパス株式会社 Dispositif d'analyse d'image, système d'analyse d'image et procédé de fonctionnement pour dispositif d'analyse d'image
WO2019083019A1 (fr) * 2017-10-26 2019-05-02 富士フイルム株式会社 Dispositif de traitement d'image médicale et dispositif d'endoscope
WO2020008834A1 (fr) * 2018-07-05 2020-01-09 富士フイルム株式会社 Dispositif de traitement d'image, procédé et système endoscopique
WO2020054543A1 (fr) * 2018-09-11 2020-03-19 富士フイルム株式会社 Dispositif et procédé de traitement d'image médicale, système d'endoscope, dispositif de processeur, dispositif d'aide au diagnostic et programme
WO2020121906A1 (fr) * 2018-12-13 2020-06-18 ソニー株式会社 Système d'assistance médicale, dispositif d'assistance médicale, et procédé d'assistance médicale
WO2020218029A1 (fr) * 2019-04-26 2020-10-29 Hoya株式会社 Système d'endoscope électronique et dispositif de traitement de données

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017199635A1 (fr) * 2016-05-18 2017-11-23 オリンパス株式会社 Dispositif d'analyse d'image, système d'analyse d'image et procédé de fonctionnement pour dispositif d'analyse d'image
WO2019083019A1 (fr) * 2017-10-26 2019-05-02 富士フイルム株式会社 Dispositif de traitement d'image médicale et dispositif d'endoscope
WO2020008834A1 (fr) * 2018-07-05 2020-01-09 富士フイルム株式会社 Dispositif de traitement d'image, procédé et système endoscopique
WO2020054543A1 (fr) * 2018-09-11 2020-03-19 富士フイルム株式会社 Dispositif et procédé de traitement d'image médicale, système d'endoscope, dispositif de processeur, dispositif d'aide au diagnostic et programme
WO2020121906A1 (fr) * 2018-12-13 2020-06-18 ソニー株式会社 Système d'assistance médicale, dispositif d'assistance médicale, et procédé d'assistance médicale
WO2020218029A1 (fr) * 2019-04-26 2020-10-29 Hoya株式会社 Système d'endoscope électronique et dispositif de traitement de données

Similar Documents

Publication Publication Date Title
Ohmori et al. Endoscopic detection and differentiation of esophageal lesions using a deep neural network
Nakagawa et al. Classification for invasion depth of esophageal squamous cell carcinoma using a deep neural network compared with experienced endoscopists
EP3449800B1 (fr) Appareil de traitement d'images médicales, appareil d'endoscope, appareil de support de diagnostic et appareil de support de services médicaux
JP6875709B2 (ja) 消化器官の内視鏡画像による疾患の診断支援方法、診断支援システム、診断支援プログラム及びこの診断支援プログラムを記憶したコンピュータ読み取り可能な記録媒体
Patel et al. Real-time characterization of diminutive colorectal polyp histology using narrow-band imaging: implications for the resect and discard strategy
US10482313B2 (en) Method and system for classification of endoscopic images using deep decision networks
JP5800468B2 (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JP7476800B2 (ja) 医療支援システム、医療支援装置及び医療支援方法
JP4652023B2 (ja) 病気の発見に役立つ画像データの処理方法及び装置
JP7034102B2 (ja) 対象の消化管における粘膜疾患の評価及び監視のためのシステム及び方法
WO2012114600A1 (fr) Dispositif de traitement d'image médicale et procédé de traitement d'image médicale
Shimamoto et al. Real-time assessment of video images for esophageal squamous cell carcinoma invasion depth using artificial intelligence
CN113573654A (zh) 用于检测并测定病灶尺寸的ai系统
JP2023021440A (ja) 眼の画像内における病変の検知
WO2007119297A1 (fr) dispositif de traitement d'image pour usage médical et procédé de traitement d'image pour usage médical
JP2015509026A (ja) 生体内画像ストリーム中の運動性事象を表示するためのシステムおよび方法
JP7411618B2 (ja) 医療画像処理装置
JP2012045056A (ja) 画像処理装置、画像処理方法、および画像処理プログラム
JPWO2020008651A1 (ja) 内視鏡用画像処理装置、及び、内視鏡用画像処理方法、並びに、内視鏡用画像処理プログラム
JP2007151645A (ja) 医用画像診断支援システム
US20230206435A1 (en) Artificial intelligence-based gastroscopy diagnosis supporting system and method for improving gastrointestinal disease detection rate
US20230255467A1 (en) Diagnostic imaging device, diagnostic imaging method, diagnostic imaging program, and learned model
JPWO2020040086A1 (ja) 医療画像処理システム
JP2006325640A (ja) 異常陰影候補の表示方法及び医用画像処理システム
WO2023026538A1 (fr) Système d'assistance médicale, procédé d'assistance médicale et dispositif d'assistance à l'évaluation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22860836

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE