WO2023209884A1 - Système d'assistance médicale et méthode d'affichage d'image - Google Patents

Système d'assistance médicale et méthode d'affichage d'image Download PDF

Info

Publication number
WO2023209884A1
WO2023209884A1 PCT/JP2022/019130 JP2022019130W WO2023209884A1 WO 2023209884 A1 WO2023209884 A1 WO 2023209884A1 JP 2022019130 W JP2022019130 W JP 2022019130W WO 2023209884 A1 WO2023209884 A1 WO 2023209884A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
support system
information
medical support
Prior art date
Application number
PCT/JP2022/019130
Other languages
English (en)
Japanese (ja)
Inventor
卓志 永田
聡美 小林
和也 古保
和也 渡辺
功 舘下
珠帆 宮内
諒 小熊
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to PCT/JP2022/019130 priority Critical patent/WO2023209884A1/fr
Publication of WO2023209884A1 publication Critical patent/WO2023209884A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present disclosure relates to a medical support system and an image display method that display images taken inside a subject.
  • a doctor uses an endoscope to photograph the inside of a subject and observes images displayed on a display device.
  • an image showing a location to be observed such as a lesion
  • the doctor operates the release switch of the endoscope to capture (save) the endoscopic image.
  • the doctor re-observes (interprets) the captured images, so the more images are captured, the longer it takes to observe the images.
  • Patent Document 1 discloses a technique for identifying reference images that are dissimilar to each other from a plurality of images taken by a capsule endoscope, and displaying a list of the identified reference images arranged at equal intervals in a grid pattern. When one of the images displayed in the list is selected, the selected image is enlarged and displayed, allowing the doctor to observe the details of the image.
  • Displaying a list of multiple endoscopic images can be said to be a suitable display method for doctors to grasp the overview of the entire examination. For example, by displaying a list of multiple images showing lesions, a doctor can efficiently grasp the status of multiple lesions; It is preferable to display them in a way that does not confuse them.
  • the present disclosure has been made in view of these circumstances, and its purpose is to provide a technique for displaying images taken inside a subject in a manner that is easy for doctors to observe.
  • a medical support system includes a processor having hardware, and the processor acquires imaging information of each of a plurality of images taken inside a subject, and stores the information in chronological order.
  • the difference in the shooting information between the consecutive first and second images is calculated, and based on the calculated difference, it is determined whether the first and second images belong to the same group, and the determination result is visually displayed. As shown, the first image and the second image are displayed.
  • Another aspect of the present invention is a method for displaying images, in which imaging information of each of a plurality of images taken inside a subject is acquired, and imaging information of a first image and a second image that are consecutive in time series is obtained. Calculate the difference, determine whether the first image and the second image belong to the same group based on the calculated difference, and display the first image and the second image to visually indicate the determination result. do.
  • FIG. 1 is a diagram showing the configuration of a medical support system according to an embodiment.
  • FIG. 3 is a diagram showing functional blocks of a server device.
  • FIG. 2 is a diagram showing functional blocks of an information processing device. It is a figure which shows an example of a report creation screen.
  • 12 is a flowchart of processing for displaying a plurality of lesion images in groups. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images. It is a figure showing an example of a list screen of endoscopic images.
  • FIG. 1 shows the configuration of a medical support system 1 according to an embodiment.
  • the medical support system 1 is installed in a medical facility such as a hospital that performs endoscopy.
  • a server device 2, an image analysis device 3, an image storage device 8, an endoscope system 9, and a terminal device 10b are communicably connected via a network 4 such as a LAN (local area network). be done.
  • the endoscope system 9 is installed in an examination room and includes an endoscope observation device 5 and a terminal device 10a.
  • the server device 2, image analysis device 3, and image storage device 8 may be provided outside the medical facility, for example, as a cloud server.
  • An endoscope 7 inserted into the patient's digestive tract is connected to the endoscopic observation device 5.
  • the endoscope 7 has an insertion section that is inserted into a subject, an operation section provided on the proximal end side of the insertion section, and a universal cord extending from the operation section.
  • the endoscope 7 is detachably connected to the endoscope observation device 5 by a scope connector provided at the end of the universal cord.
  • the elongated insertion section has a hard distal end, a curved section formed to be freely curved, and a flexible elongated tube section in order from the distal end to the proximal end.
  • a plurality of magnetic coils are arranged at predetermined intervals along the longitudinal direction of the insertion section inside the distal end, curved section, and flexible tube section, and the magnetic coils are supplied from the endoscope observation device 5. generates a magnetic field according to the coil drive signal.
  • the endoscope 7 has a light guide for transmitting the illumination light supplied from the endoscope observation device 5 to illuminate the inside of the digestive tract, and has a distal end that transmits the illumination light transmitted by the light guide.
  • An illumination window for emitting light to the living tissue and a photographing unit that photographs the living tissue at a predetermined period and outputs an imaging signal to the endoscope observation device 5 are provided.
  • the imaging unit includes a solid-state image sensor (for example, a CCD image sensor or a CMOS image sensor) that converts incident light into an electrical signal.
  • the endoscopic observation device 5 generates an endoscopic image by performing image processing on the image signal photoelectrically converted by the solid-state image sensor of the endoscope 7, and displays the endoscopic image on the display device 6 in real time. In addition to normal image processing such as A/D conversion and noise removal, the endoscopic observation device 5 may have a function of performing special image processing for the purpose of highlighting and the like.
  • the imaging frame rate of the endoscope 7 is preferably 30 fps or more, and may be 60 fps.
  • the endoscopic observation device 5 generates endoscopic images at a cycle of the imaging frame rate.
  • the endoscopic observation device 5 may be configured by one or more processors having dedicated hardware, but may also be configured by one or more processors having general-purpose hardware.
  • the endoscope 7 of the embodiment is a flexible endoscope and has a forceps channel for inserting an endoscopic treatment tool. By inserting biopsy forceps into the forceps channel and manipulating the inserted biopsy forceps, the doctor can perform a biopsy and collect a portion of the diseased tissue during an endoscopy.
  • the doctor operates the endoscope 7 according to the examination procedure and observes the endoscopic image displayed on the display device 6.
  • a doctor usually inserts an endoscope 7 for lower part examination from the anus to the terminal ileum, and while withdrawing the endoscope 7, observes the terminal ileum and the large intestine in this order.
  • a doctor inserts an endoscope 7 for upper examination into the duodenum through the mouth, and while pulling out the endoscope 7, observes the duodenum, stomach, and esophagus in order.
  • the doctor may observe the esophagus, stomach, and duodenum in order while inserting the endoscope 7.
  • the endoscopic observation device 5 captures an endoscopic image at the timing when the release switch is operated, and stores the captured endoscopic image together with information (image ID) for identifying the endoscopic image in the image storage device 8. Send to.
  • the endoscopic observation device 5 may assign image IDs including serial numbers to endoscopic images in the order in which they are captured. Note that the endoscopic observation device 5 may send a plurality of captured endoscopic images together to the image storage device 8 after the end of the examination.
  • the image storage device 8 records the endoscopic image transmitted from the endoscopic observation device 5 in association with the examination ID that identifies the endoscopic examination.
  • imaging means an operation in which the solid-state imaging device of the endoscope 7 converts incident light into an electrical signal.
  • imaging may include the operation from the converted electrical signal to the endoscope observation device 5 generating an endoscopic image, and may further include the operation until displaying it on the display device 6.
  • capture means an operation of acquiring an endoscopic image generated by the endoscopic observation device 5.
  • capture may include an operation of saving (recording) an acquired endoscopic image.
  • a photographed endoscopic image is captured when the doctor operates the release switch, but the photographed endoscopic image may be automatically captured regardless of the operation of the release switch. good.
  • the endoscopic observation device 5 has a drive circuit and generates coil drive signals for driving a plurality of magnetic coils provided in the insertion section of the endoscope 7.
  • the endoscopic observation device 5 supplies a coil drive signal to the plurality of magnetic coils of the endoscope 7, so that the plurality of magnetic coils generate a magnetic field.
  • the endoscopic observation device 5 includes a receiving antenna having a plurality of coils that three-dimensionally detects the magnetic field generated by each of the plurality of magnetic coils.
  • the receiving antenna detects the magnetic field generated by each of the plurality of magnetic coils, and generates a magnetic field detection signal according to the strength of the detected magnetic field.
  • the endoscopic observation device 5 acquires the positions of the plurality of magnetic coils within the subject based on the magnetic field detection signal output from the receiving antenna. Specifically, the endoscopic observation device 5 uses a predetermined position within the subject (the mouth for upper endoscopy, the anus for lower endoscopy) as the origin or reference point as the position of the plurality of magnetic coils. A plurality of three-dimensional coordinate values in a virtual spatial coordinate system may be obtained. The endoscope observation device 5 generates insertion shape information indicating the shape of the endoscope inserted into the subject from the three-dimensional coordinate values of the plurality of magnetic coils, and specifies the position of the tip of the endoscope. It has the function of
  • the terminal device 10a is provided in the examination room and includes an information processing device 11a and a display device 12a.
  • the terminal device 10a may be used by a doctor, a nurse, or the like to check information regarding the biological tissue being imaged in real time during an endoscopy.
  • the terminal device 10b includes an information processing device 11b and a display device 12b, and is provided in a room other than the examination room.
  • the terminal device 10b is used by a doctor when creating a report of an endoscopy.
  • the terminal devices 10a, 10b may be configured with one or more processors having general-purpose hardware.
  • the endoscopic observation device 5 displays the endoscopic image on the display device 6 in real time, and also sends the endoscopic image to the image analysis device 3 together with the imaging information of the image. Supply in real time.
  • the photographing information includes at least the frame number of the image and the photographing time information indicating the time when the image was photographed, and the frame number indicates the number of frames after the endoscope 7 started photographing. It may be information that indicates.
  • the endoscope observation device 5 has a function of specifying the position of the endoscope tip in the subject, and includes the position of the endoscope tip in the imaging information provided to the image analysis device 3. In other words, it may include photographing position information indicating the position where the image was photographed.
  • the image analysis device 3 is an electronic computer that analyzes endoscopic images, detects lesions included in the endoscopic images, and qualitatively diagnoses the detected lesions.
  • the image analysis device 3 may be a CAD (computer-aided diagnosis) system having an AI (artificial intelligence) diagnosis function.
  • the image analysis device 3 may be composed of one or more processors with dedicated hardware, but may also be composed of one or more processors with general-purpose hardware.
  • the image analysis device 3 performs machine learning using learning endoscopic images, information indicating organs and parts included in the endoscopic images, and information regarding lesion areas included in the endoscopic images as training data. Use the trained model generated by . Annotation work on endoscopic images is performed by an annotator with specialized knowledge such as a doctor, and a type of deep learning such as CNN, RNN, LSTM, etc. may be used for machine learning.
  • this trained model receives an endoscopic image, it outputs information indicating the photographed organ, information indicating the photographed region, and information regarding the photographed lesion (lesion information).
  • the lesion information output by the image analysis device 3 includes at least lesion presence/absence information indicating whether a lesion is included in the endoscopic image (a lesion is captured).
  • the lesion information includes information indicating the size of the lesion, information indicating the position of the outline of the lesion, information indicating the shape of the lesion, information indicating the depth of invasion of the lesion, and qualitative diagnosis results of the lesion. may include.
  • the qualitative diagnosis result of a lesion may include information indicating the type of lesion.
  • the image analysis device 3 is provided with endoscopic images and imaging information from the endoscopic observation device 5 in real time, and for each endoscopic image, it analyzes information indicating organs, information indicating parts, and Output lesion information.
  • image analysis information information indicating organs, information indicating sites, and lesion information output for each endoscopic image.
  • the endoscopic observation device 5 displays information indicating that the capture operation has been performed (capture operation information) and the captured endoscopic image, as well as the endoscopic image and photographing information.
  • the image ID is provided to the image analysis device 3.
  • the photographing information includes a frame number, photographing time information, and photographing position information.
  • the image analysis device 3 acquires the capture operation information, it provides the server device 2 with the examination ID, image ID, imaging information, and image analysis information generated for the endoscopic image of the image ID.
  • the image ID, imaging information, and image analysis information constitute "additional information" that expresses the characteristics and properties of the endoscopic image.
  • the image analysis device 3 acquires the capture operation information, it transmits the additional information together with the examination ID to the server device 2, and the server device 2 records the additional information in association with the examination ID.
  • the end examination button When the user finishes the endoscopic examination, he or she operates the end examination button on the endoscopic observation device 5.
  • the operation information of the examination end button is supplied to the server device 2 and the image analysis device 3, and the server device 2 and the image analysis device 3 recognize the end of the endoscopy.
  • FIG. 2 shows functional blocks of the server device 2.
  • the server device 2 includes a communication section 20, a processing section 30, and a storage device 60.
  • the communication unit 20 transmits and receives information such as data and instructions to and from the image analysis device 3, endoscope observation device 5, image storage device 8, terminal device 10a, and terminal device 10b via the network 4.
  • the processing section 30 includes an order information acquisition section 40 and an additional information acquisition section 42.
  • the storage device 60 includes an order information storage section 62 and an additional information storage section 64.
  • the server device 2 includes a computer, and the various functions shown in FIG. 2 are realized by the computer executing programs.
  • a computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware.
  • a processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips.
  • the functional blocks shown in FIG. 2 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
  • the order information acquisition unit 40 acquires order information for endoscopy from the hospital information system. For example, before the start of a day's testing work at a medical facility, the order information acquisition section 40 acquires the order information for that day from the hospital information system and stores it in the order information storage section 62. Before the start of the examination, the endoscopic observation device 5 or the information processing device 11a may read order information for the examination to be performed from the order information storage unit 62 and display it on the display device 6 or the display device 12a.
  • the additional information acquisition unit 42 acquires the examination ID and the additional information of the endoscopic image from the image analysis device 3, and stores the additional information in the additional information storage unit 64 in association with the examination ID.
  • Additional information on the endoscopic image includes an image ID, imaging information, and image analysis information.
  • FIG. 3 shows functional blocks of the information processing device 11b.
  • the information processing device 11b has a function of supporting test report creation work, and includes a communication section 76, an input section 78, a processing section 80, and a storage device 120.
  • the communication unit 76 transmits and receives information such as data and instructions to and from the server device 2, image analysis device 3, endoscope observation device 5, image storage device 8, and terminal device 10a via the network 4.
  • the processing unit 80 includes an operation reception unit 82, an acquisition unit 84, a display screen generation unit 100, an image group identification unit 102, a display control unit 104, and a registration processing unit 106.
  • the acquisition unit 84 includes an image acquisition unit 86 and additional information. It has an acquisition section 88.
  • the storage device 120 includes an image storage section 122 and an additional information storage section 124.
  • the information processing device 11b includes a computer, and the various functions shown in FIG. 3 are realized by the computer executing programs.
  • a computer includes a memory for loading a program, one or more processors for executing the loaded program, an auxiliary storage device, other LSI, and the like as hardware.
  • a processor is constituted by a plurality of electronic circuits including a semiconductor integrated circuit and an LSI, and the plurality of electronic circuits may be mounted on one chip or on a plurality of chips.
  • the functional blocks shown in FIG. 3 are realized by the cooperation of hardware and software, and therefore, it will be understood by those skilled in the art that these functional blocks can be realized in various ways by only hardware, only software, or a combination thereof. It is understood.
  • the user After the endoscopic examination is completed, the user, who is a doctor, enters the user ID and password into the information processing device 11b to log in.
  • an application for creating an inspection report is started, and a list of completed inspections is displayed on the display device 12b.
  • test information such as patient name, patient ID, test date and time, test items, etc. is displayed in a list, and the user operates the input section 78 such as a mouse or keyboard to select the target for report creation. Select test.
  • the operation reception unit 82 receives an operation for selecting a test
  • the image acquisition unit 86 acquires a plurality of endoscopic images linked to the test ID of the test selected by the user from the image storage device 8.
  • the display screen generation unit 100 generates a report creation screen and displays it on the display device 12b.
  • FIG. 4 shows an example of a report creation screen for inputting test results.
  • the report creation screen is displayed on the display device 12b with the report tab 54b selected.
  • information about the patient's name, patient ID, date of birth, test items, test date, and administering doctor is displayed. These pieces of information are included in the inspection order information and may be acquired from the server device 2.
  • the report creation screen consists of two areas: the left area is an attached image display area 56 that displays attached endoscopic images, and the right area is an input area 58 for the user to input test results. is placed.
  • the input area 58 is provided with an area for inputting diagnostic details of "oesophagus", “stomach”, and “duodenum”, which are observation ranges in upper endoscopy.
  • the input area 58 may have a format in which a plurality of options for test results are displayed and the user inputs the diagnosis content by selecting a check box, but it may also have a free format in which the user freely inputs text. good.
  • the attached image display area 56 is an area for displaying endoscopic images attached to a report side by side.
  • the user selects an endoscopic image to be attached to the report from the endoscopic image list screen.
  • the display screen generation unit 100 When the user selects the recorded image tab 54a, the display screen generation unit 100 generates a list screen in which a plurality of endoscopic images acquired in the examination are arranged, and displays it on the display device 12b.
  • the multiple endoscopic images acquired during the examination are images captured based on the instructions of the user who handles the medical device that photographs the inside of the subject, specifically the endoscope 7. Instead, it may be an image automatically captured by the endoscopic observation device 5.
  • the display screen generation unit 100 generates a list screen in which images containing lesions (hereinafter also referred to as "lesion images") are arranged among the plurality of endoscopic images obtained in the examination. , may be displayed on the display device 12b.
  • the display screen generation unit 100 may generate a list screen in which all lesion images with lesion presence/absence information indicating that a lesion is included are lined up, but may be generated depending on the purpose of observation, such as for hemorrhagic lesions or malignant tumors.
  • a list screen of lesion images containing a specific lesion may be generated.
  • the image group identification unit 102 has a grouping function that determines whether two consecutive images in time series belong to the same group for a plurality of endoscopic images acquired in an examination. Note that the image group specifying unit 102 may perform a grouping function on all of the plurality of endoscopic images acquired in the examination to classify all the endoscopic images into groups, but in the following example, A group identifying unit 102 extracts lesion images from a plurality of endoscopic images acquired in the examination, and performs a grouping function on the plurality of lesion images. The image group identifying unit 102 may identify the lesion image by referring to the lesion presence/absence information included in the additional information of each image stored in the additional information storage unit 124.
  • FIG. 5 is a flowchart of processing for displaying a plurality of lesion images in groups.
  • the image group specifying unit 102 acquires imaging information included in the additional information of each lesion image from the additional information storage unit 124 (S10).
  • the image group specifying unit 102 calculates the difference between the photographing information of two consecutive images in time series (S12).
  • two images that are consecutive in time series are two images that are adjacent to each other when a plurality of lesion images are arranged in order of imaging time.
  • the image taken temporally earlier will be referred to as the "first image”
  • the image taken temporally later will be referred to as the "second image”. call.
  • the imaging information includes a frame number, imaging time information, and imaging position information, and among these, the image group identification unit 102 uses at least one of the imaging time information and the imaging position information to identify a plurality of lesion images. Divide into groups. In the following, a case will be described in which the image group specifying unit 102 groups a plurality of lesion images using imaging time information.
  • the image group identifying unit 102 determines whether or not two chronologically consecutive images, that is, the first image and the second image, belong to the same group based on the calculated difference in the photographing times (S14).
  • a small difference in shooting time means that the first image and the second image were taken at close timing
  • a large difference in shooting time means that the first image and second image were taken at different timings. This means that the photo was taken in .
  • the image group identifying unit 102 may compare the difference in photographing times with a predetermined threshold, and determine whether the first image and the second image belong to the same group based on the comparison result. In the embodiment, if the first image and the second image belong to the same group, the lesions shown in the first image and the second image are considered to be the same.
  • the threshold may include a first threshold and a second threshold that is smaller than the first threshold.
  • the first threshold may be 60 seconds and the second threshold may be 15 seconds.
  • the image group identification unit 102 determines that the first image and the second image belong to different groups when the difference in the shooting times is greater than or equal to the first threshold, and determines that the first image and the second image belong to different groups when the difference in the shooting times is less than the second threshold. , it is determined that the first image and the second image belong to the same group. Note that when the difference in shooting time is equal to or greater than the second threshold and less than the first threshold, the image group identification unit 102 determines whether the first image and the second image belong to the same group or belong to different groups. It is decided that it cannot be determined.
  • the image group specifying unit 102 finishes the determination process for the first image and the second image (N in S16)
  • the image group specifying unit 102 sets the second image as the first image, and selects an image captured temporally later than the second image.
  • the steps S12 and S14 are performed for a new combination in which the second image is the second image.
  • the image group specifying unit 102 ends the grouping of the plurality of images.
  • the display control unit 104 displays the first image and the second image so as to visually indicate the determination result by the image group identification unit 102. (S18).
  • FIG. 6 shows an example of an endoscopic image list screen.
  • the endoscopic image list screen is displayed on the display device 12b with the recorded image tab 54a selected, and in this example, a list screen of a plurality of lesion images is displayed.
  • the display screen generation unit 100 arranges a plurality of endoscopic images in the list display area 50 according to the order in which they were taken. In the list display area 50, endoscopic images may be displayed as reduced thumbnail images. Note that the display screen generation unit 100 refers to the additional information of each endoscopic image and generates the image ID of each endoscopic image, the site name indicating the site included in each endoscopic image, and information indicating the lesion type. , may be displayed together with the endoscopic image.
  • Check boxes are provided in the endoscopic images displayed in the list display area 50.
  • the operation reception unit 82 accepts the operation to select the endoscopic image as an attached image of the report, and the endoscopic image is attached to the report. selected as the attached image.
  • the endoscopic images selected as report-attached images are displayed side by side in the attached-image display area 56 (see FIG. 4) on the report creation screen.
  • the operation reception unit 82 accepts the operation to enlarge and display the endoscopic image, and the display screen generation unit 100 allows editing.
  • a display screen including the enlarged endoscopic image is generated and displayed on the display device 12b.
  • the user operation for instructing enlarged display may be a double-click operation on the endoscopic image.
  • the display control unit 104 sets the display interval between the first image and the second image based on the determination result of whether or not the temporally adjacent first image and second image belong to the same group. The placement positions of the first image and the second image are determined. In the embodiment, when it is determined that the first image and the second image belong to the same group, the display control unit 104 sets the display interval (W1) between the first image and the second image to be less than a predetermined interval (W2). When it is determined that the first image and the second image belong to different groups, the display interval (W3) between the first image and the second image is made longer than the predetermined interval (W2).
  • the combinations of images whose display interval is set to W1 are an image with ID1 and an image with ID2, an image with ID5 and an image with ID6, an image with ID9 and an image with ID10, an image with ID10 and an image with ID11.
  • This is an image of
  • the user can recognize that the two adjacent images belong to the same group, that is, that the same lesion is depicted in the two adjacent images. Therefore, the user needs to know that the same lesion A appears in the image of ID1 and the image of ID2, that the same lesion B appears in the image of ID5 and the image of ID6, and that the image of ID9, ID10, and ID11 recognize that the same lesion C is shown in the image.
  • the user can easily select images to be attached to a report.
  • the combinations of images for which the display interval is set to W3 are an image with ID4 and an image with ID5, and an image with ID7 and an image with ID8.
  • the display interval between the image of ID2 and the image of ID3, the image of ID3 and the image of ID4, the image of ID8 and the image of ID9, and the image of ID11 and the image of ID12 is set to an interval W2 that is larger than W1 and smaller than W3.
  • Ru By placing two adjacent images at an intermediate distance, neither close nor far apart, the user can observe the lesions in the two adjacent images and determine whether they are the same lesion. You can recognize that you need to carefully decide whether or not to do so.
  • the display control unit 104 preferably adds information (not shown in FIG. 6) indicating whether the two images at the folding position belong to the same group on or around the two images.
  • FIG. 7 shows an example of a list screen of endoscopic images.
  • the display control unit 104 displays a time indicating a difference in shooting time between adjacent images.
  • the time indicating the difference in the shooting time is displayed between all adjacent images, but for example, only when the adjacent images belong to different groups, the shooting time is displayed.
  • a time indicating the difference may be displayed.
  • the above describes a case in which the image group specifying unit 102 groups a plurality of lesion images using imaging time information, but it is also possible to group a plurality of lesion images using imaging position information.
  • the image group specifying unit 102 calculates the difference between the shooting position of the first image and the shooting position of the second image, and groups the first image and the second image into the same group based on the calculated difference between the shooting positions. Determine whether it belongs.
  • a small difference in the shooting position means that the first image and the second image were taken in a close position
  • a large difference in the shooting position means that the first image and the second image were taken in a far apart position. This means that the photo was taken in .
  • the image group identifying unit 102 may compare the difference between the photographing positions with a predetermined threshold, and determine whether the first image and the second image belong to the same group based on the comparison result.
  • the threshold may include a first threshold and a second threshold that is smaller than the first threshold.
  • the first threshold may be 7 cm and the second threshold may be 2 cm.
  • the image group identification unit 102 determines that the first image and the second image belong to different groups when the difference in the photographing positions is greater than or equal to the first threshold, and determines that the first image and the second image belong to different groups when the difference in the photographing positions is less than the second threshold. , it is determined that the first image and the second image belong to the same group. Note that when the difference in the shooting positions is equal to or greater than the second threshold and less than the first threshold, the image group identification unit 102 determines whether the first image and the second image belong to the same group or belong to different groups. It is decided that it cannot be determined.
  • the image group specifying unit 102 calculates differences in photographing positions for all combinations of temporally adjacent first images and second images, and based on the calculated differences in photographing positions, identifies the adjacent first and second images. Determine whether the images belong to the same group.
  • the display screen generation unit 100 When the display screen generation unit 100 generates a list screen in which lesion images are arranged, the display control unit 104 displays the first image and the second image so as to visually indicate the determination result by the image group identification unit 102. (See Figures 6 and 7). Note that although FIG. 7 shows how the display control unit 104 displays a time indicating the difference in shooting time between adjacent images, it also shows the difference in shooting position together with the time indicating the difference in shooting time. A distance may be displayed, or a distance indicating a difference in photographing positions may be displayed instead of a time indicating a difference in photographing times.
  • FIG. 8 shows another example of the endoscopic image list screen.
  • the display screen generation unit 100 arranges a plurality of endoscopic images in the list display area 50 according to the order in which they were taken.
  • the display control unit 104 may display information indicating that the first image and the second image belong to the same group.
  • the display control unit 104 surrounds multiple images belonging to the same group with a frame 90.
  • the display control unit 104 may add information indicating that the images belong to different groups to a plurality of images belonging to different groups.
  • the display control unit 104 may allow the user to distinguish between groups by changing the color of the frame 90 for each group. By displaying in this manner, it is possible to prevent a doctor from confusing whether lesions shown in a plurality of images are the same lesion or different lesions.
  • the display control unit 104 may add marks of the same shape and the same color to a plurality of images belonging to the same group, and may add marks of different shapes and/or different colors to images belonging to different groups. good.
  • the image group identifying unit 102 determines whether two consecutive images in time series include the same lesion based on the imaging information.
  • the image group identifying unit 102 may determine whether two consecutive images in time series include the same organ or site based on the imaging information.
  • the same group may be a group of images depicting the same organ or site within the subject.
  • the image group identification unit 102 divides the plurality of endoscopic images into regions based on the position where the endoscopic image was taken or the region of the subject shown in the endoscopic image. may be displayed.
  • the region may be an organ unit or a site unit.
  • FIG. 9 shows another example of the endoscopic image list screen.
  • the display screen generation unit 100 displays a plurality of endoscopic images by dividing them into regions (organs) according to the order of imaging.
  • the display screen generation unit 100 vertically divides the list display area 50 into regions and displays endoscopic images belonging to each region.
  • images ID1 to ID4 are images of the duodenum
  • images ID5 to ID14 are images of the stomach
  • the display screen generation unit 100 generates endoscopic images for each organ. It is divided and displayed.
  • FIG. 10 shows another example of the endoscopic image list screen.
  • the display screen generation unit 100 displays a plurality of endoscopic images by dividing them into regions (organs) according to the order of imaging.
  • the display screen generation unit 100 creates an interval wider than a normal interval between images between a part where an image including one organ is displayed and a part where an image including another organ is displayed. , displays endoscopic images divided by organ.
  • the endoscopic observation device 5 sends the captured image to the image storage device 8, but in a modified example, the image analysis device 3 may send the captured image to the image storage device 8. Further, in the embodiment, the information processing device 11b has the processing section 80, but in a modified example, the server device 2 may have the processing section 80.
  • a method of displaying a plurality of images acquired by a doctor using an endoscope 7 inserted into a patient's gastrointestinal tract has been described.
  • This method can be applied when displaying a plurality of images acquired by a capsule endoscope with a shooting frame rate of 2 fps or more.
  • Images taken in capsule endoscopy are associated with information indicating the time at which the images were taken and/or information indicating the position at which the images were taken as imaging information. Based on this, it may be determined whether the first image and the second image that are consecutive in time series belong to the same group.
  • the information indicating the position photographed by the capsule endoscope is derived by a known method using the magnetic sensor of the capsule endoscope and the received power difference of the wireless signal transmitted from the capsule endoscope. good.
  • the present disclosure can be used in the technical field of displaying images obtained during inspection.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Selon la présente invention, une unité d'identification de groupe d'images 102 acquiert des informations d'imagerie respectives pour chacune d'une pluralité d'images obtenues par imagerie de l'intérieur d'un sujet, et calcule une différence dans des informations d'imagerie entre une première image et une seconde image qui sont successives en série chronologique. L'unité d'identification de groupe d'images 102 détermine, sur la base de la différence calculée, si la première image et la seconde image appartiennent au même groupe. Une unité de commande d'affichage 104 affiche la première image et la seconde image de façon à indiquer de manière visible le résultat de la détermination.
PCT/JP2022/019130 2022-04-27 2022-04-27 Système d'assistance médicale et méthode d'affichage d'image WO2023209884A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/019130 WO2023209884A1 (fr) 2022-04-27 2022-04-27 Système d'assistance médicale et méthode d'affichage d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/019130 WO2023209884A1 (fr) 2022-04-27 2022-04-27 Système d'assistance médicale et méthode d'affichage d'image

Publications (1)

Publication Number Publication Date
WO2023209884A1 true WO2023209884A1 (fr) 2023-11-02

Family

ID=88518374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/019130 WO2023209884A1 (fr) 2022-04-27 2022-04-27 Système d'assistance médicale et méthode d'affichage d'image

Country Status (1)

Country Link
WO (1) WO2023209884A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006015125A (ja) * 2004-05-31 2006-01-19 Toshiba Corp グループ情報作成システム、グループ情報作成方法およびグループ情報作成プログラム
WO2014192504A1 (fr) * 2013-05-31 2014-12-04 コニカミノルタ株式会社 Dispositif et programme de traitement d'image
JP2016517114A (ja) * 2013-04-19 2016-06-09 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 画像注釈のグループ化

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006015125A (ja) * 2004-05-31 2006-01-19 Toshiba Corp グループ情報作成システム、グループ情報作成方法およびグループ情報作成プログラム
JP2016517114A (ja) * 2013-04-19 2016-06-09 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 画像注釈のグループ化
WO2014192504A1 (fr) * 2013-05-31 2014-12-04 コニカミノルタ株式会社 Dispositif et programme de traitement d'image

Similar Documents

Publication Publication Date Title
JP5394622B2 (ja) 医用ガイドシステム
US8049777B2 (en) Insertion support system for specifying a location of interest as an arbitrary region and also appropriately setting a navigation leading to the specified region
JP4875416B2 (ja) 医用ガイドシステム
US20080303898A1 (en) Endoscopic image processing apparatus
JP2017108792A (ja) 内視鏡業務支援システム
JP7270658B2 (ja) 画像記録装置、画像記録装置の作動方法および画像記録プログラム
JP2012024509A (ja) 画像処理装置、方法及びプログラム
WO2021176664A1 (fr) Système et procédé d'aide à l'examen médical et programme
JP2009022446A (ja) 医療における統合表示のためのシステム及び方法
JP2016143194A (ja) 医用画像処理装置、医用画像処理方法、及び医用画像処理プログラム
JP4686279B2 (ja) 医用診断装置及び診断支援装置
CN116097287A (zh) 计算机程序、学习模型的生成方法、手术辅助装置以及信息处理方法
JP2017099509A (ja) 内視鏡業務支援システム
JP6258026B2 (ja) 超音波診断装置
WO2023209884A1 (fr) Système d'assistance médicale et méthode d'affichage d'image
JP2022071617A (ja) 内視鏡システム及び内視鏡装置
JP2011024913A (ja) 医用画像処理装置、医用画像処理プログラム、及びx線ct装置
WO2019088008A1 (fr) Appareil de traitement d'image, procédé de traitement d'image, programme et système d'endoscope
JP2017086685A (ja) 内視鏡業務支援システム
JP2005131031A (ja) 画像表示装置、画像表示方法、及び画像表示プログラム
JP4615842B2 (ja) 内視鏡システムおよび内視鏡画像処理装置
WO2023195103A1 (fr) Système d'aide à l'inspection et procédé d'aide à l'inspection
WO2023145078A1 (fr) Système d'assistance médicale et méthode d'assistance médicale
WO2023166647A1 (fr) Système d'assistance médicale et procédé d'affichage d'image
WO2023135816A1 (fr) Système d'assistance médicale et méthode d'assistance médicale

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22940162

Country of ref document: EP

Kind code of ref document: A1