WO2022195725A1 - Dispositif de traitement d'informations et procédé d'affichage d'image - Google Patents

Dispositif de traitement d'informations et procédé d'affichage d'image Download PDF

Info

Publication number
WO2022195725A1
WO2022195725A1 PCT/JP2021/010634 JP2021010634W WO2022195725A1 WO 2022195725 A1 WO2022195725 A1 WO 2022195725A1 JP 2021010634 W JP2021010634 W JP 2021010634W WO 2022195725 A1 WO2022195725 A1 WO 2022195725A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
unit
images
observed
Prior art date
Application number
PCT/JP2021/010634
Other languages
English (en)
Japanese (ja)
Inventor
珠帆 宮内
卓志 永田
聡美 小林
奈々 杉江
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to PCT/JP2021/010634 priority Critical patent/WO2022195725A1/fr
Publication of WO2022195725A1 publication Critical patent/WO2022195725A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof

Definitions

  • the present disclosure relates to technology for displaying multiple images.
  • Patent Document 1 when the main reproduction of video data is interrupted, reproduction history information specifying the portion of the video data for which the main reproduction was interrupted is stored in association with the video data, and the video data is reproduced based on the reproduction history information.
  • a video playback method for generating provisional playback start information specifying a video data portion for which provisional playback is to be started, and provisionally playing back the video data from the video portion specified by the provisional playback start information.
  • an information processing apparatus includes a photographed image arrangement unit that arranges a plurality of photographed images side by side in a list display area, and a reproduction unit that reproduces the images in the order in which they were photographed.
  • an observation determination unit that determines whether or not the reproduced image has been observed by the user; and a processing execution unit that executes processing indicating that observation has been completed.
  • An image display method includes displaying a plurality of captured images side by side in a list display area, reproducing the images in the order in which they were captured, and determining whether or not the reproduced images were observed by the user. If it is determined that the reproduced image has been viewed by the user, the image arranged in the list display area is processed to indicate that it has been viewed.
  • FIG. 10 is a diagram showing an example of an endoscopic image interpretation screen;
  • FIG. 10 is a diagram showing a playback button for an endoscopic image;
  • FIG. 10 is a diagram showing an example of a peripheral image reproduction area superimposed and displayed on the list display area;
  • FIG. 10 is a diagram showing a display example of a peripheral image playback area during forward playback;
  • FIG. 11 is a diagram showing a display example of a temporarily stopped peripheral image playback area; It is a figure which shows an example of the marking process result performed to the list display area.
  • FIG. 10 is a diagram showing an example of a captured image display area; It is a figure which shows an example of the marking process result performed to the list display area.
  • FIG. 1 is a diagram for explaining the outline of the image observation support system according to the embodiment.
  • the image observation support system 1 includes a capsule endoscope 3 that is introduced into the body of a subject (patient) and transmits image data of the interior of the subject, and receives image data transmitted from the capsule endoscope 3. and an information processing device 10 that acquires an image data group from the receiving device 4, performs predetermined image processing, and displays an endoscopic image on a display device 12.
  • FIG. Cradle 5 is connected to information processing apparatus 10 by a data transfer cable.
  • the receiving device 4 is connected to the information processing device 10 via the cradle 5 by being inserted into the cradle 5 .
  • the subject has multiple receiving antennas (not shown) attached to the abdomen, and swallows the capsule endoscope 3 through the mouth while the receiving device 4 is attached to the waist with a belt.
  • the capsule endoscope 3 performs A/D conversion on imaging signals output from an imaging unit that is an ultra-compact camera, an illumination unit that illuminates the inside of the subject's body, and converts the image ID and imaging time information into meta data. It comprises a signal processing unit that generates image data added as data, a memory that temporarily stores image data, a communication module that transmits the image data stored in the memory, and a battery that supplies power to each unit.
  • the capsule endoscope 3 periodically captures still images while moving through the gastrointestinal tract, and transmits the image data to the receiving device 4 via an antenna.
  • the receiving device 4 has a built-in recording medium, and the receiving device 4 associates the image data received by each receiving antenna with the radio wave intensity information at the time of reception by each receiving antenna, and records it in the recording medium.
  • the capsule endoscope 3 captures images of the inside of the body every 0.5 seconds, about 60,000 pieces of image data are recorded in the recording medium when the images of the inside of the body are completed in about 8 hours.
  • the image ID is information for identifying an image, and may be information to which a serial number indicating the order of imaging is added. For example, "1" may be added to the image ID of the first captured endoscopic image, and "2" may be added to the image ID of the second captured endoscopic image.
  • the serial number included in the image IDs expresses the shooting order, and duplication of the image IDs can be avoided.
  • the image ID and the shooting time information may be added to the image data as metadata by the receiving device 4 when the receiving device 4 receives the image data.
  • the image data captured by the capsule endoscope 3 is recorded in the recording medium with the image ID and capturing time information added as metadata, and associated with the received radio wave intensity.
  • the antenna and the receiving device 4 are collected from the subject.
  • the data terminal of the receiving device 4 is connected to the connector of the cradle 5, and about 60,000 image data and radio wave intensity information recorded on the recording medium are processed. transferred to the device 10;
  • FIG. 2 shows functional blocks of the information processing device 10 .
  • the information processing apparatus 10 includes an acquisition unit 20, an image diagnosis unit 22, an image processing unit 24, an operation reception unit 26, a screen generation unit 30, an observation determination unit 40, a process execution unit 42, an image capture unit 44, and a storage unit 50.
  • the screen generation unit 30 has a captured image placement unit 32, a reproduction processing unit 34, and a captured image placement unit 36, and generates a display screen.
  • the processing execution unit 42 may be one function of the screen generation unit 30 .
  • Each function of the information processing apparatus 10 may be realized by executing various applications such as an image reproduction application.
  • the configuration of the information processing apparatus 10 can be implemented by arbitrary processors, memories, and other LSIs in terms of hardware, and by programs loaded in the memory in terms of software. It depicts the functional blocks to be implemented. Therefore, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • the acquisition unit 20 acquires about 60,000 pieces of image data and radio wave intensity information transferred from the cradle 5 and stores them in the storage unit 50 .
  • the storage unit 50 is a large-capacity recording device such as an HDD (hard disk drive) or an SSD (solid state drive), and may be a built-in recording device. It may be an external recording device that The radio wave intensity information is used for processing to identify the locus of movement of the capsule endoscope 3 .
  • the image data transferred from the cradle 5 is either uncompressed RAW (raw) image data or RAW image data subjected to only lossless compression, and the data size is very large. Therefore, when the acquisition unit 20 acquires the image data, the image processing unit 24 performs irreversible compression processing on the RAW image data, generates an endoscope image to which the image ID and imaging time information are added as metadata, and stores the image data. may be stored in unit 50; For example, the image processing unit 24 may compress the endoscopic RAW image in an image format such as JPEG.
  • the image diagnosis unit 22 has a function of diagnosing endoscopic images.
  • the image diagnosis unit 22 performs the image diagnosis function using a trained model generated by machine learning using learning images, which are endoscopic images captured in the past, and labels indicating lesions as teacher data. It can be realized.
  • the trained model outputs lesion information regarding the endoscopic image.
  • the lesion information includes at least information indicating whether or not a lesion has been detected, and may include a lesion name if a lesion has been detected.
  • the image diagnosis unit 22 inputs the endoscopic images to the learned model according to the imaging order included in the image ID, and acquires the lesion information from the learned model.
  • the image diagnosis unit 22 has a function of distinguishing and recognizing lesions included in a plurality of endoscopic images.
  • the image diagnosis unit 22 assigns identification information (lesion ID) to the lesion detected by the learned model, and adds the lesion ID to the lesion information of the endoscopic image.
  • the image diagnosis unit 22 determines whether the detected lesion is the same as a lesion to which a lesion ID has already been assigned. If the detected lesion is the same as a lesion already given a lesion ID, the image diagnosis unit 22 adds the lesion ID of the lesion to the lesion information of the endoscopic image including the lesion. On the other hand, if the detected lesion is not the same as a lesion to which a lesion ID has already been assigned, the imaging diagnostic unit 22 assigns a new lesion ID to the detected lesion, Add the assigned lesion ID to the information.
  • the image diagnosis unit 22 performs image diagnosis using the endoscopic image that has been compressed by the image processing unit 24, but performs image diagnosis using the endoscopic RAW image that has not been compressed by the image processing unit 24. you can go When the image diagnosis unit 22 acquires the lesion information of the endoscopic image, the image processing unit 24 adds the lesion information to the metadata of the endoscopic image. Therefore, the metadata of the endoscopic image stored in the storage unit 50 includes at least the image ID, imaging time information, and lesion information.
  • Doctor B who is a user, logs in to the information processing apparatus 10 by entering a user ID and a password.
  • the display device 12 displays a list of capsule endoscopy examinations.
  • the examination list screen displays examination information such as patient ID, patient name, examination ID, examination date and time, and the user selects an examination for which an interpretation report is to be created.
  • the screen generator 30 creates an interpretation screen for the user to interpret an endoscopic image. It is generated and displayed on the display device 12 .
  • FIG. 3 shows an example of an endoscopic image interpretation screen.
  • the photographed image arrangement unit 32 arranges a plurality of photographed endoscopic images 80a to 80u (hereinafter referred to as “endoscopic images 80” unless otherwise distinguished) in a grid pattern in the list display area 100. .
  • An endoscopic image that does not fit on the display screen is displayed by the user operating the scroll bar.
  • the photographed image arrangement unit 32 refers to the metadata of the endoscopic images stored in the storage unit 50 and extracts a part of the endoscopic images from the storage unit 50 . , are arranged in the list display area 100 in order of shooting. Specifically, the photographed image placement unit 32 extracts endoscopic images in which lesions have been detected, and arranges them in the list display area 100 in order of photography.
  • an elapsed time from the start of imaging (hereinafter referred to as "imaging time") is displayed.
  • the photographed image arrangement unit 32 arranges a plurality of endoscopic images 80 in order from the left end of the screen to the right end of the screen in chronological order of photographing time. lined up. Since the image interpretation screen shown in FIG. 3 displays only endoscopic images including lesions, the number of displayed images can be significantly reduced compared to the case where all endoscopic images are displayed.
  • an endoscopic image including a lesion was taken as an example of a part of the endoscopic images to be arranged in the list display area 100.
  • the list display area 100 may include an endoscopic image of the entrance of the site, a bleeding image including bleeding, and the like. Also, among endoscopic images including lesions, only endoscopic images related to specific lesions may be extracted and included in the list display area 100 .
  • the user uses the mouse, which is the input unit 14, to input an operation for the endoscopic image 80 in the list display area 100.
  • the operation accepting unit 26 accepts a user's operation.
  • the endoscopic images 80c are surrounded by the selection frame 120.
  • the endoscopic image 80c is displayed.
  • the image 80 is surrounded by the selection frame 120.
  • the time bar display area 102 displays a time bar with the shooting start time at the left end and the shooting end time at the right end.
  • a slider 106 indicates the temporal position of the endoscopic image 80c selected in the list display area 100.
  • FIG. The captured image display switching button 104 is operated when displaying the captured image display area.
  • FIG. 4 shows the playback button 108 for endoscopic images.
  • a play button 108 is displayed on that endoscopic image 80 .
  • the mouse pointer 110 is aligned with the endoscopic image 80c surrounded by the selection frame 120, but when the mouse pointer 110 is aligned with another endoscopic image 80, that endoscopic image 80 A play button 108 is displayed above.
  • the operation reception unit 26 accepts the left-click operation as a peripheral image playback request, and the playback processing unit 34 plays back an image starting from the endoscopic image 80c. is superimposed on the list display area 100 .
  • FIG. 5 shows an example of the peripheral image playback area 112 superimposed on the list display area 100.
  • the reproduction processing unit 34 has a function of reproducing a plurality of endoscopic images in the peripheral image reproduction area 112 in the order in which they were captured.
  • This playback function is called a "peripheral image playback function" because it is a function of continuously playing back images that are close in distance to the shooting position of the endoscopic image 80c that is the selected starting point.
  • the surrounding image reproduction area 112 is displayed as a window, and the user can move the surrounding image reproduction area 112 to a desired position.
  • the reproduction processing unit 34 reproduces the endoscopic image in a reproduction mode according to the user's operation accepted by the operation accepting unit 26 .
  • the reproduction processing unit 34 continuously (animatedly) reproduces the plurality of endoscopic images stored in the storage unit 50 in the order in which they were captured, starting from the endoscopic image 80c for which the reproduction button 108 was operated.
  • the reproduction processing unit 34 reproduces the endoscopic images captured around the imaging position of the endoscopic image 80 c so that the user can observe the endoscopic images that are not displayed in the list display area 100 .
  • a forward play button 114a and a reverse play button 114b are displayed.
  • the operation accepting unit 26 accepts the selection operation of the forward playback button 114a, and the playback processing unit 34 causes the endoscopic image 80c to be reproduced in the peripheral image playback area 112 from the endoscopic image 80c.
  • the endoscopic images are switched in the forward direction (direction from old images to new images) and reproduced.
  • the operation receiving unit 26 receives the selection operation of the reverse playback button 114b, and the playback processing unit 34 performs reverse playback from the endoscopic image 80c in the peripheral image playback region 112.
  • the endoscopic images are switched in the direction (from the image with the latest shooting time to the image with the old shooting time) and played back.
  • the playback speed (the display time of one endoscopic image) can be adjusted with a slider provided in the playback button display area 114 .
  • the endoscopic image is not only automatically reproduced by operating the forward reproduction button 114a and the reverse reproduction button 114b, but also by manual operation by the user.
  • the operation reception unit 26 receives the rotation operation of the wheel, and the reproduction processing unit 34 reproduces the endoscopic images one by one in the peripheral image reproduction area 112 for frame-advance reproduction or frame-rewind reproduction. do.
  • the operation accepting unit 26 accepts a frame-by-frame playback operation
  • the playback processing unit 34 forwards frame-by-frame playback of the endoscopic images in the peripheral image playback area 112 and the user rotates the wheel backward
  • the operation receiving unit 26 receives a frame-by-frame reverse reproduction operation
  • the reproduction processing unit 34 performs frame-by-frame reverse reproduction of the endoscopic image in the reverse direction.
  • the reproduction processing unit 34 reproduces the endoscopic image stored in the storage unit 50 in a reproduction mode according to the user's operation received by the operation receiving unit 26 .
  • the peripheral image reproduction area 112 shown in FIG. 5 triangular marks are added to the four corners of the endoscopic image 80c. These triangular marks indicate that the images are displayed in the list display area 100.
  • FIG. 6 shows a display example of the peripheral image playback area 112 during forward playback.
  • the operation receiving unit 26 receives a selection operation of the forward reproduction button 114a, and the reproduction processing unit 34 reproduces the endoscopic images in the forward direction, starting from the endoscopic image 80c, according to the shooting order.
  • an endoscopic image captured at "51:04" is reproduced.
  • the four corners of the displayed endoscopic image are marked with triangles. No mark is added.
  • a pause button 114c is displayed instead of the selected forward play button 114a or reverse play button 114b.
  • the playback processing unit 34 pauses the playback of the endoscopic image and continues to display the endoscopic image at the time of playback stop. .
  • FIG. 7 shows a display example of the paused peripheral image playback area 112.
  • the operation accepting unit 26 accepts a selection operation of the pause button 114c, and the playback processing unit 34 suspends playback of the endoscopic image.
  • an endoscopic image 90 captured at "53:41" is displayed as a still image.
  • the reproduction processing unit 34 ends the superimposed display of the peripheral image reproduction area 112 so that the user can see the entire list display area 100. Become.
  • the endoscopic image captured at "53:41” is temporally divided into an endoscopic image 80e captured at "53:03" and an endoscopic image 80e captured at "1:20:01". It is located between the photographed endoscopic image 80f. Therefore, the endoscopic images 80c, 80d, and 80e are displayed in the peripheral image playback area 112 during peripheral image playback starting from the endoscopic image 80c. Therefore, the processing execution unit 42 executes processing indicating that the endoscopic images 80c, 80d, and 80e arranged in the list display area 100 have been observed.
  • the process execution unit 42 may perform a marking process to indicate that the endoscopic images 80c, 80d, and 80e arranged in the list display area 100 have been observed.
  • the marking process may include various forms of display processes that can distinguish between the unobserved endoscopic image 80 and the observed endoscopic images 80c, 80d, 80e.
  • FIG. 8 shows an example of the result of marking processing applied to the list display area 100.
  • an observed mark 122 is added to the observed endoscopic images 80c, 80d, and 80e.
  • the user can easily recognize that the endoscopic images 80c, 80d, and 80e have been observed, and inefficient double reading can be prevented.
  • the observed marks 122 may be placed at positions associated with the endoscopic images 80c, 80d, and 80e, and the processing execution unit 42 may place the observed marks 122 near the imaging time.
  • the processing execution unit 42 changes the character color and character font of the photographing time, changes the display color around the image, etc., so that the observed endoscope images 80c, 80d, and 80e can be transferred to other endoscopes. It may be displayed so as to be distinguished from the image 80 .
  • the processing execution unit 42 of the embodiment performs processing indicating that the endoscopic image 80 has been observed only when it is estimated that the user has observed the reproduced endoscopic image 80. may be executed.
  • the observation determination unit 40 determines whether or not the endoscopic image 80 reproduced in the peripheral image reproduction area 112 has been observed by the user. When it is determined that the reproduced endoscopic image 80 has been observed by the user, the processing execution unit 42 indicates that the endoscopic image 80 arranged in the list display area 100 has been observed. Execute the indicated process. The determination criteria used by the observation determination unit 40 will be described below.
  • the observation determination unit 40 may determine whether or not the reproduced endoscopic image 80 has been observed by the user based on the reproduction time by the reproduction processing unit 34 . Specifically, the observation determining unit 40 determines that the endoscopic image 80 has been observed by the user when the playback time is longer than or equal to the predetermined time, and determines that the endoscopic image 80 has been observed by the user when the playback time is less than the predetermined time. is not observed by the user.
  • the user may erroneously operate the forward playback button 114a or the reverse playback button 114b, and in such cases, it is not preferable to evaluate the reproduced endoscopic image 80 as having been observed.
  • the observation determination unit 40 determines that the user is not observing when the playback time is short. you can For example, the time threshold for determining the presence or absence of observation may be about 3 seconds.
  • the reproduction processing unit 34 reproduces the starting endoscopic image 80c (photographed time: 50:15) to the endoscopic image 90 (photographed time: 53:41) using the peripheral image reproduction function.
  • the reproduction processing unit 34 reproduces 412 images at 15 fps as shown in the reproduction button display area 114, the reproduction time is approximately 27.5 seconds, which is longer than the time threshold of 3 seconds. Therefore, when determination criterion 1 is used, the observation determining unit 40 determines that the displayed endoscopic images 80c, 80d, and 80e have been observed by the user.
  • the observation determination unit 40 specifies the playback time by the playback processing unit 34 based on the interval between the shooting times of the played back images, and determines whether or not the played back endoscopic image 80 has been observed by the user.
  • the threshold in this case may be longer than 3 seconds, and may be set to about 15 seconds, for example. Since the imaging time interval is 3 minutes and 26 seconds, which is longer than the time threshold of 15 seconds, the observation determination unit 40 may determine that the displayed endoscopic images 80c, 80d, and 80e have been observed by the user. .
  • criterion 1 is applied to the playback mode in which the forward playback button 114a or the reverse playback button 114b is operated. good.
  • the observation determination unit 40 may determine whether or not the reproduced endoscopic image 80 has been observed by the user based on the reproduction mode of the endoscopic image 80 by the reproduction processing unit 34 . Specifically, the observation determination unit 40 determines that the reproduced endoscopic image 80 has been observed by the user when the reproduction mode is forward reproduction or reverse reproduction. On the other hand, the observation determination unit 40 determines that the reproduced endoscopic image 80 is not observed by the user when the reproduction mode is frame forward or frame reverse based on manual operation. Since there is a high possibility that the user will erroneously operate the wheel of the mouse, the observation determination unit 40 determines that the endoscopic image 80 reproduced by the wheel operation is not being observed.
  • the reproduction processing unit 34 reproduces endoscopic images based on the operation of the forward reproduction button 114a. Therefore, when determination criterion 2 is used, the observation determining unit 40 determines that the displayed endoscopic images 80c, 80d, and 80e have been observed by the user.
  • the user captures (saves) the observed endoscopic images and extracts the endoscopic images for attachment to the inspection report.
  • the capture operation may be an operation of double-clicking the endoscopic image paused in the peripheral image playback area 112 .
  • the image capture unit 44 receives the endoscopic image. Capture a mirror image. For example, when the user double-clicks the endoscopic image 90 in the peripheral image reproduction area 112 shown in FIG.
  • FIG. 9 shows an example of the captured image display area 116.
  • the captured image placement unit 36 arranges the captured images in the captured image display area 116 in order of shooting.
  • the rightmost captured image is an endoscopic image 90 .
  • the observation determination unit 40 determines whether the reproduced endoscopic image has been observed by the user based on whether the image capturing unit 44 has captured the endoscopic image 90 . Specifically, when the image capture unit 44 captures the endoscopic image 90 displayed in the peripheral image reproduction area 112, the observation determination unit 40 determines that the reproduced endoscopic image has been observed by the user. When the operation accepting unit 26 accepts the capture operation, it is certain that the user is observing the endoscopic image, so the observation determining unit 40 can determine that the reproduced endoscopic image has been observed by the user. .
  • the observation determination unit 40 determines that the reproduced endoscopic image is not being observed by the user. you can
  • the image capture unit 44 captures an endoscopic image 90 . Therefore, when determination criterion 3 is used, the observation determining unit 40 determines that the endoscope images 80c, 80d, and 80e displayed in the reproduction process have been observed by the user.
  • Any one of the criteria 1 to 3 above may be used alone, or they may be used in combination.
  • a determination method when two or more determination criteria are combined will be described below.
  • Determination method when determination criterion 3 is combined with determination criterion 1 and/or determination criterion 2 (a) When an endoscopic image is captured Regardless of the determination result by 2, it is determined by determination criterion 3 that the reproduced endoscopic image 80 has been observed. (b) When an endoscopic image is not captured (b-1) When judgment criteria 3 and 1 are combined It is determined whether or not (b-2) Combination of Determination Criteria 3 and Determination Criteria 2 The observation determination unit 40 determines whether or not the reproduced endoscopic image 80 has been observed according to the determination criteria 2 .
  • the observation judgment unit 40 determines whether the reproduced endoscopic image 80 is observed if the reproduction mode is frame forward or frame reverse. determine that it is not. The observation determination unit 40 determines whether or not the reproduced endoscopic image 80 has been observed based on determination criterion 1 if the reproduction mode is forward reproduction or reverse reproduction. Therefore, even if the reproduction mode is forward reproduction or reverse reproduction, if the reproduction time is less than 3 seconds, the observation determination unit 40 determines that the reproduced endoscopic image 80 is not being observed.
  • the observation determination unit 40 determines that the reproduced endoscopic image 80 is not being observed if the reproduction mode is frame forward or frame reverse. The observation determination unit 40 determines whether or not the reproduced endoscopic image 80 has been observed based on determination criterion 1 if the reproduction mode is forward reproduction or reverse reproduction. Therefore, even if the reproduction mode is forward reproduction or reverse reproduction, if the reproduction time is less than 3 seconds, the observation determination unit 40 determines that the reproduced endoscopic image 80 is not being observed.
  • the observation determination unit 40 uses one or more determination criteria to determine whether or not the reproduced endoscopic image 80 has been observed by the user.
  • the processing executing unit 42 determines whether the endoscopic image 80 arranged in the list display area 100 has been observed. Executes a process indicating that As described above, FIG. 8 shows a state in which the observed mark 122 is added to the endoscopic images 80c, 80d, and 80e that have been determined to have been observed.
  • lesion information is added to the endoscopic image 80 as metadata.
  • the lesion information includes a lesion ID that identifies a lesion, and the observation determination unit 40 can identify an endoscopic image group including the same lesion by referring to the lesion ID of each endoscopic image 80. . Therefore, when the observation determination unit 40 determines that the endoscopic images 80c, 80d, and 80e have been observed, the observation determining unit 40 determines that the endoscopic images 80c, 80d, and 80e include the same lesions as those included in the respective endoscopic images 80c, 80d, and 80e. Image 80 may be considered as viewed.
  • FIG. 10 shows an example of the result of marking processing applied to the list display area 100.
  • the observation determination unit 40 identifies the lesion IDs of the endoscopic images 80c, 80d, and 80e determined to have been observed, and searches for another endoscopic image 80 containing these lesion IDs.
  • the lesion ID of the endoscopic image 80b is the same as the lesion ID of the endoscopic image 80c.
  • the endoscopic image 80b without the image is considered the observed image.
  • the processing execution unit 42 performs a marking process indicating that the images have been observed on the endoscopic images 80c, 80d, and 80e that have been determined to have been observed, and the endoscopic image 80b that has been determined to have been observed.
  • observed marks 122 are added to the endoscopic images 80b, 80c, 80d, and 80e.
  • the user can easily recognize that the endoscopic images 80b, 80c, 80d, and 80e have been observed, and inefficient double reading can be prevented.
  • the processing execution unit 42 may have a function of returning the endoscopic images 80b, 80c, 80d, and 80e to the state before the marking processing according to the user's operation.
  • the processing execution unit 42 may have a function of collectively returning the plurality of endoscopic images 80b, 80c, 80d, and 80e to the state before the marking processing according to the user's operation. Since all the observed marks 122 can be erased collectively, the mark erasing operation can be facilitated.
  • processing execution unit 42 may have a function of returning one endoscopic image 80 to the state before the marking processing according to user's operation. By erasing the observed marks 122 one by one, the observed marks 122 added to the endoscopic image 80 that were not actually observed can be individually erased.
  • the "peripheral image reproduction function" of the embodiment reproduces the peripheral images without thinning them out according to the order in which they were shot, but the reproduction mode may not be limited to this.
  • a plurality of specific images such as a predetermined representative image or an image of an abnormal finding such as a lesion, are extracted from the captured images, and only the specific plurality of extracted images are displayed in the order in which they were captured.
  • a moving image may be reproduced as an image.
  • the present disclosure can be used in the field of supporting observation of images.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne une unité d'agencement d'images capturées 32 agençant une pluralité d'images capturées dans une rangée dans une région d'affichage de liste. Une unité de traitement de reproduction 34 reproduit les images selon l'ordre de capture. Une unité de détermination de visualisation 40 détermine si une image reproduite a été vue par un utilisateur. Lorsqu'il a été déterminé qu'une image reproduite a été vue par l'utilisateur, une unité d'exécution de traitement 42 exécute un traitement pour indiquer que l'image, qui est agencée dans la région d'affichage de liste, a été vue.
PCT/JP2021/010634 2021-03-16 2021-03-16 Dispositif de traitement d'informations et procédé d'affichage d'image WO2022195725A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010634 WO2022195725A1 (fr) 2021-03-16 2021-03-16 Dispositif de traitement d'informations et procédé d'affichage d'image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010634 WO2022195725A1 (fr) 2021-03-16 2021-03-16 Dispositif de traitement d'informations et procédé d'affichage d'image

Publications (1)

Publication Number Publication Date
WO2022195725A1 true WO2022195725A1 (fr) 2022-09-22

Family

ID=83320162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010634 WO2022195725A1 (fr) 2021-03-16 2021-03-16 Dispositif de traitement d'informations et procédé d'affichage d'image

Country Status (1)

Country Link
WO (1) WO2022195725A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004392A1 (fr) * 2005-07-01 2007-01-11 Access Co., Ltd. Système et méthode de rapport de scène de programme de diffusion, dispositif terminal mobile et programme informatique
JP2007336283A (ja) * 2006-06-15 2007-12-27 Toshiba Corp 情報処理装置、情報処理方法および情報処理プログラム
WO2009008125A1 (fr) * 2007-07-12 2009-01-15 Olympus Medical Systems Corp. Dispositif de traitement d'image, son procédé de fonctionnement et son programme
JP2010082241A (ja) * 2008-09-30 2010-04-15 Olympus Medical Systems Corp 画像表示装置、画像表示方法、および画像表示プログラム
WO2011013475A1 (fr) * 2009-07-29 2011-02-03 オリンパスメディカルシステムズ株式会社 Dispositif d'affichage d'image, système de support d'interprétation radiographique, et programme de support d'interprétation radiographique
JP2013075244A (ja) * 2013-02-01 2013-04-25 Olympus Medical Systems Corp 画像表示装置、画像表示方法、および画像表示プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007004392A1 (fr) * 2005-07-01 2007-01-11 Access Co., Ltd. Système et méthode de rapport de scène de programme de diffusion, dispositif terminal mobile et programme informatique
JP2007336283A (ja) * 2006-06-15 2007-12-27 Toshiba Corp 情報処理装置、情報処理方法および情報処理プログラム
WO2009008125A1 (fr) * 2007-07-12 2009-01-15 Olympus Medical Systems Corp. Dispositif de traitement d'image, son procédé de fonctionnement et son programme
JP2010082241A (ja) * 2008-09-30 2010-04-15 Olympus Medical Systems Corp 画像表示装置、画像表示方法、および画像表示プログラム
WO2011013475A1 (fr) * 2009-07-29 2011-02-03 オリンパスメディカルシステムズ株式会社 Dispositif d'affichage d'image, système de support d'interprétation radiographique, et programme de support d'interprétation radiographique
JP2013075244A (ja) * 2013-02-01 2013-04-25 Olympus Medical Systems Corp 画像表示装置、画像表示方法、および画像表示プログラム

Similar Documents

Publication Publication Date Title
JP4971615B2 (ja) 生体内でキャプチャされた画像ストリームを編集するシステムおよび方法
US7119814B2 (en) System and method for annotation on a moving image
US9186041B2 (en) Medical information recording apparatus that determines medical scene or changing of medical scene, synthesizes images obtained by medical equipment, and records synthesized image
US8467615B2 (en) Image display apparatus
JP5784859B2 (ja) 画像管理装置
WO2006022269A1 (fr) Dispositif d'affichage d'image, procédé d'affichage d'image et programme d'affichage d'image
US20090051691A1 (en) Image display apparatus
US20090019381A1 (en) Image display apparatus
JP2008119145A (ja) 画像表示方法および画像表示装置
US20090312601A1 (en) Capsule endoscope system
US20080232702A1 (en) Image display apparatus
US20190133425A1 (en) Endoscope system, terminal device, server, and transmission method
US10918260B2 (en) Endoscopic image observation support system
CN101257838A (zh) 图像显示装置
JP4885432B2 (ja) 画像表示装置、画像表示方法および画像表示プログラム
JP4477451B2 (ja) 画像表示装置、画像表示方法および画像表示プログラム
KR100751160B1 (ko) 의료용 화상 기록 시스템
JPWO2019064704A1 (ja) 内視鏡画像観察支援システム、内視鏡画像観察支援装置、内視鏡画像観察支援方法
WO2022195725A1 (fr) Dispositif de traitement d'informations et procédé d'affichage d'image
JP6335412B1 (ja) 内視鏡画像観察支援システム
WO2018230074A1 (fr) Système d'aide à l'observation d'une image d'endoscope
JP4445742B2 (ja) 画像表示装置、画像表示方法、及び画像表示プログラム
WO2018198525A1 (fr) Système d'aide à l'observation d'image endoscopique
WO2023166647A1 (fr) Système d'assistance médicale et procédé d'affichage d'image
JP2006061628A (ja) データ生成装置、データ閲覧システム、データ生成方法およびデータ生成プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931477

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931477

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP