CN110662477B - Information processing apparatus, control method, and program - Google Patents

Information processing apparatus, control method, and program Download PDF

Info

Publication number
CN110662477B
CN110662477B CN201880034288.0A CN201880034288A CN110662477B CN 110662477 B CN110662477 B CN 110662477B CN 201880034288 A CN201880034288 A CN 201880034288A CN 110662477 B CN110662477 B CN 110662477B
Authority
CN
China
Prior art keywords
abnormal region
video frame
detected
region
abnormal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880034288.0A
Other languages
Chinese (zh)
Other versions
CN110662477A (en
Inventor
高桥郁磨
佐野真贵
奥津元靖
田中千惠美
西光雅弘
今冈仁
上条宪一
斋藤裕
山田真善
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NATIONAL CANCER CENTER
NEC Corp
Original Assignee
NATIONAL CANCER CENTER
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NATIONAL CANCER CENTER, NEC Corp filed Critical NATIONAL CANCER CENTER
Publication of CN110662477A publication Critical patent/CN110662477A/en
Application granted granted Critical
Publication of CN110662477B publication Critical patent/CN110662477B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00055Operational features of endoscopes provided with output arrangements for alerting the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

This information processing apparatus (2000) detects an abnormal area (30) from video data (12). The information processing apparatus (2000) causes the video frame (14) in which the abnormal region (30) is detected by the display device to be displayed in the first region (22) of the display device (20). Further, the information processing apparatus (2000) causes the display device to display video data (12) including a video frame (14) generated after the video frame (14) displayed in the first area in a second area (24) of the display device (20).

Description

Information processing apparatus, control method, and program
Technical Field
The invention relates to an information processing apparatus, a control method, and a program.
Background
A check is performed to find out whether there is an abnormality in the body by using an image that images the inside of the body of a human or an animal. For example, patent documents 1 to 3 disclose a technique of displaying an image (CT image or MRI image) acquired in a past examination (for example, one year ago) and an image acquired in a current examination side by side. Further, patent documents 1 and 4 disclose a technique of detecting a lesion from an image and marking the detected lesion.
Relevant documents
Patent document
[ patent document 1] Japanese patent application laid-open No. 2007-159934
[ patent document 2] Japanese patent application laid-open No. 2016-048426
[ patent document 3] Japanese patent application laid-open No. 2016-202722
[ patent document 4] PCT publication No. WO 2011/132468
Disclosure of Invention
Technical problem
As one of methods of examining the inside of a body, there is a method of examining the state of the inside of the body by viewing a video displayed on a display device using an endoscope system or the like. Specifically, the doctor inserts a scope having a camera at the tip from the nose, mouth, anus, and the like of the subject, and then moves inside the body. By doing so, the state of the inside of the body is imaged by the camera. While viewing the state of the inside of the body imaged by the camera using the video displayed on the display device, the doctor checks whether an abnormal part exists in the body of the subject.
As described above, in the method of performing an examination by moving the camera inside the body of the subject, since the camera moves inside the body, the part that can be observed by the doctor changes with time. Therefore, the doctor may miss the abnormal part, and the lesion detection rate actually varies depending on the doctor in charge of the examination. In each of the above-mentioned related documents, a case where the site that can be observed by the doctor changes with time in this manner is not assumed.
The present invention has been made in view of the above problems. It is an object of the present invention to provide a technique for improving the quality of an examination using video that images the inside of a subject's body.
Means for solving the problems
An information processing apparatus according to the present invention includes: 1) a detection unit that detects an abnormal region of a body from a video that images the body; and 2) a display control unit that displays, in a first area of the display device, a video frame in which an abnormal area is detected by a video frame constituting the video, and displays, in a second area of the display device, the video including the video frame generated after the video frame.
The control method according to the present invention is executed by a computer. The control method comprises the following steps: 1) a detection step of detecting an abnormal region of a body from a video that images the body; and 2) a display control step of displaying, in a first area of the display device, a video frame from which an abnormal area is detected among video frames constituting the video, and displaying, in a second area of the display device, the video including the video frame generated after the video frame.
The program according to the present invention causes a computer to execute each step of the control method according to the present invention.
The invention has the advantages of
According to the present invention, there is provided a technique for improving the accuracy of an examination using a video that images a body of a subject.
Drawings
The above objects as well as other objects, features and advantages will become more apparent from the following description of preferred exemplary embodiments and the accompanying drawings accompanying the exemplary embodiments.
Fig. 1 is a diagram conceptually showing an operation of an information processing apparatus according to example embodiment 1.
Fig. 2 is a block diagram showing a functional configuration of an information processing apparatus.
Fig. 3 is a diagram showing a computer for implementing the information processing apparatus.
Fig. 4 is a diagram showing the configuration of a display device.
Fig. 5 is a diagram showing a specific example of a usage environment of an information processing apparatus.
Fig. 6 is a flowchart showing a flow of processing performed by the information processing apparatus according to example embodiment 1.
Fig. 7 is a diagram showing various overlay marks overlaid on an abnormal region.
Fig. 8 is a diagram showing an indication mark indicating an abnormal region.
Fig. 9 is a diagram showing an information processing apparatus connected to an image storage unit.
Fig. 10 is a block diagram showing an information processing apparatus according to example embodiment 2.
Fig. 11 is a diagram showing the abnormal area information in a table form.
Fig. 12 is a diagram showing a scene in which a display on a display device is updated.
Fig. 13 is a diagram showing a first display in consideration of the difference of the abnormal region.
Fig. 14 is a diagram showing highlighting.
Fig. 15 is a diagram showing an example in which the first display is highlighted.
Fig. 16 is a block diagram showing an information processing apparatus according to example embodiment 3.
Fig. 17 is a diagram showing a format of information to be stored in the image storage unit in a table format.
Fig. 18 is a block diagram showing an information processing apparatus according to example embodiment 4.
Fig. 19 is a diagram showing a scene in which a video frame including an abnormal region for a predetermined action of a user is highlighted.
Detailed Description
Hereinafter, example embodiments of the present invention will be described with reference to the accompanying drawings. Note that the same reference numerals are assigned to the same components throughout the drawings, and the description thereof will not be repeated. In each block diagram, unless otherwise specified, each block represents the configuration of a functional unit, not a hardware unit.
[ example embodiment 1]
Fig. 1 is a diagram conceptually illustrating an operation of an information processing apparatus 2000 according to example embodiment 1. Note that fig. 1 shows only an example of the operation thereof to facilitate easy understanding of the information processing apparatus 2000, and does not limit the function of the information processing apparatus 2000.
The camera 10 is used to examine a human or other animal. Hereinafter, a person or the like to be examined is referred to as an object. The camera 10 is any camera capable of imaging the interior of the body of a subject and generates a video frame 14 representing the imaging result. For example, the camera 10 is an endoscopic camera. The video data 12 is formed of a plurality of video frames 14 generated at different times from one another.
Video data 12 generated by the camera 10 is displayed on a display device 20. Display device 20 is any display device capable of displaying video data 12. Note that the fact that the video data 12 is displayed on the display device 20 indicates that a plurality of video frames 14 constituting the video data 12 are sequentially displayed on the display device 20.
A user (e.g., a doctor) of the information processing apparatus 2000 recognizes a scene in the body of the subject by viewing the video data 12 displayed on the display device 20. More specifically, the user recognizes whether or not an abnormal portion, a degree of abnormality, or the like exists in the body of the subject. Here, the "abnormal part of the body" is, for example, a part having a lesion, a part having a wound, or a part having a foreign substance. A lesion is a living change caused by a disease such as a tumor.
Here, in endoscopy or the like for searching for an abnormal part in a body while observing the body of a subject by a camera, even if the abnormal part is imaged by the camera, a doctor misses the abnormal part. Therefore, it is preferable to provide support so that the doctor can easily identify the abnormal part, thereby preventing omission of the abnormal part.
The information processing apparatus 2000 according to the present exemplary embodiment operates as follows. The information processing apparatus 2000 acquires the video data 12, and performs image analysis on the video frames 14 constituting the video data 12. Specifically, the information processing apparatus 2000 detects the abnormal region 30 from the video frame 14. The abnormal region 30 is a region that is considered to represent an abnormal part of the subject body. For example, the abnormal region 30 in fig. 1 is a region including a tumor (a region representing a lesion).
The information processing apparatus 2000 displays the video frame 14 from which the abnormal region 30 is detected in the first region 22 of the display device 20, and displays the video data 12 in the second region 24 of the display device 20. In other words, the video frame 14 from which the abnormal region 30 is detected and the video data 12 including the video frame 14 generated thereafter are displayed on the display device 20. The first region 22 and the second region 24 are regions different from each other.
For example, the video data 12 generated by the camera 10 is displayed in real-time in the second region 24. That is, the video data 12 to be displayed in the second area 24 represents the scene of the object at the current point in time in real time. On the other hand, the video frame 14 generated before the current time point in the same examination and imaging the abnormal part of the body is displayed in the first region 22.
In this way, with the information processing apparatus 2000 according to the present exemplary embodiment, the video frame 14 from which the abnormal region 30 is detected is displayed on the display device 20 together with the video data 12. By doing so, it is possible to make the user of the information processing apparatus 2000 easily recognize an abnormal part inside the body of the subject. Even if the user misses a certain abnormal part when the abnormal part is displayed in the second area, the video frame 14 including the abnormal part is displayed and remains in the first area 22 of the display device 20. Therefore, the user recognizes the abnormal portion later by browsing the first region 22. As described above, with the information processing apparatus 2000 according to the present exemplary embodiment, it is possible to reduce the possibility that the user misses an abnormal portion. Therefore, the accuracy of the internal examination of the body performed using the camera 10 can be improved.
Hereinafter, the present exemplary embodiment will be described in more detail.
< functional configuration >
Fig. 2 is a block diagram showing a functional configuration of the information processing apparatus 2000. The information processing apparatus 2000 includes a detection unit 2020 and a display control unit 2040. The detection unit 2020 detects the abnormal region 30 from the video data 12. The display control unit 2040 displays the video frame 14 from which the abnormal region 30 is detected in the first region 22 of the display device 20. Further, the display control unit 2040 displays the video data 12 including the video frame 14 generated after the video frame 14 displayed in the first region on the second region 24 of the display device 20.
< example of hardware configuration of information processing apparatus 2000 >
Each function configuration unit of the information processing apparatus 2000 may be formed of hardware (e.g., a hard-wired electronic circuit or the like) or a combination of hardware and software (e.g., a combination of an electronic circuit and a program for controlling the circuit) which forms each function configuration unit. Hereinafter, a case where each functional configuration unit of the information processing apparatus 2000 is formed by a combination of hardware and software will be further described.
Fig. 3 is a diagram showing a computer 1000 for forming the information processing apparatus 2000. The computer 1000 is a variety of computers. For example, the computer 1000 is a Personal Computer (PC), a server machine, a tablet terminal, a smart phone, or the like. The computer 1000 may be a dedicated computer designed to form the information processing apparatus 2000, or may be a general-purpose computer.
The computer 1000 includes a bus 1020, a processor 1040, a memory 1060, a storage device 1080, input and output interfaces 1100, and a network interface 1120. The bus 1020 is a data transmission path for the processor 1040, the memory 1060, the storage device 1080, the input and output interface 1100, and the network interface 1120 to transmit and receive data to and from each other. Processor 1040 is an arithmetic processing device, such as a Central Processing Unit (CPU) or a Graphics Processing Unit (GPU). The memory 1060 is a main storage device formed of a Random Access Memory (RAM) or the like. The storage device 1080 is an auxiliary storage device formed of a hard disk, a Solid State Drive (SSD), a ROM, or a memory card. However, the storage device 1080 may be formed of hardware similar to the hardware used to form the main storage device, such as RAM.
The input and output interface 1100 is an interface for connecting the computer 1000 to input and output devices. For example, the camera 10 and the display device 20 are connected to the input and output interface 1100.
The network interface 1120 is an interface for connecting to a communication network such as a Wide Area Network (WAN) or a Local Area Network (LAN).
The storage device 1080 stores program modules that implement each function of the information processing apparatus 2000. Processor 1040 reads each program module into memory 1060 and executes each program module to implement each function corresponding to the program module.
< about display apparatus 20>
The display device 20 may have one screen, or may have a plurality of screens. In the former case, the first area 22 and the second area 24 are areas different from each other on one screen. In the latter case, the first area 22 and the second area 24 may be areas different from each other on one screen, or may be areas on screens different from each other.
Fig. 4 is a diagram showing the configuration of the display device 20. The display device 20 in fig. 4(a) has a display screen 26. The first area 22 and the second area 24 are areas different from each other on the display screen 26. The display device 20 in fig. 4(b) has two display screens 26 (display screen 26-1 and display screen 26-2). The first area 22 is the entire area of the display screen 26-1 or a partial area thereof. The second area 24, on the other hand, is the entire area of the display screen 26-2 or a partial area thereof. In the following description, unless otherwise specified, a case where the display device 20 is constituted by one display screen 26 (a case of fig. 4 (a)) will be described as an example.
< specific example of the usage Environment of the information processing apparatus 2000 >
Fig. 5 is a diagram showing a specific example of the usage environment of the information processing apparatus 2000. For example, the information processing apparatus 2000 is used together with the scope 40 and the endoscope system 50. The scope 40 is connected to an endoscope system 50. The observation mirror 40 is provided with the camera 10. In this case, the video data 12 is formed by a plurality of video frames 14 generated by the camera 10 provided in the observation mirror 40. The endoscope system 50 outputs the video data 12 to the information processing apparatus 2000. For example, the video data 12 is output from an interface for video output (for example, a high-definition multimedia interface (HDMI) (registered trademark) interface) provided in the endoscope system 50 to an interface for video input of the information processing apparatus 2000. The information processing apparatus 2000 processes the video data 12 acquired from the endoscope system 50 to control display of the display device 20 (refer to fig. 1).
Note that the configuration shown in fig. 5 is merely an example, and the usage environment of the information processing apparatus 2000 is not limited to the configuration shown in fig. 5. For example, the video data 12 may be output from the camera 10 to the information processing apparatus 2000. In this case, the information processing apparatus 2000 may not be connected to the endoscope system 50.
< Process flow >
Fig. 6 is a flowchart showing a flow of processing performed by the information processing apparatus 2000 according to example embodiment 1. Steps S102 to S112 are loop processing performed for each video frame 14 acquired from the camera 10. In S102, information processing apparatus 2000 selects video frame 14 having the earliest generation time point among video frames 14 for which loop processing a has not been performed. The video frame 14 selected here is denoted video frame i. Note that, for example, in a case where all the video frames 14 have been subjected to the loop processing a, the information processing apparatus 2000 waits until a new video frame 14 is generated. Alternatively, the process of FIG. 6 may end.
The detection unit 2020 detects an abnormal region 30 from the video frame i (S104). In the case where the abnormal region 30 is detected from the video frame i (yes in S106), the display control unit 2040 displays the video frame i in the first region 22 (S108). Thus, the video frame 14 from which the abnormal region 30 is detected is displayed in the first region 22.
In S110, the display control unit 2040 displays the video frame i in the second region 24. Therefore, the video frame i is displayed in the second area 24 regardless of whether the abnormal area 30 is detected.
Since S112 is the end of the loop processing a, the processing of fig. 6 returns to S102.
Note that the video frames 14 subjected to the process (S104) for detecting the abnormal region 30 may be all the video frames 14 included in the video data 12, or may be some of the video frames 14. In the latter case, for example, the detection unit 2020 performs S104 only for every predetermined number of one video frame 14 (for example, every ten video frames).
< acquisition of video data 12 >
Any method of acquiring video data 12 by detection unit 2020 may be employed. For example, the detection unit 2020 accesses a storage device in which the video data 12 is stored to acquire the video data 12. The storage device in which the video data 12 is stored may be provided inside the camera 10, or may be provided outside the camera 10. For example, the detection unit 2020 may receive video data 12 to be transmitted from the camera 10 to acquire the video data 12. Further, the detection unit 2020 may acquire the video data 12 from another device (e.g., the endoscope system 50 described above) connected to the camera 10.
< detection of abnormal region 30: s104>
The detection unit 2020 detects an abnormal region 30 from each video frame 14 constituting the video data 12. Here, as a technique of analyzing an image imaged inside the body and detecting an abnormal portion, a related art technique may be used. For example, methods such as feature value matching or template matching may be used. For example, in the case of detecting a tumor by feature amount matching, one or more values (feature amounts) representing appearance features (color, pattern, shape, and the like) of the tumor are defined in advance. The detection unit 2020 detects, from the video frame 14, an image region having a high similarity with a feature value of a tumor set in advance in the image region of the video frame 14. The detection unit 2020 treats the detected image region as an image region indicating the abnormal region 30. The same method can be used for the case where a wound or foreign matter is detected.
Note that in the case where it is desired to detect a foreign object, it is assumed that a foreign object entering the body has been determined. In this case, it is preferable that the characteristic value of the foreign substance can be specified to the information processing apparatus 2000. For example, a photograph of a foreign object that has entered the body is input to the information processing apparatus 2000. The information processing apparatus 2000 performs image analysis on the photograph to calculate a characteristic value of the foreign substance to be detected. The detection unit 2020 detects a foreign object having the calculated feature value from the video frame 14.
< with respect to the first region 22 and the second region 24>
As described above, the video frame 14 including the abnormal region 30 is displayed in the first region 22. The video data 12 is displayed in the second region 24. The first region 22 and the second region 24 may be any regions that are different from each other. The position and size of the first region 22 and the second region 24 in the display device 20 may or may not be fixed. In the latter case, for example, the display control unit 2040 receives a user operation for changing the positions and sizes of the first region 22 and the second region 24. The display control unit 2040 changes the positions and sizes of the first region 22 and the second region 24 in response to the received user operation.
< display of video frame 14 in first area 22: s108>
The display control unit 2040 displays the video frame 14 from which the abnormal region 30 is detected in the first region 22 of the display device 20. Here, as a technique of displaying an image in an area on the display device 20, a related art technique may be used.
The video frames 14 to be displayed in the first region 22 may be one or more. In the latter case, for example, the display control unit 2040 displays the video frames 14 from which the abnormal region 30 is detected in the first region 22 in chronological order from the earlier generation time point. Here, since the number of video frames 14 from which the abnormal region 30 is detected is large, there may be a case where the video frames 14 are not suitable for the first region 22. In this case, for example, the display control unit 2040 may change the video frame 14 to be displayed in the first region 22 by displaying a scroll bar or the like on the display device 20.
Note that at any time, the video frame 14 from which the abnormal region 30 is detected is displayed on the display device 20. For example, the display control unit 2040 displays the video frame 14 on the display device 20 at the timing when the abnormal region 30 is detected from the video frame 14. In another example, the display control unit 2040 displays the video frame 14 on the display device 20 after a predetermined time has elapsed since the video frame 14 detected the abnormal region 30. The predetermined time may be set in advance in the display control unit 2040 or may be stored in a storage device accessible from the display control unit 2040.
< display of video data 12: s110>
The display control unit 2040 displays the video data 12 in the second region 24 of the display device 20. As a technique for displaying video data on a display device, the related art can be used.
< display showing abnormal region 30 >
The display control unit 2040 may perform display representing the abnormal region 30 included in the video frame 14 to display the video frame 14 in the first region 22. By so doing, the user can easily identify the abnormal region 30 included in the video frame 14. Hereinafter, this display is referred to as a first display.
Various displays may be used as the first display. For example, the display control unit 2040 displays a predetermined mark in the abnormal region 30 so as to be superimposed on the abnormal region 30 of the video frame 14 to be displayed in the first region 22. Hereinafter, this mark is referred to as an overlay mark. In this example, the overlay mark is the first display. Fig. 7 is a diagram showing various overlay marks 60 to be overlaid on the abnormal area 30.
In another example, the display control unit 2040 may perform a first display (hereinafter referred to as an indication mark) indicating the abnormal region 30 in the vicinity of the video frame 14. Fig. 8 is a diagram showing an indication mark 70 indicating the abnormal region 30.
< recording of video frame 14 >
The detection unit 2020 may record the video frame 14 from which the abnormal region 30 is detected in the storage device. Hereinafter, the storage device for storing the video frames 14 is referred to as an image storage unit 80. Fig. 9 is a diagram showing the information processing apparatus 2000 connected to the image storage unit 80. Note that the image storage unit 80 may be provided inside the information processing apparatus 2000.
The detection unit 2020 may directly record the video frame 14 from which the abnormal region 30 is detected in the image storage unit 80, or may appropriately process the video frame 14 and record the processed video frame 14 in the image storage unit 80. For example, the detection unit 2020 records the video frame 14 on which the image (the superimposition marker 60 or the like) indicating the position of the abnormal region 30 is superimposed in the image storage unit 80. By so doing, the position of the abnormal region 30 in the video frame 14 can be easily identified. In another example, the detection unit 2020 records the video frame 14 in the image storage unit 80 in association with information for determining the position of the abnormal region 30 included in the video frame 14.
[ example embodiment 2]
Fig. 10 is a block diagram showing an information processing apparatus 2000 according to example embodiment 2. The information processing apparatus 2000 according to example embodiment 2 is the same as the information processing apparatus 2000 according to example embodiment 1 except for the following matters.
Typically, the camera generates video frames at a frequency such as 30 frames per second (fps). Thus, multiple video frames 14 may include the same location. For example, when a certain abnormal region 30 lasts for one second within the imaging range of the camera 10, the abnormal region 30 is detected from 30 video frames 14 at the maximum. As described above, when the same abnormal area 30 is detected from a plurality of video frames 14, it is not always necessary to display all of the plurality of video frames 14 in the first area 22. For example, by displaying only some (e.g., one) of the plurality of video frames 14 in the first region 22, the user may identify the abnormal region 30 included in the video frame 14.
In the case where there are a plurality of video frames 14 including the same abnormal region 30 as described above, the information processing apparatus 2000 according to example embodiment 2 displays only some of the video frames 14 on the display device 20. For this reason, the information processing apparatus 2000 according to example embodiment 2 includes a determination unit 2060. The determination unit 2060 determines whether the abnormal region 30 detected from each of the plurality of video frames 14 is the same. The display control unit 2040 according to example embodiment 2 displays only some of the plurality of video frames 14 including the abnormal region 30 determined to be the same on the display device 20.
< determination by the determination unit 2060 >
The determination unit 2060 compares the abnormal regions 30 detected from the video frame 14 to determine whether the abnormal regions 30 detected from the video frame 14 are the same. For example, the determination unit 2060 calculates the degree of similarity between an image region representing an abnormal region 30 included in a certain video frame 14 and an image region representing an abnormal region 30 included in another video frame 14. The determination unit 2060 determines that the abnormal regions 30 included in the two video frames 14 are the same if the degree of similarity is equal to or greater than the predetermined value. On the other hand, if the degree of similarity is smaller than the predetermined value, the determination unit 2060 determines that the abnormal regions 30 included in the two video frames 14 are different from each other. Here, the conventional technique may be used as a technique of calculating the similarity by comparing image regions.
Note that the determination unit 2060 may compare the image region having a predetermined size or shape including the abnormal region 30 and its periphery with respect to the plurality of video frames 14. In another example, the determination unit 2060 may compare image regions around the abnormal region 30, instead of the abnormal region 30, for a plurality of video frames 14.
A more specific example of a method of determining whether the abnormal area 30 included in each video frame 14 is the same will be described. When the abnormal region 30 is detected from the video frame 14, the detection unit 2020 calculates a feature value of an image region representing the abnormal region 30 (for example, a parameter representing the shape or pattern of the image region). The detection unit 2020 records the calculated feature value in association with a discriminator (e.g., a frame number) of the video frame 14 in the storage device. The storage device may be handled as a database in which information for managing the abnormal area 30 detected from the video data 12 is stored. Hereinafter, information to be stored in the storage device is referred to as abnormal area information.
Fig. 11 is a diagram showing abnormal area information in a table format. The table shown in fig. 11 is referred to as table 300. Table 300 has two columns: an abnormal area discriminator 302 and data 304. The abnormal area discriminator 302 is a discriminator assigned to the abnormal area 30. The data 304 indicates a set of "feature values of the abnormal region 30 and the discriminator of the video frame 14 from which the abnormal region 30 is detected". For example, the record in the first row of the table 300 indicates that the abnormal region 30 having the discriminator r1 is detected from the video frame 14 having the discriminator img001 and the video frame 14 having the discriminator img 004. Further, the record in the first row of the table 300 indicates that the feature value of the abnormal region 30 detected from the video frame 14 having the discriminator img001 is v1, and the feature value of the abnormal region 30 detected from the video frame 14 having the discriminator img004 is v 5. Since the similarity between the eigenvalues v1 and v5 is high, (v1, img001) and (v5, img005) are stored in the same record.
When the abnormal region 30 is detected from the video frame 14, the detection unit 2020 adds a set of "feature values of the abnormal region 30 and an identifier of the video frame 14" to the table 300. In this case, the determination unit 2060 searches the table 300 for a feature value having a high degree of similarity to the feature value of the detected abnormal region 30. Assume that, as a result of the search, a record indicating a feature value having a high degree of similarity to the feature value is found in the data 304. In this case, the determination unit 2060 updates the record acquired by the search to add the feature values of the detected abnormal region 30 and the discriminators of the video frames 14 from which the abnormal region 30 is detected to the table 300. Specifically, the determination unit 2060 adds a set of "feature values of the detected abnormal region 30 and the discriminator of the video frame 14 from which the abnormal region 30 is detected" to the recorded data 304 acquired by the search.
On the other hand, it is assumed that, as a result of the search, a record indicating a feature value having a high degree of similarity to the feature value of the detected abnormal region 30 is not found in the data 304. The determination unit 2060 generates a new record indicating "the feature value of the detected abnormal region 30 and the discriminator of the video frame 14 from which the abnormal region 30 is detected", and adds the record to the table 300.
Note that, in the case where a plurality of abnormal regions 30 are detected from one video frame 14, the above-described processing is performed for each of the plurality of abnormal regions 30.
By managing the abnormal regions 30 detected from each video frame 14 in this manner, it can be easily determined whether the same abnormal region 30 is included in a plurality of video frames 14. Specifically, when the record indicating the discriminator of a specific video frame 14 is the same as the record indicating the discriminator of another video frame 14 in the table 300, the determination unit 2060 determines that the same abnormal area 30 is included in these video frames 14. On the other hand, when the record indicating the discriminator of a specific video frame 14 is different from the record indicating the discriminator of another video frame 14 in the table 300, the determination unit 2060 determines that the abnormal regions 30 different from each other are included in these video frames 14.
< method of determining video frame 14 to be displayed on display device 20 >
For example, the display control unit 2040 determines one video frame 14 among the plurality of video frames 14 including the abnormal region 30 determined to be the same, in which the abnormal region 30 is most easily recognized by the user, and displays the determined video frame 14 on the display device 20. Various methods may be employed for the above determination. Hereinafter, specific examples of the above-described determination method will be described.
< use of method for indicating possibility of abnormality >)
For a plurality of video frames 14 including the abnormal region 30 determined to be the same, the display control unit 2040 determines the possibility that the image region representing the abnormal region 30 represents an abnormality of the body. For example, in the case where the abnormal region 30 is detected from the video frame 14 by feature value matching or template matching, the possibility that the image region representing the abnormal region 30 represents an abnormality of the body is represented by the similarity between the image region and a feature value or template defined in advance. The display control unit 2040 determines the video frame 14 having the highest probability as the video frame 14 to be displayed on the display device 20.
It is considered that the higher the possibility that the abnormal region 30 included in the video frame 14 indicates an abnormality of the body, the higher the possibility that the abnormal region 30 included in the video frame 14 indicates an abnormality. Therefore, by displaying the video frame 14 on the display device 20 in a case where the possibility that the abnormal region indicates an abnormality of the body is high, the user is enabled to recognize the physical abnormality of the subject more accurately.
< method of Using the position of the abnormal region 30 >)
The display control unit 2040 determines, among the plurality of video frames 14 including the abnormal region 30 determined to be the same, the video frame 14 whose abnormal region 30 is located closest to the center position of the video frame 14, and processes the determined video frame 14 as the video frame 14 to be displayed on the display device 20. Specifically, for each video frame 14, the display control unit 2040 calculates the distance between the abnormal region 30 included in the video frame 14 and the center coordinates of the video frame 14. The display control unit 2040 determines the video frame 14 having the smallest distance as the video frame 14 to be displayed on the display device 20.
In general, an object included in an image generated by a camera is more easily seen because the object is close to the center of the image. Therefore, by displaying the video frame 14 on the display device 20 with the position of the abnormal region 30 close to the center position of the video frame 14, the user can more easily see the abnormal region 30.
< method of Using contrast of the entire video frame 14 >)
The display control unit 2040 determines, among the plurality of video frames 14 including the abnormal region 30 determined to be the same, the video frame 14 having the highest contrast among the entire video frames 14 as the video frame 14 to be displayed on the display device 20. Specifically, for each video frame 14, the display control unit 2040 calculates an index value representing the contrast of the entire video frame 14. The display control unit 2040 compares the calculated index values to determine the video frame 14 having the highest contrast, and processes the determined video frame 14 as the video frame 14 to be displayed on the display device 20. Note that, for example, a michelson contrast or the like may be used as an index value for expressing the contrast.
In general, when the contrast of an image is high, it is easier to recognize each object included in the image. Therefore, by displaying the video frame 14 having a high contrast in the entire video frame 14 on the display device 20, it is made easier for the user to see the abnormal region.
< method of Using contrast of image region representing abnormal region 30 >)
The display control unit 2040 may use the contrast of the image area representing the abnormal area 30 instead of the contrast of the entire video frame 14. That is, the display control unit 2040 calculates, for each of the plurality of video frames 14 including the abnormal region 30 determined to be the same, an index value representing the contrast of the image region of the image abnormal region 30. The display control unit 2040 compares the calculated index values to determine the video frame 14 having the highest contrast in the image area representing the abnormal area 30, and displays the determined video frame 14 on the display device 20.
In this way, since the abnormal region 30 having a high contrast is displayed on the display device 20, the user can more easily see the inside of the abnormal region 30.
< time of displaying video frame 14 on display device 20 >
As described above, at any time, the video frame 14 from which the abnormal region 30 is detected is displayed on the display device 20. For example, the display control unit 2040 displays the video frame 14 on the display device 20 at the timing when the abnormal region 30 is detected from the video frame 14. In this case, for example, the display control unit 2040 compares the video frame 14 that has been displayed on the display device 20 with a new video frame 14 from which the same abnormal region 30 as the abnormal region 30 included in the video frame 14 is detected, so as to determine the video frame 14 to be displayed on the display device 20. In the event that it is determined that a new video frame 14 is to be displayed on the display device 20, the display is updated on the display device 20. On the other hand, in the case where it is determined that the video frame 14 that has been already displayed in the video frames 14 is to be displayed on the display device 20, the display control unit 2040 does not display a new video frame 14 on the display device 20.
For example, the display control unit 2040 compares the possibility that an abnormal region 30 included in the video frame 14 that has been displayed on the display device 20 indicates an abnormality with the possibility that an abnormal region 30 included in a new video frame 14 indicates an abnormality. In a case where the abnormal region 30 included in the new video frame 14 has a higher possibility of representing an abnormality, the display control unit 2040 updates the display on the display device 20 to display the new video frame 14 on the display device 20. On the other hand, in a case where the abnormal region 30 in the video frame 14 that has been displayed on the display device 20 has a higher possibility of indicating an abnormality, the display control unit 2040 does not update the display on the display device 20.
Fig. 12 is a diagram showing a scene in which the display of the display device 20 is updated. In the upper portion of the display device 20, the video frame 14-1 is displayed in the first region 22. Thereafter, it is assumed that the same abnormal region 30 as the abnormal region 30 included in the video frame 14-1 is detected from the video frame 14-2 generated after the video frame 14-1. Further, it is assumed that the possibility that the abnormal region 30 included in the video frame 14-2 indicates an abnormality is higher than the possibility that the abnormal region 30 included in the video frame 14-1 indicates an abnormality.
In this case, the display control unit 2040 changes the video frame 14 to be displayed in the first region 22 from the video frame 14-1 to the video frame 14-2 (see the lower part of fig. 12). On the other hand, in the case where the possibility that the abnormal area 30 included in the video frame 14-2 indicates an abnormality is lower than the possibility that the abnormal area 30 included in the video frame 14-1 indicates an abnormality, the display control unit 2040 does not change the video frame 14 to be displayed in the first area 22 (not shown).
The timing at which the display control unit 2040 displays the video frame 14 on the display device 20 is not limited to the above-described timing. For example, when the same abnormal region 30 is detected from a plurality of video frames 14 that are consecutive in time series, the display control unit 2040 does not display the video frames 14 on the display device 20. After the same abnormal region 30 is no longer detected from the video frames 14, the display control unit 2040 determines one video frame 14 to be displayed on the display device 20 among the plurality of video frames 14 from which the abnormal region 30 has been detected so far. The display control unit 2040 displays the determined video frame 14 on the display device 20.
< storage of video frame 14 in image storage unit 80 >
The detection unit 2020 according to example embodiment 2 may record only some of the video frames 14 from which the same abnormal region 30 is detected in the image storage unit 80. For example, the detection unit 2020 records, in the image storage unit 80, only the video frame 14 (the video frame 14 determined by each of the above-described methods) to be displayed in the first region 22 by the detection unit 2020 among the video frames 14 from which the same abnormal region 30 is detected. By doing so, it is possible to save the storage area of the image storage unit 80 while storing the video frame 14 well representing the abnormal area 30.
Here, the detection unit 2020 records the video frame 14 in the image storage unit 80 at any time. For example, the detection unit 2020 records the video frame 14 in the image storage unit 80 at the timing when the abnormal region 30 is detected from the video frame 14. In this case, the detection unit 2020 compares the video frame 14 already stored in the image storage unit 80 with the new video frame 14 from which the same abnormal region 30 as the abnormal region 30 included in the video frame 14 is detected, so as to determine the video frame 14 to be stored in the image storage unit 80. In the case where it is determined that the new video frame 14 is to be stored in the image storage unit 80, the detection unit 2020 deletes the video frame 14 that has been stored in the image storage unit 80, and records the new video frame 14 in the image storage unit 80. On the other hand, in the case where it is determined that the video frame 14 already stored in the image storage unit 80 is to be stored in the image storage unit 80, the detection unit 2020 does not record a new video frame 14 in the image storage area 80.
In another example, when the same abnormal region 30 is detected from a plurality of video frames 14 that are consecutive in time series, the detection unit 2020 does not record the video frames 14 in the image storage unit 80. After the same abnormal region 30 is no longer detected from the video frames 14, the detection unit 2020 determines that one video frame 14 stored in the image storage unit 80 is determined among the plurality of video frames 14 from which the abnormal region 30 has been detected so far. The detection unit 2020 records the determined video frame 14 in the image storage unit 80.
< about first display >
As described above, the display control unit 2040 can display the first display representing the abnormal region 30 on the display device 20. In this case, it is preferable that the same first display is used for the same abnormality region 30, and abnormality regions 30 different from each other are used for different abnormality regions 30. By so doing, it is possible to easily discriminate whether the abnormal region 30 included in each of the plurality of video frames 14 displayed on the display device 20 is the same. Therefore, the inspection using the information processing apparatus 2000 can be performed more smoothly.
There are a plurality of methods of making the first display different for each of the abnormal regions 30 different from each other. For example, the display control unit 2040 uses the first displays having the same color or shape for the same abnormal region 30, and uses the first displays having different colors or shapes for the abnormal regions 30 different from each other. Fig. 13 is a diagram showing a first display in consideration of the difference of the abnormal region 30. In fig. 13, an abnormal region 30-1 included in the video frame 14-1 and an abnormal region 30-2 included in the video frame 14-2 are the same abnormal region 30. On the other hand, the abnormal region 30-3 included in the video frame 14-3 is an abnormal region 30 different from the abnormal regions 30-1 and 30-2. The display control unit 2040 displays the overlay mark 60-1 and the overlay mark 60-2 having the same pattern (dot pattern) on the abnormal region 30-1 and the abnormal region 30-2, respectively. On the other hand, the display control unit 2040 displays the overlay mark 60-3 having a lattice pattern different from the dot pattern on the abnormal region 30-3.
< abnormal region 30 to be simultaneously displayed in first region 22 and second region 24 >
The user of the information processing apparatus 2000 performs an examination while moving the camera 10 in the body of the subject. Therefore, the abnormal area 30 once exceeding the imaging range of the camera 10 can enter the imaging range of the camera 10 again. For example, assume that the doctor views the video frame 14 displayed in the first region 22, and recognizes that the doctor misses the abnormal region 30 included in the video frame 14 (does not see the abnormal region 30 when the abnormal region 30 is displayed as a video in the second region 24). In this case, there may be the following cases: the doctor operates the camera 10 (e.g., operates the scope 40) so that the abnormal region 30 falls within the imaging range of the camera 10 again, so as to recognize the details by viewing the abnormal region 30 of the video. As a result, the same abnormal area 30 is displayed in the first area 22 and the second area 24. In other words, the same abnormal region 30 is displayed in the video frame representing the past scene and the video representing the real-time scene.
In the case where the same abnormal region 30 is displayed in the first region 22 and the second region 24 in this way, preferably, the display control unit 2040 notifies this fact. For example, in a case where an abnormal region 30 is detected from the video frame 14 to be displayed in the second region 24, the display control unit 2040 determines whether or not the video frame 14 including the abnormal region 30 is displayed in the first region 22. In a case where the video frame 14 including the abnormal region 30 is displayed in the first region 22, the display control unit 2040 performs a predetermined notification.
Any notification method may be employed. For example, the display control unit 2040 may highlight the video frame 14 that includes the same abnormal region 30 as the abnormal region 30 detected from the video frame 14 to be displayed in the second region 24 among the video frames 14 displayed in the first region 22.
Fig. 14 is a diagram showing highlighting. In fig. 14, two video frames 14 (video frame 14-1 and video frame 14-2) are displayed in the first area 22. An abnormal region 30-1 and an abnormal region 30-2 are detected from the video frame 14-1 and the video frame 14-2, respectively. These are abnormal regions 30 different from each other. The abnormal region 30-3 is detected from the video frame 14-3, which is the current frame of the video data 12 displayed in the second region 24. The abnormal area 30-2 and the abnormal area 30-3 represent the same abnormal area 30.
In this case, the display control unit 2040 highlights the video frame 14-2 including the same abnormal region 30 as the abnormal region 30 included in the video frame 14-3. In FIG. 14, the outline of video frame 14-2 is bolded to highlight video frame 14-2. By doing so, the doctor can easily recognize that, among the abnormal regions 30 imaged in the past, the abnormal region 30 imaged in real time by the camera 10 (the abnormal region 30 displayed in the second region 24) is the same as the abnormal region 30 included in the video frame 14-2.
Note that the method of highlighting the video frame 14 is not limited to the method of "bolding the frame line of the video frame 14" shown in the example of fig. 14. For example, various methods such as a method of blinking the video frame 14 or a method of changing the color of the video frame 14 may be employed.
The display control unit 2040 may highlight the first display indicating the abnormal region 30 displayed in the first region 22 and the first display indicating the abnormal region 30 displayed in the second region 24. By doing so, the user of the information processing apparatus 2000 can easily recognize the position of the abnormal region 30 included in the past video frame, which is the same as the abnormal region 30 included in the video. Fig. 15 is a diagram showing an example in which the first display is highlighted. Fig. 15 shows the same situation as fig. 14, except that the first display is highlighted. In fig. 15, an overlay mark 60 is displayed on the abnormal area 30.
In fig. 15, similarly to fig. 14, the abnormal region 30-2 included in the video frame 14-2 and the abnormal region 30 included in the video frame 14-3 are the same abnormal region 30. The display control unit 2040 thickens the frame line of the superimposition mark 60 that indicates these two abnormal regions 30. By so doing, the user of the information processing apparatus 2000 can easily recognize that the abnormal area 30-1 indicated by the overlay mark 60-1 and the abnormal area 30-2 indicated by the overlay mark 60-2 are the same.
Note that when the abnormal region 30 to be displayed on the first region 22 and the abnormal region 30 to be displayed on the second region 24 are the same, it may be allowed to display the same first display for these abnormal regions. For example, overlay marks 60-1 and 60-2 in FIG. 15 have the same shape. By so doing, it is possible to easily recognize whether the abnormal region 30 included in the real-time video is the same as the abnormal region 30 included in the past video frame.
The method of notifying that the same abnormal area 30 is displayed in the first area 22 and the second area 24 is not limited to the above-described highlighting. For example, the notification may be output of a predetermined sound such as a buzzer sound. In another example, the notification may be a notification that a predetermined vibration is output.
< hardware configuration >
Similar to example embodiment 1, for example, the hardware configuration of a computer forming the information processing apparatus 2000 according to example embodiment 2 is represented by fig. 3. However, the storage device 1080 of the computer 1000 forming the information processing apparatus 2000 according to the present exemplary embodiment also stores program modules for realizing the functions of the information processing apparatus 2000 according to the present exemplary embodiment.
[ example embodiment 3]
Fig. 16 is a block diagram showing an information processing apparatus 2000 according to example embodiment 3. The information processing apparatus 2000 according to example embodiment 3 is the same as the information processing apparatus 2000 according to example embodiment 1 or 2 except for the following matters.
The information processing apparatus 2000 according to example embodiment 3 has a designation reception unit 2080. The designation receiving unit 2080 receives an input from a user designating one of a plurality of video frames 14 constituting the video data 12. The designation receiving unit 2080 records the designated video frame 14 in the image storage unit 80. The detection unit 2020 according to the third embodiment stores the video frame 14 from which the abnormal region 30 is detected in the image storage unit 80.
Here, the detection unit 2020 records the video frame 14 from which the abnormal region 30 is detected so as to be distinguishable from the video frame 14 (the video frame 14 specified for the specified reception unit 2080) to be recorded in the image storage unit 80 by the specified reception unit 2080. In other words, the video frame 14 to be recorded in the image storage unit 80 by the detection unit 2020 and the video frame 14 to be recorded in the image storage unit 80 by the designation reception unit 2080 are recorded so as to be distinguishable from each other. Hereinafter, the video frame 14 to be recorded in the image storage unit 80 by the detection unit 2020 is referred to as an auto storage frame. Further, the video frame 14 to be recorded in the image storage unit 80 by the designation reception unit 2080 is referred to as a designation storage frame.
The designation as to the designation reception unit 2080 is made by, for example, a doctor who performs an examination. For example, when the doctor finds an abnormal part while viewing the video data 12 displayed in the second region 24 during the examination, the doctor attempts to record the video frame 14 including the part. In another example, the physician may record a video frame 14 that includes a predetermined site to be noted, regardless of whether the site is abnormal.
In this case, for example, in a case where the part is included in the video frame 14 displayed in the second area 24 to specify the video frame 14 including the part, the doctor operates an input device such as a keyboard or a predetermined button. The designation receiving unit 2080 records the video frames 14 designated in this manner in the image storage unit 80. From the user's perspective, the above-described operation is, for example, an operation of taking a picture by releasing the shutter of the camera.
In contrast, the automatic storage frame to be recorded in the image storage unit 80 by the detection unit 2020 includes the video frame 14 that is automatically detected (not specified by the user) by the image analysis of the information processing apparatus 2000. That is, the automatic storage frame is the video frame 14 automatically recorded in the image storage unit 80 by the information processing apparatus 2000.
As described above, the designated storage frame and the automatic storage frame are different for the user in the meanings of the trigger and the frame recorded in the image storage unit 80. Therefore, it is preferable that the user can easily recognize whether the video frame 14 stored in the image storage unit 80 is a designated storage frame or an automatic storage frame.
In this regard, with the information processing apparatus 2000 according to the present exemplary embodiment, the video frame 14 to be recorded in the image storage unit 80 by the detection unit 2020 and the video frame 14 to be recorded in the image storage unit 80 by the designation reception unit 2080 are recorded in the image storage unit 80 so as to be distinguishable from each other. Therefore, it is possible to easily discriminate whether the video frame 14 stored in the image storage unit 80 is a designated storage frame or an automatic storage frame.
< method of discrimination >
Any method of storing the designated storage frame and the automatic storage frame in the image storage unit 80 in a discriminable manner may be employed. For example, the information processing apparatus 2000 records a flag indicating whether the video frame 14 is a specified storage frame or an automatic storage frame in the image storage unit 80 in association with the video frame 14. Fig. 17 is a diagram showing a format of information to be stored in the image storage unit 80 in a table form. The table shown in fig. 17 is represented by table 200. Table 200 has two columns: a video frame 202 and a type flag 204. Video frame 202 represents video frame 14 itself. The type flag 204 indicates whether the video frame 14 shown in the video frame 202 is a designated storage frame or an automatic storage frame.
< display with respect to display control unit 2040 >
In the case where there is a video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the specified storage frame among the video frames 14 to be displayed in the first region 22, the display control unit 2040 enables the video frame 14 to be distinguishable from the other video frames 14. For example, the display control unit 2040 performs predetermined display in the video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the periphery of the specified storage frame or video frame 14. The predetermined display is represented by a second display. For example, the second display highlights the video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the designated storage frame. Note that the method described in example embodiment 2 may be used as a method of highlighting the determined video frame 14.
One of the purposes of displaying the video frame 14 from which the abnormal region 30 is detected in the first region 22 is to prevent the user from missing the abnormal region 30. At this point, the abnormal region 30 included in the designated storage frame (i.e., the video frame 14 designated by the user) can be noticed by the user and is not missed by the user.
In the case where the video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the specified storage frame is displayed in the first region 22, the detection unit 2020 enables the user to identify the video frame 14. By so doing, the user can easily recognize the fact that the user has recognized the abnormal region 30 displayed in the first region 22.
However, the display control unit 2040 may perform the second display in the video frame 14 including the abnormal region 30 different from the abnormal region 30 included in the specified storage frame, and in the video frame 14 to be displayed in the first region 22, may not perform the second display in the video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the specified storage frame. By so doing, among the abnormal regions 30 automatically detected by the information processing apparatus 2000, the video frame 14 including the abnormal region 30 that is likely not recognized by the user is highlighted. Therefore, the user can easily recognize the abnormal region 30 which is not recognized by the user.
In another example, in the video frame 14 in which the abnormal region 30 is detected by the detection unit 2020, the display control unit 2040 may not display (delete from the first region 22) the video frame 14 including the same abnormal region 30 as the abnormal region 30 included in the specified storage frame in the first region 22. By so doing, in the video frame 14 in which the abnormal region 30 is detected by the detection unit 2020, only the abnormal region 30 that is likely not to be recognized by the user is displayed in the first region 22 of the display device 20. Therefore, the user can easily recognize the abnormal region 30 which is not recognized by the user.
< example of hardware configuration >
Similar to example embodiment 1, for example, the hardware configuration of a computer forming the information processing apparatus 2000 according to example embodiment 3 is represented by fig. 3. However, the storage device 1080 of the computer 1000 forming the information processing apparatus 2000 according to the present exemplary embodiment also stores program modules for realizing the functions of the information processing apparatus 2000 according to the present exemplary embodiment.
[ example embodiment 4]
Fig. 18 is a block diagram showing an information processing apparatus 2000 according to example embodiment 4. The information processing apparatus 2000 according to example embodiment 4 is the same as the information processing apparatus 2000 according to example embodiments 1, 2, or 3 except for the following matters.
The information processing apparatus 2000 according to example embodiment 4 includes a second detection unit 2100. The second detection unit 2100 detects a predetermined action of the user on the abnormal region 30 or its periphery. When a user inside the body of an inspection object finds a part that may be abnormal, the user performs various actions to observe the part in more detail. Examples of the action are 1) changing the color or intensity of light irradiated to the abnormal region 30 or its periphery, 2) performing dye spraying or coloring on the abnormal region 30 or its periphery, 3) applying water or a drug to the abnormal region 30 or its periphery, and 4) collecting tissue of the abnormal region 30 or its periphery. Similar to the abnormal area 30 specified by the user operation with respect to the specification receiving unit 2080, it is possible that the abnormal area 30 for these actions is a part that is highly likely to be recognized by the user. Here, the "abnormal region 30 for a predetermined motion" refers to the abnormal region 30 in which the second detection unit 2100 detects a predetermined motion of the user with respect to the abnormal region 30 or the periphery of the abnormal region 30.
For example, the display control unit 2040 performs the same control on the abnormal region 30 for which a predetermined action by the user is directed as the display control (refer to example embodiment 3) performed on the same abnormal region 30 as the abnormal region 30 included in the specified storage frame. More specifically, for example, the display control unit 2040 executes: displaying a predetermined display on an abnormal area 30, which is directed to a predetermined action of the user and is displayed in or at the periphery of the first area 22; or a predetermined display is displayed in the video frame 14 including the abnormal region 30 or its periphery. This predetermined display is referred to as a third display. The third display is a display that highlights the abnormal area 30 or the video frame 14, for example. By so doing, the user can easily recognize the abnormal region 30 that has been recognized by the user among the abnormal regions 30 displayed in the first region 22. Note that the various displays described above may be used to highlight the determined anomaly region 30 or video frame 14.
Fig. 19 is a diagram showing a scene in which the video frame 14 including the abnormal region 30 for the predetermined action of the user is highlighted. In this example, the predetermined action of the user is an action of dyeing the periphery of the abnormal region 30.
In fig. 19, an abnormal area 30-3 is detected from the video frame 14-3 displayed in the second area 24. In addition, the periphery of the abnormal region 30-3 is stained.
Here, the abnormal region 30-2 indicating the same abnormal region as the abnormal region 30-3 is detected from the video frame 14-2 among the video frames 14 displayed in the first region 22. Highlighting is performed to thicken the outline of video frame 14-2.
Note that the processing executed when the predetermined action of the user is detected is not limited to the display of the third display described above. For example, the display control unit 2040 may not display the video frame 14 including the target abnormal region 30 for the predetermined action of the user in the first region 22. By so doing, in the first region 22, of the video frames 14 in which the abnormal region 30 is detected by the detection unit 2020, only the abnormal region 30 that is highly likely not to be recognized by the user is displayed in the first region 22 of the display device 20. Therefore, the user can easily recognize that the user does not recognize the abnormal region 30.
< method for user to detect predetermined action >
The various predetermined operations described above are executed by a user performing predetermined input operations on an endoscope system or the like. For example, in a general endoscope system, a viewing mirror equipped with a camera is provided with a mechanism for irradiating light (such as a light source), a mechanism for spraying a dye or a coloring solution, a mechanism for administering water or a drug, a mechanism for collecting tissue, and the like. These mechanisms operate in response to a predetermined input operation performed by a user on the endoscope system. In other words, when the various predetermined actions described above are performed, an input operation for operating a mechanism that realizes the predetermined actions is performed.
For example, the second detection unit 2100 detects that input operations for operating these mechanisms are performed in order to detect that the user has performed a predetermined action. For example, the second detection unit 2100 receives a notification indicating that an input operation is performed from the endoscope system or the like to detect that the input operation is performed.
Here, the second detection unit 2100 processes a part included in the video frame 14 displayed in the second region 24 (a part captured by the camera 10) as a part for a predetermined action of the user at the time when the input operation is detected. That is, in a case where the abnormal region 30 is included in the video frame 14 displayed in the second region 24 at the time when the input operation is detected, the second detection unit 2100 processes the abnormal region 30 as the abnormal region 30 for a predetermined action of the user.
Note that a method of detecting a predetermined action of the user by the second detection unit 2100 is not limited to the above-described method of detecting an input operation. For example, the second detection unit 2100 may perform image analysis of the video data 12 in order to detect a predetermined action of the user. For example, the second detection unit 2100 compares the luminance distribution or the color distribution of each video frame 14 included in the video data 12 to detect a change in luminance or color of the imaging range of the camera 10. The second detection unit 2100 detects a change in color or intensity of light illuminating the imaging range of the camera 10 or sprays a staining solution.
In the case of using image analysis in this way, for example, the second detection unit 2100 processes a part included in the video frame 14, which is detected as a change in luminance or color, as a part of a predetermined action for the user. That is, when the abnormal region 30 in which the change of the brightness or the color is detected is included in the video frame 14, the second detection unit 2100 regards the abnormal region 30 as the abnormal region 30 for the predetermined action of the user.
< example of hardware configuration >
Similar to example embodiment 1, for example, the hardware configuration of a computer forming the information processing apparatus 2000 according to example embodiment 4 is represented by fig. 3. However, the storage device 1080 of the computer 1000 forming the information processing apparatus 2000 according to the present exemplary embodiment also stores program modules for realizing the functions of the information processing apparatus 2000 according to the present exemplary embodiment.
As described above, the exemplary embodiments according to the present invention are described with reference to the drawings, but these are examples of the present invention. The present invention may adopt a combination of the above-described exemplary embodiments or various configurations other than the above-described configuration.
Some or all of the above exemplary embodiments may be described as described in, but not limited to, the following supplements.
1. An information processing apparatus comprising: a detection unit that detects an abnormal region in an interior of a body from a video in which the interior of the body is imaged; and a display control unit that displays, among a plurality of video frames constituting the video, a video frame from which the abnormal region is detected, in a first region of a display device, and displays the video including a video frame generated after the video frame, in a second region of the display device.
2. The information processing apparatus according to supplement 1, wherein the display control unit displays a first display indicating a position of the abnormal region in the video frame from which the abnormal region is detected to be displayed on the display device.
3. The information processing apparatus according to supplement 2, wherein the display control unit displays the plurality of video frames from which the abnormal region is detected in the first region.
4. The information processing apparatus according to supplement 3, further comprising: a determination unit that determines whether or not the abnormal regions detected from the plurality of video frames represent the same abnormality, wherein the display control unit performs: displaying the same first display for the abnormal area when the abnormal areas detected from the plurality of video frames are determined to be the same, and displaying different first displays for the abnormal area when the abnormal areas detected from the plurality of video frames are determined to be different from each other.
5. The information processing apparatus according to any one of supplements 1 to 4, further comprising: a determination unit that determines whether abnormal regions detected from a plurality of video frames are the same, wherein when the abnormal regions detected from the plurality of video frames are determined to be the same, the detection unit displays some of the plurality of video frames in the first region.
6. The information processing apparatus according to supplement 5, wherein the detection unit displays the video frame having the highest possibility that the abnormal region represents an abnormality, the video frame having the shortest distance between the abnormal region and a center position of the video frame, the video frame having the highest contrast in an entire image region, or the video frame having the highest contrast in the abnormal region in the first region, among the plurality of video frames from which the same abnormal region is detected.
7. The information processing apparatus according to any one of supplements 1 to 6, wherein the detection unit records the video frame from which the abnormal region is detected in a storage unit.
8. The information processing apparatus according to supplement 7, further comprising: a designation receiving unit that receives an input designating one of a plurality of video frames constituting the video, and records the designated video frame in the storage unit, wherein the detection unit records the video frame from which the abnormal region is detected in the storage unit so as to be distinguishable from the video frame recorded in the storage unit by the designation receiving unit.
9. The information processing apparatus according to supplement 8, further comprising: a determination unit that determines whether abnormal regions detected from a plurality of video frames are the same, wherein the display control unit displays a predetermined display in a first video frame displayed in the first region or in a periphery of the first video frame when a second video frame determined to include the same abnormal region as the abnormal region detected from the first video frame is designated as the input to the designation reception unit.
10. The information processing apparatus according to supplement 8, further comprising: a determination unit that determines whether the abnormal regions detected from a plurality of video frames are the same, wherein the display control unit does not display a first video frame in the first region when a second video frame determined to include the same abnormal region as the abnormal region detected from the first video frame is designated as the input to the designation reception unit.
11. The information processing apparatus according to any one of supplements 1 to 10, further comprising: a second detection unit that detects a predetermined action by a user on the detected abnormal region or a periphery of the abnormal region, wherein when the predetermined action on the detected abnormal region or the periphery of the abnormal region is detected, the display control unit displays a predetermined display in the video frame or the periphery of the video frame that includes the detected abnormal region and is displayed in the first region.
12. The information processing apparatus according to any one of supplements 1 to 10, further comprising: a second detection unit that detects a predetermined motion by a user to the detected abnormal region or a periphery of the abnormal region, wherein the display control unit does not display the video frame including the detected abnormal region in the first region when the predetermined motion by the user to the detected abnormal region or the periphery of the abnormal region is detected.
13. The information processing apparatus according to supplement 11 or 12, wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a medicine to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
14. A control method executed by a computer, the method comprising: a detection step of detecting an abnormal region in an interior of a body from a video in which the interior of the body is imaged; and a display control step of displaying, among video frames constituting the video, a video frame from which the abnormal region is detected, in a first region of a display device, and displaying the video including the video frame generated after the video frame, in a second region of the display device.
15. The control method according to supplement 14, further comprising: in the display control step, a first display indicating a position of the abnormal region in the video frame from which the abnormal region is detected to be displayed on the display device is displayed.
16. The control method according to supplement 15, further comprising: in the display control step, a plurality of video frames from which the abnormal region is detected are displayed in the first region.
17. The control method according to supplement 16, further comprising: a determination step of determining whether or not abnormal regions detected from a plurality of video frames represent the same abnormality, in the display control step, performing: displaying the same first display for the abnormal area when the abnormal areas detected from the plurality of video frames are determined to be the same, and displaying different first displays for the abnormal area when the abnormal areas detected from the plurality of video frames are determined to be different from each other.
18. The control method according to any one of supplements 14 to 17, further comprising: a determination step of determining whether abnormal regions detected from a plurality of video frames are the same, in the detection step, when the abnormal regions detected from the plurality of video frames are determined to be the same, some of the plurality of video frames are displayed in the first region.
19. The control method according to supplement 18, wherein in the detecting step, the video frame having the highest possibility that the abnormal region represents an abnormality, the video frame having the shortest distance between the abnormal region and a center position of the video frame, the video frame having the highest contrast in an entire image region, or the video frame having the highest contrast in the abnormal region displayed, of the plurality of video frames from which the same abnormal region is detected, is displayed.
20. The control method according to any one of supplements 14 to 19, further comprising: in the detecting step, the video frame from which the abnormal region is detected is recorded in a storage unit.
21. The control method according to supplement 20, further comprising: a designation receiving step of receiving an input designating one of a plurality of video frames constituting the video, and recording the designated video frame in the storage unit, in the detecting step, the video frame from which the abnormal region is detected is recorded in the storage unit so as to be distinguishable from the video frame recorded in the storage unit by the designation receiving step.
22. The control method according to supplement 21, further comprising: a determination step of determining whether abnormal regions detected from a plurality of video frames are the same, in which a predetermined display is displayed in a first video frame displayed in the first region or in a periphery of the first video frame when a second video frame determined to include the same abnormal region as the abnormal region detected from the first video frame is designated as the input to the designation reception unit.
23. The control method according to supplement 21, further comprising: a determination step of determining whether the abnormal regions detected from a plurality of video frames are the same, wherein in the display control step, when a second video frame is designated as the input in the designation reception step, a first video frame is not displayed in the first region, the second video frame being determined to include the same abnormal region as the abnormal region detected from the first video frame.
24. The control method according to any one of supplements 14 to 23, further comprising: a second detection step of detecting a predetermined motion by a user to the detected abnormal region or a periphery of the abnormal region, in the display control step, when the predetermined motion to the detected abnormal region or the periphery of the abnormal region is detected, a predetermined display is displayed in the video frame or the periphery of the video frame including the detected abnormal region and displayed in the first region.
25. The control method according to any one of supplements 14 to 23, further comprising: a second detection step of detecting a predetermined motion of the detected abnormal region or a periphery of the abnormal region by a user, wherein in the display control step, when the predetermined motion of the detected abnormal region or the periphery of the abnormal region by the user is detected, the video frame including the detected abnormal region is not displayed in the first region.
26. The control method according to supplement 24 or 25, wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a drug to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
27. A program that causes a computer to execute each step of the control method according to any one of supplements 14 to 26.
The present application claims priority based on japanese patent application No.2017-103348 filed on 25/5/2017, the entire disclosure of which is incorporated herein.

Claims (13)

1. An information processing apparatus includes:
a detection unit that detects an abnormal region in an interior of a body from a video in which the interior of the body is imaged;
a display control unit that displays, among a plurality of video frames constituting the video, a video frame from which the abnormal region is detected, in a first region of a display device, and displays the video including the video frame generated after the video frame in a second region of the display device; and
a second detection unit that detects a predetermined action by a user on the detected abnormal region or a periphery of the abnormal region,
wherein when the predetermined action on the detected abnormal region or the periphery of the abnormal region is detected, the display control unit displays a predetermined display in the video frame or the periphery of the video frame including the detected abnormal region and displayed in the first region, and
Wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a drug to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
2. An information processing apparatus includes:
a detection unit that detects an abnormal region in an interior of a body from a video in which the interior of the body is imaged;
a display control unit that displays, among a plurality of video frames constituting the video, a video frame from which the abnormal region is detected, in a first region of a display device, and displays the video including the video frame generated after the video frame in a second region of the display device; and
a second detection unit that detects a predetermined action by a user on the detected abnormal region or a periphery of the abnormal region,
Wherein when the predetermined action on the detected abnormal region or the periphery of the abnormal region by the user is detected, the display control unit does not display the video frame including the detected abnormal region in the first region, and
wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a drug to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
3. The information processing apparatus according to claim 1 or 2,
wherein the display control unit displays a first display indicating a position of the abnormal region in the video frame from which the abnormal region to be displayed on the display device is detected.
4. The information processing apparatus according to claim 3,
Wherein the display control unit displays a plurality of the video frames from which the abnormal region is detected in the first region.
5. The information processing apparatus according to claim 4, further comprising:
a determination unit that determines whether or not the abnormal regions detected from the plurality of video frames represent the same abnormality,
wherein the display control unit performs:
when the abnormal regions detected from the plurality of video frames are determined to be the same, displaying the same first display for the abnormal regions, an
When the abnormal regions detected from the plurality of video frames are determined to be different from each other, a different first display is displayed for the abnormal regions.
6. The information processing apparatus according to claim 1 or 2, further comprising:
a determination unit that determines whether the abnormal regions detected from the plurality of video frames are the same,
wherein the detection unit displays some of the plurality of video frames in the first area when the abnormal area detected from the plurality of video frames is determined to be the same.
7. The information processing apparatus according to claim 6,
Wherein the detection unit displays the video frame having the highest possibility that the abnormal region represents an abnormality, the video frame having the shortest distance between the abnormal region and a center position of the video frame, the video frame having the highest contrast in the entire image region, or the video frame having the highest contrast in the abnormal region in the first region, among the plurality of video frames from which the same abnormal region is detected.
8. The information processing apparatus according to claim 1 or 2,
wherein the detection unit records the video frame from which the abnormal region is detected in a storage unit.
9. The information processing apparatus according to claim 8, further comprising:
a designation receiving unit that receives an input designating one of a plurality of video frames constituting the video and records the designated video frame in the storage unit,
wherein the detection unit records the video frame from which the abnormal region is detected in the storage unit so as to be distinguishable from the video frame recorded in the storage unit by the specified reception unit.
10. The information processing apparatus according to claim 9, further comprising:
a determination unit that determines whether the abnormal regions detected from the plurality of video frames are the same,
wherein the display control unit displays a predetermined display in a first video frame displayed in the first area or in a periphery of the first video frame when a second video frame determined to include the same abnormal area as the abnormal area detected from the first video frame is designated as the input to the designation reception unit.
11. The information processing apparatus according to claim 9, further comprising:
a determination unit that determines whether the abnormal regions detected from a plurality of video frames are the same,
wherein the display control unit does not display a first video frame in the first area when a second video frame determined to include the same abnormal area as the abnormal area detected from the first video frame is designated as the input to the designation reception unit.
12. A computer-readable storage medium storing a program, the program causing a computer to execute:
Detecting an abnormal region in an interior of a body from a video in which the interior of the body is imaged;
displaying, in a first area of a display device, a video frame from which the abnormal area is detected, among a plurality of video frames constituting the video, and displaying the video including the video frame generated after the video frame in a second area of the display device; and
detecting a predetermined action by a user on the detected abnormal region or a periphery of the abnormal region,
wherein when the predetermined action on the detected abnormal region or the periphery of the abnormal region is detected, a predetermined display is displayed in the video frame or the periphery of the video frame that includes the detected abnormal region and is displayed in the first region, and
wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a drug to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
13. A computer-readable storage medium storing a program, the program causing a computer to execute:
detecting an abnormal region in an interior of a body from a video in which the interior of the body is imaged;
displaying, in a first area of a display device, a video frame from which the abnormal area is detected, among a plurality of video frames constituting the video, and displaying the video including the video frame generated after the video frame in a second area of the display device; and
detecting a predetermined action by a user on the detected abnormal region or a periphery of the abnormal region,
wherein when the predetermined action of the user on the detected abnormal region or the periphery of the abnormal region is detected, the video frame including the detected abnormal region in the first region is not displayed, and
wherein the predetermined action by the user is an action of changing a color or intensity of light irradiated to the detected abnormal region or the periphery of the abnormal region, an action of performing dye spraying or coloring in the detected abnormal region or the periphery of the abnormal region, an action of applying water or a drug to the detected abnormal region or the periphery of the abnormal region, or an action of collecting tissue of the detected abnormal region or the periphery of the abnormal region.
CN201880034288.0A 2017-05-25 2018-05-18 Information processing apparatus, control method, and program Active CN110662477B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-103348 2017-05-25
JP2017103348 2017-05-25
PCT/JP2018/019309 WO2018216617A1 (en) 2017-05-25 2018-05-18 Information processing device, control method, and program

Publications (2)

Publication Number Publication Date
CN110662477A CN110662477A (en) 2020-01-07
CN110662477B true CN110662477B (en) 2022-06-28

Family

ID=64396438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880034288.0A Active CN110662477B (en) 2017-05-25 2018-05-18 Information processing apparatus, control method, and program

Country Status (5)

Country Link
US (1) US20200129042A1 (en)
EP (1) EP3636134A4 (en)
JP (2) JP6799301B2 (en)
CN (1) CN110662477B (en)
WO (1) WO2018216617A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110662476B (en) * 2017-05-25 2022-02-11 日本电气株式会社 Information processing apparatus, control method, and program
JPWO2019078237A1 (en) * 2017-10-18 2020-10-22 富士フイルム株式会社 Medical image processing equipment, endoscopy system, diagnostic support equipment, and medical business support equipment
CN112512398B (en) * 2018-07-27 2024-03-29 富士胶片株式会社 Medical image processing apparatus
EP3861921A4 (en) * 2018-10-04 2021-12-01 NEC Corporation Information processing device, control method, and program
JP7215504B2 (en) * 2019-02-13 2023-01-31 日本電気株式会社 Treatment support device, treatment support method, and program
WO2020170809A1 (en) * 2019-02-19 2020-08-27 富士フイルム株式会社 Medical image processing device, endoscope system, and medical image processing method
JP7256275B2 (en) * 2019-09-03 2023-04-11 富士フイルム株式会社 Medical image processing device, endoscope system, operating method and program for medical image processing device
JP7179707B2 (en) * 2019-10-03 2022-11-29 富士フイルム株式会社 Medical support device, medical support method, and medical support program
EP4129150A4 (en) * 2020-03-30 2023-05-24 NEC Corporation Information processing device, display method, and non-transitory computer-readable medium having program stored therein
JP7402314B2 (en) * 2020-04-02 2023-12-20 富士フイルム株式会社 Medical image processing system, operating method of medical image processing system
JP7389257B2 (en) 2020-07-15 2023-11-29 富士フイルム株式会社 Endoscope system and its operating method
WO2023153069A1 (en) * 2022-02-09 2023-08-17 富士フイルム株式会社 Medical image device, endoscope system, and medical certificate creation system
WO2023238609A1 (en) * 2022-06-09 2023-12-14 富士フイルム株式会社 Information processing device, endoscopic device, information processing method, and program

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101116608A (en) * 2006-08-03 2008-02-06 奥林巴斯医疗株式会社 Image display device
CN101252874A (en) * 2005-09-02 2008-08-27 奥林巴斯医疗株式会社 Portable simplified image display device and receiving system
CN101541227A (en) * 2005-02-10 2009-09-23 G.I.视频有限公司 Advancement techniques for gastrointestinal tool with guiding element
CN100579443C (en) * 2005-01-19 2010-01-13 奥林巴斯株式会社 Electronic endoscope
CN103068403A (en) * 2010-08-20 2013-04-24 日本国立癌症研究中心 Method and composition for treating, preventing and diagnosing cancer containing cancer stem cells or derived therefrom
CN103153155A (en) * 2011-03-15 2013-06-12 奥林巴斯医疗株式会社 Medical device
CN105407787A (en) * 2013-08-02 2016-03-16 奥林巴斯株式会社 Image processing device, image processing method, and program
CN105512473A (en) * 2015-11-30 2016-04-20 广州三瑞医疗器械有限公司 Intelligent identification method and device of colposcope images
CN106659368A (en) * 2014-07-21 2017-05-10 恩多巧爱思股份有限公司 Multi-focal, multi-camera endoscope systems
WO2017081976A1 (en) * 2015-11-10 2017-05-18 オリンパス株式会社 Endoscope device

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6708054B2 (en) * 2001-04-12 2004-03-16 Koninklijke Philips Electronics, N.V. MR-based real-time radiation therapy oncology simulator
EP1262900A3 (en) * 2001-05-29 2007-03-07 MeVis BreastCare GmbH &amp; Co. KG A method and system for in-service monitoring and training for a radiologic workstation
US6616634B2 (en) * 2001-09-21 2003-09-09 Semler Technologies, Inc. Ergonomic syringe
JP4370121B2 (en) * 2003-06-02 2009-11-25 オリンパス株式会社 Endoscope device
US6997910B2 (en) * 2004-05-03 2006-02-14 Infusive Technologies, Llc Multi-chamber, sequential dose dispensing syringe
WO2007026890A1 (en) * 2005-09-02 2007-03-08 Olympus Medical Systems Corp. Portable simplified image display device and receiving system
JP2007159934A (en) 2005-12-15 2007-06-28 Hitachi Medical Corp Comparative diagnostic reading supporting apparatus
JP4891636B2 (en) * 2006-03-14 2012-03-07 オリンパスメディカルシステムズ株式会社 Image analysis device
JP2008301968A (en) * 2007-06-06 2008-12-18 Olympus Medical Systems Corp Endoscopic image processing apparatus
JP5186929B2 (en) * 2008-01-21 2013-04-24 日本電気株式会社 Authentication imaging device
US9007475B2 (en) * 2009-12-17 2015-04-14 Lenovo Innovations Limited (Hong Kong) Communication apparatus and electronic mail creation method
WO2011132468A1 (en) 2010-04-21 2011-10-27 コニカミノルタエムジー株式会社 Medical-image displaying device and program
JP5537261B2 (en) * 2010-05-25 2014-07-02 株式会社東芝 Medical image diagnostic apparatus, image information processing apparatus, and treatment support data display control program
JP5329593B2 (en) * 2011-04-01 2013-10-30 富士フイルム株式会社 Biological information acquisition system and method of operating biological information acquisition system
JP2012223363A (en) * 2011-04-20 2012-11-15 Tokyo Institute Of Technology Surgical imaging system and surgical robot
JP2012248070A (en) * 2011-05-30 2012-12-13 Sony Corp Information processing device, metadata setting method, and program
US20130044927A1 (en) * 2011-08-15 2013-02-21 Ian Poole Image processing method and system
WO2013039395A1 (en) * 2011-09-14 2013-03-21 Ec Solution Group B.V. Active matrix display smart card
EP2573403B1 (en) * 2011-09-20 2017-12-06 Grundfos Holding A/S Pump
JP5713959B2 (en) * 2012-05-23 2015-05-07 株式会社東芝 Electronic device, method, and program
KR101323646B1 (en) * 2012-08-17 2013-11-05 한국전기연구원 Filter exchanging device for fluorescence endoscopic television camera systems
CN202843579U (en) * 2012-09-28 2013-04-03 北京锐视觉科技有限公司 Slit lamp
WO2015029584A1 (en) * 2013-08-30 2015-03-05 オリンパスメディカルシステムズ株式会社 Image management device
JP2015195845A (en) * 2014-03-31 2015-11-09 富士フイルム株式会社 Endoscope system, operation method of endoscope system, processor device, and operation method of processor device
CN104161493B (en) * 2014-07-22 2016-04-20 清华大学深圳研究生院 Polarization imaging endoscopic system and endoscopic imaging method
JP6503167B2 (en) 2014-08-27 2019-04-17 株式会社Nobori Image interpretation report system
US10302780B2 (en) * 2014-10-17 2019-05-28 Silverside Detectors Inc. Fissile neutron detector
JP6536153B2 (en) 2015-04-27 2019-07-03 コニカミノルタ株式会社 Medical image display device and program
JP6422816B2 (en) * 2015-04-28 2018-11-14 富士フイルム株式会社 Endoscope system
JPWO2017073337A1 (en) * 2015-10-27 2017-11-09 オリンパス株式会社 Endoscope apparatus and video processor
JP6597242B2 (en) 2015-12-02 2019-10-30 株式会社明電舎 Vibration control structure of static induction equipment
WO2018159363A1 (en) * 2017-03-01 2018-09-07 富士フイルム株式会社 Endoscope system and method for operating same

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100579443C (en) * 2005-01-19 2010-01-13 奥林巴斯株式会社 Electronic endoscope
CN101541227A (en) * 2005-02-10 2009-09-23 G.I.视频有限公司 Advancement techniques for gastrointestinal tool with guiding element
CN101252874A (en) * 2005-09-02 2008-08-27 奥林巴斯医疗株式会社 Portable simplified image display device and receiving system
CN101116608A (en) * 2006-08-03 2008-02-06 奥林巴斯医疗株式会社 Image display device
CN103068403A (en) * 2010-08-20 2013-04-24 日本国立癌症研究中心 Method and composition for treating, preventing and diagnosing cancer containing cancer stem cells or derived therefrom
CN103153155A (en) * 2011-03-15 2013-06-12 奥林巴斯医疗株式会社 Medical device
CN105407787A (en) * 2013-08-02 2016-03-16 奥林巴斯株式会社 Image processing device, image processing method, and program
CN106659368A (en) * 2014-07-21 2017-05-10 恩多巧爱思股份有限公司 Multi-focal, multi-camera endoscope systems
WO2017081976A1 (en) * 2015-11-10 2017-05-18 オリンパス株式会社 Endoscope device
CN105512473A (en) * 2015-11-30 2016-04-20 广州三瑞医疗器械有限公司 Intelligent identification method and device of colposcope images

Also Published As

Publication number Publication date
EP3636134A4 (en) 2021-03-10
JP6799301B2 (en) 2020-12-16
CN110662477A (en) 2020-01-07
JPWO2018216617A1 (en) 2020-04-09
US20200129042A1 (en) 2020-04-30
JP2021040324A (en) 2021-03-11
WO2018216617A1 (en) 2018-11-29
EP3636134A1 (en) 2020-04-15

Similar Documents

Publication Publication Date Title
CN110662477B (en) Information processing apparatus, control method, and program
CN110662476B (en) Information processing apparatus, control method, and program
US9959618B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US9445713B2 (en) Apparatuses and methods for mobile imaging and analysis
US8502861B2 (en) Image display apparatus
EP2453790B1 (en) Image processing apparatus, image processing method, and program
US8830308B2 (en) Image management apparatus, image management method and computer-readable recording medium associated with medical images
US11776692B2 (en) Training data collection apparatus, training data collection method, program, training system, trained model, and endoscopic image processing apparatus
CN104350742B (en) For showing the system and method for image stream
CN111214255B (en) Medical ultrasonic image computer-aided method
KR101840350B1 (en) Method and apparatus for aiding reading efficiency using eye tracking information in medical image reading processing
KR20120072961A (en) Method and apparatus for aiding image diagnosis using medical image, image diagnosis aiding system for performing the same
US11957300B2 (en) Information processing apparatus, control method, and program
JP2007275440A (en) Similar image retrieval system, method, and program
JP5698293B2 (en) Portable medical image display terminal and operating method thereof
US20140064563A1 (en) Image processing apparatus, method of controlling image processing apparatus and storage medium
JP2007289335A (en) Medical image diagnosis support device
JP7253152B2 (en) Information processing device, information processing method, and program
US20220172850A1 (en) Aggravation estimation system
JPWO2015194580A1 (en) Endoscope processor
WO2022080141A1 (en) Endoscopic imaging device, method, and program
JP6568375B2 (en) Ophthalmic information processing system, image processing apparatus, and image processing method
CN116849593A (en) Visual laryngoscope system with organ identification function and organ identification method
JP2003290194A (en) Display device for detection result of abnormal shadow

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant