WO2020075254A1 - Système d'endoscope et procédé de génération d'image d'écran - Google Patents

Système d'endoscope et procédé de génération d'image d'écran Download PDF

Info

Publication number
WO2020075254A1
WO2020075254A1 PCT/JP2018/037852 JP2018037852W WO2020075254A1 WO 2020075254 A1 WO2020075254 A1 WO 2020075254A1 JP 2018037852 W JP2018037852 W JP 2018037852W WO 2020075254 A1 WO2020075254 A1 WO 2020075254A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
annotation
endoscope
endoscopic
unit
Prior art date
Application number
PCT/JP2018/037852
Other languages
English (en)
Japanese (ja)
Inventor
舞 尾島
勝義 石橋
Original Assignee
オリンパス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパス株式会社 filed Critical オリンパス株式会社
Priority to PCT/JP2018/037852 priority Critical patent/WO2020075254A1/fr
Publication of WO2020075254A1 publication Critical patent/WO2020075254A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools

Definitions

  • the present invention relates to an endoscope system and a display image generation method executed in the endoscope system.
  • an endoscope system is known as a system used in the medical field.
  • the endoscope system performs processing such as capturing an image of the inside of a subject using the endoscope, generating an endoscopic image based on the imaging result, and displaying the generated endoscopic image on a monitor. .
  • the function of the endoscope system there is a function of superimposing an arrow mark on the endoscopic image displayed on the monitor and making the arrow mark movable according to the keyboard operation.
  • This function is used when an instructor gives instructions to a doctor who performs endoscopy, when an instructor gives instructions to a trainee or an inexperienced doctor, and when the doctor gives the patient It is a function that is supposed to be used when explaining and the like.
  • Patent Document 1 proposes an endoscope system capable of notifying a user of the inclination of a subject in the depth direction of an image and the size of the subject.
  • Patent Document 2 proposes an electronic endoscope system capable of directly displaying a still image edited using a display means different from the main monitor on the main monitor.
  • the present invention provides an endoscope system that can superimpose drawing information arbitrarily drawn by a user by an easy operation on a terminal device on an endoscopic image displayed on a display device.
  • An object is to provide a display image generation method.
  • a first aspect of the present invention is an endoscope system including an endoscope device and a terminal device, wherein the endoscope device includes an image generation unit that generates an endoscope image, and the endoscope.
  • a display image generation unit that generates a first display image based on an image and a first communication unit that communicates with the terminal device are provided, and the terminal device communicates with the endoscope device.
  • a second communication unit a display unit that displays a second display image, a drawing operation unit that receives a drawing operation for the second display image displayed on the display unit, and coordinate information related to the drawing operation.
  • a coordinate information generation unit for generating the coordinate information, wherein the first communication unit transmits the endoscopic image to the terminal device, and the second communication unit is transmitted from the first communication unit.
  • the display unit receives the endoscopic image and displays the endoscopic image as the second display image. Is displayed, the second communication unit transmits the coordinate information to the endoscope device, the first communication unit receives the coordinate information transmitted from the second communication unit,
  • the display image generation unit is configured to generate, as the first display image, a first superimposed image in which drawing information based on the coordinate information is superimposed on the endoscopic image.
  • a second aspect of the present invention is characterized in that, in the first aspect, a display device for displaying the first display image is further provided.
  • the first communication unit further transmits the first display image to the terminal device, and the second communication unit further includes the It is characterized in that the first display image transmitted from the first communication unit is received.
  • a fourth aspect of the present invention is characterized in that, in the third aspect, the display section further displays the first display image as the second display image.
  • the terminal device in the coordinate information, with respect to the endoscope image or the first display image transmitted from the first communication unit.
  • a superimposing unit that generates a second superimposition image on which drawing information based on the superimposition information is superposed, and the display unit further displays the second superimposition image as the second display image.
  • the terminal device superimposes drawing information based on the coordinate information on the endoscopic image transmitted from the first communication unit.
  • a superimposing unit that generates a second superposed image is further provided, and the display unit further displays the second superposed image as the second display image.
  • the coordinate information generation unit further assigns identification information to each of the coordinate information relating to the drawing operation, and a plurality of the drawing information is added to the identification information. It is characterized in that it is managed based on.
  • An eighth aspect of the present invention is characterized in that, in the seventh aspect, the drawing information to be managed can be hidden and the drawing information to be hidden is selected using the identification information.
  • a ninth aspect of the present invention is a display image generation method executed in an endoscope system having an endoscope device and a terminal device, wherein the endoscope device generates an endoscope image, The endoscopic image is transmitted to the terminal device, the terminal device receives the endoscopic image, displays the endoscopic image, receives a drawing operation for the displayed endoscopic image, and Generates coordinate information related to a drawing operation, transmits the coordinate information to the endoscope device, the endoscope device receives the coordinate information, and draws the drawing information based on the coordinate information to the endoscopic image. Is generated as a display image.
  • the present invention it is possible to superimpose the drawing information arbitrarily drawn by the user by the easy operation on the terminal device on the endoscopic image displayed on the display device.
  • FIG. 1st Embodiment it is a figure which shows the process example of the coordinate information generation part, the annotation image generation part, and the superposition part in a tablet terminal. It is a flow chart which shows an example of processing performed in the endoscope system concerning a 1st embodiment. It is a figure which shows the structural example of the endoscope system which concerns on 2nd Embodiment. It is a flow chart which shows an example of processing performed in an endoscope system concerning a 2nd embodiment. It is a figure which shows the structural example of the endoscope system which concerns on 3rd Embodiment.
  • FIG. 1 is a diagram showing a configuration example of the endoscope system according to the first embodiment.
  • the endoscope system 1 includes a videoscope 2, an endoscope video processor (hereinafter referred to as “endoscope processor”) 3, an endoscope observation monitor 4, and a tablet terminal device (hereinafter referred to as “tablet”). 5).
  • Each of the video scope 2 and the endoscope observation monitor 4 is connected to the endoscope processor 3.
  • the endoscope processor 3 and the tablet terminal 5 can wirelessly communicate with each other, and input / output of signals, information, and the like performed between the endoscope processor 3 and the tablet terminal 5 described below are wirelessly transmitted and received. It is carried out by.
  • the videoscope 2 takes an image of the inside of the subject and outputs an imaging signal according to the imaging result to the endoscope processor 3.
  • the imaging device included in the videoscope 2 performs imaging of the inside of the subject.
  • the image pickup device is an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the endoscope processor 3 includes an endoscope image generation unit 31, an annotation image generation unit 32, an endoscope observation screen generation unit 33, and a video output unit 34.
  • the endoscope processor 3 is an example of an endoscope device.
  • the endoscopic image generation unit 31 generates an endoscopic image based on the image pickup signal input from the videoscope 2, and outputs the endoscopic image to the endoscopic observation screen generation unit 33.
  • the annotation image generation unit 32 generates an annotation image each time coordinate information is input from the coordinate information generation unit 51 of the tablet terminal 5 based on the input coordinate information and corresponding attribute information (details will be described later). And an annotation image generated last time are combined to generate an annotation image, and the generated annotation image is output to the endoscope observation screen generation unit 33. However, if the annotation image generated last time does not exist yet, the annotation image generated based on the input coordinate information and the corresponding attribute information is not synthesized and the endoscope observation screen generation unit is used as it is. To 33. Note that the annotation image generation unit 32 includes a memory (not shown) that stores the latest annotation image generated for this processing. The annotation image is an example of drawing information.
  • the annotation image generation unit 32 each time the coordinate information is input, the annotation image generation unit 32 generates an annotation image in which annotations corresponding to the input coordinate information are drawn and is output to the endoscope observation screen generation unit 33. It
  • the annotation is a line
  • the annotation image is generated by connecting the vertices based on the coordinate information with a line
  • the type and color of the line are associated with the corresponding attribute information. It is determined based on the included information about the type and color of the line.
  • the endoscopic observation screen generation unit 33 performs endoscopic observation of a superimposed image obtained by superimposing the annotation image input from the annotation image generation unit 32 on the endoscopic image input from the endoscopic image generation unit 31.
  • the image is generated as a mirror observation screen image, and the endoscopic observation screen image is output to the video output unit 34.
  • the endoscopic image input from the endoscopic image generation unit 31 is output to the video output unit 34 as an endoscopic observation screen image.
  • the annotation image is not input from the annotation image generation unit 32, there is no input of the annotation image from the annotation image generation unit 32 from the power-on of the endoscope processor 3 to the present time. Refers to the case.
  • the video output unit 34 outputs a video signal corresponding to the endoscopic observation screen image input from the endoscopic observation screen generation unit 33 to the endoscopic observation monitor 4 and the tablet terminal 5.
  • the endoscope observation monitor 4 is, for example, a liquid crystal display device.
  • the endoscopic observation monitor 4 includes a video display unit 41.
  • the video display unit 41 displays a video (endoscope observation screen image) corresponding to the video signal input from the video output unit 34 of the endoscope processor 3.
  • a live image (live image) of the subject or a live image with annotation (live image with annotation) is displayed on the endoscopic observation monitor 4, and the user B performs observation and confirmation thereof.
  • the user B is a doctor or a patient, for example, a doctor who receives an instruction from an instructing doctor or a patient who receives an explanation from the doctor.
  • the tablet terminal 5 includes a coordinate information generation unit 51, an annotation image generation unit 52, a superposition unit 53, and a video display unit 54.
  • the video display unit 54 is also a display device with a touch panel.
  • the touch panel of the video display unit 54 is an example of a drawing operation unit that receives a drawing operation.
  • the coordinate information generation unit 51 generates coordinate information related to the annotation operation of the user A performed on the touch panel of the video display unit 54, and outputs the coordinate information to the annotation image generation unit 52 and the endoscope processor 3.
  • the user A is a doctor, for example, an instructing doctor or a doctor who gives an explanation to a patient.
  • the annotation operation is an operation of drawing an annotation on a video (image) displayed on the video display unit 54, and is also a touch operation on the touch panel of the video display unit 54.
  • the coordinate information related to the annotation operation is information regarding the position touched on the touch panel of the video display unit 54 (hereinafter referred to as “touch position”).
  • the coordinate information generation unit 51 generation and output of coordinate information is performed as follows.
  • the coordinate information generation unit 51 acquires the coordinates (vertex coordinates) of the touch position for each first predetermined period. Then, for each second predetermined period (> first predetermined period), coordinate information about the coordinates acquired during the latest second predetermined period is generated and output. This makes it possible to generate and output coordinate information in which the coordinates of a plurality of touch positions are collected.
  • the coordinate information generation unit 51 when the coordinate is not acquired continuously for the third predetermined period ( ⁇ second predetermined period) from the time when the coordinate is finally acquired, the coordinate is acquired last. These coordinates are regarded as the coordinates of the end point position of the annotation drawing.
  • the first coordinates after the power of the tablet terminal 5 is turned on are acquired, or when the first coordinates are acquired after the coordinates are not continuously acquired for the third predetermined period.
  • the coordinates as the starting point position coordinates for annotation drawing Regards the coordinates as the starting point position coordinates for annotation drawing.
  • the coordinates from the start point position coordinates of the annotation drawing to the end point position coordinates are set as the coordinates of one unit of annotation drawing, and when outputting the coordinate information including the start point position coordinates, the annotation attribute information is added to the coordinate information.
  • the attribute information includes information about the type and color of the annotation line. The type and color of the line can be arbitrarily selected by the user A depending on the operation on the tablet terminal 5.
  • the annotation image generation unit 52 receives the coordinate information from the coordinate information generation unit 51, the annotation image generation unit 52 receives the coordinate information input in the latest fourth predetermined period ( ⁇ second predetermined period) and the corresponding attribute information. An annotation image is generated based on, and the annotation image is output to the superimposing unit 53.
  • the fourth predetermined period is a period provided for reflecting the annotation operation of the user A on the display of the tablet terminal 5 (video display unit 54) in real time. If it is assumed that the annotation image is generated and superimposed only in the endoscope processor 3, it may take some time after the annotation operation by the user A until it is reflected in the display of the tablet terminal 5. There is a possibility that a time lag of (3) occurs and the user A may feel a sense of visual discomfort. Therefore, the fourth predetermined period is set to a period corresponding to the time of the time lag or a period corresponding to a time equal to or longer than the time of the time lag in order not to cause the visual discomfort.
  • the superposition unit 53 generates a superposition image in which the annotation image input from the annotation image generation unit 52 is superposed on the endoscopic observation screen image corresponding to the video signal input from the endoscope processor 3, A video signal corresponding to the superimposed image is output to the video display unit 54.
  • the video signal input from the endoscope processor 3 is output to the video display unit 54.
  • the case where there is no input of an annotation image from the annotation image generation unit 52 means that there is no input of an annotation image from the annotation image generation unit 52 from the moment the power of the tablet terminal 5 is turned on until the present time. Point to.
  • the video display unit 54 displays a video (superimposed image or endoscope observation screen image) according to the video signal input from the superimposing unit 53. Accordingly, the live image (live image) or the annotated live image (live image with annotation) in the subject is displayed on the tablet terminal 5, and the user A performs an annotation operation on the image (image). It is possible to check the operation result.
  • FIG. 2 is a diagram showing a processing example of the coordinate information generation unit 51, the annotation image generation unit 52, and the superposition unit 53 in the tablet terminal 5.
  • the coordinate information generation unit 51 when the user A starts an annotation operation on the tablet terminal 5, the coordinate information generation unit 51 causes the coordinate information 1, coordinate information 2, ..., Coordinate information N to be displayed for each second predetermined period. , Generates and outputs the coordinate information according to the annotation operation.
  • the annotation image generation unit 52 Every time the coordinate information is input, the annotation image generation unit 52 generates and outputs an annotation image based on the coordinate information input in the latest fourth predetermined period and the corresponding attribute information. For example, when the coordinate information N is input, the coordinate information N-1 and the coordinate information N, which are the coordinate information input during the fourth predetermined period from the time when the coordinate information N is input, An annotation image 501 is generated and output based on the corresponding attribute information.
  • the superimposing unit 53 generates the superposed image 503 in which the input annotation image 501 is superposed on the endoscopic observation screen image 502 corresponding to the video signal input from the endoscopic processor 3.
  • the endoscopic observation screen image 502 is an endoscopic observation screen image on which the annotation image generated based on the coordinate information already output from the coordinate information generation unit 51 in the endoscope processor 3 is superimposed. (Live image with annotation).
  • the superimposed image 503 is displayed on the tablet terminal 5.
  • the annotation operation of the user A is reflected on the display of the tablet terminal 5 in real time, and there is no fear of causing the user A to feel a sense of visual discomfort due to the above-mentioned time lag.
  • FIG. 3 is a flowchart showing an example of processing executed in the endoscope system 1 according to the first embodiment. Note that this process is also a process executed when the operation mode of the endoscope system 1 is the live mode.
  • the live mode is a mode in which a live image in the subject can be observed.
  • the endoscope processor 3 in the live mode, first, the endoscope processor 3 generates an endoscopic image (live image) based on the image pickup signal input from the videoscope 2 and converts the endoscopic image into the endoscopic image.
  • the corresponding video signal is output to the endoscopic observation monitor 4 and the tablet terminal 5 as a video signal (live video) corresponding to the endoscopic observation screen image (S301). Thereafter, this S301 is repeatedly performed until S311 described later is started.
  • the endoscope observation monitor 4 displays an endoscope observation screen image according to the video signal output from the endoscope processor 3 in S301 (S302). As a result, live image display is performed on the endoscope observation monitor 4.
  • the tablet terminal 5 displays the endoscope observation screen image according to the video signal output from the endoscope processor 3 in S301 (S303). As a result, live image display is performed on the tablet terminal 5.
  • the tablet terminal 5 determines whether or not the annotation operation has been performed (S304), and when the determination result is “none”, returns to S303. On the other hand, when the determination result of S304 is "present", the tablet terminal 5 starts a process of generating coordinate information related to the annotation operation every second predetermined period (S305). In this process, attribute information including information about the type and color of the line is added to the coordinate information including the coordinates of the starting point position of the annotation drawing.
  • the tablet terminal 5 After S305 is started, the tablet terminal 5 generates an annotation image as described in the description of the annotation image generating unit 52 (S306) every time the coordinate information is generated in S305, and the annotation image is displayed. A superimposing image superimposed on the endoscopic observation screen image (live image) corresponding to the video signal output from the endoscope processor 3 is generated (S307), and the superimposed image is displayed (S308).
  • the endoscopic observation screen image corresponding to the video signal output from the endoscopic processor 3 is the endoscopic observation image in which the annotation image is not superimposed at the time when S311 described later is not yet performed. It becomes a screen image (output video signal in S301), and becomes an endoscopic observation screen image (output video signal in S312) on which an annotation image is superimposed after S311 described later is performed. Therefore, in the latter case, live video display with annotation is performed in S308.
  • the tablet terminal 5 outputs the generated coordinate information to the endoscope processor 3 every time the coordinate information is generated in S305 (S309).
  • the endoscope processor 3 generates an annotation image as described in the description of the annotation image generating unit 32 each time the coordinate information output from the tablet terminal 5 in S309 is input (S310), and the annotation An image is superimposed on the endoscopic image (live image) generated based on the image pickup signal input from the videoscope 2 to generate a superimposed image (S311), and a video signal corresponding to the superimposed image is generated.
  • the image signal (live image) corresponding to the image of the endoscope observation screen is output to the endoscope observation monitor 4 and the tablet terminal 5 (S312).
  • the endoscope observation monitor 4 displays an endoscope observation screen image according to the video signal output from the endoscope processor 3 in S312 (S313). As a result, live video display with annotation is performed on the endoscope observation monitor 4.
  • S307 and S308 performed by the tablet terminal 5 after S312 are as described above.
  • the user A puts an arbitrary annotation on the endoscope observation screen image displayed on the endoscope observation monitor 4 according to the annotation operation on the tablet terminal 5. They can be overlapped and the contents can be confirmed by the user B. Further, since the annotation operation on the tablet terminal 5 is a touch operation on the touch panel, the operation is easy unlike the keyboard operation.
  • the annotation image generation unit 52 of the tablet terminal 5 may have the same configuration as the annotation image generation unit 32 of the endoscope processor 3 and may perform the same processing.
  • the second embodiment is a modification of the first embodiment. Therefore, in the description of the second embodiment, the same elements as those described in the first embodiment will be designated by the same reference numerals, and detailed description thereof will be omitted.
  • FIG. 4 is a diagram showing a configuration example of the endoscope system according to the second embodiment.
  • the endoscope system 1 according to the second embodiment includes the annotation image generation unit 52 and the superimposing unit 53 included in the tablet terminal 5 of the endoscope system 1 illustrated in FIG. 1.
  • the coordinate information generation unit 51 of the tablet terminal 5 outputs the generated coordinate information to only the endoscope processor 3.
  • the video signal output from the video output unit 34 of the endoscope processor 3 to the tablet terminal 5 is input to the video display unit 54 of the tablet terminal 5.
  • FIG. 5 is a flowchart showing an example of processing executed in the endoscope system 1 according to the second embodiment.
  • S306 and S307 included in the flowchart shown in FIG. 3 are excluded from this flowchart. Accordingly, in S308, the tablet terminal 5 displays an endoscope observation screen image corresponding to the video signal output from the endoscope processor 3 in S312. As a result, live video display with annotation is performed on the tablet terminal 5.
  • the others are the same as those in the first embodiment.
  • the configuration of the endoscope system 1 can be realized with a simpler configuration. This configuration is particularly effective when the endoscope system 1 has a performance capable of high-speed processing to the extent that the user A does not experience a sense of visual discomfort due to the above-mentioned time lag.
  • the third embodiment is a modification of the first embodiment. Therefore, in the description of the third embodiment, the same elements as those described in the first embodiment are designated by the same reference numerals, and detailed description thereof will be omitted.
  • FIG. 6 is a diagram showing a configuration example of the endoscope system 1 according to the third embodiment.
  • the endoscope system 1 according to the third embodiment is basically the same as the configuration shown in FIG. 1, but the endoscope observation screen generation unit of the endoscope processor 3 is used. 33, the video output unit 34, and the contents of the processing performed by the annotation image generation unit 52 of the tablet terminal 5 are different from those of the first embodiment.
  • the endoscope observation screen generation unit 33 of the endoscope processor 3 outputs the first endoscope observation screen image and the second endoscope observation screen image to the video output unit.
  • the first endoscopic observation screen image is the endoscopic image input from the endoscopic image generation unit 31.
  • the second endoscopic observation screen image is a superimposed image in which the annotation image input from the annotation image generation unit 32 is superimposed on the endoscopic image input from the endoscopic image generation unit 31.
  • the endoscopic image input from the endoscopic image generation unit 31 is set as the second endoscopic observation screen image.
  • the second endoscopic observation screen image corresponds to the endoscopic observation screen image output to the video output unit 34 by the endoscopic observation screen generating unit 33 according to the first embodiment.
  • the video output unit 34 outputs a video signal corresponding to the first endoscopic observation screen image input from the endoscopic observation screen generation unit 33 to the tablet terminal 5, and inputs from the endoscopic observation screen generation unit 33.
  • a video signal corresponding to the generated second endoscopic observation screen image is output to the endoscopic observation monitor 4.
  • the annotation image generation unit 52 of the tablet terminal 5 has the same configuration as the annotation image generation unit 32 of the endoscope processor 3 and performs the same processing.
  • the others are the same as those in the first embodiment.
  • FIG. 7 is a flowchart showing an example of processing executed in the endoscope system 1 according to the third embodiment. As shown in FIG. 7, in this flowchart, the output of the video signal to the tablet terminal 5, which is performed in S312 of the flowchart shown in FIG. 3, is not performed.
  • the annotation image generation performed in S306 is performed as described in the description of the annotation image generation unit 52 described above. That is, the same process as the generation of the annotation image performed in S310 is performed.
  • the annotation image generated in S306 is superimposed on the endoscope observation screen image (live image) corresponding to the video signal output from the endoscope processor 3 in S301. Is generated.
  • the endoscope observation screen image corresponding to the video signal output to the tablet terminal 5 in S301 becomes the above-mentioned first endoscope observation screen image, and is output to the endoscope observation monitor 4 in S301 and S312.
  • the endoscopic observation screen image corresponding to the video signal corresponding to the video signal becomes the above-mentioned second endoscopic observation screen image.
  • the fourth embodiment is a modification of the first embodiment. Therefore, in the description of the fourth embodiment, the same elements as those described in the first embodiment will be designated by the same reference numerals, and detailed description thereof will be omitted.
  • FIG. 8 is a figure which shows the structural example of the endoscope system 1 which concerns on 4th Embodiment.
  • the tablet terminal 5 in addition to the configuration shown in FIG. 1, the tablet terminal 5 further includes a freeze instructing section 55. Further, the contents of the processing performed by the endoscope observation screen generation unit 33 and the video output unit 34 of the endoscope processor 3 and the annotation image generation unit 52 and the superposition unit 53 of the tablet terminal 5 are different from those of the first embodiment. .
  • the freeze instruction unit 55 of the tablet terminal 5 outputs a freeze instruction signal to the endoscope processor 3 in response to a freeze operation on the tablet terminal 5.
  • the freeze operation is performed by a touch operation on the touch panel of the image display unit 54.
  • the operation mode of the endoscope system 1 transits from the live mode to the freeze mode.
  • the freeze mode is a mode in which a freeze image and a live image in the subject can be observed.
  • the endoscope observation screen generation section 33 of the endoscope processor 3 performs the following processing.
  • a freeze image is generated from the endoscopic image input from the endoscopic image generating unit 31, and is stored in a memory (not shown).
  • the freeze image can also be referred to as an endoscopic image input from the endoscopic image generating unit 31 at the time when the freeze instruction signal is input.
  • a superimposed image in which the annotation image input from the annotation image generation unit 32 is superimposed on the freeze image read from the memory is generated (A).
  • a two-image display screen image in which the generated superimposed image and the endoscopic image input from the endoscopic image generation unit 31 are arranged side by side is generated (B).
  • the two-image display screen in which the freeze image read from the memory instead of the superimposed image is arranged side by side with the endoscopic image Generate an image.
  • the generated two-screen display screen image and the freeze image read from the memory are output to the video output unit 34 (C).
  • the above (A) to (C) are repeated until the next freeze instruction signal is input, or until the freeze mode is canceled and the operation mode of the endoscope system 1 transits to the live mode.
  • the freeze mode can be released by, for example, an operation on the touch panel of the video display unit 54 of the tablet terminal 5.
  • the video output unit 34 When the two-screen display screen image and the freeze image are further input from the endoscopic observation screen generation unit 33, the video output unit 34 outputs a video signal corresponding to the input two-screen display screen image to the endoscopic observation monitor. 4 and the image file of the freeze image is output to the tablet terminal 5. However, the output of the image file to the tablet terminal 5 is performed only once for the same freeze image, and the image file of the same freeze image is not output many times.
  • a two-image display screen image corresponding to the video signal is displayed on the endoscope observation monitor 4 (video display unit 41).
  • a two-image display of a freeze image with annotation and a live image is performed.
  • the annotation image generation unit 52 of the tablet terminal 5 has the same configuration as the annotation image generation unit 32 of the endoscope processor 3 and performs the same processing.
  • the superimposing unit 53 further generates a superimposed image in which the annotation image input from the annotation image generating unit 52 is superimposed on the freeze image of the image file input from the video output unit 34 of the endoscope processor 3. , And outputs a video signal corresponding to the superimposed image to the video display unit 54.
  • a video signal corresponding to the freeze image of the image file input from the endoscope processor 3 is output to the video display unit 54.
  • the image (superimposed image or freeze image) corresponding to the video signal is displayed on the tablet terminal 5 (video display unit 54), for example, the freeze image with annotation is displayed.
  • FIG. 8 the others are the same as those in the first embodiment.
  • 9A and 9B are flowcharts showing an example of processing executed in the endoscope system 1 according to the fourth embodiment.
  • the process returns to S301.
  • the tablet terminal 5 outputs a freeze instruction signal to the endoscope processor 3.
  • the endoscope processor 3 When the freeze instruction signal is input from the tablet terminal 5, the endoscope processor 3 generates a freeze image from the endoscopic image (live video) generated based on the image pickup signal input from the videoscope 2 (S902). ). Then, the generated image file of the freeze image is output to the tablet terminal 5 (S903). In addition, a two-image display screen image in which the generated freeze image and the endoscopic image (live image) generated based on the image pickup signal input from the videoscope 2 are arranged side by side is generated (S904), and the two images are generated. A video signal corresponding to the display screen image is output to the endoscope observation monitor 4.
  • the endoscope observation monitor 4 displays a two-image display screen image corresponding to the video signal output from the endoscope processor 3 (S905). As a result, two images of a freeze image and a live image (so-called two-screen display) are displayed on the endoscope observation monitor 4.
  • the tablet terminal 5 displays the freeze image of the image file output from the endoscope processor 3 in S903 (S906), and performs S304 to 306 as in the flowchart shown in FIG. However, in S306, the generation of the annotation image is performed as described in the description of the annotation image generation unit 52 described above.
  • the tablet terminal 5 generates a superimposed image by superimposing the annotation image generated in S306 on the freeze image (S907), and displays the superimposed image (S908). As a result, the freeze image with annotation is displayed on the tablet terminal 5.
  • the tablet terminal 5 After S305 starts, the tablet terminal 5 performs S309, and the endoscope processor 3 performs S310, as in the flowchart shown in FIG. Then, the endoscope processor 3 generates a superimposed image by superimposing the annotation image generated in S310 on the freeze image generated in S902 (S909), and inputs the superimposed image and the videoscope 2.
  • a two-image display screen image in which the endoscopic image (live video) generated based on the image pickup signal is arranged side by side is generated (S910), and a video signal corresponding to the two-image display screen image is generated by the endoscopic observation monitor 4 Output to.
  • the endoscope observation monitor 4 displays a two-image display screen image corresponding to the video signal output from the endoscope processor 3 in S910 (S911). As a result, the endoscopic observation monitor 4 displays two images (a so-called two-screen display) of the freeze image with annotation and the live image.
  • the user A performs the freeze operation on the tablet terminal 5 and further performs the annotation operation, so that an arbitrary annotation is displayed on the endoscope observation monitor 4. It can be superimposed on the freeze image displayed together with the video, and the content can be confirmed by the user B.
  • the fifth embodiment is a modification of the first embodiment. Therefore, in the description of the fifth embodiment, the same elements as those described in the first embodiment will be denoted by the same reference numerals, and detailed description thereof will be omitted.
  • FIG. 10 is a figure which shows the structural example of the endoscope system 1 which concerns on 5th Embodiment.
  • the tablet terminal 5 further includes a deletion instruction unit 56 in addition to the configuration shown in FIG. Further, the processing content of the coordinate information generation unit 51 of the tablet terminal 5 is different from that of the first embodiment. Furthermore, the configuration of the annotation image generation unit 32 of the endoscope processor 3 and the configuration of the annotation image generation unit 52 of the tablet terminal 5 are different from those of the first embodiment.
  • the coordinate information generation unit 51 further issues an annotation ID (identifier) every time the coordinate information including the starting point position coordinates of annotation drawing is generated. That is, it can be said that the annotation ID is issued for each coordinate information related to one unit of annotation drawing. Then, when outputting the coordinate information including the start point position coordinates, the coordinate information generation unit 51 further adds the annotation ID to the coordinate information and outputs the annotation information. Accordingly, the annotation ID and the attribute information are added to the coordinate information including the starting point position coordinates.
  • the annotation ID is a serial number (serial number).
  • the annotation image generation unit 32 of the endoscope processor 3 includes an image generation block 321, an image storage unit 322, an effective figure management unit 323, and a synthesis unit 324.
  • the image generation block 321 generates an annotation image for one unit of annotation drawing each time coordinate information with a serial number and attribute information is input from the coordinate information generation unit 51 of the tablet terminal 5 (hereinafter, referred to as “1 "Unit annotation image generation process”) is started.
  • 1 "Unit annotation image generation process” In the 1-unit annotation image generation process, first, an annotation image is generated based on the coordinate information to which the attribute information is added. After that, every time coordinate information is input from the coordinate information generation unit 51 of the tablet terminal 5, the annotation image generated based on the input coordinate information and the corresponding attribute information and the annotation image generated last time are displayed.
  • the image generation block 321 includes a memory (not shown) that stores the latest annotation image generated for this processing.
  • the image storage unit 322 stores the annotation image associated with the serial number. That is, the annotation image for each serial number is stored.
  • the effective figure management unit 323 stores the serial number and the effective figure flag in association with each other.
  • the initial value of the effective figure flag is set to "1”.
  • the effective figure management unit 323 updates the value of the effective figure flag of the serial number included in the deletion instruction signal to “0”.
  • the synthesizing unit 324 reads the serial number having the valid graphic flag value of “1” from the valid graphic management unit 323, reads the annotation image associated with the read serial number from the image storage unit 322, and reads the annotation image and the image.
  • the annotation image obtained by synthesizing the latest annotation image generated by the one-unit annotation image generation processing being executed in the generation block 321 is output to the endoscope observation screen generation unit 33.
  • the deletion instruction unit 56 of the tablet terminal 5 outputs a deletion instruction signal including a serial number corresponding to the annotation to be deleted to the annotation image generation unit 52 and the endoscope processor 3 in accordance with the annotation deletion operation on the tablet terminal 5.
  • the annotation deletion operation is performed by a touch operation on the touch panel of the video display unit 54.
  • the touch operation is performed on the annotation displayed in the image (video) with annotation displayed on the video display unit 54.
  • the annotation image generation unit 52 of the tablet terminal 5 has the same configuration as the annotation image generation unit 32 described above and performs the same processing. In FIG. 10, the others are the same as those in the first embodiment.
  • FIG. 11A and 11B are flowcharts showing an example of processing executed in the endoscope system 1 according to the fifth embodiment. As shown in FIG. 11 (FIGS. 11A and 11B), in this flowchart, first, S301 to 305 are performed as in the flowchart shown in FIG.
  • the generated coordinate information is coordinate information including the start point position coordinates
  • a serial number is further added to the coordinate information. Accordingly, the serial number and the attribute information are added to the coordinate information including the starting point position coordinates.
  • the tablet terminal 5 outputs the generated coordinate information to the endoscope processor 3 every time the coordinate information is generated in S305, similarly to the flowchart shown in FIG. 3 (S309). Further, the tablet terminal 5 determines whether or not the annotation drawing is completed (S1101), and if the determination result is NO, this determination is repeated. Note that the end of the annotation drawing corresponds to the end of the 1-unit annotation image generation processing in the annotation image generation unit 52 described above.
  • the tablet terminal 5 stores the serial number and the valid figure flag whose value is “1” in association with each other in the annotation image generating unit 52 (S1102). That is, the memory is saved.
  • the endoscope processor 3 generates an annotation image as described in the description of the image generation block 321 (S1102).
  • the endoscope processor 3 synthesizes the latest annotation image generated by the one-unit annotation image generation processing that is being executed with the annotation image with the valid figure flag value of "1" (S1103), and A composite image is generated by superimposing the composite image on the endoscopic image (live video) generated based on the imaging signal input from the videoscope 2 (S1104), and the video signal ( The live image) is output to the endoscope observation monitor 4 and the tablet terminal 5 (S1105).
  • the endoscope processor 3 determines whether or not the annotation drawing is completed (S1106), and if the determination result is NO, this determination is repeated. On the other hand, if the determination result in S1106 is YES, the endoscope processor 3 stores the annotation image in which the serial numbers are associated and the serial number and the value are “1” as described in the description of the annotation image generation unit 32 described above. The effective figure flag of “” is stored in association with each other (S1107). That is, the memory is saved.
  • the endoscopic observation monitor 4 displays a superimposed image according to the video signal output in S1105 (S1108). As a result, live video display with annotation is performed on the endoscope observation monitor 4.
  • the tablet terminal 5 displays the superimposed image according to the video signal output in S1105 (S1109). As a result, live video display with annotation is performed on the tablet terminal 5.
  • the tablet terminal 5 determines whether or not the annotation deletion operation has been performed (S1110), and when the determination result is “none”, returns to S1109. On the other hand, when the determination result of S1110 is “present”, the tablet terminal 5 outputs an annotation deletion instruction signal according to the annotation deletion operation to the endoscope processor 3 (S1111).
  • the annotation deletion instruction signal includes a serial number corresponding to the annotation to be deleted.
  • the tablet terminal 5 updates the value of the valid figure flag stored in association with the serial number corresponding to the annotation to be deleted to “0” in the annotation image generation unit 52 (S1112). That is, the value of the effective figure flag is set to "0" and the value is stored in the memory.
  • the endoscope processor 3 updates the value of the effective figure flag stored in the annotation image generation unit 32 in association with the serial number included in the annotation deletion instruction signal output from the tablet terminal 5 in S1111, to “0”. Yes (S1113).
  • the endoscope processor 3 synthesizes the latest annotation image generated in the one-unit annotation image generation process that is being executed with the annotation image having the valid figure flag value of "1" (S1114), and A composite image is generated by superimposing the composite image on the endoscopic image (live video) generated based on the image pickup signal input from the videoscope 2 (S1115), and the video signal ( The live image) is output to the endoscope observation monitor 4 and the tablet terminal 5 (S1116).
  • the annotation image associated with the serial number whose valid graphic flag value has been updated to "0" in S1113 is excluded from the synthesis target.
  • the endoscope observation monitor 4 displays a superimposed image according to the video signal output from the endoscope processor 3 in S1116 (S1117). As a result, on the endoscopic observation monitor 4, the annotation-added live video display in which the annotation to be deleted is hidden is displayed.
  • the tablet terminal 5 displays the superimposed image according to the video signal output from the endoscope processor 3 in S1116 (S1118). As a result, on the tablet terminal 5, the annotation-added live video display in which the annotation to be deleted is hidden is displayed.
  • the user A is further superimposed on the endoscopic observation screen image displayed on the endoscopic observation monitor 4 by the annotation deletion operation on the tablet terminal 5. You can delete existing annotations.
  • the display / non-display of the annotation image stored in association with the serial number is managed by the effective figure flag stored in association with the serial number. Therefore, the touch panel of the video display unit 54 of the tablet terminal 5 is managed. In, even if the vicinity of the hidden annotation is touched, the value of the effective figure flag does not change. Further, by using this management, the hidden annotation image may be redisplayable.
  • the present embodiment and the fourth embodiment may be combined so that the processing described in the fifth embodiment can be executed in the freeze mode described in the fourth embodiment.
  • the annotation is not limited to a line, and may be another figure such as a circle or a square.
  • the shape, size, color, etc. of the graphic may be selectable according to the operation on the tablet terminal 5.
  • the vertex coordinates forming the figure are generated as coordinate information, and the information regarding the size and color of the figure is added as attribute information.
  • the type of figure can be selected by operating the tablet terminal 5, and if the selected figure is a line, the type and color of the line can be further selected, and the selected figure can be a circle, a square, or the like. If it is, the size and color of the figure may be further selectable.
  • an annotation is performed on the endoscope observation screen image corresponding to the video signal input from the endoscope processor 3.
  • the display image quality of the tablet terminal 5 may be fixed to a preset single image quality, or coordinate conversion for up conversion or down conversion may be performed. If the endoscopic image is a mask-processed image peculiar to the image, coordinate replacement may be performed on the display area of the endoscopic image.
  • a part of the configuration of the endoscope system 1 may be realized by the hardware configuration shown below.
  • the above-described units of the endoscope processor 3, a configuration for executing the above-described processing performed by the endoscope processor 3, the image display unit 41 of the endoscope observation monitor 4, and the endoscope observation monitor 4 perform the same.
  • the configuration for executing the above-described processing may be realized by a circuit such as an FPGA (field-programmable gate array) or an ASIC (application specific integrated circuit).
  • the above-mentioned respective units of the endoscope processor 3, the configuration for executing the above-described processing performed by the endoscope processor 3, and the tablet terminal 5 may be realized by the hardware configuration shown in FIG. .
  • FIG. 12 is a diagram illustrating an example of the hardware configuration.
  • the hardware configuration shown in FIG. 12 is a portable recording in which a CPU (Central Processing Unit) 101, a memory 102, an input / output device 103, an input / output IF (interface) 104, a storage device 105, and a portable recording medium 108 are stored.
  • the medium drive device 106 and the communication IF 107 are provided, and these are connected to each other via a bus 109.
  • the CPU 101 is an arithmetic unit that executes a program for processing performed by the endoscope processor 3 or the tablet terminal 5.
  • the memory 102 is a RAM (Random Access Memory) and a ROM (Read Only Memory), the RAM is used as a work area of the CPU 101, and the ROM stores a program and information necessary for executing the program in a nonvolatile manner.
  • the input / output device 103 is a display device with a touch panel or the like.
  • the input / output device 103 is an input device such as a touch panel and a keyboard, and an output device such as a display device.
  • the input / output IF 104 is an interface for transmitting / receiving a signal to / from an external device.
  • the external device is the videoscope 2, the endoscope observation monitor 4, or the like.
  • the storage device 105 is a storage for nonvolatilely storing a program, information necessary for executing the program, information acquired by executing the program, and the like.
  • the storage device 105 is, for example, an HDD (Hard Disk Drive).
  • the portable recording medium driving device 106 drives the portable recording medium 108 and accesses the recorded contents.
  • the portable recording medium 108 is a memory device, a flexible disk, an optical disk, a magneto-optical disk, or the like.
  • the portable recording medium 108 also includes a CD-ROM (Compact Disk Read Only Memory), a DVD (Digital Versatile Disk), a USB (Universal Serial Bus) memory, and the like.
  • the portable recording medium 108 is a storage for nonvolatilely storing a program, information necessary for executing the program, information acquired by executing the program, and the like.
  • the communication IF 107 is an interface that is connected to a network (for example, a wireless network) and that communicates with an external device via the network.
  • a network for example, a wireless network
  • the present invention is not limited to the above-described embodiment as it is, and can be embodied by modifying constituent elements in an implementation stage without departing from the scope of the invention.
  • various inventions can be formed by appropriately combining a plurality of components disclosed in the above embodiments. For example, some of all the constituent elements shown in the embodiment may be deleted. Furthermore, the constituent elements of different embodiments may be combined appropriately.
  • Endoscope System 2 Video Scope 3 Endoscope Processor 4 Endoscope Observation Monitor 5 Tablet Terminal 31 Endoscopic Image Generating Unit 32 Annotation Image Generating Unit 33 Endoscopic Observation Screen Generating Unit 34 Video Output Unit 41 Video Display Unit 51 Coordinate Information Generating Section 52 Annotation Image Generating Section 53 Superimposing Section 54 Video Display Section 55 Freeze Instructing Section 56 Delete Instructing Section 101 CPU 102 memory 103 input / output device 104 input / output IF 105 storage device 106 portable recording medium driving device 107 communication IF 108 Portable Recording Medium 109 Bus 321 Image Generation Block 322 Image Storage 323 Effective Figure Management 324 Synthesizer 501 Annotation Image 502 Endoscope Observation Screen Image 503 Superimposed Image

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Astronomy & Astrophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Endoscopes (AREA)

Abstract

La présente invention concerne un système d'endoscope qui comprend un dispositif d'endoscope et un dispositif terminal. Le dispositif d'endoscope génère une image d'endoscope et transmet l'image d'endoscope au dispositif terminal. Le dispositif terminal reçoit l'image d'endoscope, affiche l'image d'endoscope, accepte une opération de dessin sur l'image d'endoscope affichée, génère des informations de coordonnées relatives à l'opération de dessin et transmet les informations de coordonnées au dispositif d'endoscope. Le dispositif d'endoscope reçoit les informations de coordonnées et superpose des informations de dessin sur la base des informations de coordonnées sur l'image endoscopique pour générer une image superposée en tant qu'image d'écran.
PCT/JP2018/037852 2018-10-11 2018-10-11 Système d'endoscope et procédé de génération d'image d'écran WO2020075254A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/037852 WO2020075254A1 (fr) 2018-10-11 2018-10-11 Système d'endoscope et procédé de génération d'image d'écran

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/037852 WO2020075254A1 (fr) 2018-10-11 2018-10-11 Système d'endoscope et procédé de génération d'image d'écran

Publications (1)

Publication Number Publication Date
WO2020075254A1 true WO2020075254A1 (fr) 2020-04-16

Family

ID=70165221

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/037852 WO2020075254A1 (fr) 2018-10-11 2018-10-11 Système d'endoscope et procédé de génération d'image d'écran

Country Status (1)

Country Link
WO (1) WO2020075254A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022264688A1 (fr) 2021-06-16 2022-12-22 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, et procédé de fonctionnement pour dispositif de traitement d'image médicale

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04341232A (ja) * 1991-03-11 1992-11-27 Olympus Optical Co Ltd 電子内視鏡システム
JP2011167332A (ja) * 2010-02-18 2011-09-01 Hoya Corp 遠隔医療用電子内視鏡システム
JP2013132514A (ja) * 2011-12-27 2013-07-08 Toshiba Corp 医用画像表示装置及び医用画像保管システム
WO2013114660A1 (fr) * 2012-01-30 2013-08-08 日立コンシューマエレクトロニクス株式会社 Système de prise en charge de terminal d'éducation et d'informations
US20130338493A1 (en) * 2012-06-19 2013-12-19 Covidien Lp Surgical devices, systems and methods for highlighting and measuring regions of interest
WO2017038241A1 (fr) * 2015-08-28 2017-03-09 富士フイルム株式会社 Dispositif d'opération avec un instrument, procédé d'opération avec un instrument, et système d'instrument électronique
WO2018179979A1 (fr) * 2017-03-29 2018-10-04 ソニー・オリンパスメディカルソリューションズ株式会社 Dispositif de commande, dispositif externe, système d'observation médicale, procédé de commande, procédé d'affichage et programme

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04341232A (ja) * 1991-03-11 1992-11-27 Olympus Optical Co Ltd 電子内視鏡システム
JP2011167332A (ja) * 2010-02-18 2011-09-01 Hoya Corp 遠隔医療用電子内視鏡システム
JP2013132514A (ja) * 2011-12-27 2013-07-08 Toshiba Corp 医用画像表示装置及び医用画像保管システム
WO2013114660A1 (fr) * 2012-01-30 2013-08-08 日立コンシューマエレクトロニクス株式会社 Système de prise en charge de terminal d'éducation et d'informations
US20130338493A1 (en) * 2012-06-19 2013-12-19 Covidien Lp Surgical devices, systems and methods for highlighting and measuring regions of interest
WO2017038241A1 (fr) * 2015-08-28 2017-03-09 富士フイルム株式会社 Dispositif d'opération avec un instrument, procédé d'opération avec un instrument, et système d'instrument électronique
WO2018179979A1 (fr) * 2017-03-29 2018-10-04 ソニー・オリンパスメディカルソリューションズ株式会社 Dispositif de commande, dispositif externe, système d'observation médicale, procédé de commande, procédé d'affichage et programme

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022264688A1 (fr) 2021-06-16 2022-12-22 富士フイルム株式会社 Dispositif de traitement d'image médicale, système d'endoscope, et procédé de fonctionnement pour dispositif de traitement d'image médicale

Similar Documents

Publication Publication Date Title
US20160171162A1 (en) System and Method for Displaying Annotated Capsule Images
JP5290475B2 (ja) 内視鏡システム
JP2007042073A (ja) 映像提示システム、映像提示方法、該映像提示方法をコンピュータに実行させるためのプログラム、および記憶媒体
JP2008060731A (ja) カメラ、出力画像選択方法、プログラム
JPWO2004039068A1 (ja) 画像合成携帯端末およびそれに用いられる画像合成方法
JP4656005B2 (ja) 画像処理システム及び画像処理方法
JP5642457B2 (ja) 表示制御装置および表示制御方法
KR20110060821A (ko) 정보 처리 장치, 방법 및 컴퓨터 판독가능한 매체
CN104011787A (zh) 图像处理装置、其控制方法、图像处理系统和程序
JP2001195601A (ja) 複合現実感提示装置及び複合現実感提示方法並びに記憶媒体
JP2015176559A (ja) 情報処理方法、情報処理装置、およびプログラム
WO2020075254A1 (fr) Système d'endoscope et procédé de génération d'image d'écran
JP2009219573A (ja) 内視鏡用画像処理装置及び内視鏡用画像処理方法
WO2015015560A1 (fr) Dispositif de commande d'affichage et son procédé de commande
JP2011206435A (ja) 撮像装置、撮像方法、撮像プログラム、及び内視鏡
JP2009122184A (ja) 画像形成装置
JP2005286608A (ja) サムネイル表示装置
JP4635437B2 (ja) 電子機器及び画像表示方法
JP2010276977A (ja) 撮像装置
JP2011110281A (ja) 電子内視鏡装置
JPH10155737A (ja) 電子内視鏡装置
JP2005195867A5 (fr)
JP5005980B2 (ja) 内視鏡装置
JP2010097449A (ja) 画像合成装置、及び画像合成方法、画像合成プログラム
JP2016038542A (ja) 画像処理方法および画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18936341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18936341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP