WO2016139884A1 - Endoscope et système d'endoscope - Google Patents

Endoscope et système d'endoscope Download PDF

Info

Publication number
WO2016139884A1
WO2016139884A1 PCT/JP2016/000422 JP2016000422W WO2016139884A1 WO 2016139884 A1 WO2016139884 A1 WO 2016139884A1 JP 2016000422 W JP2016000422 W JP 2016000422W WO 2016139884 A1 WO2016139884 A1 WO 2016139884A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fog
signal
endoscope
unit
Prior art date
Application number
PCT/JP2016/000422
Other languages
English (en)
Japanese (ja)
Inventor
小金 春夫
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2016139884A1 publication Critical patent/WO2016139884A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • This disclosure relates to an endoscope and an endoscope system that display captured images.
  • an insufflation apparatus that detects smoke generated in the abdominal cavity from an in-vivo image captured by an endoscope, supplies air supply gas for inflating the abdominal cavity, and discharges smoke generated in the abdominal cavity to the outside of the body.
  • a technique of controlling and removing smoke is also known (see, for example, Patent Document 2).
  • a technique for performing fog correction processing as an example of image processing is already known so that fog generated around the subject is not visible on the image captured by the camera (for example, non-patent literature). 1).
  • this fog correction process the brightness histogram is flattened in order to increase the contrast of the entire screen, which has been lowered due to the concentration of the brightness histogram in the intermediate gradation due to the occurrence of fog.
  • the present disclosure includes an endoscope including an imaging window at a tip and transmitting an optical image incident from the imaging window, and a camera head unit including an imaging element that captures an optical image transmitted through the scope And a control device that performs fog correction processing on the image captured by the image sensor, a display device that displays an image output from the control device, and the control device is instructed whether to perform fog correction processing.
  • a switch and when the controller is instructed to execute fog correction processing in response to pressing of the switch, the control device executes fog correction processing on the image and is instructed not to execute fog correction processing.
  • This is an endoscope system that outputs an image without executing fog correction processing.
  • the present disclosure includes a scope unit that has an imaging window at the tip, transmits an optical image incident from the imaging window, a camera head unit that includes an imaging element that captures an optical image transmitted through the scope unit, and a camera It is an endoscope provided with a switch that is provided in the head unit and instructs the presence / absence of execution of fog correction processing on an image captured by an image sensor.
  • fog correction processing can be applied to an image captured by an endoscope, which has not been performed conventionally.
  • FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the appearance of the camera head of the present embodiment.
  • FIG. 3 is a diagram illustrating an example of the appearance of the camera control device of the present embodiment.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration of the camera head according to the present embodiment.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of the camera control apparatus according to the present embodiment.
  • FIG. 6 is a diagram illustrating an example of a monitor screen on which the setting change menu of the present embodiment is displayed.
  • FIG. 7 is a diagram illustrating an example of a timing shift during the fog correction process according to the present embodiment.
  • FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system according to the present embodiment.
  • FIG. 2 is a diagram showing an example of the appearance of the camera head of the present embodiment.
  • FIG. 3 is
  • FIG. 8 is a flowchart showing an example of the correction processing procedure of the camera control apparatus of the present embodiment.
  • FIG. 9 is a schematic diagram illustrating an example of an image of a subject including an affected area captured by the endoscope displayed on the monitor according to the present embodiment.
  • Non-Patent Document 1 In general, when the fog correction process shown in Non-Patent Document 1 is performed, a processing delay occurs due to a display load caused by image processing. Further, the image that was subjected to the fog correction process was slightly degraded in image quality as compared to the image that was not subjected to the fog correction process. For this reason, conventionally, fog correction processing has not been performed on an image captured by an endoscope.
  • the present disclosure has been made in view of the above-described conventional situation, and an image of a site such as an affected part after fog correction processing is performed even if there is a display delay when fog is generated according to the imaging situation of the affected part.
  • a high-quality image of a site such as an affected part without fog correction processing is output, and an endoscope and an endoscope system that improve user convenience are provided. With the goal.
  • the endoscope system and endoscope of this embodiment are used for observing an affected part in a body such as a person's abdominal cavity during surgery.
  • FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system 5 according to the present embodiment.
  • An endoscope system 5 shown in FIG. 1 includes an endoscope 10, a camera control device 15, a monitor 16 as an example of a display device, and a light source 17 for illumination.
  • the endoscope 10 is used for observing an affected part in the body, and includes a scope 11, a mount adapter 12, a relay lens 13, a camera head 14, a head switch 19, and a light source input unit 18.
  • the scope 11 as an example of the scope part is a main part of a rigid endoscope inserted into the body, for example, and is an elongated light guide member capable of guiding light from the end to the tip.
  • the scope 11 has a photographing window 11z at the tip, and a central portion where a bundle of optical fibers through which an optical image incident from the photographing window 11z is transmitted is disposed so as to surround the optical fiber bundle. It has a structure in which an optical fiber that guides the light L to the tip is integrated. An optical material such as optical glass or optical plastic is used for the photographing window 11z.
  • the mount adapter 12 is a member for attaching the scope 11 to the camera head 14.
  • Various scopes can be detachably attached to the mount adapter 12.
  • a light source input unit 18 is attached to the mount adapter 12.
  • the light source input unit 18 is detachably connected to a light guide cable 17z for guiding light from the light source 17 to the scope 11, and enters the light from the light source 17 and guides it to the scope 11 side.
  • the light source 17 is a light source having an LED (Light Emitting Diode) with a dimming function.
  • a light source such as a laser or a halogen lamp and a diaphragm spring may be provided, and light emitted from the light source may be dimmed by the diaphragm spring.
  • the relay lens 13 has a plurality of lenses that converge an optical image transmitted through the scope 11 on the imaging surface, and performs focus adjustment and magnification adjustment by moving these lenses.
  • the camera head 14 as an example of the camera head unit includes an imaging element 50 (see FIG. 4) such as a CCD (Charged Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that converts an optical image formed on the imaging surface into an image signal. Is incorporated, and the imaging signal (video signal) from the imaging device 50 and each information signal in the camera head 14 (for example, including the on / off signal of the head switch 19) are multiplexed. The data is sent to the camera control device 15 via the extension cable 14z. In addition, the camera head 14 demodulates the power supply, the drive signal, and the setting signal to each circuit of the multiplexed image sensor 50 input from the camera control device 15 via the head extension cable 14z.
  • an imaging element 50 see FIG. 4
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • FIG. 2 is a diagram showing an example of the appearance of the camera head 14.
  • the camera head 14 has, for example, a square block-shaped housing, and a head switch 19 is provided on one surface (for example, the upper surface in FIG. 2).
  • the head switch 19 is waterproof and is a push button switch that turns on and off an internal electric circuit (not shown) every time it is pressed in a direction perpendicular to the surface.
  • the electric circuit is turned on (that is, the on signal of the head switch 19 is output)
  • the fog correction process is executed
  • the internal electric circuit is turned off (that is, the off signal of the head switch 19 is
  • the camera control device 15 is instructed not to execute the fog correction process.
  • a signal whose internal electric circuit is turned on by pressing the head switch 19 is multiplexed with a video signal from the image pickup device 50 and is transmitted to the camera control device 15 via the head extension cable 14z. Is transmitted.
  • the signal in which the internal electric circuit is turned off by pressing the head switch 19 is not multiplexed with the video signal from the image sensor 50 and is transmitted as it is to the camera control device 15 via the head extension cable 14z.
  • the camera control device 15 multiplexes the power supply of the image sensor 50, the drive signal, and the setting signal to each circuit, and sends the multiplexed signal to the camera head 14 via the head extension cable 14z, and the camera head via the head extension cable 14z. 14, the multiplexed video signal and each information signal (including the on / off signal of the head switch 19) in the camera head 14 are demodulated.
  • the camera control device 15 extracts the on / off signal of the head switch 19 and performs a correction process to be described later on the video signal in accordance with the on / off signal, and the corrected video signal is suitable for the display format of the monitor 16. Convert to video signal.
  • the camera control device 15 sends this video signal to the monitor 16 via the signal cable 15z.
  • FIG. 3 is a diagram showing an example of the appearance of the camera control device 15.
  • a camera setting change button 40 is provided on the front surface of the housing of the camera control device 15.
  • the camera setting change button 40 includes a left / right up / down button 41 and a confirm button 42 respectively represented by arrows.
  • a gain setting value is determined in a setting adjustment menu (see FIG. 6), which will be described later, according to the operation amount of the camera setting change button 40.
  • the gain setting value is updated by pressing the right button, and the gain setting values are decreased by pressing the left button.
  • the position of the correction detection area frame 20 is determined in the menu for moving the correction detection area frame 20 (see FIG. 9) according to the operation amount of the camera setting change button 40.
  • the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button.
  • the correction detection area frame 20 moves upward when the upper button is pressed, and moves downward when the lower button is pressed.
  • the confirmation button 42 is used when confirming the parameters input in the setting change menu. For example, in the setting change menu displayed on the monitor 16 (see FIG. 6), when the operation of the confirmation button 42 is accepted for an item to which a star mark has been added, the screen of the monitor 16 transitions to a lower hierarchy. When the operation of the confirm button 42 is accepted in the item “END” shown in FIG. 6, the setting change menu screen is hidden from the monitor 16, and the original screen is displayed. The confirmation button 42 is also used when changing various settings.
  • the monitor 16 inputs a video signal captured by the endoscope 10 and output from the camera control device 15 via the signal cable 15z, and displays an image based on the video signal on the display panel.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration of the camera head 14.
  • the camera head 14 illustrated in FIG. 4 includes an image sensor 50, various image sensor drive signal generation circuits 51, a signal multiplex transmission circuit 53, and a multiplex signal demodulation circuit 52.
  • the signal multiplex transmission circuit 53 multiplexes the video signal from the image sensor 50 and each information signal (including the on / off signal of the head switch 19) in the camera head 14, and the multiplexed signal is transmitted to the head extension cable 14z. Is sent to the camera control device 15.
  • the multiplexed signal demodulating circuit 52 demodulates the power supply, the drive signal, and the setting signal to each circuit input from the camera control device 15 via the head extension cable 14z, and various kinds of imaging elements. Output to the drive signal generation circuit 51.
  • FIG. 5 is a diagram illustrating an example of a hardware configuration of the camera control device 15.
  • the camera control device 15 shown in FIG. 5 includes a multiplex signal demodulation circuit 60, a signal multiplex transmission circuit 61, a camera synchronization signal generation circuit 81, a pre-processing unit 71, a signal amplification processing unit 65, a YC separation processing / RGB conversion unit 72, A correction processing unit 73, gain level adjustment units 69 and 74, an area detection processing unit 77, an area generation unit 75, and a fog detection data addition / average / storage unit 78 are provided.
  • the 6 includes an encoder processing unit 80, an area display signal replacement mixing / selection unit 76, a menu character generation unit 79, a control unit 83, a data storage unit 84, an external switch 88, and an external communication.
  • Part 82 is provided.
  • the external switch 88 includes a camera setting change button 40 (see FIG. 3).
  • the pre-processing unit 71 performs noise removal, color separation processing using a pixel filter array, and the like on the video signal (that is, the image sensor signal) input from the camera head 14.
  • the signal amplification processing unit 65 performs signal processing such as amplification on the image sensor signal performed by the pre-processing unit 71.
  • the gain level adjustment unit 69 adjusts the amplification factor (that is, gain) of the image sensor signal according to the set value from the control unit 83.
  • the YC separation processing / RGB conversion unit 72 separates the luminance signal (Y signal) and the color signal (C signal) from the video signal from the signal amplification processing unit 65, and converts the color signal into an RGB signal. Further, the YC separation processing / RGB conversion unit 72 outputs the separated Y signal to the area display signal replacement mixing / selection unit 76 via the correction processing unit 73.
  • the correction processing unit 73 generates color difference signals (RY), (G ⁇ Y), and (BY) using the RGB signals input from the YC separation processing / RGB conversion unit 72, and sends them to the encoder processing unit 80. Output.
  • the gain level adjustment unit 74 adjusts the gain of a circuit that can divide and set the input level for each region in accordance with the set value from the control unit 83 with respect to the correction processing unit 73. Further, the gain level adjustment unit 74 adjusts to different gains in a normal correction process and a fog correction process described later.
  • the setting value from the control unit 83 that is the gain adjustment source is determined by the operation amount of the camera setting change button 40 in the setting change menu shown in FIG. For example, of the left and right up / down buttons 41 of the camera setting change button 40, the setting value of each gain is updated by pressing the right button, and the setting value of each gain is decreased by pressing the left button. As updated.
  • the setting value set by the camera setting change button 40 or the setting value read by the control unit 83 is stored in the data storage unit 84.
  • the data storage unit 84 has a correction value storage area 84z.
  • the correction value storage area 84z both the gain setting value in the normal correction process and the gain setting value in the fog correction process are stored.
  • the area generation unit 75 generates a display area frame signal based on the area designation coordinates input from the control unit 83, and converts the display area frame signal into an area display signal replacement mixing / selection unit 76 and fog detection data addition / average / The data is output to the storage unit 78.
  • the display area frame signal is a signal for displaying the correction detection area frame 20 (see FIG. 9) on the display panel of the monitor 16.
  • the size of the correction detection area frame 20 can be arbitrarily set as will be described later. For example, when the diameter of the scope 11 is small and the photographing range is narrow, the range for performing the fog correction process is also narrowed, so the size of the correction detection area frame 20 is set to a small (proper) value.
  • correction detection area frame 20 may be arbitrarily set by the user's designation, or may be set by automatic detection of the scope diameter, and further, the correction detection area frame 20 of the scope 11 among a plurality of patterns registered in advance. It may be set to correspond to the diameter.
  • the area display signal replacement mixing / selection unit 76 mixes the display area frame signal with the Y signal from the YC separation processing / RGB conversion unit 72 and performs encoder processing. To the unit 80. Further, when the area display signal replacement mixing / selecting unit 76 inputs the area display switching designation coordinates from the control unit 83, the area display signal replacement mixing / selecting unit 76 determines the position of the correction detection area frame 20 according to the coordinate information.
  • the area display switching designated coordinates are determined by the operation amount of the camera setting change button 40 in the menu for moving the correction detection area frame 20.
  • the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button.
  • the correction detection area frame 20 moves upward by pressing the upper button, and moves downward by pressing the lower button.
  • the area detection processing unit 77 detects an area where fog is generated from the color signal from the YC separation processing / RGB conversion unit 72 (that is, the display area of the correction detection area frame 20), and the result is obtained. Output to the fog detection data addition / average / storage unit 78.
  • the fog detection data addition / average / storage unit 78 detects the fog for the display area of the portion where fog is detected, which is detected by the area detection processing unit 77 based on the display area frame signal input from the area generation unit 75. Add the data, calculate the average and statistical frequency (statistical data), and save the results.
  • the menu character generator 79 generates character data for menu display according to the character display designation input from the controller 83.
  • the camera synchronization signal generation circuit 81 generates an image sensor driving signal for driving the image sensor 50 in the camera head 14, a signal processing synchronization signal and a vertical synchronization signal used in the operation of each unit.
  • the external communication unit 82 communicates with an external device (for example, a computer), and enables selection of a display area by the external device and capture of pixel information or color space information of the selected display area by the external device.
  • the display area selected by the external device is not limited to one, and a plurality of display areas may be simultaneously provided. Since the camera control device 15 includes the external communication unit 82, it is possible to perform key operations on the external device side in addition to key operations performed alone, and the degree of freedom of operation is improved. For example, it is possible to interactively select a display area by using a mouse pointer or GUI, perform color correction, and change and save the settings of the camera control device 15.
  • the control unit 83 controls the operation of each unit of the camera control device 15, and is used in the operation of a CPU (Central Processing Unit), a program memory in which a control program for operating the CPU is stored, and the CPU. It has RAM (Random Access Memory) and peripheral I / O (Input / Output) for connecting with peripheral devices. Further, an external switch 88 including a camera setting change button 40 is connected to the control unit 83.
  • the controller 83 gives a character display designation to the menu character generator 79.
  • the control unit 83 communicates with a computer that is an external device via the external communication unit 82 so that the camera control device 15 can be controlled on the computer side.
  • the control unit 83 reads the statistical data and the luminance level stored in the fog detection data addition / average / storage unit 78, calculates a gain setting value used in the fog correction process, and sets the gain. The value is set in the gain level adjustment unit 74.
  • the fog correction process is a known technique as shown in Non-Patent Document 1, for example, and includes various processes. For example, in the fog correction process, in order to increase the contrast of the entire screen, which is reduced by the concentration of the brightness histogram due to the occurrence of fog in the middle gradation, the correction process setting value (here, (Gain setting value) is changed. Further, in the correction value storage area 84z of the data storage unit 84, the set values of the gains in the normal correction process and the fog correction process calculated by the control unit 83 are stored.
  • control unit 83 reads the gain setting value during the normal correction process from the correction value storage area 84z when the off signal of the head switch 19 is output, and on the other hand when the on signal of the head switch 19 is output.
  • the gain setting value at the time of fog correction processing can be read from the correction value storage area 84z and set in the gain level adjustment unit 74 instantaneously. Note that the gain change performed at the time of switching between the normal correction process and the fog correction process may also be performed by the gain level adjustment unit 69.
  • a user for example, a doctor who performs an operation or a collaborator thereof detects an area 30 (see FIG. 5) detected in the correction detection area frame 20 from the image of the subject including the affected part TG displayed on the screen of the monitor 16. 9), an operation for changing the set values of the respective units of the camera control device 15 is performed.
  • the control unit 83 designates character display on the menu character generation unit 79 and causes the monitor 16 to display a setting change menu.
  • FIG. 6 is a diagram showing an example of the screen of the monitor 16 on which the setting change menu is displayed.
  • the setting change menu displayed on the screen of the monitor 16 at the time of setup includes menu items such as “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, “NKEE”, “END”, and the like.
  • an item to which a star mark is added indicates that there is a lower hierarchy.
  • the confirmation button 42 is pressed for an item to which a star mark has been added by a user operation, the screen of the monitor 16 transitions to a lower hierarchy.
  • the user confirms the setting change menu displayed on the screen of the monitor 16 and makes various settings / changes to the camera control device 15. For example, when performing color correction, when the user selects “NKEE” from the setting change menu, the control unit 83 displays an NKEE screen (not shown) on the monitor 16.
  • the NKEE screen is a screen in the first layer of the setting change menu, and includes items NKEE POINT and NKEE LEVEL.
  • the camera control device 15 displays the parameters of NKEE to be adjusted.
  • the confirmation button 42 when the confirmation button 42 is pressed in each of the items “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, and “SHUTTER” by the user's operation, the camera ID and illumination are displayed on the screen of the monitor 16.
  • the shutter and gain setting values can be adjusted or changed.
  • the screen of the monitor 16 deletes the setting change menu and transitions to the original screen.
  • the camera control device 15 sets the correction detection area frame 20 including the region 30 in which the fog correction process is performed on the image of the subject including the affected part TG displayed on the screen of the monitor 16.
  • the camera control device 15 changes the setting value (here, the gain setting value) of fog correction processing performed on the region 30 detected in the correction detection area frame 20, the luminance of the region 30 is It is determined which area of the knee area is included, the gain is adjusted for each corresponding area, and the signal of each area is synthesized after this adjustment.
  • FIG. 7 is a diagram illustrating an example of a timing shift during fog correction processing.
  • squares indicate frames
  • black squares indicate frames for the same captured image.
  • the fog correction processing is performed on the previous frame image, that is, fog detection data addition is performed. Since the calculation is performed based on the statistical data obtained by the averaging / storing unit 78, the image after the fog correction process is output as an output video with a delay of at least one frame compared to the normal correction process.
  • the camera control device 15 In switching from the normal correction process to the fog correction process, when the user turns on the head switch 19 (for example, when the head switch 19 is pressed an odd number of times), the camera control device 15 follows a predetermined algorithm. Thus, the gain setting value is instantaneously switched to the correction processing setting value so as to obtain an optimum combination for the fog correction processing. Accordingly, the user can intuitively adjust the setting value (in this case, the gain setting value) of the fog correction process without having specialized knowledge.
  • FIG. 8 is a flowchart showing an example of the correction processing procedure of the camera control device 15.
  • the control unit 83 in the camera control device 15 takes in the state of the head switch 19 (in other words, whether the electric circuit in the head switch 19 is on or off) (S1). Thereby, the control unit 83 detects whether or not the head switch 19 is pressed by a user operation (S2). When the head switch 19 is not depressed (S2, NO), the control unit 83 proceeds to the process of step S4 as it is. On the other hand, when the head switch 19 is pressed (S2, YES), the control unit 83 changes the presence / absence of fog correction processing (S3).
  • the control unit 83 determines whether or not to perform fog correction processing (S4). When the head switch 19 is OFF, that is, when fog correction processing is not performed (S4, NO), the control unit 83 performs normal correction processing (fog correction) stored in the correction value storage area 84z of the data storage unit 84. Various setting values of “no processing” are read (S5).
  • the control unit 83 performs various types of fog correction processing stored in the correction value storage area 84z of the data storage unit 84.
  • a set value is read (S6).
  • the set values of various correction processes are set values of gains (gains) is shown as an example.
  • the gain setting value for the gain level adjustment unit 69 may be changed.
  • the control unit 83 sets the gain setting values read in steps S5 and S6 in the gain level adjustment units 69 and 74 (S7).
  • the gain setting value is set in the gain level adjustment unit 69
  • the signal amplification processing unit 65 adjusts the gain of the video signal output from the preprocessing unit 71
  • the gain level adjustment unit 74 sets the gain value.
  • the correction processing unit 73 adjusts the gain of the color difference signal.
  • the encoder processing unit 80 mixes the color difference signal corrected by the correction processing unit 73, the display area frame signal from the area display signal replacement mixing / selection unit 76, and the Y signal from the YC separation processing / RGB conversion unit 72.
  • the received signals are input, these signals are changed to video signals suitable for the display format of the monitor 16, and output to the monitor 16.
  • the monitor 16 displays an image based on this video signal.
  • FIG. 9 is a schematic diagram illustrating an example of an image of a subject including the affected part TG captured by the endoscope 10 displayed on the monitor 16.
  • the correction detection area frame 20 is set by the user so as to include the affected part TG with respect to the image displayed on the monitor 16.
  • fog water vapor or smoke
  • the affected part TG is difficult for the user to see (monitor image on the left side in the figure).
  • fog correction processing is performed on the region 30 where the generation of fog has been detected.
  • the affected area TG can be visually recognized (the monitor image on the right side in the figure).
  • the camera control device 15 performs the normal correction process on the image captured by the endoscope 10, An image of the affected part TG imaged by the endoscope 10 is displayed with optimum characteristics (luminance characteristics, color characteristics).
  • the user operates the head switch 19 to turn on the camera and instructs the camera to switch to perform fog correction processing on the image captured by the endoscope 10. This is performed on the control device 15.
  • the camera head 14 is provided with the head switch 19 that can instantaneously switch between the fog correction process and the normal process performed on the video from the endoscope 10. Therefore, when fog is generated, it is possible to quickly obtain an image in which a site such as an affected part can be visually recognized even if there is a display delay. For example, when the affected area is excised, the image can be confirmed without affecting the excised area without stopping the operation. On the other hand, when fog is not generated, an image obtained by imaging a site such as an affected part can be obtained with appropriate image quality.
  • the fog correction process can be applied to an image captured by an endoscope, which has not been performed conventionally.
  • the endoscope system 5 can output an image of a site such as an affected part after fog correction processing even if there is a display delay when fog is generated, for example, depending on the imaging situation of the affected part in the human body,
  • a high-quality image of a site such as an affected part without fog correction processing can be output, so that in any case, user convenience can be improved.
  • the endoscope system 5 of the present embodiment performs fog correction processing on only a part of the image including the affected part, it is possible to reduce the area where the image quality is slightly lowered as compared with the case where it is performed on the entire image. it can.
  • the endoscope system 5 of the present embodiment can reduce the time required for the fog correction process, and can suppress display delay when fog occurs. Furthermore, it is possible to view an image of a part of the affected area where fog has occurred while comparing the image with other images where fog is not generated.
  • the head switch 19 is provided in the camera head 14 with a built-in image sensor, so that it is easy for a person who operates the endoscope to operate.
  • the fog correction process is performed on the region set in the correction detection area frame 20, but the fog correction process may be performed on the entire correction detection area frame 20. Good.
  • the head switch 19 may not be a push button switch that switches between on and off each time it is pressed, and may be a push button switch that is turned on only when it is pressed.
  • fog correction processing may be performed when the push button switch is off, and normal correction processing may be performed when the push button switch is on.
  • the present disclosure is useful as an endoscope system and an endoscope that can perform fog correction processing on an image picked up by an endoscope.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)

Abstract

Lorsqu'une brume est générée dans la zone d'un site affecté par découpe au laser ou analogue, un utilisateur met en marche un commutateur de têtes (19) et émet une instruction de commutation à un dispositif de commande de caméra (15) de sorte qu'un traitement de correction de la brume est effectué pour une image capturée par un endoscope (10). Lorsque le traitement de correction de la brume est effectué, un retard d'affichage d'au moins une trame par rapport au traitement de correction normal se produit dans l'image du site affecté affichée sur un moniteur (16), mais une condition dans laquelle le site affecté est rendu invisible par la brume est évitée.
PCT/JP2016/000422 2015-03-02 2016-01-28 Endoscope et système d'endoscope WO2016139884A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-040386 2015-03-02
JP2015040386A JP2016158886A (ja) 2015-03-02 2015-03-02 内視鏡及び内視鏡システム

Publications (1)

Publication Number Publication Date
WO2016139884A1 true WO2016139884A1 (fr) 2016-09-09

Family

ID=56843690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/000422 WO2016139884A1 (fr) 2015-03-02 2016-01-28 Endoscope et système d'endoscope

Country Status (2)

Country Link
JP (1) JP2016158886A (fr)
WO (1) WO2016139884A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392546A (zh) * 2017-03-07 2019-10-29 索尼公司 信息处理设备、辅助系统和信息处理方法
WO2023166742A1 (fr) * 2022-03-04 2023-09-07 オリンパス株式会社 Dispositif de traitement d'image, système de traitement et procédé de traitement d'image

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018157917A (ja) * 2017-03-22 2018-10-11 ソニー株式会社 制御装置、制御方法、制御システム、およびプログラム
WO2018198255A1 (fr) * 2017-04-26 2018-11-01 オリンパス株式会社 Dispositif de traitement d'image, procédé de commande pour dispositif de traitement d'image, et programme
JP6655756B2 (ja) 2017-05-19 2020-02-26 オリンパス株式会社 3d内視鏡装置、及び、3d映像処理装置
WO2021095773A1 (fr) * 2019-11-14 2021-05-20 ソニーグループ株式会社 Appareil de traitement d'informations, procédé de génération et programme de génération

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009125188A (ja) * 2007-11-21 2009-06-11 Panasonic Corp 内視鏡装置および内視鏡用カメラ装置
JP2011004929A (ja) * 2009-06-25 2011-01-13 Hoya Corp 内視鏡装置
JP2012239644A (ja) * 2011-05-19 2012-12-10 Olympus Corp 画像処理装置、内視鏡装置、画像処理方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2933165B2 (ja) * 1988-01-08 1999-08-09 オリンパス光学工業株式会社 電子内視鏡装置
JPH0530413A (ja) * 1991-07-01 1993-02-05 Sharp Corp テレビカメラシステム
JPH06113195A (ja) * 1992-09-29 1994-04-22 Canon Inc ビデオカメラ装置
JPH06268915A (ja) * 1993-03-12 1994-09-22 Hitachi Medical Corp ディジタル画像処理装置
JP5030335B2 (ja) * 2001-03-06 2012-09-19 オリンパス株式会社 医用画像表示装置
JP4632716B2 (ja) * 2004-08-10 2011-02-16 オリンパス株式会社 滅菌耐久性組成物を用いた内視鏡光学系、及びそれを備えた内視鏡
JP2009201587A (ja) * 2008-02-26 2009-09-10 Fujifilm Corp 放射線画像撮影装置
JP5379647B2 (ja) * 2009-10-29 2013-12-25 オリンパス株式会社 撮像装置、及び、画像生成方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009125188A (ja) * 2007-11-21 2009-06-11 Panasonic Corp 内視鏡装置および内視鏡用カメラ装置
JP2011004929A (ja) * 2009-06-25 2011-01-13 Hoya Corp 内視鏡装置
JP2012239644A (ja) * 2011-05-19 2012-12-10 Olympus Corp 画像処理装置、内視鏡装置、画像処理方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TETSUO TANAKA ET AL.: "Development of Fog Image Correction Process for Network Cameras", PANASONIC TECHNICAL JOURNAL, vol. 59, no. 2, November 2013 (2013-11-01), pages 133 - 135 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110392546A (zh) * 2017-03-07 2019-10-29 索尼公司 信息处理设备、辅助系统和信息处理方法
WO2023166742A1 (fr) * 2022-03-04 2023-09-07 オリンパス株式会社 Dispositif de traitement d'image, système de traitement et procédé de traitement d'image

Also Published As

Publication number Publication date
JP2016158886A (ja) 2016-09-05

Similar Documents

Publication Publication Date Title
WO2016139884A1 (fr) Endoscope et système d'endoscope
US10485629B2 (en) Endoscope device
US8072505B2 (en) Imaging apparatus
CN110168605B (zh) 用于动态范围压缩的视频信号处理装置、视频信号处理方法和计算机可读介质
US20120201433A1 (en) Image composition system
US10548465B2 (en) Medical imaging apparatus and medical observation system
US10952597B2 (en) Endoscope apparatus and method of detecting edge
US11022859B2 (en) Light emission control apparatus, light emission control method, light emission apparatus, and imaging apparatus
JP7016681B2 (ja) 内視鏡システム
JP6987513B2 (ja) 内視鏡装置
JP2012235962A (ja) 電子内視鏡装置
US10778889B2 (en) Image pickup apparatus, video signal processing apparatus, and video signal processing method
US11864732B2 (en) Medical image processing device and medical observation system
JP6045794B2 (ja) 電子内視鏡システム
JP5967952B2 (ja) 電子内視鏡システム
JP5976342B2 (ja) 電子内視鏡システム
US11778325B2 (en) Image processing apparatus, image processing method, and image processing program
US20190394395A1 (en) Image pickup apparatus, video signal processing apparatus, and video signal processing method
JPWO2019167555A1 (ja) 映像信号処理装置、映像信号処理方法および撮像装置
JP2008307294A (ja) 撮像システム
US11523065B2 (en) Imaging device and gain setting method
JP2011152300A (ja) 電子内視鏡システム
US20190076006A1 (en) Medical observation device and medical observation system
JP4493426B2 (ja) 電子内視鏡システム
JP2010279457A (ja) 電子内視鏡、電子内視鏡システムおよび色調整方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16758585

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16758585

Country of ref document: EP

Kind code of ref document: A1