WO2016139884A1 - Endoscope and endoscope system - Google Patents
Endoscope and endoscope system Download PDFInfo
- Publication number
- WO2016139884A1 WO2016139884A1 PCT/JP2016/000422 JP2016000422W WO2016139884A1 WO 2016139884 A1 WO2016139884 A1 WO 2016139884A1 JP 2016000422 W JP2016000422 W JP 2016000422W WO 2016139884 A1 WO2016139884 A1 WO 2016139884A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- fog
- signal
- endoscope
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
Definitions
- This disclosure relates to an endoscope and an endoscope system that display captured images.
- an insufflation apparatus that detects smoke generated in the abdominal cavity from an in-vivo image captured by an endoscope, supplies air supply gas for inflating the abdominal cavity, and discharges smoke generated in the abdominal cavity to the outside of the body.
- a technique of controlling and removing smoke is also known (see, for example, Patent Document 2).
- a technique for performing fog correction processing as an example of image processing is already known so that fog generated around the subject is not visible on the image captured by the camera (for example, non-patent literature). 1).
- this fog correction process the brightness histogram is flattened in order to increase the contrast of the entire screen, which has been lowered due to the concentration of the brightness histogram in the intermediate gradation due to the occurrence of fog.
- the present disclosure includes an endoscope including an imaging window at a tip and transmitting an optical image incident from the imaging window, and a camera head unit including an imaging element that captures an optical image transmitted through the scope And a control device that performs fog correction processing on the image captured by the image sensor, a display device that displays an image output from the control device, and the control device is instructed whether to perform fog correction processing.
- a switch and when the controller is instructed to execute fog correction processing in response to pressing of the switch, the control device executes fog correction processing on the image and is instructed not to execute fog correction processing.
- This is an endoscope system that outputs an image without executing fog correction processing.
- the present disclosure includes a scope unit that has an imaging window at the tip, transmits an optical image incident from the imaging window, a camera head unit that includes an imaging element that captures an optical image transmitted through the scope unit, and a camera It is an endoscope provided with a switch that is provided in the head unit and instructs the presence / absence of execution of fog correction processing on an image captured by an image sensor.
- fog correction processing can be applied to an image captured by an endoscope, which has not been performed conventionally.
- FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system according to the present embodiment.
- FIG. 2 is a diagram showing an example of the appearance of the camera head of the present embodiment.
- FIG. 3 is a diagram illustrating an example of the appearance of the camera control device of the present embodiment.
- FIG. 4 is a diagram illustrating an example of a hardware configuration of the camera head according to the present embodiment.
- FIG. 5 is a diagram illustrating an example of a hardware configuration of the camera control apparatus according to the present embodiment.
- FIG. 6 is a diagram illustrating an example of a monitor screen on which the setting change menu of the present embodiment is displayed.
- FIG. 7 is a diagram illustrating an example of a timing shift during the fog correction process according to the present embodiment.
- FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system according to the present embodiment.
- FIG. 2 is a diagram showing an example of the appearance of the camera head of the present embodiment.
- FIG. 3 is
- FIG. 8 is a flowchart showing an example of the correction processing procedure of the camera control apparatus of the present embodiment.
- FIG. 9 is a schematic diagram illustrating an example of an image of a subject including an affected area captured by the endoscope displayed on the monitor according to the present embodiment.
- Non-Patent Document 1 In general, when the fog correction process shown in Non-Patent Document 1 is performed, a processing delay occurs due to a display load caused by image processing. Further, the image that was subjected to the fog correction process was slightly degraded in image quality as compared to the image that was not subjected to the fog correction process. For this reason, conventionally, fog correction processing has not been performed on an image captured by an endoscope.
- the present disclosure has been made in view of the above-described conventional situation, and an image of a site such as an affected part after fog correction processing is performed even if there is a display delay when fog is generated according to the imaging situation of the affected part.
- a high-quality image of a site such as an affected part without fog correction processing is output, and an endoscope and an endoscope system that improve user convenience are provided. With the goal.
- the endoscope system and endoscope of this embodiment are used for observing an affected part in a body such as a person's abdominal cavity during surgery.
- FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system 5 according to the present embodiment.
- An endoscope system 5 shown in FIG. 1 includes an endoscope 10, a camera control device 15, a monitor 16 as an example of a display device, and a light source 17 for illumination.
- the endoscope 10 is used for observing an affected part in the body, and includes a scope 11, a mount adapter 12, a relay lens 13, a camera head 14, a head switch 19, and a light source input unit 18.
- the scope 11 as an example of the scope part is a main part of a rigid endoscope inserted into the body, for example, and is an elongated light guide member capable of guiding light from the end to the tip.
- the scope 11 has a photographing window 11z at the tip, and a central portion where a bundle of optical fibers through which an optical image incident from the photographing window 11z is transmitted is disposed so as to surround the optical fiber bundle. It has a structure in which an optical fiber that guides the light L to the tip is integrated. An optical material such as optical glass or optical plastic is used for the photographing window 11z.
- the mount adapter 12 is a member for attaching the scope 11 to the camera head 14.
- Various scopes can be detachably attached to the mount adapter 12.
- a light source input unit 18 is attached to the mount adapter 12.
- the light source input unit 18 is detachably connected to a light guide cable 17z for guiding light from the light source 17 to the scope 11, and enters the light from the light source 17 and guides it to the scope 11 side.
- the light source 17 is a light source having an LED (Light Emitting Diode) with a dimming function.
- a light source such as a laser or a halogen lamp and a diaphragm spring may be provided, and light emitted from the light source may be dimmed by the diaphragm spring.
- the relay lens 13 has a plurality of lenses that converge an optical image transmitted through the scope 11 on the imaging surface, and performs focus adjustment and magnification adjustment by moving these lenses.
- the camera head 14 as an example of the camera head unit includes an imaging element 50 (see FIG. 4) such as a CCD (Charged Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that converts an optical image formed on the imaging surface into an image signal. Is incorporated, and the imaging signal (video signal) from the imaging device 50 and each information signal in the camera head 14 (for example, including the on / off signal of the head switch 19) are multiplexed. The data is sent to the camera control device 15 via the extension cable 14z. In addition, the camera head 14 demodulates the power supply, the drive signal, and the setting signal to each circuit of the multiplexed image sensor 50 input from the camera control device 15 via the head extension cable 14z.
- an imaging element 50 see FIG. 4
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- FIG. 2 is a diagram showing an example of the appearance of the camera head 14.
- the camera head 14 has, for example, a square block-shaped housing, and a head switch 19 is provided on one surface (for example, the upper surface in FIG. 2).
- the head switch 19 is waterproof and is a push button switch that turns on and off an internal electric circuit (not shown) every time it is pressed in a direction perpendicular to the surface.
- the electric circuit is turned on (that is, the on signal of the head switch 19 is output)
- the fog correction process is executed
- the internal electric circuit is turned off (that is, the off signal of the head switch 19 is
- the camera control device 15 is instructed not to execute the fog correction process.
- a signal whose internal electric circuit is turned on by pressing the head switch 19 is multiplexed with a video signal from the image pickup device 50 and is transmitted to the camera control device 15 via the head extension cable 14z. Is transmitted.
- the signal in which the internal electric circuit is turned off by pressing the head switch 19 is not multiplexed with the video signal from the image sensor 50 and is transmitted as it is to the camera control device 15 via the head extension cable 14z.
- the camera control device 15 multiplexes the power supply of the image sensor 50, the drive signal, and the setting signal to each circuit, and sends the multiplexed signal to the camera head 14 via the head extension cable 14z, and the camera head via the head extension cable 14z. 14, the multiplexed video signal and each information signal (including the on / off signal of the head switch 19) in the camera head 14 are demodulated.
- the camera control device 15 extracts the on / off signal of the head switch 19 and performs a correction process to be described later on the video signal in accordance with the on / off signal, and the corrected video signal is suitable for the display format of the monitor 16. Convert to video signal.
- the camera control device 15 sends this video signal to the monitor 16 via the signal cable 15z.
- FIG. 3 is a diagram showing an example of the appearance of the camera control device 15.
- a camera setting change button 40 is provided on the front surface of the housing of the camera control device 15.
- the camera setting change button 40 includes a left / right up / down button 41 and a confirm button 42 respectively represented by arrows.
- a gain setting value is determined in a setting adjustment menu (see FIG. 6), which will be described later, according to the operation amount of the camera setting change button 40.
- the gain setting value is updated by pressing the right button, and the gain setting values are decreased by pressing the left button.
- the position of the correction detection area frame 20 is determined in the menu for moving the correction detection area frame 20 (see FIG. 9) according to the operation amount of the camera setting change button 40.
- the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button.
- the correction detection area frame 20 moves upward when the upper button is pressed, and moves downward when the lower button is pressed.
- the confirmation button 42 is used when confirming the parameters input in the setting change menu. For example, in the setting change menu displayed on the monitor 16 (see FIG. 6), when the operation of the confirmation button 42 is accepted for an item to which a star mark has been added, the screen of the monitor 16 transitions to a lower hierarchy. When the operation of the confirm button 42 is accepted in the item “END” shown in FIG. 6, the setting change menu screen is hidden from the monitor 16, and the original screen is displayed. The confirmation button 42 is also used when changing various settings.
- the monitor 16 inputs a video signal captured by the endoscope 10 and output from the camera control device 15 via the signal cable 15z, and displays an image based on the video signal on the display panel.
- FIG. 4 is a diagram illustrating an example of a hardware configuration of the camera head 14.
- the camera head 14 illustrated in FIG. 4 includes an image sensor 50, various image sensor drive signal generation circuits 51, a signal multiplex transmission circuit 53, and a multiplex signal demodulation circuit 52.
- the signal multiplex transmission circuit 53 multiplexes the video signal from the image sensor 50 and each information signal (including the on / off signal of the head switch 19) in the camera head 14, and the multiplexed signal is transmitted to the head extension cable 14z. Is sent to the camera control device 15.
- the multiplexed signal demodulating circuit 52 demodulates the power supply, the drive signal, and the setting signal to each circuit input from the camera control device 15 via the head extension cable 14z, and various kinds of imaging elements. Output to the drive signal generation circuit 51.
- FIG. 5 is a diagram illustrating an example of a hardware configuration of the camera control device 15.
- the camera control device 15 shown in FIG. 5 includes a multiplex signal demodulation circuit 60, a signal multiplex transmission circuit 61, a camera synchronization signal generation circuit 81, a pre-processing unit 71, a signal amplification processing unit 65, a YC separation processing / RGB conversion unit 72, A correction processing unit 73, gain level adjustment units 69 and 74, an area detection processing unit 77, an area generation unit 75, and a fog detection data addition / average / storage unit 78 are provided.
- the 6 includes an encoder processing unit 80, an area display signal replacement mixing / selection unit 76, a menu character generation unit 79, a control unit 83, a data storage unit 84, an external switch 88, and an external communication.
- Part 82 is provided.
- the external switch 88 includes a camera setting change button 40 (see FIG. 3).
- the pre-processing unit 71 performs noise removal, color separation processing using a pixel filter array, and the like on the video signal (that is, the image sensor signal) input from the camera head 14.
- the signal amplification processing unit 65 performs signal processing such as amplification on the image sensor signal performed by the pre-processing unit 71.
- the gain level adjustment unit 69 adjusts the amplification factor (that is, gain) of the image sensor signal according to the set value from the control unit 83.
- the YC separation processing / RGB conversion unit 72 separates the luminance signal (Y signal) and the color signal (C signal) from the video signal from the signal amplification processing unit 65, and converts the color signal into an RGB signal. Further, the YC separation processing / RGB conversion unit 72 outputs the separated Y signal to the area display signal replacement mixing / selection unit 76 via the correction processing unit 73.
- the correction processing unit 73 generates color difference signals (RY), (G ⁇ Y), and (BY) using the RGB signals input from the YC separation processing / RGB conversion unit 72, and sends them to the encoder processing unit 80. Output.
- the gain level adjustment unit 74 adjusts the gain of a circuit that can divide and set the input level for each region in accordance with the set value from the control unit 83 with respect to the correction processing unit 73. Further, the gain level adjustment unit 74 adjusts to different gains in a normal correction process and a fog correction process described later.
- the setting value from the control unit 83 that is the gain adjustment source is determined by the operation amount of the camera setting change button 40 in the setting change menu shown in FIG. For example, of the left and right up / down buttons 41 of the camera setting change button 40, the setting value of each gain is updated by pressing the right button, and the setting value of each gain is decreased by pressing the left button. As updated.
- the setting value set by the camera setting change button 40 or the setting value read by the control unit 83 is stored in the data storage unit 84.
- the data storage unit 84 has a correction value storage area 84z.
- the correction value storage area 84z both the gain setting value in the normal correction process and the gain setting value in the fog correction process are stored.
- the area generation unit 75 generates a display area frame signal based on the area designation coordinates input from the control unit 83, and converts the display area frame signal into an area display signal replacement mixing / selection unit 76 and fog detection data addition / average / The data is output to the storage unit 78.
- the display area frame signal is a signal for displaying the correction detection area frame 20 (see FIG. 9) on the display panel of the monitor 16.
- the size of the correction detection area frame 20 can be arbitrarily set as will be described later. For example, when the diameter of the scope 11 is small and the photographing range is narrow, the range for performing the fog correction process is also narrowed, so the size of the correction detection area frame 20 is set to a small (proper) value.
- correction detection area frame 20 may be arbitrarily set by the user's designation, or may be set by automatic detection of the scope diameter, and further, the correction detection area frame 20 of the scope 11 among a plurality of patterns registered in advance. It may be set to correspond to the diameter.
- the area display signal replacement mixing / selection unit 76 mixes the display area frame signal with the Y signal from the YC separation processing / RGB conversion unit 72 and performs encoder processing. To the unit 80. Further, when the area display signal replacement mixing / selecting unit 76 inputs the area display switching designation coordinates from the control unit 83, the area display signal replacement mixing / selecting unit 76 determines the position of the correction detection area frame 20 according to the coordinate information.
- the area display switching designated coordinates are determined by the operation amount of the camera setting change button 40 in the menu for moving the correction detection area frame 20.
- the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button.
- the correction detection area frame 20 moves upward by pressing the upper button, and moves downward by pressing the lower button.
- the area detection processing unit 77 detects an area where fog is generated from the color signal from the YC separation processing / RGB conversion unit 72 (that is, the display area of the correction detection area frame 20), and the result is obtained. Output to the fog detection data addition / average / storage unit 78.
- the fog detection data addition / average / storage unit 78 detects the fog for the display area of the portion where fog is detected, which is detected by the area detection processing unit 77 based on the display area frame signal input from the area generation unit 75. Add the data, calculate the average and statistical frequency (statistical data), and save the results.
- the menu character generator 79 generates character data for menu display according to the character display designation input from the controller 83.
- the camera synchronization signal generation circuit 81 generates an image sensor driving signal for driving the image sensor 50 in the camera head 14, a signal processing synchronization signal and a vertical synchronization signal used in the operation of each unit.
- the external communication unit 82 communicates with an external device (for example, a computer), and enables selection of a display area by the external device and capture of pixel information or color space information of the selected display area by the external device.
- the display area selected by the external device is not limited to one, and a plurality of display areas may be simultaneously provided. Since the camera control device 15 includes the external communication unit 82, it is possible to perform key operations on the external device side in addition to key operations performed alone, and the degree of freedom of operation is improved. For example, it is possible to interactively select a display area by using a mouse pointer or GUI, perform color correction, and change and save the settings of the camera control device 15.
- the control unit 83 controls the operation of each unit of the camera control device 15, and is used in the operation of a CPU (Central Processing Unit), a program memory in which a control program for operating the CPU is stored, and the CPU. It has RAM (Random Access Memory) and peripheral I / O (Input / Output) for connecting with peripheral devices. Further, an external switch 88 including a camera setting change button 40 is connected to the control unit 83.
- the controller 83 gives a character display designation to the menu character generator 79.
- the control unit 83 communicates with a computer that is an external device via the external communication unit 82 so that the camera control device 15 can be controlled on the computer side.
- the control unit 83 reads the statistical data and the luminance level stored in the fog detection data addition / average / storage unit 78, calculates a gain setting value used in the fog correction process, and sets the gain. The value is set in the gain level adjustment unit 74.
- the fog correction process is a known technique as shown in Non-Patent Document 1, for example, and includes various processes. For example, in the fog correction process, in order to increase the contrast of the entire screen, which is reduced by the concentration of the brightness histogram due to the occurrence of fog in the middle gradation, the correction process setting value (here, (Gain setting value) is changed. Further, in the correction value storage area 84z of the data storage unit 84, the set values of the gains in the normal correction process and the fog correction process calculated by the control unit 83 are stored.
- control unit 83 reads the gain setting value during the normal correction process from the correction value storage area 84z when the off signal of the head switch 19 is output, and on the other hand when the on signal of the head switch 19 is output.
- the gain setting value at the time of fog correction processing can be read from the correction value storage area 84z and set in the gain level adjustment unit 74 instantaneously. Note that the gain change performed at the time of switching between the normal correction process and the fog correction process may also be performed by the gain level adjustment unit 69.
- a user for example, a doctor who performs an operation or a collaborator thereof detects an area 30 (see FIG. 5) detected in the correction detection area frame 20 from the image of the subject including the affected part TG displayed on the screen of the monitor 16. 9), an operation for changing the set values of the respective units of the camera control device 15 is performed.
- the control unit 83 designates character display on the menu character generation unit 79 and causes the monitor 16 to display a setting change menu.
- FIG. 6 is a diagram showing an example of the screen of the monitor 16 on which the setting change menu is displayed.
- the setting change menu displayed on the screen of the monitor 16 at the time of setup includes menu items such as “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, “NKEE”, “END”, and the like.
- an item to which a star mark is added indicates that there is a lower hierarchy.
- the confirmation button 42 is pressed for an item to which a star mark has been added by a user operation, the screen of the monitor 16 transitions to a lower hierarchy.
- the user confirms the setting change menu displayed on the screen of the monitor 16 and makes various settings / changes to the camera control device 15. For example, when performing color correction, when the user selects “NKEE” from the setting change menu, the control unit 83 displays an NKEE screen (not shown) on the monitor 16.
- the NKEE screen is a screen in the first layer of the setting change menu, and includes items NKEE POINT and NKEE LEVEL.
- the camera control device 15 displays the parameters of NKEE to be adjusted.
- the confirmation button 42 when the confirmation button 42 is pressed in each of the items “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, and “SHUTTER” by the user's operation, the camera ID and illumination are displayed on the screen of the monitor 16.
- the shutter and gain setting values can be adjusted or changed.
- the screen of the monitor 16 deletes the setting change menu and transitions to the original screen.
- the camera control device 15 sets the correction detection area frame 20 including the region 30 in which the fog correction process is performed on the image of the subject including the affected part TG displayed on the screen of the monitor 16.
- the camera control device 15 changes the setting value (here, the gain setting value) of fog correction processing performed on the region 30 detected in the correction detection area frame 20, the luminance of the region 30 is It is determined which area of the knee area is included, the gain is adjusted for each corresponding area, and the signal of each area is synthesized after this adjustment.
- FIG. 7 is a diagram illustrating an example of a timing shift during fog correction processing.
- squares indicate frames
- black squares indicate frames for the same captured image.
- the fog correction processing is performed on the previous frame image, that is, fog detection data addition is performed. Since the calculation is performed based on the statistical data obtained by the averaging / storing unit 78, the image after the fog correction process is output as an output video with a delay of at least one frame compared to the normal correction process.
- the camera control device 15 In switching from the normal correction process to the fog correction process, when the user turns on the head switch 19 (for example, when the head switch 19 is pressed an odd number of times), the camera control device 15 follows a predetermined algorithm. Thus, the gain setting value is instantaneously switched to the correction processing setting value so as to obtain an optimum combination for the fog correction processing. Accordingly, the user can intuitively adjust the setting value (in this case, the gain setting value) of the fog correction process without having specialized knowledge.
- FIG. 8 is a flowchart showing an example of the correction processing procedure of the camera control device 15.
- the control unit 83 in the camera control device 15 takes in the state of the head switch 19 (in other words, whether the electric circuit in the head switch 19 is on or off) (S1). Thereby, the control unit 83 detects whether or not the head switch 19 is pressed by a user operation (S2). When the head switch 19 is not depressed (S2, NO), the control unit 83 proceeds to the process of step S4 as it is. On the other hand, when the head switch 19 is pressed (S2, YES), the control unit 83 changes the presence / absence of fog correction processing (S3).
- the control unit 83 determines whether or not to perform fog correction processing (S4). When the head switch 19 is OFF, that is, when fog correction processing is not performed (S4, NO), the control unit 83 performs normal correction processing (fog correction) stored in the correction value storage area 84z of the data storage unit 84. Various setting values of “no processing” are read (S5).
- the control unit 83 performs various types of fog correction processing stored in the correction value storage area 84z of the data storage unit 84.
- a set value is read (S6).
- the set values of various correction processes are set values of gains (gains) is shown as an example.
- the gain setting value for the gain level adjustment unit 69 may be changed.
- the control unit 83 sets the gain setting values read in steps S5 and S6 in the gain level adjustment units 69 and 74 (S7).
- the gain setting value is set in the gain level adjustment unit 69
- the signal amplification processing unit 65 adjusts the gain of the video signal output from the preprocessing unit 71
- the gain level adjustment unit 74 sets the gain value.
- the correction processing unit 73 adjusts the gain of the color difference signal.
- the encoder processing unit 80 mixes the color difference signal corrected by the correction processing unit 73, the display area frame signal from the area display signal replacement mixing / selection unit 76, and the Y signal from the YC separation processing / RGB conversion unit 72.
- the received signals are input, these signals are changed to video signals suitable for the display format of the monitor 16, and output to the monitor 16.
- the monitor 16 displays an image based on this video signal.
- FIG. 9 is a schematic diagram illustrating an example of an image of a subject including the affected part TG captured by the endoscope 10 displayed on the monitor 16.
- the correction detection area frame 20 is set by the user so as to include the affected part TG with respect to the image displayed on the monitor 16.
- fog water vapor or smoke
- the affected part TG is difficult for the user to see (monitor image on the left side in the figure).
- fog correction processing is performed on the region 30 where the generation of fog has been detected.
- the affected area TG can be visually recognized (the monitor image on the right side in the figure).
- the camera control device 15 performs the normal correction process on the image captured by the endoscope 10, An image of the affected part TG imaged by the endoscope 10 is displayed with optimum characteristics (luminance characteristics, color characteristics).
- the user operates the head switch 19 to turn on the camera and instructs the camera to switch to perform fog correction processing on the image captured by the endoscope 10. This is performed on the control device 15.
- the camera head 14 is provided with the head switch 19 that can instantaneously switch between the fog correction process and the normal process performed on the video from the endoscope 10. Therefore, when fog is generated, it is possible to quickly obtain an image in which a site such as an affected part can be visually recognized even if there is a display delay. For example, when the affected area is excised, the image can be confirmed without affecting the excised area without stopping the operation. On the other hand, when fog is not generated, an image obtained by imaging a site such as an affected part can be obtained with appropriate image quality.
- the fog correction process can be applied to an image captured by an endoscope, which has not been performed conventionally.
- the endoscope system 5 can output an image of a site such as an affected part after fog correction processing even if there is a display delay when fog is generated, for example, depending on the imaging situation of the affected part in the human body,
- a high-quality image of a site such as an affected part without fog correction processing can be output, so that in any case, user convenience can be improved.
- the endoscope system 5 of the present embodiment performs fog correction processing on only a part of the image including the affected part, it is possible to reduce the area where the image quality is slightly lowered as compared with the case where it is performed on the entire image. it can.
- the endoscope system 5 of the present embodiment can reduce the time required for the fog correction process, and can suppress display delay when fog occurs. Furthermore, it is possible to view an image of a part of the affected area where fog has occurred while comparing the image with other images where fog is not generated.
- the head switch 19 is provided in the camera head 14 with a built-in image sensor, so that it is easy for a person who operates the endoscope to operate.
- the fog correction process is performed on the region set in the correction detection area frame 20, but the fog correction process may be performed on the entire correction detection area frame 20. Good.
- the head switch 19 may not be a push button switch that switches between on and off each time it is pressed, and may be a push button switch that is turned on only when it is pressed.
- fog correction processing may be performed when the push button switch is off, and normal correction processing may be performed when the push button switch is on.
- the present disclosure is useful as an endoscope system and an endoscope that can perform fog correction processing on an image picked up by an endoscope.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
Abstract
When mist is generated in the area of an affected site by laser cutting or the like, a user turns on a head switch (19) and issues a switching instruction to a camera control device (15) so that mist correction processing is performed for an image captured by an endoscope (10). When mist correction processing is performed, a display delay of at least one frame relative to normal correction processing occurs in the image of the affected site displayed on a monitor (16), but a condition in which the affected site is made invisible by the mist is avoided.
Description
本開示は、撮像した画像を表示する内視鏡及び内視鏡システムに関する。
This disclosure relates to an endoscope and an endoscope system that display captured images.
従来、内視鏡が人の身体内の患部を撮像している間に、例えばレーザ切除手術において大量の水蒸気又は煙が発生した場合、医者が患部全体の状態を適切に視るためには、水蒸気又は煙が消えるまで一旦レーザ切除の工程を止める等、手術の中断を余儀なくされた。
Conventionally, when a large amount of water vapor or smoke is generated in, for example, laser ablation surgery while an endoscope is imaging an affected area in a human body, in order for a doctor to appropriately view the state of the entire affected area, The operation was interrupted, such as stopping the laser ablation process until the water vapor or smoke disappeared.
これに対し、内視鏡のスコープ部に振動を与え、その先端に設けられた撮影窓に生じた曇りを除去する技術が知られている(例えば、特許文献1参照)。
On the other hand, a technique is known in which vibration is applied to a scope portion of an endoscope and fogging generated in a photographing window provided at the tip of the endoscope is removed (see, for example, Patent Document 1).
また、内視鏡が撮像した体内の画像から腹腔内で発生する煙を検出し、腹腔内を膨らませる送気ガスの供給と腹腔内に発生する煙の体外への排出を行う気腹装置を制御して煙の除去を行う技術も知られている(例えば、特許文献2参照)。
In addition, an insufflation apparatus that detects smoke generated in the abdominal cavity from an in-vivo image captured by an endoscope, supplies air supply gas for inflating the abdominal cavity, and discharges smoke generated in the abdominal cavity to the outside of the body. A technique of controlling and removing smoke is also known (see, for example, Patent Document 2).
また、カメラで撮像した画像に対し、被写体の周囲に発生した霧が画像上で見えなくなるように、画像処理の一例としての霧補正処理を行う技術が既に知られている(例えば、非特許文献1参照)。この霧補正処理では、霧の発生により輝度ヒストグラムが中間諧調に集中することで低下した画面全体のコントラストを高めるために、輝度ヒストグラムを平坦化することが行われる。
Further, a technique for performing fog correction processing as an example of image processing is already known so that fog generated around the subject is not visible on the image captured by the camera (for example, non-patent literature). 1). In this fog correction process, the brightness histogram is flattened in order to increase the contrast of the entire screen, which has been lowered due to the concentration of the brightness histogram in the intermediate gradation due to the occurrence of fog.
本開示は、先端に撮像窓を有し撮像窓から入射する光学像を伝達するスコープ部と、スコープ部を通して伝達された光学像を撮像する撮像素子を内蔵するカメラヘッド部とを含む内視鏡と、撮像素子により撮像された画像に対し、霧補正処理を行う制御装置と、制御装置から出力される画像を表示する表示装置と、制御装置に対し、霧補正処理の実行の有無を指示するスイッチと、を備え、制御装置は、スイッチの押下に応じて、霧補正処理の実行が指示された場合に画像に対して霧補正処理を実行し、霧補正処理の不実行が指示された場合に霧補正処理を実行せずに画像を出力する、内視鏡システムである。
The present disclosure includes an endoscope including an imaging window at a tip and transmitting an optical image incident from the imaging window, and a camera head unit including an imaging element that captures an optical image transmitted through the scope And a control device that performs fog correction processing on the image captured by the image sensor, a display device that displays an image output from the control device, and the control device is instructed whether to perform fog correction processing. A switch, and when the controller is instructed to execute fog correction processing in response to pressing of the switch, the control device executes fog correction processing on the image and is instructed not to execute fog correction processing. This is an endoscope system that outputs an image without executing fog correction processing.
また、本開示は、先端に撮像窓を有し、撮像窓から入射する光学像を伝達するスコープ部と、スコープ部を通して伝達された光学像を撮像する撮像素子を内蔵するカメラヘッド部と、カメラヘッド部に設けられ、撮像素子により撮像された画像に対し、霧補正処理の実行の有無を指示するスイッチと、を備える、内視鏡である。
In addition, the present disclosure includes a scope unit that has an imaging window at the tip, transmits an optical image incident from the imaging window, a camera head unit that includes an imaging element that captures an optical image transmitted through the scope unit, and a camera It is an endoscope provided with a switch that is provided in the head unit and instructs the presence / absence of execution of fog correction processing on an image captured by an image sensor.
本開示によれば、霧が発生している時には、表示遅延があっても患部等の部位を視認できる画像を迅速に得ることができ、一方、霧が発生していない時には患部等の部位を撮像した画像を適正な画質で得ることができる。これにより、従来行われていなかった、内視鏡で撮像される画像に対しても霧補正処理を適用できる。
According to the present disclosure, when fog is generated, it is possible to quickly obtain an image for visually recognizing a site such as an affected part even if there is a display delay. A captured image can be obtained with an appropriate image quality. Thereby, fog correction processing can be applied to an image captured by an endoscope, which has not been performed conventionally.
実施の形態の説明に先立ち、従来の技術における問題点を簡単に説明する。内視鏡の撮像画像の被写体(例えば身体内の患部やその周囲)に生じた曇りを物理的に除去する場合には、上述した特許文献1に示す振動発生装置や特許文献2に示す気腹装置等を付属させる等、内視鏡システムの構成が複雑となり、部品点数が増大する。
Prior to the description of the embodiment, the problems in the prior art will be briefly described. In the case of physically removing cloudiness generated in a subject (for example, an affected part in the body and its surroundings) of an image captured by an endoscope, the vibration generator shown in Patent Document 1 described above and the air stomach described in Patent Document 2 are used. The configuration of the endoscope system becomes complicated by attaching a device or the like, and the number of parts increases.
また、一般に、非特許文献1に示す霧補正処理を行う場合には、画像処理による表示の負荷のために処理遅延が起きてしまう。更に、霧補正処理がなされた画像は、霧補正処理が無い画像と比べて画質が僅かに劣化した。このため、従来では内視鏡で撮像された画像には、霧補正処理は行われていなかった。
In general, when the fog correction process shown in Non-Patent Document 1 is performed, a processing delay occurs due to a display load caused by image processing. Further, the image that was subjected to the fog correction process was slightly degraded in image quality as compared to the image that was not subjected to the fog correction process. For this reason, conventionally, fog correction processing has not been performed on an image captured by an endoscope.
本開示は、上述した従来の状況に鑑みてなされたものであり、患部の撮像状況に応じて、霧が発生している時には表示遅延があっても霧補正処理後の患部等の部位の画像を出力し、一方、霧が発生していない時には霧補正処理なしの患部等の部位の高画質な画像を出力し、ユーザの利便性を向上する内視鏡及び内視鏡システムを提供することを目的とする。
The present disclosure has been made in view of the above-described conventional situation, and an image of a site such as an affected part after fog correction processing is performed even if there is a display delay when fog is generated according to the imaging situation of the affected part. On the other hand, when fog is not generated, a high-quality image of a site such as an affected part without fog correction processing is output, and an endoscope and an endoscope system that improve user convenience are provided. With the goal.
以下、本開示に係る内視鏡システム及び内視鏡を具体的に開示した実施形態(以下、本実施形態)という)について、図面を参照して説明する。本実施形態の内視鏡システム及び内視鏡は、例えば手術中において人の腹腔内等の身体内の患部を観察するために用いられる。
Hereinafter, an embodiment (hereinafter referred to as the present embodiment) that specifically discloses the endoscope system and the endoscope according to the present disclosure will be described with reference to the drawings. The endoscope system and endoscope of this embodiment are used for observing an affected part in a body such as a person's abdominal cavity during surgery.
図1は、本実施形態の内視鏡システム5の構成の一例を示す図である。図1に示す内視鏡システム5は、内視鏡10、カメラ制御装置15、表示装置の一例としてのモニタ16及び照明用の光源17を備える。内視鏡10は、体内の患部を観察するために用いられ、スコープ11、マウントアダプタ12、リレーレンズ13、カメラヘッド14、ヘッドスイッチ19及び光源入力部18から構成される。
FIG. 1 is a diagram illustrating an example of a configuration of an endoscope system 5 according to the present embodiment. An endoscope system 5 shown in FIG. 1 includes an endoscope 10, a camera control device 15, a monitor 16 as an example of a display device, and a light source 17 for illumination. The endoscope 10 is used for observing an affected part in the body, and includes a scope 11, a mount adapter 12, a relay lens 13, a camera head 14, a head switch 19, and a light source input unit 18.
スコープ部の一例としてのスコープ11は、例えば身体内に挿入される硬性内視鏡の主要部であり、末端から先端まで光を導くことが可能な細長い導光部材である。スコープ11は、先端に撮影窓11zを有し、撮影窓11zから入射した光学像が伝送される光ファイバの束が配置された中心部と、これらの周りを囲むように配置され光源17からの光Lを先端まで導く光ファイバとが一体化された構造を有する。撮影窓11zには、光学ガラスや光学プラスチック等の光学材料が用いられる。
The scope 11 as an example of the scope part is a main part of a rigid endoscope inserted into the body, for example, and is an elongated light guide member capable of guiding light from the end to the tip. The scope 11 has a photographing window 11z at the tip, and a central portion where a bundle of optical fibers through which an optical image incident from the photographing window 11z is transmitted is disposed so as to surround the optical fiber bundle. It has a structure in which an optical fiber that guides the light L to the tip is integrated. An optical material such as optical glass or optical plastic is used for the photographing window 11z.
マウントアダプタ12は、スコープ11をカメラヘッド14に取り付けるための部材である。マウントアダプタ12には、種々のスコープが着脱自在に装着可能である。また、マウントアダプタ12には、光源入力部18が装着される。光源入力部18は、光源17からの光をスコープ11に導くための導光ケーブル17zが着脱自在に接続され、光源17からの光を入射しスコープ11側に導く。
The mount adapter 12 is a member for attaching the scope 11 to the camera head 14. Various scopes can be detachably attached to the mount adapter 12. A light source input unit 18 is attached to the mount adapter 12. The light source input unit 18 is detachably connected to a light guide cable 17z for guiding light from the light source 17 to the scope 11, and enters the light from the light source 17 and guides it to the scope 11 side.
光源17は、調光機能付きのLED(Light Emitting Diode)を有する光源である。なお、LED光源の代わりに、例えばレーザ、ハロゲンランプ等の発光源及び絞りばねを有し、発光源から発せられた光を絞りばねで調光した構成を備えてもよい。
The light source 17 is a light source having an LED (Light Emitting Diode) with a dimming function. Instead of the LED light source, for example, a light source such as a laser or a halogen lamp and a diaphragm spring may be provided, and light emitted from the light source may be dimmed by the diaphragm spring.
リレーレンズ13は、スコープ11を通して伝達される光学像を撮像面に収束させる複数のレンズを有し、これらのレンズを移動させて焦点調整及び倍率調整を行う。
The relay lens 13 has a plurality of lenses that converge an optical image transmitted through the scope 11 on the imaging surface, and performs focus adjustment and magnification adjustment by moving these lenses.
カメラヘッド部の一例としてのカメラヘッド14は、撮像面に結像した光学像を画像信号に変換するCCD(Charged Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等の撮像素子50(図4参照)を内蔵し、撮像素子50からの撮像信号(映像信号)とカメラヘッド14内の各情報信号(例えば、ヘッドスイッチ19のオン/オフ信号を含む)を多重化し、この多重化した信号を、ヘッド延長ケーブル14zを介してカメラ制御装置15に送出する。また、カメラヘッド14は、カメラ制御装置15からヘッド延長ケーブル14zを介して入力した、多重化された撮像素子50の電源、駆動信号及び各回路への設定信号を復調する。
The camera head 14 as an example of the camera head unit includes an imaging element 50 (see FIG. 4) such as a CCD (Charged Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) that converts an optical image formed on the imaging surface into an image signal. Is incorporated, and the imaging signal (video signal) from the imaging device 50 and each information signal in the camera head 14 (for example, including the on / off signal of the head switch 19) are multiplexed. The data is sent to the camera control device 15 via the extension cable 14z. In addition, the camera head 14 demodulates the power supply, the drive signal, and the setting signal to each circuit of the multiplexed image sensor 50 input from the camera control device 15 via the head extension cable 14z.
図2は、カメラヘッド14の外観の一例を示す図である。カメラヘッド14は例えば四角いブロック形の筐体を有しており、その一面(例えば図2の上面)には、ヘッドスイッチ19が設けられている。ヘッドスイッチ19は、防水加工されており、面に対して垂直方向に押下される度に、内部の電気回路(不図示)がオンとオフに切り替わる押しボタンスイッチであり、後述するように、内部の電気回路がオンになる(つまり、ヘッドスイッチ19のオン信号が出力される)と霧補正処理を実行するように、また内部の電気回路がオフになる(つまり、ヘッドスイッチ19のオフ信号が出力される)と霧補正処理を実行しないようにカメラ制御装置15に指示する。なお、ヘッドスイッチ19の押下により内部の電気回路がオンになった信号は、後述するように、撮像素子50からの映像信号と多重化され、ヘッド延長ケーブル14zを経由してカメラ制御装置15に伝送される。一方、ヘッドスイッチ19の押下により内部の電気回路がオフになった信号は、撮像素子50からの映像信号と多重化されずに、そのままヘッド延長ケーブル14zを経由してカメラ制御装置15に伝送される。
FIG. 2 is a diagram showing an example of the appearance of the camera head 14. The camera head 14 has, for example, a square block-shaped housing, and a head switch 19 is provided on one surface (for example, the upper surface in FIG. 2). The head switch 19 is waterproof and is a push button switch that turns on and off an internal electric circuit (not shown) every time it is pressed in a direction perpendicular to the surface. When the electric circuit is turned on (that is, the on signal of the head switch 19 is output), the fog correction process is executed, and the internal electric circuit is turned off (that is, the off signal of the head switch 19 is The camera control device 15 is instructed not to execute the fog correction process. As will be described later, a signal whose internal electric circuit is turned on by pressing the head switch 19 is multiplexed with a video signal from the image pickup device 50 and is transmitted to the camera control device 15 via the head extension cable 14z. Is transmitted. On the other hand, the signal in which the internal electric circuit is turned off by pressing the head switch 19 is not multiplexed with the video signal from the image sensor 50 and is transmitted as it is to the camera control device 15 via the head extension cable 14z. The
カメラ制御装置15は、撮像素子50の電源、駆動信号及び各回路への設定信号を多重化し、ヘッド延長ケーブル14zを介してカメラヘッド14に送出し、また、ヘッド延長ケーブル14zを介してカメラヘッド14から入力した、多重化された映像信号とカメラヘッド14内の各情報信号(ヘッドスイッチ19のオン/オフ信号を含む)を復調する。カメラ制御装置15は、ヘッドスイッチ19のオン/オフ信号を取り出し、このオン/オフ信号に従い、映像信号に対し後述する補正処理を行い、補正処理後の映像信号をモニタ16の表示形式に適した映像信号に変換する。カメラ制御装置15は、この映像信号を、信号ケーブル15zを介してモニタ16に送出する。
The camera control device 15 multiplexes the power supply of the image sensor 50, the drive signal, and the setting signal to each circuit, and sends the multiplexed signal to the camera head 14 via the head extension cable 14z, and the camera head via the head extension cable 14z. 14, the multiplexed video signal and each information signal (including the on / off signal of the head switch 19) in the camera head 14 are demodulated. The camera control device 15 extracts the on / off signal of the head switch 19 and performs a correction process to be described later on the video signal in accordance with the on / off signal, and the corrected video signal is suitable for the display format of the monitor 16. Convert to video signal. The camera control device 15 sends this video signal to the monitor 16 via the signal cable 15z.
図3は、カメラ制御装置15の外観の一例を示す図である。カメラ制御装置15の筐体の前面には、カメラ設定変更ボタン40が設けられている。カメラ設定変更ボタン40は、それぞれ矢印で表された左右上下ボタン41及び確定ボタン42からなる。
FIG. 3 is a diagram showing an example of the appearance of the camera control device 15. A camera setting change button 40 is provided on the front surface of the housing of the camera control device 15. The camera setting change button 40 includes a left / right up / down button 41 and a confirm button 42 respectively represented by arrows.
カメラ設定変更ボタン40の操作量により、後述する設定調整メニュー(図6参照)において、例えばゲインの設定値が決定される。例えば、カメラ設定変更ボタン40の左右上下ボタン41のうち、右ボタンを押下することでゲインの設定値が増加するように更新され、左ボタンを押下することで各ゲインの設定値が減少するように更新される。また、カメラ設定変更ボタン40の操作量により、補正用検出エリア枠20(図9参照)を動かすメニューにおいて、補正用検出エリア枠20の位置が決定される。補正用検出エリア枠20は、例えばカメラ設定変更ボタン40の左右上下ボタン41のうち、右ボタンを押下することで右方向に移動し、左ボタンを押下することで左方向に移動する。また、補正用検出エリア枠20は、上ボタンを押下することで上方向に移動し、下ボタンを押下することで下方向に移動する。
For example, a gain setting value is determined in a setting adjustment menu (see FIG. 6), which will be described later, according to the operation amount of the camera setting change button 40. For example, among the left / right up / down buttons 41 of the camera setting change button 40, the gain setting value is updated by pressing the right button, and the gain setting values are decreased by pressing the left button. Updated to Further, the position of the correction detection area frame 20 is determined in the menu for moving the correction detection area frame 20 (see FIG. 9) according to the operation amount of the camera setting change button 40. For example, the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button. The correction detection area frame 20 moves upward when the upper button is pressed, and moves downward when the lower button is pressed.
確定ボタン42は、設定変更メニューにおいて入力されたパラメータを確定する時に使用される。例えばモニタ16に表示された設定変更メニュー(図6参照)において、星マークが付加された項目で確定ボタン42の操作を受け付けると、モニタ16の画面は下位の階層に遷移する。また、図6に示す「END」の項目で確定ボタン42の操作を受け付けると、モニタ16から設定変更メニューの画面が非表示され、元の画面に遷移する。また、確定ボタン42は、各種の設定を変更する際にも使用される。
The confirmation button 42 is used when confirming the parameters input in the setting change menu. For example, in the setting change menu displayed on the monitor 16 (see FIG. 6), when the operation of the confirmation button 42 is accepted for an item to which a star mark has been added, the screen of the monitor 16 transitions to a lower hierarchy. When the operation of the confirm button 42 is accepted in the item “END” shown in FIG. 6, the setting change menu screen is hidden from the monitor 16, and the original screen is displayed. The confirmation button 42 is also used when changing various settings.
モニタ16は、内視鏡10で撮像され、カメラ制御装置15から出力された映像信号を、信号ケーブル15zを介して入力し、映像信号に基づく画像を表示パネルに表示する。
The monitor 16 inputs a video signal captured by the endoscope 10 and output from the camera control device 15 via the signal cable 15z, and displays an image based on the video signal on the display panel.
図4は、カメラヘッド14のハードウェア構成の一例を示す図である。図4に示すカメラヘッド14は、撮像素子50、撮像素子各種駆動信号発生回路51、信号多重伝送回路53及び多重信号復調回路52を有する。信号多重伝送回路53は、撮像素子50からの映像信号とカメラヘッド14内の各情報信号(ヘッドスイッチ19のオン/オフ信号を含む)を多重化し、この多重化した信号を、ヘッド延長ケーブル14zを介してカメラ制御装置15に送出する。また、多重信号復調回路52は、カメラ制御装置15からヘッド延長ケーブル14zを介して入力した、多重化された撮像素子50の電源、駆動信号及び各回路への設定信号を復調し、撮像素子各種駆動信号発生回路51に出力する。
FIG. 4 is a diagram illustrating an example of a hardware configuration of the camera head 14. The camera head 14 illustrated in FIG. 4 includes an image sensor 50, various image sensor drive signal generation circuits 51, a signal multiplex transmission circuit 53, and a multiplex signal demodulation circuit 52. The signal multiplex transmission circuit 53 multiplexes the video signal from the image sensor 50 and each information signal (including the on / off signal of the head switch 19) in the camera head 14, and the multiplexed signal is transmitted to the head extension cable 14z. Is sent to the camera control device 15. The multiplexed signal demodulating circuit 52 demodulates the power supply, the drive signal, and the setting signal to each circuit input from the camera control device 15 via the head extension cable 14z, and various kinds of imaging elements. Output to the drive signal generation circuit 51.
図5は、カメラ制御装置15のハードウェア構成の一例を示す図である。図5に示すカメラ制御装置15は、多重信号復調回路60、信号多重伝送回路61、カメラ同期信号発生回路81、前置処理部71、信号増幅処理部65、YC分離処理・RGB変換部72、補正処理部73、ゲインレベル調整部69,74、エリア検出処理部77、エリア生成部75及び霧検出データ加算・平均・保存部78を備える。また、図6に示すカメラ制御装置15は、エンコーダ処理部80、エリア表示用信号置き換え混合・選択部76、メニュー用文字発生部79、制御部83、データ保存部84、外部スイッチ88及び外部通信部82を備える。外部スイッチ88は、カメラ設定変更ボタン40(図3参照)を含む。
FIG. 5 is a diagram illustrating an example of a hardware configuration of the camera control device 15. The camera control device 15 shown in FIG. 5 includes a multiplex signal demodulation circuit 60, a signal multiplex transmission circuit 61, a camera synchronization signal generation circuit 81, a pre-processing unit 71, a signal amplification processing unit 65, a YC separation processing / RGB conversion unit 72, A correction processing unit 73, gain level adjustment units 69 and 74, an area detection processing unit 77, an area generation unit 75, and a fog detection data addition / average / storage unit 78 are provided. 6 includes an encoder processing unit 80, an area display signal replacement mixing / selection unit 76, a menu character generation unit 79, a control unit 83, a data storage unit 84, an external switch 88, and an external communication. Part 82 is provided. The external switch 88 includes a camera setting change button 40 (see FIG. 3).
前置処理部71は、カメラヘッド14から入力された映像信号(つまり、撮像素子信号)に対し、ノイズ除去や画素フィルタ配列による色分離処理等を行う。
The pre-processing unit 71 performs noise removal, color separation processing using a pixel filter array, and the like on the video signal (that is, the image sensor signal) input from the camera head 14.
信号増幅処理部65は、前置処理部71が行った撮像素子信号に対し、増幅等の信号処理を行う。
The signal amplification processing unit 65 performs signal processing such as amplification on the image sensor signal performed by the pre-processing unit 71.
ゲインレベル調整部69は、制御部83からの設定値に従い、撮像素子信号の増幅率(つまり、ゲイン)を調整する。
The gain level adjustment unit 69 adjusts the amplification factor (that is, gain) of the image sensor signal according to the set value from the control unit 83.
YC分離処理・RGB変換部72は、信号増幅処理部65からの映像信号に対し、輝度信号(Y信号)と色信号(C信号)とを分離し、色信号をRGB信号に変換する。また、YC分離処理・RGB変換部72は、分離したY信号を、補正処理部73を介してエリア表示用信号置き換え混合・選択部76に出力する。
The YC separation processing / RGB conversion unit 72 separates the luminance signal (Y signal) and the color signal (C signal) from the video signal from the signal amplification processing unit 65, and converts the color signal into an RGB signal. Further, the YC separation processing / RGB conversion unit 72 outputs the separated Y signal to the area display signal replacement mixing / selection unit 76 via the correction processing unit 73.
補正処理部73は、YC分離処理・RGB変換部72から入力したRGB信号を用いて色差信号(R-Y),(G-Y),(B-Y)を生成し、エンコーダ処理部80に出力する。
The correction processing unit 73 generates color difference signals (RY), (G−Y), and (BY) using the RGB signals input from the YC separation processing / RGB conversion unit 72, and sends them to the encoder processing unit 80. Output.
ゲインレベル調整部74は、補正処理部73に対し、制御部83からの設定値に応じて、各領域毎の入力レベルを分割・設定できる回路のゲインを調整する。また、ゲインレベル調整部74は、後述する通常補正処理と霧補正処理とで異なるゲインに調整する。このゲインの調整元となる制御部83からの設定値は、図6に示す設定変更メニューにおけるカメラ設定変更ボタン40の操作量により決定される。例えば、カメラ設定変更ボタン40の左右上下ボタン41のうち、右ボタンを押下することで各ゲインの設定値が増加するように更新され、左ボタンを押下することで各ゲインの設定値が減少するように更新される。また、カメラ設定変更ボタン40によって設定される設定値、又は制御部83によって読み出される設定値は、データ保存部84に記憶される。
The gain level adjustment unit 74 adjusts the gain of a circuit that can divide and set the input level for each region in accordance with the set value from the control unit 83 with respect to the correction processing unit 73. Further, the gain level adjustment unit 74 adjusts to different gains in a normal correction process and a fog correction process described later. The setting value from the control unit 83 that is the gain adjustment source is determined by the operation amount of the camera setting change button 40 in the setting change menu shown in FIG. For example, of the left and right up / down buttons 41 of the camera setting change button 40, the setting value of each gain is updated by pressing the right button, and the setting value of each gain is decreased by pressing the left button. As updated. The setting value set by the camera setting change button 40 or the setting value read by the control unit 83 is stored in the data storage unit 84.
データ保存部84は、補正値記憶領域84zを有する。補正値記憶領域84zには、通常補正処理におけるゲインの設定値と、霧補正処理におけるゲインの設定値との両方が記憶される。
The data storage unit 84 has a correction value storage area 84z. In the correction value storage area 84z, both the gain setting value in the normal correction process and the gain setting value in the fog correction process are stored.
エリア生成部75は、制御部83から入力されるエリア指定座標に基づいて表示エリア枠信号を生成し、表示エリア枠信号をエリア表示用信号置き換え混合・選択部76及び霧検出データ加算・平均・保存部78に出力する。表示エリア枠信号は、モニタ16の表示パネルに補正用検出エリア枠20(図9参照)を表示する信号である。この補正用検出エリア枠20の大きさは、後述するように任意に設定可能である。例えばスコープ11の径が小さく撮影範囲が狭い場合、霧補正処理を行う範囲も狭くなるので、補正用検出エリア枠20の大きさは小さな(適正な)値に設定される。また、補正用検出エリア枠20は、ユーザの指定によって任意に設定されてもよく、又はスコープ径の自動検出によって設定されてもよく、更には、あらかじめ登録された複数のパターンのうちスコープ11の径に対応するものに設定されてもよい。
The area generation unit 75 generates a display area frame signal based on the area designation coordinates input from the control unit 83, and converts the display area frame signal into an area display signal replacement mixing / selection unit 76 and fog detection data addition / average / The data is output to the storage unit 78. The display area frame signal is a signal for displaying the correction detection area frame 20 (see FIG. 9) on the display panel of the monitor 16. The size of the correction detection area frame 20 can be arbitrarily set as will be described later. For example, when the diameter of the scope 11 is small and the photographing range is narrow, the range for performing the fog correction process is also narrowed, so the size of the correction detection area frame 20 is set to a small (proper) value. Further, the correction detection area frame 20 may be arbitrarily set by the user's designation, or may be set by automatic detection of the scope diameter, and further, the correction detection area frame 20 of the scope 11 among a plurality of patterns registered in advance. It may be set to correspond to the diameter.
エリア表示用信号置き換え混合・選択部76は、エリア生成部75から表示エリア枠信号を入力すると、その表示エリア枠信号とYC分離処理・RGB変換部72からのY信号とを混合し、エンコーダ処理部80に出力する。また、エリア表示用信号置き換え混合・選択部76は、制御部83からエリア表示切替え指定座標を入力すると、その座標情報に従って、補正用検出エリア枠20の位置を決定する。
When the display area frame signal is input from the area generation unit 75, the area display signal replacement mixing / selection unit 76 mixes the display area frame signal with the Y signal from the YC separation processing / RGB conversion unit 72 and performs encoder processing. To the unit 80. Further, when the area display signal replacement mixing / selecting unit 76 inputs the area display switching designation coordinates from the control unit 83, the area display signal replacement mixing / selecting unit 76 determines the position of the correction detection area frame 20 according to the coordinate information.
エリア表示切替え指定座標は、補正用検出エリア枠20を動かすメニューにおけるカメラ設定変更ボタン40の操作量により決定される。補正用検出エリア枠20は、例えばカメラ設定変更ボタン40の左右上下ボタン41のうち、右ボタンを押下することで右方向に移動し、左ボタンを押下することで左方向に移動する。また、補正用検出エリア枠20は上ボタンを押下することで上方向に移動し、下ボタンを押下することで下方向に移動する。
The area display switching designated coordinates are determined by the operation amount of the camera setting change button 40 in the menu for moving the correction detection area frame 20. For example, the correction detection area frame 20 moves to the right by pressing the right button of the left and right upper and lower buttons 41 of the camera setting change button 40, and moves to the left by pressing the left button. The correction detection area frame 20 moves upward by pressing the upper button, and moves downward by pressing the lower button.
エリア検出処理部77は、YC分離処理・RGB変換部72からの色信号から、霧が発生している部分の領域(つまり、補正用検出エリア枠20の表示領域)を検出し、その結果を霧検出データ加算・平均・保存部78に出力する。
The area detection processing unit 77 detects an area where fog is generated from the color signal from the YC separation processing / RGB conversion unit 72 (that is, the display area of the correction detection area frame 20), and the result is obtained. Output to the fog detection data addition / average / storage unit 78.
霧検出データ加算・平均・保存部78は、エリア生成部75から入力された表示エリア枠信号に基づき、エリア検出処理部77で検出された、霧が発生している部分の表示領域に対する霧検出データを加算し、その平均や統計的頻度(統計データ)を求め、その結果を保存する。
The fog detection data addition / average / storage unit 78 detects the fog for the display area of the portion where fog is detected, which is detected by the area detection processing unit 77 based on the display area frame signal input from the area generation unit 75. Add the data, calculate the average and statistical frequency (statistical data), and save the results.
メニュー用文字発生部79は、制御部83から入力される文字表示指定に従って、メニュー表示用の文字データを生成する。
The menu character generator 79 generates character data for menu display according to the character display designation input from the controller 83.
カメラ同期信号発生回路81は、カメラヘッド14内の撮像素子50を駆動するための撮像素子駆動信号、各部の動作において使用される信号処理用の同期信号及び垂直同期信号を発生する。
The camera synchronization signal generation circuit 81 generates an image sensor driving signal for driving the image sensor 50 in the camera head 14, a signal processing synchronization signal and a vertical synchronization signal used in the operation of each unit.
外部通信部82は、外部機器(例えば、コンピュータ)と通信を行い、外部機器による表示領域の選択や、外部機器による選択した表示領域の画素情報又は色空間情報の取り込みを可能にする。ここで、外部機器により選択される表示領域は、1つに限らず、複数同時であってもよい。カメラ制御装置15が外部通信部82を備えることで、単体で行うキー操作に加え、外部機器側でキー操作を行うことが可能となり、操作の自由度が向上する。例えば、マウス型のポインタやGUIを用いて、対話的に表示領域を選択して色補正を行い、カメラ制御装置15の設定を変更し保存することができる。
The external communication unit 82 communicates with an external device (for example, a computer), and enables selection of a display area by the external device and capture of pixel information or color space information of the selected display area by the external device. Here, the display area selected by the external device is not limited to one, and a plurality of display areas may be simultaneously provided. Since the camera control device 15 includes the external communication unit 82, it is possible to perform key operations on the external device side in addition to key operations performed alone, and the degree of freedom of operation is improved. For example, it is possible to interactively select a display area by using a mouse pointer or GUI, perform color correction, and change and save the settings of the camera control device 15.
制御部83は、カメラ制御装置15の各部の動作を制御するものであり、CPU(Central Processing Unit)、このCPUを動作させるための制御プログラムが格納されたプログラムメモリ、CPUの動作において使用されるRAM(Random Access Memory)、周辺機器と接続するための周辺I/O(Input/Output)を有する。また、制御部83には、カメラ設定変更ボタン40を含む外部スイッチ88が接続される。制御部83は、メニュー用文字発生部79に対し文字表示指定を与える。また、制御部83は、外部通信部82を介して外部機器であるコンピュータと通信を行い、コンピュータ側でカメラ制御装置15を制御可能にする。
The control unit 83 controls the operation of each unit of the camera control device 15, and is used in the operation of a CPU (Central Processing Unit), a program memory in which a control program for operating the CPU is stored, and the CPU. It has RAM (Random Access Memory) and peripheral I / O (Input / Output) for connecting with peripheral devices. Further, an external switch 88 including a camera setting change button 40 is connected to the control unit 83. The controller 83 gives a character display designation to the menu character generator 79. In addition, the control unit 83 communicates with a computer that is an external device via the external communication unit 82 so that the camera control device 15 can be controlled on the computer side.
また、制御部83は、霧検出データ加算・平均・保存部78に保存されている統計データと輝度レベルの読み出しを行い、霧補正処理時に用いられるゲインの設定値を算出し、このゲインの設定値をゲインレベル調整部74に設定する。なお、霧補正処理は、例えば非特許文献1等で示されるように公知の技術であり、種々の処理が挙げられる。例えば、霧補正処理では、霧の発生により輝度ヒストグラムが中間諧調に集中することで低下した画面全体のコントラストを高めるために、輝度ヒストグラムを平坦化するように、補正処理の設定値(ここでは、ゲインの設定値)が変更される。また、データ保存部84の補正値記憶領域84zには、制御部83で算出された、通常補正処理及び霧補正処理における各ゲインの設定値が記憶される。
Further, the control unit 83 reads the statistical data and the luminance level stored in the fog detection data addition / average / storage unit 78, calculates a gain setting value used in the fog correction process, and sets the gain. The value is set in the gain level adjustment unit 74. Note that the fog correction process is a known technique as shown in Non-Patent Document 1, for example, and includes various processes. For example, in the fog correction process, in order to increase the contrast of the entire screen, which is reduced by the concentration of the brightness histogram due to the occurrence of fog in the middle gradation, the correction process setting value (here, (Gain setting value) is changed. Further, in the correction value storage area 84z of the data storage unit 84, the set values of the gains in the normal correction process and the fog correction process calculated by the control unit 83 are stored.
これにより、制御部83は、ヘッドスイッチ19のオフ信号が出力された時には通常補正処理時のゲインの設定値を補正値記憶領域84zから読み出し、一方、ヘッドスイッチ19のオン信号が出力された時には霧補正処理時のゲインの設定値を補正値記憶領域84zから読み出し、瞬時にゲインレベル調整部74に設定可能である。なお、通常補正処理と霧補正処理とで切り替え時に行われるゲインの変更は、ゲインレベル調整部69でも行われるようにしてもよい。
Thus, the control unit 83 reads the gain setting value during the normal correction process from the correction value storage area 84z when the off signal of the head switch 19 is output, and on the other hand when the on signal of the head switch 19 is output. The gain setting value at the time of fog correction processing can be read from the correction value storage area 84z and set in the gain level adjustment unit 74 instantaneously. Note that the gain change performed at the time of switching between the normal correction process and the fog correction process may also be performed by the gain level adjustment unit 69.
上記構成を有する内視鏡システム5における内視鏡10を用いた撮像動作について説明する。
An imaging operation using the endoscope 10 in the endoscope system 5 having the above configuration will be described.
始めに、ユーザ(例えば手術を行う医者又はその協力者)は、モニタ16の画面に表示された患部TGを含む被写体の画像のうち、補正用検出エリア枠20内で検出された領域30(図9参照)に対し霧補正処理を行うために、カメラ制御装置15の各部の設定値を変更する操作を行う。ユーザによる所定の操作に従い、制御部83は、メニュー用文字発生部79に文字表示指定を行い、モニタ16に設定変更メニューを表示させる。
First, a user (for example, a doctor who performs an operation or a collaborator thereof) detects an area 30 (see FIG. 5) detected in the correction detection area frame 20 from the image of the subject including the affected part TG displayed on the screen of the monitor 16. 9), an operation for changing the set values of the respective units of the camera control device 15 is performed. In accordance with a predetermined operation by the user, the control unit 83 designates character display on the menu character generation unit 79 and causes the monitor 16 to display a setting change menu.
図6は、設定変更メニューが表示されたモニタ16の画面の一例を示す図である。例えば、セットアップ時、モニタ16の画面に表示される設定変更メニューには、「CAMERA ID」、「ELC」、「SHUTTER」、「GAIN」、「NKEE」、「END」等のメニュー項目が含まれる。例えば星マークが付加された項目は、下位の階層があることを示す。ユーザの操作によって、星マークが付加された項目に対して確定ボタン42が押下されると、モニタ16の画面は下位の階層に遷移する。
FIG. 6 is a diagram showing an example of the screen of the monitor 16 on which the setting change menu is displayed. For example, the setting change menu displayed on the screen of the monitor 16 at the time of setup includes menu items such as “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, “NKEE”, “END”, and the like. . For example, an item to which a star mark is added indicates that there is a lower hierarchy. When the confirmation button 42 is pressed for an item to which a star mark has been added by a user operation, the screen of the monitor 16 transitions to a lower hierarchy.
ユーザは、モニタ16の画面に表示された設定変更メニューを確認し、カメラ制御装置15に対し各種の設定・変更を行う。例えば色補正を行う場合、ユーザが設定変更メニューの「NKEE」を選択すると、制御部83は、モニタ16にNKEE(ニー)画面(図示せず)を表示する。NKEE画面は、設定変更メニューの第1階層の画面であり、NKEE POINT、NKEE LEVELの各項目を含む。NKEE画面では、カメラ制御装置15は、調整しようとするNKEE(ニー)のパラメータを表示する。
The user confirms the setting change menu displayed on the screen of the monitor 16 and makes various settings / changes to the camera control device 15. For example, when performing color correction, when the user selects “NKEE” from the setting change menu, the control unit 83 displays an NKEE screen (not shown) on the monitor 16. The NKEE screen is a screen in the first layer of the setting change menu, and includes items NKEE POINT and NKEE LEVEL. On the NKEE screen, the camera control device 15 displays the parameters of NKEE to be adjusted.
また、ユーザの操作によって、「CAMERA ID」、「ELC」、「SHUTTER」、「GAIN」、「SHUTTER」の項目においてそれぞれ確定ボタン42が押下されると、モニタ16の画面において、カメラID、照明、シャッタ、ゲインの設定値が調整あるいは変更可能となる。また、「END」の項目で確定ボタン42の操作を受け付けると、モニタ16の画面は設定変更メニューを消去し、元の画面に遷移する。
In addition, when the confirmation button 42 is pressed in each of the items “CAMERA ID”, “ELC”, “SHUTTER”, “GAIN”, and “SHUTTER” by the user's operation, the camera ID and illumination are displayed on the screen of the monitor 16. The shutter and gain setting values can be adjusted or changed. When the operation of the confirmation button 42 is accepted in the item “END”, the screen of the monitor 16 deletes the setting change menu and transitions to the original screen.
本実施形態では、カメラ制御装置15は、モニタ16の画面に表示された患部TGを含む被写体の画像に対し、霧補正処理を行う領域30を含む補正用検出エリア枠20を設定する。カメラ制御装置15は、補正用検出エリア枠20内で検出された領域30に対して行われる霧補正処理の設定値(ここでは、ゲインの設定値)を変更する際、この領域30の輝度がニー領域のどの領域に入っているかを判定し、該当する領域毎にゲインを調整し、この調整後、各領域の信号を合成する。
In the present embodiment, the camera control device 15 sets the correction detection area frame 20 including the region 30 in which the fog correction process is performed on the image of the subject including the affected part TG displayed on the screen of the monitor 16. When the camera control device 15 changes the setting value (here, the gain setting value) of fog correction processing performed on the region 30 detected in the correction detection area frame 20, the luminance of the region 30 is It is determined which area of the knee area is included, the gain is adjusted for each corresponding area, and the signal of each area is synthesized after this adjustment.
図7は、霧補正処理時におけるタイミングのずれの一例を説明する図である。図中、例えば四角形はフレームを示し、黒い四角形は同一の撮像画像に対するフレームを示す。入力映像の画像に対し、通常補正処理あるいは霧補正処理が行われて画像が出力映像として出力される場合、霧補正処理は、前のフレームの画像に対し行われるので、つまり、霧検出データ加算・平均・保存部78で求められた統計データを基に行われるので、通常補正処理と比べ、少なくとも1フレーム分以上遅れて、霧補正処理後の画像が出力映像として出力される。
FIG. 7 is a diagram illustrating an example of a timing shift during fog correction processing. In the figure, for example, squares indicate frames, and black squares indicate frames for the same captured image. When normal correction processing or fog correction processing is performed on the input video image and the image is output as output video, the fog correction processing is performed on the previous frame image, that is, fog detection data addition is performed. Since the calculation is performed based on the statistical data obtained by the averaging / storing unit 78, the image after the fog correction process is output as an output video with a delay of at least one frame compared to the normal correction process.
通常補正処理から霧補正処理への切り替えでは、ユーザがヘッドスイッチ19をオンに操作する(例えばヘッドスイッチ19を奇数回押下する)と、カメラ制御装置15は、あらかじめ定められた既定のアルゴリズムに則って霧補正処理に最適な組み合わせになるように、瞬時にゲインの設定値を補正処理用設定値に切り替える。従って、ユーザは、専門的な知識を持たなくても直感的に霧補正処理の設定値(ここでは、ゲインの設定値)を調整できる。
In switching from the normal correction process to the fog correction process, when the user turns on the head switch 19 (for example, when the head switch 19 is pressed an odd number of times), the camera control device 15 follows a predetermined algorithm. Thus, the gain setting value is instantaneously switched to the correction processing setting value so as to obtain an optimum combination for the fog correction processing. Accordingly, the user can intuitively adjust the setting value (in this case, the gain setting value) of the fog correction process without having specialized knowledge.
図8は、カメラ制御装置15の補正処理手順の一例を示すフローチャートである。カメラ制御装置15内の制御部83は、ヘッドスイッチ19の状態(言い換えると、ヘッドスイッチ19内部の電気回路がオン状態か又はオフ状態か)を取り込む(S1)。これにより、制御部83は、ユーザ操作によりヘッドスイッチ19が押下された状態であるか否かを検知する(S2)。ヘッドスイッチ19が押下されていない状態である場合(S2、NO)、制御部83は、そのままステップS4の処理に進む。一方、ヘッドスイッチ19が押下された状態である場合(S2、YES)、制御部83は、霧補正処理の有無を変更する(S3)。
FIG. 8 is a flowchart showing an example of the correction processing procedure of the camera control device 15. The control unit 83 in the camera control device 15 takes in the state of the head switch 19 (in other words, whether the electric circuit in the head switch 19 is on or off) (S1). Thereby, the control unit 83 detects whether or not the head switch 19 is pressed by a user operation (S2). When the head switch 19 is not depressed (S2, NO), the control unit 83 proceeds to the process of step S4 as it is. On the other hand, when the head switch 19 is pressed (S2, YES), the control unit 83 changes the presence / absence of fog correction processing (S3).
制御部83は、霧補正処理を行うか否かを判別する(S4)。ヘッドスイッチ19がオフである場合、つまり、霧補正処理を行わない場合(S4、NO)、制御部83は、データ保存部84の補正値記憶領域84zに記憶されている通常補正処理(霧補正処理無し)の各種設定値を読み込む(S5)。
The control unit 83 determines whether or not to perform fog correction processing (S4). When the head switch 19 is OFF, that is, when fog correction processing is not performed (S4, NO), the control unit 83 performs normal correction processing (fog correction) stored in the correction value storage area 84z of the data storage unit 84. Various setting values of “no processing” are read (S5).
一方、ヘッドスイッチ19がオンである場合(S4、YES)、つまり、霧補正処理を行う場合、制御部83は、データ保存部84の補正値記憶領域84zに記憶されている霧補正処理の各種設定値を読み込む(S6)。ここでは、各種補正処理の設定値が一例としてゲイン(利得)の設定値である場合を示す。なお、通常補正処理と霧補正処理とでは、ゲインレベル調整部74に対しゲインの設定値を変更する他、ゲインレベル調整部69に対してもゲインの設定値を変更してもよい。
On the other hand, when the head switch 19 is on (S4, YES), that is, when performing fog correction processing, the control unit 83 performs various types of fog correction processing stored in the correction value storage area 84z of the data storage unit 84. A set value is read (S6). Here, a case where the set values of various correction processes are set values of gains (gains) is shown as an example. In the normal correction process and the fog correction process, in addition to changing the gain setting value for the gain level adjustment unit 74, the gain setting value for the gain level adjustment unit 69 may be changed.
制御部83は、ステップS5,S6で読み込まれた各ゲインの設定値をゲインレベル調整部69,74に設定する(S7)。ゲインレベル調整部69にゲインの設定値が設定された場合、信号増幅処理部65は、前置処理部71から出力される映像信号のゲインを調整し、また、ゲインレベル調整部74にゲインの設定値が設定された場合、補正処理部73は色差信号のゲインを調整する。
The control unit 83 sets the gain setting values read in steps S5 and S6 in the gain level adjustment units 69 and 74 (S7). When the gain setting value is set in the gain level adjustment unit 69, the signal amplification processing unit 65 adjusts the gain of the video signal output from the preprocessing unit 71, and the gain level adjustment unit 74 sets the gain value. When the set value is set, the correction processing unit 73 adjusts the gain of the color difference signal.
エンコーダ処理部80は、補正処理部73で補正処理された色差信号と、エリア表示用信号置き換え混合・選択部76から表示エリア枠信号とYC分離処理・RGB変換部72からのY信号とが混合された信号とを入力し、これらの信号をモニタ16の表示形式に適した映像信号に変更し、モニタ16に出力する。モニタ16は、この映像信号に基づく画像を表示する。
The encoder processing unit 80 mixes the color difference signal corrected by the correction processing unit 73, the display area frame signal from the area display signal replacement mixing / selection unit 76, and the Y signal from the YC separation processing / RGB conversion unit 72. The received signals are input, these signals are changed to video signals suitable for the display format of the monitor 16, and output to the monitor 16. The monitor 16 displays an image based on this video signal.
図9は、モニタ16に表示された内視鏡10によって撮像された患部TGを含む被写体の画像の一例を示す模式図である。モニタ16に表示される画像に対し、患部TGを含むように、ユーザによって補正用検出エリア枠20が設定されている。補正用検出エリア枠20内で霧(水蒸気や煙)が発生すると、ユーザには患部TGが視えにくくなる(図中左側のモニタ画像)。この時、ユーザがヘッドスイッチ19をオンに操作すると、霧の発生が検出された領域30に対し、霧補正処理が行わる。この結果、霧が発生していても、患部TGが視認可能となる(図中右側のモニタ画像)。
FIG. 9 is a schematic diagram illustrating an example of an image of a subject including the affected part TG captured by the endoscope 10 displayed on the monitor 16. The correction detection area frame 20 is set by the user so as to include the affected part TG with respect to the image displayed on the monitor 16. When fog (water vapor or smoke) is generated in the correction detection area frame 20, the affected part TG is difficult for the user to see (monitor image on the left side in the figure). At this time, when the user operates the head switch 19 to turn on, fog correction processing is performed on the region 30 where the generation of fog has been detected. As a result, even if fog is generated, the affected area TG can be visually recognized (the monitor image on the right side in the figure).
以上により、本実施形態の内視鏡システム5では、患部TGに霧が発生していない場合、カメラ制御装置15は、内視鏡10で撮像された画像に対して通常補正処理を行うので、内視鏡10で撮像される患部TGの画像を最適な特性(輝度特性、色特性)で表示する。一方、レーザ切除等により患部TG周辺に霧が発生した場合、ユーザはヘッドスイッチ19をオンに操作し、内視鏡10で撮像される画像に対して霧補正処理を行うように切り替える指示をカメラ制御装置15に行う。霧補正処理を行った場合、通常補正処理と比べ、モニタ16に表示される患部の画像に1フレーム分以上の表示遅延が生じるものの、霧で患部TGが視えなくなるような状況は回避され、ユーザは手術を中断することなく続行できる。
As described above, in the endoscope system 5 of the present embodiment, when the fog is not generated in the affected part TG, the camera control device 15 performs the normal correction process on the image captured by the endoscope 10, An image of the affected part TG imaged by the endoscope 10 is displayed with optimum characteristics (luminance characteristics, color characteristics). On the other hand, when fog is generated around the affected area TG due to laser ablation or the like, the user operates the head switch 19 to turn on the camera and instructs the camera to switch to perform fog correction processing on the image captured by the endoscope 10. This is performed on the control device 15. When the fog correction process is performed, a situation in which the image of the affected area displayed on the monitor 16 has a display delay of one frame or more compared to the normal correction process, but the situation where the affected area TG cannot be seen due to the fog is avoided. The user can continue without interrupting the operation.
このように、本実施形態の内視鏡システム5は、内視鏡10からの映像に対して行われる、霧補正処理と通常処理とを瞬時に切り替え可能なヘッドスイッチ19がカメラヘッド14に設けられているので、霧が発生している時には表示遅延があっても患部等の部位を視認できる画像を迅速に得ることができる。例えば、患部を切除する場合、その作業を止めることなく、切除部への影響が少なく、その画像を確認できる。一方、霧が発生していない時には患部等の部位を撮像した画像を適正な画質で得ることができる。
As described above, in the endoscope system 5 of the present embodiment, the camera head 14 is provided with the head switch 19 that can instantaneously switch between the fog correction process and the normal process performed on the video from the endoscope 10. Therefore, when fog is generated, it is possible to quickly obtain an image in which a site such as an affected part can be visually recognized even if there is a display delay. For example, when the affected area is excised, the image can be confirmed without affecting the excised area without stopping the operation. On the other hand, when fog is not generated, an image obtained by imaging a site such as an affected part can be obtained with appropriate image quality.
つまり、霧が発生していない時には、従来と同様、画像処理による表示遅延が無く、患部の撮影に最適な輝度補正、色補正等の特性を有する画像が得られる。従って、手術中に患部を例えばレーザ除去して蒸気や煙等の霧が発生した時、霧が晴れるまで作業を停止させて待つことなく、映像に霧が映らないようになるので、手術を続行することができる。このように、従来行われていなかった、内視鏡で撮像される画像に対しても霧補正処理を適用できる。言い換えると、内視鏡システム5は、例えば人体内の患部の撮像状況に応じて、霧が発生している時には表示遅延があっても霧補正処理後の患部等の部位の画像を出力でき、一方、霧が発生していない時には霧補正処理なしの患部等の部位の高画質な画像を出力できるので、いずれの場合にしてもユーザの利便性を向上できる。
That is, when fog is not generated, there is no display delay due to image processing as in the conventional case, and an image having characteristics such as brightness correction and color correction that are optimal for photographing the affected area can be obtained. Therefore, when the affected area is laser-removed during surgery, for example, when mist such as vapor or smoke is generated, the mist will not appear in the image without stopping and waiting until the fog clears. can do. Thus, the fog correction process can be applied to an image captured by an endoscope, which has not been performed conventionally. In other words, the endoscope system 5 can output an image of a site such as an affected part after fog correction processing even if there is a display delay when fog is generated, for example, depending on the imaging situation of the affected part in the human body, On the other hand, when no fog is generated, a high-quality image of a site such as an affected part without fog correction processing can be output, so that in any case, user convenience can be improved.
また、本実施形態の内視鏡システム5は、患部を含む画像の一部だけを霧補正処理するので、画像全体に対して行った場合と比べ、僅かに画質が低下した領域を減らすことができる。
In addition, since the endoscope system 5 of the present embodiment performs fog correction processing on only a part of the image including the affected part, it is possible to reduce the area where the image quality is slightly lowered as compared with the case where it is performed on the entire image. it can.
また、本実施形態の内視鏡システム5は、霧補正処理にかかる時間を短縮でき、霧発生時の表示遅延を抑制できる。更には、霧が発生していない他部の画像と視比べながら、霧が発生した患部の一部の画像を視ることができる。
In addition, the endoscope system 5 of the present embodiment can reduce the time required for the fog correction process, and can suppress display delay when fog occurs. Furthermore, it is possible to view an image of a part of the affected area where fog has occurred while comparing the image with other images where fog is not generated.
また、本実施形態の内視鏡システム5では、ヘッドスイッチ19は、撮像素子が内蔵されたカメラヘッド14に設けられているので、内視鏡を操作する者が操作し易い。
Moreover, in the endoscope system 5 of the present embodiment, the head switch 19 is provided in the camera head 14 with a built-in image sensor, so that it is easy for a person who operates the endoscope to operate.
以上、図面を参照しながら各種の実施形態について説明したが、本開示はかかる例に限定されないことは言うまでもない。当業者であれば、請求の範囲に記載された範疇内において、各種の変更例又は修正例に想到し得ることは明らかであり、それらについても当然に本開示の技術的範囲に属するものと了解される。
Although various embodiments have been described above with reference to the drawings, it goes without saying that the present disclosure is not limited to such examples. It will be apparent to those skilled in the art that various changes and modifications can be made within the scope of the claims, and these are of course within the technical scope of the present disclosure. Is done.
例えば、上記した本実施形態では、補正処理の設定値として霧補正処理時にゲインが変更される場合を示したが、その他のパラメータが変更されてもよい。
For example, in the above-described embodiment, the case where the gain is changed during the fog correction process as the setting value of the correction process is shown, but other parameters may be changed.
また、上記した本実施形態では、補正用検出エリア枠20内で設定された領域に対し霧補正処理が行われたが、補正用検出エリア枠20の全体に対し霧補正処理が行われてもよい。
Further, in the above-described embodiment, the fog correction process is performed on the region set in the correction detection area frame 20, but the fog correction process may be performed on the entire correction detection area frame 20. Good.
また、ヘッドスイッチ19は、押下する度にオンとオフとが切り替わる押しボタンスイッチでなくてもよく、例えば押下している時だけオンとなるような押しボタンスイッチであってもよい。また、押しボタンスイッチがオフで霧補正処理を行い、オンで通常補正処理を行ってもよい。
Further, the head switch 19 may not be a push button switch that switches between on and off each time it is pressed, and may be a push button switch that is turned on only when it is pressed. Alternatively, fog correction processing may be performed when the push button switch is off, and normal correction processing may be performed when the push button switch is on.
本開示は、内視鏡で撮像される画像に対して霧補正処理を行うことができる内視鏡システム及び内視鏡として有用である。
The present disclosure is useful as an endoscope system and an endoscope that can perform fog correction processing on an image picked up by an endoscope.
5 内視鏡システム
10 内視鏡
11 スコープ
11z 撮影窓
12 マウントアダプタ
13 リレーレンズ
14 カメラヘッド
14z ヘッド延長ケーブル
15 カメラ制御装置
15z 信号ケーブル
16 モニタ
17 光源
17z 導光ケーブル
18 光源入力部
19 ヘッドスイッチ
20 補正用検出エリア枠
30 領域
40 カメラ設定変更ボタン
41 左右上下ボタン
42 確定ボタン
50 撮像素子
51 撮像素子各種駆動信号発生回路
52,60 多重信号復調回路
53,61 信号多重伝送回路
65 信号増幅処理部
69,74 ゲインレベル調整部
71 前置処理部
72 YC分離処理・RGB変換部
73 補正処理部
75 エリア生成部
76 エリア表示用信号置き換え混合・選択部
77 エリア検出処理部
78 霧検出データ加算・平均・保存部
79 メニュー用文字発生部
80 エンコーダ処理部
81 カメラ同期信号発生回路
82 外部通信部
83 制御部
84 データ保存部
84z 補正値記憶領域
88 外部スイッチ
L 光
TG 患部 5Endoscope System 10 Endoscope 11 Scope 11z Imaging Window 12 Mount Adapter 13 Relay Lens 14 Camera Head 14z Head Extension Cable 15 Camera Control Device 15z Signal Cable 16 Monitor 17 Light Source 17z Light Guide Cable 18 Light Source Input Unit 19 Head Switch 20 Correction Detection area frame 30 area 40 camera setting change button 41 left / right up / down button 42 confirm button 50 image sensor 51 image sensor various drive signal generation circuit 52, 60 multiplex signal demodulator circuit 53, 61 signal multiplex transmission circuit 65 signal amplification processor 69, 74 Gain level adjustment unit 71 Pre-processing unit 72 YC separation processing / RGB conversion unit 73 Correction processing unit 75 Area generation unit 76 Signal display mixing / selection unit for area display 77 Area detection processing unit 78 Fog detection data addition / average / preservation Part 79 Menu character generator 80 encoder processor 81 camera synchronizing signal generating circuit 82 external communication unit 83 control unit 84 data storage unit 84z correction value storage area 88 external switch L light TG affected area
10 内視鏡
11 スコープ
11z 撮影窓
12 マウントアダプタ
13 リレーレンズ
14 カメラヘッド
14z ヘッド延長ケーブル
15 カメラ制御装置
15z 信号ケーブル
16 モニタ
17 光源
17z 導光ケーブル
18 光源入力部
19 ヘッドスイッチ
20 補正用検出エリア枠
30 領域
40 カメラ設定変更ボタン
41 左右上下ボタン
42 確定ボタン
50 撮像素子
51 撮像素子各種駆動信号発生回路
52,60 多重信号復調回路
53,61 信号多重伝送回路
65 信号増幅処理部
69,74 ゲインレベル調整部
71 前置処理部
72 YC分離処理・RGB変換部
73 補正処理部
75 エリア生成部
76 エリア表示用信号置き換え混合・選択部
77 エリア検出処理部
78 霧検出データ加算・平均・保存部
79 メニュー用文字発生部
80 エンコーダ処理部
81 カメラ同期信号発生回路
82 外部通信部
83 制御部
84 データ保存部
84z 補正値記憶領域
88 外部スイッチ
L 光
TG 患部 5
Claims (4)
- 先端に撮像窓を有し前記撮像窓から入射する光学像を伝達するスコープ部と、前記スコープ部を通して伝達された光学像を撮像する撮像素子を内蔵するカメラヘッド部とを含む内視鏡と、
前記撮像素子により撮像された画像に対し、霧補正処理を行う制御装置と、
前記制御装置から出力される画像を表示する表示装置と、
前記制御装置に対し、前記霧補正処理の実行の有無を指示するスイッチと、を備え、
前記制御装置は、前記スイッチの押下に応じて、前記霧補正処理の実行が指示された場合に前記画像に対して前記霧補正処理を実行し、前記霧補正処理の不実行が指示された場合に前記霧補正処理を実行せずに前記画像を出力する、
内視鏡システム。 An endoscope including an imaging window at the tip and transmitting an optical image incident from the imaging window; and a camera head unit including an imaging element that captures an optical image transmitted through the scope;
A control device that performs fog correction processing on an image captured by the image sensor;
A display device for displaying an image output from the control device;
A switch that instructs the control device whether or not to perform the fog correction process,
When the execution of the fog correction process is instructed in response to pressing of the switch, the control device executes the fog correction process on the image, and when the execution of the fog correction process is instructed. To output the image without executing the fog correction process,
Endoscope system. - 請求項1に記載の内視鏡システムであって、
前記制御装置は、前記撮像素子により撮像された前記画像の一部に対して、前記霧補正処理を実行する、
内視鏡システム。 The endoscope system according to claim 1,
The control device performs the fog correction process on a part of the image captured by the image sensor.
Endoscope system. - 請求項1に記載の内視鏡システムであって、
前記スイッチは、前記内視鏡の前記カメラヘッド部に設けられた、
内視鏡システム。 The endoscope system according to claim 1,
The switch is provided in the camera head portion of the endoscope.
Endoscope system. - 先端に撮像窓を有し、前記撮像窓から入射する光学像を伝達するスコープ部と、
前記スコープ部を通して伝達された光学像を撮像する撮像素子を内蔵するカメラヘッド部と、
前記カメラヘッド部に設けられ、前記撮像素子により撮像された画像に対し、霧補正処理の実行の有無を指示するスイッチと、を備える、
内視鏡。 A scope unit that has an imaging window at the tip and transmits an optical image incident from the imaging window;
A camera head unit containing an image sensor that captures an optical image transmitted through the scope unit;
A switch that is provided in the camera head unit and instructs the presence or absence of execution of fog correction processing on an image captured by the image sensor.
Endoscope.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015040386A JP2016158886A (en) | 2015-03-02 | 2015-03-02 | Endoscope and endoscope system |
JP2015-040386 | 2015-03-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016139884A1 true WO2016139884A1 (en) | 2016-09-09 |
Family
ID=56843690
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/000422 WO2016139884A1 (en) | 2015-03-02 | 2016-01-28 | Endoscope and endoscope system |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2016158886A (en) |
WO (1) | WO2016139884A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110392546A (en) * | 2017-03-07 | 2019-10-29 | 索尼公司 | Information processing equipment, auxiliary system and information processing method |
WO2023166742A1 (en) * | 2022-03-04 | 2023-09-07 | オリンパス株式会社 | Image processing device, treatment system, and image processing method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018157917A (en) * | 2017-03-22 | 2018-10-11 | ソニー株式会社 | Control device, control method, control system, and program |
WO2018198255A1 (en) * | 2017-04-26 | 2018-11-01 | オリンパス株式会社 | Image processing device, operation method for image processing device, and program |
CN110944566B (en) | 2017-05-19 | 2021-12-21 | 奥林巴斯株式会社 | 3D endoscope apparatus and 3D image processing apparatus |
JP7616064B2 (en) | 2019-11-14 | 2025-01-17 | ソニーグループ株式会社 | Information processing device, generation method, and generation program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125188A (en) * | 2007-11-21 | 2009-06-11 | Panasonic Corp | Endoscope device and endoscope camera device |
JP2011004929A (en) * | 2009-06-25 | 2011-01-13 | Hoya Corp | Endoscope apparatus |
JP2012239644A (en) * | 2011-05-19 | 2012-12-10 | Olympus Corp | Image processing apparatus, endoscope apparatus and image processing method |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2933165B2 (en) * | 1988-01-08 | 1999-08-09 | オリンパス光学工業株式会社 | Electronic endoscope device |
JPH0530413A (en) * | 1991-07-01 | 1993-02-05 | Sharp Corp | Television camera system |
JPH06113195A (en) * | 1992-09-29 | 1994-04-22 | Canon Inc | Video camera device |
JPH06268915A (en) * | 1993-03-12 | 1994-09-22 | Hitachi Medical Corp | Digital picture processor |
JP5030335B2 (en) * | 2001-03-06 | 2012-09-19 | オリンパス株式会社 | Medical image display device |
JP4632716B2 (en) * | 2004-08-10 | 2011-02-16 | オリンパス株式会社 | Endoscopic optical system using sterilization durable composition and endoscope provided with the same |
JP2009201587A (en) * | 2008-02-26 | 2009-09-10 | Fujifilm Corp | Radiographic apparatus |
JP5379647B2 (en) * | 2009-10-29 | 2013-12-25 | オリンパス株式会社 | Imaging apparatus and image generation method |
-
2015
- 2015-03-02 JP JP2015040386A patent/JP2016158886A/en active Pending
-
2016
- 2016-01-28 WO PCT/JP2016/000422 patent/WO2016139884A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009125188A (en) * | 2007-11-21 | 2009-06-11 | Panasonic Corp | Endoscope device and endoscope camera device |
JP2011004929A (en) * | 2009-06-25 | 2011-01-13 | Hoya Corp | Endoscope apparatus |
JP2012239644A (en) * | 2011-05-19 | 2012-12-10 | Olympus Corp | Image processing apparatus, endoscope apparatus and image processing method |
Non-Patent Citations (1)
Title |
---|
TETSUO TANAKA ET AL.: "Development of Fog Image Correction Process for Network Cameras", PANASONIC TECHNICAL JOURNAL, vol. 59, no. 2, November 2013 (2013-11-01), pages 133 - 135 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110392546A (en) * | 2017-03-07 | 2019-10-29 | 索尼公司 | Information processing equipment, auxiliary system and information processing method |
WO2023166742A1 (en) * | 2022-03-04 | 2023-09-07 | オリンパス株式会社 | Image processing device, treatment system, and image processing method |
Also Published As
Publication number | Publication date |
---|---|
JP2016158886A (en) | 2016-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016139884A1 (en) | Endoscope and endoscope system | |
US10485629B2 (en) | Endoscope device | |
US8072505B2 (en) | Imaging apparatus | |
CN110168605B (en) | Video signal processing apparatus, video signal processing method, and computer readable medium for dynamic range compression | |
US10548465B2 (en) | Medical imaging apparatus and medical observation system | |
US20120201433A1 (en) | Image composition system | |
US11022859B2 (en) | Light emission control apparatus, light emission control method, light emission apparatus, and imaging apparatus | |
WO2018180573A1 (en) | Surgical image processing device, image processing method, and surgery system | |
US10952597B2 (en) | Endoscope apparatus and method of detecting edge | |
JP7016681B2 (en) | Endoscope system | |
JPWO2019167555A1 (en) | Video signal processing device, video signal processing method and imaging device | |
US11864732B2 (en) | Medical image processing device and medical observation system | |
JP6987513B2 (en) | Endoscope device | |
JP2012235962A (en) | Electric endoscope device | |
US11523065B2 (en) | Imaging device and gain setting method | |
JP6045794B2 (en) | Electronic endoscope system | |
JP5967952B2 (en) | Electronic endoscope system | |
JP5976342B2 (en) | Electronic endoscope system | |
US10778889B2 (en) | Image pickup apparatus, video signal processing apparatus, and video signal processing method | |
US11039067B2 (en) | Image pickup apparatus, video signal processing apparatus, and video signal processing method | |
US20190076006A1 (en) | Medical observation device and medical observation system | |
US11778325B2 (en) | Image processing apparatus, image processing method, and image processing program | |
JP2011152300A (en) | Electronic endoscope system | |
JP4493426B2 (en) | Electronic endoscope system | |
US11979670B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and program for blending plurality of image signals based on a peaking signal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16758585 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16758585 Country of ref document: EP Kind code of ref document: A1 |