US20180309933A1 - Image processing device, image processing method, program, and endoscope device - Google Patents

Image processing device, image processing method, program, and endoscope device Download PDF

Info

Publication number
US20180309933A1
US20180309933A1 US16/020,756 US201816020756A US2018309933A1 US 20180309933 A1 US20180309933 A1 US 20180309933A1 US 201816020756 A US201816020756 A US 201816020756A US 2018309933 A1 US2018309933 A1 US 2018309933A1
Authority
US
United States
Prior art keywords
frame
ordinary
unit
special
frames
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/020,756
Other versions
US10306147B2 (en
Inventor
Hisakazu Shiraki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US16/020,756 priority Critical patent/US10306147B2/en
Publication of US20180309933A1 publication Critical patent/US20180309933A1/en
Application granted granted Critical
Publication of US10306147B2 publication Critical patent/US10306147B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23267
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • G06K9/2018
    • G06K9/2027
    • G06T5/003
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction
    • H04N2005/2255
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Definitions

  • imaging an ordinary image and a special image using time division is described in Japanese Unexamined Patent Application Publication No. 2007-313171.
  • performing composite display of the ordinary image and the special image is described in Japanese Unexamined Patent Application Publication No. 2012-24283.
  • an image processing device which includes an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • the image processing device may further include a feature extraction process unit which generates a feature extraction frame by subjecting the special frame to a feature extraction process
  • the motion correction unit may further subject the feature extraction frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors
  • the compositing unit may subject the ordinary frames to the image compositing process based on the feature extraction frame that is subjected to motion correction.
  • the compositing unit may subject the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process.
  • the compositing unit may add the motion-corrected special frame to the ordinary frames according to the motion-corrected feature extraction frame.
  • the compositing unit may subject the ordinary frames to a color conversion process according to the motion-corrected feature extraction frame.
  • the compositing unit may subject the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process according to a selection by a user.
  • the image processing device may further include a motion vector correction unit which corrects the detected motion vectors based on the plurality of motion vectors that are consecutively detected.
  • an image processing method performed by an image processing device.
  • the method includes inputting ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; detecting motion vectors of the object from a plurality of the ordinary frames with different imaging timing; subjecting the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and subjecting the ordinary frames to an image compositing process based on the special frame.
  • ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period are input; motion vectors of the object from a plurality of the ordinary frames with different imaging timing are detected; the special frame is subjected to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and the ordinary frames are subjected to an image compositing process based on the special frame.
  • an endoscope device which includes a light source unit which irradiates an object with ordinary light or special light; an imaging unit which consecutively images, at a predetermined ratio according to a predetermined frame period, ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • an object is irradiated with ordinary light or special light; ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light are consecutively imaged at a predetermined ratio according to a predetermined frame period; motion vectors of the object are detected from a plurality of the ordinary frames with different imaging timing; the special frame is subjected to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and the ordinary frames are subjected to an image compositing process based on the special frame.
  • FIG. 1 is a block diagram illustrating a configuration example of an endoscope device to which an embodiment of the present disclosure is applied;
  • FIG. 2 is a diagram illustrating imaging timing between ordinary frames and special frames
  • FIG. 3 is a block diagram illustrating a detailed configuration example of an image processing unit of FIG. 1 ;
  • FIG. 4 is a block diagram illustrating a detailed configuration example of a motion vector detection unit of FIG. 3 ;
  • FIG. 5 is a flowchart illustrating an image compositing process
  • FIG. 6 is a diagram illustrating an example of motion correction amount estimation
  • FIG. 7 is a diagram illustrating an impression of correcting dispersion in motion vectors based on a series of motion vectors
  • FIG. 8 is a diagram illustrating an impression of a superposing compositing process.
  • FIG. 9 is a block diagram illustrating a configuration example of a computer.
  • FIG. 1 illustrates a configuration example of an endoscope device, which is an embodiment of the present disclosure, that images an ordinary frame and a special frame using time division, accurately aligns and combines the frames, and displays a composite frame that is obtained as a result.
  • An endoscope device 10 is configured to include a light source unit 11 , an imaging unit 12 , a developing unit 13 , an image processing unit 14 , and a display unit 15 .
  • the light source unit 11 switches between ordinary light such as white light and special light that has a predetermined wavelength for each frame that is imaged, and irradiates the object (an organ or the like in the body) therewith.
  • the light source unit 11 outputs an irradiation identification signal indicating which of the ordinary light and the special light the object is irradiated with to the image processing unit 14 for each frame that is imaged.
  • an optical filter which transmits only a predetermined wavelength may be provided in the light path of the ordinary light.
  • the imaging unit 12 images the object in a state in which the ordinary light or the special light is radiated from the light source unit 11 , and outputs an image signal that is obtained as a result to the developing unit 13 .
  • the developing unit 13 subjects the image signal that is input thereto from the imaging unit 12 to a developing process such as a mosaic process, and outputs the image signal resulting from the process (the ordinary frame when the ordinary light is radiated, and the special frame when the special light is radiated) to the image processing unit 14 .
  • the blood vessel or the lesion such as a tumor is made clearer in comparison to in an ordinary frame; however, in contrast, the brightness of the entire frame is dark and there is much noise.
  • the entire frame is bright in comparison to the special frame, and there is little noise; however, in contrast, it is difficult to distinguish the blood vessel or the lesion such as a tumor.
  • the image processing unit 14 detects motion vectors using two ordinary frames with different imaging timing. By subjecting the special frame to a differential filter process, the image processing unit 14 generates a frame (hereinafter referred to as a differential filter frame) in which edge portions (specifically, the contours or the like of the blood vessel or the lesion, for example) within the image are emphasized. Furthermore, the image processing unit 14 performs motion correction on each of the special frame and the differential filter frame based on the motion vectors that are detected from the ordinary frame, combines the ordinary frame with the special frame and the differential filter frame which are subjected to the motion correction, and outputs the composite frame that is obtained as a result to the display unit 15 .
  • a differential filter frame a frame in which edge portions (specifically, the contours or the like of the blood vessel or the lesion, for example) within the image are emphasized. Furthermore, the image processing unit 14 performs motion correction on each of the special frame and the differential filter frame based on the motion vectors that are detected from the ordinary frame, combines the ordinary
  • the display unit 15 displays the composite frame.
  • FIG. 2 An example of the imaging timing of the ordinary frames and the special frames is illustrated in FIG. 2 .
  • ordinary frames are imaged for several continuous frames, and a special frame is imaged periodically between the ordinary frames.
  • a special frame is imaged periodically between the ordinary frames.
  • the imaging ratio of ordinary frames to special frames is set to 4:1.
  • Ta illustrates a timing at which an ordinary frame is imaged one frame before a special frame is imaged
  • Tb illustrates a timing at which a special frame is imaged
  • Tc, Td, and Te illustrate timings at which ordinary frames are imaged 1, 2, and 3 frames, respectively, after the special frame is imaged.
  • Ta to Te will be used in the description of the detection of motion vectors described later.
  • FIG. 3 a configuration example of the image processing unit 14 is illustrated in FIG. 3 .
  • the image processing unit 14 is configured to include a switching unit 21 , a motion vector detection unit 22 , a correction amount estimation unit 23 , a frame memory 24 , a differential filter processing unit 25 , a motion correction unit 26 , and a compositing unit 27 .
  • the ordinary frames and the special frames that are input thereto from the developing unit 13 of the previous stage are input to the switching unit 21 , and the irradiation identification signal from the light source unit 11 is input to the switching unit 21 , the motion vector detection unit 22 , and the correction amount estimation unit 23 .
  • the switching unit 21 determines whether or not the input from the developing unit 13 is a special frame based on the irradiation identification signal, when the input is not a special frame (is an ordinary frame), outputs the ordinary frame to the motion vector detection unit 22 and the compositing unit 27 , and when the input is a special frame, the special frame is output to the frame memory 24 .
  • the motion vector detection unit 22 For each frame period, the motion vector detection unit 22 detects the motion vectors using two ordinary frames with different imaging timing, and outputs the detected motion vectors to the correction amount estimation unit 23 .
  • the correction amount estimation unit 23 estimates the motion correction amounts of the special frame and the differential filter frame based on the motion vectors that are detected by the motion vector detection unit 22 , and outputs the estimated motion correction amounts to the motion correction unit 26 .
  • the correction amount estimation unit 23 is capable of correcting motion vectors which may be erroneously detected based on the motion vectors that are detected in succession, and is capable of estimating the motion correction amounts based on the corrected motion vectors.
  • the frame memory 24 holds the special frame that is input thereto from the switching unit 21 , and supplies the held special frame to the differential filter processing unit 25 and the motion correction unit 26 for each frame period.
  • the frame memory 24 updates the held special frame when the next special frame is input thereto from the switching unit 21 .
  • the differential filter processing unit 25 generates a feature extraction frame in which features in the image are emphasized by subjecting the special frame that is supplied thereto from the frame memory 24 to a differential filter process (for example, the Sobel filter process), and outputs the feature extraction frame to the motion correction unit 26 .
  • a differential filter frame in which the edge portions are emphasized is generated as the feature extraction frame.
  • the differential filter processing unit 25 may omit the differential filter process and output the result of the previous differential filter process to the motion correction unit 26 .
  • a process may be executed in which a region in which the variance or the dynamic range in a micro-block (of 3 ⁇ 3 pixels, for example) is greater than or equal to a threshold that is extracted, and a feature extraction frame indicating the extraction results is generated.
  • a process may be executed in which a region in which the signal levels of pixels are within a specific threshold, that is, a region with specific RGB levels is extracted, and a feature extraction frame indicating the extraction results is generated.
  • a closed region (corresponding to a tumor or the like) may be subjected to a contour detection process such as snakes, and a feature extraction frame indicating the results may be generated.
  • the motion correction unit 26 subjects the special frame from the frame memory 24 to the motion correction based on the motion correction amounts that are input from the correction amount estimation unit 23 , subjects the differential filter frame from the differential filter processing unit 25 to the motion correction, and outputs the post-motion correction special frame and differential filter frame to the compositing unit 27 .
  • the compositing unit 27 includes a superposing unit 28 and a marking unit 29 , using the ordinary frame and the post-motion correction special frame and differential filter frame as the input, generates a composite frame by performing a superposing compositing process by the superposing unit 28 or a marking compositing process by the marking unit 29 , and outputs the composite frame to the display unit 15 of the subsequent stage.
  • the motion vector detection unit 22 is configured to include frame memories 31 and 32 , a frame selection unit 33 , a block matching unit 34 , and a vector correction unit 35 .
  • the ordinary frame that is input thereto from the switching unit 21 of the previous stage is input to the frame memory 31 and the frame selection unit 33 .
  • the frame memory 31 For each frame period, the frame memory 31 outputs the ordinary frame that is held therein until that point to the frame memory 32 and the frame selection unit 33 , and updates the data held therein until that point with the ordinary frame that is input from the switching unit 21 of the previous stage. In the same manner, for each frame period, the frame memory 32 outputs the ordinary frame that is held therein to the frame selection unit 33 , and updates the data held therein with the ordinary frame that is input from the frame memory 31 of the previous stage.
  • the frame memory 31 outputs the ordinary frame that is held until that point to the subsequent stage, and clears the data that is held until that point.
  • the frame memory 31 since the frame memory 31 is not holding any data, the output to the subsequent stage is not performed.
  • the frame memory 32 outputs the ordinary frame that is held until that point to the subsequent stage, and clears the data that is held until that point.
  • the two ordinary frames are output to the block matching unit 34 .
  • the two ordinary frames that are input from the frame memories 31 and 32 are output to the block matching unit 34 .
  • the block matching unit 34 detects the motion vectors between the two ordinary frames using a block matching process.
  • the vector correction unit 35 determines the relationship between the two ordinary frames that are used for the motion vectors based on the irradiation identification signal, corrects the detected motion vectors based on the relationship, and outputs the motion vectors to the correction amount estimation unit 23 .
  • the vector correction unit 35 does not perform the motion vector correction.
  • the frame memory 31 does not perform output.
  • the ordinary frame from the frame memory 32 that is one frame prior to the reference, and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33 , and the motion vectors are detected from the two ordinary frames.
  • the vector correction unit 35 multiplies each of the vertical and horizontal components of the detected motion vectors by 1 ⁇ 2.
  • the reference imaging timing is the Tc illustrated in FIG. 2
  • the reference ordinary frame from the frame memory 31 and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33 , and the motion vectors are detected from the two ordinary frames.
  • the vector correction unit 35 multiplies each of the vertical and horizontal components of the detected motion vectors by ⁇ 1.
  • the ordinary frame from the frame memory 32 that is one frame prior to the reference, the reference ordinary frame from the frame memory 31 , and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33 , and the motion vectors are detected from the two ordinary frames from the frame memories 31 and 32 .
  • the vector correction unit 35 does not perform the motion vector correction.
  • the ordinary frame from the frame memory 32 that is one frame prior to the reference, the reference ordinary frame from the frame memory 31 , and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33 , and the motion vectors are detected from the two ordinary frames from the frame memories 31 and 32 .
  • the vector correction unit 35 does not perform the motion vector correction.
  • the motion vectors that are corrected as described above are output from the vector correction unit 35 to the correction amount estimation unit 23 of the subsequent state.
  • FIG. 5 is a flowchart illustrating an image compositing process. The image compositing process is executed for each frame period.
  • step S 1 the switching unit 21 determines whether or not the input from the developing unit 13 is a special frame based on the irradiation identification signal, and when the input is a special frame, the special frame is output to the frame memory 24 . Conversely, when it is determined that the input is not a special frame (is an ordinary frame), the switching unit 21 outputs the ordinary frame to the motion vector detection unit 22 and the compositing unit 27 .
  • step S 2 the frame memory 24 supplies the special frame that is held until that point to the differential filter processing unit 25 and the motion correction unit 26 . Note that, the frame memory 24 updates the held special frame when the special frame is input thereto from the switching unit 21 .
  • step S 3 the differential filter processing unit 25 generates a differential filter frame in which edge portions in the image are emphasized by subjecting the special frame that is supplied thereto from the frame memory 24 to a differential filter process (for example, the Sobel filter process such as the one illustrated in the following equation (1)), and outputs the differential filter frame to the motion correction unit 26 .
  • a differential filter process for example, the Sobel filter process such as the one illustrated in the following equation (1)
  • Sobel Rh ( x,y )
  • Sobel R ( x,y ) Sobel Rh ( x,y )+Sobel Rv ( x,y )
  • Sobel Gh ( x,y )
  • Sobel G ( x,y ) Sobel Gh ( x,y )+Sobel Gv ( x,y )
  • Sobel Bh ( x,y )
  • Sobel Bv ( x,y )
  • Sobel B ( x,y ) Sobel Bh ( x,y )+Sobel Bv ( x,y )
  • R, G, and B in the equation (1) respectively correspond to levels in the R, G, and B planes of the special frame.
  • step S 4 the motion vector detection unit 22 detects the motion vectors using two ordinary frames with different imaging timing, and outputs the motion vectors to the correction amount estimation unit 23 .
  • step S 5 the correction amount estimation unit 3 determines whether or not the detected motion vectors are less than or equal to a predetermined threshold, and when the detected motion vectors are less than or equal to the predetermined threshold, the process proceeds to step S 6 in order to use the motion vectors in the motion correction. Conversely, when the detected motion vectors are greater than the predetermined threshold, the motion vectors are not used in the motion correction. In this case, the image compositing process that corresponds to the present imaging timing ends.
  • step S 6 the correction amount estimation unit 23 estimates the motion correction amounts of the special frame and the differential filter frame based on the motion vectors that are detected by the motion vector detection unit 22 , and outputs the estimated motion correction amounts to the motion correction unit 26 .
  • the motion correction amounts H x and H y are computed as illustrated in the following equation (2).
  • V x and V y are motion vectors that are detected and corrected
  • correction amount estimation unit 23 it is also possible to correct the dispersion in the motion vectors and subsequently estimate the motion correction amounts based on the series of motion vectors, as described hereinafter, as another motion correction amount estimation method.
  • FIG. 6 is a diagram illustrating a process flow in which the dispersion in the motion vectors is corrected and the motion correction amounts are subsequently estimated based on the series of motion vectors.
  • FIG. 7 illustrates an impression of correcting the dispersion in the motion vectors based on the series of motion vectors.
  • the motion vectors (V′ x,t , and V′ y,t ) in relation to the imaging timing t are estimated as illustrated in the following equation (3).
  • V′ x,t a x t 3 +b x t 2 +c x t+d x
  • V′ y,t a y t 3 +b y t 2 +c y t+d y (3)
  • the motion correction amounts H x and H y are computed using the following equation (4) by substituting the motion vectors of the equation (2) with the estimated motion vectors (V′ x,t and V′ y,t ).
  • the coefficients (a x , b x , c x , and d x ) and (a y , b y , c y , and d y ) in the equation (3) can be calculated using the least squares method using the detected motion vectors (V x,1 and V y,1 ), . . . , (V x,t and V y,t ).
  • step S 7 After the motion correction amounts are estimated as described above, the process proceeds to step S 7 .
  • step S 7 the motion correction unit 26 subjects the special frame from the frame memory 24 to the motion correction based on the motion correction amounts that are input from the correction amount estimation unit 23 , subjects the differential filter frame from the differential filter processing unit 25 to the motion correction, and outputs the post-motion correction special frame and differential filter frame to the compositing unit 27 .
  • the ordinary frame, the special frame, and the differential filter frame that are input to the compositing unit 27 become frames in which the object is accurately aligned in relation to each other.
  • step S 8 the compositing unit 27 generates a composite frame by subjecting the ordinary frame and the post-motion correction special frame and differential filter frame to the superposing compositing process or the marking compositing process according to the selection from the user, and outputs the composite frame to the display unit 15 of the subsequent stage.
  • O(x,y) is a pixel value of the composite frame
  • N(x,y) is a pixel value of the ordinary frame
  • Sobel(x,y) is a pixel value of the post-motion correction differential filter frame
  • I(x,y) is a pixel value of the post-motion correction special frame.
  • C 0 and C 1 are coefficients that control the degree of superposition and may be arbitrarily set by the user.
  • FIG. 8 illustrates an impression of the superposing compositing process described above.
  • the superposing compositing process it is possible to obtain the composite frame in which the special frame and the differential filter frame are accurately aligned to the ordinary frame, and the edges of a portion to be focused on (a blood vessel, a lesion, or the like) are emphasized and superposed.
  • the ordinary frame is subjected to pseudo color conversion using a color matrix process according to color conversion coefficients C that are multiplied by the pixel values of the differential filter frame.
  • O(x,y) is a pixel value of the composite frame
  • N(x,y) is a pixel value of the ordinary frame
  • Sobel(x,y) is a pixel value of the post-motion correction differential filter frame
  • C is a color conversion coefficient.
  • the post-motion correction special frame is not used in the marking and superposing compositing process.
  • the edges of the blood vessel or the like are more strongly subjected to the color conversion, and the other regions are not significantly subjected to the color conversion. Accordingly, it is possible to obtain a composite frame in which only the edges of the blood vessel or the like stand out.
  • the endoscope device 10 since the motion vectors are detected using only the ordinary frames and the motion correction amounts are estimated after correcting the detected motion vectors, it is possible to accurately execute the motion correction of the special frame and the differential filter frame. Accordingly, since it is possible to accurately align the information of the special frame of the blood vessel, the tumor, or the like in relation to the ordinary frame, it is possible to allow the user (a medical practitioner performing an operation, or the like) to accurately and clearly visually recognize a tumor portion to be removed and the blood vessel portion not to be removed.
  • the composite frame that is presented to the user is created based on the ordinary frame, a composite frame that is bright with little noise in comparison to the special frame can be presented to the user.
  • the series of processes described above can be executed using hardware, and can be executed using software.
  • the program configuring the software is installed on a computer.
  • examples of the computer include a computer embedded within dedicated hardware, and an ordinary personal computer or the like which is capable of executing the various functions due to various programs that are installed thereon.
  • FIG. 9 is a block diagram illustrating a configuration example of the hardware of the computer which executes the series of processes described above using a program.
  • a central processing unit (CPU) 101 a central processing unit (CPU) 101 , a read only memory (ROM) 102 , and random access memory (RAM) 103 are connected to each other by a bus 104 .
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • an input-output interface 105 is connected to the bus 104 .
  • the input-output interface 105 is connected to an input unit 106 , an output unit 107 , a storage unit 108 , a communication unit 109 , and a drive 110 .
  • the input unit 106 is formed of a keyboard, a mouse, a microphone, and the like.
  • the output unit 107 is formed of a display, a speaker, and the like.
  • the storage unit 108 is formed of a hard disk, non-volatile memory, or the like.
  • the communication unit 109 is formed of a network interface or the like.
  • the drive 110 drives a removable medium 111 such as a magnetic disk, an optical disc, a magneto-optical disc, or a semiconductor memory.
  • the series of processes described above is performed by the CPU 101 , for example, loading the program stored in the storage unit 108 into the RAM 103 via the input-output interface 105 and the bus 104 , and executing the loaded program.
  • the computer 100 may be a so-called cloud computer that is connected via the Internet, for example.
  • the program which is executed by the computer 100 may be a program in which the processes are performed in time series in the order described in the present specification.
  • the program may be a program in which the processes are performed in parallel or at the necessary timing such as when the process is called.
  • the present disclosure may adopt the following configurations.
  • An image processing device including an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • the image processing device further including a feature extraction process unit which generates a feature extraction frame by subjecting the special frame to a feature extraction process, in which the motion correction unit further subjects the feature extraction frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors, and in which the compositing unit subjects the ordinary frames to the image compositing process based on the feature extraction frame that is subjected to motion correction.
  • a feature extraction process unit which generates a feature extraction frame by subjecting the special frame to a feature extraction process
  • the motion correction unit further subjects the feature extraction frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors
  • the compositing unit subjects the ordinary frames to the image compositing process based on the feature extraction frame that is subjected to motion correction.
  • the image processing device according to any one of (1) to (7), further including a motion vector correction unit which corrects the detected motion vectors based on the plurality of motion vectors that are consecutively detected.
  • An image processing method performed by an image processing device including inputting ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; detecting motion vectors of the object from a plurality of the ordinary frames with different imaging timing; subjecting the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and subjecting the ordinary frames to an image compositing process based on the special frame.
  • An endoscope device including a light source unit which irradiates an object with ordinary light or special light; an imaging unit which consecutively images, at a predetermined ratio according to a predetermined frame period, ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Endoscopes (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An image processing device includes an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/618,240, filed Feb. 10, 2015, which claims the benefit of Japanese Priority Patent Application No. JP 2014-048336 filed Mar. 12, 2014, the entire contents of which are incorporated herein by reference.
  • BACKGROUND
  • The present disclosure relates to an image processing device, an image processing method, a program, and an endoscope device. In particular, the present disclosure relates to an image processing device, an image processing method, a program, and an endoscope device, each of which is capable of combining and displaying an ordinary image which is imaged by irradiating a human body with ordinary light such as white light, and a special image which is obtained by irradiating the human body with a special light and illustrates the position of blood vessels.
  • In the related art, for example, with the intention of usage in a medical setting, various technologies are proposed in which an ordinary image of an organ or the like that is imaged by an endoscope device is combined with a special image that represents the position of blood vessels or a lesion such as a tumor, which are difficult to discern in the ordinary image, and the result is displayed.
  • For example, imaging an ordinary image and a special image using time division is described in Japanese Unexamined Patent Application Publication No. 2007-313171. As another example, performing composite display of the ordinary image and the special image is described in Japanese Unexamined Patent Application Publication No. 2012-24283.
  • Here, the term “ordinary image” indicates an image which is imaged by irradiating an organ or the like that serves as the object with ordinary light such as white light. Hereinafter, the ordinary image will also be referred to as an ordinary frame. The term “special image” indicates an image which is imaged by irradiating the object with special light of a predetermined wavelength different from that of the ordinary light. Hereinafter, the special image will also be referred to as the special frame. Note that, when imaging the special image, there is a case in which a fluorescent agent or the like which reacts to the irradiation of the special light is mixed into or applied to the blood vessel (the blood) or the lesion that serves as the object.
  • SUMMARY
  • Since combining the ordinary frame and the special frame that are imaged using time division causes a shift in the imaging timing, when there is hand shaking or the object moves, there is a likelihood that the alignment of the ordinary frame with the special frame may not be performed accurately.
  • Note that, technology also exists which carries out the compositing after detecting motion vectors between the ordinary frame and the special frame that are imaged using time division and motion correction is performed based on the motion vectors. However, the imaging conditions differ between the ordinary frame and the special frame, errors occur easily in block matching when detecting the motion vectors, and it is difficult to accurately detect the motion vectors.
  • It is desirable to enable the accurate alignment and combination of an ordinary frame and a special frame that are imaged using time division.
  • According to a first embodiment of the present disclosure, there is provided an image processing device which includes an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • In the image processing device, the image processing device may further include a feature extraction process unit which generates a feature extraction frame by subjecting the special frame to a feature extraction process, the motion correction unit may further subject the feature extraction frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors, and the compositing unit may subject the ordinary frames to the image compositing process based on the feature extraction frame that is subjected to motion correction.
  • In the image processing device, the feature extraction process unit may generate a differential filter frame as the feature extraction frame by subjecting the special frame to a differential filter process.
  • In the image processing device, the compositing unit may subject the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process.
  • In the image processing device, as the superposing compositing process, the compositing unit may add the motion-corrected special frame to the ordinary frames according to the motion-corrected feature extraction frame.
  • In the image processing device, as the marking compositing process, the compositing unit may subject the ordinary frames to a color conversion process according to the motion-corrected feature extraction frame.
  • In the image processing device, the compositing unit may subject the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process according to a selection by a user.
  • In the image processing device, the image processing device may further include a motion vector correction unit which corrects the detected motion vectors based on the plurality of motion vectors that are consecutively detected.
  • According to a first embodiment of the present disclosure, there is provided an image processing method performed by an image processing device. The method includes inputting ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; detecting motion vectors of the object from a plurality of the ordinary frames with different imaging timing; subjecting the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and subjecting the ordinary frames to an image compositing process based on the special frame.
  • According to a first embodiment of the present disclosure, there is provided a program for causing a computer to function as an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • In the first embodiments of the present disclosure, ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period are input; motion vectors of the object from a plurality of the ordinary frames with different imaging timing are detected; the special frame is subjected to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and the ordinary frames are subjected to an image compositing process based on the special frame.
  • According to a second embodiment of the present disclosure, there is provided an endoscope device which includes a light source unit which irradiates an object with ordinary light or special light; an imaging unit which consecutively images, at a predetermined ratio according to a predetermined frame period, ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • In the second embodiment of the present disclosure, an object is irradiated with ordinary light or special light; ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light are consecutively imaged at a predetermined ratio according to a predetermined frame period; motion vectors of the object are detected from a plurality of the ordinary frames with different imaging timing; the special frame is subjected to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and the ordinary frames are subjected to an image compositing process based on the special frame.
  • According to the first embodiments of the present disclosure, it is possible to accurately align and combine ordinary frames and a special frame that are images using time division.
  • According to the second embodiment of the present disclosure, it is possible to image ordinary frames and a special frame using time division, and to accurately align and combine the frames.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating a configuration example of an endoscope device to which an embodiment of the present disclosure is applied;
  • FIG. 2 is a diagram illustrating imaging timing between ordinary frames and special frames;
  • FIG. 3 is a block diagram illustrating a detailed configuration example of an image processing unit of FIG. 1;
  • FIG. 4 is a block diagram illustrating a detailed configuration example of a motion vector detection unit of FIG. 3;
  • FIG. 5 is a flowchart illustrating an image compositing process;
  • FIG. 6 is a diagram illustrating an example of motion correction amount estimation;
  • FIG. 7 is a diagram illustrating an impression of correcting dispersion in motion vectors based on a series of motion vectors;
  • FIG. 8 is a diagram illustrating an impression of a superposing compositing process; and
  • FIG. 9 is a block diagram illustrating a configuration example of a computer.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Hereafter, detailed description will be given of a favorable embodiment for realizing the present disclosure (referred to below as the “embodiment”) with reference to the drawings.
  • Configuration Example of Endoscope Device
  • FIG. 1 illustrates a configuration example of an endoscope device, which is an embodiment of the present disclosure, that images an ordinary frame and a special frame using time division, accurately aligns and combines the frames, and displays a composite frame that is obtained as a result.
  • An endoscope device 10 is configured to include a light source unit 11, an imaging unit 12, a developing unit 13, an image processing unit 14, and a display unit 15.
  • The light source unit 11 switches between ordinary light such as white light and special light that has a predetermined wavelength for each frame that is imaged, and irradiates the object (an organ or the like in the body) therewith. The light source unit 11 outputs an irradiation identification signal indicating which of the ordinary light and the special light the object is irradiated with to the image processing unit 14 for each frame that is imaged. Note that, when irradiating the object with the special light, an optical filter which transmits only a predetermined wavelength may be provided in the light path of the ordinary light.
  • The imaging unit 12 images the object in a state in which the ordinary light or the special light is radiated from the light source unit 11, and outputs an image signal that is obtained as a result to the developing unit 13. The developing unit 13 subjects the image signal that is input thereto from the imaging unit 12 to a developing process such as a mosaic process, and outputs the image signal resulting from the process (the ordinary frame when the ordinary light is radiated, and the special frame when the special light is radiated) to the image processing unit 14.
  • Here, in the special frame, the blood vessel or the lesion such as a tumor is made clearer in comparison to in an ordinary frame; however, in contrast, the brightness of the entire frame is dark and there is much noise.
  • Meanwhile, in the ordinary frame, the entire frame is bright in comparison to the special frame, and there is little noise; however, in contrast, it is difficult to distinguish the blood vessel or the lesion such as a tumor.
  • The image processing unit 14 detects motion vectors using two ordinary frames with different imaging timing. By subjecting the special frame to a differential filter process, the image processing unit 14 generates a frame (hereinafter referred to as a differential filter frame) in which edge portions (specifically, the contours or the like of the blood vessel or the lesion, for example) within the image are emphasized. Furthermore, the image processing unit 14 performs motion correction on each of the special frame and the differential filter frame based on the motion vectors that are detected from the ordinary frame, combines the ordinary frame with the special frame and the differential filter frame which are subjected to the motion correction, and outputs the composite frame that is obtained as a result to the display unit 15.
  • The display unit 15 displays the composite frame.
  • Imaging Timing of Ordinary Frame and Special Frame
  • Next, an example of the imaging timing of the ordinary frames and the special frames is illustrated in FIG. 2.
  • In the endoscope device 10, ordinary frames are imaged for several continuous frames, and a special frame is imaged periodically between the ordinary frames. For example, as illustrated in FIG. 2, the imaging ratio of ordinary frames to special frames is set to 4:1.
  • However, the ratio is not limited to 4:1, and may be variable. In FIG. 2, Ta illustrates a timing at which an ordinary frame is imaged one frame before a special frame is imaged, Tb illustrates a timing at which a special frame is imaged, and Tc, Td, and Te illustrate timings at which ordinary frames are imaged 1, 2, and 3 frames, respectively, after the special frame is imaged. Ta to Te will be used in the description of the detection of motion vectors described later.
  • Configuration Example of Image Processing Unit 14
  • Next, a configuration example of the image processing unit 14 is illustrated in FIG. 3.
  • The image processing unit 14 is configured to include a switching unit 21, a motion vector detection unit 22, a correction amount estimation unit 23, a frame memory 24, a differential filter processing unit 25, a motion correction unit 26, and a compositing unit 27.
  • In the image processing unit 14, the ordinary frames and the special frames that are input thereto from the developing unit 13 of the previous stage are input to the switching unit 21, and the irradiation identification signal from the light source unit 11 is input to the switching unit 21, the motion vector detection unit 22, and the correction amount estimation unit 23.
  • The switching unit 21 determines whether or not the input from the developing unit 13 is a special frame based on the irradiation identification signal, when the input is not a special frame (is an ordinary frame), outputs the ordinary frame to the motion vector detection unit 22 and the compositing unit 27, and when the input is a special frame, the special frame is output to the frame memory 24.
  • For each frame period, the motion vector detection unit 22 detects the motion vectors using two ordinary frames with different imaging timing, and outputs the detected motion vectors to the correction amount estimation unit 23.
  • The correction amount estimation unit 23 estimates the motion correction amounts of the special frame and the differential filter frame based on the motion vectors that are detected by the motion vector detection unit 22, and outputs the estimated motion correction amounts to the motion correction unit 26. Note that, the correction amount estimation unit 23 is capable of correcting motion vectors which may be erroneously detected based on the motion vectors that are detected in succession, and is capable of estimating the motion correction amounts based on the corrected motion vectors.
  • The frame memory 24 holds the special frame that is input thereto from the switching unit 21, and supplies the held special frame to the differential filter processing unit 25 and the motion correction unit 26 for each frame period. The frame memory 24 updates the held special frame when the next special frame is input thereto from the switching unit 21.
  • The differential filter processing unit 25 generates a feature extraction frame in which features in the image are emphasized by subjecting the special frame that is supplied thereto from the frame memory 24 to a differential filter process (for example, the Sobel filter process), and outputs the feature extraction frame to the motion correction unit 26. Note that, in the case of the differential filter process, a differential filter frame in which the edge portions are emphasized is generated as the feature extraction frame. As described above, since the frame memory 24 supplies the special frame held therein for each frame period, the same special frames are consecutively supplied. In this case, the differential filter processing unit 25 may omit the differential filter process and output the result of the previous differential filter process to the motion correction unit 26.
  • Note that, as described above, instead of generating the differential filter frame using the differential filter process, for example, a process may be executed in which a region in which the variance or the dynamic range in a micro-block (of 3×3 pixels, for example) is greater than or equal to a threshold that is extracted, and a feature extraction frame indicating the extraction results is generated. As another example, a process may be executed in which a region in which the signal levels of pixels are within a specific threshold, that is, a region with specific RGB levels is extracted, and a feature extraction frame indicating the extraction results is generated. As still another example, a closed region (corresponding to a tumor or the like) may be subjected to a contour detection process such as snakes, and a feature extraction frame indicating the results may be generated.
  • The motion correction unit 26 subjects the special frame from the frame memory 24 to the motion correction based on the motion correction amounts that are input from the correction amount estimation unit 23, subjects the differential filter frame from the differential filter processing unit 25 to the motion correction, and outputs the post-motion correction special frame and differential filter frame to the compositing unit 27.
  • The compositing unit 27 includes a superposing unit 28 and a marking unit 29, using the ordinary frame and the post-motion correction special frame and differential filter frame as the input, generates a composite frame by performing a superposing compositing process by the superposing unit 28 or a marking compositing process by the marking unit 29, and outputs the composite frame to the display unit 15 of the subsequent stage.
  • Configuration Example of Motion Vector Detection Unit 22
  • Next, a configuration example of the motion vector detection unit 22 is illustrated in FIG. 4. The motion vector detection unit 22 is configured to include frame memories 31 and 32, a frame selection unit 33, a block matching unit 34, and a vector correction unit 35.
  • In the motion vector detection unit 22, the ordinary frame that is input thereto from the switching unit 21 of the previous stage is input to the frame memory 31 and the frame selection unit 33.
  • For each frame period, the frame memory 31 outputs the ordinary frame that is held therein until that point to the frame memory 32 and the frame selection unit 33, and updates the data held therein until that point with the ordinary frame that is input from the switching unit 21 of the previous stage. In the same manner, for each frame period, the frame memory 32 outputs the ordinary frame that is held therein to the frame selection unit 33, and updates the data held therein with the ordinary frame that is input from the frame memory 31 of the previous stage.
  • However, among frame periods, at a timing at which the ordinary frame is not input to the motion vector detection unit 22, the frame memory 31 outputs the ordinary frame that is held until that point to the subsequent stage, and clears the data that is held until that point.
  • At the next timing, since the frame memory 31 is not holding any data, the output to the subsequent stage is not performed. The frame memory 32 outputs the ordinary frame that is held until that point to the subsequent stage, and clears the data that is held until that point.
  • Therefore, two or three ordinary frames with different imaging timing are input to the frame selection unit 33 at the same time.
  • When two ordinary frames are input to the frame selection unit 33 at the same time, the two ordinary frames are output to the block matching unit 34. When three ordinary frames are input to the frame selection unit 33 at the same time, the two ordinary frames that are input from the frame memories 31 and 32 are output to the block matching unit 34. The block matching unit 34 detects the motion vectors between the two ordinary frames using a block matching process.
  • The vector correction unit 35 determines the relationship between the two ordinary frames that are used for the motion vectors based on the irradiation identification signal, corrects the detected motion vectors based on the relationship, and outputs the motion vectors to the correction amount estimation unit 23.
  • Detailed description will be given of the correction of the motion vectors by the vector correction unit 35. If the output from the frame memory 31 is used as a reference, when the reference imaging timing is the Ta illustrated in FIG. 2, the ordinary frame from the frame memory 32 that is one frame prior to the reference, and the reference ordinary frame from the frame memory 31 are input to the frame selection unit 33, and the motion vectors are detected from the two ordinary frames. In this case, the vector correction unit 35 does not perform the motion vector correction.
  • When the reference imaging timing is the Tb illustrated in FIG. 2, since the Tb is the imaging timing of the special frame, the frame memory 31 does not perform output. The ordinary frame from the frame memory 32 that is one frame prior to the reference, and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33, and the motion vectors are detected from the two ordinary frames. In that case, since the detected motion vectors are from between ordinary frames that are two frames separated from each other, the vector correction unit 35 multiplies each of the vertical and horizontal components of the detected motion vectors by ½.
  • When the reference imaging timing is the Tc illustrated in FIG. 2, the reference ordinary frame from the frame memory 31, and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33, and the motion vectors are detected from the two ordinary frames. In that case, since the directions of the detected motion vectors oppose each other, the vector correction unit 35 multiplies each of the vertical and horizontal components of the detected motion vectors by −1.
  • When the reference imaging timing is the Td illustrated in FIG. 2, the ordinary frame from the frame memory 32 that is one frame prior to the reference, the reference ordinary frame from the frame memory 31, and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33, and the motion vectors are detected from the two ordinary frames from the frame memories 31 and 32. In this case, the vector correction unit 35 does not perform the motion vector correction.
  • When the reference imaging timing is the Te illustrated in FIG. 2, the ordinary frame from the frame memory 32 that is one frame prior to the reference, the reference ordinary frame from the frame memory 31, and the ordinary frame from the switching unit 21 that is one frame after the reference are input to the frame selection unit 33, and the motion vectors are detected from the two ordinary frames from the frame memories 31 and 32. In this case, the vector correction unit 35 does not perform the motion vector correction.
  • The motion vectors that are corrected as described above are output from the vector correction unit 35 to the correction amount estimation unit 23 of the subsequent state.
  • Image Compositing Process by Image Processing Unit 14
  • Next, description will be given of the image compositing process by the image processing unit 14 with reference to FIG. 5.
  • FIG. 5 is a flowchart illustrating an image compositing process. The image compositing process is executed for each frame period.
  • In step S1, the switching unit 21 determines whether or not the input from the developing unit 13 is a special frame based on the irradiation identification signal, and when the input is a special frame, the special frame is output to the frame memory 24. Conversely, when it is determined that the input is not a special frame (is an ordinary frame), the switching unit 21 outputs the ordinary frame to the motion vector detection unit 22 and the compositing unit 27.
  • In step S2, the frame memory 24 supplies the special frame that is held until that point to the differential filter processing unit 25 and the motion correction unit 26. Note that, the frame memory 24 updates the held special frame when the special frame is input thereto from the switching unit 21.
  • In step S3, the differential filter processing unit 25 generates a differential filter frame in which edge portions in the image are emphasized by subjecting the special frame that is supplied thereto from the frame memory 24 to a differential filter process (for example, the Sobel filter process such as the one illustrated in the following equation (1)), and outputs the differential filter frame to the motion correction unit 26.

  • SobelRh(x,y)=|−R(x−1,y−1)−2R(x−1,y)−R(x−1,y−1)+R(x+1,y−1)+2R(x+1,y)+R(x+1,y+1)|

  • SobelRv(x,y)=|−R(x−1,y−1)−2R(x,y−1)−R(x+1,y−1)+R(x−1,y+1)+2R(x,y+1)+R(x+1,y+1)|

  • SobelR(x,y)=SobelRh(x,y)+SobelRv(x,y)

  • SobelGh(x,y)=|−G(x−1,y−1)−2G(x−1,y)−G(x−1,y+1)+G(x+1,y−1)+2G(x+1,y)+0(x+1,y+1)|

  • SobelGv(x,y)=|−G(x−1,y−1)−M(x,y−1)−G(x+1,y−1)+G(x−1,y+1)+2G(x,y+1)+G(x+1,y+1)|

  • SobelG(x,y)=SobelGh(x,y)+SobelGv(x,y)

  • SobelBh(x,y)=|−B(x−1,y−1)−2B(x−1,y)−B(x−1,y+1)+B(x+1,y−1)+2B(x−1,y)+B(x+1,y+1)|

  • SobelBv(x,y)=|(x−1,y−1)−2B(x,y−1)−B(x+1,y−1)+B(x−1,y+1)+2B(x,y+1)+B(x+1,y+1)|

  • SobelB(x,y)=SobelBh(x,y)+SobelBv(x,y)
  • Note that, R, G, and B in the equation (1) respectively correspond to levels in the R, G, and B planes of the special frame.
  • In step S4, the motion vector detection unit 22 detects the motion vectors using two ordinary frames with different imaging timing, and outputs the motion vectors to the correction amount estimation unit 23. In step S5, the correction amount estimation unit 3 determines whether or not the detected motion vectors are less than or equal to a predetermined threshold, and when the detected motion vectors are less than or equal to the predetermined threshold, the process proceeds to step S6 in order to use the motion vectors in the motion correction. Conversely, when the detected motion vectors are greater than the predetermined threshold, the motion vectors are not used in the motion correction. In this case, the image compositing process that corresponds to the present imaging timing ends.
  • In step S6, the correction amount estimation unit 23 estimates the motion correction amounts of the special frame and the differential filter frame based on the motion vectors that are detected by the motion vector detection unit 22, and outputs the estimated motion correction amounts to the motion correction unit 26. Specifically, for example, the motion correction amounts Hx and Hy are computed as illustrated in the following equation (2).
  • H x = t = 1 N V x , t H y = t = 1 N V y , t ( 2 )
  • In the equation (2), Vx and Vy are motion vectors that are detected and corrected, N represents the imaging timing t=N of the ordinary frame for which the motion vectors are detected in relation to the imaging timing t=0 of the special frame for which correction is performed.
  • Note that, in the correction amount estimation unit 23, it is also possible to correct the dispersion in the motion vectors and subsequently estimate the motion correction amounts based on the series of motion vectors, as described hereinafter, as another motion correction amount estimation method.
  • FIG. 6 is a diagram illustrating a process flow in which the dispersion in the motion vectors is corrected and the motion correction amounts are subsequently estimated based on the series of motion vectors. FIG. 7 illustrates an impression of correcting the dispersion in the motion vectors based on the series of motion vectors.
  • Specifically, the motion vectors (V′x,t, and V′y,t) in relation to the imaging timing t are estimated as illustrated in the following equation (3).

  • V′ x,t =a x t 3 +b x t 2 +c x t+d x

  • V′ y,t =a y t 3 +b y t 2 +c y t+d y  (3)
  • The motion correction amounts Hx and Hy are computed using the following equation (4) by substituting the motion vectors of the equation (2) with the estimated motion vectors (V′x,t and V′y,t).
  • H x = t = 1 N V x , t H y = t = 1 N V y , t ( 4 )
  • Note that, the coefficients (ax, bx, cx, and dx) and (ay, by, cy, and dy) in the equation (3) can be calculated using the least squares method using the detected motion vectors (Vx,1 and Vy,1), . . . , (Vx,t and Vy,t).
  • After the motion correction amounts are estimated as described above, the process proceeds to step S7.
  • In step S7, the motion correction unit 26 subjects the special frame from the frame memory 24 to the motion correction based on the motion correction amounts that are input from the correction amount estimation unit 23, subjects the differential filter frame from the differential filter processing unit 25 to the motion correction, and outputs the post-motion correction special frame and differential filter frame to the compositing unit 27. The ordinary frame, the special frame, and the differential filter frame that are input to the compositing unit 27 become frames in which the object is accurately aligned in relation to each other.
  • In step S8, the compositing unit 27 generates a composite frame by subjecting the ordinary frame and the post-motion correction special frame and differential filter frame to the superposing compositing process or the marking compositing process according to the selection from the user, and outputs the composite frame to the display unit 15 of the subsequent stage.
  • Description will be given of the superposing compositing process. As illustrated in the following equation (5), in the superposing compositing process, the result of multiplying the post-motion correction differential filter frame and special frame with each other is added to the ordinary frame.

  • O R(x,y)=C 0 ×N R(x,y)+C 1×SobelR(x,yI R(x,y)

  • O G(x,y)=C 0 ×N G(x,y)+C 1×SobelG(x,yI G(x,y)

  • O B(x,y)=C 0 x N B(x,y)+C 1×SobelB(x,yI B(x,y)  (5)
  • In the equation (5), O(x,y) is a pixel value of the composite frame, N(x,y) is a pixel value of the ordinary frame, Sobel(x,y) is a pixel value of the post-motion correction differential filter frame, and I(x,y) is a pixel value of the post-motion correction special frame. C0 and C1 are coefficients that control the degree of superposition and may be arbitrarily set by the user.
  • FIG. 8 illustrates an impression of the superposing compositing process described above. In the superposing compositing process, it is possible to obtain the composite frame in which the special frame and the differential filter frame are accurately aligned to the ordinary frame, and the edges of a portion to be focused on (a blood vessel, a lesion, or the like) are emphasized and superposed.
  • Next, description will be given of the marking compositing process. As illustrated in the following equation (6), in the marking compositing process, the ordinary frame is subjected to pseudo color conversion using a color matrix process according to color conversion coefficients C that are multiplied by the pixel values of the differential filter frame.
  • ( O R ( x , y ) O G ( x , y ) O B ( x , y ) ) = ( C 1 R × Sobel R ( x , y ) C 1 G × Sobel G ( x , y ) C 1 B × Sobel B ( x , y ) C 2 R × Sobel R ( x , y ) C 2 G × Sobel G ( x , y ) C 1 B × Sobel B ( x , y ) C 3 R × Sobel R ( x , y ) C 3 G × Sobel G ( x , y ) C 1 B × Sobel B ( x , y ) ) ( N R ( x , y ) N G ( x , y ) N B ( x , y ) ) ( 6 )
  • In the equation (6), O(x,y) is a pixel value of the composite frame, N(x,y) is a pixel value of the ordinary frame, Sobel(x,y) is a pixel value of the post-motion correction differential filter frame, and C is a color conversion coefficient.
  • As is clear from the equation (6), the post-motion correction special frame is not used in the marking and superposing compositing process.
  • According to the marking and superposing compositing process, since the degree of color conversion is controlled according to the pixel values of the differential filter frame, the edges of the blood vessel or the like are more strongly subjected to the color conversion, and the other regions are not significantly subjected to the color conversion. Accordingly, it is possible to obtain a composite frame in which only the edges of the blood vessel or the like stand out.
  • The description of the image compositing process ends with the above description.
  • As described above, according to the endoscope device 10 that serves as the present embodiment, since the motion vectors are detected using only the ordinary frames and the motion correction amounts are estimated after correcting the detected motion vectors, it is possible to accurately execute the motion correction of the special frame and the differential filter frame. Accordingly, since it is possible to accurately align the information of the special frame of the blood vessel, the tumor, or the like in relation to the ordinary frame, it is possible to allow the user (a medical practitioner performing an operation, or the like) to accurately and clearly visually recognize a tumor portion to be removed and the blood vessel portion not to be removed.
  • Since the composite frame that is presented to the user is created based on the ordinary frame, a composite frame that is bright with little noise in comparison to the special frame can be presented to the user.
  • Incidentally, the series of processes described above can be executed using hardware, and can be executed using software. When the series of processes is executed using software, the program configuring the software is installed on a computer. Here, examples of the computer include a computer embedded within dedicated hardware, and an ordinary personal computer or the like which is capable of executing the various functions due to various programs that are installed thereon.
  • FIG. 9 is a block diagram illustrating a configuration example of the hardware of the computer which executes the series of processes described above using a program.
  • In a computer 100, a central processing unit (CPU) 101, a read only memory (ROM) 102, and random access memory (RAM) 103 are connected to each other by a bus 104.
  • Furthermore, an input-output interface 105 is connected to the bus 104. The input-output interface 105 is connected to an input unit 106, an output unit 107, a storage unit 108, a communication unit 109, and a drive 110.
  • The input unit 106 is formed of a keyboard, a mouse, a microphone, and the like. The output unit 107 is formed of a display, a speaker, and the like. The storage unit 108 is formed of a hard disk, non-volatile memory, or the like. The communication unit 109 is formed of a network interface or the like. The drive 110 drives a removable medium 111 such as a magnetic disk, an optical disc, a magneto-optical disc, or a semiconductor memory.
  • In the computer 100 configured as described above, the series of processes described above is performed by the CPU 101, for example, loading the program stored in the storage unit 108 into the RAM 103 via the input-output interface 105 and the bus 104, and executing the loaded program.
  • The computer 100 may be a so-called cloud computer that is connected via the Internet, for example.
  • Note that, the program which is executed by the computer 100 may be a program in which the processes are performed in time series in the order described in the present specification. The program may be a program in which the processes are performed in parallel or at the necessary timing such as when the process is called.
  • The embodiments of the present disclosure are not limited to the embodiment described above, and various modifications may be made within a scope not departing from the main concept of the present disclosure.
  • Furthermore, the present disclosure may adopt the following configurations.
  • (1) An image processing device, including an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • (2) The image processing device according to (1), further including a feature extraction process unit which generates a feature extraction frame by subjecting the special frame to a feature extraction process, in which the motion correction unit further subjects the feature extraction frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors, and in which the compositing unit subjects the ordinary frames to the image compositing process based on the feature extraction frame that is subjected to motion correction.
  • (3) The image processing device according to (2), in which the feature extraction process unit generates a differential filter frame as the feature extraction frame by subjecting the special frame to a differential filter process.
  • (4) The image processing device according to any one of (1) to (3), in which the compositing unit subjects the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process.
  • (5) The image processing device according to (4), in which as the superposing compositing process, the compositing unit adds the motion-corrected special frame to the ordinary frames according to the motion-corrected feature extraction frame.
  • (6) The image processing device according to (4), in which as the marking compositing process, the compositing unit subjects the ordinary frames to a color conversion process according to the motion-corrected feature extraction frame.
  • (7) The image processing device according to any one of (4) to (6), in which the compositing unit subjects the ordinary frames to a superposing compositing process or a marking compositing process as the image compositing process according to a selection by a user.
  • (8) The image processing device according to any one of (1) to (7), further including a motion vector correction unit which corrects the detected motion vectors based on the plurality of motion vectors that are consecutively detected.
  • (9) An image processing method performed by an image processing device, the method including inputting ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; detecting motion vectors of the object from a plurality of the ordinary frames with different imaging timing; subjecting the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and subjecting the ordinary frames to an image compositing process based on the special frame.
  • (10) A program for causing a computer to function as an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • (11) An endoscope device, including a light source unit which irradiates an object with ordinary light or special light; an imaging unit which consecutively images, at a predetermined ratio according to a predetermined frame period, ordinary frames in a state in which the object is irradiated with the ordinary light, and a special frame in a state in which the object is irradiated with the special light; a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing; a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
  • It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (1)

What is claimed is:
1. An image processing device, comprising:
an input unit which inputs ordinary frames in a state in which an object is irradiated with ordinary light, and a special frame in a state in which the object is irradiated with special light, which are imaged consecutively at a predetermined ratio according to a predetermined frame period;
a detection unit which detects motion vectors of the object from a plurality of the ordinary frames with different imaging timing;
a motion correction unit which subjects the special frame to motion correction corresponding to the imaging timing of the ordinary frames based on the detected motion vectors; and
a compositing unit which subjects the ordinary frames to an image compositing process based on the special frame.
US16/020,756 2014-03-12 2018-06-27 Image processing device, image processing method, program, and endoscope device Active US10306147B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/020,756 US10306147B2 (en) 2014-03-12 2018-06-27 Image processing device, image processing method, program, and endoscope device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014-048336 2014-03-12
JP2014048336A JP2015171450A (en) 2014-03-12 2014-03-12 Image processing device, image processing method, program, and endoscope apparatus
US14/618,240 US10021306B2 (en) 2014-03-12 2015-02-10 Image processing device, image processing method, program, and endoscope device
US16/020,756 US10306147B2 (en) 2014-03-12 2018-06-27 Image processing device, image processing method, program, and endoscope device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/618,240 Continuation US10021306B2 (en) 2014-03-12 2015-02-10 Image processing device, image processing method, program, and endoscope device

Publications (2)

Publication Number Publication Date
US20180309933A1 true US20180309933A1 (en) 2018-10-25
US10306147B2 US10306147B2 (en) 2019-05-28

Family

ID=54070382

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/618,240 Active 2036-03-01 US10021306B2 (en) 2014-03-12 2015-02-10 Image processing device, image processing method, program, and endoscope device
US16/020,756 Active US10306147B2 (en) 2014-03-12 2018-06-27 Image processing device, image processing method, program, and endoscope device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/618,240 Active 2036-03-01 US10021306B2 (en) 2014-03-12 2015-02-10 Image processing device, image processing method, program, and endoscope device

Country Status (3)

Country Link
US (2) US10021306B2 (en)
JP (1) JP2015171450A (en)
CN (1) CN104915920A (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4171169A1 (en) 2015-08-31 2023-04-26 Ntt Docomo, Inc. User terminal, radio base station and radio communication method
JP2019502419A (en) * 2015-11-05 2019-01-31 コヴィディエン リミテッド パートナーシップ System and method for detection of subsurface blood
KR102192488B1 (en) * 2015-11-25 2020-12-17 삼성전자주식회사 Apparatus and method for frame rate conversion
WO2017212653A1 (en) * 2016-06-10 2017-12-14 オリンパス株式会社 Image processing device, image processing method, and image processing program
WO2020050187A1 (en) * 2018-09-06 2020-03-12 ソニー株式会社 Medical system, information processing device, and information processing method
EP4024115A4 (en) * 2019-10-17 2022-11-02 Sony Group Corporation Surgical information processing device, surgical information processing method, and surgical information processing program
KR102504321B1 (en) * 2020-08-25 2023-02-28 한국전자통신연구원 Apparatus and method for online action detection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110020536A1 (en) * 2008-03-26 2011-01-27 Taisuke Yamamoto Electrode for lithium secondary battery and method of manufacturing same
US20110023789A1 (en) * 2007-12-10 2011-02-03 The Cooper Union For The Advancement Of Science And Art Gas delivery system for an animal storage container
US20150022370A1 (en) * 2013-07-22 2015-01-22 Google Inc. Methods, systems, and media for projecting light to indicate a device status

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6975898B2 (en) * 2000-06-19 2005-12-13 University Of Washington Medical imaging, diagnosis, and therapy using a scanning single optical fiber system
FR2811791B1 (en) * 2000-07-13 2002-11-22 France Telecom MOTION ESTIMATOR FOR CODING AND DECODING IMAGE SEQUENCES
JP2007313171A (en) 2006-05-29 2007-12-06 Olympus Corp Endoscopic system
US8896701B2 (en) * 2010-02-23 2014-11-25 Ratheon Company Infrared concealed object detection enhanced with closed-loop control of illumination by.mmw energy
JP2011200367A (en) * 2010-03-25 2011-10-13 Fujifilm Corp Image pickup method and device
JP5603676B2 (en) * 2010-06-29 2014-10-08 オリンパス株式会社 Image processing apparatus and program
JP2012024283A (en) 2010-07-22 2012-02-09 Fujifilm Corp Endoscope diagnostic apparatus
US9999355B2 (en) * 2014-02-12 2018-06-19 Koninklijke Philips N.V. Device, system and method for determining vital signs of a subject based on reflected and transmitted light

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110023789A1 (en) * 2007-12-10 2011-02-03 The Cooper Union For The Advancement Of Science And Art Gas delivery system for an animal storage container
US20110020536A1 (en) * 2008-03-26 2011-01-27 Taisuke Yamamoto Electrode for lithium secondary battery and method of manufacturing same
US20150022370A1 (en) * 2013-07-22 2015-01-22 Google Inc. Methods, systems, and media for projecting light to indicate a device status

Also Published As

Publication number Publication date
JP2015171450A (en) 2015-10-01
US10021306B2 (en) 2018-07-10
CN104915920A (en) 2015-09-16
US20150264264A1 (en) 2015-09-17
US10306147B2 (en) 2019-05-28

Similar Documents

Publication Publication Date Title
US10306147B2 (en) Image processing device, image processing method, program, and endoscope device
CN113454981B (en) Techniques for multi-exposure fusion of multiple image frames based on convolutional neural networks and for deblurring the multiple image frames
US10691393B2 (en) Processing circuit of display panel, display method and display device
US9916666B2 (en) Image processing apparatus for identifying whether or not microstructure in set examination region is abnormal, image processing method, and computer-readable recording device
EP2533193B1 (en) Apparatus and method for image processing
EP3076364A1 (en) Image filtering based on image gradients
US9953422B2 (en) Selective local registration based on registration error
KR102570562B1 (en) Image processing apparatus and operating method for the same
US8660379B2 (en) Image processing method and computer program
CN103310430A (en) Method and apparatus for deblurring non-uniform motion blur
US11720745B2 (en) Detecting occlusion of digital ink
US9292921B2 (en) Method and system for contrast inflow detection in 2D fluoroscopic images
WO2017161767A1 (en) Element display method and device
EP3217659A1 (en) Image processing apparatus, image processing method, and program
CN113379702B (en) Blood vessel path extraction method and device for microcirculation image
EP3559900B1 (en) System and method for compensation of reflection on a display device
US10921410B2 (en) Method and system for susceptibility weighted magnetic resonance imaging
US10580132B2 (en) Medical image processing apparatus, control method therefor, and non-transitory storage medium storing program
JPWO2018159288A1 (en) Image processing apparatus, image processing method, and program
CN113989171A (en) Subtraction map generation method and device, storage medium and computer equipment
US10937156B2 (en) Saliency mapping of imagery during artificially intelligent image classification
KR20200041773A (en) Apparatus for compansating cancer region information and method for the same
EP2287805A1 (en) Image processing device, image processing method, and information storage medium
CN113344809B (en) Ultrasonic image enhancement method, system and equipment
US20220309619A1 (en) Image processing apparatus, image processing method, and computer readable recording medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4