CN111161852B - Endoscope image processing method, electronic equipment and endoscope system - Google Patents

Endoscope image processing method, electronic equipment and endoscope system Download PDF

Info

Publication number
CN111161852B
CN111161852B CN201911399032.2A CN201911399032A CN111161852B CN 111161852 B CN111161852 B CN 111161852B CN 201911399032 A CN201911399032 A CN 201911399032A CN 111161852 B CN111161852 B CN 111161852B
Authority
CN
China
Prior art keywords
images
image
offset range
series
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911399032.2A
Other languages
Chinese (zh)
Other versions
CN111161852A (en
Inventor
李宗州
谢天宇
王希光
付野
王晨曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shuangyiqi Electronics Co ltd
Peking University
Original Assignee
Beijing Shuangyiqi Electronics Co ltd
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shuangyiqi Electronics Co ltd, Peking University filed Critical Beijing Shuangyiqi Electronics Co ltd
Priority to CN201911399032.2A priority Critical patent/CN111161852B/en
Publication of CN111161852A publication Critical patent/CN111161852A/en
Application granted granted Critical
Publication of CN111161852B publication Critical patent/CN111161852B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS

Landscapes

  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Endoscopes (AREA)

Abstract

The present invention relates to the field of medical devices, and in particular, to an endoscope image processing method, an electronic device, and an endoscope system. The endoscope image processing method comprises the following steps: identifying a specific region in a target image, wherein the target image is an image arbitrarily selected from a series of images taken by an object under illumination by an illumination unit; calculating the offset range which can be generated by the specific area in other images in the series of images; the series of images are aligned to the particular region within the offset range. The invention greatly reduces the calculated amount of a series of spectrum images shot by the alignment endoscope and improves the alignment effect of a specific area; after alignment, an accurate spectrum data cube or spectrum curve can be generated, or images with clear features can be generated through fusion, so that the accuracy of spectrum pathological analysis and diagnosis is improved.

Description

Endoscope image processing method, electronic equipment and endoscope system
Technical Field
The present invention relates to the field of medical devices, and in particular, to an endoscope image processing method, an electronic device, and an endoscope system.
Background
An endoscope system is an important medical instrument for diagnosing and treating early diseases of a human body cavity, and comprises an endoscope body and an illumination unit. When in use, the soft or hard tubular endoscope body is inserted into a human body, and the image is captured under the irradiation of the illumination unit, so that the tissue form and pathological condition of the viscera of the human body can be observed by means of the image, and diagnosis can be performed. Since human tissue (including lesion tissue) is sensitive to light with certain specific wavelengths and insensitive to light with certain specific wavelengths, in order to highlight submerged characteristic information under white light irradiation, a plurality of different wavelengths of light are usually used to respectively irradiate on an inspected object, a series of images are taken, a spectrum map or a spectrum data cube of the human tissue (including lesion tissue) is obtained, and the dependence on sampling and pathological examination in diagnosis is reduced by researching the characteristics of a spectrum curve and analyzing the spectrum pathology.
At present, a series of images are analyzed and processed in a mode that a plurality of images are selected from the images at will to perform image identification, alignment and pseudo-color assignment, and finally a color image is formed. However, when a series of images of the inspected object are continuously taken, in the interval time of two illumination imaging, the movement or deformation of the inspected object can be caused by human respiration, heartbeat, gastrointestinal peristalsis and the like, and moreover, the endoscope can shake or rotate when a user remotely operates the endoscope, so that obvious offset exists between each image in the series of images continuously taken; in addition, the endoscope has a large angle of view, barrel distortion, and increases the offset between images. Currently, the selected images are directly aligned within the whole image range, the calculated amount is large, the alignment effect of the concerned region is difficult to ensure, further, a spectrum curve with an accurate position cannot be obtained, the accuracy of subsequent spectrum pathological analysis is greatly reduced, and even the subsequent spectrum pathological analysis is completely wrong.
Disclosure of Invention
In view of the above-mentioned drawbacks of the background art, an object of the present invention is to provide an endoscope image processing method, an electronic device, and an endoscope system, which are used for solving the problems that the existing endoscope system has poor image processing effect in use and cannot provide accurate data and good images for subsequent researches.
In order to solve the above technical problems, the present invention provides an endoscope image processing method, including:
identifying a specific region in a target image, wherein the target image is an image arbitrarily selected from a series of images taken by an object under illumination by an illumination unit; calculating the offset range which can be generated by the specific area in other images in the series of images; the series of images are aligned to the particular region within the offset range.
In order to solve the above technical problem, the present invention also provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the endoscopic image processing method as described above when executing the program.
In order to solve the technical problem, the invention also provides an endoscope system which comprises an endoscope body and the electronic equipment, wherein the processor is in communication connection with the image pickup module in the endoscope body.
The technical scheme of the invention has the following beneficial effects:
the endoscope image processing method of the invention marks a specific area in one image selected at will in a series of shot images, calculates the offset range which can be generated by the specific area in other images in the series of images, and aligns the series of images according to the specific area in the offset range; according to the endoscope image processing method, the offset range of the specific area in other images in a series of images is calculated, the series of images are aligned according to the specific area in the offset range which can be generated, the calculated amount is greatly reduced, and the alignment effect of the specific area is improved; by using the aligned images, an accurate spectrum data cube or spectrum curve can be generated, and accurate data can be provided for diagnosis and research; after alignment, the images with rich characteristic information can be fused, and the fused images have clear characteristics, thereby being beneficial to rapid and accurate diagnosis.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of one embodiment of an endoscopic image processing method of the present invention;
FIG. 2 is a spectral diagram of a combination of illumination light from the illumination unit of the embodiment shown in FIG. 1;
FIG. 3 is a spectral diagram of another combination of illumination light from the illumination unit of the embodiment shown in FIG. 1;
FIG. 4 is a plot of a series of images of FIG. 1 versus the various illumination light sources shown in FIG. 2;
FIG. 5 is a schematic diagram of one embodiment of identifying a particular region in a target image;
fig. 6 is a schematic diagram when only one pixel is in a certain region manually specified in a target image;
FIG. 7 is a schematic diagram of another embodiment of identifying a particular region in a target image;
FIG. 8 is a flow chart of yet another embodiment of identifying a particular region in a target image;
FIG. 9 is a schematic diagram of a processing result corresponding to the flow shown in FIG. 8;
FIG. 10 is a schematic illustration of identifying a plurality of specific regions in a target image;
FIG. 11 is a schematic diagram of one embodiment of calculating the offset range for the particular region shown in FIG. 5;
FIG. 12 is a schematic diagram of another embodiment of calculating the offset range for the particular region shown in FIG. 5;
FIG. 13 is a pixel point to be calibrated selected within the real-time image range;
FIG. 14 is a schematic diagram of a calibration process of offset range circles of pixels to be calibrated at Δt time;
FIG. 15 is a flow chart of an embodiment of the alignment step shown in FIG. 1;
FIG. 16 is a schematic illustration of alignment according to the method shown in FIG. 15;
FIG. 17 is a flow chart of another embodiment of the alignment step shown in FIG. 1;
FIG. 18 is a flow chart of another embodiment of an endoscopic image processing method of the present invention;
FIG. 19 is a schematic illustration of a series of images initially aligned in the manner shown in FIG. 18;
FIG. 20 is a schematic diagram of a displacement transformation included in a similarity transformation;
FIG. 21 is a schematic diagram of a rotation transformation included in a similarity transformation;
FIG. 22 is a schematic diagram of a scaling transformation included in a similarity transformation;
FIG. 23 is a flowchart of an embodiment of the image alignment adjustment step of FIG. 18;
FIG. 24 is a schematic view of the displacement trajectory of the set of images shown in FIG. 19 after preliminary alignment;
FIG. 25 is a schematic view of the rotational trajectory of the set of images shown in FIG. 19 after preliminary alignment;
FIG. 26 is a schematic view of the zoom locus after preliminary alignment of the set of images shown in FIG. 19;
fig. 27 is a schematic view of an embodiment of an endoscope system of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In describing embodiments of the present invention, it should be noted that the terms "first" and "second" are used for clarity in describing the numbering of the product components and do not represent any substantial distinction unless explicitly stated or defined otherwise. The directions of the upper, the lower, the left and the right are all the directions shown in the drawings. The specific meaning of the above terms in the embodiments of the present invention will be understood by those of ordinary skill in the art according to specific circumstances.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
FIG. 1 is a flowchart of an embodiment of an endoscopic image processing method of the present invention, the endoscopic image processing method shown in FIG. 1 comprising the steps of:
In step S110, a specific region is identified in a target image, wherein the target image is an image arbitrarily selected from a series of images taken by an object under illumination by an illumination unit.
In step S120, an offset range that can be generated by the specific region in other images in the series of images is calculated.
Step S130, aligning a series of images according to a specific area within the offset range.
The illumination unit may use a xenon lamp as a light source, or may use a broad spectrum LED or tungsten lamp or other broad spectrum light source. The light source sequentially filters out a plurality of illumination lights through the optical filter set, the optical filter wheel and the optical filter set control mechanism. Specifically, a narrow-band filter with different center wavelengths or a broadband filter with different wavelength ranges or a filter without installation can be selected as required to form a filter set; the optical filters are inlaid on the optical filter wheel according to a certain sequence, for example, the central wavelength is from small to large; the light transmittance of the filters may be varied to adjust the intensity of each illumination light; the optical filter control mechanism controls the optical filter wheel to rotate, and the optical filter is switched, so that various required illumination lights can be filtered out. Fig. 2 is a spectral diagram of a combination of illumination light from the illumination unit of the embodiment of fig. 1, which includes 25 narrow-band light 201-225 having only one center wavelength, which collectively cover the visible light range of 400-760nm, with filters designed to provide different light transmittance, and the illumination intensity is adjusted to maximize the brightness of a series of images captured by the imaging unit within the endoscope body.
In addition, the light source of the illumination unit may be implemented using a plurality of LEDs corresponding to a plurality of illumination lights as required, and in this embodiment, a filter may be omitted, each LED in the illumination light path may be sequentially lighted by an electronic control manner, or each LED may be sequentially switched into the illumination light path by a mechanical manner. In addition, when an LED is used as a light source, the LED can be attached to the distal end of the endoscope body, but the distal end space of the endoscope body is small, and the number of LEDs that can be attached is limited.
It should be noted that the above-mentioned various illumination lights are not limited to the 25 kinds of narrow-band lights shown in fig. 2, but each illumination light may have more than one center wavelength and bandwidth, i.e., each illumination light may be a narrow-band light having one or more center wavelengths, and/or any combination of broad-band lights having one or more wavelength ranges, especially illumination lights that may be sensitive to a specific disorder or may highlight a feature of a certain aspect of the inspected object. Each illumination light may appear multiple times at different intensities during the capture of a series of images. Fig. 3 is a spectral diagram of another combination of illumination light from an illumination unit, including a narrowband light 201 having one center wavelength, a narrowband light 226 having two center wavelengths, a broadband light 227 having one wavelength range, and a broadband light 228 having two wavelength ranges.
Fig. 4 is a graph of a series of images of fig. 1 corresponding to the various illumination light types shown in fig. 2. The illumination light 201 is guided by the endoscope body and irradiates on an inspected object, and the camera module shoots a corresponding image 301 to finish illumination imaging once; the illumination unit switches the next illumination light 202, and the camera module shoots another corresponding image 302 to finish another illumination imaging; the illumination unit continues to switch the next illumination light 203, and the camera module captures a corresponding image 303 to complete illumination imaging again; in this way, the illumination units sequentially switch illumination light, the camera module sequentially captures a corresponding image until the illumination units switch to illumination light 225, the camera module captures a corresponding image 325, illumination imaging of a series of images is completed, and the series of images form a set of images 330. Let the time interval between two adjacent illumination images be Δt. It should be noted that the method for processing an endoscopic image provided by the invention is also applicable to a plurality of images captured under illumination of a single wavelength.
According to the time sequence of image shooting, if the target image is the first image shot, calculating the offset range of the specific area in the second image, aligning the second image with the target image in the calculated offset range, obtaining the accurate corresponding area of the specific area in the second image, calculating the offset range of the third image compared with the corresponding area, aligning the third image with the target image in the offset range, and repeating the steps until all the images are aligned with the target image. In addition, the offset range of each other image except the target image with respect to the target image may be calculated first, and then the images may be aligned to the target image one by one within the offset range. Of course, if the target image is the i-th image, the offset range is calculated for the i+1-th and subsequent images by the method consistent with the above, and the offset range of the previous i-1 image is calculated by the method of the reverse operation.
For convenience of description, the following description will be specifically made with reference to a series of images in fig. 4 and the first image is taken as the target image, and the description will not be repeated for the case of taking other images as the target object.
FIG. 5 is a schematic diagram of one embodiment of identifying a particular region in a target image. A certain region is manually specified as a specific region in the target image. Specifically, an area 403 is defined in the first image 301 of the set of images 330 by means of the mouse 401, with the area 403 being a specific area. Where region 403 contains one polyp of interest 402. The image of the specific region is designated as a target image for subsequent alignment. In addition, a specific region may be defined in any one of the other images of the set of images 330. In addition to the mouse 401, a specific area may be manually specified with the aid of any other input device such as a touch panel, a keyboard, a voice input device, a camera, a scanner, etc.; the user may designate a specific region by clicking, outlining, inputting coordinates, inputting a region, selecting a preset region, moving a selection box of a specific shape, or the like.
In this case, when the user designates an area through the input device, only a particularly small area may be designated, for example, as shown in fig. 6, and the manually designated area 409 in the target image has only one pixel. One pixel is extremely susceptible to noise and subsequent alignment cannot be performed for only one pixel, and in this case, the region 409 is centered on the region 410, which is enlarged to 5×5 pixels, as the final specific region. Alternatively, it may be enlarged to 9x9 or 13x13 or other slightly larger areas than before.
Alternatively, other ways of identifying specific regions in the target image may also be used. FIG. 7 is a schematic diagram of another embodiment of identifying a particular region in a target image without using an input device. Specifically, a square area 404 with a side length of 100 pixels is preset in the center of the real-time image, and is prompted by a dashed box; before capturing a group of images 330 of the inspected object corresponding to various illumination lights, the direction and the position of the top end of the endoscope body are adjusted, so that the square area is aligned with the concerned area, namely, the polyp 402 is aligned, and the polyp 402 is filled with the square area 404; then, an image is taken, and a square area 404 in the first image 301 is taken as a specific area. Wherein polyp 402 may also not fill square region 404 as long as it falls within square region 404. In addition, an area of an arbitrary shape and size may be preset at an arbitrary position of the real-time image, or the area of any one image may be used as a specific area.
Fig. 8 is a flowchart of still another embodiment of identifying a specific area in a target image, and fig. 9 is a schematic diagram of a processing result corresponding to the flowchart shown in fig. 8, again without using an input device. Step S410, performing edge detection on the first image 301 of the captured set of images 330 by using a canny operator or a sobel operator or a laplacian operator or a roberts operator or a prewitt operator; step S420, connecting broken edges based on a closed contour extraction method based on edge direction retention assumption at break points to obtain a closed contour, and extracting the closed contour to extract the contour of the polyp 402; in step S430, the extracted polyp contour is then dilated by a proportion or number of pixels to yield a region 408, and the region 408 is selected as the particular region.
In addition, two or more of the user input position designating means, the preset position designating means, and the automatic recognition position designating means can be combined together, so that the region of interest can be designated as the specific region more quickly and accurately. In one embodiment, a plurality of areas are selected by the automatic identification position designating means, and then one area is designated from the plurality of areas by the user input position designating means, and perfected, and selected as a specific area.
The specific areas 403, 404, 408 illustrated in fig. 5, 7, 9 are generally all larger, without the need for expansion by the method illustrated in fig. 6.
The specific area is often an area of interest to the user, and in the embodiments shown in fig. 5, 6, 7, and 9, only a case where one area is designated as the specific area is shown, and the user sometimes needs to designate another area as the specific area as a reference area for the area of interest. For example, the user wants to acquire the spectrum curve of the lesion and also wants to compare with the spectrum curve of the normal part, in which case, firstly, the alignment effect of the region of interest is ensured, and secondly, the alignment effect of the reference region (which may be worse than the region of interest) is considered. Fig. 10 is a schematic diagram of identifying a plurality of specific regions in a target image, specific region 501 being a region of interest containing polyp 402; the specific region 502 is a reference region, and no lesions are found. To ensure the alignment effect of the region of interest 501 and to compromise the alignment effect of the reference region 502, different alignment weights are applied to the region of interest 501 and the reference region 502 for subsequent alignment. For example, one embodiment of applying weights is for a user to input the degree of interest for each particular region in a given rule via an input device; an alignment weight of the corresponding ratio is applied as a ratio of the attention of each specific region. The specific areas as the reference areas are often smaller, so that the area of each specific area can be calculated, and the alignment weight of corresponding ratio can be applied according to the area ratio, thereby being used as an implementation for automatically applying the alignment weight. And when calculating the offset range, respectively calculating the offset range for each specific area.
In the time interval Δt between two adjacent illumination imaging, a specific region in the target image may deviate under the influence of organ peristalsis, endoscope body shake or rotation, and the like, and a deviation range which can be generated needs to be calculated in the image to be aligned. For the above different methods for identifying the specific areas, only the specific areas shown in fig. 5 will be specifically described, and other ways for identifying the specific areas are the same, which will not be described again.
FIG. 11 is a schematic diagram of one embodiment of calculating the offset range for the particular region shown in FIG. 5. First, a specified region 403 specified in a first image 301 of a group of images 330 determines a total of four pixel points 601 of minimum x-coordinate, maximum x-coordinate, minimum y-coordinate, and maximum y-coordinate; reading offset range circles 602 of the four pixel points 601 at Δt time, and determining four positions 603 where the four pixel points 601 possibly deviate from the specific area 403 to the maximum in the x and y directions; a rectangular area 604 with one side parallel to the x-axis and the y-axis is determined by four positions 603, and the rectangular area 604 is the calculated offset range that can be generated in the second image 302 by the specific area 403. It should be noted that, the number of the pixels is not less than three, and the three pixels can determine a triangle area, and the triangle area is used as the offset range that can be generated in the second image 302 by the specific area 403; when five or six points or more, a polygonal area may be determined as a range of offsets that can be generated 403 in the second image 302.
The offset range circle 602 of the pixel point at Δt time is obtained through calibration in advance, and is stored in a storage module, and is read from the storage module when needed. Within the offset range 604, the image 302 to be aligned is aligned to the target image 301 by the specific region 403. After alignment, an accurate corresponding region of the specific region 403 in the second image 302 can be obtained, and further, the offset range that can be generated by the specific region 403 in the third image 303 can be estimated according to the corresponding region; the image to be aligned 303 is then aligned to the target image 301 by the specific region 403 within the offset range. Thus, images 302-325 can be sequentially aligned to target image 301 for a particular region 403 within a corresponding offset range.
Fig. 12 is a schematic diagram of another embodiment of calculating the offset range for the particular region shown in fig. 5. The offset range circle 602 of each pixel point 601 at Δt time on the boundary of the specific area 403 is read from the storage module, all the offset range circles 602 are superimposed, a new area 605 is determined by the outer boundary of the superimposed area, that is, the calculated offset range of the specific area 403 in the second image 302 can be generated, and an area 606 is also determined by the inner boundary of the superimposed area, which can be used as an additional constraint condition in alignment, that is, the constraint that the pixel points 601 on the boundary of the specific area 403 are offset inwards. It should be noted that, a plurality of pixel points selected at intervals on the boundary of the specific area 403 may be also used, and the offset range circles 602 of the pixel points at Δt time may be respectively read, so as to be circumscribed by the boundaries of the offset range circles 602 at the same time as the calculated offset range that the specific area 403 can generate in the second image 302, where the number of pixel points is not less than three, and an outer boundary formed by a circumscribing circle or scaling up the boundary of the specific area 403 may be uniquely determined by the three offset range circles 602. Of course, the plurality of pixel points may be all disposed on the boundary of the specific region 403; it is also possible that a part of the boundary is adjacent to the specific area 403 and another part of the boundary is located on the specific area 403, which is not particularly limited in the embodiment of the present invention.
During image capturing, human respiration, heartbeat, gastrointestinal peristalsis and the like can cause movement or deformation of an inspected object; the user operates the endoscope at the far end, and the top end of the endoscope body is slightly dithered or rotated although the endoscope body is kept stable as much as possible; in addition, the barrel distortion of the endoscope causes obvious offset among a series of images which are continuously shot, but the offset is not infinite, but has an estimated range, and the calculated amount of alignment can be greatly reduced and the accuracy of alignment can be improved by estimating the offset range and aligning the images in the offset range.
Fig. 13 and 14 illustrate how the offset range circle 602 of the pixel at Δt time is calibrated. As shown in fig. 13, 15×15 pixels 601 to be calibrated are selected and uniformly distributed in the whole image range, and fig. 14 is a schematic diagram of a calibration process of offset range circles of the pixels to be calibrated at Δt time. At a typical viewing distance, a first pixel point 601 to be calibrated is aligned with a punctiform site having a distinct feature in the subject under examination, such as a small polyp of the stomach, and two images are taken at intervals of not less than Δt. And manually finding out the center position point 701 of the small polyp in the 1 st image, finding out the center position point 702 of the small polyp in the second image, determining the coordinate offset of the position point 702 relative to the position point 701, and completing the acquisition of the first group of calibration data of the first pixel point 601 to be calibrated. In this way, for the first pixel point 601 to be calibrated, 100 sets of calibration data are acquired, and before each set of calibration data is acquired, the typical observation distance and the punctiform position with obvious characteristics are randomly transformed. Drawing the 100 sets of calibration data together, maintaining the coordinate offset of all the position points 702 relative to the position points 701, and overlapping all the position points 701; a circle is drawn around 701 to exactly contain all the position points 702, resulting in a circle 703 with a radius r. Since there is an error in finding out the center position points 701 and 702 of the point-like parts with obvious features, it is also necessary to multiply the radius R by a margin coefficient, for example, 1.2, to obtain a circle 602 with the radius R, that is, a circle 602 with the offset range of the pixel point 601 to be calibrated at Δt time. And in this way, calibrating each pixel point 601 to be calibrated in turn. And for uncalibrated pixel points, performing inverse distance weighted interpolation on the offset range circle radius of the adjacent calibrated pixel points according to the distance from the adjacent calibrated pixel points to obtain the offset range circle radius of the pixel points, and further obtaining the offset range circle 602 of the pixel points.
The number of the pixel points to be calibrated is not limited to 15×15, and can be adjusted, increased or reduced according to specific image resolution; the pixel points to be calibrated can be distributed in the whole image range in other arrangement modes. The 100 sets of calibration data do not need to be acquired at one time, can be accumulated in clinical examinations or extracted from clinical videos and image data, and the number of the data sets is not limited to 100. The margin coefficient can be reasonably adjusted according to the data acquisition condition and the data group number. Calibration may also be performed separately for different types of endoscopes and organs to improve calibration accuracy, such as gastroscopes and stomach, colonoscopes and colons.
In a similar manner to the calibration of the offset range circle in Δt time, the calibration of the offset range circle in N Δt (n.gtoreq.2) time may be performed in the same manner, and the offset range circle in N Δt time may be used to determine the offset range of the specific region for each of the N-1 images at intervals. For example, offset range circles in multiple time periods such as Δt, 2Δt, 3Δt and the like can be calibrated at the same time, a series of collected images are divided into two types of images with relatively rich features and images with relatively blurred features, then the images with relatively rich features are processed, each image carries collection time information, the offset range circle in the corresponding time period is selected according to the time interval distance between the image and a target image to determine the offset range which can be generated by a specific region in the image, and then alignment is carried out. And repeating the process until the images with relatively rich features are aligned, and then sequentially aligning the images with relatively blurred features by using the same method.
In the calibration, not only the offset range of the pixel point may be defined by a circle to obtain an offset range circle, but also any other shape, such as a square, may be used to obtain an offset range square.
In addition, with the method for identifying the specific region in the target image shown in fig. 7, since the specific region is preset, the offset range of the specific region in other images in the series of images can be calculated in advance according to the offset range in the calibrated N Δt (N is greater than or equal to 1) time, so that after a series of images are captured, step S120 shown in fig. 1 can be omitted, and step S130 can be directly performed.
Fig. 15 is a flow chart of an embodiment of the alignment step shown in fig. 1. Fig. 16 is a schematic view of the alignment of the image with the specific area shown in fig. 5 within the offset range shown in fig. 11 according to the method shown in fig. 15. The alignment step specifically comprises the following steps: step S810, searching for a feature point by using an acceleration robust feature (Speed Up Robust Feature, SURF) algorithm in a specific region 403 in the target image 301 and an offset range 604 in the image 302 to be aligned, wherein the feature point 810 in fig. 16 is one of a plurality of feature points; step S820, matching the characteristic point pairs by using a fast approximate nearest neighbor search library (Fast Library for Approximate Nearest Neighbors, FLANN for short) to obtain a plurality of characteristic point pairs such as 804-809 and the like; step S830, screening feature point pairs by using random sampling consensus (Random Sample Consensus, RANSAC) algorithm, such as screening feature point pairs 805-808; in step S840, the system of simultaneous equations solves the homography transformation matrix from the image 302 to the image 301 using the coordinates of the filtered feature point pairs, and as an alignment transformation matrix, an alignment mapping relationship is obtained, thereby implementing alignment of the image 302 to be aligned to the target image 301 in the specific region 403 within the offset range 604. After alignment, an accurate corresponding region 821 of the specific region 403 in the 2 nd image 302 can be obtained, and further, the offset range that can be generated by the specific region 403 in the 3 rd image 303 can be estimated according to the corresponding region 821; further, the image 303 to be aligned is aligned to the target image 301 by the specific region 403 within the offset range in the image 303. In this way, sequential alignment of images 302-325 to target image 301 by specific region 403 within the corresponding offset range is achieved.
The endoscope image processing method provided by the invention only searches the characteristic points in the specific area and the corresponding offset range and carries out subsequent processing, compared with the whole image processing, the calculation amount is greatly reduced, and the alignment effect of the specific area is ensured by means of the limitation of the offset range. Alternatively, the alignment transformation section 803 may also use the coordinates of the pairs of the feature points screened out, and solve a similarity transformation or affine transformation matrix by simultaneous equations as an alignment transformation matrix to align the image to be aligned to the target image.
In addition, remote sensing image registration based on Marr wavelet modified SIFT algorithm or infrared and visible light image registration algorithm based on saliency and ORB can be used to align the image to be aligned to the target image in specific area within the offset range. The remote sensing image registration based on the Marr wavelet improved SIFT algorithm is characterized in that the Marr wavelet under the scale space theory is utilized to extract features of a target image and an image to be aligned, the Euclidean distance is utilized to perform primary alignment on feature points of the target image and the image to be aligned, and then the primary alignment result is precisely aligned according to a random sampling consistent method. The infrared and visible light image registration algorithm based on salience and ORB is to obtain a salience structure diagram of an image by using an optimized HC-GHS salience detection algorithm, then to detect characteristic points on the salience structure diagram by using the ORB algorithm, to screen out characteristic points with strong robustness by using Taylor series, to perform grouping matching according to the direction of the characteristic points, and finally to realize the matching of the characteristic points by using Hamming distance. Alternatively, a specific area can be used as an alignment template, scanned in a corresponding offset range, and alignment can be performed by using mutual information as a similarity measure; alternatively, the alignment is performed using a gray-based elastic alignment method such as B-spline free deformation or demons algorithm.
Fig. 17 is a flowchart of another example of the alignment step shown in fig. 1, and step S800 is performed before searching for the feature points, that is, the distortion coefficients stored in the storage module are read according to the specific model of the endoscope, and the distortion correction is performed on the specific area and the corresponding offset range to eliminate barrel distortion caused by the image capturing module of the endoscope, where the distortion coefficients can be obtained by calibrating the endoscope in advance by using a calibration method of a zhang camera through a chessboard calibration board. After the image alignment is completed, step S850 is performed, that is, the alignment of the image to be aligned to the target image is achieved before the alignment map is converted to the distortion correction by using the distortion coefficient again.
When the identified specific area is plural, different alignment weights need to be applied to the plural specific areas. The following describes how image alignment is performed using the alignment weights, taking the image alignment flow shown in fig. 15 as an example. First, when screening feature point pairs, it is ensured that the ratio of the number of feature point pairs screened from each specific region is equal to the ratio of the alignment weights. For example, in fig. 10, the alignment weights of the specific region 501 and the specific region 502 are 1 and 0.33, respectively, and if 4 pairs of feature points need to be screened, 3 pairs are screened from the specific region 501, and only 1 pair is screened from the specific region 502. In this way, when the simultaneous equations solve the aligned homography transformation matrix, the specific region 501 plays a dominant role, the specific region 502 plays an auxiliary role, the alignment effect of the specific region 501 is ensured, and the alignment effect of the specific region 502 is considered. In addition, any group of stored images can be called out from the storage module, a specific area is reassigned, the offset range is estimated again, and the images are aligned again.
Fig. 18 is a flowchart of another embodiment of the method for processing an endoscopic image according to the present invention, compared to the method for processing an endoscopic image shown in fig. 1, the method for processing an endoscopic image of the present invention performs step S1010, i.e. preliminary alignment of images, before performing the alignment, and performs step S1020, i.e. alignment adjustment of images, after performing the preliminary alignment, so that a better alignment effect can be achieved. Fig. 19 is a schematic view of a series of images initially aligned in the manner shown in fig. 18. A group of images 1130 are preliminarily aligned in a corresponding offset range by using similar transformation according to a specific area 1131, namely 24 similar transformation matrixes of images 1102-1125 to be aligned (only 1112, 1113, 1114 are shown in fig. 19) to the target image 1101 are respectively solved, the images 1102-1125 to be aligned are preliminarily aligned to the target image 1101, and the images 1101-1125 respectively correspond to 25 illumination lights 201-225 shown in fig. 2. After the preliminary alignment, the corresponding regions 1132 to 1155 (only 1142, 1143, 1144 are shown in fig. 19) of the specific region 1101 in the images 1102 to 1125 to be aligned can be obtained.
Since human tissue (including diseased tissue) is insensitive to certain specific wavelengths of light, the image taken upon insensitive light irradiation may be darker or less characterized. Such as image 1113 in fig. 19, the image is darker and the feature information is not obvious. However, in calculating the spectral curve, information of such an image is necessary, and the accuracy of the alignment thereof affects the accuracy of the subsequent spectral pathology analysis. Although there is a limitation of the offset range, since the feature information is not obvious, a larger deviation may still be generated during the preliminary alignment, for example, the specific region 1101 has a significant deviation in the corresponding region 1143 in the image 1113; but the image shift caused by various factors during image capturing is continuous and should not be excessively abrupt. Thus, the offset trajectories of a set of images 1130 may be calculated based on the initially aligned 24 similarity transformation matrices, and the alignment map of the images with significant deviation may be adjusted based on the offset trajectories.
In particular, the similarity transformation can be decomposed into three basic transformations, displacement transformation, rotation transformation, and scaling transformation. Fig. 20, 21, and 22 are schematic diagrams of displacement transformation, rotation transformation, and scaling transformation included in the similarity transformation, where 1201 is an image before transformation, 1202 is an image after displacement transformation, 1203 is an image after rotation transformation, and 1204 is an image after scaling transformation. Accordingly, the offset trajectory of a set of images 1130 may be composed of a displacement trajectory, a rotation trajectory, and a zoom trajectory.
FIG. 23 is a flowchart of an embodiment of the image alignment adjustment step of FIG. 18. Specifically, in step S1310, the offset trajectories, i.e., the displacement trajectory, the rotation trajectory, and the scaling trajectory, of the set of images 1130 are calculated according to the 24 similar transformation matrices of the images 1102-1125 to the target image 1101, respectively. Fig. 24, 25, and 26 are schematic diagrams of displacement trajectories, rotation trajectories, and scaling trajectories after preliminary alignment, respectively; the abscissa is the center wavelength of 25 kinds of illumination light, and the illumination light 212, 213, 214 correspond to the images 1112, 1113, 1114 in fig. 19, respectively; the ordinate is displacement amount, rotation amount, scaling amount, 1401 is displacement trajectory, 1402 is rotation trajectory, and 1403 is scaling trajectory, respectively. In step S1320, the variance of each locus constituting the offset locus is calculated, or the gradients of two adjacent loci are calculated, and the locus 1410 whose variance or gradient exceeds the set threshold is determined as an abnormal locus, and the image 1113 corresponding to the abnormal locus 1410 is the image with obvious deviation on the corresponding offset locus. Step S1330, using the displacement transformation matrices of image 1112 and image 1114 to image 1101, interpolating a new displacement transformation matrix; interpolation calculates a new rotation transformation matrix using the rotation transformation matrices of image 1112 and image 1114 to image 1101; a new scaling matrix is interpolated using the scaling matrices of image 1112 and image 1114 to image 1101. Step S1340, the new displacement transformation matrix, rotation transformation matrix and scaling transformation matrix are synthesized into a new similarity transformation matrix, which is used as a new alignment transformation matrix for the image 1113 to the image 1101, and the adjustment of the alignment mapping relationship of the image 1113 with obvious deviation is completed. If abnormal sites are detected in one or two offset tracks, the corresponding basic transformation matrix is only required to be adjusted, and a new similar transformation matrix is synthesized; or only one offset trajectory is calculated, and only one basic transformation dimension is detected and adjusted.
In addition, affine transformation or homography transformation can be adopted when a series of images are initially aligned according to a specific area in the offset range, other offset tracks, such as a shearing track or a perspective track, can be calculated besides a displacement track, a rotation track and a scaling track, and corresponding basic transformation dimensions can be detected and adjusted.
The aligned set of images is arranged in order of the center wavelength of the corresponding illumination light by means of a software program to generate a spectral data cube of the specific region. In addition, a user can select any pixel point through an input device, or automatically select a gray-scale gravity center pixel point of a specific area, and generate a spectrum curve of the pixel point by taking the center wavelength as an abscissa and the gray value as an ordinate. Wherein, the spectrum curves of all pixel points in the specific area can be directly generated; or respectively carrying out gray level average on the specific region and the corresponding region in other images to regenerate the average spectrum curve of the specific region. For the case shown in fig. 10 where a plurality of specific regions are specified, a spectral data cube or spectral curve for each specific region may be generated separately.
All or part of the aligned set of images is fused by means of a software program to generate an image. One embodiment is: and respectively calculating gray variances of the specific region and the corresponding regions in other images, sorting all the images according to the variances from large to small, and automatically selecting the images of which the front 50% are arranged for fusion, namely selecting the images with relatively large gray differences for fusion. Another embodiment is: and rescreening the characteristic points from the specific area and the corresponding areas in other images according to the same threshold standard, sorting from large to small according to the number of the contained characteristic points, and automatically selecting the images with the top 30% of the images to be fused, namely selecting the images with rich characteristic information to be fused. In addition, the user can select any number of images through the input device, and a color image can be generated through pseudo-color assignment and fusion. Or, respectively treating all the images as one color component, performing white balance, and fusing to generate a white light image.
In addition, the embodiment of the invention also provides an electronic device which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor. The memory and the processor complete communication with each other through the bus, and the processor is used for calling program instructions in the memory to execute the following method: identifying a specific region in a target image, wherein the target image is an image arbitrarily selected from a series of images taken by an object under illumination by an illumination unit; and calculating the offset range which can be generated by the specific area in other images in the series of images, and aligning the series of images according to the specific area in the offset range.
The electronic device provided by the embodiment of the invention can execute specific steps in the endoscope image processing method and achieve the same technical effects, and the specific description is not provided here.
In the above embodiment, the memory is a separate module, including the above-mentioned memory module, and may be implemented based on a nonvolatile memory, such as an electrically erasable programmable read only memory (Electrically erasable programmable read only memory, abbreviated as EEPROM), a Flash memory (Flash memory), or the like, and may also be a hard disk, an optical disk, or a magnetic tape. In a further embodiment, the memory is used only for storing the program-instructed endoscopic image processing method, the above-mentioned memory module belonging to another memory unit independent of the memory, and the data are retrieved from the memory module if the retrieval of the data is required when executing the endoscopic image processing method.
Fig. 27 is a schematic view of an embodiment of an endoscope system of the present invention. Fig. 27 provides an endoscope system including an illumination unit 1501, an endoscope body 1502, and an electronic device 1506. The illumination unit 1501 generates a plurality of illumination lights, which are guided by the endoscope 1502 and are respectively irradiated on the object 1520. The endoscope 1502 includes an imaging module 1504, and the imaging module 1504 captures a series of images 1530 of the inspected object 1520 corresponding to the plurality of illumination lights; the imaging module 1504 has an optical imaging lens 1505, the optical imaging lens 1505 being located at the top of the endoscope body 1502. A processor in the electronic device runs the endoscopic image processing method, designating a specific region in any one of a series of images 1530; estimating a range of offsets that the particular region can produce within other images in the series of images 1530; aligning a series of images 1530 for the particular region within the offset range; generating a spectral data cube or spectral curve for the particular region using the aligned series of images; and fusing all or part of the aligned series of images to generate an image. And a memory storing the image, the specific region, the offset range, the spectrum data, and the like. Besides, the electronic device is provided with a display device, and the electronic device can be externally connected with the display device and is used for displaying the image, the specific region, the spectrum data and the like. The display device can be realized by selecting a medical display LMD-2451MC of Sony or other displays matched with the image resolution. In order to facilitate manual operation of a user, the electronic device can be provided with an input device, and any other input auxiliary devices such as a touch pad, a keyboard, a voice input device, a camera, a scanner and the like can be selected. The object 1520 shown in fig. 27 is merely an example, and the actual object may be any organ or portion to which an endoscope such as a stomach or intestine is applicable.
The electronic device may be implemented based on an FPGA (Field Programable Gate Array, field programmable gate array). In addition, the System on Chip (SoC) or the ASIC (Application Specific Integrated Circuit) or the embedded processor may be implemented by directly using a computer or by combining one or more of the above schemes.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. An endoscopic image processing method, comprising:
identifying a specific region in a target image, wherein the target image is an image arbitrarily selected from a series of images taken by an object under illumination by an illumination unit; calculating the offset range which can be generated by the specific area in other images in the series of images; aligning the series of images to the specific region within the offset range;
Wherein calculating the range of offsets that the particular region can produce within other images in the series of images comprises: calculating the offset range which can be generated by the specific area in other images based on the pixel offset range which can be generated by the pre-calibrated pixel points;
before aligning the series of images according to the specific area in the offset range, calculating offset tracks of the series of images, and adjusting alignment mapping relation of some images in the series of images based on the offset tracks.
2. The endoscopic image processing method according to claim 1, wherein identifying a specific region in the target image comprises:
manually designating a certain region as a specific region in the target image; or when shooting the target image, adjusting an endoscope body to enable the target object to fall into a preset area, and taking the preset area as a specific area; or automatically identifying the area where the target object is located in the target image as a specific area.
3. The endoscopic image processing method according to claim 1 or 2, wherein the specific area includes a plurality of, different alignment weights are applied to the plurality of specific areas, and the series of images are aligned within the offset range based on the alignment weights.
4. The endoscopic image processing method according to claim 1, wherein calculating the shift range that the specific region can generate in the other images based on the pixel shift range that the pixel points calibrated in advance can generate comprises:
acquiring pixel offset ranges of at least three pixel points calibrated in advance;
taking a closed area formed by intersecting tangential lines at the outermost side of the pixel offset range as the offset range, or taking the outer boundary of the pixel offset range as the offset range;
wherein the at least three pixel points include three pixel points located on the boundary of the specific area.
5. The endoscopic image processing method according to claim 1, wherein aligning the series of images by the specific region within the offset range includes:
searching characteristic points in a specific area of the target image and an offset range of the image to be aligned and matching the characteristic point pairs, and calculating an alignment mapping relation between the image to be aligned and the target image based on the characteristic point pairs.
6. The endoscopic image processing method according to claim 5, further comprising, before searching for the feature point: carrying out distortion correction on a specific area of the target image and the offset range of the image to be aligned based on the distortion coefficient; after the image alignment is completed, the alignment of the image to be aligned to the target image is realized before the alignment mapping relation is converted to distortion correction based on the distortion coefficient.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor performs the steps of the endoscopic image processing method according to any of claims 1-6 when the program is executed.
8. An endoscope system comprising an endoscope body, further comprising the electronic device of claim 7, wherein the processor is communicatively coupled to an imaging module within the endoscope body.
CN201911399032.2A 2019-12-30 2019-12-30 Endoscope image processing method, electronic equipment and endoscope system Active CN111161852B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911399032.2A CN111161852B (en) 2019-12-30 2019-12-30 Endoscope image processing method, electronic equipment and endoscope system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911399032.2A CN111161852B (en) 2019-12-30 2019-12-30 Endoscope image processing method, electronic equipment and endoscope system

Publications (2)

Publication Number Publication Date
CN111161852A CN111161852A (en) 2020-05-15
CN111161852B true CN111161852B (en) 2023-08-15

Family

ID=70559586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911399032.2A Active CN111161852B (en) 2019-12-30 2019-12-30 Endoscope image processing method, electronic equipment and endoscope system

Country Status (1)

Country Link
CN (1) CN111161852B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113068015A (en) * 2021-03-24 2021-07-02 南京锐普创科科技有限公司 Endoscope image distortion correction system based on optical fiber probe
CN113344987A (en) * 2021-07-07 2021-09-03 华北电力大学(保定) Infrared and visible light image registration method and system for power equipment under complex background

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432543A (en) * 1992-03-05 1995-07-11 Olympus Optical Co., Ltd. Endoscopic image processing device for estimating three-dimensional shape of object based on detection of same point on a plurality of different images
JP2001136540A (en) * 1999-11-05 2001-05-18 Olympus Optical Co Ltd Image processor
US6842196B1 (en) * 2000-04-04 2005-01-11 Smith & Nephew, Inc. Method and system for automatic correction of motion artifacts
JP2010041418A (en) * 2008-08-05 2010-02-18 Olympus Corp Image processor, image processing program, image processing method, and electronic apparatus
CN103035004A (en) * 2012-12-10 2013-04-10 浙江大学 Circular target centralized positioning method under large visual field
CN104411229A (en) * 2012-06-28 2015-03-11 奥林巴斯株式会社 Image processing device, image processing method, and image processing program
CN105931237A (en) * 2016-04-19 2016-09-07 北京理工大学 Image calibration method and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5715372B2 (en) * 2010-10-15 2015-05-07 オリンパス株式会社 Image processing apparatus, method of operating image processing apparatus, and endoscope apparatus
US10825152B2 (en) * 2017-09-14 2020-11-03 Canon U.S.A., Inc. Distortion measurement and correction for spectrally encoded endoscopy
US11298001B2 (en) * 2018-03-29 2022-04-12 Canon U.S.A., Inc. Calibration tool for rotating endoscope

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5432543A (en) * 1992-03-05 1995-07-11 Olympus Optical Co., Ltd. Endoscopic image processing device for estimating three-dimensional shape of object based on detection of same point on a plurality of different images
JP2001136540A (en) * 1999-11-05 2001-05-18 Olympus Optical Co Ltd Image processor
US6842196B1 (en) * 2000-04-04 2005-01-11 Smith & Nephew, Inc. Method and system for automatic correction of motion artifacts
JP2010041418A (en) * 2008-08-05 2010-02-18 Olympus Corp Image processor, image processing program, image processing method, and electronic apparatus
CN104411229A (en) * 2012-06-28 2015-03-11 奥林巴斯株式会社 Image processing device, image processing method, and image processing program
CN103035004A (en) * 2012-12-10 2013-04-10 浙江大学 Circular target centralized positioning method under large visual field
CN105931237A (en) * 2016-04-19 2016-09-07 北京理工大学 Image calibration method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
钱渠.基于内窥镜视频的工业管道图像展开研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2021,(第1期),I138-1415. *

Also Published As

Publication number Publication date
CN111161852A (en) 2020-05-15

Similar Documents

Publication Publication Date Title
JP6150583B2 (en) Image processing apparatus, endoscope apparatus, program, and operation method of image processing apparatus
CN113573654B (en) AI system, method and storage medium for detecting and determining lesion size
JP6045417B2 (en) Image processing apparatus, electronic apparatus, endoscope apparatus, program, and operation method of image processing apparatus
JP6049518B2 (en) Image processing apparatus, endoscope apparatus, program, and operation method of image processing apparatus
CN111275041B (en) Endoscope image display method and device, computer equipment and storage medium
US20220125280A1 (en) Apparatuses and methods involving multi-modal imaging of a sample
JP6967602B2 (en) Inspection support device, endoscope device, operation method of endoscope device, and inspection support program
CN111091562B (en) Method and system for measuring size of digestive tract lesion
JP2010279539A (en) Diagnosis supporting apparatus, method, and program
CN113543694A (en) Medical image processing device, processor device, endoscope system, medical image processing method, and program
US8666135B2 (en) Image processing apparatus
CN111161852B (en) Endoscope image processing method, electronic equipment and endoscope system
JP6952214B2 (en) Endoscope processor, information processing device, endoscope system, program and information processing method
JP6168876B2 (en) Detection device, learning device, detection method, learning method, and program
JP7385731B2 (en) Endoscope system, image processing device operating method, and endoscope
JP6150617B2 (en) Detection device, learning device, detection method, learning method, and program
CN107529962B (en) Image processing apparatus, image processing method, and recording medium
CN114418920B (en) Endoscope multi-focus image fusion method
JP6168878B2 (en) Image processing apparatus, endoscope apparatus, and image processing method
KR101559253B1 (en) Apparatus and method for shooting of tongue using illumination
Ali Total variational optical flow for robust and accurate bladder image mosaicing
JP4856275B2 (en) Medical image processing device
JP2023011303A (en) Medical image processing apparatus and operating method of the same
CN118614857A (en) 3D color fluorescence high-definition imaging method and system for pleuroperitoneal cavity as well as pleuroperitoneal cavity
CN114627045A (en) Medical image processing system and method for operating medical image processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant