CN113763298A - Endoscope image processing method, endoscope image processing device, endoscope, and storage medium - Google Patents

Endoscope image processing method, endoscope image processing device, endoscope, and storage medium Download PDF

Info

Publication number
CN113763298A
CN113763298A CN202110861542.8A CN202110861542A CN113763298A CN 113763298 A CN113763298 A CN 113763298A CN 202110861542 A CN202110861542 A CN 202110861542A CN 113763298 A CN113763298 A CN 113763298A
Authority
CN
China
Prior art keywords
image
frame
target
region
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110861542.8A
Other languages
Chinese (zh)
Inventor
袁耀峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huanuokang Technology Co ltd
Original Assignee
Zhejiang Huanuokang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huanuokang Technology Co ltd filed Critical Zhejiang Huanuokang Technology Co ltd
Priority to CN202110861542.8A priority Critical patent/CN113763298A/en
Publication of CN113763298A publication Critical patent/CN113763298A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Endoscopes (AREA)
  • Image Processing (AREA)

Abstract

The present application relates to an endoscopic image processing method, an apparatus, an endoscope, and a storage medium, wherein the endoscopic image processing method includes: acquiring a target area frame image from the original image coding data; performing definition evaluation on the frame image of the target area to obtain a corresponding definition value; and screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target image. By the method and the device, the problem of low quality of the output image in the related technology is solved, and the high-quality target image is output by combining the definition value with the multi-frame weighting fusion processing.

Description

Endoscope image processing method, endoscope image processing device, endoscope, and storage medium
Technical Field
The present application relates to the field of image processing, and more particularly to an endoscopic image processing method, device, endoscope, and storage medium.
Background
An endoscope is an optical device that enters the body through a natural orifice of the body or through a small surgical incision to view the area of interest. Endoscopes with different lengths and diameters can be used for detecting different areas of a human body, and can be divided into a plurality of types such as a laparoscope, a neuroendoscope, an arthroscope and an esophagoscope according to the difference of detection areas. The endoscope is widely applied to detection and treatment of various parts of a human body, and is an extremely important medical device in modern medical treatment.
The endoscope technology has been developed for decades, and at present, the main structure of the medical endoscope is a long flexible pipe, and medical workers use a control knob to control a cable inside the flexible pipe to control a ring-shaped part inside a bending part of the endoscope to move up and down, so that the front end of the endoscope can perform bending motion. In the operation process, the image can be stored by using a snapshot button of the handle, and the problem that the snapshot image is smeared and not clear due to shaking exists. To solve this problem, the currently adopted image processing scheme is: processing the captured image based on algorithms such as block gradient calculation, edge detection and the like; but the image quality of the output of such algorithms is not high.
At present, no effective solution is provided for the problem of low quality of output images in the related art.
Disclosure of Invention
The embodiment of the application provides an endoscope image processing method, an endoscope image processing device, an endoscope and a storage medium, which are used for solving the problem that the quality of an output image is not high in the related art.
In a first aspect, an embodiment of the present application provides an endoscopic image processing method, including:
acquiring a target area frame image from the original image coding data;
performing definition evaluation on the frame image of the target area to obtain a corresponding definition value;
and screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target image.
In some embodiments, the obtaining the target region frame image from the original image encoding data includes:
acquiring image coding data of a preset region of interest from the original image coding data;
and carrying out digital image processing on the image coding data of the interested region to obtain the frame image of the target region.
In some embodiments, the obtaining the region of interest from the encoded original image data includes:
acquiring photosensitive line information from original image coded data;
and extracting the original image coded data based on the photosensitive line information by using a preset image region-of-interest extraction formula to obtain the region of interest.
In some embodiments, the performing digital image processing on the image encoding data of the region of interest to obtain the frame image of the target region includes:
carrying out basic digital image processing on the image coding data of the region of interest to obtain a region of interest image;
and performing center alignment fusion on the interested region image and the preprocessed non-interested region image to obtain the target region frame image.
In some embodiments, before performing sharpness evaluation on the target area frame image, the method further includes:
judging the data type of the target area frame image;
if the data type of the target area frame image is the RGB data type, converting the target area frame image of the RGB data type into the target area frame image of the YUV data type by using a first conversion formula.
In some embodiments, the performing sharpness evaluation on the target area frame image to obtain a corresponding sharpness value includes:
determining the brightness sum of the frame image of the target area of the YUV data type based on a preset sampling value by utilizing an accumulation formula;
determining the brightness average value of the target area frame image according to the preset sampling value and the brightness sum;
and determining the definition value of the target area frame image based on the brightness average value by utilizing a standardized formula.
In some embodiments, the screening out, from the target region frame images, a target frame image of a number of frames that satisfy a preset frame number threshold based on the sharpness value includes:
and based on the definition value, performing definition sequencing on the frame images of the target area, and screening out target frame images of a plurality of frames meeting a preset frame number threshold in a definition sequencing result.
In some embodiments, the performing multi-frame weighted fusion processing on the target frame images of the frames to obtain the target image includes:
performing multi-frame weighted fusion processing on the target frame images by using a weighted fusion formula to obtain a fused image;
and filling the fused image to obtain the target image.
In some embodiments, the endoscopic image processing method provided by the present application further includes:
and after the fused image is obtained, converting the fused image into a fused image of an RGB data type by using a second conversion formula.
In some embodiments, the endoscopic image processing method provided by the present application further includes:
and after the target area frame image is obtained, outputting a video stream in real time based on the target area frame image.
In a second aspect, an embodiment of the present application provides an endoscopic image processing apparatus, including an acquisition module, an evaluation module, and a fusion module;
the acquisition module is used for acquiring a target area frame image from the original image coding data;
the evaluation module is used for evaluating the definition of the target area frame image by utilizing a definition algorithm to obtain the definition value of the target area frame image;
and the fusion module is used for screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target images.
In a third aspect, an embodiment of the present application provides an endoscope, an image display system of which comprises a memory, a processor and a computer program stored on the memory and executable on the processor, and the processor implements the endoscope image processing method according to the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a storage medium having a computer program stored thereon, which when executed by a processor, implements the endoscopic image processing method as described in the first aspect above.
Compared with the related art, the endoscope image processing method, the endoscope image processing device, the endoscope and the storage medium provided by the embodiment of the application acquire the frame image of the target area from the original image coding data; performing definition evaluation on the frame image of the target area to obtain a corresponding definition value; and screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target image. The method and the device solve the problem of low quality of the output image in the related technology, and realize the output of the high-quality target image by combining the definition value with the multi-frame weighted fusion processing.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a partial hardware configuration block diagram of an endoscopic image display system according to an embodiment of the present application;
FIG. 2 is a flow chart of an endoscopic image processing method provided by an embodiment of the present application;
FIG. 3 is a flowchart of step S210 in FIG. 2;
FIG. 4 is a schematic view of a region of interest being entirely within an active area as provided by an embodiment of the present application;
FIG. 5 is a schematic view of a region of interest portion within an active area provided by an embodiment of the present application;
FIG. 6 is a flowchart of step S230 in FIG. 2;
fig. 7 is a block diagram showing a configuration of an endoscopic image processing apparatus according to an embodiment of the present application.
In the figure: 210. an acquisition module; 220. an evaluation module; 230. and a fusion module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The method embodiment provided by the embodiment can be executed in an endoscope image display system. Taking an image display system as an example, fig. 1 is a block diagram of a hardware configuration of an image display system portion of an endoscopic image processing method according to an embodiment of the present invention. As shown in fig. 1, the image display system may include one or more (only one shown in fig. 1) processors 102 (the processors 102 may include, but are not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally, a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is merely illustrative and is not intended to limit the structure of the image display system described above. For example, the image display system may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1. In other embodiments, the method embodiments provided by the present embodiment may be executed in a terminal, a computer, or a similar operation device.
The memory 104 can be used for storing computer programs, for example, software programs and modules of application software, such as a computer program corresponding to the endoscope image processing method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to an image display system via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The specific example of the network described above may include a wireless network provided by a communication provider of the image display system. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
The present embodiment provides an endoscopic image processing method, and fig. 2 is a flowchart of an endoscopic image processing method according to an embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
step S210, acquiring a target area frame image from the original image coded data;
step S220, evaluating the definition of the frame image of the target area to obtain a corresponding definition value;
and step S230, based on the definition value, screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target image.
RAW image encoding data (RAW) may be obtained from a medical endoscope image sensor, the RAW image encoding data containing image sensor data information necessary to create a visual image. The target area frame image is an image of each frame in an area, and the target area frame images of each frame are connected to output a video of the current area. The method can be obtained by carrying out digital image processing (ISP) on original image coded data through software and hardware arranged in the endoscope so as to reduce the requirement of a subsequent algorithm on resources. During use of a medical endoscope, a user often needs to save a current image to be seen.
In this embodiment, in order to improve the quality of the stored image and achieve a 4K high-definition image, the sharpness evaluation is performed on the frame image of the target area of each frame to obtain the sharpness value corresponding to each frame. And screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, removing unclear or low-quality target region frame images from the target region frame images, and reserving high-definition target frame images. And then performing multi-frame weighted fusion processing on the target frame images of a plurality of frames to obtain a high-quality target image. In this embodiment, the multi-frame weighted fusion process may use a weighted average fusion algorithm, a machine-learned fusion model, or the like, which is not limited thereto.
Through the steps, the problem of low quality of the output image in the related technology is solved, and the high-quality target image is output by combining the definition value with the multi-frame weighted fusion processing.
The above steps are explained in detail below:
in one embodiment, as shown in fig. 3, step S210 includes the following steps;
step S211, acquiring image coded data of a preset region of interest from the original image coded data;
the method specifically comprises the following steps: acquiring photosensitive line information from original image coded data; and extracting the original image coded data based on the photosensitive line information by using a preset image region-of-interest extraction formula to obtain a region-of-interest.
The photosensitive line information is the number of the photosensitive lines with the maximum length in each frame of original image coding data. And acquiring photosensitive line information in the encoded data of each frame of original image, and determining the radius of the interested area based on the photosensitive line information by using an image interest extraction formula. The image region-of-interest extraction formula is expressed as:
Figure BDA0003185929720000071
wherein x is the length of the maximum photosensitive line; y represents the maximum line spacing between the maximum photosensitive lines; a represents the number of photosensitive lines having the largest length.
As shown in fig. 4, the region of interest is fully within the active area. The center of the region of interest is the circle center, and the maximum number of the photosensitive lines is 1; the radius of the region of interest is half the length of the maximum photospeed. As shown in fig. 5, a schematic diagram of a region of interest partially within an active area is shown. The number of the photosensitive lines with the largest length is more than or equal to 2. And calculating the maximum line spacing y between the maximum photosensitive lines, calculating the length x of the maximum photosensitive lines, taking the center of the region of interest as the center of a circle, and calculating the radius of the region of interest according to the pythagorean theorem. And then, extracting the original image coded data according to the radius of the region of interest to obtain the region of interest.
Step S212, digital image processing is carried out on the image coding data of the region of interest to obtain a frame image of the target region.
The method specifically comprises the following steps: carrying out basic digital image processing on the image coding data of the region of interest to obtain a region of interest image; and performing center alignment fusion on the interested region image and the preprocessed non-interested region image to obtain a target region frame image.
The basic digital image processing refers to a process of converting a RAW format into an RGB format or a YUV format. Since the basic digital image processing involves a large amount of data and strict real-time requirements, it can be implemented by adopting a software and hardware scheme. In this embodiment, the basic digital image processing process is as follows: carrying out AWB white balance on the image coding data of the region of interest; converting the image coding data of the white-balanced interested region into an RGB image by using a demosaic algorithm; performing color interpolation by using RAW2RGB algorithm, and correcting by using Gamma; and finally, performing CCM, CSC color correction and color gamut conversion to obtain the region-of-interest image. In other embodiments, it is possible to use a series of processing blocks connected end to end, running at high speed simultaneously, driven by a clock of several hundred MHz, with data being transferred from one block to the next until all the algorithmic processing is completed, and finally the image in YUV or RGB format is streamed from the last stage of the pipeline. Digital image processing differs from basic digital image processing in that basic digital image processing does not involve center-aligned fusion of region-of-interest images and non-region-of-interest images.
The preprocessing of the image of the non-interesting region means that after the image coding data of the interesting region is obtained, the image coding data of the minimum external rectangular region of the interesting region is intercepted, the image coding data of the interesting region is removed from the image coding data of the minimum external rectangular region, the image coding data of the interesting region is reserved, namely the image coding data of the non-interesting region, and then basic digital image processing is carried out on the part to obtain the image of the non-interesting region. In order to further improve the processing efficiency, the non-interested region image can be obtained by a preprocessing mode of directly filling 0 into the pixel points of the non-interested region without using basic digital image processing for the non-interested region. And aligning and fusing the interested area image and the non-interested area image according to the center to obtain a target area frame image. The center-aligned fusion can be implemented by using the existing image stitching technology, and is not described herein.
Before the definition evaluation is carried out on the frame image of the target area, if no snapshot needs to be carried out and a high-quality image needs to be output, the steps are repeated, and the video stream can be output in real time. And if the snapshot happens, storing the frame images of the target area of the continuous T frames into a buffer area, and entering the subsequent processing.
The digital image processing method has two data types, one is an RGB data type, and the other is a YUV data type. In order to conveniently process target area frame images of different data types, judging the data type of the target area frame image; and if the data type of the target area frame image is the RGB data type, converting the target area frame image of the RGB data type into the target area frame image of the YUV data type by using a first conversion formula. The subsequent processing is then performed as step S220 and step S230. If the data type of the target area frame image is the YUV data type, the subsequent processing is directly performed according to step S220 and step S230.
The first conversion formula is expressed as follows:
Figure BDA0003185929720000081
wherein R represents a red channel, and G represents a green channel; b represents a blue channel; y represents brightness; u represents chroma; v represents concentration.
In this embodiment, step S220 includes the following steps:
step S221, determining the brightness sum of the frame image of the target area of the YUV data type based on a preset sampling value by using an accumulation formula;
step S222, determining a brightness average value of the frame image of the target area according to a preset sampling value and the brightness sum;
in step S223, the sharpness value of the frame image of the target area is determined based on the brightness average value using the normalization formula.
Specifically, the above-mentioned cumulative formula is:
Figure BDA0003185929720000082
wherein Y represents brightness; y issum+ represents the cumulative sum of the Y values for all window regions; threshold represents a brightness threshold; number represents the number of accumulations; the sampling value is an area with the size of N x N;
in this embodiment, for the frame image of the target area of each frame, the area with the sampling value of N × N is adopted, and the sum Y of the accumulated Y values of all window areas is sequentially calculated from left to right and from top to bottom in a sliding window mannersumAnd records the number of accumulation times (as in the above accumulation formula). If the cumulative sum of the Y values of the current window is larger than or equal to the brightness threshold (threshold), the current window does not perform the cumulative calculation, and the next window is continued until the current frame is finished. Calculating the average brightness value of the current frame
Figure BDA0003185929720000091
Average value of brightness
Figure BDA0003185929720000092
And calculating the sum of squares of average brightness difference values of all windows participating in the calculation of the brightness average value in the current frame by taking the brightness average value as reference, and then standardizing the sum of squares by using the accumulation times number of the windows, wherein the standardized definition value S represents the average degree of brightness change of the image, and the larger the average degree of the brightness change is, the clearer the image is, the smaller the average degree of the brightness change is, the more fuzzy the image is, thereby quickly determining the definition of the frame image of the target area.
The sharpness value S is expressed as follows:
Figure BDA0003185929720000093
in this embodiment, step S230, as shown in fig. 6, includes the following steps:
and S231, performing definition sequencing on the frame images in the target area based on the definition value, and screening out target frame images of a plurality of frames meeting a preset frame number threshold in a definition sequencing result.
Specifically, there are two forms of the sharpness ordering, which may be a sequential ordering or a reverse ordering. If the sequence is ordered, selecting target frame images with the number of the frame number threshold M arranged in the front; if the sequence is the reverse sequence, then the target frame images with the number of the frame number threshold M arranged at the back are selected.
Step S232, performing multi-frame weighted fusion processing on a plurality of target frame images by using a weighted fusion formula to obtain a fused image;
and step S233, filling the fused image to obtain a target image.
Specifically, the weighting value of each frame of the target frame image is determined by using the following formula.
Figure BDA0003185929720000094
In the formula, WnA weighting value representing the target frame image of the nth frame; snRepresenting a sharpness value of the target frame image of the nth frame; m represents a frame number threshold value, and a target frame image of M frames is selected.
Then, a weighted fusion formula is used for carrying out multi-frame weighted fusion processing on the plurality of target frame images, and the weighted fusion formula is as follows:
Figure BDA0003185929720000095
Figure BDA0003185929720000101
Figure BDA0003185929720000102
wherein width represents the width of the M frame target frame images; height represents the height of the M frame target frame images; i represents the width of the target frame image of the current frame; j represents the height of the target frame image of the current frame; y isi,jRepresents the width of M frames i andweighting the fused brightness in the height j area;
Figure BDA0003185929720000103
expressing the brightness after weighted fusion in the n frame width i and height j area; u shapei,jRepresenting the chroma after weighted fusion in the M frame width i and height j areas;
Figure BDA0003185929720000104
representing the chroma after weighted fusion in the n frame width i and height j areas; vi,jRepresenting the density after weighted fusion in the M frame width i and height j areas;
Figure BDA0003185929720000105
and the density after weighted fusion in the n frame width i and height j area is shown.
And performing multi-frame weighted fusion processing on three channels of the target frame images by using the three formulas to obtain a fused image. And then filling the fused image to the current resolution, and filling 0 on two sides to obtain the target image.
Since the frame number threshold M is taken from continuous T frame target frame images, the difference between the target frame image contents is not large, and therefore the pixels are directly subjected to weighted fusion. If the frame skipping is selected, or the T value is large, the possibility of large content gap exists. Before the weighted fusion formula, the SIFT algorithm is firstly used for matching the window characteristics, and then the weighted fusion of the corresponding windows is carried out, so that the content difference between the target frame images can be well optimized.
In order to make the output target image have wider application range and facilitate the direct viewing of a user, the method also comprises the following steps after the fused image is obtained;
the fused image is converted into a fused image of RGB data type using a second conversion formula.
The second conversion formula is expressed as follows:
Figure BDA0003185929720000106
it should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment also provides an endoscopic image processing apparatus, which is used to implement the above embodiments and preferred embodiments, and the description of the apparatus is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a block diagram showing the configuration of an endoscopic-based image processing apparatus according to an embodiment of the present application, and as shown in fig. 7, the apparatus includes: an acquisition module 210, an evaluation module 220, and a fusion module 230;
an obtaining module 210, configured to obtain a target area frame image from original image coded data;
the evaluation module 220 is configured to perform sharpness evaluation on the target area frame image by using a sharpness algorithm to obtain a sharpness value of the target area frame image;
and the fusion module 230 is configured to screen a plurality of target frame images satisfying a preset frame number threshold from the target region frame images based on the sharpness value, and perform multi-frame weighted fusion processing on the plurality of target frame images to obtain a target image.
By using the device, the problem of low quality of output images in the related technology is solved, and the output of high-quality target images is realized by combining definition values with multi-frame weighted fusion processing.
In one embodiment, the obtaining module 210 is further configured to obtain encoded image data of a preset region of interest from the encoded original image data;
and carrying out digital image processing on the image coding data of the region of interest to obtain a frame image of the target region.
In one embodiment, the endoscope image processing apparatus provided by the present application further includes a determining module, configured to determine a data type of the frame image of the target area;
and if the data type of the target area frame image is the RGB data type, converting the target area frame image of the RGB data type into the target area frame image of the YUV data type by using a first conversion formula.
In one embodiment, the evaluating module 220 is further configured to determine a luminance sum of the frame image of the target area in the YUV data type based on a preset sampling value by using an accumulation formula;
determining the brightness average value of the frame image of the target area according to the preset sampling value and the brightness sum;
and determining the definition value of the target area frame image based on the brightness average value by using a standardized formula.
In one embodiment, the fusion module 230 is further configured to perform multi-frame weighted fusion processing on a plurality of target frame images by using a weighted fusion formula to obtain a fused image;
the fused image is converted into a fused image of RGB data type using a second conversion formula.
And filling the fused image to obtain a target image.
In one embodiment, the endoscopic image processing apparatus provided by the present application further includes a video output module, configured to output a video stream in real time based on the target area frame image after obtaining the target area frame image.
The present embodiment also provides an endoscope, the image display system of which comprises a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the endoscope may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a target area frame image from the original image coded data;
s2, performing definition evaluation on the target area frame image by using a definition algorithm to obtain a definition value of the target area frame image;
s3, based on the definition value, a plurality of frames of target frame images meeting the preset frame number threshold are screened from the target area frame images, and multi-frame weighted fusion processing is carried out on the plurality of frames of target frame images to obtain the target image.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the endoscopic image processing method in the above embodiments, the embodiments of the present application may be implemented by providing a storage medium. The storage medium having stored thereon a computer program; the computer program realizes any one of the endoscope image processing methods in the above embodiments when executed by a processor.
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (13)

1. An endoscopic image processing method, comprising:
acquiring a target area frame image from the original image coding data;
performing definition evaluation on the frame image of the target area to obtain a corresponding definition value;
and screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target image.
2. The endoscopic image processing method according to claim 1, wherein said acquiring a target region frame image from original image encoding data comprises:
acquiring image coding data of a preset region of interest from the original image coding data;
and carrying out digital image processing on the image coding data of the interested region to obtain the frame image of the target region.
3. The endoscopic image processing method according to claim 2, wherein said acquiring a region of interest from original image encoding data, comprises:
acquiring photosensitive line information from original image coded data;
and extracting the original image coded data based on the photosensitive line information by using a preset image region-of-interest extraction formula to obtain the region of interest.
4. The endoscopic image processing method as defined in claim 2, wherein said performing digital image processing on the image coded data of the region of interest to obtain the target region frame image, includes:
carrying out basic digital image processing on the image coding data of the region of interest to obtain a region of interest image;
and performing center alignment fusion on the interested region image and the preprocessed non-interested region image to obtain the target region frame image.
5. The endoscopic image processing method according to claim 1, further comprising, before performing sharpness evaluation on the target area frame image:
judging the data type of the target area frame image;
if the data type of the target area frame image is the RGB data type, converting the target area frame image of the RGB data type into the target area frame image of the YUV data type by using a first conversion formula.
6. The endoscopic image processing method according to claim 5, wherein said performing sharpness evaluation on said target region frame image to obtain a corresponding sharpness value, comprises:
determining the brightness sum of the frame image of the target area of the YUV data type based on a preset sampling value by utilizing an accumulation formula;
determining the brightness average value of the target area frame image according to the preset sampling value and the brightness sum;
and determining the definition value of the target area frame image based on the brightness average value by utilizing a standardized formula.
7. The endoscopic image processing method according to claim 1, wherein said screening out target frame images of frames satisfying a preset frame number threshold from among said target region frame images based on said sharpness value comprises:
and based on the definition value, performing definition sequencing on the frame images of the target area, and screening out target frame images of a plurality of frames meeting a preset frame number threshold in a definition sequencing result.
8. The endoscopic image processing method according to claim 1, wherein said performing a multi-frame weighted fusion process on the target frame images of several frames to obtain a target image, comprises:
performing multi-frame weighted fusion processing on the target frame images by using a weighted fusion formula to obtain a fused image;
and filling the fused image to obtain the target image.
9. The endoscopic image processing method according to claim 8, further comprising:
and after the fused image is obtained, converting the fused image into a fused image of an RGB data type by using a second conversion formula.
10. The endoscopic image processing method according to claim 1, further comprising:
and after the target area frame image is obtained, outputting a video stream in real time based on the target area frame image.
11. An endoscopic image processing apparatus is characterized by comprising an acquisition module, an evaluation module and a fusion module;
the acquisition module is used for acquiring a target area frame image from the original image coding data;
the evaluation module is used for evaluating the definition of the target area frame image by utilizing a definition algorithm to obtain the definition value of the target area frame image;
and the fusion module is used for screening a plurality of frames of target frame images meeting a preset frame number threshold from the target region frame images based on the definition value, and performing multi-frame weighted fusion processing on the plurality of frames of target frame images to obtain the target images.
12. An endoscope, an image display system of which comprises a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the computer program to perform the endoscope image processing method according to any one of claims 1 to 10.
13. A storage medium having stored thereon a computer program, wherein the computer program is arranged to execute the endoscopic image processing method of any one of claims 1 to 10 when executed.
CN202110861542.8A 2021-07-29 2021-07-29 Endoscope image processing method, endoscope image processing device, endoscope, and storage medium Pending CN113763298A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110861542.8A CN113763298A (en) 2021-07-29 2021-07-29 Endoscope image processing method, endoscope image processing device, endoscope, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110861542.8A CN113763298A (en) 2021-07-29 2021-07-29 Endoscope image processing method, endoscope image processing device, endoscope, and storage medium

Publications (1)

Publication Number Publication Date
CN113763298A true CN113763298A (en) 2021-12-07

Family

ID=78788220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110861542.8A Pending CN113763298A (en) 2021-07-29 2021-07-29 Endoscope image processing method, endoscope image processing device, endoscope, and storage medium

Country Status (1)

Country Link
CN (1) CN113763298A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115330644A (en) * 2022-10-13 2022-11-11 极限人工智能有限公司 Adaptive edge enhancement method and system for endoscope image
CN115861299A (en) * 2023-02-15 2023-03-28 浙江华诺康科技有限公司 Electronic endoscope quality control method and device based on two-dimensional reconstruction
CN116095481A (en) * 2023-01-13 2023-05-09 杭州微影软件有限公司 Auxiliary focusing method and device, electronic equipment and storage medium
CN117456000A (en) * 2023-12-20 2024-01-26 杭州海康慧影科技有限公司 Focusing method and device of endoscope, storage medium and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019210707A1 (en) * 2018-05-02 2019-11-07 杭州海康威视数字技术股份有限公司 Image sharpness evaluation method, device and electronic device
CN110717871A (en) * 2019-09-30 2020-01-21 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112907500A (en) * 2019-12-03 2021-06-04 精微视达医疗科技(武汉)有限公司 Endoscope focusing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019210707A1 (en) * 2018-05-02 2019-11-07 杭州海康威视数字技术股份有限公司 Image sharpness evaluation method, device and electronic device
CN110717871A (en) * 2019-09-30 2020-01-21 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN112907500A (en) * 2019-12-03 2021-06-04 精微视达医疗科技(武汉)有限公司 Endoscope focusing method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YANJIE LIU等: "A Fast Fusion Method for Multi-exposure Image in YUV Color Space", IEEE, 14 October 2018 (2018-10-14) *
刘丽;苏赋;田芳;卢阿娟;: "基于Matlab的图像感兴趣区域提取", 现代电子技术, no. 08, 15 April 2013 (2013-04-15) *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115330644A (en) * 2022-10-13 2022-11-11 极限人工智能有限公司 Adaptive edge enhancement method and system for endoscope image
CN115330644B (en) * 2022-10-13 2023-04-07 极限人工智能有限公司 Adaptive edge enhancement method and system for endoscope image
CN116095481A (en) * 2023-01-13 2023-05-09 杭州微影软件有限公司 Auxiliary focusing method and device, electronic equipment and storage medium
CN115861299A (en) * 2023-02-15 2023-03-28 浙江华诺康科技有限公司 Electronic endoscope quality control method and device based on two-dimensional reconstruction
CN117456000A (en) * 2023-12-20 2024-01-26 杭州海康慧影科技有限公司 Focusing method and device of endoscope, storage medium and electronic equipment
CN117456000B (en) * 2023-12-20 2024-03-29 杭州海康慧影科技有限公司 Focusing method and device of endoscope, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN113763298A (en) Endoscope image processing method, endoscope image processing device, endoscope, and storage medium
WO2021147500A1 (en) Endoscope image processing method and apparatus, and electronic device and storage medium
CN110246087B (en) System and method for removing image chroma noise by referring to multi-resolution of multiple channels
CN102598651B (en) Motion video processing unit and method, the camera head of motion video processing unit is installed
WO2021244440A1 (en) Method, apparatus, and system for adjusting image quality of television, and television set
CN109167923B (en) Image transmission method, image transmission device, electronic equipment and storage medium
EP2396764A1 (en) An integrated circuit having a circuit for and method of providing intensity correction for a video
WO2022073282A1 (en) Motion recognition method based on feature interactive learning, and terminal device
CN113297937B (en) Image processing method, device, equipment and medium
CN110766637B (en) Video processing method, processing device, electronic equipment and storage medium
JP2004534434A (en) How to compress and decompress video data
WO2024027287A9 (en) Image processing system and method, and computer-readable medium and electronic device
US20080285868A1 (en) Simple Adaptive Wavelet Thresholding
US20120263356A1 (en) Method for efficient representation and processing of color pixel data in digital pathology images
CN108174084A (en) panoramic video processing method and terminal device
CN110634564B (en) Pathological information processing method, device and system, electronic equipment and storage medium
CN113450270A (en) Correction parameter generation method, electronic device, and storage medium
CN115115526A (en) Image processing method and apparatus, storage medium, and graphic calculation processor
TWI472232B (en) Video transmission by decoupling color components and apparatus thereof and processor readable tangible medium encoded with instructions
CN112565887A (en) Video processing method, device, terminal and storage medium
Babu et al. Novel chroma subsampling patterns for wireless capsule endoscopy compression
CN108055475A (en) Video signal processing method, apparatus and readable storage medium
US8891894B2 (en) Psychovisual image compression
CN113613024B (en) Video preprocessing method and device
CN109740467A (en) A kind of electronic certificate recognition methods, apparatus and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination