CN111598794A - Image imaging method and device for removing underwater overlapping condition - Google Patents

Image imaging method and device for removing underwater overlapping condition Download PDF

Info

Publication number
CN111598794A
CN111598794A CN202010334814.4A CN202010334814A CN111598794A CN 111598794 A CN111598794 A CN 111598794A CN 202010334814 A CN202010334814 A CN 202010334814A CN 111598794 A CN111598794 A CN 111598794A
Authority
CN
China
Prior art keywords
image
value
repetition
comparing
graying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010334814.4A
Other languages
Chinese (zh)
Inventor
王海博
葛磊
孙超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong EHualu Information Technology Co ltd
Original Assignee
Shandong EHualu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong EHualu Information Technology Co ltd filed Critical Shandong EHualu Information Technology Co ltd
Priority to CN202010334814.4A priority Critical patent/CN111598794A/en
Publication of CN111598794A publication Critical patent/CN111598794A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an image imaging method and device for removing underwater overlapping condition. Wherein, the method comprises the following steps: acquiring a first image and a second image; graying the first image and the second image; comparing the first image with the second image to obtain a repetition degree; generating a judgment result according to the repetition degree; and when the judgment result is repetition, executing deduplication operation on the first image and the second image. The invention solves the technical problem that repeated image data are generated when the underwater camera takes photos.

Description

Image imaging method and device for removing underwater overlapping condition
Technical Field
The invention relates to the field of underwater shooting, in particular to an image imaging method and device for removing underwater overlapping.
Background
With the continuous development of intelligent underwater shooting equipment, people have higher and higher requirements on the imaging quality and the intelligent degree of an underwater imaging system. At present, when equipment for marine underwater imaging performs underwater operation, an underwater camera generally performs continuous imaging in a continuous shooting mode on underwater imaging image data.
However, when the underwater imaging device in the prior art performs shooting and imaging, repeated image data is often generated in a continuous time due to the need of continuous imaging, the image data exists because the camera takes pictures twice in a front-back fixed time interval, but the moving amount of the camera body is zero, i.e. the camera body does not move in place, so that the image data generated by taking pictures by the underwater camera may be two or several images which are redundant repeatedly.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an image imaging method and device for removing underwater overlapping conditions, and aims to at least solve the technical problem that repeated image data are generated when an underwater camera takes photos.
According to an aspect of the embodiments of the present invention, there is provided an image imaging method for removing an underwater overlapping condition, including: acquiring a first image and a second image; graying the first image and the second image; comparing the first image with the second image to obtain a repetition degree; generating a judgment result according to the repetition degree; and when the judgment result is repetition, executing deduplication operation on the first image and the second image.
Optionally, before performing the graying processing on the first image and the second image, the method further includes: and carrying out noise reduction processing on the first image and the second image.
Optionally, the comparing the first image and the second image to obtain the repetition degree includes: performing binarization processing on the first image and the second image after graying; and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
Optionally, the binarization processing includes: acquiring gray values of the first image and the second image; changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255; and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
Optionally, the performing the deduplication operation on the first image and the second image includes: acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image; comparing the image quality values of the first image and the second image to generate a comparison result; and according to the comparison result, removing the image with smaller image quality value.
Optionally, when the determination result is repetition, after performing a deduplication operation on the first image and the second image, the method further includes: and outputting an image with higher image quality according to the comparison result.
According to another aspect of the embodiments of the present invention, there is also provided an image forming apparatus for removing an underwater overlapping condition, including: the acquisition module is used for acquiring a first image and a second image; the graying module is used for performing graying processing on the first image and the second image; the comparison module is used for comparing the first image with the second image to obtain the repetition degree; the judging module is used for generating a judging result according to the repetition degree; and the duplication removing module is used for executing duplication removing operation on the first image and the second image when the judgment result is duplication.
According to another aspect of the embodiments of the present invention, a non-volatile storage medium is further provided, where the non-volatile storage medium includes a stored program, and the program controls, when running, an apparatus in which the non-volatile storage medium is located to execute the method.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method.
In the embodiment of the invention, a first image and a second image are obtained; graying the first image and the second image; comparing the first image with the second image to obtain a repetition degree; generating a judgment result according to the repetition degree; when the judgment result is repeated, the duplicate removal operation is executed on the first image and the second image, and the purpose of judging whether the two compared images are the repeated redundant image data or not is achieved by comparing the grayed images, so that the technical problem that the repeated image data is generated when the underwater camera takes pictures is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of an image imaging method for removing an underwater overlap condition according to an embodiment of the present invention;
fig. 2 is a block diagram of an image forming apparatus for removing an underwater overlapping situation according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
In accordance with an embodiment of the present invention, there is provided an embodiment of an image imaging method for removing an underwater overlap condition, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than that herein.
Fig. 1 is an image imaging method for removing an underwater overlapping condition according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, a first image and a second image are obtained.
Specifically, the underwater camera continuously photographs the underwater environment and generates a plurality of image data, and the generated image data are continuous in photographing time, that is, there is a sequence of generation. Therefore, in the embodiment of the present invention, first, image data of two time points captured by an underwater camera in an underwater environment is acquired, the first image and the second image may be continuous or discontinuous in time, and specifically, whether the image data generated at the continuous time point or the image data generated at the discontinuous time point is selected, which does not affect subsequent analysis and processing of the first image and the second image.
It should be noted that the first image and the second image generated by continuous underwater environment photographing need to be acquired by the same underwater camera, that is, the first image and the second image are different in picture content and production time point, and other data such as pixel size and color tone need to be uniform.
Step S104, performing graying processing on the first image and the second image.
Specifically, in the RGB model, if R ═ G ═ B, the color represents a gray color, where the value of R ═ G ═ B is called a gray value, so that each pixel of the gray image only needs one byte to store the gray value (also called an intensity value and a brightness value), and the gray range is 0 to 255. The color image is grayed by four methods, namely a component method maximum value method average value method weighted average method. In the embodiment of the present invention, the graying processing of the first image and the second image may be performed by using an average value method, and the three-component luminance in the color image is averaged to obtain one grayscale value, that is, the associated graying formula is f (i, j) ═ R (i, j) + G (i, j) + B (i, j))/3.
It should be noted that, according to the image data obtained by graying the first image and the second image by the above method, the original data is to be covered, so that the subsequent steps are performed to perform the deduplication processing operation on the first image and the second image.
Optionally, before performing the graying processing on the first image and the second image, the method further includes: and carrying out noise reduction processing on the first image and the second image.
Specifically, the noise reduction processing refers to that a real digital image is often influenced by noise interference and the like between an imaging device and an external environment during digitization and transmission, and is called a noisy image or a noisy image. The process of reducing noise in a digital image is called image denoising, sometimes referred to as image denoising. The noise reduction processing method of the embodiment of the invention can carry out noise reduction processing through the self-adaptive wiener filter, and can adjust the output of the filter according to the local variance of the image, wherein the larger the local variance is, the stronger the smoothing effect of the filter is. The final goal is to minimize the mean square error E2 ═ E [ (f (x, y) -f ^ (x, y)2] between the restored image f (x, y) and the original image f (x, y). The method has better filtering than the mean filter, is useful for preserving the edges and other high frequency parts of the image, but is computationally intensive.
It should be noted that the performing noise reduction processing on the first image and the second image may be to transmit the first image and the second image to a graphics processor, for example, a GPU chip of a computer, and perform noise reduction processing on the first image and the second image respectively through the graphics processor, and then continue to perform graying processing by obtaining the image after the noise reduction processing, so as to obtain an image that can be used for processing in a subsequent step.
And S106, comparing the first image with the second image to obtain the repetition degree.
Optionally, the comparing the first image and the second image to obtain the repetition degree includes: performing binarization processing on the first image and the second image after graying; and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
Specifically, after obtaining the grayed first image and second image, the embodiment of the present invention needs to perform binarization processing on the first image and the second image so as to obtain the repetition degree of the first image and the second image, that is, the similarity between the first image and the second image.
It should be noted that the binarization of the image means that the gray value of the pixel point on the image is set to 0 or 255, that is, the whole image has an obvious visual effect of only black and white. Binarization is the simplest method for image segmentation, and can convert a gray image into a binary image, set the pixel gray greater than a certain critical gray value as a gray maximum value, and set the pixel gray smaller than the critical gray value as a gray minimum value, thereby realizing binarization. According to different threshold value selections, the binarization algorithm is divided into a fixed threshold value and a self-adaptive threshold value. The commonly used binarization methods are: bimodal, P-parametric, iterative, and OTSU. In the field of computer vision, image segmentation refers to the process of subdividing a digital image into a plurality of image sub-regions (sets of pixels), also called superpixels. The purpose of image segmentation is to simplify or change the representation of the image so that the image is easier to understand and analyze. Image segmentation is commonly used to locate objects and boundaries (lines, curves, etc.) in images. More precisely, image segmentation is a process of labeling each pixel in an image such that pixels with the same label have some common visual characteristic. The result of image segmentation is a set of sub-regions on the image (the totality of these sub-regions covers the entire image), or a set of contour lines extracted from the image (e.g. edge detection). Each pixel in a sub-area is similar under some measure of a property or a property derived by calculation, e.g. color, brightness, texture. The adjacent regions differ greatly in some measure of the characteristic.
Therefore, the first image and the second image after binarization processing can present images of black and white points, and the similarity of the positions and proportions of black pixel points and white pixel points is obtained by comparing the two images of the black and white points, so that the repetition degree of the first image and the second image is calculated.
Optionally, the binarization processing includes: acquiring gray values of the first image and the second image; changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255; and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
For example, after the image a and the image B are grayed simultaneously, binarization processing is performed, a pixel point with a preset gray value smaller than a is a dark point, that is, the gray value becomes 0, a point with a gray value larger than a is called a bright point, and the gray value is set to 255, so that the pixels of the whole image are simply distinguished into black and white colors, a "binary" effect is formed, and then the similarity between the image a and the image B is judged by comparing coordinate values of all black and white pixel points and related pixel points, that is, the repetition degree is x.
And S108, generating a judgment result according to the repetition degree.
Specifically, the determination result may be that the first image and the second image are repeated, or that the first image and the second image are not repeated, and when the repetition degree is higher than the preset repetition degree, the first image and the second image are determined to be repeated, for example, the preset repetition degree value is set to 95%, that is, all two images with repetition degrees higher than 95% are repeated images. Otherwise, the first image and the second image which are not more than the preset repetition degree do not meet the repetition condition, and the judgment result is that the images are not repeated.
And step S110, when the judgment result is repetition, executing the duplicate removal operation on the first image and the second image.
Optionally, the performing the deduplication operation on the first image and the second image includes: acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image; comparing the image quality values of the first image and the second image to generate a comparison result; and according to the comparison result, removing the image with smaller image quality value.
Specifically, after it is determined that the first image and the second image are repeated, the two repeated image data need to be uniformly and preferentially changed into one image data, so as to save storage resources. Then, which image of the first image and the second image is to be removed is selected, and an image quality value of the first image and the second image is required to be obtained, wherein the quality value refers to a resolution value of the image. Therefore, an image with high resolution can be regarded as high in image quality. The purpose of the deduplication operation is to remove low resolution images, i.e., images with less image quality.
Optionally, when the determination result is repetition, after performing a deduplication operation on the first image and the second image, the method further includes: and outputting an image with higher image quality according to the comparison result.
Example two
Fig. 2 is a block diagram of an image forming apparatus for removing an underwater overlapping situation according to an embodiment of the present invention, and according to another aspect of the embodiment of the present invention, there is also provided an image forming apparatus for removing an underwater overlapping situation, including:
an acquiring module 20 is configured to acquire a first image and a second image.
Specifically, the underwater camera continuously photographs the underwater environment and generates a plurality of image data, and the generated image data are continuous in photographing time, that is, there is a sequence of generation. Therefore, in the embodiment of the present invention, first, image data of two time points captured by an underwater camera in an underwater environment is acquired, the first image and the second image may be continuous or discontinuous in time, and specifically, whether the image data generated at the continuous time point or the image data generated at the discontinuous time point is selected, which does not affect subsequent analysis and processing of the first image and the second image.
It should be noted that the first image and the second image generated by continuous underwater environment photographing need to be acquired by the same underwater camera, that is, the first image and the second image are different in picture content and production time point, and other data such as pixel size and color tone need to be uniform.
A graying module 22, configured to perform graying processing on the first image and the second image.
Specifically, in the RGB model, if R ═ G ═ B, the color represents a gray color, where the value of R ═ G ═ B is called a gray value, so that each pixel of the gray image only needs one byte to store the gray value (also called an intensity value and a brightness value), and the gray range is 0 to 255. The color image is grayed by four methods, namely a component method maximum value method average value method weighted average method. In the embodiment of the present invention, the graying processing of the first image and the second image may be performed by using an average value method, and the three-component luminance in the color image is averaged to obtain one grayscale value, that is, the associated graying formula is f (i, j) ═ R (i, j) + G (i, j) + B (i, j))/3.
It should be noted that, according to the image data obtained by graying the first image and the second image by the above method, the original data is to be covered, so that the subsequent steps are performed to perform the deduplication processing operation on the first image and the second image.
Optionally, before performing the graying processing on the first image and the second image, the method further includes: and carrying out noise reduction processing on the first image and the second image.
Specifically, the noise reduction processing refers to that a real digital image is often influenced by noise interference and the like between an imaging device and an external environment during digitization and transmission, and is called a noisy image or a noisy image. The process of reducing noise in a digital image is called image denoising, sometimes referred to as image denoising. The noise reduction processing method of the embodiment of the invention can carry out noise reduction processing through the self-adaptive wiener filter, and can adjust the output of the filter according to the local variance of the image, wherein the larger the local variance is, the stronger the smoothing effect of the filter is. The final goal is to minimize the mean square error E2 ═ E [ (f (x, y) -f ^ (x, y)2] between the restored image f (x, y) and the original image f (x, y). The method has better filtering than the mean filter, is useful for preserving the edges and other high frequency parts of the image, but is computationally intensive.
It should be noted that the performing noise reduction processing on the first image and the second image may be to transmit the first image and the second image to a graphics processor, for example, a GPU chip of a computer, and perform noise reduction processing on the first image and the second image respectively through the graphics processor, and then continue to perform graying processing by obtaining the image after the noise reduction processing, so as to obtain an image that can be used for processing in a subsequent step.
And a comparing module 24, configured to compare the first image and the second image to obtain a repetition degree.
Optionally, the comparing the first image and the second image to obtain the repetition degree includes: performing binarization processing on the first image and the second image after graying; and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
Specifically, after obtaining the grayed first image and second image, the embodiment of the present invention needs to perform binarization processing on the first image and the second image so as to obtain the repetition degree of the first image and the second image, that is, the similarity between the first image and the second image.
It should be noted that the binarization of the image means that the gray value of the pixel point on the image is set to 0 or 255, that is, the whole image has an obvious visual effect of only black and white. Binarization is the simplest method for image segmentation, and can convert a gray image into a binary image, set the pixel gray greater than a certain critical gray value as a gray maximum value, and set the pixel gray smaller than the critical gray value as a gray minimum value, thereby realizing binarization. According to different threshold value selections, the binarization algorithm is divided into a fixed threshold value and a self-adaptive threshold value. The commonly used binarization methods are: bimodal, P-parametric, iterative, and OTSU. In the field of computer vision, image segmentation refers to the process of subdividing a digital image into a plurality of image sub-regions (sets of pixels), also called superpixels. The purpose of image segmentation is to simplify or change the representation of the image so that the image is easier to understand and analyze. Image segmentation is commonly used to locate objects and boundaries (lines, curves, etc.) in images. More precisely, image segmentation is a process of labeling each pixel in an image such that pixels with the same label have some common visual characteristic. The result of image segmentation is a set of sub-regions on the image (the totality of these sub-regions covers the entire image), or a set of contour lines extracted from the image (e.g. edge detection). Each pixel in a sub-area is similar under some measure of a property or a property derived by calculation, e.g. color, brightness, texture. The adjacent regions differ greatly in some measure of the characteristic.
Therefore, the first image and the second image after binarization processing can present images of black and white points, and the similarity of the positions and proportions of black pixel points and white pixel points is obtained by comparing the two images of the black and white points, so that the repetition degree of the first image and the second image is calculated.
Optionally, the binarization processing includes: acquiring gray values of the first image and the second image; changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255; and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
For example, after the image a and the image B are grayed simultaneously, binarization processing is performed, a pixel point with a preset gray value smaller than a is a dark point, that is, the gray value becomes 0, a point with a gray value larger than a is called a bright point, and the gray value is set to 255, so that the pixels of the whole image are simply distinguished into black and white colors, a "binary" effect is formed, and then the similarity between the image a and the image B is judged by comparing coordinate values of all black and white pixel points and related pixel points, that is, the repetition degree is x.
And the judging module 26 is configured to generate a judging result according to the repetition degree.
Specifically, the determination result may be that the first image and the second image are repeated, or that the first image and the second image are not repeated, and when the repetition degree is higher than the preset repetition degree, the first image and the second image are determined to be repeated, for example, the preset repetition degree value is set to 95%, that is, all two images with repetition degrees higher than 95% are repeated images. Otherwise, the first image and the second image which are not more than the preset repetition degree do not meet the repetition condition, and the judgment result is that the images are not repeated.
A duplicate removal module 28, configured to perform a duplicate removal operation on the first image and the second image when the determination result is duplicate.
Optionally, the performing the deduplication operation on the first image and the second image includes: acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image; comparing the image quality values of the first image and the second image to generate a comparison result; and according to the comparison result, removing the image with smaller image quality value.
Specifically, after it is determined that the first image and the second image are repeated, the two repeated image data need to be uniformly and preferentially changed into one image data, so as to save storage resources. Then, which image of the first image and the second image is to be removed is selected, and an image quality value of the first image and the second image is required to be obtained, wherein the quality value refers to a resolution value of the image. Therefore, an image with high resolution can be regarded as high in image quality. The purpose of the deduplication operation is to remove low resolution images, i.e., images with less image quality.
Optionally, when the determination result is repetition, after performing a deduplication operation on the first image and the second image, the method further includes: and outputting an image with higher image quality according to the comparison result.
According to another aspect of the embodiments of the present invention, a non-volatile storage medium is further provided, where the non-volatile storage medium includes a stored program, and the program controls, when running, an apparatus in which the non-volatile storage medium is located to execute the method.
Specifically, the method comprises the following steps: acquiring a first image and a second image; graying the first image and the second image; comparing the first image with the second image to obtain a repetition degree; generating a judgment result according to the repetition degree; and when the judgment result is repetition, executing deduplication operation on the first image and the second image.
Optionally, before performing the graying processing on the first image and the second image, the method further includes: and carrying out noise reduction processing on the first image and the second image.
Optionally, the comparing the first image and the second image to obtain the repetition degree includes: performing binarization processing on the first image and the second image after graying; and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
Optionally, the binarization processing includes: acquiring gray values of the first image and the second image; changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255; and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
Optionally, the performing the deduplication operation on the first image and the second image includes: acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image; comparing the image quality values of the first image and the second image to generate a comparison result; and according to the comparison result, removing the image with smaller image quality value.
Optionally, when the determination result is repetition, after performing a deduplication operation on the first image and the second image, the method further includes: and outputting an image with higher image quality according to the comparison result.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method.
Specifically, the method comprises the following steps: acquiring a first image and a second image; graying the first image and the second image; comparing the first image with the second image to obtain a repetition degree; generating a judgment result according to the repetition degree; and when the judgment result is repetition, executing deduplication operation on the first image and the second image.
Optionally, before performing the graying processing on the first image and the second image, the method further includes: and carrying out noise reduction processing on the first image and the second image.
Optionally, the comparing the first image and the second image to obtain the repetition degree includes: performing binarization processing on the first image and the second image after graying; and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
Optionally, the binarization processing includes: acquiring gray values of the first image and the second image; changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255; and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
Optionally, the performing the deduplication operation on the first image and the second image includes: acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image; comparing the image quality values of the first image and the second image to generate a comparison result; and according to the comparison result, removing the image with smaller image quality value.
Optionally, when the determination result is repetition, after performing a deduplication operation on the first image and the second image, the method further includes: and outputting an image with higher image quality according to the comparison result.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (9)

1. An image imaging method for removing an underwater overlap condition, comprising:
acquiring a first image and a second image;
graying the first image and the second image;
comparing the first image with the second image to obtain a repetition degree;
generating a judgment result according to the repetition degree;
and when the judgment result is repetition, executing deduplication operation on the first image and the second image.
2. The method of claim 1, wherein prior to said graying out the first and second images, the method further comprises:
and carrying out noise reduction processing on the first image and the second image.
3. The method of claim 1, wherein comparing the first image and the second image to obtain a degree of repetition comprises:
performing binarization processing on the first image and the second image after graying;
and comparing the first image and the second image after the binarization processing to obtain the repetition degree.
4. The method according to claim 3, characterized in that the binarization process includes:
acquiring gray values of the first image and the second image;
changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 255;
and changing the gray value of the pixel points which exceed the preset gray value on the first image and the second image into 0.
5. The method of claim 1, wherein the performing the deduplication operation on the first image and the second image comprises:
acquiring an image quality value of the first image and the second image, wherein the image quality value refers to a resolution value of an image;
comparing the image quality values of the first image and the second image to generate a comparison result;
and according to the comparison result, removing the image with smaller image quality value.
6. The method according to claim 1 or 5, wherein after performing a deduplication operation on the first image and the second image when the determination result is duplication, the method further comprises:
and outputting an image with higher image quality according to the comparison result.
7. An image forming apparatus for removing an underwater overlapping condition, comprising:
the acquisition module is used for acquiring a first image and a second image;
the graying module is used for performing graying processing on the first image and the second image;
the comparison module is used for comparing the first image with the second image to obtain the repetition degree;
the judging module is used for generating a judging result according to the repetition degree;
and the duplication removing module is used for executing duplication removing operation on the first image and the second image when the judgment result is duplication.
8. A non-volatile storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the non-volatile storage medium is located to perform the method of any of claims 1 to 6.
9. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of any one of claims 1 to 6.
CN202010334814.4A 2020-04-24 2020-04-24 Image imaging method and device for removing underwater overlapping condition Pending CN111598794A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010334814.4A CN111598794A (en) 2020-04-24 2020-04-24 Image imaging method and device for removing underwater overlapping condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010334814.4A CN111598794A (en) 2020-04-24 2020-04-24 Image imaging method and device for removing underwater overlapping condition

Publications (1)

Publication Number Publication Date
CN111598794A true CN111598794A (en) 2020-08-28

Family

ID=72183580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010334814.4A Pending CN111598794A (en) 2020-04-24 2020-04-24 Image imaging method and device for removing underwater overlapping condition

Country Status (1)

Country Link
CN (1) CN111598794A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565909A (en) * 2020-11-30 2021-03-26 维沃移动通信有限公司 Video playing method and device, electronic equipment and readable storage medium
CN114666506A (en) * 2022-03-29 2022-06-24 广州方舟信息科技有限公司 Screening method and device for continuously shot images and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327426A (en) * 2016-08-19 2017-01-11 携程计算机技术(上海)有限公司 Image replication removing method and image replication removing system
CN110348264A (en) * 2019-07-04 2019-10-18 北京电子工程总体研究所 A kind of QR image in 2 D code bearing calibration and system
CN110399344A (en) * 2019-07-23 2019-11-01 北京明略软件系统有限公司 Choose the method and device of multiimage
CN110909791A (en) * 2019-11-20 2020-03-24 车智互联(北京)科技有限公司 Similar image identification method and computing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327426A (en) * 2016-08-19 2017-01-11 携程计算机技术(上海)有限公司 Image replication removing method and image replication removing system
CN110348264A (en) * 2019-07-04 2019-10-18 北京电子工程总体研究所 A kind of QR image in 2 D code bearing calibration and system
CN110399344A (en) * 2019-07-23 2019-11-01 北京明略软件系统有限公司 Choose the method and device of multiimage
CN110909791A (en) * 2019-11-20 2020-03-24 车智互联(北京)科技有限公司 Similar image identification method and computing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112565909A (en) * 2020-11-30 2021-03-26 维沃移动通信有限公司 Video playing method and device, electronic equipment and readable storage medium
CN114666506A (en) * 2022-03-29 2022-06-24 广州方舟信息科技有限公司 Screening method and device for continuously shot images and electronic equipment
CN114666506B (en) * 2022-03-29 2023-01-06 广州方舟信息科技有限公司 Screening method and device for continuously shot images and electronic equipment

Similar Documents

Publication Publication Date Title
CN108805829B (en) Image data processing method, device, equipment and computer readable storage medium
CN110599387A (en) Method and device for automatically removing image watermark
CN110246087B (en) System and method for removing image chroma noise by referring to multi-resolution of multiple channels
Feng et al. Measurement of ringing artifacts in JPEG images
JP2007507802A (en) Text-like edge enhancement in digital images
CN107767356B (en) Image processing method and device
CN110717922A (en) Image definition evaluation method and device
Pei et al. A novel image recovery algorithm for visible watermarked images
CN111598794A (en) Image imaging method and device for removing underwater overlapping condition
CN100367770C (en) Method for removing isolated noise point in video
WO2011067755A1 (en) Method and system for automatically recovering chromaticity and image variation of colour clipped image regions
CN112700363A (en) Self-adaptive visual watermark embedding method and device based on region selection
CN111353955A (en) Image processing method, device, equipment and storage medium
KR20140109801A (en) Method and apparatus for enhancing quality of 3D image
CN112927160B (en) Single low-light image enhancement method based on depth Retinex
JP5286215B2 (en) Outline extracting apparatus, outline extracting method, and outline extracting program
CN110136085B (en) Image noise reduction method and device
CN110298812B (en) Image fusion processing method and device
CN112233037A (en) Image enhancement system and method based on image segmentation
CN112541853A (en) Data processing method, device and equipment
US8538163B2 (en) Method and system for detecting edges within an image
CN116468636A (en) Low-illumination enhancement method, device, electronic equipment and readable storage medium
CN115660937A (en) Image processing method, image processing apparatus, electronic device, and storage medium
CN110570343B (en) Image watermark embedding method and device based on self-adaptive feature point extraction
CN110140150B (en) Image processing method and device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200828

RJ01 Rejection of invention patent application after publication