CN112150373B - Image processing method, image processing apparatus, and readable storage medium - Google Patents

Image processing method, image processing apparatus, and readable storage medium Download PDF

Info

Publication number
CN112150373B
CN112150373B CN201910582938.1A CN201910582938A CN112150373B CN 112150373 B CN112150373 B CN 112150373B CN 201910582938 A CN201910582938 A CN 201910582938A CN 112150373 B CN112150373 B CN 112150373B
Authority
CN
China
Prior art keywords
damaged area
image
match line
area
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910582938.1A
Other languages
Chinese (zh)
Other versions
CN112150373A (en
Inventor
魏传振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Zhenshi Information Technology Co Ltd
Original Assignee
Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Zhenshi Information Technology Co Ltd filed Critical Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority to CN201910582938.1A priority Critical patent/CN112150373B/en
Publication of CN112150373A publication Critical patent/CN112150373A/en
Application granted granted Critical
Publication of CN112150373B publication Critical patent/CN112150373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides an image processing method, including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of subareas, and pixels in each subarea in the plurality of subareas have the same pixel gray value; determining at least one damaged area of the plurality of sub-areas; determining a match line intersecting the damaged area based on the damaged area, wherein a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest; and repairing the image to be repaired based on color components of pixel gray values in the first and second non-damaged areas. The present disclosure also provides an image processing apparatus and a computer-readable storage medium.

Description

Image processing method, image processing apparatus, and readable storage medium
Technical Field
The present disclosure relates to the field of computer technology, and more particularly, to an image processing method, an image processing apparatus, and a computer-readable storage medium.
Background
With the rapid development of computer technology, image processing technology is increasingly applied to various fields such as industrial and agricultural production, construction, logistics, daily life and the like. For example, to identify an identifier on the good, thereby classifying the good.
In implementing the concepts of the present disclosure, the inventors found that at least the following problems exist in the prior art: existing image processing techniques have difficulty identifying images in which at least a portion of the area is occluded.
Disclosure of Invention
In view of this, the present disclosure provides an image processing method, an image processing apparatus, and a computer-readable storage medium.
One aspect of the present disclosure provides an image processing method including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of subareas, and pixels in each subarea in the plurality of subareas have the same pixel gray value; determining at least one damaged area of the plurality of sub-areas; determining a match line intersecting the damaged area based on the damaged area, wherein a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest; and repairing the image to be repaired based on color components of pixel gray values in the first and second non-damaged areas.
According to an embodiment of the present disclosure, acquiring an image to be repaired includes: acquiring a plurality of original images, wherein the plurality of original images are images of a target object acquired by an image acquisition device along a specific direction at each movement preset distance, and the original images have the same size; overlapping and arranging the plurality of original images according to an acquisition sequence, and determining tracks formed by the positions of pixels of the same row or the same column in one original image in the plurality of original images respectively; and generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing method further includes selecting one of the plurality of original images as a repair starting image, wherein the repair starting image includes a plurality of damaged lines; determining the one image to be repaired for pixels of each of the plurality of damaged rows; and restoring the original image to be restored based on a plurality of the original images to be restored of the original image to be restored.
According to an embodiment of the present disclosure, determining a match line intersecting the damaged area based on the damaged area includes: determining an initial boundary line based on the location of the damaged area; searching whether abnormal pixels which are different from the pixel gray values of pixels in the non-damaged area on the first side of the initial boundary line appear in the non-damaged area on the second side of the initial boundary line in the image to be repaired from the initial boundary line; setting a hypothetical match line based on the at least one outlier pixel if the outlier pixel occurs; determining a pixel gray value difference between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line; and determining a hypothetical match line that maximizes the difference between the non-damaged area of the first side and the non-damaged area of the second side as a match line.
According to an embodiment of the present disclosure, determining a pixel gray value difference between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line comprises: establishing a parallelogram with the hypothesized match line as a central axis; determining a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line, wherein the first area is an area on the first side of the central axis in the parallelogram and the second area is an area on the second side of the central axis in the parallelogram; and determining a pixel gray value distribution of a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line based on the first color histogram and the second color histogram.
According to an embodiment of the present disclosure, determining a first color histogram of pixels in a non-damaged area in a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area in a second area on a second side of the hypothetical match line comprises: obtaining a horizontal distance from a pixel of a non-damaged region in the first region and the second region to the central axis, wherein the horizontal distance indicates a distance from the pixel to another pixel on the central axis that is in the same row as the pixel; and taking the horizontal distance as a weight of the gray value of the pixel, and establishing a first color histogram of the non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the pixel gray value of the pixel.
Another aspect of the present disclosure provides an image processing apparatus including: an acquisition module, configured to acquire an image to be repaired, where the image to be repaired includes a plurality of sub-areas, and pixels in each of the plurality of sub-areas have the same pixel gray value; a first determination module for determining at least one damaged area of the plurality of sub-areas; a second determining module, configured to determine, based on the damaged area, a match line intersecting the damaged area, where a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is largest; and a first repair module for repairing the image to be repaired based on color components of pixel gray values in the first and second non-damaged areas.
According to an embodiment of the present disclosure, the acquisition module includes: the acquisition sub-module is used for acquiring a plurality of original images, wherein the plurality of original images are images of a target object acquired by the image acquisition device along a specific direction at each movement preset distance, and the original images have the same size; the determining submodule is used for overlapping and arranging the plurality of original images according to the acquisition sequence and determining tracks formed by the positions of pixels of the same row or the same column in one original image in the plurality of original images respectively; and a generation sub-module for generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing apparatus further includes a selecting module for selecting one of the plurality of original images as a repair starting image, wherein the repair starting image includes a plurality of damaged lines; a third determining module configured to determine the one image to be repaired for pixels of each of the plurality of damaged rows; and a second restoration module for restoring the restoration starting image based on a plurality of the restoration starting images.
Another aspect of the present disclosure provides an image processing apparatus including: one or more processors; and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed, are configured to implement a method as described above.
Another aspect of the present disclosure provides a computer program comprising computer executable instructions which when executed are for implementing a method as described above.
According to the embodiment of the disclosure, the problem that the image with the blocked partial area is difficult to identify can be at least partially solved, and therefore, the technical effect of accurately identifying the blocked image can be achieved.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments thereof with reference to the accompanying drawings in which:
fig. 1 schematically illustrates an application scenario in which an image processing method may be applied according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow chart of an image processing method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates an example method flow diagram for acquiring a repair image according to an embodiment of this disclosure;
FIG. 4 schematically illustrates a schematic diagram of a plurality of original images arranged in an overlapping manner and determining a locus of positions in the plurality of original images formed by pixels of a same row in the plurality of original images, respectively, in accordance with an embodiment of the present disclosure;
FIG. 5A schematically illustrates a flow chart of determining match lines intersecting the damaged area based on the damaged area, according to an embodiment of the disclosure;
FIG. 5B schematically illustrates a schematic diagram of determining match lines intersecting the damaged area based on the damaged area, according to an embodiment of the disclosure;
FIG. 6 schematically illustrates a flow chart of determining a difference in pixel gray level values between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line, according to an embodiment of the present disclosure;
FIG. 7A schematically illustrates a schematic diagram of repairing the image to be repaired shown in FIG. 5B based on color components of pixel gray values in a first non-damaged region and a second non-damaged region in accordance with an embodiment of the disclosure;
FIG. 7B schematically illustrates a schematic diagram after completion of repairing the image to be repaired in FIG. 5B, according to an embodiment of the disclosure;
FIG. 8A schematically illustrates a schematic view of an image to be repaired according to another embodiment of the present disclosure;
FIG. 8B schematically illustrates an image after an image to be repaired is repaired according to another embodiment of the present disclosure;
fig. 9 schematically illustrates a flowchart of an image processing method according to another embodiment of the present disclosure;
fig. 10 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure; and
fig. 11 schematically shows a block diagram of an image processing system according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is only exemplary and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It should be noted that the terms used herein should be construed to have meanings consistent with the context of the present specification and should not be construed in an idealized or overly formal manner.
Where expressions like at least one of "A, B and C, etc. are used, the expressions should generally be interpreted in accordance with the meaning as commonly understood by those skilled in the art (e.g.," a system having at least one of A, B and C "shall include, but not be limited to, a system having a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a formulation similar to at least one of "A, B or C, etc." is used, in general such a formulation should be interpreted in accordance with the ordinary understanding of one skilled in the art (e.g. "a system with at least one of A, B or C" would include but not be limited to systems with a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
An embodiment of the present disclosure provides an image processing method, including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of subareas, and pixels in each subarea in the plurality of subareas have the same pixel gray value; determining at least one damaged area of the plurality of sub-areas; determining a match line intersecting the damaged area based on the damaged area, wherein a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest; and repairing the image to be repaired based on color components of pixel gray values in the first and second non-damaged areas.
Fig. 1 schematically illustrates an application scenario in which an image processing method may be applied according to an embodiment of the present disclosure. It should be noted that fig. 1 illustrates only an example of an application scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but it does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments, or scenarios.
As shown in fig. 1, the application scene includes an image 100 to be identified. The image 100 to be identified includes a plurality of sub-regions, such as sub-regions 110-130 included in fig. 1. Wherein the pixels in each sub-region have the same pixel gray value.
As shown in fig. 1, the image 100 to be identified includes an occluded region 140 therein. The image 100 to be recognized cannot be recognized because the pixel gray values of the pixels in the blocking area 140 cannot be acquired.
The image processing method according to the embodiment of the disclosure can repair the image 100 to be recognized, so that the repaired image 100 to be recognized can be accurately recognized.
Fig. 2 schematically shows a flowchart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the image processing method includes operations S210 to S240.
In operation S210, an image to be repaired is acquired, wherein the image to be repaired includes a plurality of sub-regions, and pixels in each of the plurality of sub-regions have the same pixel gray value.
At least one damaged area of the plurality of sub-areas is determined in operation S220.
In operation S230, a match line intersecting the damaged area is determined based on the damaged area, wherein a difference in pixel gray value between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest.
In operation S240, the image to be repaired is repaired based on color components of pixel gray values in the first and second non-damaged areas.
According to the method, the image to be repaired is repaired according to the texture information of the image to be repaired, so that the accuracy of identifying the image to be repaired is improved.
According to an embodiment of the present disclosure, in operation S210, the image to be repaired may include, for example, an epipolar plane image (epipolar plane image, EPI) including a plurality of lines, the pixels in each line having the same pixel gray value, and the pixel gray values of the pixels in different lines may or may not be the same. The same pixel gray value may be, for example, the same value of R, G, B of the pixel.
According to an embodiment of the present disclosure, the image to be repaired may include, for example, a bar code image including a plurality of black bars and spaces having unequal widths, and pixels in the black bars have the same pixel gray value.
Fig. 3 schematically illustrates a flowchart of a method of acquiring a repair image according to operation S210 of an embodiment of the present disclosure.
As shown in fig. 3, the method includes operations S211 to S213.
In operation S211, a plurality of original images, which are images of a target object acquired per movement of a preset distance in a specific direction by an image acquisition device, are acquired, the original images having the same size.
In operation S212, the plurality of original images are overlapped and arranged according to the acquisition time sequence, and a trajectory formed by the positions of the pixels of the same row or column in one of the plurality of original images in the plurality of original images is determined.
In operation S213, the image to be repaired is generated based on the trajectory.
According to an embodiment of the present disclosure, in operation S211, a plurality of original images may be, for example, images of one target object acquired every time a same camera moves in a horizontal or vertical direction. Or a plurality of original images obtained by respectively acquiring the target object by a plurality of cameras which are arranged at intervals of a preset distance in a certain specific direction.
According to an embodiment of the present disclosure, in operation S212, for example, a plurality of original images are obtained by capturing images of one target object by the same camera moving in a horizontal or vertical direction every predetermined distance, the plurality of original images may be overlapped in the time sequence of capturing. For another example, if a plurality of original images are obtained by capturing one target object by a plurality of cameras positioned on the same straight line, the plurality of original images may be arranged in accordance with the distance between the cameras and the target object.
According to an embodiment of the present disclosure, in operation S212, for example, the same camera is moved in a horizontal direction, and an image of one target object is acquired every predetermined distance of movement to obtain a plurality of original images. In this embodiment, for example, a locus formed by the positions of pixels of a certain line in the first original image in the plurality of original images, respectively, may be determined.
For another example, the same camera is moved in the vertical direction, and an image of a target object is acquired every predetermined distance of movement to obtain a plurality of original images. In this embodiment, for example, a locus formed by the positions of pixels of a certain column in the first original image in the plurality of original images, respectively, may be determined.
According to an embodiment of the present disclosure, in operation S213, an image to be repaired may be determined, for example, by a trajectory formed by positions of pixels of a certain line of a first original image in a plurality of original images.
For the convenience of the understanding of those skilled in the art, operations S212 and S213 will be described below with reference to fig. 4 by taking a camera moving in a horizontal direction as an example.
Fig. 4 schematically illustrates a schematic diagram of a trajectory in which a plurality of original images are arranged in an overlapping manner and the positions of pixels of the same line in the plurality of original images are determined in the plurality of original images, respectively, according to an embodiment of the present disclosure.
As shown in FIG. 4, the plurality of original images may include, for example, images 410-450.
The images 410 to 450 may be acquired, for example, every time the same camera moves a predetermined distance in the horizontal direction. The images 410-450 are arranged in overlapping relation in the temporal order of acquisition.
As shown in fig. 4, an occluded area 411 is included in the plurality of original images, and the occluded area 411 is marked as white.
According to an embodiment of the present disclosure, for example, the positions of the pixels of the 100 th row of the first original image 410 in the images 410 to 450 respectively may be determined, and the positions of each pixel in the respective original images are connected to form an image to be repaired including the tracks of the respective pixels. As shown in fig. 4, the damaged area 412 in the image to be repaired may be a trace formed by pixels in the blocked area 411 that is white.
In this embodiment, the pixels of the first line in the image to be repaired may be the pixels of the 100 th line of the first image 410, the pixels of the second line in the image to be repaired may be the pixels of the 100 th line of the second image 420, and similarly, the pixels of the last line in the image to be repaired may be the pixels of the 100 th line of the last image.
It should be understood that in the embodiment shown in fig. 4, the number of the original images is merely an example, and in practical application, the number of the original images may be arbitrary.
Referring back to fig. 2, in operation S220, for example, in the scenario shown in fig. 4, the damaged area in the image to be repaired may be the damaged area 412, according to an embodiment of the present disclosure.
According to an embodiment of the present disclosure, at operation S220, at least one damaged area may be determined according to a pixel gray value, for example, according to marking the damaged area in the image to be repaired as a specific pixel gray value, for example, (255, 255, 255).
According to an embodiment of the present disclosure, in operation S230, for example, in the scenario shown in fig. 1, the match line intersecting the damaged area determined based on the damaged area 140 may include, for example, the match line MN, and the pixel gray value difference between the non-damaged area on the left side of the match line MN and the non-damaged area on the right side of the match line 141 is the largest.
FIG. 5A schematically illustrates a flow chart for determining match lines intersecting the damaged area based on the damaged area, according to an embodiment of the disclosure.
Fig. 5B schematically illustrates a schematic diagram of determining a match line intersecting the damaged area based on the damaged area, according to an embodiment of the disclosure.
As shown in fig. 5A, the method includes operations S231 to S235.
In operation S231, an initial boundary line is determined based on the position of the damaged area.
In operation S232, it is searched from the initial boundary line whether an abnormal pixel, which is different from a pixel gray value of a pixel in a non-damaged area on a first side of the initial boundary line, appears in a non-damaged area on a second side of the initial boundary line in the image to be repaired.
In operation S233, a hypothetical match line is set based on the at least one outlier pixel.
In operation S234, a difference in pixel gray scale values between the non-damaged area on the first side of the hypothetical match line and the non-damaged area on the second side of the hypothetical match line is determined.
In operation S235, it is determined that the hypothetical match line that maximizes the difference between the non-damaged area of the first side and the non-damaged area of the second side is a match line.
An implementation of determining a match line intersecting the damaged area based on the damaged area described in fig. 5A according to an embodiment of the disclosure is described below with reference to fig. 5B.
According to an embodiment of the present disclosure, a parallelogram region including a damaged region may be determined according to the position of the damaged region, and one hypotenuse of the parallelogram may be determined as an initial boundary line in operation S231, for example. For example, in the scenario shown in fig. 5B, a parallelogram GFHKG region is determined according to the position of the damaged region 510, and the hypotenuse GF of the parallelogram GFHKG region is taken as the initial boundary line.
In accordance with an embodiment of the present disclosure, in operation S232, it may be searched, for example, starting from the hypotenuse GF, to the right whether an abnormal pixel occurs in the non-corrupted region that is different from the pixel gray value of the pixel in the first non-corrupted region to the left of the hypotenuse GF, wherein the first non-corrupted region includes the non-corrupted region closest to the hypotenuse GF. According to an embodiment of the present disclosure, for example, it may be possible to sequentially find out to the right whether an abnormal pixel occurs among a plurality of pixels located on the same diagonal line, the slope of which is the same as that of an initial boundary line, for example, the initial boundary line GF.
According to an embodiment of the present disclosure, in operation S233, for example, in the scenario shown in fig. 5B, an abnormal pixel a is found, and a straight line assuming that the match line passes through the abnormal pixel a and has a slope of tan θ is set. The hypothetical match line may be, for example, a straight line PQ.
In accordance with an embodiment of the present disclosure, in operation S234, a pixel gray value difference between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line may be calculated, for example, according to equation (1).
Wherein g θ (l) To assume the number of pixels with a gray value of l for the pixel to the left of the match line, h θ (l) To assume the number of pixels with a gray value of l for the pixel to the right of the match line, x 2 (g θ ,h θ ) Is of slope ofthe difference in pixel gray values between the non-damaged area on the first side of the hypothetical match line and the non-damaged area on the second side of the hypothetical match line for tan θ.
Fig. 6 schematically illustrates a flow chart of determining a difference in pixel gray value between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line, according to an embodiment of the present disclosure.
As shown in fig. 6, the method includes operations S610 to S630.
In operation S610, a parallelogram is established with the hypothetical match line as the central axis.
In operation S620, a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line are determined, wherein the first area is an area on the first side of the central axis in the parallelogram and the second area is an area on the second side of the central axis in the parallelogram.
In operation S630, a pixel gray value distribution of a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line is determined based on the first color histogram and the second color histogram.
According to the method of the embodiment of the disclosure, the difference of the pixel gray values at two sides of the assumed matching line is determined according to the color histograms of the first area and the second area in the parallelogram taking the assumed matching line as the central axis.
In accordance with an embodiment of the present disclosure, in operation S610, for example, in the scenario shown in fig. 5B, a parallelogram BDECB may be established with the match line PQ assumed as a central axis.
According to an embodiment of the present disclosure, in operation S620 and operation S630, for example, in the scenario shown in fig. 5B, the first region is a BDQPB region on the left side of the central axis PQ of the parallelogram BDECB, and the second region is a PQECP region on the right side of the central axis PQ. According to an embodiment of the present disclosure, for example, color histograms of the BDQPB region and the PQECP region may be determined, respectively, to thereby determine pixel gray value distributions of the BDQPB region and the PQECP region.
According to an embodiment of the present disclosure, determining a first color histogram of pixels in a non-damaged area in a BDQPB area on a first side of a hypothetical match line PQ and a second color histogram of pixels in a non-damaged area in a PQECP area on a second side of the hypothetical match line PQ includes: obtaining a horizontal distance from a pixel of a non-damaged region in the first region (BDQPB region) and the second region (PQECP region) to a central axis (PQ), wherein the horizontal distance indicates a distance from the pixel to another pixel on the central axis (PQ) that is in the same row as the pixel; and establishing a first color histogram of a non-damaged area in the first area (BDQPB area) and a second color histogram of a non-damaged area in the second area (PQECP area) according to the weights and the pixel gray values of the pixels by taking the horizontal distance as the weight of the gray values of the pixels.
According to an embodiment of the present disclosure, an embodiment of determining a first color histogram is described taking a pixel T of a non-damaged area of a first area as an example, as shown in fig. 5B.
The first area is a BDQPB area, the non-damaged area in the BDQPB area comprises a pixel T, another pixel which is positioned on the central axis of the same row with the pixel T is a pixel U, and the horizontal distance from the pixel T to the central axis is determined as the distance between the pixel T and the pixel U.
According to an embodiment of the present disclosure, the horizontal distance of the pixel of the non-damaged area to the central axis may be calculated according to formula (2), for example, the distance between the pixel T and the pixel U may be calculated according to formula (2).
Z θ (i, j) =i- (u+ (j-v) ×tan θ) formula (2)
Wherein (u, v) is the coordinates of the abnormal pixel point A, (i, j) is the coordinates of the pixel T in the non-damaged area, Z θ (i, j) represents the distance between the pixel T and the pixel U.
According to embodiments of the present disclosure, for example, the horizontal distance Z may be θ (i, j) as the imageThe weight of the pixel gray value of the pixel T. According to an embodiment of the present disclosure, the weights of the gray values of the respective pixels in the non-damaged area may also be calculated according to the following formula (3), for example.
Wherein c represents normalization, a represents a scale parameter, ω θ (i, j) represents the weight of the pixel (i, j).
According to the embodiment of the present disclosure, for example, the pixel gray value of the pixel T is R, the weight of the pixel T is ω (T), and it is possible to determine the degree of contribution of the pixel T to the number of pixels of the pixel having the pixel gray value of R is ω (T). For another example, if the pixel gradation value of the other pixel S is R and the weight of the pixel S is ω (S), the degree of contribution of the pixel S to the number of pixels of the pixel gradation value R is ω (S).
According to an embodiment of the present disclosure, for example, the number of pixel gray values in the non-damaged area of the first area may be counted, the first color histogram may be established according to the number of pixel gray values, the number of pixel gray values in the non-damaged area of the second area may be counted, and the second color histogram may be established according to the number of pixel gray values. For example, if there are pixels T and U in the non-damaged area of the first area, the number of pixel gray values R may be ω (T) +ω (S).
According to an embodiment of the present disclosure, the difference in pixel gray-scale values between the non-damaged area on the first side of the hypothetical match line and the non-damaged area on the second side of the hypothetical match line may be determined, for example, from the pixel gray-scale value distribution of the non-damaged area on the first side of the hypothetical match line and the non-damaged area on the second side of the hypothetical match line and equation (1) above.
Referring back to fig. 5A, in accordance with an embodiment of the present disclosure, in operation S235, it may be determined, for example, that χ is to be caused 2 (g θ ,h θ ) The largest value of tan θ is the slope of the match line. The match line in fig. 5B is calculated as a straight line MN, for example, according to equation (1).
Referring back to fig. 2, in operation S240, for example, in the scenario shown in fig. 5B, if in operation S230, it is determined that the match line is the straight line MN, and the color component of the pixel gray value in the first non-damaged area may be, for example, (A1, A2, A3), the damaged area on the left side of the straight line MN is filled with the color of the color component (A1, A2, A3).
Fig. 7A schematically illustrates a schematic diagram of repairing the image to be repaired illustrated in fig. 5B based on color components of pixel gray values in the first and second non-damaged areas according to an embodiment of the present disclosure.
As shown in fig. 7A, the matching line in fig. 5B is a straight line MN, the color components of the non-damaged area on the left side of the straight line MN are (A1, A2, A3), and the color of the damaged area on the left side of the straight line MN is filled with the color of the color component (A1, A2, A3).
According to an embodiment of the present disclosure, for example, the damaged area in the parallelogram MNHKM may be repaired continuously according to the above method, so as to obtain an undamaged image, and the undamaged image after repair is shown in fig. 7B.
Fig. 8A schematically illustrates a schematic view of an image to be repaired according to another embodiment of the present disclosure.
Fig. 8B schematically illustrates an image of fig. 8A after the image to be repaired is repaired according to another embodiment of the present disclosure.
As shown in fig. 8A, the image to be repaired is a barcode image, and a partial area in the barcode image is blocked, so that the barcode cannot be recognized.
According to an embodiment of the present disclosure, the bar code image in fig. 8A may be repaired, for example, by the method shown in fig. 2.
As shown in fig. 8B, after the repair process is performed on the barcode shown in fig. 8A, the blocked portion in the barcode image is repaired so that the electronic device can recognize the barcode shown in fig. 8B.
Fig. 9 schematically shows a flowchart of an image processing method according to another embodiment of the present disclosure.
As shown in fig. 9, the method further includes operations S910 to S930 on the basis of the foregoing embodiment.
In operation S910, one of the plurality of original images is selected as an original image to be restored.
In operation S920, one of the images to be restored is determined for each row of pixels in the original image to be restored.
In operation S930, the repair-waiting original image is repaired based on a plurality of the repair-waiting original images.
A implementation of the image processing method according to an embodiment of the present disclosure is schematically described below with reference to fig. 4.
According to an embodiment of the present disclosure, in operation S910, one original image may be selected as a repair original image from among the original images 410 to 450, for example. For example, the original image to be restored may be the original image 410, and lines 10-150 of pixels in the original image to be restored 410 are lines of impaired behavior.
According to an embodiment of the present disclosure, in operation S920, images to be repaired respectively corresponding to the 10 th line to the 150 th line may be determined, for example, according to the method described in fig. 3.
According to an embodiment of the present disclosure, for example, determining an image to be repaired corresponding to a pixel of the 100 th row may be determining positions of respective pixels of the 100 th row of the first original image 410 in the images 410 to 450, respectively, and connecting the positions of each pixel in the respective original images, thereby forming the image to be repaired including a track of the respective pixels. As shown in fig. 4, the image to be repaired includes a trace formed by pixels in the white occluded region.
In this embodiment, the pixels of the first line in the image to be repaired may be the pixels of the 100 th line of the first image 410, the pixels of the second line in the image to be repaired may be the pixels of the 100 th line of the second image 420, and similarly, the pixels of the last line in the image to be repaired may be the pixels of the 100 th line of the last image.
And respectively obtaining the images to be repaired corresponding to the 10 th line to the 150 th line according to a similar method for obtaining the images to be repaired.
According to an embodiment of the present disclosure, in operation S930, for example, the to-be-repaired images corresponding to the 10 th to 150 th lines may be repaired, respectively, to determine standard pixel gray-scale values of the damaged pixels in the plurality of damaged lines in the original image 410 to be repaired, and repair color components of the pixel gray-scale values of the damaged pixels to the standard pixel gray-scale values.
Fig. 10 schematically shows a block diagram of an image processing apparatus 1000 according to an embodiment of the present disclosure.
As shown in fig. 10, the image processing apparatus 1000 includes an acquisition module 1010, a first determination module 1020, a second determination module 1030, and a first repair module 1040.
The obtaining module 1010 is configured to obtain an image to be repaired, for example, performing operation S210 described above with reference to fig. 2, where the image to be repaired includes a plurality of sub-regions, and pixels in each of the plurality of sub-regions have the same pixel gray value.
The first determining module, for example, performs operation S220 described above with reference to fig. 2, for determining at least one damaged area of the plurality of sub-areas.
A second determining module, for example, performs operation S230 described above with reference to fig. 2, for determining, based on the damaged area, a match line intersecting the damaged area, wherein a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest. And
the first repair module, for example, performs operation S230 described above with reference to fig. 2, for repairing the image to be repaired based on the color components of the pixel gray values in the first and second non-damaged areas.
According to an embodiment of the present disclosure, the acquisition module 1010 includes: the acquisition sub-module is used for acquiring a plurality of original images, wherein the plurality of original images are images of a target object acquired by the image acquisition device along a specific direction at each movement preset distance, and the original images have the same size; the first determining submodule is used for overlapping and arranging the plurality of original images according to the acquisition sequence and determining tracks formed by the positions of pixels of the same row or the same column in one original image in the plurality of original images respectively; and a generation sub-module for generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing apparatus further includes: a selecting module, configured to select one of the plurality of original images as a repair starting image, where the repair starting image includes a plurality of damaged lines; a third determining module configured to determine the one image to be repaired for pixels of each of the plurality of damaged rows; and a second restoration module for restoring the restoration starting image based on a plurality of the restoration starting images.
According to an embodiment of the present disclosure, the second determining module 1030 includes: a second determining sub-module for determining an initial boundary line based on the location of the damaged area; a searching sub-module, configured to search, from the initial boundary line, whether an abnormal pixel that is different from a pixel gray value of a pixel in a non-damaged area on a first side of the initial boundary line appears in a non-damaged area on a second side of the initial boundary line in the image to be repaired; a setting sub-module for setting a hypothetical match line based on the at least one outlier pixel if the outlier pixel is present; a third determination sub-module for determining a difference in pixel gray scale values between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line; and a fourth determination submodule for determining an assumed match line that maximizes a difference between the non-damaged area of the first side and the non-damaged area of the second side to be a match line.
According to an embodiment of the present disclosure, the third determination submodule includes: a setting unit for setting up a parallelogram with the assumed match line as a central axis; a first determining unit configured to determine a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line, wherein the first area is an area on the first side of the central axis in the parallelogram, and the second area is an area on the second side of the central axis in the parallelogram; and a second determining unit configured to determine a pixel gray value distribution of a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line based on the first color histogram and the second color histogram.
According to an embodiment of the present disclosure, the first determination unit includes: an obtaining subunit configured to obtain a horizontal distance from a pixel of a non-damaged area in the first area and the second area to the central axis, wherein the horizontal distance indicates a distance from the pixel to another pixel on the central axis that is in the same row as the pixel; and a determining subunit, configured to use the horizontal distance as a weight of the gray value of the pixel, and establish a first color histogram of the non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the pixel gray value of the pixel.
Any number of modules, sub-modules, units, sub-units, or at least some of the functionality of any number of the sub-units according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented as split into multiple modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system-on-chip, a system-on-substrate, a system-on-package, an Application Specific Integrated Circuit (ASIC), or in any other reasonable manner of hardware or firmware that integrates or encapsulates the circuit, or in any one of or a suitable combination of three of software, hardware, and firmware. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be at least partially implemented as computer program modules, which when executed, may perform the corresponding functions.
For example, any of the acquisition module 1010, the first determination module 1020, the second determination module 1030, and the first repair module 1040 may be combined in one module to be implemented, or any of the modules may be split into a plurality of modules. Alternatively, at least some of the functionality of one or more of the modules may be combined with at least some of the functionality of other modules and implemented in one module. According to embodiments of the present disclosure, at least one of the acquisition module 1010, the first determination module 1020, the second determination module 1030, and the first repair module 1040 may be implemented at least in part as hardware circuitry, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging the circuitry, or in any one of or a suitable combination of three of software, hardware, and firmware. Alternatively, at least one of the acquisition module 1010, the first determination module 1020, the second determination module 1030, and the first repair module 1040 may be at least partially implemented as computer program modules that, when executed, perform the corresponding functions.
Fig. 11 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing system illustrated in fig. 11 is merely an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present disclosure.
As shown in fig. 11, the image processing 1100 according to the embodiment of the present disclosure includes a processor 1101 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1102 or a program loaded from a storage section 1108 into a Random Access Memory (RAM) 1103. The processor 1101 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or an associated chipset and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. The processor 1101 may also include on-board memory for caching purposes. The processor 1101 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flow according to embodiments of the present disclosure.
In the RAM 1103, various programs and data necessary for the operation of the system 1100 are stored. The processor 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. The processor 1101 performs various operations of the method flow according to the embodiments of the present disclosure by executing programs in the ROM 1102 and/or the RAM 1103. Note that the program may be stored in one or more memories other than the ROM 1102 and the RAM 1103. The processor 1101 may also perform various operations of the method flow according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the present disclosure, the system 1100 may also include an input/output (I/O) interface 1105, the input/output (I/O) interface 1105 also being connected to the bus 1104. The system 1100 may also include one or more of the following components connected to the I/O interface 1105: an input section 1106 including a keyboard, a mouse, and the like; an output portion 1107 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 1108 including a hard disk or the like; and a communication section 1109 including a network interface card such as a LAN card, a modem, and the like. The communication section 1109 performs communication processing via a network such as the internet. The drive 1110 is also connected to the I/O interface 1105 as needed. Removable media 1111, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is installed as needed in drive 1110, so that a computer program read therefrom is installed as needed in storage section 1108.
According to embodiments of the present disclosure, the method flow according to embodiments of the present disclosure may be implemented as a computer software program. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1109, and/or installed from the removable media 1111. The above-described functions defined in the system of the embodiments of the present disclosure are performed when the computer program is executed by the processor 1101. The systems, devices, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
The present disclosure also provides a computer-readable storage medium that may be embodied in the apparatus/device/system described in the above embodiments; or may exist alone without being assembled into the apparatus/device/system. The computer-readable storage medium carries one or more programs which, when executed, implement methods in accordance with embodiments of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example, but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, the computer-readable storage medium may include ROM 1102 and/or RAM 1103 described above and/or one or more memories other than ROM 1102 and RAM 1103.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that the features recited in the various embodiments of the disclosure and/or in the claims may be combined in various combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the disclosure. In particular, the features recited in the various embodiments of the present disclosure and/or the claims may be variously combined and/or combined without departing from the spirit and teachings of the present disclosure. All such combinations and/or combinations fall within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the disclosure, and such alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (10)

1. An image processing method, comprising:
acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of subareas, and pixels in each subarea in the plurality of subareas have the same pixel gray value;
determining at least one damaged area of the plurality of sub-areas;
determining a match line intersecting the damaged area based on the damaged area, wherein a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is greatest; and
repairing the image to be repaired based on color components of pixel gray values in the first non-damaged area and the second non-damaged area;
Wherein said determining a match line intersecting said damaged area based on said damaged area comprises:
determining an initial boundary line based on the location of the damaged area;
searching whether abnormal pixels which are different from the pixel gray values of pixels in the non-damaged area on the first side of the initial boundary line appear in the non-damaged area on the second side of the initial boundary line in the image to be repaired from the initial boundary line;
setting a hypothetical match line based on the at least one outlier pixel if the outlier pixel occurs;
determining a pixel gray value difference between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line; and
a hypothetical match line that maximizes the difference between the non-damaged area of the first side and the non-damaged area of the second side is determined to be a match line.
2. The method of claim 1, wherein the acquiring an image to be repaired comprises:
acquiring a plurality of original images, wherein the plurality of original images are images of a target object acquired by an image acquisition device along a specific direction at each movement preset distance, and the original images have the same size;
Overlapping and arranging the plurality of original images according to an acquisition sequence, and determining tracks formed by the positions of pixels of the same row or the same column in one original image in the plurality of original images respectively; and
and generating the image to be repaired based on the track.
3. The method of claim 2, further comprising:
selecting one of the plurality of original images as a repair starting image, wherein the repair starting image comprises a plurality of damaged rows;
determining one of the images to be repaired for pixels of each of the plurality of damaged rows; and
and restoring the original image to be restored based on a plurality of the original images to be restored of the original image to be restored.
4. The method of claim 1, wherein the determining a pixel gray value difference between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line comprises:
establishing a parallelogram with the hypothesized match line as a central axis;
determining a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line, wherein the first area is an area on the first side of the central axis in the parallelogram and the second area is an area on the second side of the central axis in the parallelogram; and
A pixel gray value distribution of a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line is determined based on the first color histogram and the second color histogram.
5. The method of claim 4, wherein the determining a first color histogram of pixels in a non-damaged area in a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area in a second area on a second side of the hypothetical match line comprises:
obtaining a horizontal distance from a pixel of a non-damaged region in the first region and the second region to the central axis, wherein the horizontal distance indicates a distance from the pixel to another pixel on the central axis that is in the same row as the pixel; and
and taking the horizontal distance as the weight of the gray value of the pixel, and establishing a first color histogram of the non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the pixel gray value of the pixel.
6. An image processing apparatus comprising:
an acquisition module, configured to acquire an image to be repaired, where the image to be repaired includes a plurality of sub-areas, and pixels in each of the plurality of sub-areas have the same pixel gray value;
A first determination module for determining at least one damaged area of the plurality of sub-areas;
a second determining module, configured to determine, based on the damaged area, a match line intersecting the damaged area, where a difference in pixel gray values between a first non-damaged area on a first side of the match line and a second non-damaged area on a second side of the match line is largest; and
a first repair module for repairing the image to be repaired based on color components of pixel gray values in the first and second non-damaged areas;
wherein the second determining module includes:
a second determining sub-module for determining an initial boundary line based on the location of the damaged area;
a searching sub-module, configured to search, from the initial boundary line, whether an abnormal pixel that is different from a pixel gray value of a pixel in a non-damaged area on a first side of the initial boundary line appears in a non-damaged area on a second side of the initial boundary line in the image to be repaired;
a setting sub-module for setting a hypothetical match line based on the at least one outlier pixel if the outlier pixel is present;
a third determination sub-module for determining a difference in pixel gray scale values between a non-damaged area on a first side of the hypothetical match line and a non-damaged area on a second side of the hypothetical match line; and
A fourth determination sub-module for determining an assumed match line that maximizes a difference between the non-damaged area of the first side and the non-damaged area of the second side as a match line.
7. The apparatus of claim 6, the acquisition module comprising:
the acquisition sub-module is used for acquiring a plurality of original images, wherein the plurality of original images are images of a target object acquired by the image acquisition device along a specific direction at each movement preset distance, and the original images have the same size;
the first determining submodule is used for overlapping and arranging the plurality of original images according to the acquisition sequence and determining tracks formed by the positions of pixels of the same row or the same column in one original image in the plurality of original images respectively; and
and the generation sub-module is used for generating the image to be repaired based on the track.
8. The apparatus of claim 7, further comprising:
a selecting module, configured to select one of the plurality of original images as a repair starting image, where the repair starting image includes a plurality of damaged lines;
a third determining module configured to determine the one image to be repaired for pixels of each of the plurality of damaged rows; and
The second restoration module is used for restoring the to-be-restored original image based on a plurality of to-be-restored images of the to-be-restored original image.
9. An image processing apparatus comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-5.
10. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method of any of claims 1 to 5.
CN201910582938.1A 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium Active CN112150373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910582938.1A CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910582938.1A CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Publications (2)

Publication Number Publication Date
CN112150373A CN112150373A (en) 2020-12-29
CN112150373B true CN112150373B (en) 2023-09-26

Family

ID=73891369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910582938.1A Active CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Country Status (1)

Country Link
CN (1) CN112150373B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651897B (en) * 2020-12-30 2024-05-03 成都星时代宇航科技有限公司 Pixel repairing method, device, electronic equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990016366A (en) * 1997-08-14 1999-03-05 윤종용 How to restore damaged image
CN104966279A (en) * 2015-06-15 2015-10-07 鲁东大学 Image synthesis restoration method based on local structure features
WO2018019194A1 (en) * 2016-07-27 2018-02-01 腾讯科技 (深圳) 有限公司 Image recognition method, terminal, and nonvolatile storage medium
CN109272526A (en) * 2017-07-17 2019-01-25 北京京东尚科信息技术有限公司 Image processing method, system and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014474B2 (en) * 2012-09-06 2015-04-21 Cyberlink Corp. Systems and methods for multi-resolution inpainting

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990016366A (en) * 1997-08-14 1999-03-05 윤종용 How to restore damaged image
CN104966279A (en) * 2015-06-15 2015-10-07 鲁东大学 Image synthesis restoration method based on local structure features
WO2018019194A1 (en) * 2016-07-27 2018-02-01 腾讯科技 (深圳) 有限公司 Image recognition method, terminal, and nonvolatile storage medium
CN109272526A (en) * 2017-07-17 2019-01-25 北京京东尚科信息技术有限公司 Image processing method, system and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于像素权值的高效小波图像修复算法;吕永利;姜斌;包建荣;;信息与控制(第01期);全文 *

Also Published As

Publication number Publication date
CN112150373A (en) 2020-12-29

Similar Documents

Publication Publication Date Title
US11587219B2 (en) Method and apparatus for detecting pixel defect of optical module, and device
US10559090B2 (en) Method and apparatus for calculating dual-camera relative position, and device
US9940509B2 (en) Object detection method and object detection apparatus
CN109658454B (en) Pose information determination method, related device and storage medium
CN111598913B (en) Image segmentation method and system based on robot vision
US9626761B2 (en) Sampling method and image processing apparatus of CS-RANSAC for estimating homography
CN107748882B (en) Lane line detection method and device
CN109102026B (en) Vehicle image detection method, device and system
CN112200851A (en) Point cloud-based target detection method and device and electronic equipment thereof
CN112150373B (en) Image processing method, image processing apparatus, and readable storage medium
CN110909620A (en) Vehicle detection method and device, electronic equipment and storage medium
CN113902740A (en) Construction method of image blurring degree evaluation model
CN112837384B (en) Vehicle marking method and device and electronic equipment
US9648211B2 (en) Automatic video synchronization via analysis in the spatiotemporal domain
CN111709951B (en) Target detection network training method and system, network, device and medium
CN112287905A (en) Vehicle damage identification method, device, equipment and storage medium
CN116017129A (en) Method, device, system, equipment and medium for adjusting angle of light supplementing lamp
CN114694137B (en) Image detection method, three-dimensional imaging method and device
CN114040120B (en) Shooting path determination method, device and equipment for panel element detection
CN115511870A (en) Object detection method and device, electronic equipment and storage medium
CN111145674B (en) Display panel detection method, electronic device and storage medium
US10713808B2 (en) Stereo matching method and system using rectangular window
CN112183563A (en) Image recognition model generation method, storage medium and application server
CN113487594B (en) Sub-pixel corner detection method, system and medium based on deep learning
CN112711973B (en) Assessment method and device for key point detection algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant