CN112150373A - Image processing method, image processing apparatus, and readable storage medium - Google Patents

Image processing method, image processing apparatus, and readable storage medium Download PDF

Info

Publication number
CN112150373A
CN112150373A CN201910582938.1A CN201910582938A CN112150373A CN 112150373 A CN112150373 A CN 112150373A CN 201910582938 A CN201910582938 A CN 201910582938A CN 112150373 A CN112150373 A CN 112150373A
Authority
CN
China
Prior art keywords
image
damaged
pixel
region
match line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910582938.1A
Other languages
Chinese (zh)
Other versions
CN112150373B (en
Inventor
魏传振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Zhenshi Information Technology Co Ltd
Original Assignee
Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Zhenshi Information Technology Co Ltd filed Critical Beijing Jingdong Zhenshi Information Technology Co Ltd
Priority to CN201910582938.1A priority Critical patent/CN112150373B/en
Publication of CN112150373A publication Critical patent/CN112150373A/en
Application granted granted Critical
Publication of CN112150373B publication Critical patent/CN112150373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides an image processing method, including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value; determining at least one damaged region of the plurality of sub-regions; determining a match line intersecting the damaged region based on the damaged region, wherein a pixel gray scale value difference of a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest; and repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area. The present disclosure also provides an image processing apparatus and a computer-readable storage medium.

Description

Image processing method, image processing apparatus, and readable storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an image processing method, an image processing apparatus, and a computer-readable storage medium.
Background
With the rapid development of computer technology, image processing technology is increasingly applied to various fields such as industrial and agricultural production, buildings, logistics, daily life and the like. Such as identifying an identification on the good, to classify the good.
In the course of implementing the disclosed concept, the inventors found that there are at least the following problems in the prior art: the existing image processing technology has difficulty in identifying an image with at least a partial area being blocked.
Disclosure of Invention
In view of the above, the present disclosure provides an image processing method, an image processing apparatus, and a computer-readable storage medium.
One aspect of the present disclosure provides an image processing method, including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value; determining at least one damaged region of the plurality of sub-regions; determining a match line intersecting the damaged region based on the damaged region, wherein a pixel gray scale value difference of a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest; and repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area.
According to an embodiment of the present disclosure, acquiring an image to be repaired includes: acquiring a plurality of original images, wherein the original images are images of a target object acquired by an image acquisition device every time the image acquisition device moves a preset distance in a specific direction, and the original images have the same size; overlapping and arranging the plurality of original images according to an acquisition sequence, and determining tracks formed by pixels in the same row or the same column in one original image in the plurality of original images at the positions of the plurality of original images respectively; and generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing method further includes selecting one of the plurality of original images as an original image to be repaired, wherein the original image to be repaired includes a plurality of damaged lines; determining the one image to be repaired for the pixels of each of the plurality of damaged rows; and restoring the original image to be restored based on a plurality of images to be restored of the original image to be restored.
According to an embodiment of the present disclosure, determining, based on the damaged region, a match line that intersects the damaged region comprises: determining an initial boundary line based on the location of the damaged area; searching whether an abnormal pixel with a pixel gray value different from that of a pixel in a non-damaged area on the first side of the initial boundary line appears in a non-damaged area on the second side of the initial boundary line in the image to be repaired from the initial boundary line; setting a hypothetical match line based on the at least one anomalous pixel if the anomalous pixel occurs; determining a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line; and determining a hypothetical match line that maximizes the difference between the non-damaged region of the first side and the non-damaged region of the second side as a match line.
According to an embodiment of the present disclosure, determining a pixel grayscale value difference between a non-damaged region of a first side of the hypothetical match line and a non-damaged region of a second side of the hypothetical match line comprises: establishing a parallelogram with the assumed match line as a central axis; determining a first color histogram of pixels in a non-damaged region of a first region on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged region of a second region on a second side of the hypothetical match line, wherein the first region is a region on the first side of the central axis in the parallelogram and the second region is a region on the second side of the central axis in the parallelogram; and determining pixel gray value distributions of the non-damaged region on the first side of the hypothetical match line and the non-damaged region on the second side of the hypothetical match line based on the first color histogram and the second color histogram.
According to an embodiment of the present disclosure, determining a first color histogram of pixels in a non-damaged area in a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area in a second area on a second side of the hypothetical match line comprises: obtaining a horizontal distance of a pixel of a non-damaged area of the first area and the second area to the central axis, wherein the horizontal distance indicates a distance of the pixel to another pixel on the same row as the pixel on the central axis; and taking the horizontal distance as the weight of the gray value of the pixel, and establishing a first color histogram of the non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the gray value of the pixel.
Another aspect of the present disclosure provides an image processing apparatus including: the image restoration method comprises an acquisition module, a restoration module and a restoration module, wherein the acquisition module is used for acquiring an image to be restored, the image to be restored comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value; a first determining module for determining at least one damaged area of the plurality of sub-areas; a second determining module, configured to determine, based on the damaged region, a match line intersecting the damaged region, where a pixel grayscale value difference between a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is largest; and the first repairing module is used for repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area.
According to an embodiment of the present disclosure, the obtaining module includes: an acquisition sub-module configured to acquire a plurality of original images, which are images of a target object acquired by an image acquisition device every time the image acquisition device moves a preset distance in a specific direction, the original images having the same size; the determining submodule is used for overlapping and arranging the plurality of original images according to the acquisition sequence and determining tracks formed by pixels in the same row or the same column in one of the plurality of original images at the positions of the plurality of original images respectively; and the generation submodule is used for generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing apparatus further includes a selecting module, configured to select one of the plurality of original images as an original image to be repaired, where the original image to be repaired includes a plurality of damaged lines; a third determining module, configured to determine the image to be repaired for the pixels of each of the plurality of damaged rows; and the second restoration module is used for restoring the original image to be restored based on a plurality of images to be restored of the original image to be restored.
Another aspect of the present disclosure provides an image processing apparatus including: one or more processors; memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the above-described method.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
According to the embodiment of the disclosure, the problem that the image with the blocked part is difficult to identify can be at least partially solved, and therefore the technical effect of accurately identifying the blocked image can be realized.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments of the present disclosure with reference to the accompanying drawings, in which:
fig. 1 schematically shows an application scenario in which an image processing method may be applied according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flowchart of an example method of obtaining a repair image, in accordance with an embodiment of the present disclosure;
fig. 4 is a schematic diagram schematically illustrating a plurality of original images arranged in an overlapping manner and determining a trajectory formed by positions of pixels in the same line in the plurality of original images, respectively, according to an embodiment of the present disclosure;
FIG. 5A schematically illustrates a flow chart for determining a match line that intersects the damaged region based on the damaged region, according to an embodiment of the present disclosure;
FIG. 5B schematically illustrates a schematic diagram of determining a match line that intersects the damaged region based on the damaged region, according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart for determining a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line, in accordance with an embodiment of the present disclosure;
FIG. 7A schematically illustrates a schematic diagram of repairing the image to be repaired illustrated in FIG. 5B based on color components of pixel grayscale values in the first and second non-damaged regions according to an embodiment of the disclosure;
fig. 7B schematically illustrates a schematic diagram after completing the restoration of the image to be restored in fig. 5B according to an embodiment of the disclosure;
fig. 8A schematically shows a schematic view of an image to be repaired according to another embodiment of the present disclosure;
fig. 8B schematically shows an image after an image to be repaired is repaired according to another embodiment of the present disclosure;
FIG. 9 schematically shows a flow chart of an image processing method according to another embodiment of the present disclosure;
fig. 10 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure; and
FIG. 11 schematically shows a block diagram of an image processing system according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
An embodiment of the present disclosure provides an image processing method, including: acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value; determining at least one damaged region of the plurality of sub-regions; determining a match line intersecting the damaged region based on the damaged region, wherein a pixel gray scale value difference of a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest; and repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area.
Fig. 1 schematically shows an application scenario in which an image processing method may be applied according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of an application scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the application scene includes an image to be recognized 100. The image to be recognized 100 comprises a plurality of sub-regions, such as the sub-regions 110 to 130 comprised in FIG. 1. Wherein the pixels in each sub-region have the same pixel grey value.
As shown in fig. 1, the image to be recognized 100 includes an occluded area 140. The image to be recognized 100 cannot be recognized because the pixel gray scale values of the pixels in the occlusion region 140 cannot be obtained.
The image processing method according to the embodiment of the present disclosure can repair the image 100 to be recognized, so that the repaired image 100 to be recognized can be recognized accurately.
Fig. 2 schematically shows a flow chart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the image processing method includes operations S210 to S240.
In operation S210, an image to be repaired is acquired, where the image to be repaired includes a plurality of sub-regions, and pixels in each of the plurality of sub-regions have the same pixel grayscale value.
In operation S220, at least one damaged region of the plurality of sub-regions is determined.
In operation S230, a match line intersecting the damaged region is determined based on the damaged region, wherein a pixel gray value difference between a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest.
In operation S240, the image to be repaired is repaired based on the color components of the gray-scale values of the pixels in the first and second non-damaged regions.
According to the method disclosed by the embodiment of the invention, the image to be repaired is repaired according to the texture information of the image to be repaired, so that the accuracy of identifying the image to be repaired is improved.
According to an embodiment of the present disclosure, in operation S210, the image to be repaired may include, for example, an Epipolar Plane Image (EPI), where the EPI-polar plane image includes a plurality of lines, and pixels in each line have the same pixel gray scale value, and pixel gray scale values of pixels in different lines may be the same or different. The same gray value of the pixel may be, for example, R, G, B values of the pixel are respectively the same.
According to an embodiment of the present disclosure, the image to be repaired may include, for example, a barcode image including a plurality of black bars and spaces of different widths, pixels in the black bars having the same pixel grayscale value.
Fig. 3 schematically shows a flowchart of a method of acquiring a repair image according to operation S210 of the present disclosure.
As shown in fig. 3, the method includes operations S211 to S213.
In operation S211, a plurality of original images, which are images of a target object captured by an image capturing device every movement of a preset distance in a specific direction, are acquired, the original images having the same size.
In operation S212, the plurality of original images are overlapped in the acquisition time order, and a trajectory formed by pixels in the same row or the same column in one of the plurality of original images at positions in the plurality of original images respectively is determined.
In operation S213, the image to be repaired is generated based on the trajectory.
According to an embodiment of the present disclosure, in operation S211, the plurality of original images may be, for example, images of one target object captured for each movement of a preset distance as the same camera is moved in a horizontal or vertical direction. Or, a plurality of original images obtained by placing one camera at every preset distance in a certain specific direction and respectively acquiring the target object by the plurality of cameras may be used.
According to an embodiment of the present disclosure, in operation S212, for example, a plurality of original images are obtained by capturing an image of one target object every moving a preset distance by moving the same camera in a horizontal or vertical direction, the plurality of original images may be arranged to be overlapped in a time order of capture. For another example, if a plurality of original images are obtained by capturing a target object by a plurality of cameras located on the same straight line, the plurality of original images may be arranged according to the distance between the cameras and the target object.
According to an embodiment of the present disclosure, in operation S212, for example, the same camera is moved in a horizontal direction, and an image of one target object is acquired every movement of a preset distance to obtain a plurality of original images. In this embodiment, for example, the trajectory of the pixels of a certain line in the first original image at the positions in the plurality of original images, respectively, may be determined.
For another example, the same camera is moved in the vertical direction, and an image of one target object is acquired every movement of a preset distance to obtain a plurality of original images. In this embodiment, for example, the trajectory formed by the positions of the pixels of a certain column in the first original image in the plurality of original images can be determined.
According to the embodiment of the present disclosure, in operation S213, the image to be repaired may be determined, for example, by a trajectory formed by positions of pixels of a certain line of the first original image in the plurality of original images.
For example, to facilitate understanding of those skilled in the art, the operations S212 and S213 are described below with reference to fig. 4, taking the case where the camera moves in the horizontal direction as an example.
Fig. 4 schematically shows a schematic diagram of a plurality of original images arranged in an overlapping manner and determining a trajectory formed by the positions of pixels in the same line in the plurality of original images, respectively, according to an embodiment of the present disclosure.
As shown in FIG. 4, the plurality of original images may include, for example, images 410-450.
The images 410-450 may be acquired, for example, for each predetermined distance of movement of the same camera in the horizontal direction. The images 410-450 are arranged in an overlapping manner according to the time sequence of acquisition.
As shown in fig. 4, an occluded area 411 is included in the plurality of original images, the occluded area 411 being marked white.
According to the embodiment of the disclosure, for example, the positions of the pixels in the 100 th row of the first original image 410 in the images 410-450 are determined, and the positions of each pixel in each original image are connected to form the image to be repaired including the track of each pixel. As shown in fig. 4, the damaged area 412 in the image to be repaired may be a track formed by pixels in the white occluded area 411.
In this embodiment, the first row of pixels in the image to be repaired may be the 100 th row of pixels of the first image 410, the second row of pixels in the image to be repaired may be the 100 th row of pixels of the second image 420, and similarly, the last row of pixels in the image to be repaired may be the 100 th row of pixels of the last image.
It should be understood that the number of original images in the embodiment shown in fig. 4 is merely an example, and in practical applications, the number of original images may be arbitrary.
Referring back to fig. 2, according to an embodiment of the present disclosure, in operation S220, for example, in the scenario illustrated in fig. 4, the damaged region in the image to be repaired may be a damaged region 412.
According to the embodiment of the present disclosure, in operation S220, at least one damaged area may be determined according to a pixel gray value, for example, by marking the damaged area in the image to be repaired as a specific pixel gray value, for example, (255, 255, 255).
According to an embodiment of the present disclosure, in operation S230, for example, in the scenario shown in fig. 1, the match line determined based on the damaged region 140 and intersecting the damaged region may include, for example, the match line MN, and the pixel gray value difference between the non-damaged region on the left side of the match line MN and the non-damaged region on the right side of the match line 141 is the largest.
Figure 5A schematically illustrates a flow chart for determining a match line that intersects the damaged region based on the damaged region, according to an embodiment of the present disclosure.
Figure 5B schematically illustrates a schematic diagram of determining a match line that intersects the damaged region based on the damaged region, according to an embodiment of the present disclosure.
As shown in fig. 5A, the method includes operations S231 to S235.
In operation S231, an initial boundary line is determined based on the location of the damaged region.
In operation S232, it is searched from the initial boundary line whether an abnormal pixel having a different pixel gray value from the pixel gray value of the pixel in the non-damaged area on the second side of the initial boundary line in the image to be repaired occurs in the non-damaged area on the first side of the initial boundary line.
In operation S233, a hypothesis match line is set based on the at least one abnormal pixel.
In operation S234, a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line is determined.
In operation S235, a hypothetical match line that maximizes the difference between the non-damaged region of the first side and the non-damaged region of the second side is determined to be a match line.
An implementation of determining a match line intersecting the damaged region based on the damaged region as described in fig. 5A in accordance with an embodiment of the present disclosure is illustrated below in conjunction with fig. 5B.
According to the embodiment of the present disclosure, in operation S231, for example, a parallelogram region including the damaged region may be determined according to the position of the damaged region, and one oblique side of the parallelogram may be determined as the initial boundary line. For example, in the scenario shown in fig. 5B, a parallelogram GFHKG region is determined based on the position of the damaged region 510, and the hypotenuse GF of the parallelogram GFHKG region is used as the initial boundary line.
According to an embodiment of the present disclosure, in operation S232, it may be searched, for example, from the oblique side GF to the right whether the non-damaged area has abnormal pixels having different pixel gray values from the pixels in the first non-damaged area on the left side of the oblique side GF, where the first non-damaged area includes the non-damaged area closest to the oblique side GF. According to the embodiment of the present disclosure, for example, it may be to sequentially search to the right whether an abnormal pixel occurs in a plurality of pixels located on the same oblique line, where the slope of the oblique line is the same as the slope of the initial boundary line, for example, the initial boundary line GF.
According to the embodiment of the present disclosure, in operation S233, for example, in the scenario shown in fig. 5B, the abnormal pixel a is found, and the hypothetical match line is set as a straight line passing through the abnormal pixel a and having a slope of tan θ. Assume that the match line may be a straight line PQ, for example.
According to an embodiment of the present disclosure, in operation S234, a pixel gray value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line may be calculated, for example, according to equation (1).
Figure BDA0002111900750000101
Wherein, gθ(l) Number of pixels, h, assuming a gray value of l for the left pixel of the match lineθ(l) Number of pixels, x, assuming a gray value of l for the pixel on the right of the match line2(gθ,hθ) Is the difference in pixel gray value between the non-damaged area on the first side of the hypothetical match line and the non-damaged area on the second side of the hypothetical match line with a slope tan θ.
Figure 6 schematically illustrates a flow chart for determining a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line, in accordance with an embodiment of the present disclosure.
As shown in fig. 6, the method includes operations S610 to S630.
In operation S610, a parallelogram is established with the hypothetical match line as a central axis.
In operation S620, a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line are determined, wherein the first area is an area on the first side of the central axis in the parallelogram and the second area is an area on the second side of the central axis in the parallelogram.
In operation S630, pixel grayscale value distributions of a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line are determined based on the first color histogram and the second color histogram.
According to the method, the pixel gray value difference of two sides of the assumed matching line is determined according to the color histograms of the first area and the second area in the parallelogram with the assumed matching line as the central axis, and the method only calculates the color histogram in the parallelogram in the image to be repaired, so that the calculation amount is reduced, and the influence of other undamaged areas on the calculation result is reduced.
According to an embodiment of the present disclosure, in operation S610, for example, in the scenario shown in fig. 5B, a parallelogram BDECB may be established with the assumed match line PQ as a central axis.
According to an embodiment of the present disclosure, in operations S620 and S630, for example, in the scenario shown in fig. 5B, the first region is a BDQPB region on the left side of the central axis PQ of the parallelogram BDECB, and the second region is a PQECP region on the right side of the central axis PQ. According to the embodiment of the present disclosure, for example, color histograms of the BDQPB region and the PQECP region may be determined, respectively, so as to determine pixel gray value distributions of the BDQPB region and the PQECP region.
According to an embodiment of the present disclosure, determining a first color histogram of pixels in a non-damaged region in a BDQPB region on a first side of an assumed match line PQ and a second color histogram of pixels in a non-damaged region in a PQECP region on a second side of the assumed match line PQ comprises: obtaining a horizontal distance of a pixel of a non-damaged region in the first region (BDQPB region) and the second region (PQECP region) to the central axis (PQ), wherein the horizontal distance indicates a distance of the pixel to another pixel located on the same row as the pixel on the central axis (PQ); and establishing a first color histogram of the non-damaged region in the first region (BDQPB region) and a second color histogram of the non-damaged region in the second region (PQECP region) based on the weights and the pixel gradation values of the pixel, using the horizontal distance as the weight of the gradation value of the pixel.
According to an embodiment of the present disclosure, as shown in fig. 5B, an embodiment of determining the first color histogram is described by taking a pixel T of the non-damaged area of the first area as an example.
The first region is a BDQPB region, the non-damaged region in the BDQPB region includes a pixel T, another pixel on the central axis on the same row as the pixel T is a pixel U, and a horizontal distance from the pixel T to the central axis is determined as a distance between the pixel T and the pixel U.
According to an embodiment of the present disclosure, the horizontal distance of the pixel of the non-damaged region from the central axis may be calculated according to formula (2), for example, the distance between the pixel T and the pixel U may be calculated according to formula (2).
Zθ(i, j) ═ i- (u + (j-v) × tan θ) formula (2)
Wherein (u, v) is the coordinate of the abnormal pixel A, (i, j) is the coordinate of the pixel T in the undamaged area, and Zθ(i, j) represents the distance between pixel T and pixel U.
According to embodiments of the present disclosure, for example, the horizontal distance Z may beθ(i, j) as a weight of the pixel gradation value of the pixel T. According to the embodiment of the present disclosure, the weight of the gradation value of each pixel in the non-damaged region may also be calculated according to the following formula (3), for example.
Figure BDA0002111900750000121
Where c denotes normalization, a denotes a scale parameter, ωθ(i, j) represents the weight of the pixel (i, j).
According to the embodiment of the present disclosure, for example, if the pixel gray scale value of the pixel T is R and the weight of the pixel T is ω (T), it may be determined that the degree of contribution of the pixel T to the number of pixels of which the pixel gray scale value is R is ω (T). For another example, if the gray-scale value of the other pixel S is R and the weight of the pixel S is ω (S), the degree of contribution of the pixel S to the number of pixels having the gray-scale value R is ω (S).
According to the embodiment of the present disclosure, for example, the number of gray-scale values of pixels in the non-damaged region of the first region may be counted, the first color histogram may be established according to the number of gray-scale values of pixels, the number of gray-scale values of pixels in the non-damaged region of the second region may be counted, and the second color histogram may be established according to the number of gray-scale values of pixels. For example, if the pixel gray values of the pixels T and U existing in the non-damaged area of the first area are R, the number of the pixel gray values R may be ω (T) + ω (S).
According to an embodiment of the present disclosure, the pixel grayscale value difference between the non-damaged region on the first side of the hypothetical match line and the non-damaged region on the second side of the hypothetical match line may be determined, for example, according to the pixel grayscale value distribution of the non-damaged region on the first side of the hypothetical match line and the non-damaged region on the second side of the hypothetical match line and equation (1) above.
Referring back to fig. 5A, according to an embodiment of the present disclosure, in operation S235, for example, it may be determined such that χ2(gθ,hθ) The largest value of tan θ is the slope of the match line. The match line in fig. 5B is calculated as a straight line MN according to equation (1), for example.
Referring back to fig. 2, in operation S240, for example, in the scenario shown in fig. 5B, if in operation S230, the match line is determined to be the straight line MN, and the color component of the pixel gray value in the first non-damaged region may be, for example, (a1, a2, A3), the damaged region on the left side of the straight line MN is filled with a color whose color component is (a1, a2, A3).
Fig. 7A schematically illustrates a schematic diagram of repairing the image to be repaired illustrated in fig. 5B based on the color components of the gray values of the pixels in the first non-damaged region and the second non-damaged region according to an embodiment of the present disclosure.
As shown in fig. 7A, the match line in fig. 5B is a straight line MN, the color component of the non-damaged area on the left side of the straight line MN is (a1, a2, A3), and the color of the damaged area on the left side of the straight line MN is filled with a color having a color component of (a1, a2, A3).
According to an embodiment of the present disclosure, the repair of the damaged region in the parallelogram MNHKM, for example, may be continued according to the method described above to obtain an undamaged image, which is shown in fig. 7B.
Fig. 8A schematically illustrates a schematic diagram of an image to be repaired according to another embodiment of the present disclosure.
Fig. 8B schematically illustrates an image of fig. 8A after the image to be repaired is repaired according to another embodiment of the present disclosure.
As shown in fig. 8A, the image to be repaired is a barcode image, and a partial area in the barcode image is blocked, so that the barcode cannot be recognized.
According to an embodiment of the present disclosure, the barcode image in fig. 8A may be repaired by the method illustrated in fig. 2, for example.
As shown in fig. 8B, after the barcode shown in fig. 8A is repaired, the blocked portion in the barcode image is repaired, so that the electronic device can recognize the barcode shown in fig. 8B.
Fig. 9 schematically shows a flow chart of an image processing method according to another embodiment of the present disclosure.
As shown in fig. 9, the method further includes operations S910 to S930 based on the foregoing embodiment.
In operation S910, one of the plurality of original images is selected as an original image to be restored.
In operation S920, one to-be-repaired image is determined for each row of pixels in the to-be-repaired original image.
In operation S930, the original image to be restored is restored based on a plurality of the images to be restored of the original image to be restored.
An implementation of the image processing method according to an embodiment of the present disclosure is schematically illustrated below with reference to fig. 4.
According to an embodiment of the present disclosure, in operation S910, for example, an original image may be selected from the original images 410 to 450 as an original image to be restored. For example, the original image to be restored may be the original image 410, and the lines 10 to 150 of the pixels in the original image 410 to be restored are damaged lines.
According to an embodiment of the present disclosure, in operation S920, for example, according to the method described in fig. 3, the images to be repaired corresponding to the 10 th row to the 150 th row are determined.
According to the embodiment of the disclosure, for example, determining the image to be repaired corresponding to the pixel of the 100 th row may be determining the position of each pixel of the 100 th row of the first original image 410 in the images 410 to 450, and connecting the positions of each pixel in each original image, so as to form the image to be repaired including the track of each pixel. As shown in fig. 4, the image to be repaired includes tracks formed by pixels in the white occluded area.
In this embodiment, the first row of pixels in the image to be repaired may be the 100 th row of pixels of the first image 410, the second row of pixels in the image to be repaired may be the 100 th row of pixels of the second image 420, and similarly, the last row of pixels in the image to be repaired may be the 100 th row of pixels of the last image.
According to a similar method for obtaining the image to be repaired, the images to be repaired corresponding to the 10 th line to the 150 th line are respectively obtained.
According to the embodiment of the present disclosure, in operation S930, for example, the images to be repaired corresponding to the 10 th to 150 th rows may be respectively repaired, so as to determine the standard pixel gray value of the damaged pixels in the plurality of damaged rows in the original image 410 to be repaired, and repair the color component of the pixel gray value of the damaged pixels to the standard pixel gray value.
Fig. 10 schematically shows a block diagram of an image processing apparatus 1000 according to an embodiment of the present disclosure.
As shown in fig. 10, the image processing apparatus 1000 includes an acquisition module 1010, a first determination module 1020, a second determination module 1030, and a first repair module 1040.
The obtaining module 1010, for example, performs the operation S210 described above with reference to fig. 2, to obtain an image to be repaired, where the image to be repaired includes a plurality of sub-regions, and pixels in each of the plurality of sub-regions have the same pixel grayscale value.
A first determining module, e.g. performing operation S220 described above with reference to fig. 2, for determining at least one damaged area of the plurality of sub-areas.
A second determining module, for example, performing operation S230 described above with reference to fig. 2, is configured to determine a match line intersecting the damaged region based on the damaged region, wherein a pixel gray value difference between a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest. And
the first repairing module, for example, performs operation S230 described above with reference to fig. 2, for repairing the image to be repaired based on the color components of the gray-scale values of the pixels in the first non-damaged area and the second non-damaged area.
According to an embodiment of the present disclosure, the obtaining module 1010 includes: an acquisition sub-module configured to acquire a plurality of original images, which are images of a target object acquired by an image acquisition device every time the image acquisition device moves a preset distance in a specific direction, the original images having the same size; the first determining submodule is used for overlapping and arranging the original images according to the acquisition sequence and determining tracks formed by pixels in the same row or the same column in one of the original images at the positions of the original images respectively; and the generation submodule is used for generating the image to be repaired based on the track.
According to an embodiment of the present disclosure, the image processing apparatus further includes: the device comprises a selection module, a restoration module and a processing module, wherein the selection module is used for selecting one of the original images as an original image to be restored, and the original image to be restored comprises a plurality of damaged lines; a third determining module, configured to determine the image to be repaired for the pixels of each of the plurality of damaged rows; and the second restoration module is used for restoring the original image to be restored based on a plurality of images to be restored of the original image to be restored.
According to an embodiment of the present disclosure, the second determining module 1030 includes: a second determining submodule for determining an initial boundary line based on the position of the damaged region; the searching submodule is used for searching whether the non-damaged area on the second side of the initial boundary line in the image to be repaired has abnormal pixels different from the pixel gray value of the pixels in the non-damaged area on the first side of the initial boundary line or not from the initial boundary line; a setting submodule, configured to set a hypothetical match line based on the at least one abnormal pixel if the abnormal pixel occurs; a third determining sub-module for determining a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line; and a fourth determination submodule configured to determine a hypothetical match line that maximizes a difference between the non-damaged region on the first side and the non-damaged region on the second side as a match line.
According to an embodiment of the present disclosure, the third determination submodule includes: a setting unit for establishing a parallelogram with the hypothetical match line as a central axis; a first determining unit, configured to determine a first color histogram of pixels in a non-damaged area of a first area on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged area of a second area on a second side of the hypothetical match line, where the first area is an area on the first side of the central axis in the parallelogram and the second area is an area on the second side of the central axis in the parallelogram; and a second determination unit configured to determine a distribution of pixel grayscale values of a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line based on the first color histogram and the second color histogram.
According to an embodiment of the present disclosure, the first determination unit includes: an obtaining subunit, configured to obtain a horizontal distance from a pixel of a non-damaged area in the first area and the second area to the central axis, where the horizontal distance indicates a distance from the pixel to another pixel on the central axis that is located in the same row as the pixel; and the determining subunit is used for taking the horizontal distance as the weight of the gray value of the pixel, and establishing a first color histogram of a non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the gray value of the pixel.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any number of the obtaining module 1010, the first determining module 1020, the second determining module 1030, and the first repairing module 1040 may be combined and implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the obtaining module 1010, the first determining module 1020, the second determining module 1030, and the first repairing module 1040 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented by any one of three implementations of software, hardware, and firmware, or by a suitable combination of any of them. Alternatively, at least one of the obtaining module 1010, the first determining module 1020, the second determining module 1030, and the first repairing module 1040 may be at least partially implemented as a computer program module, which when executed, may perform a corresponding function.
Fig. 11 schematically shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure. The image processing system shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 11, the image processing 1100 according to the embodiment of the present disclosure includes a processor 1101, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)1102 or a program loaded from a storage section 1108 into a Random Access Memory (RAM) 1103. The processor 1101 may comprise, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), among others. The processor 1101 may also include on-board memory for caching purposes. The processor 1101 may comprise a single processing unit or a plurality of processing units for performing the different actions of the method flows according to the embodiments of the present disclosure.
In the RAM 1103, various programs and data necessary for the operation of the system 1100 are stored. The processor 1101, the ROM 1102, and the RAM 1103 are connected to each other by a bus 1104. The processor 1101 performs various operations of the method flow according to the embodiments of the present disclosure by executing programs in the ROM 1102 and/or the RAM 1103. It is noted that the programs may also be stored in one or more memories other than the ROM 1102 and RAM 1103. The processor 1101 may also perform various operations of the method flows according to the embodiments of the present disclosure by executing programs stored in the one or more memories.
System 1100 may also include an input/output (I/O) interface 1105, which input/output (I/O) interface 1105 is also connected to bus 1104, according to an embodiment of the present disclosure. The system 1100 may also include one or more of the following components connected to the I/O interface 1105: an input portion 1106 including a keyboard, mouse, and the like; an output portion 1107 including a signal output unit such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 1108 including a hard disk and the like; and a communication section 1109 including a network interface card such as a LAN card, a modem, or the like. The communication section 1109 performs communication processing via a network such as the internet. A driver 1110 is also connected to the I/O interface 1105 as necessary. A removable medium 1111 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1110 as necessary, so that a computer program read out therefrom is mounted into the storage section 1108 as necessary.
According to embodiments of the present disclosure, method flows according to embodiments of the present disclosure may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable storage medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 1109 and/or installed from the removable medium 1111. The computer program, when executed by the processor 1101, performs the above-described functions defined in the system of the embodiment of the present disclosure. The systems, devices, apparatuses, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the present disclosure.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, a computer-readable storage medium may include the ROM 1102 and/or the RAM 1103 and/or one or more memories other than the ROM 1102 and the RAM 1103 described above.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described separately above, this does not mean that the measures in the embodiments cannot be used in advantageous combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.

Claims (11)

1. An image processing method comprising:
acquiring an image to be repaired, wherein the image to be repaired comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value;
determining at least one damaged region of the plurality of sub-regions;
determining a match line intersecting the damaged region based on the damaged region, wherein a pixel gray scale value difference of a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is the largest; and
and repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area.
2. The method of claim 1, wherein the acquiring the image to be repaired comprises:
acquiring a plurality of original images, wherein the original images are images of a target object acquired by an image acquisition device every time the image acquisition device moves a preset distance in a specific direction, and the original images have the same size;
overlapping and arranging the plurality of original images according to an acquisition sequence, and determining tracks formed by pixels in the same row or the same column in one original image in the plurality of original images at the positions of the plurality of original images respectively; and
and generating the image to be repaired based on the track.
3. The method of claim 2, further comprising:
selecting one of the original images as an original image to be repaired, wherein the original image to be repaired comprises a plurality of damaged lines;
determining one of the images to be repaired for the pixels of each of the plurality of damaged lines; and
and restoring the original image to be restored based on a plurality of images to be restored of the original image to be restored.
4. The method of claim 1, wherein the determining, based on the damaged region, a match line that intersects the damaged region comprises:
determining an initial boundary line based on the location of the damaged area;
searching whether an abnormal pixel with a pixel gray value different from that of a pixel in a non-damaged area on the first side of the initial boundary line appears in a non-damaged area on the second side of the initial boundary line in the image to be repaired from the initial boundary line;
setting a hypothetical match line based on the at least one anomalous pixel if the anomalous pixel occurs;
determining a pixel grayscale value difference between a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line; and
determining a hypothetical match line that maximizes the difference between the non-damaged region of the first side and the non-damaged region of the second side as a match line.
5. The method of claim 4, wherein the determining a pixel grayscale value difference between a non-damaged region of a first side of the hypothetical match line and a non-damaged region of a second side of the hypothetical match line comprises:
establishing a parallelogram with the assumed match line as a central axis;
determining a first color histogram of pixels in a non-damaged region of a first region on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged region of a second region on a second side of the hypothetical match line, wherein the first region is a region on the first side of the central axis in the parallelogram and the second region is a region on the second side of the central axis in the parallelogram; and
determining a distribution of pixel grayscale values for a non-damaged region on a first side of the hypothetical match line and a non-damaged region on a second side of the hypothetical match line based on the first color histogram and the second color histogram.
6. The method of claim 5, wherein the determining a first color histogram of pixels in a non-damaged region in a first region on a first side of the hypothetical match line and a second color histogram of pixels in a non-damaged region in a second region on a second side of the hypothetical match line comprises:
obtaining a horizontal distance of a pixel of a non-damaged area of the first area and the second area to the central axis, wherein the horizontal distance indicates a distance of the pixel to another pixel on the same row as the pixel on the central axis; and
and taking the horizontal distance as the weight of the gray value of the pixel, and establishing a first color histogram of the non-damaged area in the first area and a second color histogram of the non-damaged area in the second area according to the weight and the gray value of the pixel.
7. An image processing apparatus comprising:
the image restoration method comprises an acquisition module, a restoration module and a restoration module, wherein the acquisition module is used for acquiring an image to be restored, the image to be restored comprises a plurality of sub-areas, and pixels in each sub-area of the plurality of sub-areas have the same pixel gray value;
a first determining module for determining at least one damaged area of the plurality of sub-areas;
a second determining module, configured to determine, based on the damaged region, a match line intersecting the damaged region, where a pixel grayscale value difference between a first non-damaged region on a first side of the match line and a second non-damaged region on a second side of the match line is largest; and
and the first repairing module is used for repairing the image to be repaired based on the color components of the pixel gray values in the first non-damaged area and the second non-damaged area.
8. The apparatus of claim 7, the obtaining module comprising:
an acquisition sub-module configured to acquire a plurality of original images, which are images of a target object acquired by an image acquisition device every time the image acquisition device moves a preset distance in a specific direction, the original images having the same size;
the first determining submodule is used for overlapping and arranging the original images according to the acquisition sequence and determining tracks formed by pixels in the same row or the same column in one of the original images at the positions of the original images respectively; and
and the generation submodule is used for generating the image to be repaired based on the track.
9. The apparatus of claim 8, further comprising:
the device comprises a selection module, a restoration module and a processing module, wherein the selection module is used for selecting one of the original images as an original image to be restored, and the original image to be restored comprises a plurality of damaged lines;
a third determining module, configured to determine the image to be repaired for the pixels of each of the plurality of damaged rows; and
and the second restoration module is used for restoring the original image to be restored based on a plurality of images to be restored of the original image to be restored.
10. An image processing apparatus comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to perform the method of any of claims 1-6.
11. A computer readable storage medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 6.
CN201910582938.1A 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium Active CN112150373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910582938.1A CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910582938.1A CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Publications (2)

Publication Number Publication Date
CN112150373A true CN112150373A (en) 2020-12-29
CN112150373B CN112150373B (en) 2023-09-26

Family

ID=73891369

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910582938.1A Active CN112150373B (en) 2019-06-28 2019-06-28 Image processing method, image processing apparatus, and readable storage medium

Country Status (1)

Country Link
CN (1) CN112150373B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651897A (en) * 2020-12-30 2021-04-13 成都星时代宇航科技有限公司 Pixel repairing method and device, electronic equipment and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990016366A (en) * 1997-08-14 1999-03-05 윤종용 How to restore damaged image
US20140064614A1 (en) * 2012-09-06 2014-03-06 Cyberlink Corp. Systems and Methods for Multi-Resolution Inpainting
CN104966279A (en) * 2015-06-15 2015-10-07 鲁东大学 Image synthesis restoration method based on local structure features
WO2018019194A1 (en) * 2016-07-27 2018-02-01 腾讯科技 (深圳) 有限公司 Image recognition method, terminal, and nonvolatile storage medium
CN109272526A (en) * 2017-07-17 2019-01-25 北京京东尚科信息技术有限公司 Image processing method, system and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990016366A (en) * 1997-08-14 1999-03-05 윤종용 How to restore damaged image
US20140064614A1 (en) * 2012-09-06 2014-03-06 Cyberlink Corp. Systems and Methods for Multi-Resolution Inpainting
CN104966279A (en) * 2015-06-15 2015-10-07 鲁东大学 Image synthesis restoration method based on local structure features
WO2018019194A1 (en) * 2016-07-27 2018-02-01 腾讯科技 (深圳) 有限公司 Image recognition method, terminal, and nonvolatile storage medium
CN109272526A (en) * 2017-07-17 2019-01-25 北京京东尚科信息技术有限公司 Image processing method, system and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吕永利;姜斌;包建荣;: "基于像素权值的高效小波图像修复算法", 信息与控制, no. 01 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651897A (en) * 2020-12-30 2021-04-13 成都星时代宇航科技有限公司 Pixel repairing method and device, electronic equipment and computer readable storage medium
CN112651897B (en) * 2020-12-30 2024-05-03 成都星时代宇航科技有限公司 Pixel repairing method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN112150373B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN110758246A (en) Automatic parking method and device
CN109479082A (en) Image processing method and device
JP2009048629A (en) Detecting method
US9508000B2 (en) Object recognition apparatus
CN110264523B (en) Method and equipment for determining position information of target image in test image
EP4151951B1 (en) Vehicle localization method and device, electronic device and storage medium
CN112950725A (en) Monitoring camera parameter calibration method and device
CN110909620A (en) Vehicle detection method and device, electronic equipment and storage medium
CN115994899A (en) Bolt loosening detection method, device and detection equipment
CN111191482B (en) Brake lamp identification method and device and electronic equipment
CN112150373B (en) Image processing method, image processing apparatus, and readable storage medium
CN112001357B (en) Target identification detection method and system
CN114332349A (en) Binocular structured light edge reconstruction method and system and storage medium
CN110827340B (en) Map updating method, device and storage medium
CN116017129A (en) Method, device, system, equipment and medium for adjusting angle of light supplementing lamp
CN115249345A (en) Traffic jam detection method based on oblique photography three-dimensional live-action map
CN115018926A (en) Method, device and equipment for determining pitch angle of vehicle-mounted camera and storage medium
CN113255405B (en) Parking space line identification method and system, parking space line identification equipment and storage medium
CN113147746A (en) Method and device for detecting ramp parking space
CN115249407A (en) Indicating lamp state identification method and device, electronic equipment, storage medium and product
CN113487594B (en) Sub-pixel corner detection method, system and medium based on deep learning
CN113379591B (en) Speed determination method, speed determination device, electronic device and storage medium
CN112711973B (en) Assessment method and device for key point detection algorithm
CN116402871B (en) Monocular distance measurement method and system based on scene parallel elements and electronic equipment
JP7435258B2 (en) Vehicle speed detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant