CN113570518B - Image correction method, system, computer equipment and storage medium - Google Patents
Image correction method, system, computer equipment and storage medium Download PDFInfo
- Publication number
- CN113570518B CN113570518B CN202110830834.5A CN202110830834A CN113570518B CN 113570518 B CN113570518 B CN 113570518B CN 202110830834 A CN202110830834 A CN 202110830834A CN 113570518 B CN113570518 B CN 113570518B
- Authority
- CN
- China
- Prior art keywords
- image
- distorted
- point
- pixel
- reduction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000003702 image correction Methods 0.000 title claims abstract description 33
- 230000009467 reduction Effects 0.000 claims abstract description 49
- 230000009466 transformation Effects 0.000 claims abstract description 11
- 230000000295 complement effect Effects 0.000 claims abstract description 9
- 238000004590 computer program Methods 0.000 claims description 10
- 238000013459 approach Methods 0.000 abstract description 8
- 238000012545 processing Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000012937 correction Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 230000035772 mutation Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The application discloses an image correction method, an image correction system, computer equipment and a computer readable storage medium. The application provides an image correction method, which comprises the following steps: hough transformation step: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image; and a pixel reduction step: traversing all pixel points in the distorted first image boundary range, respectively performing angle reduction and distance reduction on the pixel points in the distorted image, and then reducing the distorted image into a second image which is not distorted; and a pixel complement step: and complementing the vacant pixels of the partial region of the second image based on the approach difference method to obtain the restored original image. The application realizes the one-by-one processing of each pixel point, so that the restored picture is more similar to the real picture.
Description
Technical Field
The present application relates to the field of image correction processing technology, and in particular, to an image correction method, an image correction system, a computer device, and a computer readable storage medium.
Background
Image correction refers to a restorative process performed on a distorted image. Image distortion is caused by aberration, distortion, bandwidth limitation, etc. of the imaging system; image geometric distortion due to imaging pose and scanning nonlinearity of the imaging device; image distortion due to motion blur, radiation distortion, introduced noise, and the like. The basic idea of image correction is to build a corresponding mathematical model based on the cause of image distortion, extract the required information from the contaminated or distorted image signal, and restore the original appearance of the image along the inverse of the image distortion. The actual restoration process is to design a filter that calculates an estimate of the real image from the distorted image to maximize its approach to the real image based on a predetermined error criterion. The image correction method adopted in the prior art is divided into a geometric correction method and a gray correction method.
Fig. 1 is a schematic diagram of a normal business card in the prior art, fig. 2 is a schematic diagram of a business card with shooting distortion, and as shown in fig. 1 and fig. 2, oblique image correction is an important link of business card identification, and the rectangular picture becomes a trapezoid due to image distortion caused by shooting angles, so that great interference is caused to business card identification.
Currently, aiming at the related technology, the following bottlenecks exist, and no effective solution has been proposed yet:
in the correction method adopted in the prior art, if the correction area has content, obvious mutation can occur in the triangulated area, and because the stretching proportion of the left triangle and the right triangle is different, forced stretching can cause the uncoordinated content of the adjacent areas of the two triangles.
The application provides a set of image correction solution, and each point of the pixel level can be processed one by one, so that the problem that the triangle divided area has obvious mutation is solved, and the restored picture is more similar to the real picture.
Disclosure of Invention
Embodiments of the present application provide an image correction method, system, computer device, and computer-readable storage medium to at least solve the problem of abrupt graph changes in the related art.
In a first aspect, an embodiment of the present application provides an image correction method, including the steps of:
hough transformation step: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
and a pixel reduction step: traversing all pixel points in the distorted first image boundary range, respectively performing angle reduction and distance reduction on the pixel points in the distorted image, and then reducing the distorted image into a second image which is not distorted;
and a pixel complement step: and complementing the vacant pixels of the partial region of the second image based on the approach difference method to obtain the restored original image.
In some embodiments, one of the corner points of the first image is set as an origin, one of sides connected with the origin in the first image is set as a bottom side, an included angle formed by a side formed by the pixel point and the origin and the bottom side is θ, and a distance from the pixel point to the origin is d, and the pixel point is denoted as (θ, d).
In some embodiments, the ith pixel point angle reduction uses the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the original point and the bottom edge is theta i And theta' is the included angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle after the pixel point is restored to the rectangle.
In some of these embodiments, the ith pixel distance reduction uses the following formula: a reduction formula for one corner of the first image:
the reduction formula for the other corner of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
In a second aspect, an embodiment of the present application provides an image correction system, which adopts the image correction method as described above, including the following modules:
a Hough transform module: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
and a pixel reduction module: traversing all pixel points in the boundary range of the distorted first image, respectively carrying out angle reduction and distance reduction on the pixel points in the distorted first image, and then reducing the distorted image into a second image which does not generate distortion;
and a pixel complement module: and complementing the vacant pixels of the partial region of the second image based on a proximity difference method to obtain a restored original image.
In some embodiments, one of the corner points of the first image is set as an origin, one of sides connected with the origin in the first image is set as a bottom side, an included angle formed by a side formed by the pixel point and the origin and the bottom side is θ, and a distance from the pixel point to the origin is d, and the pixel point is denoted as (θ, d).
In some embodiments, the ith pixel point angle reduction uses the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the original point and the bottom edge is theta i And theta' is the included angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle after the pixel point is restored to the rectangle.
In some of these embodiments, the ith pixel distance reduction uses the following formula: a reduction formula for one corner of the first image:
the reduction formula for the other corner of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
In a third aspect, an embodiment of the present application provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the image correction method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the image correction method as described in the first aspect above.
Compared with the prior art, the image correction method, the system, the device and the computer readable storage medium provided by the embodiment of the application can process each point of the pixel level one by the solution of the image correction provided by the application, solve the obvious abrupt change problem of the triangulated region, and enable the restored picture to be more similar to the real picture.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of a prior art normal business card;
FIG. 2 is a schematic view of a business card with shooting distortion;
FIG. 3 is a schematic flow chart of the method of the present application;
fig. 4 is a schematic diagram of a hough transform method according to the prior art;
FIG. 5 is a schematic diagram illustrating the principle of image restoration according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an embodiment;
FIG. 7 is a schematic diagram of a system architecture of the present application;
fig. 8 is a schematic diagram of a hardware configuration of a computer device.
In the above figures:
10. hough transform module, 20, pixel reduction module, 30 and pixel complement module
81. A processor; 82. a memory; 83. a communication interface; 80. a bus.
Detailed Description
The present application will be described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by a person of ordinary skill in the art based on the embodiments provided by the present application without making any inventive effort, are intended to fall within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the described embodiments of the application can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," and similar referents in the context of the application are not to be construed as limiting the quantity, but rather as singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in connection with the present application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
The method, the device, the equipment or the computer readable storage medium can be used for processing each point of the pixel level one by one, so that the problem that the triangulated region has obvious mutation is solved, and the restored picture is more similar to the real picture.
The application adopts a Hough transformation method, and is briefly described as follows:
hough Transform (Hough Transform) Hough Transform is one of the basic methods of identifying geometric shapes from images in image processing, and is widely used, and there are many improved algorithms. Mainly for separating geometric shapes (e.g. straight lines, circles, etc.) having some identical features from the image. The most basic hough transform is to detect straight lines (line segments) from a black-and-white image.
It is known that a straight line is drawn on a black-and-white image, and the position where the straight line is located is required. It is known that the equation for a straight line can be expressed as y=kx+b, where k and b are parameters, slope and intercept, respectively. All straight lines passing a certain point (x 0, y 0) have parameters satisfying the equation y0=kx0+b. I.e. point (x 0, y 0) defines a set of straight lines. Equation y0=kx0+b is a straight line in the plane of parameter k-b, (or may be a straight line corresponding to equation b= -x0 x k+y0). Thus, a foreground pixel point on the x-y plane of the image corresponds to a straight line on the parameter plane. We illustrate for example the principle of solving the problem of the previous one. Let y=x be the straight line on the image, we first take the three points above, a (0, 0), B (1, 1), C (2, 2). It can be found that the parameters of the straight line passing through the point a satisfy the equation b=0, the parameters of the straight line passing through the point B satisfy the equation 1=k+b, the parameters of the straight line passing through the point C satisfy the equation 2=2k+b, the three equations correspond to three straight lines on the parameter plane, and the three straight lines intersect at one point (k=1, b=0). Similarly, other points (e.g., (3, 3), (4, 4), etc.) on the original image that are on the straight line y=x correspond to the straight line on the parameter plane, and pass through the points (k=1, b=0).
First, a block buffer is initialized, and all its data is set to 0 corresponding to the parameter plane.
For each foreground point on the image, a corresponding straight line in the parameter plane is obtained, and the occurrence times of all points on the straight line are counted. And finally, finding the position of the point with the largest occurrence number on the parameter plane, wherein the position is the parameter of the straight line on the original image. The above is the basic idea of the hough transform. The points on the image plane are corresponding to the lines on the parameter plane, and finally the problem is solved by the statistical characteristics. If there are two straight lines on the image plane, two peak points of the count will eventually be seen on the parameter plane, and so on.
In practical applications, the linear equation of the form y=k×x+b has no way to represent a linear line of the form x=c (here, the slope of the linear line is infinity). In practice, the parameter equation p=x×cos (θ) +y×s i n (θ) is used. Thus, a point on the image plane corresponds to a curve on the parameter p-theta plane. The others are the same.
It is assumed that a circle with a known radius is detected from an image. This problem is more intuitive than the previous one. We can draw a circle on the parameter plane with a known radius around each foreground point on the image and accumulate the results. And finally, finding out the peak point on the parameter plane, wherein the position corresponds to the circle center on the image. In this problem, each point on the image plane corresponds to a circle on the parameter plane.
The above problem is changed, and if we do not know the value of the radius, we find the circle on the image. Thus, one approach is to expand the parameter plane into a three-dimensional space. That is, the parameter space becomes x-y-R three-dimensional, corresponding to the center and radius of the circle.
Each point on the image plane corresponds to a circle at each radius in the parameter space, which is in fact a cone. Finally, of course, the peak point in the parameter space is found. However, this approach obviously requires a large amount of memory and the speed of operation can be a significant problem. What is a better approach? The images we have assumed before are all black and white images (2-value images), and in practice many of these 2-value images are color or gray-scale images that are extracted by edges. As mentioned above, the image edge is important to the direction information in addition to the position information, and is used here. Depending on the nature of the circle, the radius of the circle must be on a straight line perpendicular to the tangent of the circle, that is, on the normal to any point on the circle. Thus, to solve the above problem, we still use a 2-dimensional parameter space, and for each foreground point on the image, by adding its direction information, a straight line can be determined, and the center of the circle is on the straight line. In this way, the problem is much simpler.
Fig. 3 is a flow chart of the method of the present application, and as shown in fig. 3, an embodiment of the present application provides an image correction method, which includes the following steps:
hough transform step S10: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
pixel reduction step S20: traversing all pixel points in the distorted first image boundary range, respectively performing angle reduction and distance reduction on the pixel points in the distorted image, and then reducing the distorted image into a second image which is not distorted;
pixel complement step S30: and complementing the vacant pixels of the partial region of the second image based on the approach difference method to obtain the restored original image.
One corner point of the first image is set as an origin, one side, connected with the origin, of the first image is set as a bottom side, an included angle formed by a side formed by the pixel point to the origin and the bottom side is theta, and a distance from the pixel point to the origin is d, so that the pixel point is expressed as (theta, d).
In some embodiments, the ith pixel point angle reduction uses the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the original point and the bottom edge is theta i And theta' is the included angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle after the pixel point is restored to the rectangle.
In some of these embodiments, the ith pixel distance reduction uses the following formula: a reduction formula for one corner of the first image:
the reduction formula for the other corner of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
The following describes the image correction process flow in the specific embodiment in detail with reference to the accompanying drawings:
(1) Utilizing Hough transformation to find quadrilateral boundary and corner point
Fig. 4 is a schematic diagram of a hough transform method according to the prior art, as shown in fig. 4, the technical scheme of hough transform is as follows: dividing the distorted rectangle into two triangular plates for correction, as shown in fig. 4, projecting the distorted matrix (left in fig. 4) to the standard A4 (right in fig. 4)
The E' point in the original image can be represented by a combination of vectors AB and AD as bases
Corresponds to the original image, namely the E point (wherein a and b are the same)
Similarly, point E can be represented by a CB, CD based combination.
(2) Point-by-point restoration of each pixel on a polygon to a rectangle
The principle is as follows: fig. 5 is a schematic diagram of an image restoration principle according to an embodiment of the present application, as shown in fig. 5, a pixel point on an image is converted into (θ, d) θ as an angle, d is a distance from the pixel point to an origin, and a lower left corner point is set as the origin. Converting pixel coordinates (x, y) into → (θ, d) representation;
1. angle reduction
θ to i-th pixel point i Mapped to 0-90 degrees, θ i For the current angle, θ' is the recovered angle
2. Distance reduction
Fig. 6 is a schematic diagram of an embodiment, as shown in fig. 6,
method for restoring lower right corner
Method for restoring upper left corner
And traversing all pixel points on the picture, and according to the specific pixel value of the current picture, if the picture is 100 x 100 pixels in the specific embodiment, traversing all pixel points according to 100 x 100 pixels, restoring the pixel point positions to a rectangle, and then, filling the pixels by using a method of adjacent difference values in the case that the partial region is in a pixel vacancy state.
Fig. 7 is a schematic diagram of the system structure of the present application, and as shown in fig. 7, an embodiment of the present application further provides an image correction system 100, which adopts the image correction method as described above, including the following modules: a hough transform module 10, a pixel reduction module 20 and a pixel complement module 30;
hough transform module 10: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
the pixel reduction module 20: traversing all pixel points in the distorted first image boundary range, respectively performing angle reduction and distance reduction on the pixel points in the distorted image, and then reducing the distorted image into a second image which is not distorted;
pixel complement module 30: and complementing the vacant pixels of the partial region of the second image based on the approach difference method to obtain the restored original image.
In some embodiments, one of the corner points of the first image is set as an origin, one of sides connected with the origin in the first image is set as a bottom side, an included angle formed by a side formed by the pixel point and the origin and the bottom side is θ, and a distance from the pixel point to the origin is d, and the pixel point is denoted as (θ, d).
In some embodiments, the ith pixel point angle reduction uses the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the original point and the bottom edge is theta i And theta' is the included angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle after the pixel point is restored to the rectangle.
In some of these embodiments, the ith pixel distance reduction uses the following formula: a reduction formula for one corner of the first image:
the reduction formula for the other corner of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
In addition, the business card method of the embodiment of the present application described in connection with fig. 1 may be implemented by a computer device. Fig. 8 is a schematic diagram of a hardware structure of a computer device according to an embodiment of the present application.
The computer device may include a processor 81 and a memory 82 storing computer program instructions.
In particular, the processor 81 may comprise a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC), or may be configured as one or more integrated circuits that implement embodiments of the present application.
Memory 82 may include, among other things, mass storage for data or instructions. By way of example, and not limitation, memory 82 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, solid state Drive (Solid State Drive, SSD), flash memory, optical Disk, magneto-optical Disk, tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. The memory 82 may include removable or non-removable (or fixed) media, where appropriate.
The memory 82 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 82 is a Non-Volatile (Non-Volatile) memory. In a particular embodiment, the Memory 82 includes Read-Only Memory (ROM) and random access Memory (Random Access Memory, RAM). Where appropriate, the ROM may be a mask-programmed ROM, a programmable ROM (PROM for short), an erasable PROM (Erasable Programmable Read-Only Memory for short, EPROM for short), an electrically erasable PROM (Electrically Erasable Programmable Read-Only Memory for short, EEPROM), an electrically rewritable ROM (Electrically Alterable Read-Only Memory for short, EAROM) or FLASH Memory (FLASH) or a combination of two or more of these. The RAM may be a static random-Access Memory (SRAM) or a dynamic random-Access Memory (Dynamic RandomA ccessMemory DRAM), where the DRAM may be a fast page mode dynamic random-Access Memory (Fast Page Mode Dynamic Random Access Memory FPMDRAM), an extended data output dynamic random-Access Memory (Extended Date Out Dynamic Random Access Memory EDODRAM), a synchronous dynamic random-Access Memory (Synchronous Dynamic Random-Access Memory SDRAM), or the like, as appropriate.
Memory 82 may be used to store or cache various data files that need to be processed and/or communicated, as well as possible computer program instructions for execution by processor 81.
The processor 81 implements any of the entity recommendation methods of the above embodiments by reading and executing computer program instructions stored in the memory 82.
In some of these embodiments, the computer device may also include a communication interface 83 and a bus 80. As shown in fig. 8, the processor 81, the memory 82, and the communication interface 83 are connected to each other via the bus 80 and perform communication with each other.
The communication interface 83 is used to enable communication between modules, devices, units and/or units in embodiments of the application. Communication port 83 may also enable communication with other components such as: and the external equipment, the image/data acquisition equipment, the database, the external storage, the image/data processing workstation and the like are used for data communication.
Bus 80 includes hardware, software, or both, coupling components of the computer device to each other. Bus 80 includes, but is not limited to, at least one of: data Bus (Data Bus), address Bus (Address Bus), control Bus (Control Bus), expansion Bus (Expansion Bus), local Bus (Local Bus). By way of example, and not limitation, bus 80 may include a graphics acceleration interface (Accelerated Graphics Port), abbreviated AGP, or other graphics bus, an enhanced industry standard architecture (Extended Industry Standard Architecture, abbreviated EISA) bus, a front side bus (frontside bus, abbreviated FSB), a HyperTransport (HT) interconnect, an industry standard architecture (Industry Standard Architecture, abbreviated ISA) bus, a wireless bandwidth (InfiniBand) interconnect, a Low Pin Count (LPC) bus, a memory bus, a micro channel architecture (Micro Channel Architecture, abbreviated MCa) bus, a peripheral component interconnect (Peripheral Component Interconnect, abbreviated PCI) bus, a PCI-Express (PCI-X) bus, a serial advanced technology attachment (Serial Advanced Technology Attachment, abbreviated SATA) bus, a video electronics standards association local (VideoElectronicsStandards Association Local Bus, abbreviated VLB) bus, or other suitable bus, or a combination of two or more of these. Bus 80 may include one or more buses, where appropriate. Although embodiments of the application have been described and illustrated with respect to a particular bus, the application contemplates any suitable bus or interconnect.
The computer device may implement the image rectification method described in connection with fig. 1.
In addition, in combination with the image correction method in the above embodiment, the embodiment of the present application may be implemented by providing a computer readable storage medium. The computer readable storage medium has stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement any of the image correction methods of the above embodiments.
Compared with the prior art, the image correction method solves the problem of abrupt inter-region change caused by the recovery of the traditional method; the method can be applied to the distortion problem caused by the shooting angle in the business card recognition process and the distortion problem caused by the camera angle in the license plate recognition process.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.
Claims (4)
1. An image correction method, comprising the steps of:
hough transformation step: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
and a pixel reduction step: traversing all pixel points in the boundary range of the distorted first image, respectively carrying out angle reduction and distance reduction on the pixel points in the distorted first image, and then reducing the distorted first image into a second image which does not generate distortion;
and a pixel complement step: filling up the vacant pixels of the partial region of the second image based on a proximity difference method to obtain a restored original image;
one corner point of the distorted first image is set as an original point, one edge connected with the original point in the first image is set as a bottom edge, and an included angle formed by an edge formed by an ith pixel point in the first image to the original point and the bottom edge is theta i The distance from the ith pixel point to the origin point is d i The pixel is expressed as (θ i ,d i );
The angle reduction of the ith pixel point adopts the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the origin and the bottom edge is theta i And after the pixel points are restored to the rectangle, the angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle is an angle formed by the boundary at the original point;
the ith pixel point distance reduction adopts the following formula:
a reduction formula of one side of the first image:
the reduction formula of the other side of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
2. An image correction system employing the image correction method as claimed in claim 1, comprising the following modules:
a Hough transform module: according to the acquired distorted first image, respectively finding out the boundary and the corner of the distorted first image by using Hough transformation, and then determining the boundary range of the distorted first image;
and a pixel reduction module: traversing all pixel points in the boundary range of the distorted first image, respectively carrying out angle reduction and distance reduction on the pixel points in the distorted first image, and then reducing the distorted first image into a second image which does not generate distortion;
and a pixel complement module: filling up the vacant pixels of the partial region of the second image based on a proximity difference method to obtain a restored original image;
one corner point of the distorted first image is set as an original point, one edge connected with the original point in the first image is set as a bottom edge, and an included angle formed by an edge formed by an ith pixel point in the first image to the original point and the bottom edge is theta i The distance from the ith pixel point to the origin point is d i The pixel is expressed as (θ i ,d i );
The angle reduction of the ith pixel point adopts the following formula:
wherein, the included angle formed by the edge formed by the ith pixel point to the origin and the bottom edge is theta i And after the pixel points are restored to the rectangle, the angle between the edge formed by the ith pixel point and the original point and the bottom edge of the rectangle is an angle formed by the boundary at the original point;
the ith pixel point distance reduction adopts the following formula:
a reduction formula of one side of the first image:
the reduction formula of the other side of the first image:
wherein w is the length of the bottom edge, and h is the length of the other edge connected with the origin.
3. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the image correction method as claimed in claim 1 when executing the computer program.
4. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the image correction method as claimed in claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110830834.5A CN113570518B (en) | 2021-07-22 | 2021-07-22 | Image correction method, system, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110830834.5A CN113570518B (en) | 2021-07-22 | 2021-07-22 | Image correction method, system, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113570518A CN113570518A (en) | 2021-10-29 |
CN113570518B true CN113570518B (en) | 2023-11-14 |
Family
ID=78166319
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110830834.5A Active CN113570518B (en) | 2021-07-22 | 2021-07-22 | Image correction method, system, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113570518B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103996172A (en) * | 2014-05-08 | 2014-08-20 | 东北大学 | Fish-eye image correction method based on multistep correction |
CN104657940A (en) * | 2013-11-22 | 2015-05-27 | 中兴通讯股份有限公司 | Method and device for correction remediation and analysis alarm of distorted image |
CN106570834A (en) * | 2016-10-26 | 2017-04-19 | 东南大学 | Image correction method for pixel modulation visible light communication system |
CN110060200A (en) * | 2019-03-18 | 2019-07-26 | 阿里巴巴集团控股有限公司 | Perspective image transform method, device and equipment |
CN110475067A (en) * | 2019-08-26 | 2019-11-19 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment, computer readable storage medium |
CN111142751A (en) * | 2019-12-26 | 2020-05-12 | 腾讯科技(深圳)有限公司 | Image processing method and device, intelligent terminal and storage medium |
CN113096192A (en) * | 2021-04-25 | 2021-07-09 | 西安四维图新信息技术有限公司 | Image sensor internal reference calibration method, device, equipment and storage medium |
-
2021
- 2021-07-22 CN CN202110830834.5A patent/CN113570518B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104657940A (en) * | 2013-11-22 | 2015-05-27 | 中兴通讯股份有限公司 | Method and device for correction remediation and analysis alarm of distorted image |
CN103996172A (en) * | 2014-05-08 | 2014-08-20 | 东北大学 | Fish-eye image correction method based on multistep correction |
CN106570834A (en) * | 2016-10-26 | 2017-04-19 | 东南大学 | Image correction method for pixel modulation visible light communication system |
CN110060200A (en) * | 2019-03-18 | 2019-07-26 | 阿里巴巴集团控股有限公司 | Perspective image transform method, device and equipment |
CN110475067A (en) * | 2019-08-26 | 2019-11-19 | Oppo广东移动通信有限公司 | Image processing method and device, electronic equipment, computer readable storage medium |
CN111142751A (en) * | 2019-12-26 | 2020-05-12 | 腾讯科技(深圳)有限公司 | Image processing method and device, intelligent terminal and storage medium |
CN113096192A (en) * | 2021-04-25 | 2021-07-09 | 西安四维图新信息技术有限公司 | Image sensor internal reference calibration method, device, equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
代勤 ; 王延杰 ; 韩广良 ; .基于改进Hough变换和透视变换的透视图像矫正.《液晶与显示》.2012,第27卷(第4期),第552-556页. * |
张勇红.基于霍夫变换的铭牌OCR图像旋转矫正方法.《电测与仪表》.2015,第第52卷卷(第第8期期),第125-128页. * |
Also Published As
Publication number | Publication date |
---|---|
CN113570518A (en) | 2021-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10733697B2 (en) | Convolutional neural network for wide-angle camera images | |
CN110414507B (en) | License plate recognition method and device, computer equipment and storage medium | |
CN108805023B (en) | Image detection method, device, computer equipment and storage medium | |
CN111311482B (en) | Background blurring method and device, terminal equipment and storage medium | |
JP4847592B2 (en) | Method and system for correcting distorted document images | |
US8873835B2 (en) | Methods and apparatus for correcting disparity maps using statistical analysis on local neighborhoods | |
US8494297B2 (en) | Automatic detection and mapping of symmetries in an image | |
CN109479082B (en) | Image processing method and apparatus | |
US20160267695A1 (en) | Acceleration of exposure fusion with pixel shaders | |
CN103279952A (en) | Target tracking method and device | |
CN113627428A (en) | Document image correction method and device, storage medium and intelligent terminal device | |
CN111311481A (en) | Background blurring method and device, terminal equipment and storage medium | |
CN111861938A (en) | Image denoising method and device, electronic equipment and readable storage medium | |
CN113570725A (en) | Three-dimensional surface reconstruction method and device based on clustering, server and storage medium | |
CN113506305B (en) | Image enhancement method, semantic segmentation method and device for three-dimensional point cloud data | |
CN111161348B (en) | Object pose estimation method, device and equipment based on monocular camera | |
WO2022199395A1 (en) | Facial liveness detection method, terminal device and computer-readable storage medium | |
US9171227B2 (en) | Apparatus and method extracting feature information of a source image | |
CN113570518B (en) | Image correction method, system, computer equipment and storage medium | |
CN116563172A (en) | VR globalization online education interaction optimization enhancement method and device | |
CN113870190B (en) | Vertical line detection method, device, equipment and storage medium | |
CN114022358A (en) | Image splicing method and device for laser camera and dome camera, and server | |
CN111630569A (en) | Binocular matching method, visual imaging device and device with storage function | |
CN113870292A (en) | Edge detection method and device for depth image and electronic equipment | |
CN110648388A (en) | Scene geometric modeling method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |