CN110599427A - Fisheye image correction method and device and terminal equipment - Google Patents

Fisheye image correction method and device and terminal equipment Download PDF

Info

Publication number
CN110599427A
CN110599427A CN201910891227.2A CN201910891227A CN110599427A CN 110599427 A CN110599427 A CN 110599427A CN 201910891227 A CN201910891227 A CN 201910891227A CN 110599427 A CN110599427 A CN 110599427A
Authority
CN
China
Prior art keywords
image
fisheye image
fisheye
circular
mapping table
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910891227.2A
Other languages
Chinese (zh)
Inventor
王晓迪
俞坚才
陈晓辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TP Link Technologies Co Ltd
Original Assignee
TP Link Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TP Link Technologies Co Ltd filed Critical TP Link Technologies Co Ltd
Priority to CN201910891227.2A priority Critical patent/CN110599427A/en
Publication of CN110599427A publication Critical patent/CN110599427A/en
Pending legal-status Critical Current

Links

Classifications

    • G06T3/047
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20072Graph-based image processing

Abstract

The application is suitable for the technical field of image processing, and provides a fisheye image correction method, a fisheye image correction device and terminal equipment, and the fisheye image correction method comprises the following steps: if the current fisheye image is corrected for the first time or the fisheye image parameters are changed, calculating the position mapping relation between the preset corrected image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image; and sequentially acquiring the image information of the circular fisheye image according to the image information mapping table to fill the preset correction image, so as to obtain a target correction image corresponding to the circular fisheye image. The embodiment of the application can reduce the consumption of hardware resources while ensuring the accuracy of fisheye image correction.

Description

Fisheye image correction method and device and terminal equipment
Technical Field
The application belongs to the technical field of image processing, and particularly relates to a fisheye image correction method, a fisheye image correction device and terminal equipment.
Background
The fisheye lens is a lens having a short focal length and an angle of view close to or equal to 180 °, and is an extreme wide-angle lens. The fisheye camera formed by matching the image sensor and the fisheye lens can shoot a circular fisheye image with an ultra-large field range, so that panoramic image collection and camera monitoring are realized.
However, the circular fisheye image is severely distorted and has poor observability and visual effect, so that the circular fisheye image is usually required to be corrected to obtain a rectangular corrected image which accords with visual habits. In the prior art, in order to ensure the accuracy of the corrected image, the position mapping relationship between a preset corrected image (a rectangular image to be filled with pixel values) and each circular fisheye image shot by a fisheye camera is calculated in real time each time, the corrected image corresponding to the circular fisheye image is obtained by filling according to the pixel values of the corresponding positions of the circular fisheye image, and then the corrected image is output and displayed.
Because the existing fisheye image correction method needs to calculate the position mapping relation between the images in real time each time, the existing fisheye image correction method has large calculation amount and large consumption of hardware resources.
Disclosure of Invention
In view of this, embodiments of the present application provide a fisheye image correction method, an apparatus, and a terminal device, so as to solve the problem in the prior art how to reduce consumption of hardware resources while ensuring accuracy of fisheye image correction.
A first aspect of an embodiment of the present application provides a fisheye image rectification method, including:
if the current fisheye image is corrected for the first time or the fisheye image parameters are changed, calculating the position mapping relation between the preset corrected image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image;
and according to the image information mapping table, sequentially filling the preset correction image with the image information of the obtained circular fisheye image to obtain a target correction image corresponding to the circular fisheye image.
A second aspect of embodiments of the present application provides a fisheye image rectification apparatus, including:
the image information mapping table acquiring unit is used for calculating the position mapping relationship between the preset correction image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm if the current fisheye image parameters are corrected for the first time or the fisheye image parameters are changed, and acquiring an image information mapping table according to the position mapping relationship for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image;
and the image information filling unit is used for sequentially acquiring the image information of the circular fisheye image according to the image information mapping table and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image.
A third aspect of the embodiments of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to enable the terminal device to implement the steps of the fisheye image rectification method.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, which, when executed by a processor, causes a terminal device to implement the steps of the fisheye image rectification method as described above.
A fifth aspect of embodiments of the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to perform the steps of the fisheye image rectification method.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, when the first correction or the change of the fisheye image parameter is detected, the circular fisheye image position corresponding to the preset correction image can be accurately calculated according to the fisheye image parameter and the preset correction algorithm, an image information mapping table is generated, the image information of the circular fisheye image is accurately obtained according to the image information mapping table and is filled into the preset correction image, and the target correction image corresponding to the circular fisheye image is accurately obtained, so that the accuracy of fisheye image correction can be ensured; meanwhile, the image information mapping table is stored after being calculated, and the target correction graph corresponding to the circular fisheye image can be directly obtained according to the prestored image information mapping table when the fisheye image is corrected for the first time and the parameters of the fisheye image are not changed, so that repeated calculation can be avoided, and consumption of hardware resources is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart illustrating an implementation of a first fisheye image rectification method according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a circular fisheye image and a corresponding preset correction image according to an embodiment of the disclosure;
fig. 3 is a schematic diagram of a symmetrical bisector of a first preset corrected image and a corresponding symmetrical bisector of a circular fisheye image according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a symmetrical bisector of a second predetermined corrected image and a corresponding symmetrical bisector of a circular fisheye image according to an embodiment of the present application;
fig. 5 is a schematic diagram of a symmetrical bisector of a third preset corrected image and a corresponding symmetrical bisector of a circular fisheye image according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a track calculated by a position mapping relationship between a preset correction image and a circular fisheye image according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a sub-block image partition of a preset corrected image according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram illustrating a position mapping relationship between a first circular fisheye image and a preset correction image according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram illustrating a position mapping relationship between a second circular fisheye image and a preset correction image according to an embodiment of the disclosure;
fig. 10 is a schematic flow chart illustrating an implementation of a second fisheye image rectification method according to an embodiment of the present application;
FIG. 11 is a diagram illustrating an image information mapping table according to an embodiment of the present application;
fig. 12 is a schematic flow chart illustrating an implementation of a third fisheye image rectification method according to an embodiment of the present application;
fig. 13 is a schematic view of a fisheye image rectification device according to an embodiment of the present disclosure;
fig. 14 is a schematic diagram of a terminal device provided in an embodiment of the present application;
fig. 15 is a schematic diagram of another terminal device provided in the embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
The first embodiment is as follows:
fig. 1 shows a schematic flowchart of a first fisheye image rectification method provided in an embodiment of the present application, which is detailed as follows:
in S101, if the current fisheye image parameter is corrected for the first time or the fisheye image parameter changes, calculating a position mapping relation between a preset corrected image and a circular fisheye image according to the fisheye image parameter and a preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image.
When fisheye image correction is started for the first time, a prestored image information mapping table does not exist; when the fisheye image parameters are changed, the position mapping relation between the preset correction image and the circular fisheye image is changed. Therefore, when the current first correction is detected or the fish-eye image parameters are changed, the position mapping relation between the preset correction image and the fish-eye image is recalculated, and a new image information mapping table is obtained and stored according to the position mapping relation. If the current correction is not the first correction and the parameters of the fisheye image are not changed, the mapping relationship between the positions of the preset correction image and the circular fisheye image is not changed, and at the moment, the pre-stored image information mapping table is directly obtained without recalculating the mapping relationship between the positions and updating the image information mapping table.
Specifically, the fisheye image parameters include the resolution and image rotation angle of the original image before correction, i.e., the circular fisheye image. Specifically, the resolution of the preset correction image is set according to the resolution of the circular fisheye image. And then, respectively calculating to obtain the coordinates (x, y) of the second pixel point of the circular fisheye image corresponding to the first pixel point according to the image rotation angle, the preset correction algorithm and the coordinates (x ', y') of each first pixel point of the preset correction image with the set resolution, so as to obtain the position mapping relation between the preset correction image and the circular fisheye image. The first pixel points are pixel points in a preset correction image, and the second pixel points are pixel points of a circular fisheye image. For example, as shown in FIG. 2, the coordinates (x) of the first pixel point A' of the pre-set rectified imageA′′,yA′') and a preset correction formula, and calculating to obtain the coordinate (x) of the corresponding second pixel point A in the circular fisheye imageA,yA) And therefore, the position mapping relation between the first pixel point A' in the preset correction image and the second pixel point A of the circular fisheye image is obtained.
The position mapping relationship between the corrected image and the circular fisheye image is preset as shown in table 1.
Table 1:
coordinates of the first pixel Coordinates of the second pixel
(xa′′,ya′′) (xa,ya)
(xb′′,yb′′) (xb,yb)
(xc′′,yc′′) (xc,yc)
(xd′′,yd′′) (xd,yd)
...... ......
And then, obtaining an image information mapping table according to the position mapping relation between the preset correction image and the circular fisheye image. The image information mapping table may be specifically a target array T, and the array sequentially stores address information of image information of a circular fisheye image corresponding to each pixel point of a current preset corrected image. Specifically, the image information of the circular fisheye image is stored in a continuous memory, specifically, the image information which can be regarded as the circular fisheye image is stored in an array I, and each element I [ I ] of the image information array I]The image information of a pixel point is included, the image information can be gray value, YUV data or RGB value and the like, correspondingly, the image of the circular fisheye imageThe address information of the image information is specifically the label I of the array I, that is, each element of the array T is the label I of the array I, and then the image information of the circular fisheye image stored in the array I can be obtained by indexing according to the start address of the array I and the label I. Specifically, the array T includes (x) the first line in the preset correction image from the origin in sequence, taking the first 4 elements of the array T as an example, according to the position mapping relationship between the preset correction image and the circular fisheye imagea′′,ya′′)、(xb′′,yb′′)、(xc′′,yc′′)、(xc′′,yc′') four pixel points, wherein the corresponding relationship between the coordinates of the first pixel point and the coordinates of the second pixel point is shown in table 1. As shown in table 2, in the circular fisheye image: (x)a,ya) The corresponding image information is I [5]]、(xb,yb) The corresponding image information is I [1]]、(xc,yc) The corresponding image information is I [3 ]]、(xd,yd) The corresponding image information is I [7 ]]Then the array T is {5, 1,3, 7.
Table 2:
optionally, the calculating a position mapping relationship between the preset correction image and the circular fisheye image according to the fisheye image parameter and the preset correction algorithm includes:
according to the fisheye image parameters and a preset correction algorithm, calculating the region position mapping relation between a first region of a preset correction image and a second region of a corresponding circular fisheye image, wherein the first region is the preset correction image and is equally divided into 2nOne of the column regions, the second region being a sector region corresponding to the first region in the circular fisheye image, wherein n is a positive integer;
and performing coordinate conversion calculation according to the region position mapping relation to obtain the position mapping relation between the preset correction image and the circular fisheye image.
Since the circular fisheye image and the rectangular preset corrected image are symmetrical patterns, the preset corrected image and the circular fisheye image can be equally divided into 2nCalculating the area position mapping relation of only one area, and obtaining the rest 2 according to the coordinate symmetry transformationnAnd (4) obtaining the position mapping relation between the complete preset correction image and the circular fisheye image by the region position relation of 1 region, wherein n is a positive integer.
As shown in fig. 3, if n is 1, the preset corrected image is equally divided into two rows of areas, an area position mapping relationship between a first area and a second area of the corresponding circular fisheye image is calculated, and coordinate conversion calculation is performed according to the area position mapping relationship between pixel points in the first area and pixel points in the second area, so that a position mapping relationship between pixel points in the non-first area in the preset corrected image and pixel points in the non-second area in the circular fisheye image can be obtained. E.g. point P of the first area in fig. 31The coordinate of' is (x)1’,y1') obtaining a point P of the corresponding second region according to a preset correction algorithm1Has the coordinates of (x)1,y1) Then, according to the coordinate symmetry transformation, the neutralization point P in the preset corrected image can be obtained1Point Q in a non-first region symmetrical about a straight line x ═ d/21’(d-x1’,y1') corresponding point Q1Has the coordinates of (x)1,d-y1) And d is the diameter of the circular fisheye image.
Similarly, if n is 2, as shown in fig. 4, the preset corrected image is equally divided into 4 rows of regions, and the point P in the first region in fig. 42The coordinate of' is (x)2’,y2') obtaining a point P of the corresponding second region according to a preset correction algorithm2Has the coordinates of (x)2,y2) Then, according to the coordinate symmetry transformation, the neutralization point P in the preset corrected image can be obtained2Point Q in a non-first region symmetrical about a straight line x ═ d/42’(d/2-x2’,y2') corresponding point Q2Has the coordinates of (d-x)2,y2)。
Similarly, if n is 3, as shown in fig. 5, the preset corrected image is equally divided into 8 rows of regions, and the point P in the first region in fig. 53The coordinate of' is (x)3’,y3') obtaining a point P of the corresponding second region according to a preset correction algorithm3Has the coordinates of (x)3,y3) Then, according to the coordinate symmetry transformation, the neutralization point P in the preset corrected image can be obtained3Point Q in a non-first region symmetrical about a straight line x ═ d/83’(d/4-x3’,y3') corresponding point Q3Has the coordinates of (y)3,x3)。
By analogy, n times of bisection can be performed on the preset correction image and the corresponding circular fisheye image according to the set n value, and the halving is performed for 2 timesnAnd (4) a region. Preferably, when n is 3, i.e. the preset correction image and the corresponding circular fisheye image are divided into 8 regions, the complexity of coordinate transformation is the best.
In the embodiment of the application, the position mapping relation between the complete preset correction image and the circular fisheye image can be obtained only by calculating the area position mapping relation and the coordinate conversion calculation between the first area of the preset correction image and the corresponding second area of the circular fisheye image, so that the calculation amount can be reduced, the fisheye image correction efficiency can be improved, and the consumption of hardware resources can be reduced.
Optionally, the preset correction algorithm includes a preset correction formula, and the calculating the position mapping relationship between the preset correction image and the circular fisheye image according to the fisheye image parameter and the preset correction algorithm includes:
sequentially calculating the pixel coordinates (x, y) of the circular fisheye image corresponding to each pixel coordinate in the preset correction image according to the fisheye image parameters and the preset correction formula and the sequence of traversing the pixel coordinates (x ', y') of the preset correction image with the same abscissa value and different ordinate values each time;
wherein the preset correction formula is as follows:
or
Where θ is the image rotation angle, "·" denotes the multiplication sign.
In the embodiment of the present application, the preset correction algorithm specifically includes a preset correction formula, and the corresponding coordinates (x, y) of the circular fisheye image can be obtained according to the coordinates (x ', y') of the preset correction image and the image rotation angle θ. When correction is performed according to the preset correction formula, the parts which need to consume large calculation power and mainly are parts for solving a trigonometric function, namely cos (2 pi x + theta) and sin (2 pi x + theta) in the formula, and because the variables of the two calculation parts are x ', namely when x' in the formula is unchanged, the corresponding trigonometric function parts are the same, and repeated calculation is not needed, the pixel coordinates (x, y) of the circular fisheye image corresponding to the first row of pixel points (x ', y') can be calculated in a traversing manner by keeping the abscissa value x '(for example, x' is equal to 0) unchanged and changing the ordinate from [0, d-1] every time; then, the pixel coordinates (x, y) of the circular fisheye image corresponding to the second row of pixel points corresponding to the next abscissa x '(for example, x' is 1) are calculated; and repeating the steps until the pixel points in the preset correction image are traversed.
In the embodiment of the application, according to the actual calculated amount of the preset formula, the traditional mode of traversing the image row by row is changed into the mode of traversing the image column by column, and the value x' is kept unchanged in each traversal, so that the calculated amount in the process of calculating the position mapping relation can be reduced, the fisheye image correction efficiency is improved, and the consumption of hardware resources is reduced.
Optionally, the calculating, according to the fisheye image parameter and the preset correction algorithm, a position mapping relationship between a preset correction image and a circular fisheye image, and obtaining an image information mapping table according to the position mapping relationship for storing includes:
dividing a preset correction image into a plurality of sub-block images;
obtaining a position mapping relation between a first sub-block image and a circular fisheye image according to the fisheye image parameters and a preset correction algorithm, and correspondingly storing mapping information of the first sub-block image to an image information mapping table, wherein the first sub-block image is any one of the sub-block images;
sequentially obtaining the position mapping relation between other sub-block images and the circular fisheye image according to the position mapping relation between the first sub-block image and the circular fisheye image, and correspondingly storing the mapping information of the other sub-block images to the image information mapping table;
and storing the finally obtained image information mapping table.
As shown in fig. 6, according to the method of traversing the image line by line, when the second pixel point of the circular fisheye image corresponding to each first pixel point on each straight line segment parallel to the x-axis is sequentially calculated according to the preset algorithm, it is equivalent to traversing a circumference of the circular fisheye image. Since the image information is usually stored in the storage unit corresponding to the circular fisheye image in a line-by-line manner, the image information of the circumference of the circular fisheye image, which is obtained by traversing the preset corrected image of the image line by line, is non-line-by-line and discontinuously stored, that is, the positions of the corresponding circular fisheye images stored in the image information mapping table, which is finally obtained according to the calculation sequence, are dispersed, so that when the preset corrected image is filled with the image information of the circular fisheye image obtained according to the image information mapping table, the image information needs to be read from a plurality of storage units with long distance each time, which causes high operation cost and more consumption of hardware resources.
In the embodiment of the present application, a preset correction image is divided into a plurality of sub-block images (for example, 9 sub-block images as shown in fig. 7), position mapping relation calculation is performed in a single sub-block image each time, mapping information of the sub-blocks is correspondingly stored in an image information mapping table, and finally, each sub-block image is traversed, so that a complete image information mapping table is obtained and stored.
Specifically, after dividing a preset image into a plurality of sub-block images, one of the sub-block images is arbitrarily selected as a first sub-block image. When the position mapping relation between the first sub-block image and the circular fisheye image is calculated, calculation needs to be performed according to fisheye image parameters and a preset correction algorithm, and the position of a second pixel point of the circular fisheye image corresponding to each first pixel point in the first sub-block image is obtained in sequence. And then, sequentially acquiring the storage address of the image information of the corresponding circular fisheye image according to the position of each second pixel point, thereby generating mapping information of the first sub-block image and storing the mapping information into an image information mapping table, wherein the mapping information is the sequentially acquired storage address information of the image information of the circular fisheye image corresponding to each first pixel point of the first sub-block image.
And performing coordinate symmetry conversion calculation according to the position mapping relation between the first sub-block image and the circular fisheye image, sequentially obtaining the position mapping relation between other sub-block images and the circular fisheye image, obtaining the mapping information of the sub-block image according to the position mapping relation, and storing the mapping information into an image information mapping table, wherein the mapping information of the sub-block image is the storage address information of the image information of the circular fisheye image corresponding to each first pixel point of the sub-block image.
Through the calculation, after the mapping information of each sub-block image is sequentially stored in the image information mapping table, a complete image information mapping table is finally obtained, and the image information mapping table is stored.
The positions of the corresponding circular fisheye images searched according to the sub-block images are concentrated every time, so that the positions of the corresponding circular fisheye images stored in the image information mapping table obtained in the calculation sequence are concentrated, and then when the preset correction image is filled with the image information of the circular fisheye images obtained according to the image information mapping table, the positions of the storage units for storing the image information read in sequence are closer to each other, so that the operation cost is reduced, and the consumption of hardware resources is reduced.
Optionally, the fisheye image parameter further includes a YUV format, the image information mapping table is specifically a target array of storage address information for storing YUV data of a circular fisheye image, and the step of calculating a position mapping relationship between a preset corrected image and the circular fisheye image according to the fisheye image parameter and a preset correction algorithm to obtain and store an image information mapping table includes:
calculating the position mapping relation between a preset correction image and the circular fisheye image according to the resolution of the circular fisheye image, the image rotation angle and a preset correction algorithm;
obtaining and storing the image information mapping table according to the position mapping relation and the YUV format;
correspondingly, the sequentially acquiring the image information of the circular fisheye image according to the image information mapping table to fill the preset correction image, and obtaining the target correction image corresponding to the circular fisheye image, including:
and sequentially acquiring YUV data of the circular fisheye image according to the image information mapping table and the YUV format to fill the preset correction image, so as to obtain a target correction image corresponding to the circular fisheye image.
In the embodiment of the present application, the image information of the circular fisheye image is YUV data, which includes a Y value, a U value, and a V value of each pixel. Wherein "Y" represents brightness (Luma or Luma), i.e., a gray scale value; "U" and "V" represent chromaticity (Chroma) which describes the color and saturation of an image for specifying the color of a pixel. The YUV data of the circular fisheye image is continuously stored in a designated memory unit, and the storage mode is determined by the YUV format in the fisheye image parameters. The YUV format may include an auyuv format, a YUYV format, a YUV format, an IMC1 format, an IMC3 format, and the like. Specifically, the sampling category (including YUV444, YUV422, YUV420, etc.) and the storage category of the current YUV data may be determined according to the YUV format. The storage type comprises a packed format and a planar format, the packed format refers to that Y, U and V values of each pixel point are arranged in a crossed mode and stored continuously by taking the pixel point as a unit, and the planar format refers to that three arrays are used for separately and continuously storing the Y, U and V values, namely Y, U and V, in respective arrays. According to the YUV format, the sampling type and the storage type of the YUV data can be determined, so that the storage mode of the YUV data in a specified memory unit is determined, the storage address information of the YUV data is determined, and an image information mapping table is established; when the YUV format changes, the storage address information of the YUV data also changes, i.e., the image information mapping table needs to be reestablished. And then, sequentially and accurately acquiring YUV data of the circular fisheye image according to the YUV format and the image information mapping table, and transferring the acquired YUV data to a preset correction image so as to obtain a target correction image corresponding to the circular fisheye image.
Optionally, the image information mapping table in the embodiment of the present application may be respectively composed of three tables, i.e., a Y value mapping table, a U value mapping table, and a V value mapping table, and specifically, the storage address information of the Y value, the U value, and the V value in the YUV data is respectively stored. Preferably, the image information mapping table in the embodiment of the present application is specifically one of a Y value mapping table, a U value mapping table, and a V value mapping table, and then when a preset correction image is filled, complete YUV data corresponding to each pixel point is obtained by calculation according to the mapping table and YUV format information and is filled.
Specifically, if it is detected that the storage type is in the packaging format according to the YUV format, that is, when Y, U, V values in the YUV data are stored in a crossed arrangement and continuously, the image information mapping table may be a mapping table that stores only the storage address information of one of the Y, U, V values (for example, the Y value); and then, acquiring storage address information of the item (such as a Y value) corresponding to each pixel point in the preset correction image according to the image information mapping table, calculating storage address information of other items (such as a U value and a V value) which are stored adjacently according to a YUV format, and finally acquiring YUV data according to the complete YUV data storage address information and filling the YUV data into the preset correction image to obtain a target correction image corresponding to the circular fisheye image. For example, when the YUV format is YUYV format, the sampling type of the YUV data is determined to be the sampling type of YUV422 according to the format, and the storage type is the packing format, and the specific storage mode is as shown in the following table:
table 3:
Y00 U00 Y01 V00 Y02 U01 Y03 V01
Y10 U10 Y11 V10 Y12 U11 Y13 V11
Y20 U20 Y21 V20 Y22 U21 Y23 V21
Y30 U30 Y31 V30 Y32 U31 Y33 V31
when the YUV format is the YUYV format, the image information mapping table may be a Y value mapping table storing storage address information of only Y values. And then, in addition to acquiring the storage address information of the Y value according to the Y value mapping table so as to acquire the Y value of the circular fisheye image, acquiring the storage address information of the U value and the V value which are stored adjacent to the Y value according to the Y value mapping table and the YUV format information so as to acquire the U value and the V value, and finally acquiring complete YUV data of each pixel point of the circular fisheye image to fill the complete YUV data into the corresponding pixel point in the preset correction image so as to acquire the target correction image corresponding to the circular fisheye image.
Specifically, if it is detected that the storage type is a flat format according to the YUV format, that is, when Y, U, V values in the YUV data are respectively and continuously stored, the image information mapping table is specifically a Y value mapping table that only stores storage address information of a Y value, and initial storage address information of a U value and a V value is recorded while the image information mapping table is established; then, after the storage address information of the Y value corresponding to each pixel point in the preset correction image is obtained according to the image information mapping table, the proportion corresponding to the sampling category is obtained according to the initial storage address information of the U value and the V value and the sampling category information (for example, each Y value in YUV444 corresponds to one U value and one V value, each two Y values in YUV422 correspond to one U value and one V value, each four Y values in YUV420 correspond to one U value and one V value), the storage address information of the U value and the V value of each pixel point is calculated, and finally, the complete YUV data of each pixel point of the circular fisheye image is obtained and is filled into the corresponding pixel point in the preset correction image, so that the target correction image corresponding to the circular fisheye image is obtained. For example, when the YUV format is the IMC3 format, the sampling type of the YUV data is determined to be the sampling type of YUV420 according to the format, and the storage type is a flat format, and the specific storage manner is as shown in the following table:
table 4:
Y00 Y01 Y02 Y03 Y04 Y05 Y06 Y07
Y10 Y11 Y12 Y13 Y14 Y15 Y16 Y17
U00 U01 U02 U03
V00 V01 V02 V03
in the above table, Y00、Y01、Y10、Y11The storage address of the U value corresponding to the Y value of the four storage addresses is U00The corresponding V value storage address is V00;Y02、Y03、Y12、Y13The storage address of the U value corresponding to the Y value of the four storage addresses is U01The corresponding V value storage address is V01;Y04、Y05、Y14、Y15The storage address of the U value corresponding to the Y value of the four storage addresses is U02The corresponding V value storage address is V02;Y06、Y07、Y16、Y17The storage address of the U value corresponding to the Y value of the four storage addresses is U03Correspond toHas a V value storage address of V03. The image information mapping table specifically only stores storage address information of a Y value, initial storage address information of a U value and initial storage address information of a V value, and then YUV data information of a circular fisheye image corresponding to each pixel point corresponding to a preset correction image can be obtained according to the information and a YUV format, so that a target correction image corresponding to the circular fisheye image is obtained. For example, according to the image information mapping table, the storage address information of the Y value of the pixel point of the circular fisheye image corresponding to the third pixel point of the correction image is preset as Y02And according to YUV format: IMC3 Format, know Y02The corresponding U value storage address information is U01And storing address information starting from the recorded U value (i.e. U)00Actual corresponding address information, for example, 0000H), U is obtained by adding 1 unit of storage length to the U-value start storage address information01And an actual storage address is used for acquiring a U value of a pixel point of the circular fisheye image corresponding to the third pixel point of the preset correction image according to the address, acquiring a V value corresponding to the pixel point, and finally completely acquiring YUV data corresponding to the third pixel point for filling.
In the embodiment of the application, the image information mapping table can be accurately constructed according to the YUV format, and can only store the storage address information of any one of the Y value, the U value and the V value in the YUV data, so that the accuracy of fisheye image correction can be ensured, the storage space can be saved, and the consumption of hardware resources can be reduced; moreover, the Y, U, V information correspondence relation does not need to be calculated and stored respectively when the image information mapping table is constructed, so that the calculation amount of the image information mapping table obtained by calculation can be reduced, the fisheye image correction efficiency is improved, and the consumption of hardware resources is reduced.
In S102, according to the image information mapping table, sequentially obtaining the image information of the circular fisheye image to fill the preset correction image, and obtaining a target correction image corresponding to the circular fisheye image.
Sequentially traversing each first pixel point of the preset correction image, sequentially acquiring image information of a second pixel point of the circular fisheye image corresponding to each first pixel point according to the image information mapping table, filling the image information to the first pixel point of the preset correction image, and finally obtaining a target correction image corresponding to the circular fisheye image. Specifically, the image information mapping table is specifically an array T, the label information stored in the array T is sequentially queried, an image information storage address of the corresponding circular fisheye image is obtained according to the label information, the image information in the corresponding circular fisheye image is obtained according to the storage address, and the image information is filled into the pixel point of the preset correction image.
In the embodiment of the application, when the first correction or the change of the fisheye image parameter is detected, the circular fisheye image position corresponding to the preset correction image can be accurately calculated according to the fisheye image parameter and the preset correction algorithm, an image information mapping table is generated, the image information of the circular fisheye image is accurately obtained according to the image information mapping table and is filled into the preset correction image, and the target correction image corresponding to the circular fisheye image is accurately obtained, so that the accuracy of fisheye image correction can be ensured; meanwhile, the image information mapping table is stored after being calculated, and the target correction graph corresponding to the circular fisheye image can be directly obtained according to the prestored image information mapping table when the fisheye image is corrected for the first time and the parameters of the fisheye image are not changed, so that repeated calculation can be avoided, and consumption of hardware resources is reduced.
For ease of understanding, the following detailed formulas
Andthe derivation process of (1).
As shown in fig. 8, the diameter of the circular fisheye image is d, and each circle in the circular fisheye image region corresponds to a straight line segment parallel to the x ' axis in a preset region composed of x ' ∈ [0, d-1] and y ' ∈ [0, d-1] in the preset correction image. Each point (x ', y ') on a straight line segment in the preset region of the preset correction image corresponds to each point (x, y) on a circle in the circular fisheye image region, that is, in each straight line segment of the preset correction image, when x ' traverses from 0 to d-1, a circle in the circular fisheye image is just traversed (the circle angle α increases from 0 degree to 360 degrees), so that the following steps are set:
is obtained from the formula (1):
in the same way, the method for preparing the composite material,
normalizing the two formulas (2) and (3), namely considering d as 1, obtaining:
since the fisheye camera can start the cruise mode, and accordingly the starting axis of α ═ 0 of the circular fisheye image is different, for example, the right half axis of L1 in fig. 8 is taken as the starting axis of α ═ 0, and through the rotation of the image rotation angle θ ═ 90 degrees, the circular fisheye image becomes an image with the upper half axis of L2 as the starting axis of α ═ 0 as shown in fig. 9, and therefore, it is necessary to add the image rotation angle θ to equation (4) to obtain:
the image rotation angle θ is fixed by taking the right half axis of L1 as the starting axis, and may be any angle of 0-360 degrees, and formula (5) is a preset correction formula. In fig. 8, θ is 0, and in fig. 9, θ is 90 degrees.
Further, equation (5) may also be optimized. Because when calculating according to equation (5), probably have and predetermine when y' is close 0 in the correction map, a plurality of pixel points on a straightway are corresponding all when being the same pixel point of circular fisheye image (the radius undersize of the circumference that corresponds circular fisheye image promptly), and it is relatively poor to correct the effect, consequently can tailor the central point of circular fisheye image, lets the circular fisheye image that the diameter is less than d/5 not correspond to map to predetermineeing in the correction map, obtains:
example two:
fig. 10 shows a schematic flowchart of a second fisheye image rectification method provided in an embodiment of the present application, which is detailed as follows:
in S1001, if the current fisheye image is corrected for the first time or the fisheye image parameter changes, calculating a position mapping relation between a preset corrected image and a circular fisheye image according to the fisheye image parameter and a preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; the image information mapping table is composed of a difference mapping table, a difference dictionary table and initial label information.
When the image information mapping table is specifically a target array storing the labels of the image information arrays of the circular fisheye image, for example, when the image information mapping table is specifically an array T storing the label I of the array I, the storage length of each entry of the array T at least needs to be equal to the length of the value corresponding to the resolution of the circular fisheye image. For example, assuming that the resolution of the circular fisheye image is 1440 × 1440, the length of the array I storing the image information of the circular fisheye image is at least m-1440 × 1440-2073600 (when the image information is specifically a gray scale value), that is, the maximum index of the array I is 2073600-1-2073599, and since the numerical value 2073599 is larger than the maximum length that can be represented by the 16-bit unsigned character type U16, the array T storing the index I of the array I needs at least 32-bit unsigned character type U32 to represent each item. That is, when the image information mapping table specifically stores the information of the index I of the array I of the image information of the circular fisheye image, the memory space occupied by the image information mapping table is large.
In the embodiment of the present application, instead of directly storing the target array (each element of the target array stores the label of the image information array of the corresponding circular fisheye image) as the image information mapping table, the difference dictionary table, and the start label information are further obtained by conversion according to the target array, and the three are stored as the image information mapping table, as shown in fig. 11. The starting label information is a label of an image information array of the circular fisheye correction image corresponding to a starting pixel point of a preset correction image, the difference dictionary table and the difference mapping table can be respectively expressed as an array, each element of the difference dictionary table stores specific difference information, each element of the difference mapping table stores an index address of a difference value between the label of the image information array of the circular fisheye correction image corresponding to the current pixel point of the preset correction image and the label of the image information array of the circular fisheye correction image corresponding to the previous pixel point, and the index address is the label of the difference dictionary table.
For example, the difference mapping table may be specifically an array J, and the difference dictionary table may be specifically an array K. Assuming that the original target array T is {2,5,7,9,6,8,4,1,0,3}, determining the start label information as the first element "2" of the target array, and then calculating the difference between every two elements of the target array to obtain a corresponding difference dictionary table, i.e. the array K is {3,2, -3, -4, -1}, where the differences are equal, and the difference is stored only once in the difference dictionary table; meanwhile, a difference mapping table is obtained, that is, an array J ═ {0,1,1,2,1,3,2,4,0 }. Specifically, when the label of an image information array corresponding to a second pixel point of the preset corrected image is calculated to be 5, and the difference between the label and the label 2 corresponding to the initial pixel point is 3, at this time, the difference 3 is stored in the first position K [0] of the array K, and the position "0" of the difference 3 in the array K is stored in the first element J [0] of the array J, that is, J [0] is equal to 0; when the label of an image information array corresponding to a third pixel point of a preset correction image is calculated to be 7, and the difference value between the label and a label 5 corresponding to a previous pixel point (a second pixel point) is 2, storing 2 into a second position K [1] of the array K, and storing the position '1' of the numerical value 2 in the array K in a second element J [1] of the array J, namely J [1] ═ 1; when the label of the image information array corresponding to the fourth pixel point of the preset corrected image is calculated to be 9, and the difference value between the label and the label 7 corresponding to the previous pixel point (the third pixel point) is 2, at this time, because the value 2 is already stored in K [1], the difference value 2 does not need to be stored again, and the position "1" of the value 2 in the array K is stored in the third element J [2] of the array J, namely J [2] is equal to 1, and so on, until the image is traversed, a complete difference mapping table and a difference dictionary table are obtained. The difference mapping table corresponds to the target array, and because the initial pixel point does not need to calculate the difference, the total number of terms of the difference mapping table is equal to the total number of terms of the target array minus 1 (namely, the number of elements of the array J is one less than that of the array T); in addition, since the repeated difference values are stored only once in the difference dictionary table, the total number of entries in the difference dictionary table is much smaller than that in the target array (i.e., the number of elements in the array K is much smaller than that in the target array).
It will be appreciated that the maximum data length of an element in array J (e.g., the data length occupied by the maximum data length of an element in array J being the value "4") is typically smaller than the maximum data length of an element in array T (e.g., the data length occupied by the maximum data length of an element in array T being the value "9"), and that the data length of each element of the same array typically needs to be uniform, the uniform data length is determined by the element with the longest data length in the array, so that the data length occupied by each element in the array J is smaller relative to the array T, that is, the data length of each entry in the difference mapping table is smaller than the data length of each entry in the image information mapping table for directly storing the label of the image information array, and the storage space occupied by the difference mapping table is much smaller than the storage space occupied by the target array. For the difference dictionary table, because the differences between every two elements in the array T are much the same, and for repeated differences, the difference dictionary table is stored only once, the number of the elements of the difference dictionary table is far smaller than that of the elements of the target array T, and the storage space occupied by the difference dictionary table is small. Therefore, the image information mapping table is formed by the difference mapping table, the difference dictionary table and the initial label information, so that the memory space can be saved, and the consumption of hardware resources can be reduced. Particularly, when the data length of each table entry in the difference mapping table is much smaller than the data length occupied by a value calculated according to the resolution of the circular fisheye image (for example, the resolution 1440 x 1440 — 2073600), the memory space occupied by the mapping table can be greatly compressed. For example, for an image with a resolution of 1440 x 1440, each entry of the array T originally storing the index I of the array I needs at least 32 bits of unsigned character U32 to represent, and when indirectly represented by the array [ J ] and the array [ K ], the maximum data length in the array is usually much smaller than the data length occupied by "2073600", and each entry in the difference mapping table may only need 8 bits of unsigned character U8 to represent, thereby greatly compressing the memory space occupied by the image information mapping table.
In S1002, sequentially obtaining image information of the circular fisheye image according to the image information mapping table, and filling the preset corrected image with the image information to obtain a target corrected image corresponding to the circular fisheye image.
When the image information mapping table is read to fill the image information of the preset correction image, according to the initial label information, the difference mapping table and the difference dictionary table and according to a calculation process opposite to that in the table building process, the image information array label of the circular fisheye image corresponding to each pixel point of the preset correction image can be obtained, the image information of the corresponding circular fisheye image is sequentially obtained to fill, and finally the target correction image corresponding to the circular fisheye image is obtained. For example, corresponding to the example in S1001, when filling the second pixel of the preset correction image, first obtaining the start index information f equal to 2, querying the difference mapping table J [0] ═ 0, querying the difference dictionary table K according to the difference address "0" to obtain K [0] ═ 3, that is, finally querying that the difference between the index value corresponding to the second pixel and the first index value is 3, that is, obtaining the image information array index corresponding to the second pixel is f + K [ J [0] ]equalto 2+ K [0] ═ 2+3 equal to 5; then according to the label value, obtaining the image information of I5 in the image information array I of the circular fisheye image, and filling the image information into the second pixel point of the preset correction image; and by parity of reasoning, sequentially obtaining image information of corresponding positions of the circular fisheye image, filling the image information into the preset correction image, and finally obtaining the target correction image.
In the embodiment of the application, the image information mapping table is formed by the difference mapping table, the difference dictionary table and the initial label information, so that compared with a mode of directly storing the image information array labels, the memory space can be saved, and the consumption of hardware resources is reduced.
Example three:
fig. 12 shows a schematic flowchart of a third fisheye image rectification method provided in an embodiment of the present application, which is detailed as follows:
in S1201, if the current first correction is detected or the fisheye image parameter changes, calculating a position mapping relation between a preset correction image and a circular fisheye image according to the fisheye image parameter and a preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image.
In this embodiment, S1201 is the same as S101 in the first embodiment or S1001 in the second embodiment, and please refer to the related description of S101 in the first embodiment or S1001 in the second embodiment, which is not described herein again.
In S1202, if the resolution of the circular fisheye image is smaller than a preset threshold, sequentially obtaining image information of the circular fisheye image according to the image information mapping table, performing bilinear interpolation operation, and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image.
If the resolution of the circular fisheye image is smaller than a preset threshold (e.g., 800 × 600), bilinear interpolation operation is performed when image information is filled in the preset corrected image, so that the image is optimized, and the finally obtained target corrected image is smoother and better in image quality.
In S1203, otherwise, directly filling preset correction images with the image information of the circular fisheye image sequentially acquired according to the image information mapping table, so as to obtain the target correction image corresponding to the circular fisheye image.
And if the resolution of the circular fisheye image is larger than a preset threshold value, skipping bilinear interpolation operation. When the resolution of the image is high, the influence on the image quality is not great even if the bilinear interpolation operation is not carried out, so that a large amount of operations can be saved by the step of not carrying out the bilinear interpolation operation, the occupation of computing resources is reduced, the fisheye image correction efficiency is improved, and the consumption of hardware resources is reduced.
In the embodiment of the application, whether bilinear interpolation operation is performed or not can be determined according to the resolution of the image, so that the calculated amount can be reduced while the image quality is ensured, the fisheye image correction efficiency is improved, and the consumption of hardware resources is reduced.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example four:
fig. 13 is a schematic structural diagram of a fisheye image rectification device provided in an embodiment of the present application, and for convenience of description, only the portions related to the embodiment of the present application are shown:
this fisheye image orthotic devices includes: an image information mapping table acquiring unit 131 and an image information filling unit 132. Wherein:
the image information mapping table obtaining unit 131 is configured to, if it is detected that the current fisheye image parameter is first corrected or the fisheye image parameter changes, calculate a position mapping relationship between a preset corrected image and a circular fisheye image according to the fisheye image parameter and a preset correction algorithm, and obtain an image information mapping table according to the position mapping relationship for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image.
Optionally, the image information mapping table obtaining unit 131 includes an area position mapping relationship calculating module and a coordinate conversion calculating module:
a region position mapping relation calculating module, configured to calculate a region position mapping relation between a first region of a preset correction image and a second region of a corresponding circular fisheye image according to the fisheye image parameter and a preset correction algorithm, where the first region is equal to or 2 times the preset correction imagenOne of the column regions, the second region being a sector region corresponding to the first region in the circular fisheye image, wherein n is a positive integer;
and the coordinate conversion calculation module is used for carrying out coordinate conversion calculation according to the region position mapping relation to obtain the position mapping relation between the preset correction image and the circular fisheye image.
Optionally, the preset correction algorithm includes a preset correction formula, and the image information mapping table obtaining unit 131 includes:
the first calculation module is used for sequentially calculating the pixel coordinates (x, y) of the circular fisheye image corresponding to each pixel coordinate in the preset correction image according to the fisheye image parameters and the preset correction formula and the sequence of traversing the pixel coordinates (x ', y') of the same abscissa value and different ordinate values of the preset correction image every time;
wherein the preset correction formula is as follows:
or
Where θ is the image rotation angle, "·" denotes the multiplication sign.
Optionally, the image information mapping table obtaining unit 131 includes a sub-block image dividing module, a first mapping information calculating unit, a second mapping information calculating unit, and an image information mapping table storing unit:
the sub-block image dividing module is used for dividing a preset correction image into a plurality of sub-block images;
the first mapping information calculation unit is used for obtaining a position mapping relation between a first sub-block image and a circular fisheye image according to the fisheye image parameters and a preset correction algorithm, and correspondingly storing mapping information of the first sub-block image to an image information mapping table, wherein the first sub-block image is any one of the sub-block images;
the second mapping information calculation unit is used for sequentially obtaining the position mapping relations between other sub-block images and the circular fisheye image according to the position mapping relation between the first sub-block image and the circular fisheye image, and correspondingly storing the mapping information of the other sub-block images to the image information mapping table;
and the image information mapping table storage unit is used for storing the finally obtained image information mapping table.
Optionally, the image information mapping table is composed of a difference mapping table, a difference dictionary table, and start label information.
And an image information filling unit 132, configured to sequentially obtain image information of the circular fisheye image according to the image information mapping table, and fill the preset corrected image to obtain a target corrected image corresponding to the circular fisheye image.
Optionally, the image information filling unit 132 includes a first filling unit and a second filling unit:
the first filling unit is used for sequentially acquiring the image information of the circular fisheye image according to the image information mapping table if the resolution of the circular fisheye image is smaller than a preset threshold, performing bilinear interpolation operation, and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image;
and the second filling unit is used for directly filling the image information of the circular fisheye image which is sequentially acquired according to the image information mapping table into a preset correction image to obtain the target correction image corresponding to the circular fisheye image.
Optionally, the fisheye image parameter further includes a YUV format, and the image information mapping table obtaining unit 131 includes a second calculating module and an image information mapping table calculating module:
the second calculation module is used for calculating the position mapping relation between a preset correction image and the circular fisheye image according to the resolution of the circular fisheye image, the image rotation angle and a preset correction algorithm;
the image information mapping table calculation module is used for obtaining and storing the image information mapping table according to the position mapping relation and the YUV format;
correspondingly, the image information filling unit 132 is specifically configured to sequentially obtain YUV data of the circular fisheye image according to the image information mapping table and the YUV format to fill the preset correction image, so as to obtain a target correction image corresponding to the circular fisheye image.
In the embodiment of the application, when the first correction or the change of the fisheye image parameter is detected, the circular fisheye image position corresponding to the preset correction image can be accurately calculated according to the fisheye image parameter and the preset correction algorithm, an image information mapping table is generated, the image information of the preset correction image is accurately acquired and filled according to the image information mapping table, and the target correction image corresponding to the circular fisheye image is accurately obtained, so that the accuracy of fisheye image correction can be ensured; meanwhile, the image information mapping table is stored after being calculated, and the target correction graph corresponding to the circular fisheye image can be directly obtained according to the prestored image information mapping table when the fisheye image is corrected for the first time and the parameters of the fisheye image are not changed, so that repeated calculation can be avoided, and consumption of hardware resources is reduced.
Example five:
fig. 14 is a schematic diagram of a terminal device according to an embodiment of the present application. As shown in fig. 14, the terminal device 14 of this embodiment includes: a processor 140, a memory 141, and a computer program 142, such as a fisheye image correction program, stored in the memory 141 and executable on the processor 140. The processor 140 implements the steps in the above-mentioned embodiments of the fisheye image rectification method, such as the steps S101 to S102 shown in fig. 1, when executing the computer program 142. Alternatively, the processor 140 implements the functions of each module/unit in the above device embodiments, for example, the functions of the units 141 to 142 shown in fig. 14, when the computer program 142 is executed.
Illustratively, the computer program 142 may be partitioned into one or more modules/units that are stored in the memory 141 and executed by the processor 140 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 142 in the terminal device 14. For example, the computer program 142 may be divided into an image information mapping table obtaining unit and an image information filling unit, and the specific functions of each unit are as follows:
the image information mapping table acquiring unit is used for calculating the position mapping relationship between the preset correction image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm if the current fisheye image parameters are corrected for the first time or the fisheye image parameters are changed, and acquiring an image information mapping table according to the position mapping relationship for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image;
and the image information filling unit is used for sequentially acquiring the image information of the circular fisheye image according to the image information mapping table and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image.
The terminal device 14 may be a computing device such as a desktop computer, a notebook, a palm computer, and a cloud server. The terminal device may include, but is not limited to, a processor 140, a memory 141. Those skilled in the art will appreciate that fig. 14 is merely an example of a terminal device 14 and does not constitute a limitation of terminal device 14 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input output devices, network access devices, buses, etc.
The Processor 140 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Preferably, as shown in fig. 15, the processor 140 is specifically a CPU, and since the fisheye image rectification algorithm in the embodiment of the present application can reduce the computing resources, the fisheye image can be rectified in real time only by using the CPU, without depending on other hardware devices such as a Graphics Processing Unit (GPU) and an FPGA, so as to reduce the hardware cost.
The storage 141 may be an internal storage unit of the terminal device 14, such as a hard disk or a memory of the terminal device 14. The memory 141 may also be an external storage device of the terminal device 14, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 14. Further, the memory 141 may also include both an internal storage unit and an external storage device of the terminal device 14. The memory 141 is used for storing the computer programs and other programs and data required by the terminal device. The memory 141 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A fisheye image rectification method, comprising:
if the current fisheye image is corrected for the first time or the fisheye image parameters are changed, calculating the position mapping relation between the preset corrected image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm, and obtaining an image information mapping table according to the position mapping relation for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image;
and sequentially acquiring the image information of the circular fisheye image according to the image information mapping table to fill the preset correction image, so as to obtain a target correction image corresponding to the circular fisheye image.
2. The method for correcting fisheye images according to claim 1, wherein the calculating the position mapping relationship between the preset correction images and the circular fisheye images according to the fisheye image parameters and the preset correction algorithm comprises:
calculating the region position mapping relation between a first region of a preset correction image and a second region of a corresponding circular fisheye image according to the fisheye image parameters and a preset correction algorithm, wherein the first region is the preset correction imagePositive image equally divided into 2nOne of the column regions, the second region being a sector region corresponding to the first region in the circular fisheye image, wherein n is a positive integer;
and performing coordinate conversion calculation according to the region position mapping relation to obtain the position mapping relation between the preset correction image and the circular fisheye image.
3. The fisheye image rectification method of claim 1, wherein the preset rectification algorithm comprises a preset rectification formula, and the calculating of the position mapping relationship between the preset rectification image and the circular fisheye image according to the fisheye image parameters and the preset rectification algorithm comprises:
according to the fisheye image parameters and the preset correction formula, sequentially calculating the pixel coordinates (x, y) of the circular fisheye image corresponding to each pixel coordinate in the preset correction image according to the sequence of traversing the pixel coordinates (x ', y') of the same abscissa value and different ordinate values of the preset correction image each time;
wherein the preset correction formula is as follows:
or
Where θ is the image rotation angle, "·" denotes the multiplication sign.
4. The fisheye image correction method of claim 1, wherein the step of calculating a position mapping relationship between a preset correction image and a circular fisheye image according to the fisheye image parameters and a preset correction algorithm and obtaining an image information mapping table according to the position mapping relationship for storage comprises the steps of:
dividing a preset correction image into a plurality of sub-block images;
obtaining a position mapping relation between a first sub-block image and a circular fisheye image according to the fisheye image parameters and a preset correction algorithm, and correspondingly storing mapping information of the first sub-block image to an image information mapping table, wherein the first sub-block image is any one of the sub-block images;
sequentially obtaining the position mapping relation between other sub-block images and the circular fisheye image according to the position mapping relation between the first sub-block image and the circular fisheye image, and correspondingly storing the mapping information of the other sub-block images to the image information mapping table;
and storing the finally obtained image information mapping table.
5. The fisheye image correction method according to claim 1, wherein the fisheye image parameters further include YUV format, the image information mapping table is a target array of storage address information for storing YUV data of a circular fisheye image, and the calculating a position mapping relationship between a preset correction image and the circular fisheye image according to the fisheye image parameters and a preset correction algorithm to obtain and store an image information mapping table includes:
calculating the position mapping relation between a preset correction image and the circular fisheye image according to the resolution of the circular fisheye image, the image rotation angle and a preset correction algorithm;
obtaining and storing the image information mapping table according to the position mapping relation and the YUV format;
correspondingly, the sequentially acquiring the image information of the circular fisheye image according to the image information mapping table to fill the preset correction image, and obtaining the target correction image corresponding to the circular fisheye image, including:
and sequentially acquiring YUV data of the circular fisheye image according to the image information mapping table and the YUV format to fill the preset correction image, so as to obtain a target correction image corresponding to the circular fisheye image.
6. The fisheye image rectification method of claim 1 wherein the image information mapping table is comprised of a difference mapping table, a difference dictionary table and start index information.
7. The method for correcting fisheye images according to any one of claims 1 to 6, wherein the obtaining image information of the circular fisheye image in sequence according to the image information mapping table to fill the preset correction image to obtain a target correction image corresponding to the circular fisheye image comprises:
if the resolution of the circular fisheye image is smaller than a preset threshold, sequentially obtaining image information of the circular fisheye image according to the image information mapping table, performing bilinear interpolation operation, and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image;
otherwise, directly filling the preset correction image with the image information of the circular fisheye image sequentially acquired according to the image information mapping table to obtain the target correction image corresponding to the circular fisheye image.
8. A fisheye image rectification device, comprising:
the image information mapping table acquiring unit is used for calculating the position mapping relationship between the preset correction image and the circular fisheye image according to the fisheye image parameters and the preset correction algorithm if the current fisheye image parameters are corrected for the first time or the fisheye image parameters are changed, and acquiring an image information mapping table according to the position mapping relationship for storage; otherwise, directly acquiring the pre-stored image information mapping table; wherein the fisheye image parameters comprise the resolution and the image rotation angle of the circular fisheye image;
and the image information filling unit is used for sequentially acquiring the image information of the circular fisheye image according to the image information mapping table and filling the preset correction image to obtain a target correction image corresponding to the circular fisheye image.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the computer program, when executed by the processor, causes the terminal device to carry out the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, causes a terminal device to carry out the steps of the method according to any one of claims 1 to 7.
CN201910891227.2A 2019-09-20 2019-09-20 Fisheye image correction method and device and terminal equipment Pending CN110599427A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910891227.2A CN110599427A (en) 2019-09-20 2019-09-20 Fisheye image correction method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910891227.2A CN110599427A (en) 2019-09-20 2019-09-20 Fisheye image correction method and device and terminal equipment

Publications (1)

Publication Number Publication Date
CN110599427A true CN110599427A (en) 2019-12-20

Family

ID=68861574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910891227.2A Pending CN110599427A (en) 2019-09-20 2019-09-20 Fisheye image correction method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110599427A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111263115A (en) * 2020-02-14 2020-06-09 珠海研果科技有限公司 Method and apparatus for presenting images
CN111738909A (en) * 2020-06-11 2020-10-02 杭州海康威视数字技术股份有限公司 Image generation method and device
CN111899151A (en) * 2020-07-28 2020-11-06 北京中星微电子有限公司 Picture generation method and device, electronic equipment and computer readable medium
CN113313648A (en) * 2021-06-01 2021-08-27 百度在线网络技术(北京)有限公司 Image correction method, image correction device, electronic apparatus, and medium
CN114187437A (en) * 2022-02-11 2022-03-15 阿里巴巴达摩院(杭州)科技有限公司 Text recognition method, image correction method, electronic device, and storage medium
WO2023178539A1 (en) * 2022-03-23 2023-09-28 华为技术有限公司 Image processing method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783011A (en) * 2010-01-08 2010-07-21 宁波大学 Distortion correction method of fish eye lens
CN103247020A (en) * 2012-02-03 2013-08-14 苏州科泽数字技术有限公司 Fisheye image spread method based on radial characteristics
CN103325110A (en) * 2013-05-29 2013-09-25 山西绿色光电产业科学技术研究院(有限公司) Panoramic image correction algorithm based on panoramic all-in-one speed dome camera
CN107133911A (en) * 2016-02-26 2017-09-05 比亚迪股份有限公司 A kind of reverse image display methods and device
CN108335273A (en) * 2018-02-06 2018-07-27 大唐终端技术有限公司 The real-time removing method of the distortion of big wide-angle flake full shot camera
CN108765282A (en) * 2018-04-28 2018-11-06 北京大学 Real-time super-resolution method and system based on FPGA
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101783011A (en) * 2010-01-08 2010-07-21 宁波大学 Distortion correction method of fish eye lens
CN103247020A (en) * 2012-02-03 2013-08-14 苏州科泽数字技术有限公司 Fisheye image spread method based on radial characteristics
CN103325110A (en) * 2013-05-29 2013-09-25 山西绿色光电产业科学技术研究院(有限公司) Panoramic image correction algorithm based on panoramic all-in-one speed dome camera
CN107133911A (en) * 2016-02-26 2017-09-05 比亚迪股份有限公司 A kind of reverse image display methods and device
CN108335273A (en) * 2018-02-06 2018-07-27 大唐终端技术有限公司 The real-time removing method of the distortion of big wide-angle flake full shot camera
CN108765282A (en) * 2018-04-28 2018-11-06 北京大学 Real-time super-resolution method and system based on FPGA
CN109308686A (en) * 2018-08-16 2019-02-05 北京市商汤科技开发有限公司 A kind of fish eye images processing method and processing device, equipment and storage medium

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
RAMA SHANKAR等: "A Flexible Architecture for Real-time Fisheye Correction using Soft-core Processors and FPGA’s", 《NCIS-2009》 *
周飚: "鱼眼镜头监控系统及图像校正技术研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *
王志刚等: "一种基于鱼眼图像的全景展开算法", 《通信技术》 *
王立国: "《高光谱图像处理技术》", 31 May 2013 *
邓松杰: "利用鱼眼镜头生成全景图像的方法", 《工程图学学报》 *
陈汉苑: "基于鱼眼镜头的全景图像生成算法研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111263115A (en) * 2020-02-14 2020-06-09 珠海研果科技有限公司 Method and apparatus for presenting images
CN111263115B (en) * 2020-02-14 2024-04-19 珠海研果科技有限公司 Method, apparatus, electronic device, and computer-readable medium for presenting images
CN111738909A (en) * 2020-06-11 2020-10-02 杭州海康威视数字技术股份有限公司 Image generation method and device
CN111738909B (en) * 2020-06-11 2023-09-26 杭州海康威视数字技术股份有限公司 Image generation method and device
CN111899151A (en) * 2020-07-28 2020-11-06 北京中星微电子有限公司 Picture generation method and device, electronic equipment and computer readable medium
CN111899151B (en) * 2020-07-28 2023-06-27 北京中星微电子有限公司 Picture generation method, device, electronic equipment and computer readable medium
CN113313648A (en) * 2021-06-01 2021-08-27 百度在线网络技术(北京)有限公司 Image correction method, image correction device, electronic apparatus, and medium
CN113313648B (en) * 2021-06-01 2023-08-29 百度在线网络技术(北京)有限公司 Image correction method, device, electronic equipment and medium
CN114187437A (en) * 2022-02-11 2022-03-15 阿里巴巴达摩院(杭州)科技有限公司 Text recognition method, image correction method, electronic device, and storage medium
CN114187437B (en) * 2022-02-11 2022-05-13 阿里巴巴达摩院(杭州)科技有限公司 Text recognition method, image correction method, electronic device, and storage medium
WO2023178539A1 (en) * 2022-03-23 2023-09-28 华为技术有限公司 Image processing method and device

Similar Documents

Publication Publication Date Title
CN110599427A (en) Fisheye image correction method and device and terminal equipment
US10726580B2 (en) Method and device for calibration
CN111340109B (en) Image matching method, device, equipment and storage medium
EP2561467A1 (en) Daisy descriptor generation from precomputed scale - space
JP5409910B2 (en) Non-product image identification
US11308647B2 (en) Method and system for improving compression ratio by difference between blocks of image file
US20130101226A1 (en) Feature descriptors
CN113126937A (en) Display terminal adjusting method and display terminal
CN113676713A (en) Image processing method, apparatus, device and medium
CN108052869B (en) Lane line recognition method, lane line recognition device and computer-readable storage medium
CN112446918A (en) Method and device for positioning target object in image, computer device and storage medium
CN110415196B (en) Image correction method, device, electronic equipment and readable storage medium
CN110969042B (en) Two-dimensional code identification method and device and hardware device
CN113963072B (en) Binocular camera calibration method and device, computer equipment and storage medium
CN114168695A (en) Target position determining method, device, terminal and storage medium
CN114117063A (en) Entity alignment method, device, electronic equipment and computer readable storage medium
CN113391779A (en) Parameter adjusting method, device and equipment for paper-like screen
CN112767412A (en) Vehicle component level segmentation method and device and electronic equipment
CN111429450A (en) Corner point detection method, system, equipment and storage medium
CN111986312A (en) Ship track drawing method, terminal device and storage medium
CN114693532A (en) Image correction method and related equipment
CN116631319B (en) Screen display compensation method, intelligent terminal and storage medium
CN112711965B (en) Drawing recognition method, device and equipment
WO2023185287A1 (en) Virtual model lighting rendering method and apparatus, storage medium and electronic device
CN113673268A (en) Identification method, system and equipment for different brightness

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191220

RJ01 Rejection of invention patent application after publication