CN109308686B - Fisheye image processing method, device, equipment and storage medium - Google Patents
Fisheye image processing method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN109308686B CN109308686B CN201810935885.2A CN201810935885A CN109308686B CN 109308686 B CN109308686 B CN 109308686B CN 201810935885 A CN201810935885 A CN 201810935885A CN 109308686 B CN109308686 B CN 109308686B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- image
- dimensional coordinate
- determining
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title abstract description 20
- 238000012937 correction Methods 0.000 claims abstract description 90
- 238000013507 mapping Methods 0.000 claims abstract description 39
- 238000000034 method Methods 0.000 claims description 65
- 238000012545 processing Methods 0.000 claims description 23
- 230000003287 optical effect Effects 0.000 claims description 11
- 241000251468 Actinopterygii Species 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 7
- 230000000903 blocking effect Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
- G06T3/047—Fisheye or wide-angle transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the application provides a fisheye image processing method, a device, equipment and a storage medium, wherein firstly, correction parameters of a fisheye image to be processed are determined; according to the correction parameters, unfolding the fisheye image into a corresponding unfolded image on a preset direction axis; adjusting the unfolded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image.
Description
Technical Field
The embodiment of the application relates to the field of computer visual communication, and relates to but is not limited to a fisheye image processing method, a fisheye image processing device, fisheye image processing equipment and a storage medium.
Background
The fisheye camera can take all information in a 185-degree visual angle at one time, and has an ultra-large information amount compared with a common camera; the fisheye lens has compact structure, small volume and difficult damage, can construct omnibearing vision, has no blind area, does not need the problems of image splicing, scarfing and the like, and has wide application in the fields of video monitoring, automatic driving, video conferences, robot navigation and the like.
However, images captured by a fisheye camera have very serious distortion, and it is difficult to detect a target such as a pedestrian or a vehicle by using information of such a severely distorted image.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for processing a fisheye image, a device and a storage medium.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a fisheye image processing method, which comprises the following steps:
determining correction parameters of a fisheye image to be processed;
according to the correction parameters, unfolding the fisheye image into a corresponding unfolded image on a preset direction axis;
adjusting the unfolded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image.
In an embodiment of the present application, the correction parameter includes at least one of: circle center coordinates, fish eye radius and distortion coefficient.
In the embodiment of the present application, the mapping relationship includes a rectification index table having the same size as the expanded image;
correspondingly, the adjusting the expanded image according to a preset mapping relationship and the pixel value in the fisheye image to obtain an output image corresponding to the fisheye image includes:
and adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
In an embodiment of the present application, the method further includes:
establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image;
according to the three-dimensional coordinate system, determining a second two-dimensional coordinate system of the unfolded image and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system;
determining a first corresponding relation between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system and the three-dimensional coordinate system;
determining a second corresponding relation between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relation;
and determining the second corresponding relation as the mapping relation.
In an embodiment of the present application, the establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for acquiring the fisheye image includes:
establishing the first two-dimensional coordinate system by taking the center O of the fisheye image as an origin;
and establishing the three-dimensional coordinate system by taking the center O of the fisheye image as an origin and taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third coordinate axis respectively.
In an embodiment of the present application, the determining, according to the three-dimensional coordinate system, a second two-dimensional coordinate system of the unfolded image and a perspective projection plane coordinate system on the same plane as the second two-dimensional coordinate system includes:
in the three-dimensional coordinate system, determining a sphere model by taking the center O of the fisheye image as a sphere center and the radius of the fisheye image as a sphere radius;
determining a tangent plane meeting a preset angle in the sphere model;
determining the second two-dimensional coordinate system and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system according to the tangent plane; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation.
In an embodiment of the present application, the determining a first corresponding relationship between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system includes:
according to the perspective projection plane coordinate system, determining the variable quantity of the coordinate on the three-dimensional coordinate system corresponding to the movement of any point on the plane of the perspective projection plane coordinate system by a preset distance along a preset coordinate axis;
and determining a first corresponding relation between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the variable quantity, the second two-dimensional coordinate system and the three-dimensional coordinate system.
In an embodiment of the present application, the determining a second corresponding relationship between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relationship includes:
determining a third corresponding relation between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the first relation;
and determining the second corresponding relation according to the first corresponding relation and the third corresponding relation.
In this embodiment of the application, the adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image includes:
dividing the rectification index table into M blocks; m is an integer greater than 1;
determining coordinates of the expanded image corresponding to the M blocks;
adjusting the pixel value of the unfolded image according to the M blocks, the coordinates of the pixel points in the unfolded image and the coordinates of the pixel points in the fisheye image to obtain M adjusted unfolded images;
and splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
In an embodiment of the present application, the dividing the rectification index table into M blocks includes:
dividing the correction index table into M original blocks according to a preset resolution;
determining a source abscissa value and a source ordinate value contained in any one of the M original blocks;
subtracting the minimum value of the source abscissa and the minimum value of the source ordinate contained in the current block from any original block respectively to obtain the M blocks containing updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block.
In this embodiment of the present application, adjusting the pixel value of the expanded image according to the M blocks, the coordinates of the pixel points in the expanded image, and the coordinates of the pixel points in the fisheye image to obtain M blocks of adjusted expanded images, includes:
searching a source coordinate value corresponding to the coordinate value of the expanded image in the mth block according to the coordinate value of the expanded image corresponding to the mth block in the M blocks; wherein M is an integer greater than 0 and less than or equal to M;
determining a first pixel value corresponding to the source coordinate value on the fisheye image;
replacing a second pixel value of the expanded image with the first pixel value to obtain an mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the developed image at a coordinate value of the developed image.
In the embodiment of the present application, the correction index table is a fixed-point two-dimensional table.
The embodiment of the application provides a fisheye image processing apparatus, the apparatus includes: first determining means, first unfolding means and first adjusting means, wherein:
the first determining device is used for determining the correction parameters of the fisheye image to be processed;
the first unfolding device is used for unfolding the fisheye image into a corresponding unfolded image on a preset direction axis according to the correction parameters;
the first adjusting device is used for adjusting the expanded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image.
In an embodiment of the present application, the correction parameter includes at least one of: circle center coordinates, fish eye radius and distortion coefficient.
In the embodiment of the present application, the mapping relationship includes a rectification index table having the same size as the expanded image;
correspondingly, the first adjusting device comprises:
and the first adjusting sub-device is used for adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
In an embodiment of the present application, the apparatus further includes:
the first establishing device is used for establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image;
second determining means for determining a second two-dimensional coordinate system of the expanded image and a perspective projection plane coordinate system on the same plane as the second two-dimensional coordinate system, based on the three-dimensional coordinate system;
third determining means for determining a first corresponding relationship between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system;
a fourth determining device, configured to determine a second corresponding relationship between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relationship;
fifth determining means, configured to determine the second corresponding relationship as the mapping relationship.
In an embodiment of the present application, the first establishing apparatus includes:
a first establishing sub-means for establishing the first two-dimensional coordinate system with a center O of the fisheye image as an origin;
and the second establishing sub-device is used for establishing the three-dimensional coordinate system by taking the center O of the fisheye image as an origin and taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third dimensional coordinate axis respectively.
In an embodiment of the present application, the second determining device includes:
the first determining sub-device is used for determining a spherical model by taking the center O of the fisheye image as a spherical center and the radius of the fisheye image as a spherical radius in the three-dimensional coordinate system;
the second determining sub-device is used for determining a tangent plane meeting a preset angle in the spherical model;
the third determining sub-device is used for determining the second two-dimensional coordinate system and a perspective projection plane coordinate system which is in the same plane as the second two-dimensional coordinate system according to the tangent plane; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation.
In an embodiment of the present application, the third determining device includes:
a third determining sub-device, configured to determine, according to the perspective projection plane coordinate system, a variation of a coordinate on the three-dimensional coordinate system corresponding to a movement of any point on a plane to which the perspective projection plane coordinate system belongs by a preset distance along a preset coordinate axis;
and the fourth determining sub-device is used for determining a first corresponding relation between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the variable quantity, the second two-dimensional coordinate system and the three-dimensional coordinate system.
In an embodiment of the present application, the fourth determining device includes:
a fifth determining sub-device, configured to determine, according to the first relationship, a third corresponding relationship between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system;
and a sixth determining sub-device, configured to determine the second corresponding relationship according to the first corresponding relationship and the third corresponding relationship.
In an embodiment of the present application, the first adjusting sub-apparatus includes:
a first dividing unit for dividing the rectification index table into M blocks; m is an integer greater than 1;
a first determining unit configured to determine coordinates of the expanded image corresponding to the M blocks;
the first adjusting unit is used for adjusting the pixel value of the expanded image according to the M blocks, the coordinates of the pixel points in the expanded image and the coordinates of the pixel points in the fisheye image to obtain M blocks of the expanded image after adjustment;
and the first splicing unit is used for splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
In an embodiment of the present application, the first dividing unit includes:
the first dividing unit is used for dividing the correction index table into M original blocks according to a preset resolution;
a first determining subunit, configured to determine a source abscissa value and a source ordinate value included in any one of the M original blocks;
a first updating subunit, configured to subtract the minimum value of the source abscissa and the minimum value of the source ordinate included in the current block from each of the original blocks to obtain the M blocks including updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block.
In an embodiment of the present application, the first adjusting unit includes:
a first searching subunit, configured to search, in the mth block in the M blocks, a source coordinate value corresponding to a coordinate value of the expanded image according to the coordinate value of the expanded image corresponding to the mth block; wherein M is an integer greater than 0 and less than or equal to M;
the second determining subunit is used for determining a first pixel value corresponding to the source coordinate value on the fisheye image;
a first replacing subunit, configured to replace a second pixel value of the expanded image with the first pixel value, to obtain an mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the developed image at a coordinate value of the developed image.
In the embodiment of the present application, the correction index table is a fixed-point two-dimensional table.
The embodiment of the present application provides a computer program product, where the computer program product includes computer-executable instructions, and after the computer-executable instructions are executed, the steps in the fisheye image processing method provided by the embodiment of the present application can be implemented.
The embodiment of the application provides computer equipment, which comprises a memory and a processor, wherein the memory stores computer executable instructions, and the processor can realize the steps in the fisheye image processing method provided by the embodiment of the application when running the computer executable instructions on the memory.
The embodiment of the application provides a fisheye image processing method, a device, equipment and a storage medium, wherein firstly, correction parameters of a fisheye image to be processed are determined; then, according to the correction parameters, unfolding the fisheye image into a corresponding unfolded image on a preset direction axis; finally, adjusting the unfolded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image; therefore, pixels on the expanded image are replaced correspondingly to pixels of the original image, so that the fisheye image is corrected and expanded into a perspective projection distortion-free image which accords with the observation habit of people, and the image edge information after fisheye image distortion correction can be well stored.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1A is a schematic structural diagram of a network architecture according to an embodiment of the present application;
fig. 1B is a schematic view illustrating an implementation process of the fisheye image processing method according to the embodiment of the disclosure;
fig. 2 is a schematic flowchart illustrating another implementation of the fisheye image processing method according to the embodiment of the disclosure;
fig. 3 is a schematic flowchart illustrating a method for processing a fisheye image according to an embodiment of the disclosure;
fig. 4 is a coordinate diagram of a method for processing a fisheye image according to an embodiment of the present disclosure;
fig. 5A is a schematic diagram illustrating a coordinate correspondence relationship between a fisheye image and an expanded image according to an embodiment of the disclosure;
fig. 5B is a schematic diagram illustrating a pixel correspondence relationship between a fisheye image and an expanded image according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a fisheye image processing apparatus according to an embodiment of the disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, specific technical solutions of the present invention will be described in further detail below with reference to the accompanying drawings in the embodiments of the present application. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
In this embodiment, a network architecture is provided first, and fig. 1A is a schematic structural diagram of the network architecture in the embodiment of the present application, as shown in fig. 1A, the network architecture includes two or more computer devices 11 to 1N and a server 31, where the computer devices 11 to 1N interact with the server 31 through a network 21. The computer device may be various types of computer devices having information processing capabilities in implementation, for example, the computer device may include a mobile phone, a tablet computer, a desktop computer, a personal digital assistant, a navigator, a digital telephone, a television, and the like.
The embodiment provides a fisheye image processing method, which can effectively solve the problems that when a fisheye image is unfolded into a planar unfolded image, edge information of the unfolded image is seriously lost and stretched, and the like.
Fig. 1B is a schematic view of an implementation flow of a fisheye image processing method according to an embodiment of the present application, and as shown in fig. 1B, the method includes the following steps:
step S101, determining the correction parameters of the fisheye image to be processed.
Here, the correction parameter includes at least one of: circle center coordinates, fish eye radius and distortion coefficient. The circle center coordinate and the fisheye radius can be obtained by selecting three non-collinear points on the circular outline of the fisheye image to determine a circular equation of the fisheye image; the distortion correction coefficient can be obtained by calibration.
And S102, unfolding the fisheye image into a corresponding unfolded image on a preset direction axis according to the correction parameters.
Here, the step S102 may be understood as expanding the fisheye image into a plurality of undistorted images (expanded images) by correcting the fisheye image in a plurality of direction axes.
Step S103, adjusting the unfolded image according to a preset mapping relation and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
Here, the preset mapping relationship is used to indicate a correspondence relationship between coordinates of pixel points in the fisheye image and coordinates of pixel points in the expanded image. The step S103 may be understood as replacing the pixel value on the expanded image with the pixel value on the fisheye image according to a preset mapping relationship, and correcting and expanding the fisheye image into a perspective projection distortion-free image according with the observation habit of people.
In the method for processing the fisheye image provided by the embodiment of the application, the pixel value on the fisheye image is replaced by the pixel value on the unfolded image, so that the edge information of the obtained unfolded image can be sufficiently represented on the unfolded image, and the fisheye image is corrected and unfolded into a perspective projection distortion-free image which accords with the observation habit of people.
In other embodiments, the mapping includes a rectification index table of the same size as the unfolded image.
Wherein the correction index table stores a correspondence between coordinates of one point on the expanded image and coordinates on the fisheye image in the form of a two-dimensional table.
Correspondingly, the step S102, namely adjusting the expanded image according to a preset mapping relationship and a pixel value in the fisheye image to obtain an output image corresponding to the fisheye image, includes:
and adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
An embodiment of the present application provides a fisheye image processing method, and fig. 2 is a schematic diagram illustrating another implementation flow of the fisheye image processing method according to the embodiment of the present application, where as shown in fig. 2, the method includes the following steps:
step S201, determining a correction parameter of the fisheye image to be processed.
Step S202, according to the correction parameters, the fisheye image is unfolded into a corresponding unfolded image on a preset direction axis.
Step S203, establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image.
Here, the first two-dimensional coordinate system is established with the center O of the fisheye image as an origin, with the fisheye imageAnd establishing the three-dimensional coordinate system by taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third dimensional coordinate axis respectively by taking the center O of the image as an origin. For example, with the center O (x) of the fisheye image0,y0) Establishing a fisheye image coordinate system xoy for the origin; establishing a camera coordinate system (x) with the O point as an originc,yc,zc) Wherein x isc,ycRespectively coincide with the x-axis and the y-axis of a coordinate system of the fisheye image, and zcThe axis coincides with the optical axis of the fisheye lens.
And step S204, determining a second two-dimensional coordinate system of the expanded image and a perspective projection plane coordinate system on the same plane as the second two-dimensional coordinate system according to the three-dimensional coordinate system.
Here, the step S204 may be implemented by: firstly, in the three-dimensional coordinate system, a sphere model (such as a hemisphere shown in fig. 4) is determined by taking the center O of the fisheye image as a sphere center and the radius of the fisheye image as a sphere radius; then, a section satisfying a predetermined angle in the sphere model is determined (as shown in fig. 4, a section UwVw) (ii) a Finally, according to the tangent plane, determining the second two-dimensional coordinate system and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation. I.e. the second two-dimensional coordinate system and the perspective projection plane coordinate system are both on the tangent plane UwVwBut the unit of the second two-dimensional coordinate system is the real dimension meter (m), and the unit of the perspective projection plane coordinate system is the pixel; that is, if a point is (m, n) in the second two-dimensional coordinate system (i.e., the coordinates of the unfolded image coordinate system), the coordinate system on the perspective projection plane is (mw)pixel,nwpixel) The coordinate values of the point in the two coordinate systems differ by a factor, namely the true scale size of a pixel.
Step S205, determining a first corresponding relationship between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system.
Here, the step S205 may be implemented by: firstly, according to the perspective projection plane coordinate system, determining the variable quantity of coordinates on the three-dimensional coordinate system corresponding to the movement of any point on the plane to which the perspective projection plane coordinate system belongs by a preset distance along a preset coordinate axis; then, according to the variation, the second two-dimensional coordinate system and the three-dimensional coordinate system, a first corresponding relationship between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system is determined. For example, as shown in fig. 4, the variation can be understood as every movement w of any point along the positive direction U on the perspective projection plane coordinate systempixelVariation dxu, dyu, dzu in three-dimensional coordinate system and every shift W in positive V direction in perspective projection plane coordinate systempixelAmount of change d in time on three-dimensional coordinate systemxv,dyv,dzv(ii) a Since the coordinates of the point on the perspective projection plane coordinate system and the coordinates of the point on the second two-dimensional coordinate system are in one-to-one correspondence, the coordinates of the point Q ″ on the three-dimensional coordinate system corresponding to any point Q (m, n) on the second two-dimensional coordinate system can be obtained from the variation.
Step S206, determining a second corresponding relationship between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relationship.
Here, the step S206 may be implemented by: firstly, according to the first relation, determining a third corresponding relation between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system; and then, determining the second corresponding relation according to the first corresponding relation and the third corresponding relation. For example, an imaging Q' "of Q" in the fisheye image (i.e., a third corresponding relationship) may be obtained according to an equidistant projection principle and a first corresponding relationship (i.e., coordinates of a point Q "corresponding to any point Q (m, n) on the second two-dimensional coordinate system on the three-dimensional coordinate system).
And step S207, adjusting the unfolded image according to the second corresponding relation and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
Here, the second correspondence is referred to as the mapping relationship. In this embodiment of the application, the expanded image is adjusted according to the second correspondence and the coordinates of the pixel points in the fisheye image, so as to obtain an output image corresponding to the fisheye image, which can be implemented in the following two ways:
firstly, in step S207, the expanded image is adjusted according to the second correspondence and the coordinates of the pixel points in the fisheye image, so as to obtain an output image corresponding to the fisheye image.
In the method for processing the fisheye image provided by the embodiment of the application, the corresponding relation between a fisheye image coordinate system (namely a first two-dimensional coordinate system) and an expanded image coordinate system (namely a second two-dimensional coordinate system) and a camera coordinate system (namely a three-dimensional coordinate system) is established, and pixels on the fisheye image are replaced by pixels on the expanded image according to the corresponding relation, so that the perspective projection distortion-free image which accords with the observation habit of people is obtained.
Secondly, the second corresponding relation is stored in a correction index table in a form of a two-dimensional table, and then the expanded image is adjusted according to the correction index table and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image, which can be realized by the following steps:
and step S208, storing the second corresponding relation in a correction index table in a two-dimensional table form.
Here, the correction index table stores a correspondence relationship of coordinates of any point on the expanded image to any point on the fisheye image. The correction index table is a fixed-point two-dimensional table, and since a Field-Programmable Gate Array (FPGA) cannot perform floating-point operation, in this embodiment, the index table is fixed-point operated, and here, a 4-bit (bit) fixed-point scheme is adopted, and all values in the index table are right-shifted by 4 bits and rounded.
In step S209, the correction index table is divided into M blocks.
Here, M is an integer greater than 1. The step S209 may be implemented by: firstly, dividing the correction index table into M original blocks according to a preset resolution; then, determining a source abscissa value and a source ordinate value contained in any one of the M original blocks; finally, subtracting the minimum value of the source abscissa and the minimum value of the source ordinate contained in the current block from any original block respectively to obtain the M blocks containing updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block. For example, the index table is divided into 32 × 32 blocks, and for each block, the minimum value and the maximum value of x and y are counted, so that the corresponding block on the source image can be determined by the minimum value and the maximum value of x and y. In addition, for the coordinate information of the index table block, the minimum values of x and y need to be uniformly subtracted to be converted into the coordinate system of the fisheye image block (i.e., the first two-dimensional coordinate system).
Step S210, determining coordinates of the expanded image corresponding to the M blocks.
Here, the step S210 may be understood as mapping the pixel values in the fisheye image block into the expanded image block according to the information of the coordinates in the index table, and in addition, since the coordinate position of the coordinates in the index table on the fisheye image is not necessarily an integer, the target pixel needs to be obtained by interpolation, and a bilinear interpolation method is used here.
Step S211, adjusting the pixel value of the expanded image according to the M blocks, the coordinates of the pixel points in the expanded image and the coordinates of the pixel points in the fisheye image to obtain the M blocks of the expanded image after adjustment.
Here, the step S211 may be implemented by: firstly, searching a source coordinate value corresponding to a coordinate value of an expanded image in an mth block according to the coordinate value of the expanded image corresponding to the mth block in the M blocks; wherein M is an integer greater than 0 and less than or equal to M; then, determining a first pixel value corresponding to the source coordinate value on the fisheye image; finally, replacing the second pixel value of the expanded image with the first pixel value to obtain the mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the expanded image at a coordinate value of the expanded image. For example, the coordinates of the expanded image in the second two-dimensional coordinate system are (1, 1), the coordinate point of the coordinate (1, 1) on the fisheye image is determined to be (50, 60) according to the rectification index table, and then the pixel of the coordinate point (50, 60) on the fisheye image is substituted for the pixel of the coordinate (1, 1) on the expanded image, so that the expanded image after pixel substitution is obtained, and a plurality of expanded image blocks corresponding to a plurality of rectification blocks are obtained. According to the multiple groups of correction index tables determined above, the expansion correction can be rapidly performed on line through a table look-up method. The embodiment provides a method for correcting block expansion aiming at the realization of FPGA, which comprises the steps of dividing an index table into a plurality of smaller blocks, correcting each block independently, splicing after the correction of all the blocks is finished, and outputting a complete correction expansion diagram. The method can fully utilize the hardware characteristic of the FPGA and improve the utilization efficiency.
And step S212, splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
Here, in this embodiment, it is optional to perform fixed-point processing on the coordinate points in the index table, block the index table, and stitch the M adjusted expanded images, so as to adapt to the hardware characteristics of the FPGA, make full use of the performance of the FPGA, and perform remapping on the central processing unit directly using the index table without performing fixed-point processing, index table blocking, and image stitching. In addition, when the method is realized on the FPGA, the fixed-point scheme is not limited to 4-bit fixed-point, and can be replaced by fixed-point schemes with other precisions, such as 8-bit and 16-bit; the block size of the index table is not limited to 32 × 32 pixels, and can be any size, depending on the bandwidth capability of the hardware; in the step of block remapping, the method is not limited to a bilinear interpolation scheme, and can be replaced by nearest neighbor interpolation, bicubic interpolation, Lanuss interpolation and the like, and is determined according to the speed and precision requirements.
In this embodiment, the fisheye image can be corrected into a plurality of normal undistorted images, and the images are corrected into images according with normal observation habits of people under the condition that the information amount is not lost. Each area of the fisheye image has the same good correction effect, and the previous method is usually good in correction effect on the central area of the image, poor in correction effect on the edge part and serious in stretching condition. By generating the index table, the space is sacrificed for time, the correction speed can be improved, and real-time correction can be realized. By the index table fixed-point and blocking strategies, real-time correction can be efficiently carried out on FPGA hardware.
The embodiment of the application provides a fisheye image processing method, which is used for correcting and unfolding a fisheye image into a plurality of paths of undistorted images on a plurality of direction axes. By adopting the fisheye image processing method provided by the embodiment, the fisheye image is corrected and expanded into a perspective projection distortion-free image which accords with the observation habit of people, and the method has important practical significance for the wide application of fisheye cameras.
Fig. 3 is a schematic flowchart of a method for processing a fisheye image according to an embodiment of the present disclosure, and as shown in fig. 3, the method may be summarized as the following four steps:
step S301, determining the spread correction parameter for determining the spread correction parameter.
The expansion correction parameters comprise a fisheye image circle center coordinate, a fisheye radius and a distortion coefficient, and the circle center coordinate and the fisheye radius can be obtained by selecting three non-collinear points on the fisheye image circular contour to determine a circular equation of the fisheye image; the distortion correction coefficient can be obtained by calibration.
In step S302, a correction index table is determined offline.
Here, the rectification index table is applied to all images of the same fisheye camera, and therefore, in order to avoid repeated determination, the present embodiment adopts a mode of determining once off-line. In the embodiment, a fisheye camera spherical perspective projection model is established, and parameters of elevation angle, azimuth angle, visual angle and rotation angle of an expansion diagram are introduced, so that a correction index table can be determined. In order to obtain the multi-view expansion diagram, a plurality of sets of expansion parameters can be designed.
Step S303, the correction index table is blocked on line, and the spread image is corrected and expanded separately.
Here, the spread correction may be rapidly performed on-line by a table look-up method according to the plurality of correction index tables determined in step S202. A method for unfolding and correcting blocks is provided for realizing FPGA, an index table is divided into a plurality of smaller blocks, each block is corrected independently, all the blocks are spliced after being corrected, and a complete correction unfolded graph is output. The method can fully utilize the hardware characteristic of the FPGA and improve the utilization efficiency.
And step S304, outputting the multi-path corrected unfolded image.
Here, by configuring a plurality of sets of the spread parameters and determining a plurality of sets of the correction index tables, the multi-channel correction spread graph can be generated.
For the above steps, with respect to "step S302, off-line determination of the correction index table" and "step S303, on-line block spread correction", the explanation is given as follows:
first, in step S302, the process of determining the correction index table offline is as follows:
1. the method for establishing the spherical imaging model with the fisheye image conforming to the equidistant projection principle comprises the following steps of:
1) as shown in fig. 4, the center O (x) of the fisheye image is shown0,y0) Establishing a two-dimensional coordinate system xoy for an origin;
2) establishing a camera coordinate system (x) with the O point as an originc,yc,zc) Wherein x isc,ycRespectively coincide with the x-axis and the y-axis of a two-dimensional coordinate system, zcThe axis is superposed with the optical axis of the fisheye lens;
3) taking the O point as the center, taking the radius R of the fisheye image as the radius to make a hemisphere, the hemisphere and the zcThe axes intersect at the O' point (0,0, R);
4) suppose a point P in the camera coordinate system is taken as a ray OP, intersecting the hemisphere with P' (x)cp',ycp',zcp') This point (P' (x)cp',ycp',zcp') Coordinates in the camera coordinate system are:
(xcp',ycp',zcp')=(Rsinθcosφ,Rsinθsinφ,Rcosθ) (1);
OP and zcThe included angle theta of the shaft is an incident angle; OP on xoyThe projection OP "of (a) is at an angle phi of azimuth (rotation in the horizontal plane is understood) and the elevation angle beta of OP is 90 deg. -theta. Then according to the principle of equidistant projection: i | OP "| θ, where k is the distortion correction coefficient.
2. Establishing a perspective projection plane coordinate system and an unfolded image coordinate system, and solving the relationship between the two coordinate systems, wherein the method comprises the following steps:
1) p' passing spherical section UwVwThe plane is the plane where the coordinate system of the expansion window is located, and the visual angle of the coordinate system of the expansion window is gamma; p' VwCoplanar with OZ and OP, with its positive direction perpendicular to OP ', and setting P' VwThe included angle between the P 'O' and the P 'O' is alpha; p' UwIs in the positive direction of P' VwA cross product determination with OP'; defining the azimuth of the tangent plane as the azimuth of OP, the elevation is the same as the elevation of OP: β is 90 ° - θ. That is, the section UwVwThere are two coordinate systems corresponding to: a perspective projection plane coordinate system and a two-dimensional coordinate system for unfolding the image, wherein the perspective projection plane coordinate system takes P' as an origin and U and V as coordinate axes. It is a spatial coordinate system, in units of the true dimension (m). The unfolded image coordinate system is on the UV plane of the perspective projection plane coordinate system, but in pixels. The two coordinate systems are in the same plane, the original points coincide, the direction axes coincide, and the difference is the measurement units of the two coordinate systems, one is in the unit of the space real dimension m, and the other is in the unit of the pixel. For example, so assume a pixel size of wpixeIf the coordinates of the image coordinate system of a point after unfolding are (m, n), then the coordinate system on the perspective projection plane is (mw)pixel,nwpixe l) They differ by a factor, i.e. the true scale size of a pixel.
2) The two-dimensional coordinate system of the expanded image is U 'V', and the height and the width are respectively w and h.
3) The point on the perspective projection plane has a one-to-one correspondence with the pixel of the unfolded image, so that the size of one pixel on the unfolded image corresponds to the size on the perspective projection plane, as shown in formula (2):
wpixel=R×2×tan(β/2)/w (2);
let the coordinates of P' on the UV surface be the origin (0,0), and the origin of the unfolded image be at the center of the image, so that the pixel coordinates on the corrected and unfolded image and the point coordinates on the perspective projection plane can be in one-to-one correspondence. Assuming that the coordinate of one point on the unfolded image is Q (m, n), the corresponding coordinate on the perspective projection plane is Q' (mw)pixel,nwpixel)。
3. Establishing a relation between an expanded image coordinate system and a camera coordinate system, comprising the following steps:
1) establishing a relation between a perspective projection plane coordinate system and a camera coordinate system: each time w moves along the positive direction of U on the perspective projection planepixelThe variance dxu, dyu, dzu in the camera coordinate system is:
dxu=-wpixel*sinφ
dyu=-wpixel*cosφ
dzu=0 (3);
in the same way, the W of each movement along the V positive direction on the perspective projection plane can be solvedpixelAmount of change d in the camera coordinate systemxv,dyv,dzvComprises the following steps:
dxv=wpixel*cosθcosφ
dyv=wpixel*cosθsinφ
dzv=wpixel*sinθ (4);
2) the coordinate Q ″ in the camera coordinate system corresponding to the point Q (m, n) in the unfolded image is obtained as:
x=xcp+mdxu+ndxv=sinθcosφ-msinφ+ncosθcosφ
y=ycp+mdyu+ndyv=sinθsinφ+mcosφ+ncosθsinφ
z=zcp+mdzu+ndzv=cosθ+nsinθ (5);
4. finding a relationship between the unfolded image coordinate system and the two-dimensional coordinate system, comprising the steps of:
the present embodiment has found the coordinates Q ″ of the coordinates of the point Q in the unfolded image in the camera coordinate system in the above steps. According to the equidistant projection principle, the imaging Q 'of Q' in the fisheye image can be obtained in the embodiment. The incident angle of Q' ″ is:
the azimuth angle is:
the length of Q' "to the center point is r ═ k α. The coordinates of Q' "are therefore:
xQ"'=rcosλ+x0
yQ"'=rsinλ+y0 (8);
x0,y0as the center coordinate of the fish-eye image, any point Q (m, n) on the expanded image corresponds to the point coordinate Q' "(rcos λ + x) on the fish-eye image0,rsinλ+y0). As shown in fig. 5A, the correspondence relationship between the coordinates of one point in the fisheye image 51 and the coordinates of one point in the expanded image 52 satisfies the function (x, y) of f (u, v, Φ, θ, γ, α); in the present embodiment, the correction index table includes two parts, i.e., a map table 53 and a map table 54, in which the map table 53 is used to store the x-coordinate value of the coordinate of one point on the expanded image on the fisheye image, and the map table 54 is used to store the y-coordinate value of the coordinate of one point on the expanded image on the fisheye image.
5. The establishment of the correction index table comprises the following steps:
the mapping relation between the two-dimensional coordinate system of the expanded image and the two-dimensional coordinate system is established through the steps as follows: (x, y) ═ f (u, v, Φ, θ, γ, α). Repeating the mapping relation for all points on the expansion map, the distortion correction index tables mapx and map can be established, wherein mapx stores the x coordinate of a point on the expansion correction map on the fisheye image, and map stores the y coordinate information of a point on the rotation-away correction map on the fisheye image.
In this step, it is essential to establish a mapping relationship between the two-dimensional coordinate system of the expanded image and the two-dimensional coordinate system to realize the scheme. In the process of establishing the mapping relationship, the α rotation angle parameter is an optional scheme, and the parameter defines the rotation angle of the output image, and the alternative scheme is to perform the rotation operation after the correction.
In this step, it is optional to establish the index table by using the mapping relationship, and the purpose is to accelerate the correction speed and reduce the repeated operation. If not considered time consuming, alternatives are: and (4) correcting each pixel by directly utilizing the mapping relation without establishing an index table.
Secondly, in step S303, the online block-by-block unfolding correction process is as follows:
the purpose of block correction is to reduce the hardware Input/Output (I/O) pressure of the FPGA and fully exert the computational performance thereof. If not, FPGA needs to load a whole pair of fisheye images once to correct, which causes great pressure to I/O and simultaneously does not utilize the full utilization of hardware resources. The block correction comprises 4 steps of fixed point, index table blocking, block remapping and image splicing.
Step one, fixed point processing: because floating point operation cannot be performed on the FPGA, the present embodiment performs fixed-point operation on the index table, and here, a 4-bit fixed-point scheme is adopted, and all values in the index table are right-shifted by 4 bits and rounded.
Secondly, dividing the index table into blocks: in order to perform the block correction, the present embodiment performs the block correction on the index table. As shown in fig. 5B, taking an output resolution of 128 × 128 as an example, the x coordinate value and the y coordinate value of the pixel at the corresponding position on the output image on the fisheye image are recorded by mapx and mapy, respectively. The mapx and the copy are divided into 32 × 32 blocks, and the minimum value and the maximum value of x and y are counted in each block, so that the corresponding block on the fisheye image can be determined by (x _ min, y _ min, x _ max, y _ max). In addition, for the coordinate information of the blocks in the index tables mapx and copy, x _ min and y _ min need to be subtracted uniformly to convert into the new fisheye image block coordinate system. As shown in fig. 5B, the position of 1 in the M corrected expanded images 501 is 1, which is a graph with a size of 32 × 32, where the graph has 32 × 32 pixels, map _ x503 records the x value of the position of the current pixel on the fisheye image 502 by looking up the index tables map _ x503 and map _ y504, map _ x503 records the coordinate y value of the position of the current pixel on the fisheye image 502, map _ y504 records the coordinate y value of the position of the current pixel on the fisheye image, and the corresponding coordinate point (assumed to be the position of 1 in the fisheye image 502) can be found on the fisheye image by looking up map _ x503 and map _ y504, and then the pixel value of the coordinate point is used as the pixel value of the expanded pixel.
Thirdly, block remapping: for the 32 × 32 blocks into which the index table is divided, a remapping operation is performed. That is, the pixel values in the fisheye image block are mapped into the output image block according to the information in the index tables mapx and copy, and in addition, since the coordinate positions of mapx and copy on the fisheye image are not necessarily integers, the target pixel needs to be acquired by interpolation, and a bilinear interpolation method is used here.
Fourthly, image splicing: and splicing all output blocks after block remapping, and outputting the output blocks into a 1-path correction expansion diagram.
In the step, fixed-point Processing, index table blocking and image splicing are optional, so as to adapt to the hardware characteristics of the FPGA and make full use of the performance of the FPGA, and the alternative scheme is that the index table is directly used for remapping on a Central Processing Unit (CPU), and fixed-point Processing, index table blocking and image splicing are not performed. In addition, when the method is realized on the FPGA, the fixed-point scheme is not limited to 4-bit fixed-point, and can be replaced by fixed-point schemes with other precisions such as 8-bit and 16-bit; the block size of the index table is not limited to 32 × 32 pixels, and the block size can be replaced by any other size according to the bandwidth capacity of hardware; in the step of block remapping, the method is not limited to the bilinear interpolation scheme, and the method can be replaced by nearest neighbor interpolation, bicubic interpolation, Lanuss interpolation and the like according to the speed and precision requirements.
The embodiment of the present application provides a fisheye image processing apparatus, fig. 6 is a schematic diagram of a composition structure of the fisheye image processing apparatus in the embodiment of the present application, and as shown in fig. 6, the fisheye image processing apparatus 600 includes: a first determining means 601, a first unfolding means 602 and a first adjusting means 603, wherein:
the first determining device 601 is configured to determine a correction parameter of a fisheye image to be processed;
the first unfolding device 602 is configured to unfold the fisheye image into a corresponding unfolded image on a preset directional axis according to the correction parameter;
the first adjusting device 603 is configured to adjust the expanded image according to a preset mapping relationship and coordinates of pixel points in the fisheye image, so as to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image.
In an embodiment of the present application, the correction parameter includes at least one of: circle center coordinates, fish eye radius and distortion coefficient.
In the embodiment of the present application, the mapping relationship includes a rectification index table having the same size as the expanded image;
correspondingly, the first adjusting device 603 includes:
and the first adjusting sub-device is used for adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
In an embodiment of the present application, the apparatus further includes:
the first establishing device is used for establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image;
second determining means for determining a second two-dimensional coordinate system of the expanded image and a perspective projection plane coordinate system on the same plane as the second two-dimensional coordinate system, based on the three-dimensional coordinate system;
third determining means for determining a first corresponding relationship between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system;
a fourth determining device, configured to determine a second corresponding relationship between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relationship;
fifth determining means, configured to determine the second corresponding relationship as the mapping relationship.
In an embodiment of the present application, the first establishing apparatus includes:
a first establishing sub-means for establishing the first two-dimensional coordinate system with a center O of the fisheye image as an origin;
and the second establishing sub-device is used for establishing the three-dimensional coordinate system by taking the center O of the fisheye image as an origin and taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third dimensional coordinate axis respectively.
In an embodiment of the present application, the second determining device includes:
the first determining sub-device is used for determining a spherical model by taking the center O of the fisheye image as a spherical center and the radius of the fisheye image as a spherical radius in the three-dimensional coordinate system;
the second determining sub-device is used for determining a tangent plane meeting a preset angle in the spherical model;
the third determining sub-device is used for determining the second two-dimensional coordinate system and a perspective projection plane coordinate system which is in the same plane as the second two-dimensional coordinate system according to the tangent plane; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation.
In an embodiment of the present application, the third determining device includes:
the third determining sub-device is used for determining the coordinate variation quantity of the three-dimensional coordinate system corresponding to the movement of any point on the plane of the perspective projection plane coordinate system by a preset distance along a preset coordinate axis according to the perspective projection plane coordinate system;
and the fourth determining sub-device is used for determining a first corresponding relation between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the variable quantity, the second two-dimensional coordinate system and the three-dimensional coordinate system.
In an embodiment of the present application, the fourth determining device includes:
a fifth determining sub-device, configured to determine, according to the first relationship, a third corresponding relationship between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system;
and a sixth determining sub-device, configured to determine the second corresponding relationship according to the first corresponding relationship and the third corresponding relationship.
In an embodiment of the present application, the first adjusting sub-apparatus includes:
a first dividing unit for dividing the rectification index table into M blocks; m is an integer greater than 1;
a first determining unit configured to determine coordinates of the expanded image corresponding to the M blocks;
the first adjusting unit is used for adjusting the pixel value of the unfolded image according to the M blocks, the coordinates of the pixel points in the unfolded image and the coordinates of the pixel points in the fisheye image to obtain M adjusted unfolded images;
and the first splicing unit is used for splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
In an embodiment of the present application, the first dividing unit includes:
the first dividing unit is used for dividing the correction index table into M original blocks according to a preset resolution;
a first determining subunit, configured to determine a source abscissa value and a source ordinate value included in any one of the M original blocks;
a first updating subunit, configured to subtract, from each of the original blocks, a minimum value of a source abscissa and a minimum value of a source ordinate that are included in the current block, to obtain the M blocks that include updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block.
In an embodiment of the present application, the first adjusting unit includes:
the first searching subunit is configured to search, in the mth block, a source coordinate value corresponding to a coordinate value of the expanded image according to the coordinate value of the expanded image corresponding to the mth block in the M blocks; wherein M is an integer greater than 0 and less than or equal to M;
the second determining subunit is used for determining a first pixel value corresponding to the source coordinate value on the fisheye image;
a first replacing subunit, configured to replace a second pixel value of the expanded image with the first pixel value, to obtain an mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the developed image at a coordinate value of the developed image.
In the embodiment of the present application, the correction index table is a fixed-point two-dimensional table.
It should be noted that the above description of the embodiment of the apparatus, similar to the above description of the embodiment of the method, has similar beneficial effects as the embodiment of the method. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
In the embodiment of the present application, if the above fisheye image processing method is implemented in the form of a software functional module and sold or used as a standalone product, the fisheye image processing method may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially implemented in the form of a software product, which is stored in a storage medium and includes several instructions to enable an instant messaging device (which may be a terminal, a server, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Accordingly, an embodiment of the present application further provides a computer program product, where the computer program product includes computer-executable instructions, and after the computer-executable instructions are executed, the steps in the fisheye image processing method provided in the embodiment of the present application can be implemented.
Accordingly, an embodiment of the present application further provides a computer storage medium, where computer-executable instructions are stored on the computer storage medium, and when executed by a processor, the computer-executable instructions implement the steps of the fisheye image processing method provided in the foregoing embodiment.
Accordingly, an embodiment of the present application provides a computer device, fig. 7 is a schematic structural diagram of a component of the computer device according to the embodiment of the present application, and as shown in fig. 7, a hardware entity of the computer device 700 includes: a processor 701, a communication interface 702, and a memory 703, wherein:
the processor 701 generally controls the overall operation of the computer device 700.
The communication interface 702 may enable the computer device to communicate with other terminals or servers via a network.
The Memory 703 is configured to store instructions and applications executable by the processor 701, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 701 and modules in the computer device 700, and may be implemented by a FLASH Memory (FLASH) or a Random Access Memory (RAM).
The above description of the computer device and storage medium embodiments is similar to the description of the method embodiments above with similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the instant messaging device and the storage medium of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element identified by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (24)
1. A method for processing a fisheye image, the method comprising:
determining correction parameters of a fisheye image to be processed;
according to the correction parameters, the fisheye image is unfolded into a corresponding unfolded image on a preset direction axis;
adjusting the unfolded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image;
the mapping relation comprises a correction index table with the same size as the unfolded image;
correspondingly, the adjusting the expanded image according to a preset mapping relationship and the pixel value in the fisheye image to obtain an output image corresponding to the fisheye image includes:
and adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
2. The method of claim 1, wherein the corrective parameters include at least one of: circle center coordinates, fish eye radius and distortion coefficient.
3. The method of claim 1, further comprising:
establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image;
according to the three-dimensional coordinate system, determining a second two-dimensional coordinate system of the unfolded image and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system;
determining a first corresponding relation between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system and the three-dimensional coordinate system;
determining a second corresponding relation between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relation;
and determining the second corresponding relation as the mapping relation.
4. The method of claim 3, wherein establishing the first two-dimensional coordinate system of the fisheye image and the three-dimensional coordinate system of the fisheye camera used to capture the fisheye image comprises:
establishing the first two-dimensional coordinate system by taking the center O of the fisheye image as an origin;
and establishing the three-dimensional coordinate system by taking the center O of the fisheye image as an origin and taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third coordinate axis respectively.
5. The method of claim 3, wherein determining a second two-dimensional coordinate system of the unfolded image and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system from the three-dimensional coordinate system comprises:
in the three-dimensional coordinate system, determining a sphere model by taking the center O of the fisheye image as a sphere center and the radius of the fisheye image as a sphere radius;
determining a tangent plane meeting a preset angle in the spherical model;
determining the second two-dimensional coordinate system and a perspective projection plane coordinate system in the same plane as the second two-dimensional coordinate system according to the tangent plane; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation.
6. The method of claim 3, wherein determining a first correspondence between the second two-dimensional coordinate system and the three-dimensional coordinate system from the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system comprises:
according to the perspective projection plane coordinate system, determining the variable quantity of the coordinate on the three-dimensional coordinate system corresponding to the movement of any point on the plane of the perspective projection plane coordinate system by a preset distance along a preset coordinate axis;
and determining a first corresponding relation between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the variable quantity, the second two-dimensional coordinate system and the three-dimensional coordinate system.
7. The method of claim 6, wherein determining a second correspondence between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first correspondence comprises:
determining a third corresponding relation between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the first relation;
and determining the second corresponding relation according to the first corresponding relation and the third corresponding relation.
8. The method of claim 1, wherein the adjusting the expanded image according to the correction index table and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image comprises:
dividing the rectification index table into M blocks; m is an integer greater than 1;
determining coordinates of the expanded image corresponding to the M blocks;
adjusting the pixel value of the expanded image according to the M blocks, the coordinates of the pixel points in the expanded image and the coordinates of the pixel points in the fisheye image to obtain M blocks of adjusted expanded images;
and splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
9. The method of claim 8, wherein the dividing the deskew index table into M blocks comprises:
dividing the correction index table into M original blocks according to a preset resolution;
determining a source abscissa value and a source ordinate value contained in any one of the M original blocks;
subtracting the minimum value of the source abscissa and the minimum value of the source ordinate contained in the current block from any original block respectively to obtain the M blocks containing updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block.
10. The method of claim 8, wherein adjusting the pixel values of the expanded image according to the M blocks, the coordinates of the pixels in the expanded image, and the coordinates of the pixels in the fisheye image to obtain M adjusted expanded images comprises:
searching a source coordinate value corresponding to the coordinate value of the expanded image in the mth block according to the coordinate value of the expanded image corresponding to the mth block in the M blocks; wherein M is an integer greater than 0 and less than or equal to M;
determining a first pixel value corresponding to the source coordinate value on the fisheye image;
replacing a second pixel value of the expanded image with the first pixel value to obtain an mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the developed image at a coordinate value of the developed image.
11. The method of any one of claims 8 to 10, wherein the correction index table is a fixed-point two-dimensional table.
12. An apparatus for processing a fisheye image, the apparatus comprising: first determining means, first unfolding means and first adjusting means, wherein:
the first determining device is used for determining the correction parameters of the fisheye image to be processed;
the first unfolding device is used for unfolding the fisheye image into a corresponding unfolded image on a preset direction axis according to the correction parameters;
the first adjusting device is used for adjusting the expanded image according to a preset mapping relation and coordinates of pixel points in the fisheye image to obtain an output image corresponding to the fisheye image; the preset mapping relation is used for indicating the corresponding relation between the coordinates of the pixel points in the fisheye image and the coordinates of the pixel points in the unfolded image;
the mapping relation comprises a rectification index table with the same size as the unfolded image;
correspondingly, the first adjusting device comprises:
and the first adjusting sub-device is used for adjusting the expanded image according to the correction index table and the coordinates of the pixel points in the fisheye image to obtain an output image corresponding to the fisheye image.
13. The apparatus of claim 12, wherein the corrective parameters include at least one of: circle center coordinates, fish eye radius and distortion coefficient.
14. The apparatus of claim 12, further comprising:
the first establishing device is used for establishing a first two-dimensional coordinate system of the fisheye image and a three-dimensional coordinate system of a fisheye camera for collecting the fisheye image;
second determining means for determining a second two-dimensional coordinate system of the expanded image and a perspective projection plane coordinate system on the same plane as the second two-dimensional coordinate system, based on the three-dimensional coordinate system;
third determining means for determining a first corresponding relationship between the second two-dimensional coordinate system and the three-dimensional coordinate system according to the perspective projection plane coordinate system, the second two-dimensional coordinate system, and the three-dimensional coordinate system;
a fourth determining device, configured to determine a second corresponding relationship between the second two-dimensional coordinate system and the first two-dimensional coordinate system according to the first corresponding relationship;
fifth determining means, configured to determine the second corresponding relationship as the mapping relationship.
15. The apparatus of claim 14, wherein the first establishing means comprises:
a first establishing sub-means for establishing the first two-dimensional coordinate system with a center O of the fisheye image as an origin;
and the second establishing sub-device is used for establishing the three-dimensional coordinate system by taking the center O of the fisheye image as an origin and taking the transverse axis and the longitudinal axis of the first two-dimensional coordinate system and the optical axis of the lens of the fisheye camera as the transverse axis, the longitudinal axis and the third dimensional coordinate axis respectively.
16. The apparatus of claim 14, wherein the second determining means comprises:
the first determining sub-device is used for determining a spherical model by taking the center O of the fisheye image as a spherical center and the radius of the fisheye image as a spherical radius in the three-dimensional coordinate system;
the second determining sub-device is used for determining a tangent plane meeting a preset angle in the spherical model;
the third determining sub-device is used for determining the second two-dimensional coordinate system and a perspective projection plane coordinate system which is in the same plane as the second two-dimensional coordinate system according to the tangent plane; and the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the perspective projection plane coordinate system meet a preset corresponding relation.
17. The apparatus of claim 14, wherein the third determining means comprises:
the third determining sub-device is used for determining the coordinate variation quantity of the three-dimensional coordinate system corresponding to the movement of any point on the plane of the perspective projection plane coordinate system by a preset distance along a preset coordinate axis according to the perspective projection plane coordinate system;
and the fourth determining sub-device is used for determining a first corresponding relation between the coordinates of each point in the second two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system according to the variable quantity, the second two-dimensional coordinate system and the three-dimensional coordinate system.
18. The apparatus of claim 17, wherein the fourth determining means comprises:
a fifth determining sub-device, configured to determine, according to the first relationship, a third corresponding relationship between the coordinates of each point in the first two-dimensional coordinate system and the coordinates of each point in the three-dimensional coordinate system;
and a sixth determining sub-device, configured to determine the second corresponding relationship according to the first corresponding relationship and the third corresponding relationship.
19. The apparatus of claim 12, wherein the first adjustment sub-apparatus comprises:
a first dividing unit for dividing the rectification index table into M blocks; m is an integer greater than 1;
a first determining unit configured to determine coordinates of the expanded image corresponding to the M blocks;
the first adjusting unit is used for adjusting the pixel value of the expanded image according to the M blocks, the coordinates of the pixel points in the expanded image and the coordinates of the pixel points in the fisheye image to obtain M blocks of the expanded image after adjustment;
and the first splicing unit is used for splicing the M adjusted unfolded images to obtain a normal image corresponding to the fisheye image.
20. The apparatus of claim 19, wherein the first dividing unit comprises:
the first dividing unit is used for dividing the correction index table into M original blocks according to a preset resolution;
a first determining subunit, configured to determine a source abscissa value and a source ordinate value included in any one of the M original blocks;
a first updating subunit, configured to subtract the minimum value of the source abscissa and the minimum value of the source ordinate included in the current block from each of the original blocks to obtain the M blocks including updated source coordinate values; the source abscissa and ordinate contained in any original block are the abscissa and ordinate of the fisheye image contained in any original block.
21. The apparatus of claim 19, wherein the first adjusting unit comprises:
the first searching subunit is configured to search, in the mth block, a source coordinate value corresponding to a coordinate value of the expanded image according to the coordinate value of the expanded image corresponding to the mth block in the M blocks; wherein M is an integer greater than 0 and less than or equal to M;
the second determining subunit is used for determining a first pixel value corresponding to the source coordinate value on the fisheye image;
a first replacing subunit, configured to replace a second pixel value of the expanded image with the first pixel value, to obtain an mth adjusted expanded image in the M adjusted expanded images; wherein the second pixel value is a pixel value of the expanded image at a coordinate value of the expanded image.
22. The apparatus of any one of claims 19 to 21, wherein the correction index table is a fixed-point two-dimensional table.
23. A computer storage medium having computer-executable instructions stored thereon that, when executed, perform the method steps of any of claims 1 to 11.
24. A computer device comprising a memory having computer-executable instructions stored thereon and a processor operable to perform the method steps of any of claims 1 to 11 when the processor executes the computer-executable instructions on the memory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810935885.2A CN109308686B (en) | 2018-08-16 | 2018-08-16 | Fisheye image processing method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810935885.2A CN109308686B (en) | 2018-08-16 | 2018-08-16 | Fisheye image processing method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109308686A CN109308686A (en) | 2019-02-05 |
CN109308686B true CN109308686B (en) | 2022-06-24 |
Family
ID=65224109
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810935885.2A Active CN109308686B (en) | 2018-08-16 | 2018-08-16 | Fisheye image processing method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109308686B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111612812B (en) * | 2019-02-22 | 2023-11-03 | 富士通株式会社 | Target object detection method, detection device and electronic equipment |
CN112132740B (en) * | 2019-06-25 | 2023-08-25 | 杭州海康威视数字技术股份有限公司 | Video image display method, device and system |
CN110458776B (en) * | 2019-07-31 | 2023-05-19 | 深圳市德赛微电子技术有限公司 | Customizable real-time video correction method and electronic equipment thereof |
CN110599427A (en) * | 2019-09-20 | 2019-12-20 | 普联技术有限公司 | Fisheye image correction method and device and terminal equipment |
CN112541861B (en) * | 2019-09-23 | 2024-05-24 | 华为技术有限公司 | Image processing method, device, equipment and computer storage medium |
CN110728622B (en) * | 2019-10-22 | 2023-04-25 | 珠海研果科技有限公司 | Fisheye image processing method, device, electronic equipment and computer readable medium |
CN111277911B (en) * | 2020-01-10 | 2021-10-15 | 聚好看科技股份有限公司 | Image processing method of panoramic video, display device and server |
CN113763530B (en) * | 2020-06-05 | 2024-04-26 | 杭州海康威视数字技术股份有限公司 | Image processing method, device, computing equipment and storage medium |
CN111861904A (en) * | 2020-06-16 | 2020-10-30 | 浙江大华技术股份有限公司 | Equal-proportion fisheye correction method and device, computer equipment and readable storage medium |
CN112530173A (en) * | 2020-12-03 | 2021-03-19 | 北京百度网讯科技有限公司 | Roadside sensing method and device, electronic equipment, storage medium and roadside equipment |
CN114827385A (en) * | 2021-01-18 | 2022-07-29 | 北京猎户星空科技有限公司 | Image processing method and device and electronic equipment |
CN113132708B (en) * | 2021-04-22 | 2022-02-22 | 贝壳找房(北京)科技有限公司 | Method and apparatus for acquiring three-dimensional scene image using fisheye camera, device and medium |
CN114648458A (en) * | 2022-03-24 | 2022-06-21 | 北京理工大学 | Fisheye image correction method and device, electronic equipment and storage medium |
CN116245748B (en) * | 2022-12-23 | 2024-04-26 | 珠海视熙科技有限公司 | Distortion correction method, device, equipment, system and storage medium for ring-looking lens |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101814181A (en) * | 2010-03-17 | 2010-08-25 | 天津理工大学 | Unfolding method for restoration of fisheye image |
CN105554478A (en) * | 2016-02-24 | 2016-05-04 | 深圳市车米云图科技有限公司 | Video monitoring system and method |
CN105744229A (en) * | 2016-02-25 | 2016-07-06 | 江苏科技大学 | Unmanned ship automatic anchoring system and working method thereof based on integration of infrared and panoramic technologies |
US9639935B1 (en) * | 2016-05-25 | 2017-05-02 | Gopro, Inc. | Apparatus and methods for camera alignment model calibration |
CN106780389A (en) * | 2016-12-23 | 2017-05-31 | 浙江宇视科技有限公司 | A kind of fisheye image correcting method and device based on Coordinate Conversion |
CN106780374A (en) * | 2016-12-01 | 2017-05-31 | 哈尔滨工业大学 | A kind of fish eye images distortion correction method based on flake imaging model |
CN106791762A (en) * | 2016-11-21 | 2017-05-31 | 深圳岚锋创视网络科技有限公司 | Method for processing stereo image and system |
CN106875339A (en) * | 2017-02-22 | 2017-06-20 | 长沙全度影像科技有限公司 | A kind of fish eye images joining method based on strip scaling board |
CN106981050A (en) * | 2016-01-18 | 2017-07-25 | 深圳岚锋创视网络科技有限公司 | The method and apparatus of the image flame detection shot to fish eye lens |
CN107346530A (en) * | 2016-05-06 | 2017-11-14 | 完美幻境(北京)科技有限公司 | A kind of projecting method and system for correcting fish eye images |
CN107563959A (en) * | 2017-08-30 | 2018-01-09 | 北京林业大学 | Panoramagram generation method and device |
CN107610045A (en) * | 2017-09-20 | 2018-01-19 | 北京维境视讯信息技术有限公司 | Luminance compensation method, device, equipment and storage medium in the splicing of flake picture |
CN107749050A (en) * | 2017-09-30 | 2018-03-02 | 珠海市杰理科技股份有限公司 | Fish eye images antidote, device and computer equipment |
EP3318469A1 (en) * | 2016-11-02 | 2018-05-09 | LG Electronics Inc. | Apparatus for providing around view image, and vehicle |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3239163A1 (en) * | 2015-04-01 | 2016-10-06 | Owl Labs, Inc. | Compositing and scaling angularly separated sub-scenes |
US10021299B2 (en) * | 2016-05-31 | 2018-07-10 | Tower Spring Global Limited | System and method for image stitching |
CN106600549A (en) * | 2016-11-16 | 2017-04-26 | 深圳六滴科技有限公司 | Method and device for correcting fisheye image |
-
2018
- 2018-08-16 CN CN201810935885.2A patent/CN109308686B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101814181A (en) * | 2010-03-17 | 2010-08-25 | 天津理工大学 | Unfolding method for restoration of fisheye image |
CN106981050A (en) * | 2016-01-18 | 2017-07-25 | 深圳岚锋创视网络科技有限公司 | The method and apparatus of the image flame detection shot to fish eye lens |
CN105554478A (en) * | 2016-02-24 | 2016-05-04 | 深圳市车米云图科技有限公司 | Video monitoring system and method |
CN105744229A (en) * | 2016-02-25 | 2016-07-06 | 江苏科技大学 | Unmanned ship automatic anchoring system and working method thereof based on integration of infrared and panoramic technologies |
CN107346530A (en) * | 2016-05-06 | 2017-11-14 | 完美幻境(北京)科技有限公司 | A kind of projecting method and system for correcting fish eye images |
US9639935B1 (en) * | 2016-05-25 | 2017-05-02 | Gopro, Inc. | Apparatus and methods for camera alignment model calibration |
EP3318469A1 (en) * | 2016-11-02 | 2018-05-09 | LG Electronics Inc. | Apparatus for providing around view image, and vehicle |
CN106791762A (en) * | 2016-11-21 | 2017-05-31 | 深圳岚锋创视网络科技有限公司 | Method for processing stereo image and system |
CN106780374A (en) * | 2016-12-01 | 2017-05-31 | 哈尔滨工业大学 | A kind of fish eye images distortion correction method based on flake imaging model |
CN106780389A (en) * | 2016-12-23 | 2017-05-31 | 浙江宇视科技有限公司 | A kind of fisheye image correcting method and device based on Coordinate Conversion |
CN106875339A (en) * | 2017-02-22 | 2017-06-20 | 长沙全度影像科技有限公司 | A kind of fish eye images joining method based on strip scaling board |
CN107563959A (en) * | 2017-08-30 | 2018-01-09 | 北京林业大学 | Panoramagram generation method and device |
CN107610045A (en) * | 2017-09-20 | 2018-01-19 | 北京维境视讯信息技术有限公司 | Luminance compensation method, device, equipment and storage medium in the splicing of flake picture |
CN107749050A (en) * | 2017-09-30 | 2018-03-02 | 珠海市杰理科技股份有限公司 | Fish eye images antidote, device and computer equipment |
Non-Patent Citations (3)
Title |
---|
DUAL FISHEYE LENS STITCHING FOR 360 DEGREE IMAGING;Tuan Ho 等;《2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)》;20170619;2172-2176 * |
Dual-fisheye lens stitching and error correction;Guangyao Ni 等;《2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)》;20180227;1-6 * |
基于MSCR和ASIFT的鱼眼图像立体匹配研究;朱均超 等;《自动化仪表》;20180220;第39卷(第2期);81-85 * |
Also Published As
Publication number | Publication date |
---|---|
CN109308686A (en) | 2019-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109308686B (en) | Fisheye image processing method, device, equipment and storage medium | |
CN109618090B (en) | Method and system for image distortion correction of images captured using wide angle lenses | |
CN110663245B (en) | Apparatus and method for storing overlapping regions of imaging data to produce an optimized stitched image | |
JP5437311B2 (en) | Image correction method, image correction system, angle estimation method, and angle estimation device | |
WO2019024935A1 (en) | Panoramic image generation method and device | |
US9436973B2 (en) | Coordinate computation device and method, and an image processing device and method | |
CN104778656B (en) | Fisheye image correcting method based on spherical perspective projection | |
CN110868541B (en) | Visual field fusion method and device, storage medium and terminal | |
CN103839227B (en) | Fisheye image correcting method and device | |
CN110956583B (en) | Spherical image processing method and device and server | |
TWI811386B (en) | Application processor | |
CN113643414B (en) | Three-dimensional image generation method and device, electronic equipment and storage medium | |
CN112686824A (en) | Image correction method, image correction device, electronic equipment and computer readable medium | |
CN112215880B (en) | Image depth estimation method and device, electronic equipment and storage medium | |
CN111161138B (en) | Target detection method, device, equipment and medium for two-dimensional panoramic image | |
CN113436269B (en) | Image dense stereo matching method, device and computer equipment | |
CN111294580A (en) | Camera video projection method, device and equipment based on GPU and storage medium | |
CN114511447A (en) | Image processing method, device, equipment and computer storage medium | |
CN114648458A (en) | Fisheye image correction method and device, electronic equipment and storage medium | |
CN112565730B (en) | Road side sensing method and device, electronic equipment, storage medium and road side equipment | |
CN111091117B (en) | Target detection method, device, equipment and medium for two-dimensional panoramic image | |
CN107346530B (en) | Projection method and system for correcting fisheye image | |
Zhang et al. | Fisheye lens distortion correction based on an ellipsoidal function model | |
JP2001005956A (en) | Wide field camera device | |
EP3766046A1 (en) | Camera calibration and/or use of a calibrated camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |