CN113888401B - Image conversion method and device - Google Patents

Image conversion method and device Download PDF

Info

Publication number
CN113888401B
CN113888401B CN202111496232.7A CN202111496232A CN113888401B CN 113888401 B CN113888401 B CN 113888401B CN 202111496232 A CN202111496232 A CN 202111496232A CN 113888401 B CN113888401 B CN 113888401B
Authority
CN
China
Prior art keywords
pixel point
dimensional
distance
determining
fisheye image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111496232.7A
Other languages
Chinese (zh)
Other versions
CN113888401A (en
Inventor
蔡都
刘国清
杨广
王启程
郑伟
俞吉
周星星
徐希楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Youjia Innovation Technology Co.,Ltd.
Original Assignee
Shenzhen Minieye Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Minieye Innovation Technology Co Ltd filed Critical Shenzhen Minieye Innovation Technology Co Ltd
Priority to CN202111496232.7A priority Critical patent/CN113888401B/en
Publication of CN113888401A publication Critical patent/CN113888401A/en
Application granted granted Critical
Publication of CN113888401B publication Critical patent/CN113888401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T3/047
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The invention discloses an image conversion method and device, and relates to the technical field of computer vision. The specific scheme comprises the following steps: the computer equipment determines a three-dimensional map meeting the ink card tray projection principle through a preset three-dimensional rendering target in a virtual three-dimensional space, determines the positions of pixel points in the three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map, determines the distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fish-eye camera distortion coefficient, determines the position of each pixel point in a fish-eye image according to the distortion characteristics, the size of the preset fish-eye image and the position of each pixel point in the three-dimensional map, and finally dyes each pixel point in the fish-eye image to obtain the fish-eye image corresponding to the three-dimensional map. The invention can obtain the fisheye image which accords with the equidistant projection model, and improves the reality of the fisheye image obtained from the virtual three-dimensional space.

Description

Image conversion method and device
Technical Field
The invention relates to the technical field of computer vision, in particular to an image conversion method and device.
Background
In the technical field of computer vision, algorithm training often requires fisheye images shot under a large number of different scene conditions or different fisheye camera parameters, and the fisheye images cannot be quickly and conveniently obtained in reality.
However, in the prior art, the fisheye image can be obtained only by distorting the pixels at the edge of the lens far away from the center, the fisheye image is only a visual approximate fisheye image, does not conform to an equidistant projection model, cannot be subjected to distortion removal, and is low in authenticity.
Disclosure of Invention
The invention provides an image conversion method and device, and solves the problem that the reality of a fisheye image obtained from a virtual three-dimensional space is low.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides an image conversion method, comprising:
determining a three-dimensional map of a three-dimensional rendering target according to the preset three-dimensional rendering target in the virtual three-dimensional space; the three-dimensional map satisfies the ink card support projection principle;
determining the position of a pixel point in a three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map;
determining distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image;
determining the position of each pixel point in the fisheye image according to the distortion characteristics, the preset size of the fisheye image and the position of each pixel point in the three-dimensional map;
and dyeing each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional map.
In combination with the first aspect, in a possible implementation manner, the position of the pixel point in the three-dimensional space is a three-dimensional coordinate of the pixel point in a three-dimensional coordinate system, an origin of the three-dimensional coordinate system is a spherical center of a projection spherical surface of the fisheye image, a Z axis of the three-dimensional coordinate system is perpendicular to the fisheye image, and distortion characteristics of the pixel point are determined according to the position of the pixel point in the three-dimensional space and a preset fisheye camera distortion coefficient, including: determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain a first distance; determining a second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient; and determining the distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, determining the second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient includes: determining the radian of the polar coordinate position of the pixel point on the fisheye image according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system; and substituting the radian of the polar coordinates and a preset fish-eye camera distortion coefficient into a projection model formula to obtain a second distance.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, the determining, by using the size of the three-dimensional map and the position of any one pixel point in the three-dimensional map, the position of the pixel point in the three-dimensional space includes: determining the longitude direction radian and the latitude direction radian of a pixel point in a three-dimensional space according to the width and the height of the three-dimensional map and the position of any pixel point in the three-dimensional map; and determining the position of the pixel point in the three-dimensional space according to the longitude direction radian and the latitude direction radian.
With reference to the first aspect and the foregoing possible implementation manners, in another possible implementation manner, determining a three-dimensional map of a three-dimensional rendering target according to a three-dimensional rendering target preset in a virtual three-dimensional space specifically includes: acquiring a three-dimensional map of a three-dimensional rendering target according to a virtual camera preset in a virtual three-dimensional space; the parameter settings of the virtual camera correspond to the stereoscopic rendering target.
With reference to the first aspect and the possible implementation manners described above, in another possible implementation manner, each pixel point in the fisheye image is dyed to obtain a fisheye image corresponding to the stereogram, which specifically includes: and determining the color at each pixel point in the three-dimensional map as the color at the corresponding position in the fisheye image.
In a second aspect, the present invention provides a vehicle travel device including:
the first determination module is used for determining a three-dimensional map of a three-dimensional rendering target according to the three-dimensional rendering target preset in the virtual three-dimensional space; the three-dimensional map satisfies the ink card support projection principle;
the second determining module is used for determining the position of a pixel point in the three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map;
the third determining module is used for determining distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image;
the positioning module is used for determining the position of each pixel point in the fisheye image according to the distortion characteristics, the preset size of the fisheye image and the position of each pixel point in the three-dimensional map;
and the dyeing module is used for dyeing each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional map.
In combination with the second aspect, in a possible implementation manner, when the position of the pixel point in the three-dimensional space is a three-dimensional coordinate of the pixel point in a three-dimensional coordinate system, an origin of the three-dimensional coordinate system is a sphere center of a projection spherical surface of the fisheye image, and a Z axis of the three-dimensional coordinate system is perpendicular to the fisheye image, the third determining module is specifically configured to: determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain a first distance; determining a second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient; and determining the distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, the third determining module is specifically configured to: determining the radian of the polar coordinate position of the pixel point on the fisheye image according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system; and substituting the radian of the polar coordinates and a preset fish-eye camera distortion coefficient into a projection model formula to obtain a second distance.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, when the size of the three-dimensional map includes a width and a height of the three-dimensional map, the second determining module is specifically configured to: determining the longitude direction radian and the latitude direction radian of a pixel point in a three-dimensional space according to the width and the height of the three-dimensional map and the position of any pixel point in the three-dimensional map; and determining the position of the pixel point in the three-dimensional space according to the longitude direction radian and the latitude direction radian.
With reference to the second aspect and the foregoing possible implementation manners, in another possible implementation manner, the positioning module is specifically configured to: acquiring a three-dimensional map of a three-dimensional rendering target according to a virtual camera preset in a virtual three-dimensional space; the parameter settings of the virtual camera correspond to the stereoscopic rendering target.
With reference to the second aspect and the foregoing possible implementations, in another possible implementation, the dyeing module is specifically configured to: and determining the color at each pixel point in the three-dimensional map as the color at the corresponding position in the fisheye image.
In a third aspect, the present invention provides a computer apparatus comprising: a processor and a memory. The memory is for storing computer program code, the computer program code including computer instructions. When the processor executes the computer instructions, the computer device performs the image conversion method as described in the first aspect and any one of its possible implementations.
In a fourth aspect, the present invention provides a computer-readable storage medium having stored thereon computer instructions which, when run on a computer device, cause the computer device to perform an image conversion method as in the first aspect or any one of the possible implementations of the first aspect.
According to the image conversion method and device provided by the embodiment of the invention, computer equipment determines a three-dimensional map meeting the ink card tray projection principle through a preset three-dimensional rendering target in a virtual three-dimensional space, determines the positions of pixel points in the three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map, determines the distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fish-eye camera distortion coefficient, determines the position of each pixel point in a fish-eye image according to the distortion characteristics, the size of the fish-eye image and the position of each pixel point in the three-dimensional map, and finally dyes each pixel point in the fish-eye image to obtain the fish-eye image corresponding to the three-dimensional map. The distortion characteristic is used for representing the distortion degree of a second distance relative to a first distance, the first distance is the distance of a pixel point from a center point of a fisheye image in a fisheye image plane, and the second distance is the distance of the pixel point from the center point of a spherical surface in a fisheye image projection spherical surface. The distance between the pixel point and the center of the picture in the fisheye image is in direct proportion to the angle between the pixel point and the optical axis in the space, so that the obtained fisheye image conforms to an equidistant projection model, and the authenticity of the fisheye image obtained from the virtual three-dimensional space is improved.
Drawings
Fig. 1 is an application scenario diagram of an image conversion method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an image conversion method according to an embodiment of the present invention;
FIG. 3 is a schematic view of a three-dimensional sticker according to an embodiment of the present invention;
FIG. 4 is a schematic view of a projection principle of an ink card holder according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a fisheye image isometric projection imaging model provided by an embodiment of the invention;
fig. 6 is a schematic diagram of a fisheye image conforming to an isometric projection model according to an embodiment of the invention;
fig. 7 is a schematic structural diagram of an image conversion apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present disclosure, "a plurality" means two or more unless otherwise specified.
Additionally, the use of "based on" or "according to" means open and inclusive, as a process, step, calculation, or other action that is "based on" or "according to" one or more stated conditions or values may in practice be based on additional conditions or exceeding the stated values.
In order to solve the problem of low authenticity of a fisheye image acquired from a virtual three-dimensional space in the prior art, the embodiment of the invention provides an image conversion method and a device, computer equipment determines a three-dimensional map meeting the ink card tray projection principle through a preset three-dimensional rendering target in the virtual three-dimensional space, determines the positions of pixel points in the three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map, determines the distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient, determines the position of each pixel point in the fisheye image according to the distortion characteristics, the size of the fisheye image and the position of each pixel point in the three-dimensional map, and finally dyes each pixel point in the fisheye image, and obtaining the fisheye image corresponding to the stereogram. The distortion characteristic is used for representing the distortion degree of a second distance relative to a first distance, the first distance is the distance of a pixel point from a center point of a fisheye image in a fisheye image plane, and the second distance is the distance of the pixel point from the center point of a spherical surface in a fisheye image projection spherical surface.
Therefore, the fisheye image conforming to the equidistant projection model can be obtained, and the reality of the fisheye image obtained from the virtual three-dimensional space is improved.
Fig. 1 is an application scene diagram of an image conversion method provided by an embodiment of the present invention, where fig. 1 is a visualization result of a virtual three-dimensional space built in a computer device, a scene in the virtual three-dimensional space can be freely laid according to a user requirement, a corresponding three-dimensional map is obtained by setting a three-dimensional scene similar to a virtual camera in the virtual three-dimensional space for capturing, and the three-dimensional map is converted into a fisheye image by the image conversion method provided by the present invention. Wherein the virtual three-dimensional space can be built up by a game development engine, such as the illusion engine 4.
The execution main body of the image conversion method provided by the embodiment of the invention is computer equipment. The vehicle driving device may be a terminal, a server, or a server cluster.
Based on the above description of the structure of the computer device, an embodiment of the present invention provides an image conversion method. As shown in fig. 2, the image conversion method may include the following steps S201 to S205.
S201, determining a three-dimensional map of a three-dimensional rendering target according to the preset three-dimensional rendering target in the virtual three-dimensional space; the three-dimensional map satisfies the mercator projection principle.
The three-dimensional map may be a cube map, a sphere map, or a single texture combination of a predetermined three-dimensional shape (such as a cube, a sphere, etc.), which is not limited herein. The present invention will be described by taking a cube map as shown in fig. 3 as an example.
It can be understood that the mercator projection is an equiangular cylinder projection with a positive axis, taking fig. 4 as an example, a cylinder with the same direction as the earth axis is assumed to be cut or cut on the earth, the graticule is projected onto the cylindrical surface according to the equiangular condition, and the planar projection of the earth is obtained after the cylindrical surface is expanded into a plane, where the moatt projection points corresponding to the point a, the point B, the point C, and the point D in fig. 4 are the point a, the point B, the point C, and the point D, respectively.
Specifically, the computer device may preset a stereoscopic rendering target in the virtual three-dimensional space, and obtain a stereoscopic map of the stereoscopic rendering target satisfying the mercator projection principle according to a virtual camera preset in the virtual three-dimensional space. Wherein the parameter settings of the virtual camera correspond to the stereoscopic rendering target.
S202, determining the position of a pixel point in a three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map.
Specifically, because the three-dimensional map conforms to the ink card holder projection principle, the computer device can reversely restore the positions of the pixel points in the three-dimensional space according to the size of the three-dimensional map and the position of any pixel point in the three-dimensional map and the ink card holder projection principle.
S203, determining distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image.
The preset fisheye camera distortion coefficient may be a distortion coefficient of a real fisheye camera.
Specifically, the computer device can determine the distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and the preset distortion coefficient of the fisheye camera. Wherein, the distortion characteristic can be used for the distortion degree of characterization second distance for first distance, and first distance can be the distance that the pixel shifts fisheye image central point in fisheye image plane, and the second distance can be the distance that the pixel shifted the sphere central point in fisheye image projection sphere.
Illustratively, as shown in fig. 5, the origin of the three-dimensional coordinate system in fig. 5 is the sphere center of the projection sphere of the fisheye image, and the three-dimensional coordinate systemZThe axis is the optical axis,Zthe axis is vertical to the fisheye image, and the distance from the fisheye image to the spherical center of the projection spherical surface of the fisheye image is the focal length of the fisheye camerafFocal length of fisheye camerafMay include a lateral pixel focal length and a longitudinal pixel focal length. With the pixel points in FIG. 5PFor example, the first distance may bePThe points being projected on a planeOXYInner point to coordinate axis originOIs a distance ofrThe second distance may bePPoint intersecting point between connecting line of point and coordinate axis origin and fisheye image projection spherical surfaceP 1 Distance to center of projection sphere of fisheye imager d
S204, determining the position of each pixel point in the fisheye image according to the distortion characteristics, the preset size of the fisheye image and the position of each pixel point in the three-dimensional map.
Specifically, because the distortion characteristics in S203 can represent the distortion degree of the position of the pixel point in the fisheye image relative to the position of the pixel point in the three-dimensional space, the computer device can determine the position of each pixel point in the fisheye image according to the distortion characteristics, the size of the preset fisheye image, the position of each pixel point in the three-dimensional map, and the horizontal pixel focal length and the vertical pixel focal length of the real fisheye camera in the three-dimensional map.
Illustratively, on the basis of the above-described embodiments, the distortion characteristics are known to bedThe preset size of the fisheye image comprises the width of the fisheye imagew f And heighth f In reality, the horizontal pixel focal length of the fisheye camera isf u Longitudinal pixel focal length off v If the coordinate of any pixel point in the three-dimensional map is (A)x 1 ,y 1 ) The coordinate of the pixel point in the fisheye image (x f ,y f ) Can be calculated by the following formula (1):
Figure 265342DEST_PATH_IMAGE001
(1)
s205, dyeing each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional paste image.
Specifically, the computer device can perform dyeing processing on each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional map. For example, the computer device may determine the color at each pixel point in the stereogram as the color at the respective corresponding location in the fisheye image. Taking fig. 6 as an example, fig. 6 shows the corresponding fisheye image converted from the stereogram shown in fig. 3.
In this embodiment, the computer device determines a three-dimensional map meeting the ink card tray projection principle through a preset three-dimensional rendering target in a virtual three-dimensional space, determines the positions of pixel points in the three-dimensional space according to the size of the three-dimensional map and the position of any one pixel point in the three-dimensional map, determines the distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fish-eye camera distortion coefficient, determines the position of each pixel point in a fish-eye image according to the distortion characteristics, the size of the fish-eye image and the position of each pixel point in the three-dimensional map, and finally dyes each pixel point in the fish-eye image to obtain the fish-eye image corresponding to the three-dimensional map. The distortion characteristic is used for representing the distortion degree of a second distance relative to a first distance, the first distance is the distance of a pixel point from a center point of a fisheye image in a fisheye image plane, and the second distance is the distance of the pixel point from the center point of a spherical surface in a fisheye image projection spherical surface. The distance between the pixel point and the center of the picture in the fisheye image is in direct proportion to the angle between the pixel point and the optical axis in the space, so that the obtained fisheye image conforms to an equidistant projection model, and the authenticity of the fisheye image obtained from the virtual three-dimensional space is improved.
In a possible implementation manner, on the basis of the above embodiment, the position of the pixel point in the three-dimensional space is a three-dimensional coordinate of the pixel point in a three-dimensional coordinate system, an origin of the three-dimensional coordinate system is a sphere center of a projection sphere of the fisheye image, and the three-dimensional coordinate systemZThe axis is perpendicular to the fisheye image, and the step S203 includes:
s301, determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain a first distance.
Exemplarily, the position of the pixel point in the three-dimensional space is assumed to be a three-dimensional coordinate (x,y,z) The computer equipment can determine the Euclidean distance between the pixel point and the central point of the fisheye image according to the three-dimensional coordinates to obtain a first distancerThe specific calculation formula may be the following formula (2):
Figure 25488DEST_PATH_IMAGE002
(2)
s302, according to the pixel points in the three-dimensional coordinate systemZAnd determining a second distance by using the axis coordinate and a preset fisheye camera distortion coefficient.
Illustratively, the computer device may be based on the three-dimensional coordinate system of the pixel pointsZAxial coordinatezAnd a predetermined fisheye camera distortion coefficient, determining a second distancer d
S303, determining distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
Further, the computer device may be based on the first distance in step S301rAnd the second distance in step S302r d Determining the distortion characteristics of the pixel pointsdThe specific calculation formula may be the following formula (3):
Figure 479472DEST_PATH_IMAGE003
(3)
in this embodiment, the computer device obtains the first distance by determining the euclidean distance between the pixel point and the center point of the fisheye image, determines the second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and the preset fisheye camera distortion coefficient, and finally determines the distortion characteristic of the pixel point according to the ratio of the first distance to the second distance, so that the calculated distortion characteristic can represent the distortion degree of the pixel point in the real fisheye image, and an important reference is provided for converting the stereogram into the fisheye image.
In a possible implementation manner, on the basis of the foregoing embodiment, the foregoing step S302 includes:
s401, according to the pixel points in the three-dimensional coordinate systemZAnd determining the radian of the polar coordinate position of the pixel point on the fisheye image through the axis coordinate.
S402, substituting the polar coordinate radian and a preset fisheye camera distortion coefficient into a projection model formula to obtain a second distance.
Illustratively, the computer device may be based on the three-dimensional coordinate system of the pixel pointsZAxial coordinatezDetermining the radian of the polar coordinate position of the pixel point on the fisheye image through a formula (4)θAnd make the polar coordinate radianθAnd a preset fisheye camera distortion coefficientk 1 k 2 k 3 k 4 Andk 5substituting into projection model formula (5) to obtain a second distancer d . Formula (4) And equation (5) may specifically be as follows:
Figure 678372DEST_PATH_IMAGE004
(4)
Figure 284934DEST_PATH_IMAGE005
(5)
in this embodiment, the computer device determines the three-dimensional coordinate system according to the pixel pointsZAnd determining the polar coordinate position radian of the pixel point on the fisheye image according to the axis coordinate, substituting the polar coordinate radian and a preset fisheye camera distortion coefficient into a projection model formula to obtain a second distance, and providing an important reference for calculating the distortion characteristic in the embodiment.
In a possible implementation manner, on the basis of the foregoing embodiment, the size of the three-dimensional map includes a width and a height of the three-dimensional map, and the foregoing step S202 includes:
s501, according to the width and the height of the three-dimensional map and the position of any pixel point in the three-dimensional map, determining the longitude direction radian and the latitude direction radian of the pixel point in the three-dimensional space.
And S502, determining the position of the pixel point in the three-dimensional space according to the longitude direction radian and the latitude direction radian.
For example, the dimensions of the stereogram may include the width of the stereogramwAnd heighthThe computer equipment can be according to the width of the stereogramwAnd heighthAnd the coordinate of any pixel point in the three-dimensional map (x 0 ,y 0 ) Determining the corresponding longitude direction radian of the pixel point in the three-dimensional space through a formula (6)pitchArc of latitude and harmonyyawAnd according to the longitude direction radianpitchArc of latitude and harmonyyawDetermining the coordinates of the pixel points in the three-dimensional space by formula (7) ((x,y,z). The equations (6) and (7) may be specifically as follows:
Figure 462319DEST_PATH_IMAGE006
(6)
Figure 216648DEST_PATH_IMAGE007
(7)
in this embodiment, the computer device determines the longitude direction radian and the latitude direction radian of the pixel point in the three-dimensional space according to the width and the height of the three-dimensional map and the position of any one pixel point in the three-dimensional map, and then determines the position of the pixel point in the three-dimensional space according to the longitude direction radian and the latitude direction radian, so that a reference point in the three-dimensional map conforming to the mercator projection principle can be reversely restored to the three-dimensional space by the method in this embodiment, and an important partial calculation mode is provided for converting the three-dimensional map into a fisheye image.
The foregoing has outlined rather broadly the solution provided by an embodiment of the present invention from the perspective of a computer device. It will be appreciated that the computer device, in order to implement the above-described functions, comprises corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the present invention can be implemented in hardware or a combination of hardware and computer software, in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
Fig. 7 shows a schematic structural diagram of an image conversion apparatus, which may include, as shown in fig. 7: a first determination module 71, a second determination module 72, a third determination module 73, a localization module 74 and a staining module 75.
A first determining module 71, configured to determine a three-dimensional map of a three-dimensional rendering target according to a three-dimensional rendering target preset in a virtual three-dimensional space; the three-dimensional map satisfies the ink card support projection principle;
a second determining module 72, configured to determine, according to the size of the three-dimensional map and a position of any pixel point in the three-dimensional map, a position of the pixel point in the three-dimensional space;
the third determining module 73 is configured to determine distortion characteristics of the pixel points according to positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image;
the positioning module 74 is configured to determine the position of each pixel point in the fisheye image according to the distortion characteristics, the preset size of the fisheye image, and the position of each pixel point in the three-dimensional map;
and the dyeing module 75 is configured to dye each pixel point in the fisheye image to obtain a fisheye image corresponding to the three-dimensional map.
Optionally, on the basis of the above embodiment, when the position of the pixel point in the three-dimensional space is the three-dimensional coordinate of the pixel point in the three-dimensional coordinate system, the origin of the three-dimensional coordinate system is the sphere center of the projection sphere of the fisheye image, and the three-dimensional coordinate systemZWhen the axis is perpendicular to the fisheye image, the third determining module 73 is specifically configured to: determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain a first distance; determining a second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient; and determining the distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
Optionally, on the basis of the foregoing embodiment, the third determining module 73 is specifically configured to: according to pixel position in three-dimensional coordinate systemZDetermining the radian of a polar coordinate position of a pixel point on the fisheye image according to the axis coordinate; and substituting the radian of the polar coordinates and a preset fish-eye camera distortion coefficient into a projection model formula to obtain a second distance.
Optionally, when the size of the three-dimensional map includes a width and a height of the three-dimensional map, the second determining module 72 is specifically configured to: determining the longitude direction radian and the latitude direction radian of a pixel point in a three-dimensional space according to the width and the height of the three-dimensional map and the position of any pixel point in the three-dimensional map; and determining the position of the pixel point in the three-dimensional space according to the longitude direction radian and the latitude direction radian.
Optionally, the positioning module 74 is specifically configured to: acquiring a three-dimensional map of a three-dimensional rendering target according to a virtual camera preset in a virtual three-dimensional space; the parameter settings of the virtual camera correspond to the stereoscopic rendering target.
Optionally, the dyeing module 75 is specifically configured to: and determining the color at each pixel point in the three-dimensional map as the color at the corresponding position in the fisheye image.
The image conversion device provided by the embodiment of the invention is used for executing the image conversion method, so that the technical effect same as that of the image conversion method can be achieved.
The embodiment of the invention also provides computer equipment, which comprises a processor and a memory; the memory is for storing computer program code, the computer program code comprising computer instructions; when the processor executes the computer instructions, the computer device executes the image conversion method provided by the foregoing embodiment of the present invention.
Embodiments of the present invention further provide a computer storage medium, which stores one or more computer instructions for implementing the image conversion method provided in the foregoing embodiments of the present invention when executed.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions within the technical scope of the present invention are intended to be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (9)

1. An image conversion method, comprising:
determining a three-dimensional map of a three-dimensional rendering target according to the preset three-dimensional rendering target in a virtual three-dimensional space; the three-dimensional map satisfies the ink card holder projection principle;
determining longitude direction radian and latitude direction radian of any pixel point in a three-dimensional space according to the width and the height of the three-dimensional map and the position of any pixel point in the three-dimensional map;
determining the position of the pixel point in a three-dimensional space according to the longitude direction radian and the latitude direction radian;
determining distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image;
determining the position of each pixel point in the fisheye image according to the distortion characteristics, the size of a preset fisheye image and the position of each pixel point in the three-dimensional map;
and dyeing each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional map.
2. The method of claim 1, wherein the position of the pixel point in the three-dimensional space is a three-dimensional coordinate of the pixel point in a three-dimensional coordinate system, an origin of the three-dimensional coordinate system is a sphere center of a projection sphere of the fisheye image, a Z-axis of the three-dimensional coordinate system is perpendicular to the fisheye image, and determining the distortion characteristic of the pixel point according to the position of the pixel point in the three-dimensional space and a preset fisheye camera distortion coefficient comprises:
determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain the first distance;
determining the second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient;
and determining the distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
3. The method according to claim 2, wherein the determining the second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient comprises:
determining the radian of the polar coordinate position of the pixel point on the fisheye image according to the Z-axis coordinate of the pixel point in a three-dimensional coordinate system;
and substituting the radian of the polar coordinates and a preset fish-eye camera distortion coefficient into a projection model formula to obtain the second distance.
4. The method according to claim 1, wherein the determining the stereomap of the stereoscopic rendering target according to the stereoscopic rendering target preset in the virtual three-dimensional space specifically includes:
acquiring a three-dimensional map of the three-dimensional rendering target according to a preset virtual camera in the virtual three-dimensional space; the parameter settings of the virtual camera correspond to the stereoscopic rendering target.
5. The method according to claim 1, wherein the dyeing of each pixel point in the fisheye image to obtain the fisheye image corresponding to the anaglyph specifically comprises: and determining the color at each pixel point in the three-dimensional map as the color at the corresponding position in the fisheye image.
6. An image conversion apparatus characterized by comprising:
the system comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining a three-dimensional map of a three-dimensional rendering target according to the three-dimensional rendering target preset in a virtual three-dimensional space; the three-dimensional map satisfies the ink card holder projection principle;
the second determining module is used for determining the longitude direction radian and the latitude direction radian of any pixel point in the three-dimensional space according to the width and the height of the three-dimensional map and the position of the any pixel point in the three-dimensional map; determining the position of the pixel point in a three-dimensional space according to the longitude direction radian and the latitude direction radian;
the third determining module is used for determining distortion characteristics of the pixel points according to the positions of the pixel points in the three-dimensional space and a preset fisheye camera distortion coefficient; the distortion characteristic is used for characterizing the distortion degree of the second distance relative to the first distance; the first distance is the distance of the pixel point shifting the center point of the fisheye image in the fisheye image plane; the second distance is the distance of the pixel point from the center point of the spherical surface in the projection spherical surface of the fisheye image;
the positioning module is used for determining the position of each pixel point in the fisheye image according to the distortion characteristics, the size of a preset fisheye image and the position of each pixel point in the three-dimensional map;
and the dyeing module is used for dyeing each pixel point in the fisheye image to obtain the fisheye image corresponding to the three-dimensional map.
7. The apparatus according to claim 6, wherein the position of the pixel point in the three-dimensional space is a three-dimensional coordinate of the pixel point in a three-dimensional coordinate system, an origin of the three-dimensional coordinate system is a sphere center of a projection sphere of the fisheye image, a Z-axis of the three-dimensional coordinate system is perpendicular to the fisheye image, and the third determining module is specifically configured to:
determining the Euclidean distance between the pixel point and the central point of the fisheye image to obtain the first distance;
determining the second distance according to the Z-axis coordinate of the pixel point in the three-dimensional coordinate system and a preset fisheye camera distortion coefficient;
and determining the distortion characteristics of the pixel points according to the ratio of the first distance to the second distance.
8. A computer device, characterized in that the computer device comprises: a processor and a memory; the memory for storing computer program code, the computer program code comprising computer instructions; the computer device, when executing the computer instructions by the processor, performs the image conversion method of any of claims 1-5.
9. A computer-readable storage medium comprising computer instructions which, when run on a computer device, cause the computer device to perform the image conversion method of any one of claims 1-5.
CN202111496232.7A 2021-12-09 2021-12-09 Image conversion method and device Active CN113888401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111496232.7A CN113888401B (en) 2021-12-09 2021-12-09 Image conversion method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111496232.7A CN113888401B (en) 2021-12-09 2021-12-09 Image conversion method and device

Publications (2)

Publication Number Publication Date
CN113888401A CN113888401A (en) 2022-01-04
CN113888401B true CN113888401B (en) 2022-04-05

Family

ID=79016625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111496232.7A Active CN113888401B (en) 2021-12-09 2021-12-09 Image conversion method and device

Country Status (1)

Country Link
CN (1) CN113888401B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110766170A (en) * 2019-09-05 2020-02-07 国网江苏省电力有限公司 Image processing-based multi-sensor fusion and personnel positioning method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253685A1 (en) * 2009-04-01 2010-10-07 Lightmap Limited Generating Data for Use in Image Based Lighting Rendering
CN103996172B (en) * 2014-05-08 2016-08-31 东北大学 A kind of fisheye image correcting method based on more corrective
CN106815805A (en) * 2017-01-17 2017-06-09 湖南优象科技有限公司 Rapid distortion bearing calibration based on Bayer images
CN107256535A (en) * 2017-06-06 2017-10-17 斑马信息科技有限公司 The display methods and device of panoramic looking-around image
WO2020224199A1 (en) * 2019-05-08 2020-11-12 四川深瑞视科技有限公司 Fisheye camera calibration system, method and apparatus, electronic device, and storage medium
CN110648274B (en) * 2019-09-23 2024-02-02 创新先进技术有限公司 Method and device for generating fisheye image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110766170A (en) * 2019-09-05 2020-02-07 国网江苏省电力有限公司 Image processing-based multi-sensor fusion and personnel positioning method

Also Published As

Publication number Publication date
CN113888401A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
CN109658365B (en) Image processing method, device, system and storage medium
CN107113376B (en) A kind of image processing method, device and video camera
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
EP3534336B1 (en) Panoramic image generating method and apparatus
CN108876926B (en) Navigation method and system in panoramic scene and AR/VR client equipment
CN111274943B (en) Detection method, detection device, electronic equipment and storage medium
CN112381919A (en) Information processing method, positioning method and device, electronic equipment and storage medium
CN107133918B (en) Method for generating panorama at any position in three-dimensional scene
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
JP2014520337A (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
CN112837419B (en) Point cloud model construction method, device, equipment and storage medium
CN112686877B (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
WO2014110954A1 (en) Method, device and computer-readable storage medium for panoramic image completion
CN110490943B (en) Rapid and accurate calibration method and system of 4D holographic capture system and storage medium
CN104994367A (en) Image correcting method and camera
WO2020232971A1 (en) Fisheye camera calibration system, method and apparatus, and electronic device and storage medium
KR101854612B1 (en) Apparatus and Method for Exemplar-Based Image Inpainting for Spherical Panoramic Image
CN113034347B (en) Oblique photography image processing method, device, processing equipment and storage medium
CN108765582B (en) Panoramic picture display method and device
US20180213215A1 (en) Method and device for displaying a three-dimensional scene on display surface having an arbitrary non-planar shape
CN114511447A (en) Image processing method, device, equipment and computer storage medium
CN113920275A (en) Triangular mesh construction method and device, electronic equipment and readable storage medium
CN113888401B (en) Image conversion method and device
CN110163922B (en) Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium
US11288774B2 (en) Image processing method and apparatus, storage medium, and electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: Floor 25, Block A, Zhongzhou Binhai Commercial Center Phase II, No. 9285, Binhe Boulevard, Shangsha Community, Shatou Street, Futian District, Shenzhen, Guangdong 518000

Patentee after: Shenzhen Youjia Innovation Technology Co.,Ltd.

Address before: 518051 401, building 1, Shenzhen new generation industrial park, No. 136, Zhongkang Road, Meidu community, Meilin street, Futian District, Shenzhen, Guangdong Province

Patentee before: SHENZHEN MINIEYE INNOVATION TECHNOLOGY Co.,Ltd.

CP03 Change of name, title or address