CN113643414A - Three-dimensional image generation method and device, electronic equipment and storage medium - Google Patents

Three-dimensional image generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113643414A
CN113643414A CN202010394664.6A CN202010394664A CN113643414A CN 113643414 A CN113643414 A CN 113643414A CN 202010394664 A CN202010394664 A CN 202010394664A CN 113643414 A CN113643414 A CN 113643414A
Authority
CN
China
Prior art keywords
pixel
dimensional image
pixel point
coordinates
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010394664.6A
Other languages
Chinese (zh)
Other versions
CN113643414B (en
Inventor
苏泳
王一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010394664.6A priority Critical patent/CN113643414B/en
Publication of CN113643414A publication Critical patent/CN113643414A/en
Application granted granted Critical
Publication of CN113643414B publication Critical patent/CN113643414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Abstract

The disclosure relates to a three-dimensional image generation method, a three-dimensional image generation device, an electronic device and a storage medium, and relates to an image processing technology to solve the problem that distortion or distortion occurs to a generated three-dimensional image due to the fact that the camera view angle change range is large or information is insufficient in the related technology. The method comprises the following steps: obtaining world coordinates of each pixel point according to the two-dimensional image and depth information corresponding to each pixel point of the two-dimensional image; obtaining texture coordinates of each pixel point according to the pixel coordinates of each pixel point; storing the index number corresponding to each pixel point according to the sequence of three pixel vertexes of a triangle, wherein the triangle is a basic unit forming a three-dimensional model, and the three-dimensional model comprises a plurality of triangles; and reading the pixel coordinate corresponding to the index number, the world coordinate and the texture coordinate of the corresponding pixel point according to the stored index number, and performing rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.

Description

Three-dimensional image generation method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to image processing technologies, and in particular, to a method and an apparatus for generating a three-dimensional image, an electronic device, and a storage medium.
Background
With the continuous development of image processing technology, depth information of a two-dimensional image, namely depth values corresponding to pixel points in the two-dimensional image, can be extracted through technologies such as depth learning, so that partial three-dimensional information of the two-dimensional image can be restored through rendering, and a common planar image is subjected to three-dimensional (3 Dimensions) processing to generate a corresponding 3D image.
For example, the original two-dimensional image and the depth map may be processed, and the pixel positions under different camera parameters may be calculated by adjusting the depth information corresponding to the original two-dimensional image, so as to render the 3D images under different angles. However, the current scheme only supports the situation that the camera view angle changes in a very small range, and if the change range of the camera view angle is large, part of pixel point information of the generated 3D image is incomplete, or the pixel point information is distorted, and the generated 3D image is distorted or distorted.
Disclosure of Invention
The present disclosure provides a three-dimensional image generation method, an apparatus, an electronic device, and a storage medium, to at least solve a problem that distortion or distortion occurs in a generated 3D image due to a large variation range of camera viewing angles in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided a three-dimensional image generation method, including: obtaining world coordinates of each pixel point according to the two-dimensional image and depth information corresponding to each pixel point of the two-dimensional image; obtaining texture coordinates of each pixel point according to the pixel coordinates of each pixel point; storing the index numbers corresponding to the pixel points according to the sequence of three pixel vertexes of the triangle, wherein the triangle is a basic unit for forming a three-dimensional model, and the three-dimensional model comprises a plurality of triangles; and reading the pixel coordinate corresponding to the index number, the world coordinate and the texture coordinate of the corresponding pixel point according to the stored index number, and performing rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.
In the technical scheme, the texture of each pixel point is obtained by obtaining the world coordinate of each pixel point, and the index number cache of the vertex of the triangular pixel rendered by the image is generated, so that the processing model of the three-dimensional image is formed. Therefore, the world coordinate and the texture coordinate corresponding to the pixel point can be obtained according to the stored pixel point index number, and the three-dimensional image can be restored through image rendering. When the camera moves in angle or adjusts parameters, parameter adjustment processing can still be carried out according to the model generated by the three-dimensional image, three-dimensional images under corresponding angles are generated, and the effect required by 3D photos is achieved.
In an embodiment, the obtaining, according to the two-dimensional image and depth information corresponding to each pixel point of the two-dimensional image, world coordinates of each pixel point includes: and obtaining world coordinates of each pixel point of the two-dimensional image according to the pixel coordinates corresponding to each pixel point of the two-dimensional image, the focal length of the camera and the depth information corresponding to each pixel point of the two-dimensional image. In the possible implementation mode, the world coordinates of each pixel point of the two-dimensional image are obtained in a simplified calculation mode, and the simplified calculation can effectively reduce the calculation complexity for constructing the three-dimensional image model.
In an embodiment, the obtaining, according to a pixel coordinate corresponding to each pixel point of a two-dimensional image, a focal length of a camera, and depth information corresponding to each pixel point of the two-dimensional image, a world coordinate of each pixel point of the two-dimensional image specifically includes: the world coordinate corresponding to the pixel point M satisfies the following conditions:
Figure BDA0002487036410000021
wherein M isi,jRepresenting the world coordinate corresponding to the pixel point M, i, j representing the pixel coordinate of the pixel point M, P0i,jRepresenting the world coordinate of the pixel point P0 corresponding to the plane with the pixel point M mapped to Z being 0, C representing the world coordinate (0, 0, f) of the central point of the camera lens, Di,jAnd f represents the depth information of the pixel point M, and f represents the focal length of the camera.
In the possible implementation manner, the world coordinates corresponding to each pixel point of the two-dimensional image are obtained in a simplified operation manner by assuming a certain data, based on that the world coordinates of the central point of the camera lens are (0, 0, f), and the central point of the camera lens always faces the central point of the two-dimensional image, so that the operation complexity for constructing the three-dimensional image model is reduced.
In one embodiment, obtaining texture coordinates corresponding to the pixel coordinates of each pixel point according to the pixel coordinates of each pixel point includes:when h is less than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx×h/w,1-j×hy) (ii) a When h is greater than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx,1-j×hyXw/h), wherein wxThe distance h between the pixel point M of the two-dimensional image and the adjacent pixel point on the X axis in world coordinatesyAnd the distance between the pixel point M of the two-dimensional image and the adjacent pixel point in the world coordinate on the Y axis is represented, w represents the pixel resolution of the two-dimensional image on the X axis, and h represents the pixel resolution of the two-dimensional image on the Y axis.
In an embodiment, the storing the index number corresponding to each pixel point according to the order of three pixel vertices of the triangle specifically includes: and sequentially storing the index numbers corresponding to the pixel points into a cache according to the clockwise sequence or the anticlockwise sequence of the three pixel vertexes of the triangle formed by image rendering. In the possible implementation manner, the index numbers of the pixel points of the three vertexes of the triangle are stored in the cache through the image rendering rule of the basic unit triangle for image rendering, so that the world coordinates and the texture coordinates of the pixel points can be determined according to the index numbers in the cache, and the three-dimensional image is rendered.
In one embodiment, before performing rendering processing to generate a three-dimensional image corresponding to the two-dimensional image, the method further includes: and according to the color values corresponding to the three pixel vertexes of the triangle formed by image rendering, obtaining the color values corresponding to the pixel points except the missing texture information of the three pixel vertexes in the plurality of pixel points in the triangle area through a linear interpolation algorithm. In the possible implementation manner, by using the image rendering characteristic of a Graphics Processing Unit (GPU), the color value information of the pixel points in the triangle, which are missing from the texture information, is filled in the generated three-dimensional image model through linear difference calculation according to the texture information of the three pixel vertexes of the triangle, so that a relatively complete three-dimensional image can be restored, and the problem of distortion or distortion is avoided.
According to a second aspect of the embodiments of the present disclosure, there is provided a three-dimensional image generation apparatus, the apparatus including: the image processing module is configured to execute depth information corresponding to the two-dimensional image and each pixel point of the two-dimensional image to obtain a world coordinate of each pixel point; the image processing module is also configured to execute the operation of obtaining the texture coordinate of each pixel point according to the pixel coordinate of each pixel point; the index cache module is configured to store the index number corresponding to each pixel point according to the sequence of three pixel vertexes of a triangle, wherein the triangle is a basic unit forming a three-dimensional model, and the three-dimensional model comprises a plurality of triangles; and the image rendering module is configured to read the pixel coordinates corresponding to the index number and the world coordinates and texture coordinates of the corresponding pixel points according to the stored index number, and perform rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.
In one embodiment, the image processing module is specifically configured to: and obtaining world coordinates of each pixel point of the two-dimensional image according to the pixel coordinates corresponding to each pixel point of the two-dimensional image, the focal length of the camera and the depth information corresponding to each pixel point of the two-dimensional image.
In one embodiment, the image processing module is specifically configured to: the world coordinate corresponding to the pixel point M satisfies the following conditions:
Figure BDA0002487036410000031
wherein M isi,jRepresenting the world coordinate corresponding to the pixel point M, i, j representing the pixel coordinate of the pixel point M, P0i,jRepresenting the world coordinate of the pixel point P0 corresponding to the plane with the pixel point M mapped to Z being 0, C representing the world coordinate (0, 0, f) of the central point of the camera lens, Di,jAnd f represents the depth information of the pixel point M, and f represents the focal length of the camera.
In an embodiment, the image processing module is further specifically configured to: when h is less than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx×h/w,1-j×hy) (ii) a When h is greater than or equal to w,the texture coordinate of the pixel point M satisfies the following conditions: m (i, j) ═ i × wx,1-j×hyXw/h), wherein wxThe distance h between the pixel point M of the two-dimensional image and the adjacent pixel point on the X axis in world coordinatesyAnd the distance between the pixel point M of the two-dimensional image and the adjacent pixel point in the world coordinate on the Y axis is represented, w represents the pixel resolution of the two-dimensional image on the X axis, and h represents the pixel resolution of the two-dimensional image on the Y axis.
In an embodiment, the index caching module is specifically configured to perform: and sequentially storing the index numbers corresponding to the pixel points into a cache according to the clockwise sequence or the anticlockwise sequence of the three pixel vertexes of the triangle formed by image rendering.
In an embodiment, the image rendering module is specifically configured to perform: and according to the color values corresponding to the three pixel vertexes of the triangle formed by image rendering, obtaining the color values corresponding to the pixel points except the missing texture information of the three pixel vertexes in the plurality of pixel points in the triangle area through a linear interpolation algorithm.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; and a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the three-dimensional image generation method according to any one of the first aspect and the first aspect provided by the embodiment of the present application.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a storage medium, wherein when executed by a processor of an electronic device, instructions of the storage medium enable the electronic device to perform the three-dimensional image generation method according to any one of the first aspect and the first aspect provided by the embodiments of the present disclosure.
According to a fifth aspect of the embodiments of the present disclosure, there is provided a computer program product, which when run on a computer, causes the computer to execute the three-dimensional image generation method according to any one of the first aspect and the first aspect provided by the embodiments of the present disclosure.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects: the method and the device for generating the three-dimensional image have the advantages that a three-dimensional image generation model is formed by generating world coordinates, texture coordinates and triangular image rendering cache corresponding to each pixel point of the two-dimensional image, the two-dimensional image is used as a texture on the model, and a complete 3D image can be restored in a three-dimensional scene; therefore, when the camera moves in an angle or adjusts other parameters, a complete three-dimensional image can still be restored according to the model generated by the 3D image, and the effect required by the 3D photo is achieved. In addition, due to the image rendering characteristic of the GPU, when rendering the texture, the missing pixel information is processed by linear interpolation or the like according to the existing texture, so that the problem of 3D image distortion such as distortion and fracture of the 3D image due to information missing can be avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a flow chart illustrating a three-dimensional image generation method according to an exemplary embodiment.
FIG. 2 is a schematic diagram illustrating various image coordinate systems according to an exemplary embodiment.
Fig. 3 is a schematic diagram illustrating a correspondence between a pixel coordinate system and a texture coordinate system according to an exemplary embodiment.
FIG. 4 is a schematic diagram illustrating triangle vertices of an image rendering according to an example embodiment.
FIG. 5 is a schematic diagram illustrating two-dimensional image imaging principles and coordinate relationships in accordance with an exemplary embodiment.
Fig. 6 is a block diagram illustrating a three-dimensional image generation method apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram showing an apparatus (general structure of an electronic device) according to an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The 3D image is an image having a stereoscopic effect, and may be an image having a sense of reality produced by image processing software. 3D, three-dimensional, refers to a three-dimensional space consisting of three axes X, Y, Z. By rotating the 3D image to convert the viewing angle, the information of the 3D image at different angles can be displayed. The 3D image is relative to a planar (two-dimensional) image that is only long and wide.
The application provides a three-dimensional image generation method and a three-dimensional image generation device. The picture generated by the technical scheme provided by the application can present a complete 3D image under the condition that the visual angle of the camera is greatly changed.
The method may be applied to an electronic device with an image capturing function or an image processing capability, where the electronic device may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, a vehicle-mounted device, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) Virtual Reality (VR) device, and the like, and the embodiment of the present disclosure does not particularly limit the specific form of the electronic device.
Fig. 1 is a flowchart illustrating a three-dimensional image generation method according to an exemplary embodiment, which may be applied to an electronic device, as shown in fig. 1, and includes the following steps.
In step S11, the world coordinates of each pixel point are obtained according to the two-dimensional image and the depth information corresponding to each pixel point of the two-dimensional image.
The two-dimensional image is a planar image that does not include depth information. Two dimensions refer to both the length and width dimensions of the image. The two-dimensional image may be obtained by shooting with a camera of the electronic device, or may be a two-dimensional image shot by another electronic device received by the electronic device, such as a camera.
The depth information refers to distance information between a target object and an observation point (a center point of a camera), and may be acquired by a Time of Flight (ToF) ranging camera or an infrared detection device. That is, the ToF ranging camera or the infrared device may acquire a depth map of the target object, and each depth value on the depth map may represent a distance between a point on the target object and the ToF ranging camera or the infrared detection device. In addition, the depth learning calculation can be carried out on the two-dimensional image through an artificial intelligence algorithm, and the depth information of the two-dimensional image is obtained.
In one embodiment, before performing specific data calculation and processing, normalization processing may be performed on the depth information, so that a value (depth value) of the depth information is within a certain range, and thus the depth value is not too small to reflect a difference of depth values between different pixel points; the depth values may also be made not to be too large, resulting in increased computational complexity. Illustratively, the depth information may be normalized to [1, 6], i.e., the depth values are limited to a range of 1 and 6 (including 1 and 6), which may be in meters (m).
Therefore, through the depth map acquired by the electronic device, the depth information corresponding to each pixel point of the two-dimensional image can be obtained, that is, the distance information between the target object corresponding to each pixel point on the two-dimensional image and the electronic device in the real world can be obtained.
The world coordinate system refers to three-dimensional space coordinates in graphics and is a three-dimensional coordinate system. The world coordinates of the target object can reflect the absolute coordinates of the target object existing in the real world, and can more accurately reflect the relative coordinates between different positions on the target object. For example, M (X, Y, Z) represents the coordinates of a point P in space in the world coordinate system. The world coordinates may be in meters (m).
In an embodiment, the world coordinate of each pixel point can be obtained according to the two-dimensional image and the depth information corresponding to each pixel point of the two-dimensional image, so that the world coordinate of the target object in the real world can be obtained, and the 3D image can be obtained through further processing.
As shown in fig. 2, there are also a camera coordinate system, an image coordinate system and a pixel coordinate system with respect to the world coordinate system of the pixel. The camera coordinate system is a three-dimensional space coordinate system established by taking a camera as a reference point, is defined for describing the position of a target object from the angle of the camera, and can be used as an intermediate quantity for establishing the relationship between a world coordinate system and an image coordinate system as well as a pixel coordinate system. The unit of the camera coordinates may be meters (m).
The image coordinate system is introduced for describing a projection transmission relation of the target object from the camera coordinate system to the image coordinate system in the imaging process of the two-dimensional image, so that the coordinates under the pixel coordinate system can be further conveniently obtained. The unit of the image coordinates may be meters (m). According to the projection transmission theory of light, a point on a target object photographed by a camera can be projected by light, thereby generating a corresponding point on a two-dimensional image plane.
The pixel coordinate system is introduced for describing the coordinates of the pixel points on the digital image after the target object is imaged. Because the two-dimensional image is composed of one pixel unit (also called pixel point), the pixel coordinate, that is, the position of each pixel point in the image, is the coordinate where the pixel point information of the image is really read from the camera. The unit of the pixel coordinates may be one (number of pixels). The pixel coordinates are a rectangular coordinate system u-v in pixel units established with the upper left corner of the two-dimensional image as the origin. The abscissa u and the ordinate v of a pixel are the number of columns and the number of rows of pixel units in the two-dimensional image, respectively.
The above coordinates can be converted through a certain mathematical operation, and can be converted into coordinates in a corresponding coordinate system according to different calculation requirements. The conversion method and the estimation process between coordinate systems related to the embodiments of the present application will be described in detail below.
In step S12, the texture coordinates of each pixel point are obtained according to the pixel coordinates of each pixel point.
The image resolution of the two-dimensional image may be represented as w × h, which may indicate that the two-dimensional image is divided into w × h pixel units, which may also be referred to as pixels. Wherein w represents the pixel resolution of the two-dimensional image on the X axis, and can represent that each row of the two-dimensional image has w pixel units, h represents the pixel resolution of the two-dimensional image on the Y axis, and can represent that each column of the two-dimensional image has h pixel units, and both w and h are natural numbers greater than 0. The pixel coordinates of the pixel points can be represented by M (i, j) to indicate the pixel point M in the ith row and the jth column, where i and j are natural numbers. The origin (0, 0) of the pixel coordinates represents the pixel point of row 0 and column 0, as shown in fig. 3.
Wherein, the index number of the pixel point can be represented by a natural number M. For example, the origin (0, 0) of the pixel coordinate represents the pixel point in the 0 th row and the 0 th column, and the index number thereof may be 0; the pixel coordinate (0, 1) represents the pixel point in the 0 th row and the 1 st column, the index number of the pixel point can be 1, and the index number of each pixel point is obtained by secondary class push. The pixel coordinates of the pixel points can be obtained according to the index numbers.
The texture may actually be an array, where the elements of the texture are color values. A single color value is called a texel or texel. Each texel has a unique address in the texture that represents the texture coordinates corresponding to the texture map. When a texture map is applied to a region of a two-dimensional image, its texel addresses need to be mapped into the coordinate system of the two-dimensional image and then translated to the screen coordinate system or pixel coordinate system. Therefore, the addresses of the texture elements, i.e., the texture coordinates, can be expressed by converting the pixel coordinates, i.e., there is a correspondence between the texture coordinates and the pixel coordinates.
As shown in fig. 3, the texture coordinates are a rectangular coordinate system x-y established with the lower left corner of the two-dimensional image as the origin and with texture elements as units, and the values of the texture coordinates range from 0 to 1 on the x axis and the y axis. Therefore, the texture coordinates start at (0, 0), i.e., the coordinates of the lower left corner of the texture picture are (0, 0), and the texture coordinates of the upper right corner of the texture picture can be (1, 1).
In the above embodiment, the texture coordinate of each pixel point may be obtained according to the pixel coordinate of each pixel point and a plurality of correspondence relationships between the pixel coordinates and the texture coordinates.
In step S13, the index numbers corresponding to each pixel point are stored in the order of the three pixel vertices of the triangle.
And according to the rule of image rendering, sequentially storing the index numbers of the pixel points of each pixel point, for example, storing the index numbers into a cache. And reading the pixel coordinates corresponding to the index number, and the world coordinates and texture coordinates of the corresponding pixel points according to the index number sequence of the pixel points stored in the cache so as to perform rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.
Texture is an array used to store the element values of pixel colors, and rendering refers to the process of generating data into an image. Texture rendering is a process of generating an image according to data such as color values stored in a cache.
In one embodiment, the three-dimensional model comprises a plurality of triangles, i.e. triangles are the basic units constituting the three-dimensional model. And the drawing of 3D image can define an image plane through at least three points, and the rectangle that four pixel constitute can be realized through two triangles. Therefore, the basic unit of image drawing and rendering can be pixel points corresponding to three vertexes of a triangle, and any complex 3D image can be realized by rendering a plurality of triangles. Wherein, the three pixel vertexes of the triangle are all vertexes in the three-dimensional space. Therefore, a triangle is a basic unit of rendering of a three-dimensional image, which can be restored by rendering texture information of a plurality of triangles.
To be able to map a texture onto a triangle, it can be specified which part of the texture each corresponds to for each vertex of the triangle. Thus, each vertex is associated with a texture coordinate that indicates from which portion of the texture image the pixel color should be captured.
Assuming that four pixels form a rectangle, as shown in fig. 4, the pixel coordinate of the pixel at the upper left corner in the two-dimensional picture is (i, j), and the index number corresponding to the pixel point is M1. The pixel coordinates of other three pixel points in the two-dimensional picture are respectively as follows: pixel coordinate (i +1, j), corresponding to index number M2; pixel coordinate (i, j +1), corresponding to index number M3; pixel coordinate (i +1, j +1), corresponding to an index number M4.
Selecting an initial pixel vertex of image rendering according to a preset rule, and sequentially storing index numbers corresponding to all pixel points into a cache according to a clockwise sequence or a counterclockwise sequence of three pixel vertices of a triangle formed by image rendering. For example, each triangle is the first vertex from the top left corner, and the rectangle can be divided into two triangles, and then the index number of the pixel coordinate of each pixel point of the two triangles is stored in the cache according to the image rendering sequence, which may be:
M1、M4、M2;
M1、M3、M4。
or, if each triangle is the first vertex from its lower left corner, the order of storing the indices of the pixel coordinates of the two triangles in the buffer may also be:
M3、M1、M2;
M3、M2、M4。
in step S14, according to the stored index number, the pixel coordinates corresponding to the index number, and the world coordinates and texture coordinates of the corresponding pixel point are read, and a three-dimensional image corresponding to the two-dimensional image is generated by performing rendering processing.
The electronic device can obtain the pixel coordinates corresponding to the index number according to the sequence of the index numbers stored in the cache, and read the world coordinates and the texture coordinates of the pixel points corresponding to the pixel coordinates according to the pixel coordinates so as to perform rendering processing to generate the three-dimensional image corresponding to the two-dimensional image.
According to the method provided by the embodiment of the application, according to the two-dimensional image and the depth information, a 3D image processing model can be formed by generating the world coordinate, the texture coordinate and the cache of the index numbers of the three vertex pixel coordinates of the triangle, and the two-dimensional image is used as the texture on the model, so that the complete 3D image can be finally restored in the 3D scene. And when the camera moves or other parameters are adjusted, the generated model of the picture can still be seen, and the effect required by the 3D image is achieved. Due to the rendering characteristic of a Graphics Processing Unit (GPU), when rendering a texture, Processing such as linear interpolation is performed on a pixel point lacking pixel information in a triangular region according to existing texture information. Specifically, according to color values corresponding to three pixel vertexes of the triangle, color values corresponding to pixel points which are except for the missing texture information of the three pixel vertexes and are in a plurality of pixel points in the triangle area are obtained through a linear interpolation algorithm, so that the phenomenon of image distortion or breakage caused by the missing of pixel information can be avoided.
The three-dimensional image can be generated through the operation and the rendering processing, and because the method calculates the world coordinates corresponding to each pixel point on the two-dimensional image through the two-dimensional image and the depth information corresponding to each pixel point on the two-dimensional image, the position of the target object in the real space is determined, and the 3D images under the shooting angles of different cameras are generated, the 3D scene can be restored under the condition that the visual angle change range of the camera is large, and the problem that the generated 3D image is distorted or distorted due to the fact that the visual angle change range of the camera is large in the prior art can be solved.
In an implementation manner, in step S11 in the foregoing embodiment, the world coordinate of each pixel point of the two-dimensional image may be obtained specifically according to the pixel coordinate corresponding to each pixel point of the two-dimensional image, the focal length of the camera, and the depth information corresponding to each pixel point of the two-dimensional image.
Specifically, the world coordinate corresponding to the pixel point M can be obtained by the following formula:
Figure BDA0002487036410000091
wherein M isi,jRepresenting the world coordinate corresponding to the pixel point M, i, j representing the pixel coordinate of the pixel point M, P0i,jRepresenting the world coordinate of the pixel point P0 corresponding to the plane in which the pixel point M is mapped to Z ═ 0, C representing the world coordinate (0, 0, f) of the center point of the camera lens, Di,jAnd f represents the depth information of the pixel point M, and f represents the focal length of the camera.
The above formula can be derived by the following procedure:
first, assume that the position C of the center point of the camera in the world coordinate system is (0, 0, f), the focal length of the camera is f, and the center point of the camera is always directed toward the center point P of the two-dimensional image.
The focal length of the lens of the camera is the distance from the optical back principal point of the lens to the focus of the light collection, and is an important performance index of the lens. The focal length may specifically be a distance from an optical center point of a lens of the camera to the two-dimensional imaging plane, that is, a distance from a center point C of the camera to a center point P of the two-dimensional image in fig. 5 is f, and a unit may be meter (m).
The relationship between the focal length f of the camera and the Field of view (FOV) of the camera satisfies:
Figure BDA0002487036410000101
the field angle is also called as a viewing angle, and is an included angle formed by two edges of a lens of an optical instrument as a vertex and the maximum range of the object image of a measured object capable of passing through the lens. The size of the field angle determines the field of view of the optical instrument. For example, in the embodiment of the present application, it may be assumed that the range of the angle of view is 40 ° to 80 °.
If the image resolution of the two-dimensional image is w × h, and the length and width of the two-dimensional image are width × height, wherein width may represent the width of the two-dimensional image and may be in meters (m); height may represent the height of the two-dimensional image, which may be in meters (m). According to the proportional relation between the image resolution and the length and width of the two-dimensional image, w/h can be obtained.
Then, when h > w of the two-dimensional image, assuming that width of the two-dimensional image under the world coordinate is 1, length of the two-dimensional image under the world coordinate is h/w; when h < w of the two-dimensional image, assuming that height of the two-dimensional image in world coordinates is 1, width of the two-dimensional image in world coordinates is w/h.
Further, assuming that the depth values of all the pixels corresponding to the two-dimensional image are all 0, it indicates that all the pixels of the two-dimensional image are on a plane whose Z is 0 in the world coordinate, and then the distance between the pixels in the world coordinate system on the Y axis of the world coordinate system may be: h isyOn the X-axis of the world coordinate system, the distance between the pixels in the world coordinate system may be: w is ax=width/w。
The pixel coordinates of the pixel points corresponding to the two-dimensional image may be: the pixel coordinate of the upper left corner of the two-dimensional image is (0, 0), and the pixel coordinate of the lower right corner is (w-1, h-1) when the pixel coordinate is increased to the lower right corner. Assuming that the pixel coordinate of the pixel point M is (i, j), the world coordinate of the pixel point P0 in the plane where Z is equal to 0 may be:
P0i,j=(-width/2+i×wx,height/2+j×hy,0)。
then, according to the correspondence between the pixel point on the two-dimensional plane and the point of the pixel point on the two-dimensional plane in the real space, as shown in fig. 5, it satisfies the triangle similarity principle, and can obtain:
Figure BDA0002487036410000111
in the above embodiment, by assuming the position relationship between the camera coordinate system using the central point of the camera as the origin and the world coordinate system, the position in the real world corresponding to the pixel point on the two-dimensional image, that is, the world coordinate of the pixel point, can be calculated by relatively simplifying the calculation. The world coordinate of each pixel point can be obtained according to the formula.
In an implementation manner, in step S12 in the foregoing embodiment, the texture coordinate of each pixel point is obtained according to the pixel coordinate of the pixel point and the relationship between the pixel coordinate and the texture coordinate, which can be specifically obtained by the following formula:
when the pixel resolution of the two-dimensional image on the Y axis is less than or equal to the pixel resolution on the X axis, that is, h is less than or equal to w, the texture coordinate of the pixel point M (i, j) may be: (i × w)x×h/w,1-j×hy)。
Or, when the pixel resolution of the two-dimensional image on the Y axis is greater than or equal to the pixel resolution on the X axis, that is, h is greater than or equal to w, the texture coordinate of the pixel point M (i, j) may be: (i × w)x,1-j×hy×w/h)。
In the embodiment, the texture coordinates are obtained by converting the pixel coordinates of the pixel points, and the texture map can be rendered according to the texture coordinates when the image is rendered, so that the 3D image can be accurately generated, and the distortion of the image are avoided.
Fig. 6 is a block diagram illustrating a three-dimensional image generation apparatus according to an exemplary embodiment. Referring to fig. 6, the apparatus 600 includes an image processing module 601, an index buffering module 602, and an image rendering module 603.
The image processing module 601 is configured to execute depth information corresponding to the two-dimensional image and each pixel point of the two-dimensional image to obtain a world coordinate of each pixel point.
The image processing module 601 is further configured to perform obtaining a texture coordinate of each pixel point according to the pixel coordinate of each pixel point.
An index caching module 602 configured to perform storing of index numbers corresponding to each pixel point in an order of three pixel vertices of a triangle, wherein the triangle is a basic unit constituting a three-dimensional model including a plurality of triangles.
And the image rendering module 603 is configured to perform rendering processing according to the stored index number, read the pixel coordinate corresponding to the index number, and the world coordinate and the texture coordinate of the corresponding pixel point, and generate a three-dimensional image corresponding to the two-dimensional image.
In one embodiment, the image processing module 601 is further specifically configured to: and obtaining world coordinates of each pixel point of the two-dimensional image according to the pixel coordinates corresponding to each pixel point of the two-dimensional image, the focal length of the camera and the depth information corresponding to each pixel point of the two-dimensional image.
In one embodiment, the world coordinate corresponding to the pixel point M satisfies
Figure BDA0002487036410000121
Wherein M isi,jRepresenting the world coordinate corresponding to the pixel point M, i, j representing the pixel coordinate of the pixel point M, P0i,jRepresenting the world coordinates of the pixel point corresponding to the plane with the pixel point M mapped to the Z being 0, C representing the world coordinates (0, 0, f) of the central point of the camera lens, Di,jAnd f represents the depth information of the pixel point M, and f represents the focal length of the camera.
In one embodiment, the image processing module 601 is further specifically configured to: when h is less than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx×h/w,1-j×hy) (ii) a When h is greater than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx,1-j×hyXw/h), wherein wxThe distance h between the pixel point M of the two-dimensional image and the adjacent pixel point on the X axis in world coordinatesyAnd the distance between the pixel point M of the two-dimensional image and the adjacent pixel point in the world coordinate on the Y axis is represented, w represents the pixel resolution of the two-dimensional image on the X axis, and h represents the pixel resolution of the two-dimensional image on the Y axis.
In one embodiment, the index caching module 602 is specifically configured to: and sequentially storing the index numbers corresponding to the pixel points into a cache according to the clockwise sequence or the anticlockwise sequence of the three pixel vertexes of the triangle formed by image rendering.
In one embodiment, the image rendering module 603 is specifically configured to perform: and according to the color values corresponding to the three pixel vertexes of the triangle formed by image rendering, obtaining the color values corresponding to the pixel points except the missing texture information of the three pixel vertexes in the plurality of pixel points in the triangle area through a linear interpolation algorithm.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 7 is a schematic diagram illustrating an apparatus 700 according to an exemplary embodiment, where the apparatus 700 may be used to generate three-dimensional images according to the above-described embodiments. As shown in fig. 7, the apparatus 700 may include at least one processor 701, a communication line 702, and a memory 703.
The processor 701 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present disclosure.
Communication link 702 may include a path to transfer information between the aforementioned components, such as a bus.
The memory 703 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via a communication line 702. The memory may also be integral to the processor. The memory provided by the disclosed embodiments may generally be non-volatile. The memory 703 is used for storing computer-executable instructions for executing the present disclosure, and is controlled by the processor 701. The processor 701 is configured to execute computer-executable instructions stored in the memory 703 to implement the methods provided by the embodiments of the present disclosure.
Optionally, the computer-executable instructions in the embodiments of the present disclosure may also be referred to as application program codes, which are not specifically limited in the embodiments of the present disclosure.
In particular implementations, processor 701 may include one or more CPUs such as CPU0 and CPU1 of fig. 7 for one embodiment.
In particular implementations, apparatus 700 may include multiple processors, such as processor 701 and processor 707 in fig. 7, for example, as an example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In particular implementations, apparatus 700 may also include a communication interface 704, as one embodiment. The communication interface 704 may be any device, such as a transceiver, for communicating with other devices or communication networks, such as an ethernet interface, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), etc.
In particular implementations, apparatus 700 may also include an output device 705 and an input device 706 as an example. An output device 705 is in communication with the processor 701 and may display information in a variety of ways. For example, the output device 705 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, a projector (projector), or the like. The input device 706 is in communication with the processor 701 and may receive user input in a variety of ways. For example, the input device 706 may be a mouse, a keyboard, a touch screen device, or a sensing device, among others.
In a specific implementation, the apparatus 700 may be a desktop, a laptop, a web server, a Personal Digital Assistant (PDA), a mobile phone, a tablet, a wireless terminal device, an embedded device, or a device with a similar structure as in fig. 7. The disclosed embodiments do not limit the type of device 700.
In some embodiments, the processor 701 in fig. 7 may cause the apparatus 700 to perform the three-dimensional image generation method in the above-described method embodiments by calling a computer stored in the memory 703 to execute instructions.
Illustratively, the functions/implementation processes of the image processing module 601, the index caching module 602, and the image rendering module 603 in fig. 6 may be implemented by the processor 701 in fig. 7 calling computer-executable instructions stored in the memory 703.
In an exemplary embodiment, there is also provided a storage medium comprising instructions, such as a memory 703 comprising instructions executable by a processor 701 of the apparatus 700 to perform the method described above.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the corresponding claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A three-dimensional image generation method, comprising:
obtaining world coordinates of each pixel point according to the two-dimensional image and depth information corresponding to each pixel point of the two-dimensional image;
obtaining texture coordinates of each pixel point according to the pixel coordinates of each pixel point;
storing the index number corresponding to each pixel point according to the sequence of three pixel vertexes of a triangle, wherein the triangle is a basic unit forming a three-dimensional model, and the three-dimensional model comprises a plurality of triangles;
and reading the pixel coordinate corresponding to the index number, the world coordinate and the texture coordinate of the corresponding pixel point according to the stored index number, and performing rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.
2. The method according to claim 1, wherein the obtaining the world coordinates of each pixel point according to the two-dimensional image and the depth information corresponding to each pixel point of the two-dimensional image comprises:
and obtaining world coordinates of each pixel point of the two-dimensional image according to the pixel coordinates corresponding to each pixel point of the two-dimensional image, the focal length of the camera and the depth information corresponding to each pixel point of the two-dimensional image.
3. The method according to claim 2, wherein the obtaining the world coordinate of each pixel point of the two-dimensional image according to the pixel coordinate corresponding to each pixel point of the two-dimensional image, the focal length of the camera, and the depth information corresponding to each pixel point of the two-dimensional image specifically comprises:
the world coordinate corresponding to the pixel point M satisfies the following conditions:
Figure FDA0002487036400000011
wherein M isi,jRepresenting the world coordinate corresponding to the pixel point M, i, j representing the pixel coordinate of the pixel point M, P0i,jRepresenting the world coordinate of the pixel point P0 corresponding to the plane with the pixel point M mapped to Z being 0, C representing the world coordinate (0, 0, f) of the central point of the camera lens, Di,jAnd f represents the depth information of the pixel point M, and f represents the focal length of the camera.
4. The method according to claim 1 or 2, wherein the obtaining texture coordinates corresponding to the pixel coordinates of each pixel point according to the pixel coordinates of each pixel point comprises:
when h is less than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx×h/w,1-j×hy);
When h is greater than or equal to w, the texture coordinate of the pixel point M satisfies the following condition: m (i, j) ═ i × wx,1-j×hyXw/h), wherein wxThe distance h between the pixel point M of the two-dimensional image and the adjacent pixel point on the X axis in world coordinatesyAnd the distance between the pixel point M of the two-dimensional image and the adjacent pixel point in the world coordinate on the Y axis is represented, w represents the pixel resolution of the two-dimensional image on the X axis, and h represents the pixel resolution of the two-dimensional image on the Y axis.
5. The method according to claim 1 or 2, wherein the storing the index number corresponding to each pixel point according to the order of three pixel vertices of the triangle specifically comprises:
and sequentially storing the index numbers corresponding to the pixel points into a cache according to the clockwise sequence or the anticlockwise sequence of the three pixel vertexes of the triangle formed by image rendering.
6. The method according to claim 1 or 2, wherein before the rendering process is performed to generate the three-dimensional image corresponding to the two-dimensional image, the method further comprises:
and according to the color values corresponding to the three pixel vertexes of the triangle formed by image rendering, obtaining the color values corresponding to the pixel points except the missing texture information of the three pixel vertexes in the plurality of pixel points in the triangle area through a linear interpolation algorithm.
7. A three-dimensional image generation apparatus, characterized in that the apparatus comprises:
the image processing module is configured to execute depth information corresponding to the two-dimensional image and each pixel point of the two-dimensional image to obtain a world coordinate of each pixel point;
the image processing module is also configured to execute the operation of obtaining the texture coordinate of each pixel point according to the pixel coordinate of each pixel point;
the index cache module is configured to store the index number corresponding to each pixel point according to the sequence of three pixel vertexes of a triangle, wherein the triangle is a basic unit forming a three-dimensional model, and the three-dimensional model comprises a plurality of triangles;
and the image rendering module is configured to read the pixel coordinates corresponding to the index number and the world coordinates and texture coordinates of the corresponding pixel points according to the stored index number, and perform rendering processing to generate a three-dimensional image corresponding to the two-dimensional image.
8. The apparatus of claim 7, wherein the image processing module is specifically configured to:
and obtaining world coordinates of each pixel point of the two-dimensional image according to the pixel coordinates corresponding to each pixel point of the two-dimensional image, the focal length of the camera and the depth information corresponding to each pixel point of the two-dimensional image.
9. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the three-dimensional image generation method of any one of claims 1 to 6.
10. A storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform the three-dimensional image generation method of any one of claims 1 to 6.
CN202010394664.6A 2020-05-11 2020-05-11 Three-dimensional image generation method and device, electronic equipment and storage medium Active CN113643414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010394664.6A CN113643414B (en) 2020-05-11 2020-05-11 Three-dimensional image generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010394664.6A CN113643414B (en) 2020-05-11 2020-05-11 Three-dimensional image generation method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113643414A true CN113643414A (en) 2021-11-12
CN113643414B CN113643414B (en) 2024-02-06

Family

ID=78415575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010394664.6A Active CN113643414B (en) 2020-05-11 2020-05-11 Three-dimensional image generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113643414B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463475A (en) * 2022-04-08 2022-05-10 山东捷瑞数字科技股份有限公司 Multi-camera rendering image fusion method based on edge correction
CN114494550A (en) * 2021-12-30 2022-05-13 北京城市网邻信息技术有限公司 WebGPU-based rendering method, electronic device and storage medium
TWI824550B (en) * 2022-06-07 2023-12-01 鴻海精密工業股份有限公司 Method for generating distorted image, electronic device and storage medium
WO2023246863A1 (en) * 2022-06-23 2023-12-28 未来科技(襄阳)有限公司 3d image generating method and apparatus, and computer device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000155851A (en) * 1998-11-20 2000-06-06 Sony Corp Texture mapping device and rendering device equipped with the same and information processor
US20020027555A1 (en) * 2000-06-28 2002-03-07 Kenichi Mori Method of rendering motion blur image and apparatus therefor
WO2006095481A1 (en) * 2005-03-07 2006-09-14 Sony Computer Entertainment Inc. Texture processing device, drawing processing device, and texture processing method
WO2009097714A1 (en) * 2008-02-03 2009-08-13 Panovasic Technology Co., Ltd. Depth searching method and depth estimating method for multi-viewing angle video image
US20100289798A1 (en) * 2009-05-13 2010-11-18 Seiko Epson Corporation Image processing method and image processing apparatus
WO2013115463A1 (en) * 2012-02-01 2013-08-08 에스케이플래닛 주식회사 Device and method for processing images
US20160034040A1 (en) * 2014-07-29 2016-02-04 Sony Computer Entertainment Inc. Information processing device, information processing method, and computer program
CN106023302A (en) * 2016-05-06 2016-10-12 刘进 Mobile communication terminal, three-dimensional reconstruction method thereof and server
WO2017083509A1 (en) * 2015-11-13 2017-05-18 Craig Peterson Three dimensional system
CN106813568A (en) * 2015-11-27 2017-06-09 阿里巴巴集团控股有限公司 object measuring method and device
WO2017104984A1 (en) * 2015-12-15 2017-06-22 삼성전자 주식회사 Method and apparatus for generating three-dimensional model using volumetric closest point approach method
WO2019140414A1 (en) * 2018-01-14 2019-07-18 Light Field Lab, Inc. Systems and methods for rendering data from a 3d environment
CN110111262A (en) * 2019-03-29 2019-08-09 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000155851A (en) * 1998-11-20 2000-06-06 Sony Corp Texture mapping device and rendering device equipped with the same and information processor
US20020027555A1 (en) * 2000-06-28 2002-03-07 Kenichi Mori Method of rendering motion blur image and apparatus therefor
WO2006095481A1 (en) * 2005-03-07 2006-09-14 Sony Computer Entertainment Inc. Texture processing device, drawing processing device, and texture processing method
WO2009097714A1 (en) * 2008-02-03 2009-08-13 Panovasic Technology Co., Ltd. Depth searching method and depth estimating method for multi-viewing angle video image
US20100289798A1 (en) * 2009-05-13 2010-11-18 Seiko Epson Corporation Image processing method and image processing apparatus
WO2013115463A1 (en) * 2012-02-01 2013-08-08 에스케이플래닛 주식회사 Device and method for processing images
US20160034040A1 (en) * 2014-07-29 2016-02-04 Sony Computer Entertainment Inc. Information processing device, information processing method, and computer program
WO2017083509A1 (en) * 2015-11-13 2017-05-18 Craig Peterson Three dimensional system
CN106813568A (en) * 2015-11-27 2017-06-09 阿里巴巴集团控股有限公司 object measuring method and device
WO2017104984A1 (en) * 2015-12-15 2017-06-22 삼성전자 주식회사 Method and apparatus for generating three-dimensional model using volumetric closest point approach method
CN106023302A (en) * 2016-05-06 2016-10-12 刘进 Mobile communication terminal, three-dimensional reconstruction method thereof and server
WO2019140414A1 (en) * 2018-01-14 2019-07-18 Light Field Lab, Inc. Systems and methods for rendering data from a 3d environment
CN110111262A (en) * 2019-03-29 2019-08-09 北京小鸟听听科技有限公司 A kind of projector distortion correction method, device and projector

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114494550A (en) * 2021-12-30 2022-05-13 北京城市网邻信息技术有限公司 WebGPU-based rendering method, electronic device and storage medium
CN114494550B (en) * 2021-12-30 2022-11-22 北京城市网邻信息技术有限公司 WebGPU-based rendering method, electronic device and storage medium
CN114463475A (en) * 2022-04-08 2022-05-10 山东捷瑞数字科技股份有限公司 Multi-camera rendering image fusion method based on edge correction
CN114463475B (en) * 2022-04-08 2022-07-19 山东捷瑞数字科技股份有限公司 Edge correction-based multi-camera rendering image fusion method
TWI824550B (en) * 2022-06-07 2023-12-01 鴻海精密工業股份有限公司 Method for generating distorted image, electronic device and storage medium
WO2023246863A1 (en) * 2022-06-23 2023-12-28 未来科技(襄阳)有限公司 3d image generating method and apparatus, and computer device

Also Published As

Publication number Publication date
CN113643414B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
CN109242961B (en) Face modeling method and device, electronic equipment and computer readable medium
US10540576B1 (en) Panoramic camera systems
EP3614340B1 (en) Methods and devices for acquiring 3d face, and computer readable storage media
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
JP6902122B2 (en) Double viewing angle Image calibration and image processing methods, equipment, storage media and electronics
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
KR20120093063A (en) Techniques for rapid stereo reconstruction from images
JP2014071850A (en) Image processing apparatus, terminal device, image processing method, and program
KR102317182B1 (en) Apparatus for generating composite image using 3d object and 2d background
CN114494388B (en) Three-dimensional image reconstruction method, device, equipment and medium in large-view-field environment
WO2021003807A1 (en) Image depth estimation method and device, electronic apparatus, and storage medium
CN113689578A (en) Human body data set generation method and device
CN113724391A (en) Three-dimensional model construction method and device, electronic equipment and computer readable medium
CN114511447A (en) Image processing method, device, equipment and computer storage medium
JP3629243B2 (en) Image processing apparatus and method for rendering shading process using distance component in modeling
CN113436247B (en) Image processing method and device, electronic equipment and storage medium
US20230005213A1 (en) Imaging apparatus, imaging method, and program
CN116188349A (en) Image processing method, device, electronic equipment and storage medium
JP2017215706A (en) Video synthesis method, video acquisition device, video synthesis system, and computer program
CN112615993A (en) Depth information acquisition method, binocular camera module, storage medium and electronic equipment
CN113706692B (en) Three-dimensional image reconstruction method, three-dimensional image reconstruction device, electronic equipment and storage medium
CN113538655B (en) Virtual face generation method and equipment
US11961266B2 (en) Multiview neural human prediction using implicit differentiable renderer for facial expression, body pose shape and clothes performance capture
US20220319055A1 (en) Multiview neural human prediction using implicit differentiable renderer for facial expression, body pose shape and clothes performance capture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant