CN108062788A - A kind of three-dimensional rebuilding method, device, equipment and medium - Google Patents

A kind of three-dimensional rebuilding method, device, equipment and medium Download PDF

Info

Publication number
CN108062788A
CN108062788A CN201711366015.XA CN201711366015A CN108062788A CN 108062788 A CN108062788 A CN 108062788A CN 201711366015 A CN201711366015 A CN 201711366015A CN 108062788 A CN108062788 A CN 108062788A
Authority
CN
China
Prior art keywords
camera
projection
scene
dimensional
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711366015.XA
Other languages
Chinese (zh)
Inventor
彭国俊
万月亮
王梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ruian Technology Co Ltd
Original Assignee
Beijing Ruian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ruian Technology Co Ltd filed Critical Beijing Ruian Technology Co Ltd
Priority to CN201711366015.XA priority Critical patent/CN108062788A/en
Publication of CN108062788A publication Critical patent/CN108062788A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a kind of three-dimensional rebuilding method, device, equipment and media, are related to three-dimensional reconstruction field.This method includes:Based on depth camera, the depth image of scene is obtained;According to the camera and the depth image, determine that the scene is projected in the projection of horizontal plane along shooting direction of visual lines, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;According to the projection, the position of the camera and the depth image, by the distance of each pixel in the projection to the camera, the ratio with its depth in the depth map is determined as the likelihood ratio;According to the projection, the position of the camera and the likelihood ratio, three-dimensional reconstruction is carried out to the scene.The embodiment of the present invention provides a kind of three-dimensional rebuilding method, device, equipment and medium, to realize the method for carrying out three-dimensional reconstruction to three-dimensional body based on depth camera.

Description

A kind of three-dimensional rebuilding method, device, equipment and medium
Technical field
The present embodiments relate to three-dimensional reconstruction field more particularly to a kind of three-dimensional rebuilding method, device, equipment and Medium.
Background technology
During traditional imaging, all it is by the information MAP in three dimensions into two-dimensional space, thus is lost Apart from depth information.If necessary to recover the information of three dimensions from two-dimensional space, it is necessary to obtain the orientation letter of two dimension Range information is obtained while breath, can accurately recover three-dimensional information.
Binocular stereo vision is currently employed a kind of most three-dimensional big data analysis method, is mainly existed using camera The parallax generated when being observed at different position to object in same scene carries out the recovery of depth information.
But binocular stereo vision is yet there is many problems, such as the lookup and matching of same place are entire binoculars The key and difficult point of vision, if matched inaccuracy, then the three-dimensional information reconstructed is also inaccurate.
The content of the invention
The embodiment of the present invention provides a kind of three-dimensional rebuilding method, device, equipment and medium, to realize based on depth camera The method that three-dimensional reconstruction is carried out to three-dimensional body.
In a first aspect, an embodiment of the present invention provides a kind of three-dimensional rebuilding method, this method includes:
Based on depth camera, the depth image of scene is obtained;
According to the camera and the depth image, determine that the scene is projected in horizontal plane along shooting direction of visual lines Projection, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;
According to the projection, the position of the camera and the depth image, by each pixel in the projection to institute The distance of camera is stated, the ratio with its depth in the depth map is determined as the likelihood ratio;
According to the projection, the position of the camera and the likelihood ratio, three-dimensional reconstruction is carried out to the scene.
Further, according to the projection, the position of the camera and the depth image, by each picture in the projection Vegetarian refreshments is to the distance of the camera, and the ratio with its depth in the depth map, being determined as the likelihood ratio includes:
According to the projection and the position of the camera, determine in the projection each pixel to the camera away from From;
By the ratio of the distance and its depth in the depth map of each pixel, it is determined as the likelihood ratio.
Further, according to the projection, the position of the camera and the likelihood ratio, the scene is carried out three-dimensional Reconstruction includes:
According to the projection, the position of the camera and the likelihood ratio, the collection of the three-dimensional coordinate of the scene is determined It closes;
According to the set of the three-dimensional coordinate of the scene, three-dimensional reconstruction is carried out to the scene.
Further, according to the camera and the depth image, determine the scene along shooting direction of visual lines projection Include in the projection of horizontal plane:
The camera is demarcated, obtains perspective projection matrix;
According to the two-dimensional coordinate in the projection matrix and the depth image, determine the scene along shooting direction of visual lines It is projected in the projection of horizontal plane.
Second aspect, the embodiment of the present invention additionally provide a kind of three-dimensional reconstruction apparatus, which includes:
Acquisition module for being based on depth camera, obtains the two-dimensional depth image of scene;
Projection module, for according to the camera and the depth image, determining the scene along shooting direction of visual lines The projection of horizontal plane is projected in, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;
Ratio determining module, for according to the projection, the position of the camera and the depth image, by the throwing For each pixel to the distance of the camera, the ratio with its depth in the depth image is determined as the likelihood ratio in shadow;
Three-dimensional reconstruction module, for according to the projection, the position of the camera and the likelihood ratio, to the scene Carry out three-dimensional reconstruction.
Further, ratio determining module includes:
Distance determining unit for the position according to the projection and the camera, determines each pixel in the projection It puts to the distance of the camera;
Ratio determination unit, for by the ratio of the distance and its depth in the depth map of each pixel, It is determined as the likelihood ratio.
Further, three-dimensional reconstruction module includes:
Coordinate determination unit, for according to the projection, the position of the camera and the likelihood ratio, determining the field The set of the three-dimensional coordinate of scape;
For the set of the three-dimensional coordinate according to the scene, three-dimensional reconstruction is carried out to the scene for three-dimensional reconstruction unit.
Further, projection module includes:
Matrix acquiring unit for being demarcated to the camera, obtains perspective projection matrix;
Projecting cell for the two-dimensional coordinate in the projection matrix and the depth image, determines the scene The projection of horizontal plane is projected in along shooting direction of visual lines.
The third aspect, the embodiment of the present invention additionally provide a kind of equipment, and the equipment includes:
One or more processors;
Storage device, for storing one or more programs;
When one or more of programs are performed by one or more of processors so that one or more of processing Device realizes the three-dimensional rebuilding method as described in any in the embodiment of the present invention.
Three or four aspect, the embodiment of the present invention additionally provide a kind of computer readable storage medium, are stored thereon with calculating Machine program realizes the three-dimensional rebuilding method as described in any in the embodiment of the present invention when program is executed by processor.
The embodiment of the present invention passes through the location point of depth camera, depth camera point perpendicular to the ground and actual scene edge The point that shooting direction of visual lines is projected in the projection of horizontal plane constructs the first right angled triangle, passes through realistic space where actual scene In point, actual scene point perpendicular to the ground and actual scene be projected in along shooting direction of visual lines horizontal plane projection point construction Second right angled triangle.Because three interior angles of the first right angled triangle are identical with three interior angles of the second right angled triangle, institute With similar triangles each other, and the first right angled triangle is proportional to the corresponding sides of the second right angled triangle.Based on above-mentioned theory, The distance of the first hypotenuse is determined using the position coordinates and above-mentioned projection coordinate of depth camera.Utilize actual field The depth information of each pixel determines the distance of the second hypotenuse in the depth image of scape.According to above-mentioned two bevel edge Distance determines the ratio of corresponding sides.According to the ratio of corresponding sides, three-dimensional coordinate of the actual scene in world coordinate system is determined Set carries out three-dimensional reconstruction according to the set of obtained three-dimensional coordinate to actual scene.
Description of the drawings
Fig. 1 is a kind of flow chart for three-dimensional rebuilding method that the embodiment of the present invention one provides;
Fig. 2 is a kind of flow chart of three-dimensional rebuilding method provided by Embodiment 2 of the present invention;
Fig. 3 is the signal of position relationship between a kind of camera provided by Embodiment 2 of the present invention, scene and scene projection Figure;
Fig. 4 is a kind of structure diagram for three-dimensional reconstruction apparatus that the embodiment of the present invention three provides;
Fig. 5 is a kind of structure diagram for equipment that the embodiment of the present invention four provides.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining the present invention rather than limitation of the invention.It also should be noted that in order to just Part related to the present invention rather than entire infrastructure are illustrated only in description, attached drawing.
Embodiment one
Fig. 1 is a kind of flow chart for three-dimensional rebuilding method that the embodiment of the present invention one provides.The present embodiment is applicable to pair Actual scene carries out the situation of three-dimensional reconstruction.This method can be performed by a kind of three-dimensional reconstruction apparatus, which can be by soft The mode of part and/or hardware is realized.Referring to Fig. 1, three-dimensional rebuilding method provided in this embodiment includes:
S110, based on depth camera, obtain the depth image of scene.
Wherein, scene is reality scene.Depth image can be obtained by depth camera.Depth image includes reality In scene in imaging point coordinates and reality scene of each point in RGB image each point depth.Specifically, depth representing is real The distance at minute surface center of each point apart from depth camera in scene.
Accurately and quickly to obtain the depth information of scene in the range of real-time inspection, depth camera includes but not limited to Depth camera based on mechanism pumped FIR laser technology.
S120, according to the camera and the depth image, determine that the scene is projected in water along shooting direction of visual lines The projection of plane.
Wherein, shooting direction of visual lines is the direction for connecting the camera and the depth image.Typically, if asking for field First point of projection in scape, then it is to connect the camera position coordinate and above-mentioned first point of image coordinate to shoot direction of visual lines Direction.
Specifically, according to the camera and the depth image, determine that the scene is projected in along shooting direction of visual lines The projection of horizontal plane includes:
The camera is demarcated, obtains perspective projection matrix;
According to the two-dimensional coordinate in the projection matrix and the depth image, determine the scene along shooting direction of visual lines It is projected in the projection of horizontal plane.
Wherein, perspective projection matrix represents, the two dimensional image coordinate system and three-dimensional world of the image of the camera shooting The mapping relations of coordinate system.
Optionally, to the camera calibration there are many ways to:According to whether from the point of view of calibrated reference, can be divided into Traditional camera marking method and camera self-calibration method;Divide from model difference used linear and non-linear;From asking Solve parameter result come divide have it is explicit and implicit.Any of the above-described middle scaling method realization may be employed in the present embodiment to take the photograph to described As the calibration of head, the present embodiment is not defined this.
Typically, the two dimensional image coordinate in the projection matrix and the depth image is substituted into, two dimensional image coordinate In the conversion formula of system and three-dimensional world coordinate system;Then the coordinate value for making third dimension in three-dimensional world coordinate system is zero, profit The coordinate value of the first dimension and the second dimension in three-dimensional world coordinate system is asked for above-mentioned formula, so that it is determined that above-mentioned be projected in three Tie up the three-dimensional coordinate in world coordinate system.
S130, according to the projection, the position of the camera and the depth image, by each pixel in the projection To the distance of the camera, the ratio with its depth in the depth map is determined as the likelihood ratio.
Specifically, according to the projection, the position of the camera and the depth image, by each pixel in the projection Point is to the distance of the camera, and the ratio with its depth in the depth map, being determined as the likelihood ratio includes:
According to the projection and the position of the camera, determine in the projection each pixel to the camera away from From;
By the ratio of the distance and its depth in the depth map of each pixel, it is determined as the likelihood ratio.
Wherein, the distance of each pixel to the camera can be by by each pixel in the projection in the projection The position coordinates of point coordinates and the camera substitutes into distance between two points formula and determines.
S140, according to the projection, the position of the camera and the likelihood ratio, Three-dimensional Gravity is carried out to the scene It builds.
Specifically, according to the projection, the position of the camera and the likelihood ratio, Three-dimensional Gravity is carried out to the scene Build including:
According to the projection, the position of the camera and the likelihood ratio, the collection of the three-dimensional coordinate of the scene is determined It closes;
According to the set of the three-dimensional coordinate of the scene, three-dimensional reconstruction is carried out to the scene.
Wherein, according to the coordinate of each pixel, the position of the camera and the likelihood ratio in the projection, institute is determined State set of the three-dimensional coordinate of each point in scene for the three-dimensional coordinate of the scene.
Method for reconstructing can be divided into according to the method for expressing for rebuilding curved surface:Reconstruction, it is implicit rebuild, deformation is rebuild, Mesh reconstruction and Subdivided reconstruction.
Wherein, Reconstruction is to realize that the shape of industrial products or mechanical part is set with the curve of parametric form or curved surface Meter;Implicit rebuild carries out curve and surface have some superiority during intersection operation;Deformation rebuild be go top initial surface after, A deformation direction is determined simultaneously, so that curved surface increases deformation along the direction, until realizing complete curve reestablishing;Mesh reconstruction Method represents model in the method for polygon;Subdivided reconstruction completes the reconstruction of curved surface from multiple angles such as point, line, surface.
Optionally, three-dimensional reconstruction of any of the above-described kind of method for reconstructing realization to scene may be employed in the present embodiment.
The technical solution of the embodiment of the present invention passes through the point of the location point of depth camera, depth camera perpendicular to the ground The point for being projected in the projection of horizontal plane along shooting direction of visual lines with actual scene constructs the first right angled triangle, passes through actual scene Point, actual scene in the realistic space of place point perpendicular to the ground and actual scene are projected in horizontal plane along shooting direction of visual lines The point of projection constructs the second right angled triangle.Because three interior angles of the first right angled triangle and three of the second right angled triangle Interior angle is identical, so similar triangles each other, and the first right angled triangle is proportional to the corresponding sides of the second right angled triangle.Base In above-mentioned theory, using the position coordinates and above-mentioned projection coordinate of depth camera determine the first hypotenuse away from From.The distance of the second hypotenuse is determined using the depth information of each pixel in the depth image of actual scene.Root According to the distance of above-mentioned two bevel edge, the ratio of corresponding sides is determined.According to the ratio of corresponding sides, determine actual scene in world coordinate system In three-dimensional coordinate set, according to the set of obtained three-dimensional coordinate to actual scene carry out three-dimensional reconstruction.
Embodiment two
Fig. 2 is a kind of flow chart of three-dimensional rebuilding method provided by Embodiment 2 of the present invention.The present embodiment is in above-mentioned reality Apply a kind of alternative proposed on the basis of example.
Understand for convenience of to this embodiment scheme, some concepts involved in the present embodiment are introduced first, it is as follows:
Camera calibration, the calibration of camera is to obtain camera two dimensional image coordinate and reflecting in world coordinate system Penetrate relation.Camera by calibration contains the more information on scene and image, and it is three-dimensional to can aid in follow-up progress The reconstruction of profile.
Perspective imaging model, reflection two dimensional image coordinate and the mapping relations in world coordinate system.Preferable perspective imaging Model is national forest park in Xiaokeng.Ideally above-mentioned mapping relations can be regarded as to a following perspective linear transformation:
Wherein xw,yw,zwFor the coordinate in world coordinate system, u, v are a pixel point coordinates in image, mijIt is perspective projection The element of matrix, ZcFor the Z coordinate in camera coordinate system.
Transition matrix in perspective projection matrix, i.e. perspective imaging model.
Referring to Fig. 2, three-dimensional rebuilding method provided in this embodiment includes:
S210, depth camera is demarcated using direct linear transformation, determines perspective projection matrix, be denoted as M, specifically It is as follows:
Specifically, the process that Method of Direct Liner Transformation solves three-dimensional scaling is as follows:
The u in image coordinate system is acquired, v needs to eliminate Zc, following two equations can be obtained by each putting:
m11Xw+m12Xw+m13Xw+m14-uXwm31-uYwm32-uZwm33=um34
m21Xw+m22Xw+m23Xw+m24-vXwm31-vYwm32-vZwm33=vm34
Following system of linear equations can be obtained using multiple points
The matrix one to be solved shares 12 unknown numbers, without loss of generality, sets m first34=1, by choosing six points, build 11 parameter [m ' of perspective matrix can be obtained by founding 12 equations11 m′12 m′13 m′14 m′21 m′22 m′23 m′24 m′31 m′32 m′33], become:
But the side for choosing extra six points may be employed since there is a little error in the position of selected point in actual calculating Formula builds overdetermined equation, using least square solution as the transformation coefficient of perspective matrix, additionally, due to the in actual perspective proof the 3rd Element in row needs to meetIt so needs matrix carrying out a down conversion Then by m34Metzler matrix can be obtained by being multiplied by M '.
The advantages of direct linear approach is without iteration, speed.Therefore the application of directly linear approach can improve three-dimensional The speed of reconstruction.
S220, foundation M and image coordinate system Scene image coordinate, calculate the friendship of image coordinate direction sight and horizontal plane Point.
Wherein above-mentioned intersection point namely scene are projected in the projection of horizontal plane along shooting direction of visual lines.
The coordinate of above-mentioned intersection point is specifically calculated according to equation below:
Arrangement obtains following linear equation in two unknowns:
Make zw=0, according to u, v, to above-mentioned equation solution, coordinate of the above-mentioned intersection point in world coordinate system is obtained, is denoted as xw,yw.The depth image that wherein u, v can specifically be shot by depth camera obtains.
S230, the coordinate according to above-mentioned intersection point in world coordinate system determine above-mentioned intersection point to the distance of depth camera, It is denoted as distance.
Above-mentioned distance is calculated according to equation below:
Distance (u, v)=sqrt ((xw-xc)2+(yw-yc)2+(Hc)2)
Wherein, distance (u, v) is the distance of coordinate points u, v to depth camera;xw,w, 0 is u, v directions sight with Coordinate of the intersection point of horizontal plane under world coordinate system;xc,yc,HcIt is position coordinates of the depth camera under world coordinate system.
S240, the depth information by pixel in the depth image of depth camera acquisition, take the photograph with above-mentioned intersection point to depth As the ratio of the distance of head, it is determined as the likelihood ratio, is denoted as ratio.
The likelihood ratio is calculated according to equation below:
Wherein, ratio (u, v) is the likelihood ratio of coordinate u, v;Depth (u, v) is the depth of the corresponding pixel of the coordinate Degree.
S250, according to above-mentioned intersection point, the position coordinates of depth camera and the above-mentioned likelihood ratio, determine the three-dimensional coordinate of scene Set.
The three-dimensional coordinate of scene is calculated according to equation below:
xp=(xw-xc)×ratio(u,v)+xc
yp=(yw-yc)×ratio(u,v)+yc
Hp=(1-ratio (u, v)) × Hc
Wherein, xp,yp,HpIt is the three-dimensional coordinate of any in scene.
S260, the set according to the three-dimensional coordinate of the scene carry out three-dimensional reconstruction to the scene.
Illustratively, referring to Fig. 3, location point, the point and actual scene of depth camera perpendicular to the ground of depth camera 1 In 1: 2 be projected in the intersection point 3 of horizontal plane along shooting direction of visual lines and construct the first right angled triangle;Pass through above-mentioned actual field 1: 2 in scape, 1: 2 in the point and above-mentioned actual scene of the point perpendicular to the ground is projected in horizontal plane along shooting direction of visual lines Intersection point construct the second right angled triangle.Because in three interior angles of the first right angled triangle and three of the second right angled triangle Angle is identical, so similar triangles each other, and the first right angled triangle is proportional to the corresponding sides of the second right angled triangle.Foundation 1: 2 image coordinate in perspective projection matrix and actual scene determines the coordinate of intersection point 3.According to the position of depth camera 1 The coordinate of coordinate and intersection point 3 is put, calculates the distance between two positions point.By 1: 2 depth in actual scene with it is above-mentioned away from From ratio, as the likelihood ratio.According to the coordinate of the likelihood ratio, the position coordinates of depth camera 1 and intersection point 3, actual field is determined The three-dimensional coordinate of Jing Zhongyi points 2.And so on, ask for set of the three-dimensional coordinate all put in scene as three-dimensional coordinate, root Three-dimensional reconstruction is carried out to scene according to the set of three-dimensional coordinate.
The technical solution of the embodiment of the present invention demarcates depth camera by Method of Direct Liner Transformation, because directly The characteristics of connecing linear approach has without iteration, speed.Therefore the application of directly linear approach can improve the speed of three-dimensional reconstruction Degree.
Embodiment three
Fig. 4 is a kind of structure diagram for three-dimensional reconstruction apparatus that the embodiment of the present invention three provides.Referring to Fig. 4, this implementation The three-dimensional reconstruction apparatus that example provides includes:Acquisition module 10, projection module 20, ratio determining module 30 and three-dimensional reconstruction module 40.
Wherein, acquisition module 10 for being based on depth camera, obtain the two-dimensional depth image of scene;
Projection module 20, for according to the camera and the depth image, determining the scene along shooting sight side To the projection for being projected in horizontal plane, wherein shooting direction of visual lines is to connect the camera and each pixel of the depth image Direction;
Ratio determining module 30, for according to the projection, the position of the camera and the depth image, will described in For each pixel to the distance of the camera, the ratio with its depth in the depth map is determined as the likelihood ratio in projection;
Three-dimensional reconstruction module 40, for according to the projection, the position of the camera and the likelihood ratio, to the field Scape carries out three-dimensional reconstruction.
The technical solution of the embodiment of the present invention passes through the point of the location point of depth camera, depth camera perpendicular to the ground The point for being projected in the projection of horizontal plane along shooting direction of visual lines with actual scene constructs the first right angled triangle, passes through actual scene Point, actual scene in the realistic space of place point perpendicular to the ground and actual scene are projected in horizontal plane along shooting direction of visual lines The point of projection constructs the second right angled triangle.Because three interior angles of the first right angled triangle and three of the second right angled triangle Interior angle is identical, so similar triangles each other, and the first right angled triangle is proportional to the corresponding sides of the second right angled triangle.Base In above-mentioned theory, using the position coordinates and above-mentioned projection coordinate of depth camera determine the first hypotenuse away from From.The distance of the second hypotenuse is determined using the depth information of each pixel in the depth image of actual scene.Root According to the distance of above-mentioned two bevel edge, the ratio of corresponding sides is determined.According to the ratio of corresponding sides, determine actual scene in world coordinate system In three-dimensional coordinate set, according to the set of obtained three-dimensional coordinate to actual scene carry out three-dimensional reconstruction.
Further, ratio determining module 30 includes:Distance determining unit and ratio determination unit.
Wherein, distance determining unit for the position according to the projection and the camera, determines each in the projection Pixel to the camera distance;
Ratio determination unit, for by the ratio of the distance and its depth in the depth map of each pixel, It is determined as the likelihood ratio.
Further, three-dimensional reconstruction module 40 includes:Coordinate determination unit and three-dimensional reconstruction unit.
Wherein, coordinate determination unit, for according to the projection, the position of the camera and the likelihood ratio, determining The set of the three-dimensional coordinate of the scene;
For the set of the three-dimensional coordinate according to the scene, three-dimensional reconstruction is carried out to the scene for three-dimensional reconstruction unit.
Further, projection module 20 includes:Matrix acquiring unit and projecting cell.
Wherein, matrix acquiring unit for being demarcated to the camera, obtains perspective projection matrix;
Projecting cell for the two-dimensional coordinate in the projection matrix and the depth image, determines the scene The projection of horizontal plane is projected in along shooting direction of visual lines.
Example IV
Fig. 5 is a kind of structure diagram for equipment that the embodiment of the present invention four provides, as shown in figure 5, the equipment includes place Manage device 70, memory 71, input unit 72 and output device 73;The quantity of processor 70 can be one or more in equipment, In Fig. 5 by taking a processor 70 as an example;Processor 70, memory 71, input unit 72 and output device 73 in equipment can be with It is connected by bus or other modes, in Fig. 5 exemplified by being connected by bus.
Memory 71 is used as a kind of computer readable storage medium, and journey is can perform available for storage software program, computer Sequence and module, if the corresponding program instruction/module of the three-dimensional rebuilding method in the embodiment of the present invention is (for example, three-dimensional reconstruction fills Acquisition module 10, projection module 20, ratio determining module 30 and three-dimensional reconstruction module 40 in putting).Processor 70 passes through operation Software program, instruction and the module being stored in memory 71, so as to perform at the various function application and data of equipment Reason, that is, realize above-mentioned three-dimensional rebuilding method.
Memory 71 can mainly include storing program area and storage data field, wherein, storing program area can store operation system Application program needed for system, at least one function;Storage data field can be stored uses created data etc. according to terminal.This Outside, memory 71 can include high-speed random access memory, can also include nonvolatile memory, for example, at least a magnetic Disk storage device, flush memory device or other non-volatile solid state memory parts.In some instances, memory 71 can be further Including compared with the remotely located memory of processor 70, these remote memories can pass through network connection to equipment.It is above-mentioned The example of network includes but not limited to internet, intranet, LAN, mobile radio communication and combinations thereof.
Input unit 72 can be used for the number for receiving input or character information and generate with the user setting of equipment and The related key signals input of function control.Output device 73 may include the display devices such as display screen.
Embodiment five
The embodiment of the present invention five also provides a kind of storage medium for including computer executable instructions, and the computer can be held When being performed by computer processor for performing a kind of three-dimensional rebuilding method, this method includes for row instruction:
Based on depth camera, the depth image of scene is obtained;
According to the camera and the depth image, determine that the scene is projected in horizontal plane along shooting direction of visual lines Projection, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;
According to the projection, the position of the camera and the depth image, by each pixel in the projection to institute The distance of camera is stated, the ratio with its depth in the depth map is determined as the likelihood ratio;
According to the projection, the position of the camera and the likelihood ratio, three-dimensional reconstruction is carried out to the scene.
Certainly, a kind of storage medium for including computer executable instructions that the embodiment of the present invention is provided, computer The method operation that executable instruction is not limited to the described above, can also carry out the three-dimensional reconstruction that any embodiment of the present invention is provided Relevant operation in method
By the description above with respect to embodiment, it is apparent to those skilled in the art that, the present invention It can be realized by software and required common hardware, naturally it is also possible to which by hardware realization, but the former is more in many cases Good embodiment.Based on such understanding, what technical scheme substantially in other words contributed to the prior art Part can be embodied in the form of software product, which can be stored in computer readable storage medium In, floppy disk, read-only memory (Read-Only Memory, ROM), random access memory (Random such as computer Access Memory, RAM), flash memory (FLASH), hard disk or CD etc., including some instructions with so that a computer is set Standby (can be personal computer, server or the network equipment etc.) performs the method described in each embodiment of the present invention.
It is worth noting that, in the embodiment of above-mentioned three-dimensional reconstruction apparatus, included unit and module are simply pressed It is divided according to function logic, but is not limited to above-mentioned division, as long as corresponding function can be realized;In addition, The specific name of each functional unit is also only to facilitate mutually distinguish, the protection domain being not intended to limit the invention.
Note that it above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that The invention is not restricted to specific embodiment described here, can carry out for a person skilled in the art various apparent variations, It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out by above example to the present invention It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also It can include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.

Claims (10)

1. a kind of three-dimensional rebuilding method, which is characterized in that including:
Based on depth camera, the depth image of scene is obtained;
According to the camera and the depth image, determine that the scene is projected in the throwing of horizontal plane along shooting direction of visual lines Shadow, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;
According to the projection, the position of the camera and the depth image, each pixel in the projection is taken the photograph to described As the distance of head, the ratio with its depth in the depth map is determined as the likelihood ratio;
According to the projection, the position of the camera and the likelihood ratio, three-dimensional reconstruction is carried out to the scene.
2. three-dimensional rebuilding method according to claim 1, which is characterized in that according to the projection, the position of the camera Put with the depth image, by the distance of each pixel in the projection to the camera, with it in the depth map The ratio of depth, being determined as the likelihood ratio includes:
According to the projection and the position of the camera, determine in the projection each pixel to the distance of the camera;
By the ratio of the distance and its depth in the depth map of each pixel, it is determined as the likelihood ratio.
3. three-dimensional rebuilding method according to claim 1, which is characterized in that according to the projection, the position of the camera Put with the likelihood ratio, to the scene carry out three-dimensional reconstruction include:
According to the projection, the position of the camera and the likelihood ratio, the set of the three-dimensional coordinate of the scene is determined;
According to the set of the three-dimensional coordinate of the scene, three-dimensional reconstruction is carried out to the scene.
4. three-dimensional rebuilding method according to claim 1, which is characterized in that according to the camera and the depth map Picture determines that the projection that the scene edge shooting direction of visual lines is projected in horizontal plane includes:
The camera is demarcated, obtains perspective projection matrix;
According to the two-dimensional coordinate in the projection matrix and the depth image, determine the scene along shooting direction of visual lines projection In the projection of horizontal plane.
5. a kind of three-dimensional reconstruction apparatus, which is characterized in that including:
Acquisition module for being based on depth camera, obtains the two-dimensional depth image of scene;
Projection module, for according to the camera and the depth image, determining the scene along shooting direction of visual lines projection In the projection of horizontal plane, wherein shooting direction of visual lines is the direction for connecting the camera and the depth image;
Ratio determining module, will be in the projection for according to the projection, the position of the camera and the depth image For each pixel to the distance of the camera, the ratio with its depth in the depth image is determined as the likelihood ratio;
Three-dimensional reconstruction module, for according to the projection, the position of the camera and the likelihood ratio, being carried out to the scene Three-dimensional reconstruction.
6. three-dimensional reconstruction apparatus according to claim 5, which is characterized in that ratio determining module includes:
Distance determining unit for the position according to the projection and the camera, determines that each pixel is extremely in the projection The distance of the camera;
Ratio determination unit, for by the ratio of the distance and its depth in the depth map of each pixel, determining For the likelihood ratio.
7. three-dimensional reconstruction apparatus according to claim 5, which is characterized in that three-dimensional reconstruction module includes:
Coordinate determination unit, for according to the projection, the position of the camera and the likelihood ratio, determining the scene The set of three-dimensional coordinate;
For the set of the three-dimensional coordinate according to the scene, three-dimensional reconstruction is carried out to the scene for three-dimensional reconstruction unit.
8. three-dimensional reconstruction apparatus according to claim 5, which is characterized in that projection module includes:
Matrix acquiring unit for being demarcated to the camera, obtains perspective projection matrix;
Projecting cell for the two-dimensional coordinate in the projection matrix and the depth image, determines the scene along bat Take the photograph the projection that direction of visual lines is projected in horizontal plane.
9. a kind of equipment, which is characterized in that the equipment includes:
One or more processors;
Storage device, for storing one or more programs;
When one or more of programs are performed by one or more of processors so that one or more of processors are real The now three-dimensional rebuilding method as described in any in claim 1-4.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor The three-dimensional rebuilding method as described in any in claim 1-4 is realized during execution.
CN201711366015.XA 2017-12-18 2017-12-18 A kind of three-dimensional rebuilding method, device, equipment and medium Pending CN108062788A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711366015.XA CN108062788A (en) 2017-12-18 2017-12-18 A kind of three-dimensional rebuilding method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711366015.XA CN108062788A (en) 2017-12-18 2017-12-18 A kind of three-dimensional rebuilding method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN108062788A true CN108062788A (en) 2018-05-22

Family

ID=62139436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711366015.XA Pending CN108062788A (en) 2017-12-18 2017-12-18 A kind of three-dimensional rebuilding method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN108062788A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110176032A (en) * 2019-04-28 2019-08-27 暗物智能科技(广州)有限公司 A kind of three-dimensional rebuilding method and device
CN110827392A (en) * 2018-08-31 2020-02-21 金钱猫科技股份有限公司 Monocular image three-dimensional reconstruction method, system and device with good scene usability
CN110966981A (en) * 2018-09-30 2020-04-07 北京奇虎科技有限公司 Distance measuring method and device
CN111273763A (en) * 2018-12-05 2020-06-12 扩展知识公司 Augmented reality implementation method and computer system for implementing augmented reality
CN111563950A (en) * 2020-05-07 2020-08-21 贝壳技术有限公司 Texture mapping strategy determination method and device and computer readable storage medium
CN113643328A (en) * 2021-08-31 2021-11-12 北京柏惠维康科技有限公司 Calibration object reconstruction method and device, electronic equipment and computer readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008135731A1 (en) * 2007-05-04 2008-11-13 Imperial Innovations Limited A method of and apparatus for generating a model of a cardiac surface having a plurality of images representing electrogram voltages
CN102436676A (en) * 2011-09-27 2012-05-02 夏东 Three-dimensional reestablishing method for intelligent video monitoring
CN103578133A (en) * 2012-08-03 2014-02-12 浙江大华技术股份有限公司 Method and device for reconstructing two-dimensional image information in three-dimensional mode
CN107358645A (en) * 2017-06-08 2017-11-17 上海交通大学 Product method for reconstructing three-dimensional model and its system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008135731A1 (en) * 2007-05-04 2008-11-13 Imperial Innovations Limited A method of and apparatus for generating a model of a cardiac surface having a plurality of images representing electrogram voltages
CN102436676A (en) * 2011-09-27 2012-05-02 夏东 Three-dimensional reestablishing method for intelligent video monitoring
CN103578133A (en) * 2012-08-03 2014-02-12 浙江大华技术股份有限公司 Method and device for reconstructing two-dimensional image information in three-dimensional mode
CN107358645A (en) * 2017-06-08 2017-11-17 上海交通大学 Product method for reconstructing three-dimensional model and its system

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827392A (en) * 2018-08-31 2020-02-21 金钱猫科技股份有限公司 Monocular image three-dimensional reconstruction method, system and device with good scene usability
CN110838164A (en) * 2018-08-31 2020-02-25 金钱猫科技股份有限公司 Monocular image three-dimensional reconstruction method, system and device based on object point depth
CN110827392B (en) * 2018-08-31 2023-03-24 金钱猫科技股份有限公司 Monocular image three-dimensional reconstruction method, system and device
CN110838164B (en) * 2018-08-31 2023-03-24 金钱猫科技股份有限公司 Monocular image three-dimensional reconstruction method, system and device based on object point depth
CN110966981A (en) * 2018-09-30 2020-04-07 北京奇虎科技有限公司 Distance measuring method and device
CN110966981B (en) * 2018-09-30 2023-03-24 北京奇虎科技有限公司 Distance measuring method and device
CN111273763A (en) * 2018-12-05 2020-06-12 扩展知识公司 Augmented reality implementation method and computer system for implementing augmented reality
CN110176032A (en) * 2019-04-28 2019-08-27 暗物智能科技(广州)有限公司 A kind of three-dimensional rebuilding method and device
CN110176032B (en) * 2019-04-28 2021-02-26 暗物智能科技(广州)有限公司 Three-dimensional reconstruction method and device
CN111563950A (en) * 2020-05-07 2020-08-21 贝壳技术有限公司 Texture mapping strategy determination method and device and computer readable storage medium
CN111563950B (en) * 2020-05-07 2023-04-21 如你所视(北京)科技有限公司 Texture mapping strategy determination method, device and computer readable storage medium
CN113643328A (en) * 2021-08-31 2021-11-12 北京柏惠维康科技有限公司 Calibration object reconstruction method and device, electronic equipment and computer readable medium

Similar Documents

Publication Publication Date Title
CN108062788A (en) A kind of three-dimensional rebuilding method, device, equipment and medium
WO2019127445A1 (en) Three-dimensional mapping method, apparatus and system, cloud platform, electronic device, and computer program product
CN107223269B (en) Three-dimensional scene positioning method and device
CN102750697B (en) Parameter calibration method and device
US20140015924A1 (en) Rapid 3D Modeling
CN108053469A (en) Complicated dynamic scene human body three-dimensional method for reconstructing and device under various visual angles camera
CN106803267A (en) Indoor scene three-dimensional rebuilding method based on Kinect
CN108288292A (en) A kind of three-dimensional rebuilding method, device and equipment
CN105654547B (en) Three-dimensional rebuilding method
CN104268935A (en) Feature-based airborne laser point cloud and image data fusion system and method
TW200935353A (en) Geospatial modeling system and related method using multiple sources of geographic information
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
CN107578469A (en) A kind of 3D human body modeling methods and device based on single photo
CN113724391A (en) Three-dimensional model construction method and device, electronic equipment and computer readable medium
US8509522B2 (en) Camera translation using rotation from device
CN109241233B (en) Coordinate matching method and device
CN114419241A (en) Three-dimensional model construction method and device and storage medium
CN107864372A (en) Solid picture-taking method, apparatus and terminal
Kawahara et al. Dynamic 3D capture of swimming fish by underwater active stereo
CN110766731A (en) Method and device for automatically registering panoramic image and point cloud and storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
Lieberwirth et al. Applying low budget equipment and open source software for high resolution documentation of archaeological stratigraphy and features
CN115619986A (en) Scene roaming method, device, equipment and medium
CN114494582A (en) Three-dimensional model dynamic updating method based on visual perception
CN116109522B (en) Contour correction method, device, medium and equipment based on graph neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180522

RJ01 Rejection of invention patent application after publication