CN108288292A - A kind of three-dimensional rebuilding method, device and equipment - Google Patents

A kind of three-dimensional rebuilding method, device and equipment Download PDF

Info

Publication number
CN108288292A
CN108288292A CN201711429865.XA CN201711429865A CN108288292A CN 108288292 A CN108288292 A CN 108288292A CN 201711429865 A CN201711429865 A CN 201711429865A CN 108288292 A CN108288292 A CN 108288292A
Authority
CN
China
Prior art keywords
image
camera
dimensional
laser
calibrating parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711429865.XA
Other languages
Chinese (zh)
Inventor
宋展
张洪龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201711429865.XA priority Critical patent/CN108288292A/en
Publication of CN108288292A publication Critical patent/CN108288292A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

A kind of three-dimensional rebuilding method includes:Runing rest includes the image of scaling board and laser rays by panorama camera acquisition;According to the image for including scaling board acquired, the calibrating parameters of the camera are obtained;According to the calibrating parameters, the image that camera obtains is converted into latitude image;According to laser rays, the parallax of two images in left and right is calculated;According to the parallax and the calibrating parameters, the three-dimensional coordinate of laser point in the laser rays in image is obtained;Sparse 3 D point cloud is constituted to generate threedimensional model according to three-dimensional coordinate.Due to the camera using panorama, viewing field of camera can be expanded, greatly reduced the number of scanning, reduce the complexity of operation, improve reconstruction efficiency;Using laser rays as characteristics of image, feature extraction speed is fast, and matching precision is high, and arithmetic speed is fast, and without accurate expensive laser scanner, sensor, high resolution industrial camera etc., is more advantageous to cost-effective.

Description

A kind of three-dimensional rebuilding method, device and equipment
Technical field
The invention belongs to a kind of three-dimensional reconstruction field more particularly to three-dimensional rebuilding method, device and equipment.
Background technology
Three-dimensional reconstruction is an important research field of computer vision.The purpose of three-dimensional reconstruction is to restore body surface shape The distance between camera and object in shape or restoration scenario, the three-dimensional configuration of quick restoration scenario and object, improve measure and The efficiency of modeling.The method of three-dimensional reconstruction is roughly divided into Stereo Vision (binocular, three mesh and multi-vision visual), structured light reconstruction Method, is based on RGB-D at structure from motion (English abbreviation SFM, full name in English are Structure from Motion) Method for reconstructing etc..Three-dimensional reconstruction is widely used in ancient building digitlization, cultural relic digitalization, robot localization navigation, reverse work The scenes such as journey.With continuing to bring out for new method and new equipment, the precision of three-dimensional reconstruction, speed, accuracy are rapidly promoted.
Wherein, Stereo Vision uses two camera combinations, and arrangement in some way is fixed, and computer vision is passed through Position relationship between two cameras of technical calibration, then measure the three-dimensional of scene based on the position relationship of two cameras and believe Breath, the threedimensional model of scene is gone out according to the three dimension reconstruction of scene.This method is also applied for using multiple cameras, any two Camera, which partners, builds stereo vision apparatus, then merges the three-dimensional information that all camera combinations are obtained, builds More careful comprehensive threedimensional model.
Structured light reconstruction method is then actively to project known coded structure to object or scene using structured light projector Controllable luminous point, striation or smooth surface, and by object of the camera acquisition with projection pattern or the image of scene, pass through computer Vision technique handles image, parses coding pattern, obtains object or the three-dimensional information of scene, builds object or the three-dimensional mould of scene Type.
Exercise recovery structure SFM is an image processing techniques based on photogrammetric scope, it includes to regard by one group Feel that the scene two-dimensional image sequence of movable information goes the three-dimensional structure of estimation scene.SFM is applied primarily to the three-dimensional of large scene Modeling, such as building modeling, city three-dimensional digital, high-precision map.Since this method calculation amount is larger, reconstruction process Typically carry out offline.
Three-dimensional reconstruction based on RGB-D, RGB-D cameras are capable of providing RGB color image and depth image, depth image Can be obtained by calculation camera to object actual range.RGB-D cameras greatly improve the efficiency of three-dimensional information acquisition.
But the camera used in above method is the camera of traditional narrow visual field, the camera that narrow visual field camera uses Model is national forest park in Xiaokeng, thus the model that stereoscopic vision uses is also mostly conventional stereo visible sensation method, and such mould Type and method are due to using narrow visual field camera, and operation is more troublesome when acquisition or hardware cost is higher or processing speed is slower, Or measurement range and precision are limited.
Invention content
In view of this, an embodiment of the present invention provides a kind of three-dimensional rebuilding method, device and equipment, to solve the prior art Middle to use narrow visual field camera, operation is more troublesome or hardware cost is higher when acquisition or processing speed is slower or measurement range The problem being limited with precision.
The first aspect of the embodiment of the present invention provides a kind of three-dimensional rebuilding method, and the three-dimensional rebuilding method is based on two-wire Structural light three-dimensional scanning means, the double structure light three-dimensional scanner includes holder, fixes two on the bracket The camera of panorama, and fix on the bracket and be in two word laser emitters in two camera centre positions, institute Stating three-dimensional rebuilding method includes:
Runing rest includes the laser of scaling board and the word laser transmitter projects by panorama camera acquisition The image of line;
According to the image for including scaling board acquired, the calibrating parameters of the camera are obtained;
According to the calibrating parameters of the camera, the image that camera obtains is converted into latitude image;
According to the laser rays in the longitude and latitude image, the parallax of two images in left and right is calculated;
According to the parallax and the calibrating parameters, the three-dimensional coordinate of laser point in the laser rays in image is obtained;
Sparse 3 D point cloud is constituted according to the three-dimensional coordinate of acquired laser point, is generated according to the sparse 3 D point cloud Threedimensional model.
With reference to first aspect, in the first possible realization method of first aspect, it is described according to acquired include mark The step of image of fixed board, the calibrating parameters for obtaining the camera includes:
At any time, the picture that multiple include the scaling board is shot respectively by two cameras;
The camera is demarcated by calibration function or calibration tool, obtains the calibrating parameters of the camera, institute It includes intrinsic parameter and outer parameter to state calibrating parameters.
With reference to first aspect, in second of possible realization method of first aspect, the calibration according to the camera Parameter, by camera obtain image be converted into latitude image the step of include:
According to the intrinsic parameter in the calibrating parameters, pass through formula:
The image that camera obtains is converted into latitude image, wherein:P (u, v, s) is a bit in spherical coordinate image, L (i, j) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate warp and woof unfolding image respectively Width and height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
With reference to first aspect, described according to the longitude and latitude image in the third possible realization method of first aspect In laser rays, calculate left and right two images parallax the step of include:
Extract the first laser line and the corresponding longitude and latitude of right camera image in the corresponding longitude and latitude image of left camera image Spend the first laser line in image;
According to the difference of the match point on the first laser line and second laser line, regarding for the image of left and right two is calculated Difference.
With reference to first aspect, described according to the parallax and described in the 4th kind of possible realization method of first aspect Calibrating parameters, obtain image in laser rays in laser point three-dimensional coordinate the step of include:
According to formula:
The three-dimensional coordinate of the laser point on laser rays is calculated, wherein:(fx,fy) indicate camera Focal length, (cx,cy) indicate that the optical centre bias of camera, ξ indicate the distance that new projection centre C1 is moved along Z axis negative direction, P (x, y, Z) a bit under world coordinate system is indicated, P2 (u, v) indicates a bit in image pixel coordinates system;dlIndicate P points to left camera Distance, drPoint P is indicated to the distance of right camera, it is respectively θ that point P is formed by angle with x-axis in the spheric coordinate system of left and rightlWith θr, fsIt is the radius that polar plane is formed by disc with spherical surface cutting, dsIndicate parallax, dnlr, the left camera spherical surface of b expressions Center is at a distance from right sphere centre, SlIndicate angle, θl2 points of arc length on corresponding camera spherical surface, SrIndicate angle, θr 2 points of arc length on corresponding camera spherical surface;P (u, v, s) be in spherical coordinate image a bit, L (i, j) (i=0, 1,...,m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate the width and height of warp and woof unfolding image, (i/ respectively M) π, (j/n) π indicate polar angle and azimuth respectively.
The second aspect of the embodiment of the present invention provides a kind of three-dimensional reconstruction apparatus, and the three-dimensional reconstruction apparatus is based on two-wire Structural light three-dimensional scanning means, the double structure light three-dimensional scanner includes holder, fixes two on the bracket The camera of panorama, and fix on the bracket and be in two word laser emitters in two camera centre positions, institute Stating three-dimensional reconstruction apparatus includes:
Image acquisition unit is used for runing rest, includes that scaling board and a word swash by panorama camera acquisition The image of the laser rays of optical transmitting set transmitting;
Calibrating parameters acquiring unit, for according to the image for including scaling board acquired, obtaining the calibration of the camera Parameter;
Longitude and latitude image conversion unit is converted to the image that camera obtains for the calibrating parameters according to the camera Longitude and latitude image;
Disparity computation unit, for according to the laser rays in the longitude and latitude image, calculating the parallax of two images in left and right;
Three-dimensional coordinate acquiring unit, for according to the parallax and the calibrating parameters, obtaining in the laser rays in image The three-dimensional coordinate of laser point;
Threedimensional model generation unit, for constituting sparse 3 D point cloud, root according to the three-dimensional coordinate of acquired laser point Threedimensional model is generated according to the sparse 3 D point cloud.
In conjunction with second aspect, in the first possible realization method of second aspect, the calibrating parameters acquiring unit packet It includes:
Subelement is shot, at any time, the picture that multiple include the scaling board being shot respectively by two cameras;
Subelement is demarcated, the camera is demarcated for passing through calibration function or calibration tool, obtains the phase The calibrating parameters of machine, the calibrating parameters include intrinsic parameter and outer parameter.
In conjunction with second aspect, in second of possible realization method of second aspect, the longitude and latitude image conversion unit For:
According to the intrinsic parameter in the calibrating parameters, pass through formula:
The image that camera obtains is converted into latitude image, wherein:P (u, v, s) is a bit in spherical coordinate image, L (i, j) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate warp and woof unfolding image respectively Width and height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
The third aspect of the embodiment of the present invention provides a kind of three-dimensional reconstruction equipment, including memory, processor and deposits Store up the computer program that can be run in the memory and on the processor, which is characterized in that the processor executes It is realized when the computer program as described in any one of first aspect the step of three-dimensional rebuilding method.
The fourth aspect of the embodiment of the present invention provides a kind of computer readable storage medium, the computer-readable storage Media storage has computer program, which is characterized in that realizes that first aspect such as is appointed when the computer program is executed by processor The step of one three-dimensional rebuilding method.
Existing advantageous effect is the embodiment of the present invention compared with prior art:By using the camera setting of two panoramas At the both ends of holder, it is arranged in the centre position of two cameras corresponding sharp when two word laser emitters record camera rotation Light, the image including scaling board acquired by camera can obtain the calibrating parameters of camera, can be with by the calibrating parameters The camera image of acquisition is converted into latitude image, and according to the laser rays in longitude and latitude image, two phases can be calculated The parallax of corresponding two images in left and right of machine is the laser point that can determine in laser rays according to the parallax and calibrating parameters Three-dimensional coordinate produces sparse 3 D point cloud, so as to further generate three-dimensional mould according to repeatedly determining three-dimensional coordinate Type.Due to the camera using panorama, viewing field of camera can be expanded, greatly reduced the number of scanning, reduce answering for operation Polygamy improves reconstruction efficiency;Using laser rays as characteristics of image, feature extraction speed is fast, and matching precision is high, arithmetic speed Soon, it and without accurate expensive laser scanner, sensor, high resolution industrial camera etc., is more advantageous to cost-effective.
Description of the drawings
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description be only the present invention some Embodiment for those of ordinary skill in the art without having to pay creative labor, can also be according to these Attached drawing obtains other attached drawings.
Fig. 1 is the structural schematic diagram of double structure light three-dimensional scanner provided in an embodiment of the present invention;
Fig. 2 is the implementation process schematic diagram of three-dimensional rebuilding method provided in an embodiment of the present invention;
Fig. 3 is Sphere Measurement Model schematic diagram provided in an embodiment of the present invention;
Fig. 4 is scape stereoscopic camera model schematic provided in an embodiment of the present invention of enjoying a double blessing;
Fig. 5 is the schematic diagram of ideal sphere stereoscopic vision mode provided in an embodiment of the present invention;
Fig. 6 is the disparity computation schematic diagram of sphere stereoscopic vision provided by the embodiments of the present application;
Fig. 7 is the polar curve schematic diagram of sphere stereoscopic image provided by the embodiments of the present application;
Fig. 8 is the longitude and latitude image schematic diagram after expansion provided by the embodiments of the present application;
Fig. 9 is disparity correspondence schematic diagram provided by the embodiments of the present application;
Figure 10 is the structural schematic diagram of three-dimensional reconstruction apparatus provided by the embodiments of the present application;
Figure 11 is the schematic diagram of three-dimensional reconstruction equipment provided in an embodiment of the present invention.
Specific implementation mode
In being described below, for illustration and not for limitation, it is proposed that such as tool of particular system structure, technology etc Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention can also be realized in the other embodiments of details.In other situations, it omits to well-known system, device, electricity The detailed description of road and method, in case unnecessary details interferes description of the invention.
In order to illustrate technical solutions according to the invention, illustrated below by specific embodiment.
It is a kind of structural schematic diagram of double structure light three-dimensional scanner provided by the embodiments of the present application as shown in Figure 1, The double structure light three-dimensional scanner includes two 360 degree of panorama cameras, 1, band 2, two word laser of rail brackets Transmitter 4 forms, wherein 360 degree of panorama cameras can be fixed on the holder with guide rail and raise whole fixed position, institute A word laser emitter is stated to be arranged on the holder in the centre position of described two panorama cameras.In addition, being adopted for the ease of image Controllable motor can be arranged in collection on the bracket, and the controllable motor 3 is arranged on the bracket, and the word laser Transmitter is located in the shaft of the controllable motor.
Two 360 ° of panorama cameras are fixed on band guide fixing bracket, the distance between two cameras can be more than 200mm;Two word laser emitters are fixed in the shaft of controllable motor, and controllable motor is fixed on band rail brackets On, the centre position of two cameras can be located substantially at.Each panorama camera can be made of the camera of two 185 ° of field angles, Camera Cam1 and camera Cam2 form a panorama camera;Camera Cam3 and camera Cam4 form a panorama camera.It is practical to survey When amount, Cam1 and Cam3 constitute a double wide angle camera measuring device, and Cam2 and Cam4 constitute another double wide angle camera and measure Device.Based on the double structure light three-dimensional scanner, the three-dimensional rebuilding method is as shown in Fig. 2, include:
In step s 201, runing rest includes that scaling board and the word laser are sent out by panorama camera acquisition The image of the laser rays of emitter transmitting;
The camera, controllable motor, a word laser emitter are mounted on the fixing bracket with guide rail, the camera is adopted The image of collection can be transmitted to computer, and the controllable motor can receive the rotation control instruction that computer is sent.The scaling board Can be chessboard calibration plate or other non-chessboard calibration plates, such as filled circles array pattern etc..
Camera can be set in random site, shooting N (N is more than 10) group includes the image of scaling board.As shown in Figure 1, Can be by camera CAM1 and camera CAM3, and by camera CAM2 and camera CAM4, that is, it is located at two phases of the same side of holder Machine shoot to obtain one group include same scaling board image, can include scaling board according to multigroup (can generally be more than 10 groups) is shot Image, to being demarcated to the parameter of the camera.
After determining the calibrating parameters, the word laser emitter can be opened, starts controllable motor, controllable electric The minimum rotation angle of machine, when the controllable motor often rotates a minimum rotation angle, triggering camera shoots a picture, electricity The picture collection that entire space is completed after machine rotation 180 degree, can trigger a reconstructing three-dimensional model.
In step S202, according to the image for including scaling board acquired, the calibrating parameters of the camera are obtained;
After collecting including the image of scaling board, camera can be demarcated, obtain the calibrating parameters of camera.Its In, for the calibration mode of camera parameter, calibration function, such as opencv calibration functions can be selected to be demarcated, Huo Zheye Camera can be demarcated using MATLAB calibration tool casees.For example, after getting including the image of scaling board, it can be right Contact on scaling board is detected, and obtains the two-dimensional points pair in the three-dimensional point and corresponding image for calibration, can be with It is demarcated using cv2.calibrateCamera () function, this function can return to the intrinsic parameter square of calibration result, camera The calibrating parameters such as battle array, distortion factor, spin matrix and translation vector.
In order to solve the problems, such as pattern distortion that the wide-angle lens of panorama camera is brought, the description of spherical surface camera model may be used For scene to the projection relation of image, Sphere Measurement Model is as shown in Figure 3:The target of the calibration of camera is to obtain in world coordinate system a bit To the image coordinate system transformational relation of a bit, the camera model that is used in the present invention is as shown in figure 3, a point P in world coordinate system (x, y, z), by projecting to a point P in image pixel mark system twice2(u,v).Projection is with the origin C of world coordinate system for the first time For projection centre, C1It is to move the new projection centre after ξ along the negative Z-direction of world coordinate system.Projection process is described as follows:
1. the point P (x, y, z) in world coordinate system is that projection centre projects to a spherical surface mould centered on camera by C In type, subpoint P1
2.P1With C1For projection centre, P2 (u, the v) points on imaging plane, projection pattern and pin-hole imaging model are projected to Unanimously.
For the above process, that is, world coordinate system to the conversion process of image coordinate system, transformational relation is as follows:
Wherein,(fx,fy) indicate camera focal length, (cx,cy) indicate camera optical centre bias, P (x, y, z) indicates that a bit under world coordinate system, P2 (u, v) indicate a bit in image pixel coordinates system.
It is similar with traditional binocular stereo visual system after single fisheye camera model determines, use fisheye camera model The double panorama camera stereoscopic models of structure, are illustrated in figure 4 scape stereoscopic camera model of enjoying a double blessing.For 3D points P, P1, P2Table Show that P points are the subpoint in the unit sphere coordinate system of left and right, P respectively1', P2' it is subpoint in the camera coordinates system of left and right respectively, P1", P2" respectively indicate left images in do not correct before image pixel coordinates system in point, P1" ', P2" ' corresponding left respectively Right image summarizes the point in the image pixel coordinates system after correction.
In order to describe the relationship in sphere stereoscopic vision system between two cameras, we use the information in Fig. 4 public Formula indicates.For the point P in a world coordinate system, it is assumed that its inhomogeneous coordinate in world coordinate system is Xw, and XSL, XSRThe corresponding points of P respectively in the camera coordinates system of left and right, we can by following equation come express the inhomogeneous coordinate of point P with The relationship of the corresponding points of P in the camera coordinates system of left and right:
Wherein, Rw→SL, tw→SLRespectively world coordinate system to left camera coordinates system spin matrix and translation vector, Rw→SR, tw→SRRespectively spin matrix and translation vector of the world coordinate system to right camera coordinates system.To two etc. in formula 2 Formula merges resolution and can obtain:
Therefore, spin matrix R can be passed throughL→RWith translation vector tL→RTo express the coordinate conversion between two cameras of left and right Relationship.
The target of sphere stereoscopic vision calibration is the coordinate transformation relation obtained between two panorama cameras, i.e.,:
In step S203, according to the calibrating parameters of the camera, the image that camera obtains is converted into latitude image;
Ideal sphere stereoscopic vision is similar with traditional flat and stereo vision, the x of the left and right coordinate system after three-dimensional correction On the same line, the y-axis of left and right coordinate system is parallel with z-axis difference, is illustrated in figure 5 ideal sphere stereoscopic vision mould for axis Type.The baseline of sphere stereoscopic vision system is the line of the focus of two spherical surface cameras.The pole (Epipole) of each image is just It is the intersection point of spherical surface image and baseline, and in x-axis, for normalized spherical surface image coordinate system, the pole can table It is shown as (± 1,0,0), two poles in left image are respectively e1lAnd e2l, two poles in right image are e respectively1rAnd e2r。 Polar curve (Epipole Line), which is exactly scene point with two focuses, to be formed by plane and intersects with spherical surface image and be formed by great circle.
Parallax in binocular stereo vision is used for the calculating of three-dimensional coordinate, and the parallax in sphere stereoscopic vision system with Parallax in plane binocular stereo vision defines similar, and specific computational methods can be by shown in Fig. 6.It can be by analyzing one The polar plane determined by a point in two camera photocentres and scene.A point P in scene and x-axis in the spheric coordinate system of left and right It is formed by angle (i.e. polar angle) and uses θ respectivelylAnd θr, parallax dsIt indicates, θlAnd θrThe difference of corresponding arc length is exactly that spherical surface is vertical Parallax in body vision, is represented by:
ds=fslr) formula -5
Wherein, fsIt is the radius that polar plane is formed by disc with spherical surface cutting, for a unit sphere image fs=1, Have
dnlrFormula -6
Parallax dnSize, it is equal to be formed by angle with point P and two camera photocentre lines.This angle and spherical surface phase The intrinsic parameter of machine is unrelated, by sine it is found that the distance of P points to two cameras can be calculated by formula 7, formula 8.
Therefore, we make it meet ideal spherical face stereoscopic vision mould as shown in Figure 5 by correcting spherical surface camera image Type.Then the corresponding points that scene is matched from sphere stereoscopic image pair pass through formula 5,6,7 to obtain the parallax of scene point It is calculated in scene a little respectively to the distance of two cameras with 8.
In order to reduce matching range, matching efficiency is improved, it may be considered that matching search range is limited in sphere stereoscopic figure On the polar curve of picture.As shown in fig. 7, the polar curve in sphere stereoscopic vision is exactly the very big circle being crossed to form with pole.We will Two poles regard the two poles of the earth of ball as, then image are unfolded by longitude and latitude sampling, then what is intersected with pole is each Great circle is converted to straight line parallel in unfolded image, the schematic diagram after being extremely unfolded as shown in Figure 8.Traditional region is related Matching process can be used for calculating the parallax of sphere stereoscopic image pair.It is arrived from the point p (u, v, s) in normalized spherical surface image Longitude and latitude correct image on a point L (i, j) (i=0,1 ..., m;J=0,1 ..., n), conversion formula is as follows:
Wherein, m, n indicate that the width and height of warp and woof unfolding image, (i/m) π, (j/n) π indicate polar angle and orientation respectively respectively Angle.In correcting spherical surface image, two poles as shown in Figure 7 can be regarded as the two poles of the earth of the earth, and (u, v, s) is to pass through polar angle The cartesian coordinate calculated with declinometer, the point p (u, v, s) in the spherical surface image of correction map to that we are practical to adopt A pixel I (x, y) in the digital picture collected.
In step S204, according to the laser rays in the longitude and latitude image, the parallax of two images in left and right is calculated;
Stereo pairs a little project to correction expanded view after correcting and expanding into longitude and latitude image in world coordinate system In same a line of picture, sphere stereoscopic vision matching problem reduction is conventional planar stereo matching problem, BM or SGBM methods are direct Matching for this problem.Although having carried out distortion correction and three-dimensional correction to image, due to two phases used in this experiment Machine and two wide-angle lens have a certain difference, and the deformation of image still can not be corrected, especially the marginal portion of image.Cause This, the longitude and latitude figure of BM (matching algorithm) or the matching process of SGBM (half global registration algorithm) based on region similitude for expansion As the matching between can not obtain the ideal disparity map of captured scene.
In order to further increase matching precision and efficiency, the application addition structure light improves matched as matching characteristic Precision.Each scene uses a word laser projecting apparatus to project in laser stripe to scene to be reconstructed respectively, two cameras Scene image is acquired respectively, then extracts the striped of image pair and is matched.As shown in figure 9, ImageL and ImageR points Left and right warp and woof unfolding image pair after Biao Shi not correcting, the red curve in image indicate collected laser stripe.Assuming that p1, p2,p3,p4,p5,p6Indicate that the point on the laser stripe extracted in left image, the match point in corresponding right image are p1',p2',p3',p4',p5',p6', p1The train value at place subtract p1' train value be exactly to be projected in p1The parallax of the scene point P of point Value.Assuming that p1And p1' pixel coordinate be respectively (ul,v)、(ur, v), (ul-ur) be exactly scene point parallax.
In step S205, according to the parallax and the calibrating parameters, laser point in the laser rays in image is obtained Three-dimensional coordinate;
According to above-mentioned formula 1, formula 7, formula 8 and formula 9, i.e., according to formula:
The three-dimensional coordinate of the laser point on laser rays is calculated, wherein:(fx,fy) indicate camera Focal length, (cx,cy) indicate that the optical centre bias of camera, ξ indicate the distance that new projection centre C1 is moved along Z axis negative direction, P (x, y, Z) a bit under world coordinate system is indicated, P2 (u, v) indicates a bit in image pixel coordinates system;dlIndicate P points to left camera Distance, drPoint P is indicated to the distance of right camera, it is respectively θ that point P is formed by angle with x-axis in the spheric coordinate system of left and rightlWith θr, fsIt is the radius that polar plane is formed by disc with spherical surface cutting, dsIndicate parallax, dnlr, the left camera spherical surface of b expressions Center is at a distance from right sphere centre, SlIndicate angle, θl2 points of arc length on corresponding camera spherical surface, SrIndicate angle, θr 2 points of arc length on corresponding camera spherical surface;P (u, v, s) be in spherical coordinate image a bit, L (i, j) (i=0, 1,...,m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate the width and height of warp and woof unfolding image, (i/ respectively M) π, (j/n) π indicate polar angle and azimuth respectively.
In step S206, sparse 3 D point cloud is constituted according to the three-dimensional coordinate of acquired laser point, according to described three It ties up sparse cloud and generates threedimensional model.
For each group of left images, step S203-S205 is repeated, all pools in two groups of camera images can be obtained respectively The three-dimensional coordinate of point on light, and two groups of point cloud datas carry out registration fusion, can obtain the sparse 3 D of all the points composition Point cloud.The point cloud data can be imported modeling software, such as CAD, GeoMagic are modeled, to obtain field to be scanned The threedimensional model of scape.Can the model of acquisition be introduced directly into 3D printer to print, model can also be imported VR and shown Equipment is browsed.
Herein described three-dimensional rebuilding method is compared with the three-dimensional rebuilding method of the prior art, by using the phase of panorama Machine effectively expands viewing field of camera, can greatly reduce the number of scanning, reduce the complexity of operation, be conducive to improve Rebuild efficiency;And use laser rays as characteristics of image, feature extraction speed is fast, matching precision higher, and arithmetic speed is fast;By Larger space (5m*5m*5m) can be reached to the measurement range of single-point in herein described three-dimensional rebuilding method, adaptability is more By force, indoor and outdoor various complex scenes be can be applied to;And without scanner, sensor and the high score for purchasing accurate costliness Resolution industrial camera or slr camera are conducive to cost-effective.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit It is fixed.
Figure 10 is a kind of structural schematic diagram of three-dimensional reconstruction apparatus provided by the embodiments of the present application, and details are as follows:
Herein described three-dimensional reconstruction apparatus is based on double structure light three-dimensional scanner shown in FIG. 1, the two-wire knot Structure light three-dimensional scanner includes holder, the camera of fixed two panoramas on the bracket, and is fixed on the holder Two word laser emitters upper and in two camera centre positions, the three-dimensional reconstruction apparatus include:
Image acquisition unit 1001 is used for runing rest, includes scaling board and described one by panorama camera acquisition The image of the laser rays of word laser transmitter projects;
Calibrating parameters acquiring unit 1002, for according to the image for including scaling board acquired, obtaining the camera Calibrating parameters;
Longitude and latitude image conversion unit 1003, for the calibrating parameters according to the camera, the image that camera is obtained turns It is changed to longitude and latitude image;
Disparity computation unit 1004, for according to the laser rays in the longitude and latitude image, calculating the image of left and right two Parallax;
Three-dimensional coordinate acquiring unit 1005, for according to the parallax and the calibrating parameters, obtaining the laser in image The three-dimensional coordinate of laser point in line;
Threedimensional model generation unit 1006, for constituting sparse 3 D point according to the three-dimensional coordinate of acquired laser point Cloud generates threedimensional model according to the sparse 3 D point cloud.
Preferably, the calibrating parameters acquiring unit includes:
Subelement is shot, at any time, the picture that multiple include the scaling board being shot respectively by two cameras;
Subelement is demarcated, the camera is demarcated for passing through calibration function or calibration tool, obtains the phase The calibrating parameters of machine, the calibrating parameters include intrinsic parameter and outer parameter.
Preferably, the longitude and latitude image conversion unit is used for:
According to the intrinsic parameter in the calibrating parameters, pass through formula:
The image that camera obtains is converted into latitude image, wherein:P (u, v, s) is a bit in spherical coordinate image, L (i, j) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate warp and woof unfolding image respectively Width and height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
Three-dimensional reconstruction apparatus described in Figure 10 is corresponding with three-dimensional rebuilding method described in Fig. 2.
Figure 11 is the schematic diagram for the three-dimensional reconstruction equipment that one embodiment of the invention provides.As shown in figure 11, the embodiment Three-dimensional reconstruction equipment 11 includes:It processor 110, memory 111 and is stored in the memory 111 and can be in the processing The computer program 112 run on device 110, such as three-dimensional reconstruction program.The processor 110 executes the computer program Realize the step in above-mentioned each three-dimensional rebuilding method embodiment when 112, such as step 201 shown in Fig. 2 is to 206.Alternatively, institute State the function that each module/unit in above-mentioned each device embodiment is realized when processor 110 executes the computer program 112, example The function of module 901 to 906 as shown in Figure 9.
Illustratively, the computer program 112 can be divided into one or more module/units, it is one or Multiple module/the units of person are stored in the memory 111, and are executed by the processor 110, to complete the present invention.Institute It can be the series of computation machine program instruction section that can complete specific function, the instruction segment to state one or more module/units For describing implementation procedure of the computer program 112 in the three-dimensional reconstruction equipment 11.For example, the computer program 112 can be divided into image acquisition unit, calibrating parameters acquiring unit, longitude and latitude image transform unit, disparity computation unit, Three-dimensional coordinate acquiring unit and threedimensional model generation unit, each unit concrete function are as follows:
Image acquisition unit is used for runing rest, includes that scaling board and a word swash by panorama camera acquisition The image of the laser rays of optical transmitting set transmitting;
Calibrating parameters acquiring unit, for according to the image for including scaling board acquired, obtaining the calibration of the camera Parameter;
Longitude and latitude image conversion unit is converted to the image that camera obtains for the calibrating parameters according to the camera Longitude and latitude image;
Disparity computation unit, for according to the laser rays in the longitude and latitude image, calculating the parallax of two images in left and right;
Three-dimensional coordinate acquiring unit, for according to the parallax and the calibrating parameters, obtaining in the laser rays in image The three-dimensional coordinate of laser point;
Threedimensional model generation unit, for constituting sparse 3 D point cloud, root according to the three-dimensional coordinate of acquired laser point Threedimensional model is generated according to the sparse 3 D point cloud.
The three-dimensional reconstruction equipment 11 can be the meters such as desktop PC, notebook, palm PC and cloud server Calculate equipment.The three-dimensional reconstruction equipment may include, but be not limited only to, processor 110, memory 111.Those skilled in the art can To understand, Figure 10 is only the example of three-dimensional reconstruction equipment 11, does not constitute the restriction to three-dimensional reconstruction equipment 11, may include Than illustrating more or fewer components, certain components or different components are either combined, such as the three-dimensional reconstruction equipment is also May include input-output equipment, network access equipment, bus etc..
Alleged processor 110 can be central processing unit (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor can also be any conventional processor Deng.
The memory 111 can be the internal storage unit of the three-dimensional reconstruction equipment 11, such as three-dimensional reconstruction equipment 11 hard disk or memory.The memory 111 can also be the External memory equipment of the three-dimensional reconstruction equipment 11, such as described The plug-in type hard disk being equipped in three-dimensional reconstruction equipment 11, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) blocks, flash card (Flash Card) etc..Further, the memory 111 can also both include The internal storage unit of the three-dimensional reconstruction equipment 11 also includes External memory equipment.The memory 111 is described for storing Other programs and data needed for computer program and the three-dimensional reconstruction equipment.The memory 111 can be also used for temporarily When store the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work( Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device are divided into different functional units or module, more than completion The all or part of function of description.Each functional unit, module in embodiment can be integrated in a processing unit, also may be used It, can also be above-mentioned integrated during two or more units are integrated in one unit to be that each unit physically exists alone The form that hardware had both may be used in unit is realized, can also be realized in the form of SFU software functional unit.In addition, each function list Member, the specific name of module are also only to facilitate mutually distinguish, the protection domain being not intended to limit this application.Above system The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may realize that lists described in conjunction with the examples disclosed in the embodiments of the present disclosure Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually It is implemented in hardware or software, depends on the specific application and design constraint of technical solution.Professional technician Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, it can be with It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute The division of module or unit is stated, only a kind of division of logic function, formula that in actual implementation, there may be another division manner, such as Multiple units or component can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be by some interfaces, device Or INDIRECT COUPLING or the communication connection of unit, can be electrical, machinery or other forms.
The unit illustrated as separating component may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, you can be located at a place, or may be distributed over multiple In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme 's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or In use, can be stored in a computer read/write memory medium.Based on this understanding, the present invention realizes above-mentioned implementation All or part of flow in example method, can also instruct relevant hardware to complete, the meter by computer program Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on The step of stating each embodiment of the method..Wherein, the computer program includes computer program code, the computer program Code can be source code form, object identification code form, executable file or certain intermediate forms etc..Computer-readable Jie Matter may include:Can carry the computer program code any entity or device, recording medium, USB flash disk, mobile hard disk, Magnetic disc, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that described The content that computer-readable medium includes can carry out increasing appropriate according to legislation in jurisdiction and the requirement of patent practice Subtract, such as in certain jurisdictions, according to legislation and patent practice, computer-readable medium do not include be electric carrier signal and Telecommunication signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although with reference to aforementioned reality Applying example, invention is explained in detail, it will be understood by those of ordinary skill in the art that:It still can be to aforementioned each Technical solution recorded in embodiment is modified or equivalent replacement of some of the technical features;And these are changed Or replace, the spirit and scope for various embodiments of the present invention technical solution that it does not separate the essence of the corresponding technical solution should all It is included within protection scope of the present invention.

Claims (10)

1. a kind of three-dimensional rebuilding method, which is characterized in that the three-dimensional rebuilding method is based on double structure light three-dimensional scanner, The double structure light three-dimensional scanner includes holder, the camera of fixed two panoramas on the bracket, and fixed On the bracket and two word laser emitters in two camera centre positions, the three-dimensional rebuilding method include:
Runing rest includes the laser rays of scaling board and the word laser transmitter projects by panorama camera acquisition Image;
According to the image for including scaling board acquired, the calibrating parameters of the camera are obtained;
According to the calibrating parameters of the camera, the image that camera obtains is converted into latitude image;
According to the laser rays in the longitude and latitude image, the parallax of two images in left and right is calculated;
According to the parallax and the calibrating parameters, the three-dimensional coordinate of laser point in the laser rays in image is obtained;
Sparse 3 D point cloud is constituted according to the three-dimensional coordinate of acquired laser point, is generated according to the sparse 3 D point cloud three-dimensional Model.
2. three-dimensional rebuilding method according to claim 1, which is characterized in that described according to what is acquired includes scaling board The step of image, the calibrating parameters for obtaining the camera includes:
At any time, the picture that multiple include the scaling board is shot respectively by two cameras;
The camera is demarcated by calibration function or calibration tool, obtains the calibrating parameters of the camera, the mark It includes intrinsic parameter and outer parameter to determine parameter.
3. three-dimensional rebuilding method according to claim 1, which is characterized in that the calibrating parameters according to the camera, By camera obtain image be converted into latitude image the step of include:
According to the intrinsic parameter in the calibrating parameters, pass through formula:
The image that camera obtains is converted into latitude image, wherein:P (u, v, s) be in spherical coordinate image a bit, L (i, J) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate the width of warp and woof unfolding image respectively And height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
4. three-dimensional rebuilding method according to claim 1, which is characterized in that described according to swashing in the longitude and latitude image Light, calculate left and right two images parallax the step of include:
Extract the first laser line and the corresponding longitude and latitude figure of right camera image in the corresponding longitude and latitude image of left camera image First laser line as in;
According to the difference of the match point on the first laser line and second laser line, the parallax of two images in left and right is calculated.
5. three-dimensional rebuilding method according to claim 1, which is characterized in that described to be joined according to the parallax and the calibration Number, obtain image in laser rays in laser point three-dimensional coordinate the step of include:
According to formula:
The three-dimensional coordinate of the laser point on laser rays is calculated, wherein:(fx,fy) indicate camera focal length, (cx,cy) indicating the optical centre bias of camera, ξ indicates the distance that new projection centre C1 is moved along Z axis negative direction, P (x, y, z) table Show that a bit under world coordinate system, P2 (u, v) indicate a bit in image pixel coordinates system;dlIndicate P points to left camera away from From drPoint P is indicated to the distance of right camera, the line of point P to left sphere centre is formed by with x-axis in the spheric coordinate system of left and right Angle is respectively θlAnd θr, fsIt is the radius that polar plane is formed by disc with spherical surface cutting, dsIndicate parallax, dnlr, b Left camera sphere centre is indicated at a distance from right sphere centre, SlIndicate angle, θl2 points of arc on corresponding camera spherical surface It is long, SrIndicate angle, θr2 points of arc length on corresponding camera spherical surface;P (u, v, s) is a bit in spherical coordinate image, L (i, j) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate warp and woof unfolding image respectively Width and height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
6. a kind of three-dimensional reconstruction apparatus, which is characterized in that the three-dimensional reconstruction apparatus is based on double structure light three-dimensional scanner, The double structure light three-dimensional scanner includes holder, the camera of fixed two panoramas on the bracket, and fixed On the bracket and two word laser emitters in two camera centre positions, the three-dimensional reconstruction apparatus include:
Image acquisition unit is used for runing rest, includes that scaling board and the word laser are sent out by panorama camera acquisition The image of the laser rays of emitter transmitting;
Calibrating parameters acquiring unit, for according to the image for including scaling board acquired, obtaining the calibrating parameters of the camera;
The image that camera obtains is converted to longitude and latitude by longitude and latitude image conversion unit for the calibrating parameters according to the camera Spend image;
Disparity computation unit, for according to the laser rays in the longitude and latitude image, calculating the parallax of two images in left and right;
Three-dimensional coordinate acquiring unit, for according to the parallax and the calibrating parameters, obtaining laser in the laser rays in image The three-dimensional coordinate of point;
Threedimensional model generation unit, for constituting sparse 3 D point cloud according to the three-dimensional coordinate of acquired laser point, according to institute It states sparse 3 D point cloud and generates threedimensional model.
7. three-dimensional reconstruction apparatus according to claim 6, which is characterized in that the calibrating parameters acquiring unit includes:
Subelement is shot, at any time, the picture that multiple include the scaling board being shot respectively by two cameras;
Subelement is demarcated, the camera is demarcated for passing through calibration function or calibration tool, obtains the camera Calibrating parameters, the calibrating parameters include intrinsic parameter and outer parameter.
8. three-dimensional reconstruction apparatus according to claim 6, which is characterized in that the longitude and latitude image conversion unit is used for:
According to the intrinsic parameter in the calibrating parameters, pass through formula:
The image that camera obtains is converted into latitude image, wherein:P (u, v, s) be in spherical coordinate image a bit, L (i, J) (i=0,1 ..., m;J=0,1 ..., n) be in longitude and latitude image a bit, m, n indicate the width of warp and woof unfolding image respectively And height, (i/m) π, (j/n) π indicate polar angle and azimuth respectively.
9. a kind of three-dimensional reconstruction equipment, including memory, processor and it is stored in the memory and can be in the processing The computer program run on device, which is characterized in that the processor realizes such as claim 1 when executing the computer program The step of to any one of 5 three-dimensional rebuilding method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, feature to exist In the step of realization three-dimensional rebuilding method as described in any one of claim 1 to 5 when the computer program is executed by processor Suddenly.
CN201711429865.XA 2017-12-26 2017-12-26 A kind of three-dimensional rebuilding method, device and equipment Pending CN108288292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711429865.XA CN108288292A (en) 2017-12-26 2017-12-26 A kind of three-dimensional rebuilding method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711429865.XA CN108288292A (en) 2017-12-26 2017-12-26 A kind of three-dimensional rebuilding method, device and equipment

Publications (1)

Publication Number Publication Date
CN108288292A true CN108288292A (en) 2018-07-17

Family

ID=62832261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711429865.XA Pending CN108288292A (en) 2017-12-26 2017-12-26 A kind of three-dimensional rebuilding method, device and equipment

Country Status (1)

Country Link
CN (1) CN108288292A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087382A (en) * 2018-08-01 2018-12-25 宁波发睿泰科智能科技有限公司 A kind of three-dimensional reconstruction method and 3-D imaging system
CN109444916A (en) * 2018-10-17 2019-03-08 上海蔚来汽车有限公司 The unmanned travelable area determining device of one kind and method
CN109579871A (en) * 2018-11-14 2019-04-05 中国直升机设计研究所 Inertial navigation components installation error detection method and device based on computer vision
CN110148216A (en) * 2019-05-24 2019-08-20 中德(珠海)人工智能研究院有限公司 A kind of method of double ball curtain cameras and double ball curtain camera three-dimensional modelings
CN110288713A (en) * 2019-07-03 2019-09-27 北京机械设备研究所 A kind of quick three-dimensional model reconstruction method and system based on multi-vision visual
CN110458952A (en) * 2019-08-19 2019-11-15 江苏濠汉信息技术有限公司 A kind of three-dimensional rebuilding method and device based on trinocular vision
CN110675314A (en) * 2019-04-12 2020-01-10 北京城市网邻信息技术有限公司 Image processing method, image processing apparatus, three-dimensional object modeling method, three-dimensional object modeling apparatus, image processing apparatus, and medium
CN110913197A (en) * 2019-11-14 2020-03-24 湖北第二师范学院 Automatic three-dimensional high-definition camera shooting cultural relic data protection device and method
WO2020088095A1 (en) * 2018-11-02 2020-05-07 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Fisheye infrared depth detection
CN111145095A (en) * 2019-12-25 2020-05-12 深圳无境智能机器人有限公司 VR (virtual reality) diagram generation method with scale measurement and data acquisition device
CN111862179A (en) * 2019-04-12 2020-10-30 北京城市网邻信息技术有限公司 Three-dimensional object modeling method and apparatus, image processing device, and medium
CN111882657A (en) * 2020-06-29 2020-11-03 杭州易现先进科技有限公司 Three-dimensional reconstruction scale recovery method, device and system and computer equipment
CN111882655A (en) * 2020-06-19 2020-11-03 杭州易现先进科技有限公司 Method, apparatus, system, computer device and storage medium for three-dimensional reconstruction
CN112113505A (en) * 2020-09-23 2020-12-22 华中科技大学鄂州工业技术研究院 Portable scanning measurement device and method based on line structured light
CN112262411A (en) * 2019-12-23 2021-01-22 商汤国际私人有限公司 Image association method, system and device
CN112449100A (en) * 2019-09-03 2021-03-05 中国科学院长春光学精密机械与物理研究所 Splicing method and device for aerial camera oblique images, terminal and storage medium
CN113219489A (en) * 2021-05-13 2021-08-06 深圳数马电子技术有限公司 Method and device for determining point pair of multi-line laser, computer equipment and storage medium
CN113808214A (en) * 2021-08-24 2021-12-17 东莞固高自动化技术有限公司 Camera calibration equipment, camera production system, camera and calibration method thereof
WO2021259287A1 (en) * 2020-06-24 2021-12-30 中兴通讯股份有限公司 Depth map generation method, and device and storage medium
CN114399597A (en) * 2022-01-12 2022-04-26 贝壳找房(北京)科技有限公司 Method and device for constructing scene space model and storage medium
WO2023019833A1 (en) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 Laser line scanning-based point cloud processing method and apparatus
WO2023109960A1 (en) * 2021-12-17 2023-06-22 先临三维科技股份有限公司 Three-dimensional scanning processing method and apparatus and three-dimensional scanning device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853521A (en) * 2010-04-22 2010-10-06 武汉微目科技有限公司 Cultural relic rotation structured light three-dimensional digital modeling method
US20130342657A1 (en) * 2012-06-15 2013-12-26 Nikon Corporation Stereo vision camera for laser radar
CN107123156A (en) * 2017-03-10 2017-09-01 西北工业大学 A kind of active light source projection three-dimensional reconstructing method being combined with binocular stereo vision
CN107170043A (en) * 2017-06-19 2017-09-15 电子科技大学 A kind of three-dimensional rebuilding method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853521A (en) * 2010-04-22 2010-10-06 武汉微目科技有限公司 Cultural relic rotation structured light three-dimensional digital modeling method
US20130342657A1 (en) * 2012-06-15 2013-12-26 Nikon Corporation Stereo vision camera for laser radar
CN107123156A (en) * 2017-03-10 2017-09-01 西北工业大学 A kind of active light source projection three-dimensional reconstructing method being combined with binocular stereo vision
CN107170043A (en) * 2017-06-19 2017-09-15 电子科技大学 A kind of three-dimensional rebuilding method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HONGLONG ZHANG,DANRONG LI,ZHAN SONG: "Dense 3D reconstruction with an active binocular panoramic vision system", 《2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (ICIA)》 *
闫龙: "《双目视觉测量系统相关技术研究》", 30 April 2017, 济南:山东大学出版社 *

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087382A (en) * 2018-08-01 2018-12-25 宁波发睿泰科智能科技有限公司 A kind of three-dimensional reconstruction method and 3-D imaging system
CN109444916A (en) * 2018-10-17 2019-03-08 上海蔚来汽车有限公司 The unmanned travelable area determining device of one kind and method
CN109444916B (en) * 2018-10-17 2023-07-04 上海蔚来汽车有限公司 Unmanned driving drivable area determining device and method
WO2020088095A1 (en) * 2018-11-02 2020-05-07 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Fisheye infrared depth detection
US11611698B2 (en) 2018-11-02 2023-03-21 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus of depth detection, and computer-readable storage medium
CN112912929A (en) * 2018-11-02 2021-06-04 Oppo广东移动通信有限公司 Fisheye infrared depth detection
CN109579871A (en) * 2018-11-14 2019-04-05 中国直升机设计研究所 Inertial navigation components installation error detection method and device based on computer vision
CN109579871B (en) * 2018-11-14 2021-03-30 中国直升机设计研究所 Inertial navigation part installation error detection method and device based on computer vision
CN110675314B (en) * 2019-04-12 2020-08-21 北京城市网邻信息技术有限公司 Image processing method, image processing apparatus, three-dimensional object modeling method, three-dimensional object modeling apparatus, image processing apparatus, and medium
CN111862179A (en) * 2019-04-12 2020-10-30 北京城市网邻信息技术有限公司 Three-dimensional object modeling method and apparatus, image processing device, and medium
CN111862179B (en) * 2019-04-12 2022-02-18 北京城市网邻信息技术有限公司 Three-dimensional object modeling method and apparatus, image processing device, and medium
CN110675314A (en) * 2019-04-12 2020-01-10 北京城市网邻信息技术有限公司 Image processing method, image processing apparatus, three-dimensional object modeling method, three-dimensional object modeling apparatus, image processing apparatus, and medium
CN110148216A (en) * 2019-05-24 2019-08-20 中德(珠海)人工智能研究院有限公司 A kind of method of double ball curtain cameras and double ball curtain camera three-dimensional modelings
CN110148216B (en) * 2019-05-24 2023-03-24 中德(珠海)人工智能研究院有限公司 Three-dimensional modeling method of double-dome camera
CN110288713A (en) * 2019-07-03 2019-09-27 北京机械设备研究所 A kind of quick three-dimensional model reconstruction method and system based on multi-vision visual
CN110288713B (en) * 2019-07-03 2022-12-23 北京机械设备研究所 Rapid three-dimensional model reconstruction method and system based on multi-view vision
CN110458952A (en) * 2019-08-19 2019-11-15 江苏濠汉信息技术有限公司 A kind of three-dimensional rebuilding method and device based on trinocular vision
CN110458952B (en) * 2019-08-19 2022-07-15 江苏濠汉信息技术有限公司 Three-dimensional reconstruction method and device based on trinocular vision
CN112449100A (en) * 2019-09-03 2021-03-05 中国科学院长春光学精密机械与物理研究所 Splicing method and device for aerial camera oblique images, terminal and storage medium
CN112449100B (en) * 2019-09-03 2023-11-17 中国科学院长春光学精密机械与物理研究所 Aviation camera inclined image splicing method, device, terminal and storage medium
CN110913197B (en) * 2019-11-14 2021-04-20 湖北第二师范学院 Automatic three-dimensional high-definition camera shooting cultural relic data protection device and method
CN110913197A (en) * 2019-11-14 2020-03-24 湖北第二师范学院 Automatic three-dimensional high-definition camera shooting cultural relic data protection device and method
CN112262411A (en) * 2019-12-23 2021-01-22 商汤国际私人有限公司 Image association method, system and device
CN112262411B (en) * 2019-12-23 2024-04-05 商汤国际私人有限公司 Image association method, system and device
CN111145095B (en) * 2019-12-25 2023-10-10 深圳市诺达通信技术有限公司 VR (virtual reality) graph generation method with scale measurement and data acquisition device
CN111145095A (en) * 2019-12-25 2020-05-12 深圳无境智能机器人有限公司 VR (virtual reality) diagram generation method with scale measurement and data acquisition device
CN111882655B (en) * 2020-06-19 2023-07-18 杭州易现先进科技有限公司 Method, device, system, computer equipment and storage medium for three-dimensional reconstruction
CN111882655A (en) * 2020-06-19 2020-11-03 杭州易现先进科技有限公司 Method, apparatus, system, computer device and storage medium for three-dimensional reconstruction
WO2021259287A1 (en) * 2020-06-24 2021-12-30 中兴通讯股份有限公司 Depth map generation method, and device and storage medium
CN111882657B (en) * 2020-06-29 2024-01-26 杭州易现先进科技有限公司 Three-dimensional reconstruction scale recovery method, device, system and computer equipment
CN111882657A (en) * 2020-06-29 2020-11-03 杭州易现先进科技有限公司 Three-dimensional reconstruction scale recovery method, device and system and computer equipment
CN112113505A (en) * 2020-09-23 2020-12-22 华中科技大学鄂州工业技术研究院 Portable scanning measurement device and method based on line structured light
CN112113505B (en) * 2020-09-23 2022-02-01 华中科技大学鄂州工业技术研究院 Portable scanning measurement device and method based on line structured light
CN113219489B (en) * 2021-05-13 2024-04-16 深圳数马电子技术有限公司 Point-to-point determination method, device, computer equipment and storage medium for multi-line laser
CN113219489A (en) * 2021-05-13 2021-08-06 深圳数马电子技术有限公司 Method and device for determining point pair of multi-line laser, computer equipment and storage medium
WO2023019833A1 (en) * 2021-08-18 2023-02-23 梅卡曼德(北京)机器人科技有限公司 Laser line scanning-based point cloud processing method and apparatus
CN113808214A (en) * 2021-08-24 2021-12-17 东莞固高自动化技术有限公司 Camera calibration equipment, camera production system, camera and calibration method thereof
WO2023109960A1 (en) * 2021-12-17 2023-06-22 先临三维科技股份有限公司 Three-dimensional scanning processing method and apparatus and three-dimensional scanning device
CN114399597B (en) * 2022-01-12 2022-10-28 贝壳找房(北京)科技有限公司 Method and device for constructing scene space model and storage medium
CN114399597A (en) * 2022-01-12 2022-04-26 贝壳找房(北京)科技有限公司 Method and device for constructing scene space model and storage medium

Similar Documents

Publication Publication Date Title
CN108288292A (en) A kind of three-dimensional rebuilding method, device and equipment
CN109816703B (en) Point cloud registration method based on camera calibration and ICP algorithm
CN107170043B (en) A kind of three-dimensional rebuilding method
CN107862744B (en) Three-dimensional modeling method for aerial image and related product
Sturm et al. Camera models and fundamental concepts used in geometric computer vision
CN104408732B (en) A kind of big depth of field measuring system and method based on omnidirectional's structure light
CN104376552B (en) A kind of virtual combat method of 3D models and two dimensional image
CN104240289B (en) Three-dimensional digitalization reconstruction method and system based on single camera
Hoppe et al. Online Feedback for Structure-from-Motion Image Acquisition.
CN106485690A (en) Cloud data based on a feature and the autoregistration fusion method of optical image
CN102750697A (en) Parameter calibration method and device
CN100417231C (en) Three-dimensional vision semi-matter simulating system and method
CN102054276B (en) Camera calibration method and system for object three-dimensional geometrical reconstruction
CN103839277A (en) Mobile augmented reality registration method of outdoor wide-range natural scene
CN109559349A (en) A kind of method and apparatus for calibration
CN108399631B (en) Scale invariance oblique image multi-view dense matching method
CN103839227A (en) Fisheye image correction method and device
CN107886546A (en) Utilize the method for ball picture and public self-polar triangle demarcation parabolic catadioptric video camera
CN104200476B (en) The method that camera intrinsic parameter is solved using the circular motion in bimirror device
CN101923730B (en) Fisheye camera and multiple plane mirror devices-based three-dimensional reconstruction method
Hafeez et al. Image based 3D reconstruction of texture-less objects for VR contents
CN104166995B (en) Harris-SIFT binocular vision positioning method based on horse pace measurement
CN103955960A (en) Image viewpoint transformation method based on single input image
CN113902802A (en) Visual positioning method and related device, electronic equipment and storage medium
CN107240149A (en) Object dimensional model building method based on image procossing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180717