CN110111413A - A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario - Google Patents

A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario Download PDF

Info

Publication number
CN110111413A
CN110111413A CN201910277233.9A CN201910277233A CN110111413A CN 110111413 A CN110111413 A CN 110111413A CN 201910277233 A CN201910277233 A CN 201910277233A CN 110111413 A CN110111413 A CN 110111413A
Authority
CN
China
Prior art keywords
coordinate
camera
plane
straight line
indicate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910277233.9A
Other languages
Chinese (zh)
Inventor
姜光
李嘉琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201910277233.9A priority Critical patent/CN110111413A/en
Publication of CN110111413A publication Critical patent/CN110111413A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The sparse cloud three-dimension modeling method provided in an embodiment of the present invention based on land and water coexistence scenario disclosed by the invention, it is related to technical field of computer vision, by distort to image, obtain Camera extrinsic number and spatial point coordinate, coordinate is optimized, the foundation to the sparse 3 D point cloud threedimensional model based on land and water coexistence scenario is realized.

Description

A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario
Technical field
The present invention relates to technical field of computer vision, and in particular to a kind of sparse cloud three based on land and water coexistence scenario Dimension module method for building up.
Background technique
The three-dimension modeling of view-based access control model is to obtain the image of object in scene by video camera and carry out to this image Analysis processing, the three-dimensional information of object in actual environment is derived in conjunction with computer vision knowledge.The process that three-dimensional is established is logical It often include image preprocessing, feature point extraction and sparse 3 D point Yun Jianli, three-dimensional dense point cloud is established and veining.Wherein three The step that sparse cloud foundation is most critical in entire three-dimensional establishment process is tieed up, not only needs to solve the three of characteristic point in this step Coordinate is tieed up, and needs to solve camera in the Position and orientation parameters of all shooting points.
In current research, common land three-dimensional scenic foundation is all highly developed from theory to algorithm level, Obtain business application.In recent years, underwater 3 D scene foundation is a critically important research topic, and the difficult point of such project is light Line can pass through different media in communication process, to generate refraction effect and changing optical path makes to become when image objects Shape.Eliminating influence caused by reflecting in image and restoring the threedimensional model of immersed body is the mesh of many underwater 3 D scenes foundation Mark.It in current research, only studies object to be established and is entirely located under water, or put camera into waterproof cover and immerse underwater clap It takes the photograph, i.e., the light of each point in object scene can get to camera by water-air interfacial refraction at least once, at present The research of the direction is still in the stage of improving.The scene for thering are a large amount of land and waters to coexist in nature, such as bank, beach, islands and reefs, river Stream, pond etc., they are all without refraction scene (land) and to have the common existing association of refraction scene (underwater).To these water The three-dimensional of land coexistence scenario is established, and is had important practical significance and theoretical value.
Due to different with there is the physical model of refraction scene (underwater) two kinds of scene imagings without refraction scene (land), so far The three-dimension modeling method for such scene that do not unify.
Summary of the invention
To solve the deficiencies in the prior art, the embodiment of the invention provides a kind of sparse cloud based on land and water coexistence scenario Three-dimension modeling method, this method comprises:
(1) image distort
(11) according to camera calibration method, the Intrinsic Matrix and distortion factor of camera are obtained;
(12) according to the Intrinsic Matrix and distortion factor, the multiple images obtained to camera are gone to distort, be obtained multiple Undistorted image;
(2) Camera extrinsic number and spatial point coordinate are obtained
(21) method for utilizing structure from motion obtains the initial value of Camera extrinsic number and calculates separately each no mistake Coordinate of the true image in space coordinates, wherein the space coordinates are established according to horizontal plane, are horizontally and vertically formed Coordinate plane be parallel to the horizontal plane or be overlapped, the direction of vertical pivot is downward;
(3) coordinate is optimized
(31) constraint function is constructedWherein, V is undistorted Image collection, NiIndicate the set of the coordinate on i-th of undistorted image with subpoint, dijFor the observation point on the water surfaceProjection straight line l on to the water surfaceiDistance beliFor throwing of the straight line l on XOY plane Shadow, straight line l are spatial point X in plane of delineation πcOn subpoint x and the line of the infinity in present image plane formed Straight line l=r3× x, and li=HTL, transformation matrix H=[r1,r2, t], H is by transformation matrix by the rotating vector R=of camera [r1,r2,r3] and translation vector t=[t1,t2,t3]TIt is composed, solves this constraint function, obtain meeting the constraint equation Camera extrinsic number R, t1、t2Minimum value and and minimum value of the coordinate on XOY plane;
(32) according to constraint functionIt obtains meeting this constraint side Journey Camera extrinsic number t3Minimum value and the coordinate vertical direction minimum value, wherein D (X, l) indicate point X to straight line l Distance, linAnd lrefRespectively indicate the coordinate of incident ray and refracted light on XOY plane, XzIndicate that the coordinate is vertical Coordinate value on direction, ZwIndicate the coordinate value on XOY plane;
(4) law of refraction is utilizedThe actual coordinate put under reductive water is generated according to the actual coordinate Sparse cloud threedimensional model.
The sparse 3 D point cloud method for building up of land and water coexistence scenario provided in an embodiment of the present invention has the advantages that
It can be realized the foundation to the sparse cloud threedimensional model based on land and water coexistence scenario.
Detailed description of the invention
Fig. 1 is the stream of the sparse cloud three-dimension modeling method provided in an embodiment of the present invention based on land and water coexistence scenario Journey schematic diagram;
Fig. 2 is a constraint function relation schematic diagram provided in an embodiment of the present invention.
Specific embodiment
Specific introduce is made to the present invention below in conjunction with the drawings and specific embodiments.
Referring to Fig.1, the sparse 3 D point cloud three-dimension modeling provided in an embodiment of the present invention based on land and water coexistence scenario Method the following steps are included:
S101 distort to image
S1011 obtains the Intrinsic Matrix and distortion factor of camera according to camera calibration method;
S1012, according to the Intrinsic Matrix and distortion factor, the multiple images obtained to camera are gone to distort, be obtained more A undistorted image.
S102 obtains Camera extrinsic number and spatial point coordinate
S1021 obtains the initial value of Camera extrinsic number and calculates separately each nothing using the method for structure from motion Coordinate of the distorted image in space coordinates, wherein the space coordinates are established according to horizontal plane, horizontally and vertically shape At coordinate plane be parallel to the horizontal plane or be overlapped, the direction of vertical pivot is downward.
S103 optimizes coordinate
S1031 constructs constraint functionWherein, V is no mistake True image collection, NiIndicate the set of the coordinate on i-th of undistorted image with subpoint, dijFor the observation point on the water surfaceProjection straight line l on to the water surfaceiDistance beliFor throwing of the straight line l on XOY plane Shadow, straight line l are spatial point X in plane of delineation πcOn subpoint x and the line of the infinity in present image plane formed Straight line l=r3× x, and li=HTL, transformation matrix H=[r1,r2, t], H is by transformation matrix by the rotating vector R=of camera [r1,r2,r3] and translation vector t=[t1,t2,t3]TIt is composed, solves this constraint function, obtain meeting the constraint equation Camera extrinsic number R, t1、t2Minimum value and and minimum value of the coordinate on XOY plane;
S1032, according to constraint functionIt obtains meeting this constraint Equation Camera extrinsic number t3Minimum value and the coordinate vertical direction minimum value, wherein D (X, l) indicates point X to straight line The distance of l, linAnd lrefRespectively indicate the coordinate of incident ray and refracted light on XOY plane, XzIndicate that the coordinate is vertical Coordinate value on direction, ZwIndicate the coordinate value on XOY plane.
S104 utilizes the law of refractionThe actual coordinate put under reductive water generates dilute according to actual coordinate Dredge point cloud threedimensional model.
The sparse cloud three-dimension modeling method provided in an embodiment of the present invention based on land and water coexistence scenario, by figure As distort, Camera extrinsic number and spatial point coordinate are obtained, coordinate is optimized, is realized to based on land and water Coexisting Field The foundation of the sparse 3 D point cloud threedimensional model of scape.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, there is no the portion being described in detail in some embodiment Point, reference can be made to the related descriptions of other embodiments.
It is understood that the correlated characteristic in the above method and device can be referred to mutually.In addition, in above-described embodiment " first ", " second " etc. be and not represent the superiority and inferiority of each embodiment for distinguishing each embodiment.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
Algorithm and display are not inherently related to any particular computer, virtual system, or other device provided herein. Various general-purpose systems can also be used together with teachings based herein.As described above, it constructs required by this kind of system Structure be obvious.In addition, the present invention is also not directed to any particular programming language.It should be understood that can use various Programming language realizes summary of the invention described herein, and the description done above to language-specific is to disclose this hair Bright preferred forms.
In addition, memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or the forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM), memory includes extremely A few storage chip.
It should be understood by those skilled in the art that, embodiments herein can provide as method, system or computer program Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the application Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the application, which can be used in one or more, The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces The form of product.
The application is referring to method, the process of equipment (system) and computer program product according to the embodiment of the present application Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates, Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/ Or the forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable Jie The example of matter.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data. The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM), Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including element There is also other identical elements in process, method, commodity or equipment.
It will be understood by those skilled in the art that embodiments herein can provide as method, system or computer program product. Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the application Form.It is deposited moreover, the application can be used to can be used in the computer that one or more wherein includes computer usable program code The shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) Formula.
The above is only embodiments herein, are not intended to limit this application.To those skilled in the art, Various changes and changes are possible in this application.It is all within the spirit and principles of the present application made by any modification, equivalent replacement, Improve etc., it should be included within the scope of the claims of this application.

Claims (1)

1. a kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario characterized by comprising
(1) image distort
(11) according to camera calibration method, the Intrinsic Matrix and distortion factor of camera are obtained;
(12) according to the Intrinsic Matrix and distortion factor, the multiple images obtained to camera go to distort, and obtain multiple no mistakes True image;
(2) Camera extrinsic number and spatial point coordinate are obtained
(21) method for utilizing structure from motion obtains the initial value of Camera extrinsic number and calculates separately each undistorted figure As the coordinate in space coordinates, wherein the space coordinates are established according to horizontal plane, the seat horizontally and vertically formed Mark plane is parallel to the horizontal plane or is overlapped, and the direction of vertical pivot is downward;
(3) coordinate is optimized
(31) constraint function is constructedWherein, V is undistorted image Set, NiIndicate the set of the coordinate on i-th of undistorted image with subpoint, dijFor the observation point on the water surfaceIt arrives Projection straight line l on the water surfaceiDistance beliFor projection of the straight line l on XOY plane, straight line L is spatial point X in plane of delineation πcOn subpoint x and the line of infinity in present image plane be formed by straight line l =r3× x, and li=HTL, transformation matrix H=[r1,r2, t], H is by transformation matrix by the rotating vector R=[r of camera1,r2,r3] With translation vector t=[t1,t2,t3]TIt is composed, solves this constraint function, obtain the Camera extrinsic for meeting the constraint equation Number R, t1、t2Minimum value and and minimum value of the coordinate on XOY plane;
(32) according to constraint functionIt obtains meeting this constraint equation phase The outer parameter t of machine3Minimum value and the coordinate vertical direction minimum value, wherein D (X, l) indicate point X to straight line l away from From linAnd lrefRespectively indicate the coordinate of incident ray and refracted light on XOY plane, XzIndicate the coordinate vertical direction On coordinate value, ZwIndicate the coordinate value on XOY plane.
(4) law of refraction is utilizedThe actual coordinate put under reductive water generates sparse according to the actual coordinate Point cloud threedimensional model.
CN201910277233.9A 2019-04-08 2019-04-08 A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario Pending CN110111413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910277233.9A CN110111413A (en) 2019-04-08 2019-04-08 A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910277233.9A CN110111413A (en) 2019-04-08 2019-04-08 A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario

Publications (1)

Publication Number Publication Date
CN110111413A true CN110111413A (en) 2019-08-09

Family

ID=67483705

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910277233.9A Pending CN110111413A (en) 2019-04-08 2019-04-08 A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario

Country Status (1)

Country Link
CN (1) CN110111413A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111161426A (en) * 2019-12-31 2020-05-15 中航华东光电有限公司 Three-dimensional display method and system based on panoramic image
CN111311742A (en) * 2020-03-27 2020-06-19 北京百度网讯科技有限公司 Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment
CN111462298A (en) * 2020-02-24 2020-07-28 西安电子科技大学 Method for reconstructing underwater three-dimensional scene
CN112308962A (en) * 2020-11-05 2021-02-02 山东产研信息与人工智能融合研究院有限公司 Real scene model construction method and device with entity target as minimum unit

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090219286A1 (en) * 2008-02-28 2009-09-03 Microsoft Corporation Non-linear beam tracing for computer graphics
CN106101689A (en) * 2016-06-13 2016-11-09 西安电子科技大学 Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
US20170115412A1 (en) * 2015-10-27 2017-04-27 ConocoPhillips Comapny Interactive Salt Model Modification
CN106952341A (en) * 2017-03-27 2017-07-14 中国人民解放军国防科学技术大学 The underwater scene three-dimensional point cloud method for reconstructing and its system of a kind of view-based access control model
CN107256563A (en) * 2017-06-13 2017-10-17 中国人民解放军国防科学技术大学 Underwater 3 D reconstructing system and its method based on difference liquid level image sequence
WO2018166747A1 (en) * 2017-03-15 2018-09-20 Jaguar Land Rover Limited Improvements in vehicle control
CN108648264A (en) * 2018-04-25 2018-10-12 吉林大学 Underwater scene method for reconstructing based on exercise recovery and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090219286A1 (en) * 2008-02-28 2009-09-03 Microsoft Corporation Non-linear beam tracing for computer graphics
US20170115412A1 (en) * 2015-10-27 2017-04-27 ConocoPhillips Comapny Interactive Salt Model Modification
CN106101689A (en) * 2016-06-13 2016-11-09 西安电子科技大学 Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
WO2018166747A1 (en) * 2017-03-15 2018-09-20 Jaguar Land Rover Limited Improvements in vehicle control
CN106952341A (en) * 2017-03-27 2017-07-14 中国人民解放军国防科学技术大学 The underwater scene three-dimensional point cloud method for reconstructing and its system of a kind of view-based access control model
CN107256563A (en) * 2017-06-13 2017-10-17 中国人民解放军国防科学技术大学 Underwater 3 D reconstructing system and its method based on difference liquid level image sequence
CN108648264A (en) * 2018-04-25 2018-10-12 吉林大学 Underwater scene method for reconstructing based on exercise recovery and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LAI KANG .ETC: ""Two-view underwater 3D reconstruction for cameras with unknown poses under flat refractive interfaces"", 《PATTERN RECOGNITION》 *
王琳: ""无人机倾斜摄影技术在三维城市建模中的应用"", 《测绘与空间地理信息》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111161426A (en) * 2019-12-31 2020-05-15 中航华东光电有限公司 Three-dimensional display method and system based on panoramic image
CN111161426B (en) * 2019-12-31 2023-08-01 中航华东光电有限公司 Panoramic image-based three-dimensional display method and system
CN111462298A (en) * 2020-02-24 2020-07-28 西安电子科技大学 Method for reconstructing underwater three-dimensional scene
CN111462298B (en) * 2020-02-24 2023-03-28 西安电子科技大学 Method for reconstructing underwater three-dimensional scene
CN111311742A (en) * 2020-03-27 2020-06-19 北京百度网讯科技有限公司 Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment
CN112308962A (en) * 2020-11-05 2021-02-02 山东产研信息与人工智能融合研究院有限公司 Real scene model construction method and device with entity target as minimum unit
CN112308962B (en) * 2020-11-05 2023-10-17 山东产研信息与人工智能融合研究院有限公司 Live-action model construction method and device taking entity target as minimum unit

Similar Documents

Publication Publication Date Title
Tewari et al. Advances in neural rendering
CN110111413A (en) A kind of sparse cloud three-dimension modeling method based on land and water coexistence scenario
JP7413321B2 (en) Daily scene restoration engine
Hedman et al. Scalable inside-out image-based rendering
CN108509848B (en) The real-time detection method and system of three-dimension object
CN110392902A (en) Use the operation of sparse volume data
WO2020047338A1 (en) Computer vision system
WO2018009473A1 (en) Motion capture and character synthesis
JP2018514031A (en) DeepStereo: learning to predict new views from real-world images
CN110135455A (en) Image matching method, device and computer readable storage medium
Yao et al. Neilf: Neural incident light field for physically-based material estimation
Henderson et al. Unsupervised object-centric video generation and decomposition in 3D
Rosu et al. Permutosdf: Fast multi-view reconstruction with implicit surfaces using permutohedral lattices
CN109816732A (en) Scaling method, calibration system, antidote, correction system and vehicle
CN113936091A (en) Method and system for constructing ray tracing acceleration structure
CN110163831A (en) The object Dynamic Display method, apparatus and terminal device of three-dimensional sand table
Häne et al. Hierarchical surface prediction
Shinohara et al. Point2color: 3d point cloud colorization using a conditional generative network and differentiable rendering for airborne lidar
CN109685879A (en) Determination method, apparatus, equipment and the storage medium of multi-view images grain distribution
Toschi et al. Relight my nerf: A dataset for novel view synthesis and relighting of real world objects
CN108876906A (en) The method and device of virtual three-dimensional model is established based on the global plane optimizing of cloud
Choi et al. MAIR: multi-view attention inverse rendering with 3d spatially-varying lighting estimation
CN107703537A (en) A kind of big gun examines methods of exhibiting and device in three-dimensional earth's surface
Pellis et al. Synthetic data generation and testing for the semantic segmentation of heritage buildings
Ahmad et al. Multi-view 3d objects localization from street-level scenes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Jiang Guang

Inventor after: Li Jiaqi

Inventor after: Liu Jianhui

Inventor before: Jiang Guang

Inventor before: Li Jiaqi

AD01 Patent right deemed abandoned
AD01 Patent right deemed abandoned

Effective date of abandoning: 20220318