CN109003294A - A kind of unreal & real space location registration and accurate matching process - Google Patents

A kind of unreal & real space location registration and accurate matching process Download PDF

Info

Publication number
CN109003294A
CN109003294A CN201810640281.5A CN201810640281A CN109003294A CN 109003294 A CN109003294 A CN 109003294A CN 201810640281 A CN201810640281 A CN 201810640281A CN 109003294 A CN109003294 A CN 109003294A
Authority
CN
China
Prior art keywords
information
space
established
real time
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810640281.5A
Other languages
Chinese (zh)
Inventor
杨述成
刘玉明
韩哲
刘大军
崔玉平
方亚
丁明伟
管文艳
杜玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Aerospace Industry Technology Inst Co Ltd
CASIC SIMULATION TECHNOLOGY Co Ltd
Original Assignee
Shenzhen Aerospace Industry Technology Inst Co Ltd
CASIC SIMULATION TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Aerospace Industry Technology Inst Co Ltd, CASIC SIMULATION TECHNOLOGY Co Ltd filed Critical Shenzhen Aerospace Industry Technology Inst Co Ltd
Priority to CN201810640281.5A priority Critical patent/CN109003294A/en
Publication of CN109003294A publication Critical patent/CN109003294A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of unreal & real space location registrations and accurate matching process, comprising the following steps: carries out three-dimensional modeling to space using measuring tool, it is established that virtual space information;Real time scan is carried out to spatial information using binocular depth video camera, it is established that spatial information model;The object space of real time scan and shape information and the virtual space information of foundation are subjected to real-time matching;After location information successful match, the model that spatial information is established is obtained to binocular depth video camera using the accurate location information in Virtual Space and is corrected in real time;After the end of scan, the model established to real time scan is compared with the virtual space information established in advance, supplements the space without scanning.Compared with prior art, the spatial positional information of the method for the invention scanning is more accurate, and the time is shorter, and can predict unknown spatial information.

Description

A kind of unreal & real space location registration and accurate matching process
Technical field
The present invention relates to spatial vision technical fields, it particularly relates to a kind of unreal & real space location registration and accurate Method of completing the square, more particularly to space orientation and creation real-time three-dimensional space map.
Background technique
Enhance display field in AR, according to the available space map of SLAM technology and current visual angle, to superposition virtual object Body does corresponding rendering, and doing so can make the dummy object of superposition seem that comparison is true, without indisposed sense.But pass through The map that SLAM technology is established is influenced by factors such as sweep time, scanning space size and space complexities, is identified Precision is low, and the data volume of real-time storage is big, can not carry out spatial prediction to the space that do not scan.
Summary of the invention
It is an object of the invention to propose a kind of unreal & real space location registration and accurate matching process, to overcome the prior art Present in above-mentioned deficiency.
To realize the above-mentioned technical purpose, the technical scheme of the present invention is realized as follows:
A kind of unreal & real space location registration and accurate matching process, the method includes the following steps are included:
1) three-dimensional modeling is carried out to space using measuring tool, it is established that virtual space information;
2) real time scan is carried out to spatial information using binocular depth video camera, it is established that spatial information model;
3) object space of real time scan and shape information are subjected to real-time matching with the virtual space information of foundation in real time;
4) after location information successful match, space is obtained to binocular depth video camera using the accurate location information in Virtual Space The model that information is established is corrected in real time;
5) after the end of scan, the model established to real time scan is compared with the virtual space information established in advance, and supplement does not have There is the space that scanning is arrived.
Further, use the method for shape skeleton Rapid matching by the object space and shape of real time scan in step 3) Information carries out real-time matching with the virtual space information of foundation in real time.
Beneficial effects of the present invention: the method for the invention is to carry out Dummy modeling to space, to having in Virtual Space Effect factor carries out space registration, registers simultaneously to space factor information when live real time scan establishes model, and in real time Real-time matching is carried out with Virtual Space log-on data, corrects the data that scanning obtains.After the end of scan, according to Virtual Space pair The spatial model of real time scan carries out spatial prediction modeling.
Detailed description of the invention
Fig. 1 is head-up binocular stereo imaging schematic diagram;
Fig. 2 is the binocular stereo imaging schematic diagram at general visual angle.
Specific embodiment
With reference to the attached drawing in the embodiment of the present invention, technical solution in the embodiment of the present invention carries out clear, complete Ground description.
A kind of unreal & real space location registration and accurate matching process described in embodiment according to the present invention, including following step It is rapid:
The first step carries out three-dimensional modeling to space using accurate measuring tool.Coordinate origin and space coordinates are established, are stored Information unit is rice.Modeling information includes the main information in space structure, such as the space structure in room, room length, The position coordinates of the information such as desk, stool, vase within doors.Other than storage location coordinate, the shape of storage space information is needed Size information, such as cylinder, cube.
Second step carries out real time scan to spatial information, and space real time scan can use the RGBD SLAM of depth camera Technology.Using binocular stereo vision, object dimensional geological information is obtained by multiple image.From different perspectives simultaneously by twin camera Two width digital pictures of surrounding scenes are obtained, or has and obtains the two of surrounding scenes from different perspectives in different moments by single camera Width digital picture, and object dimensional geological information can be recovered based on principle of parallax, rebuild the 3D shapes of surrounding scenes with Position.
The acquisition that three-dimensional information is carried out by trigonometry principle, i.e., between the plane of delineation and north side object by two video cameras Constitute a triangle.Positional relationship between known two video cameras, can obtain object in two video camera public view fields Three-dimensional dimension and space object characteristic point three-dimensional coordinate.
Specifically, as shown in Figure 1, for simple head-up binocular stereo imaging schematic diagram, the projection centre of two video cameras connects The distance of line, i.e. parallax range B.Two video cameras watch the same characteristic point P of space-time object in synchronization, respectively at " left eye " The image of point P is obtained on " right eye ", their coordinate is respectively Pleft=(Xleft, Yleft);Pright=(Xright, Yright).In the same plane by the image of fixed two video cameras, then the Y coordinate of the image coordinate of characteristic point P must be identical , i.e. Yleft=Yright=Y.By the available following relational expression of triangle geometrical relationship:
Then parallax are as follows: Disparity=Xleft-Xright. is it is possible thereby to calculate three of characteristic point P under camera coordinate system Tie up coordinate:
Therefore, complete as long as any point in left video camera image planes can find corresponding match point in right video camera image planes The three-dimensional coordinate of the point can be determined entirely.This method is point-to-point operation, as in plane all the points simply by the presence of corresponding Match point, so that it may above-mentioned operation is participated in, to obtain corresponding three-dimensional coordinate.
On the three-dimensional measurement basis for analyzing simplest head-up binocular stereo vision, we just have the ability now To consider ordinary circumstance.As shown in Fig. 2, setting left video camera O-xyz is located at world coordinate system origin, and there is no rotation, figures As coordinate system is Ol-X1Y1, effective focal length fl;Right camera coordinate system is Or-xyz, and image coordinate system Or-XrYr has Effect focal length is fr.So according to the projection model of video camera, we can obtain following relational expression:
Because the positional relationship between O-xyz coordinate system and Or-xryrzr coordinate system can be indicated by space conversion matrix MLr Are as follows:
Similarly, for the spatial point in O-xyz coordinate system, the corresponding relationship between two video camera millet cake can be indicated are as follows:
Then, spatial point three-dimensional coordinate can be expressed as
Therefore, as long as we obtain left and right computer intrinsic parameter/focal length fr, fl and spatial point on a left side by computer calibration technique Image coordinate in right video camera, it will be able to reconstruct the three dimensional space coordinate of measured point.
Third step carries out the object space of real time scan and shape information real-time with the virtual space information of foundation in real time Matching.Matching process is the technology according to shape skeleton Rapid matching, and the node of middle skeleton is as Shape Element, and between node Relationship is for the matching between restraint joint between connection relationship becomes natural element.Since skeleton describes the structural information of object And the geological information of body surface is ignored, just meeting the object space information that this real time scan obtains.With Virtual Space into After row Rapid matching, the rough location matching of two spaces is obtained.
4th step after location information successful match, images binocular depth using the accurate location information in Virtual Space Machine obtains the model that spatial information is established and is corrected in real time.The planes such as the ground, desktop, the ceiling that are identified when real time scan letter Breath will by many triangle sets at, such storage occupied space is more, can direct memory plane, reduction redundant data storage.
5th step, after the end of scan, the model established to real time scan is compared with the Virtual Space established in advance, benefit Fill the space that do not scan.
The foregoing is merely illustrative of the preferred embodiments of the present invention, is not intended to limit the invention, all in essence of the invention Within mind and principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.

Claims (2)

1. a kind of unreal & real space location registration and accurate matching process, which is characterized in that the described method comprises the following steps:
1) three-dimensional modeling is carried out to space using measuring tool, it is established that virtual space information;
2) real time scan is carried out to spatial information using binocular depth video camera, it is established that spatial information model;
3) object space of real time scan and shape information and the virtual space information of foundation are subjected to real-time matching;
4) after location information successful match, space is obtained to binocular depth video camera using the accurate location information in Virtual Space The model that information is established is corrected in real time;
5) after the end of scan, the model established to real time scan is compared with the virtual space information established in advance, and supplement does not have There is the space that scanning is arrived.
2. the method according to claim 1, wherein will using the method for shape skeleton Rapid matching in step 3) The object space and shape information of real time scan carry out real-time matching with the virtual space information of foundation in real time.
CN201810640281.5A 2018-06-21 2018-06-21 A kind of unreal & real space location registration and accurate matching process Pending CN109003294A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810640281.5A CN109003294A (en) 2018-06-21 2018-06-21 A kind of unreal & real space location registration and accurate matching process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810640281.5A CN109003294A (en) 2018-06-21 2018-06-21 A kind of unreal & real space location registration and accurate matching process

Publications (1)

Publication Number Publication Date
CN109003294A true CN109003294A (en) 2018-12-14

Family

ID=64601506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810640281.5A Pending CN109003294A (en) 2018-06-21 2018-06-21 A kind of unreal & real space location registration and accurate matching process

Country Status (1)

Country Link
CN (1) CN109003294A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110020624A (en) * 2019-04-08 2019-07-16 石家庄铁道大学 Image-recognizing method, terminal device and storage medium
CN110126824A (en) * 2019-05-22 2019-08-16 河南工业大学 A kind of commercial vehicle AEBS system of integrated binocular camera and millimetre-wave radar

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945564A (en) * 2012-10-16 2013-02-27 上海大学 True 3D modeling system and method based on video perspective type augmented reality
CN105354854A (en) * 2015-12-01 2016-02-24 国家电网公司 Camera parameter dynamic united calibration method and system based on three-dimensional digital model
CN105741346A (en) * 2014-12-29 2016-07-06 达索系统公司 Method for calibrating a depth camera
US20180114363A1 (en) * 2016-10-25 2018-04-26 Microsoft Technology Licensing, Llc Augmented scanning of 3d models

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945564A (en) * 2012-10-16 2013-02-27 上海大学 True 3D modeling system and method based on video perspective type augmented reality
CN105741346A (en) * 2014-12-29 2016-07-06 达索系统公司 Method for calibrating a depth camera
CN105354854A (en) * 2015-12-01 2016-02-24 国家电网公司 Camera parameter dynamic united calibration method and system based on three-dimensional digital model
US20180114363A1 (en) * 2016-10-25 2018-04-26 Microsoft Technology Licensing, Llc Augmented scanning of 3d models

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAI CORDES, ET AL: "EXTRINSIC CALIBRATION OF A STEREO CAMERA SYSTEM USING A 3D CAD MODEL CONSIDERING THE UNCERTAINTIES OF ESTIMATED FEATURE POINTS", 《2009 CONFERENCE FOR VISUAL MEDIA PRODUCTION》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110020624A (en) * 2019-04-08 2019-07-16 石家庄铁道大学 Image-recognizing method, terminal device and storage medium
CN110126824A (en) * 2019-05-22 2019-08-16 河南工业大学 A kind of commercial vehicle AEBS system of integrated binocular camera and millimetre-wave radar

Similar Documents

Publication Publication Date Title
US11354840B2 (en) Three dimensional acquisition and rendering
CN104504671B (en) Method for generating virtual-real fusion image for stereo display
US11170561B1 (en) Techniques for determining a three-dimensional textured representation of a surface of an object from a set of images with varying formats
CN106101689B (en) The method that using mobile phone monocular cam virtual reality glasses are carried out with augmented reality
CN106228538B (en) Binocular vision indoor orientation method based on logo
CN103971408A (en) Three-dimensional facial model generating system and method
Bertel et al. Megaparallax: Casual 360 panoramas with motion parallax
WO2019219014A1 (en) Three-dimensional geometry and eigencomponent reconstruction method and device based on light and shadow optimization
CN101729920B (en) Method for displaying stereoscopic video with free visual angles
CN103810685A (en) Super resolution processing method for depth image
CN104376552A (en) Virtual-real registering algorithm of 3D model and two-dimensional image
CN104361628A (en) Three-dimensional real scene modeling system based on aviation oblique photograph measurement
Forbes et al. Using silhouette consistency constraints to build 3D models
WO2019085022A1 (en) Generation method and device for optical field 3d display unit image
CN104599284A (en) Three-dimensional facial reconstruction method based on multi-view cellphone selfie pictures
CN111009030A (en) Multi-view high-resolution texture image and binocular three-dimensional point cloud mapping method
US8577202B2 (en) Method for processing a video data set
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN108010125A (en) True scale three-dimensional reconstruction system and method based on line-structured light and image information
Zhu et al. Eye contact in video conference via fusion of time-of-flight depth sensor and stereo
JP4996922B2 (en) 3D visualization
TW200828182A (en) Method of utilizing multi-view images to solve occlusion problem for photorealistic model reconstruction
JP4354708B2 (en) Multi-view camera system
CN109003294A (en) A kind of unreal & real space location registration and accurate matching process
Palasek et al. A flexible calibration method of multiple Kinects for 3D human reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181214

WD01 Invention patent application deemed withdrawn after publication