CN105513128A - Kinect-based three-dimensional data fusion processing method - Google Patents

Kinect-based three-dimensional data fusion processing method Download PDF

Info

Publication number
CN105513128A
CN105513128A CN201610022247.2A CN201610022247A CN105513128A CN 105513128 A CN105513128 A CN 105513128A CN 201610022247 A CN201610022247 A CN 201610022247A CN 105513128 A CN105513128 A CN 105513128A
Authority
CN
China
Prior art keywords
kinect
dimensional
point cloud
data fusion
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610022247.2A
Other languages
Chinese (zh)
Inventor
赵维明
李士伟
梁磊
段丕轩
李雨芮
任晓波
岳廷瑞
尹熹伟
李小艳
梁频
何苗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Low Speed Aerodynamics Institute of China Aerodynamics Research and Development Center
Original Assignee
Low Speed Aerodynamics Institute of China Aerodynamics Research and Development Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Low Speed Aerodynamics Institute of China Aerodynamics Research and Development Center filed Critical Low Speed Aerodynamics Institute of China Aerodynamics Research and Development Center
Priority to CN201610022247.2A priority Critical patent/CN105513128A/en
Publication of CN105513128A publication Critical patent/CN105513128A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a Kinect-based three-dimensional data fusion processing method, comprising the following steps: a, using two sets of Kinect to respectively acquire point cloud data A and B of the same chessboard target, and obtaining spatial measurement coordinates of 40 identical chessboard lattice points in the two groups of point clouds; symmetrically arranging the two sets of Kinect relative to the chessboard target; b, acquiring a conversion matrix Mwc under a space coordinates system in which a point cloud B converted from a point cloud A is located; c, performing data fusion on three-dimensional point clouds acquired by the two sets of Kinect. According to the Kinect-based three-dimensional data fusion processing method, two sets of Kinect equipment are used to acquire a group of three-dimensional point cloud data, and space positions of the two sets of Kinect are demarcated, thus the conversion matrix is obtained, and further the data fusion on the acquired two groups of three-dimensional point clouds is realized.

Description

Based on the three-dimensional data method for amalgamation processing of Kinect
Technical field
The present invention relates to the three-dimensional data method for amalgamation processing based on Kinect.
Background technology
In the optical measurement of wind tunnel model attitude and distortion, to paste, spray patterns mark or embed luminescent marking and destroy model surface characteristic, and be difficult to retain when temperature, pressure change is violent.In order to process especially model not needing when model attitude and displacement measurement in wind tunnel test, as spraying or additional marking point, raising, needs to study a kind of three-dimensional non-contact measurement method newly as the power of test under high-temperature and high-pressure conditions to harsh test environment.
Microsoft's Kinect device is a kind of degree of depth video camera, and it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interactive simultaneously.Do not need to use any controller, it is the 3 d pose and the deformation information that rely on the motion of model in cameras capture three dimensions to obtain tested model.
Although depth information collecting device acquisition precision popular is at present high, the condition required often compares Xun and carves, and adds the reason such as price and complicated operation, cannot reach civilian effect.Because the infrared pick-up head of Kinect and VGA camera are in different positions, and the parameter of camera lens itself is also incomplete same, so the picture acquired by two video cameras has slightly little difference, three-dimensional coordinate (X cannot be made, Y, Z) and the same point of the corresponding model of chromatic information.
Summary of the invention
The object of the present invention is to provide a kind of three-dimensional data method for amalgamation processing based on Kinect, can under complex environment, two Kinect device are utilized to obtain one group of three dimensional point cloud, and by demarcating two Kinect locus, obtain transition matrix, and then the two groups of three-dimensional point clouds obtained are realized the fusion of data.
For achieving the above object, technical scheme of the present invention is a kind of three-dimensional data method for amalgamation processing based on Kinect of design, comprises the steps:
A. obtain cloud data A and B of same chessboard target with two Kinect respectively, and obtain the space measurement coordinate of 40 identical checker-wise o'clock in two groups of some clouds; And two Kinect place with chessboard target symmetry;
B. acquisition point cloud A is transformed into the transition matrix M under a space coordinates at cloud B place wc, use least square method, according to formula M wc=(A ta) -1a tb obtains corresponding conversion matrix;
C. the three-dimensional point cloud that two Kinect obtain is carried out data fusion: be transformed into below the world coordinate system at another Kinect place by the cloud data that the Kinect obtaining A point cloud obtains, formulae express is as follows: AM wc=B.
Preferably, two Kinect place with chessboard target symmetry.
Advantage of the present invention and beneficial effect are: provide a kind of three-dimensional data method for amalgamation processing based on Kinect, can under complex environment, two Kinect device are utilized to obtain one group of three dimensional point cloud, and by demarcating two Kinect locus, obtain transition matrix, and then the two groups of three-dimensional point clouds obtained are realized the fusion of data.
The present invention uses the software write based on OpenNI and Primesense to obtain three-dimensional data points cloud in conjunction with Kinect, and the depth data (Z) that Kinect can be made to obtain and view data (X, Y) can be good at coincidence.
The present invention obtains model aximal deformation value, the 3 d pose in large interpretation region and deformation data when not damage model character of surface, the measuring table three-dimensional coordinate result of building and the consistance of portable three-coordinate instrument system better.The method can not only make its depth data and view data can be good at coincidence, but also the errorless combination of the cloud data that two Kinect can be obtained is fused together.
Accompanying drawing explanation
Fig. 1 is schematic diagram of the present invention;
Fig. 2 is target schematic diagram;
Fig. 3 is target calibration maps.
Embodiment
Below in conjunction with drawings and Examples, the specific embodiment of the present invention is further described.Following examples only for technical scheme of the present invention is clearly described, and can not limit the scope of the invention with this.
As shown in Figure 1 to Figure 3, the technical scheme that the present invention specifically implements is:
(1) computing machine (4 in Fig. 1) is used, the process software that built-in computer is write based on OpenNI and Primesense, two Kinect (1 in Fig. 1,2) are positioned over the position shown in Fig. 1, obtain target 3 front (as shown in Figure 2) tessellated 3 d space coordinate point respectively.
(2) two Kinect (1 in Fig. 1, the 2) spatial relation that front and back are placed is demarcated, and the transition matrix obtained; Two Kinect place with chessboard target symmetry; Key step comprises:
A. two Kinect (1 in Fig. 1,2) are used to obtain cloud data A and B of same chessboard target (3 in Fig. 1) respectively, and obtain identical 40 checker-wise points (as shown in Figure 3,5 in Fig. 3 is calibration point) the space measurement coordinate in two groups of some clouds with
B. the transformational relation that is transformed under a space coordinates at cloud B place of acquisition point cloud A is as follows:
X A W = m 11 w c X B W + m 12 w c Y B W + m 13 w c Z B W + m 14 w c m 41 w c X B W + m 42 w c Y B W + m 43 w c Z B W + m 44 w c - - - ( 1 )
Y A W = m 21 w c X B W + m 22 w c Y B W + m 23 w c Z B W + m 24 w c m 41 w c X B W + m 42 w c Y B W + m 43 w c Z B W + m 44 w c - - - ( 2 )
Z A W = m 31 w c X B W + m 32 w c Y B W + m 33 w c Z B W + m 34 w c m 41 w c X B W + m 42 w c Y B W + m 43 w c Z B W + m 44 w c - - - ( 3 )
Transition matrix is M wc, use least square method, according to formula M wc=(A ta) -1a tb obtains corresponding conversion matrix.M wcbe expressed as follows:
M w c = m 11 w c m 12 w c m 13 w c m 14 w c m 21 w c m 22 w c m 23 w c m 24 w c m 31 w c m 32 w c m 33 w c m 34 w c m 41 w c m 42 w c m 43 w c m 44 w c - - - ( 4 )
(3) three-dimensional point cloud that two Kinect obtain is carried out data fusion, key step comprises, and be transformed into below the world coordinate system at another Kinect place by the cloud data that the Kinect obtaining A point cloud obtains, formulae express is as follows: X B w Y B w Z B w w = M w c X A Y A Z A 1 , Realize the comprehensive three-dimensional non-contact measurement to model.
The above is only the preferred embodiment of the present invention; it should be pointed out that for those skilled in the art, under the prerequisite not departing from the technology of the present invention principle; can also make some improvements and modifications, these improvements and modifications also should be considered as protection scope of the present invention.

Claims (2)

1., based on the three-dimensional data method for amalgamation processing of Kinect, it is characterized in that, comprise the steps:
A. obtain cloud data A and B of same chessboard target with two Kinect respectively, and obtain the space measurement coordinate of 40 identical checker-wise o'clock in two groups of some clouds;
B. acquisition point cloud A is transformed into the transition matrix M under a space coordinates at cloud B place wc, use least square method, according to formula M wc=(A ta) -1a tb obtains corresponding conversion matrix;
C. the three-dimensional point cloud that two Kinect obtain is carried out data fusion: be transformed into below the world coordinate system at another Kinect place by the cloud data that the Kinect obtaining A point cloud obtains, formulae express is as follows: AM wc=B.
2. the three-dimensional data method for amalgamation processing based on Kinect according to claim 1, is characterized in that, two Kinect place with chessboard target symmetry.
CN201610022247.2A 2016-01-13 2016-01-13 Kinect-based three-dimensional data fusion processing method Pending CN105513128A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610022247.2A CN105513128A (en) 2016-01-13 2016-01-13 Kinect-based three-dimensional data fusion processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610022247.2A CN105513128A (en) 2016-01-13 2016-01-13 Kinect-based three-dimensional data fusion processing method

Publications (1)

Publication Number Publication Date
CN105513128A true CN105513128A (en) 2016-04-20

Family

ID=55721080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610022247.2A Pending CN105513128A (en) 2016-01-13 2016-01-13 Kinect-based three-dimensional data fusion processing method

Country Status (1)

Country Link
CN (1) CN105513128A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106384380A (en) * 2016-08-31 2017-02-08 重庆七腾软件有限公司 3D human body scanning, modeling and measuring method and system
CN107578019A (en) * 2017-09-13 2018-01-12 河北工业大学 A kind of Gait Recognition system of visual tactile fusion and recognition methods
CN108230379A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 For merging the method and apparatus of point cloud data
CN109272572A (en) * 2018-08-30 2019-01-25 中国农业大学 A kind of modeling method and device based on double Kinect cameras
CN109875562A (en) * 2018-12-21 2019-06-14 鲁浩成 A kind of human somatotype monitoring system based on the more visual analysis of somatosensory device
CN112361989A (en) * 2020-09-30 2021-02-12 北京印刷学院 Method for calibrating parameters of measurement system through point cloud uniformity consideration
CN113198692A (en) * 2021-05-19 2021-08-03 飓蜂科技(苏州)有限公司 High-precision dispensing method and device suitable for batch products
CN113237628A (en) * 2021-07-08 2021-08-10 中国空气动力研究与发展中心低速空气动力研究所 Method for measuring horizontal free flight model attitude of low-speed wind tunnel

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106688A (en) * 2013-02-20 2013-05-15 北京工业大学 Indoor three-dimensional scene rebuilding method based on double-layer rectification method
CN103279987A (en) * 2013-06-18 2013-09-04 厦门理工学院 Object fast three-dimensional modeling method based on Kinect
CN103413352A (en) * 2013-07-29 2013-11-27 西北工业大学 Scene three-dimensional reconstruction method based on RGBD multi-sensor fusion
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106688A (en) * 2013-02-20 2013-05-15 北京工业大学 Indoor three-dimensional scene rebuilding method based on double-layer rectification method
CN103279987A (en) * 2013-06-18 2013-09-04 厦门理工学院 Object fast three-dimensional modeling method based on Kinect
CN103413352A (en) * 2013-07-29 2013-11-27 西北工业大学 Scene three-dimensional reconstruction method based on RGBD multi-sensor fusion
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴禄慎 等: ""基于特征点的改进ICP三维点云配准技术"", 《南昌大学学报.工科版》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106384380A (en) * 2016-08-31 2017-02-08 重庆七腾软件有限公司 3D human body scanning, modeling and measuring method and system
CN107578019A (en) * 2017-09-13 2018-01-12 河北工业大学 A kind of Gait Recognition system of visual tactile fusion and recognition methods
CN107578019B (en) * 2017-09-13 2020-05-12 河北工业大学 Gait recognition system and method based on visual sense and tactile sense fusion
CN108230379A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 For merging the method and apparatus of point cloud data
CN108230379B (en) * 2017-12-29 2020-12-04 百度在线网络技术(北京)有限公司 Method and device for fusing point cloud data
CN109272572A (en) * 2018-08-30 2019-01-25 中国农业大学 A kind of modeling method and device based on double Kinect cameras
CN109875562A (en) * 2018-12-21 2019-06-14 鲁浩成 A kind of human somatotype monitoring system based on the more visual analysis of somatosensory device
CN112361989A (en) * 2020-09-30 2021-02-12 北京印刷学院 Method for calibrating parameters of measurement system through point cloud uniformity consideration
CN112361989B (en) * 2020-09-30 2022-09-30 北京印刷学院 Method for calibrating parameters of measurement system through point cloud uniformity consideration
CN113198692A (en) * 2021-05-19 2021-08-03 飓蜂科技(苏州)有限公司 High-precision dispensing method and device suitable for batch products
CN113237628A (en) * 2021-07-08 2021-08-10 中国空气动力研究与发展中心低速空气动力研究所 Method for measuring horizontal free flight model attitude of low-speed wind tunnel

Similar Documents

Publication Publication Date Title
CN105513128A (en) Kinect-based three-dimensional data fusion processing method
US20210166495A1 (en) Capturing and aligning three-dimensional scenes
CN103175485A (en) Method for visually calibrating aircraft turbine engine blade repair robot
CN102589530B (en) Method for measuring position and gesture of non-cooperative target based on fusion of two dimension camera and three dimension camera
CN104484887B (en) External parameters calibration method when video camera is used in combination with scanning laser range finder
CN104424630A (en) Three-dimension reconstruction method and device, and mobile terminal
CN108492017B (en) Product quality information transmission method based on augmented reality
WO2008099915A1 (en) Road/feature measuring device, feature identifying device, road/feature measuring method, road/feature measuring program, measuring device, measuring method, measuring program, measured position data, measuring terminal, measuring server device, drawing device, drawing method, drawing program, and drawing data
CN103838437A (en) Touch positioning control method based on projection image
CN110879080A (en) High-precision intelligent measuring instrument and measuring method for high-temperature forge piece
CN103925879A (en) Indoor robot vision hand-eye relation calibration method based on 3D image sensor
CN102508575B (en) Screen writing device, screen writing system and realization method thereof
CN104034269A (en) Monocular vision measuring method and monocular vision measuring device
CN106097433A (en) Object industry and the stacking method of Image model and system
CN105957096A (en) Camera extrinsic parameter calibration method for three-dimensional digital image correlation
CN103955316A (en) Fingertip touch detection system and method
CN107808412A (en) A kind of three-dimensional thermal source environmental model based on low cost determines environmental information method
CN110568934A (en) Low-error high-efficiency multi-label-graph augmented reality system
CN104952105A (en) Method and apparatus for estimating three-dimensional human body posture
KR101496441B1 (en) Apparatus and Method for registration of flat panel display device and imaging sensor, and Electronic device having flat panel display device and imaging sensor which are registered using the method
CN110796702A (en) Industrial equipment identification and positioning method, system and equipment based on machine vision
CN104596486A (en) Target-rotational-symmetry-characteristic-based posture measurement method
CN102831388A (en) Method and system for detecting real-time characteristic point based on expanded active shape model
CN205318475U (en) Three dimensional data fuses processing system based on kinect
CN203933782U (en) Prospect position-recognizing system in a kind of Virtual Studio System

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160420