CN103528568A - Wireless channel based target pose image measuring method - Google Patents

Wireless channel based target pose image measuring method Download PDF

Info

Publication number
CN103528568A
CN103528568A CN201310464818.4A CN201310464818A CN103528568A CN 103528568 A CN103528568 A CN 103528568A CN 201310464818 A CN201310464818 A CN 201310464818A CN 103528568 A CN103528568 A CN 103528568A
Authority
CN
China
Prior art keywords
image
pose
information data
wireless channel
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310464818.4A
Other languages
Chinese (zh)
Other versions
CN103528568B (en
Inventor
谌德荣
王长元
周广铭
蒋玉萍
高翔霄
杨晓乐
关咏梅
董齐齐
赵燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Beijing Institute of Astronautical Systems Engineering
Original Assignee
Beijing Institute of Technology BIT
Beijing Institute of Astronautical Systems Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT, Beijing Institute of Astronautical Systems Engineering filed Critical Beijing Institute of Technology BIT
Priority to CN201310464818.4A priority Critical patent/CN103528568B/en
Publication of CN103528568A publication Critical patent/CN103528568A/en
Application granted granted Critical
Publication of CN103528568B publication Critical patent/CN103528568B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a wireless channel data transmission system based target pose image measuring method, and relates to an image processing and pose measuring technology. The method comprises the following steps: a transmitting end comprising a camera and an image processing unit and a receiving end are arranged, and the image processing unit extracts at least three characteristic points of a shoot image, determines the coordinates of the at least three characteristic points and establishes the matching relationship between the coordinates and a target in prior characteristic to generate image characteristic information data; the image characteristic information data are transmitted to the receiving end from the transmitting end through a wireless channel; and pose parameter resolving is carried out by a pose resolving unit of the receiving end by adopting the received image characteristic information data. The pose image measuring method comprises the steps of characteristic extraction and pose parameter resolving, and satisfies a wireless channel requirement of low capacity by adopting the transmitted image characteristic information data to substitute the whole image. The characteristic extraction is completed under high-resolution and high-frame-rate image conditions, so the pose parameter measuring precision is guaranteed.

Description

A kind of object pose image measuring method based on wireless channel
Technical field
The present invention relates to image and process and pose measurement technology, relate in particular to a kind of object pose image measuring method based on wireless channel data transmission system.
Background technology
Pose image measurement technology has advantages of and does not contact testee to have very important using value in fields such as scientific research, military affairs, space developments.In the scene of some pose measurement, observation station need to be arranged on to aircraft (such as aircraft, airship, rocket etc.) upper, measure the relative pose parameter with aircraft with the airbound target (such as satellite, rocket etc.) of relative motion.Being arranged on carry-on observation station communicates by letter with ground receiving station by wireless channel.
The precision height of image measurement depends on image resolution ratio and frame per second, when image resolution ratio and frame per second are enough high, could obtain high measurement accuracy.In fields such as space flight, military affairs, when measuring system adopts wireless channel transmission data, channel capacity is lower make the image resolution ratio of measuring system and frame per second lower, need to pass through rational conceptual design, each step of optimizing pose image measurement technology, reaches higher pose measurement precision.
The object pose remote image recording geometry of Beijing Institute of Technology's space flight measurement and control laboratory development, adopts data compression technique to realize the wireless transmission at lower channel capacity hypograph, and image can be used for object pose to observe qualitatively.But limit by wireless channel capacity, image resolution ratio and code check that this system is transmitted are lower, can not carry out quantitative measurment to object pose.
The National University of Defense Technology is studied the extraterrestrial target pose measuring method based on vision, in paper " the extraterrestrial target position attitude measurement method research based on vision ", proposed under compared with the condition of high image resolution the accurately method of measurement target pose, but the method not have the consideration problem that high-definition picture cannot transmit under low channel capacity.
Typical radio channel capacity is not higher than 2Mbps, said method all can not be realized the high precision image of object pose is measured under this lower channel capacity, for this reason need to be for the lower data transmission conditions design object pose image measuring method of this channel capacity.
Summary of the invention
In order to solve the above-mentioned defect existing in prior art, the technical problem to be solved in the present invention is that the transmission conditions for the low channel capacity of wireless channel propose a kind of object pose image measuring method, the method can realize to be measured the high precision image of object pose, is applicable to have various types of cooperative targets of prior imformation.
The present invention solves the technical scheme that its technical matters takes:
A kind of object pose image measuring method based on wireless channel, position-pose measurement is decomposed into feature extraction and pose parameter calculation two links, and the transmission code rate of the image feature information data that obtain in feature extraction step meets the low capacity restriction of wireless channel.
Method of the present invention comprises, transmitting terminal and receiving end are set;
Described transmitting terminal comprises video camera and graphics processing unit; Described video camera is taken the high-definition picture of airbound target; The image resolution ratio that wherein video camera is taken is higher than 512 * 512, and frame per second is higher than 50bps;
Described graphics processing unit extracts at least 3 unique points of described video camera photographic images, determines the coordinate of unique point in image; And set up the matching relationship of the priori features of described coordinate and target, synthetic image characteristic information data.
At the image that video camera is taken, carrying out in the step of feature extraction, the unique point of extracting can be the characteristics of image such as the angle point, edge of described image.
In described step, the image feature information data of described generation need be passed through the links such as image pre-service, target detection, feature detection, characteristic matching, the output of image feature information data, and above link can be selected and measure the mature known algorithm that scene adapts and realize.
Because target to be measured in method of the present invention is cooperative target, therefore there is priori features information, process relevant algorithm from image and can take known corresponding Feature Correspondence Algorithm according to target signature different, the image feature information data of generation can calculate object pose parameter.
The image feature information data of described generation are transferred to receiving end by wireless channel from transmitting terminal.
Described receiving end comprises that pose resolves unit, and pose resolves unit and adopts the described image feature information data that receive to carry out pose parameter calculation.
The wireless channel capacity of data transmission system is lower, can not transmit in real time high resolving power and high frame rate image, but the transmission code rate of image feature information data can meet the capacity limit of wireless channel.
The present invention adopts image feature information data to replace entire image the data stream of transmission, thereby meets the wireless channel requirement that capacity is lower; And the image measurement of pose is divided into feature extraction and two steps of pose parameter calculation, feature extraction completes under the condition of high resolving power and high frame rate image, has guaranteed the precision of image feature information data, thereby has guaranteed the measuring accuracy of pose parameter.
After embodiments of the present invention being described in detail with way of example below in conjunction with accompanying drawing, other features of the present invention, feature and advantage will be more obvious.
Accompanying drawing explanation
Fig. 1 is the block diagram of system of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 2 is the measurement procedure figure of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 3 is the airbound target outside drawing of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 4 is the unique point schematic diagram of the airbound target of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 5 is the schematic diagram of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 6 is the central projection illustraton of model of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 7 is that the image feature information data of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention are extracted process flow diagram.
Fig. 8 is airbound target image gradient direction schematic diagram in a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Fig. 9 is the pixel coordinate system and image coordinate system schematic diagram of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.
Embodiment
Below in conjunction with accompanying drawing and a typical embodiment, the present invention is elaborated.
Shown in accompanying drawing, Fig. 1 is the composition frame chart of the system of a kind of object pose image measuring method based on wireless channel data transmission system of the present invention.System of the present invention is comprised of transmitting terminal and receiving end two parts.Wherein, transmitting terminal is comprised of video camera, graphics processing unit, modulation and amplifirer, transmitter; Receiving end resolves unit by receiver, detection and detuner, pose and forms.Wherein, system transmitting terminal of the present invention completes the functions such as image acquisition, image characteristics extraction, signal modulation and amplification, signal transmission; The functions such as the reception of receiving end settling signal, signal demodulation, pose resolve, parameter display.
Shown in Figure 2, measurement procedure of the present invention is as follows: adopt high-resolution camera photographic subjects image, the image resolution ratio of video camera is higher than 512 * 512, and frame per second is higher than 50bps; Captured image is passed to image processor, and image processor carries out feature extraction to image, and by image feature information data, after modulation and power amplifier, by transmitter, sends.Can higher than the wireless channel of the low bandwidth of 2Mbps, not send receiving end to through channel capacity.The receiver of receiving end receives after image feature information data, after testing and detuner pass to pose and resolve unit and carry out pose parameter calculation.
Core procedure of the present invention is, the position-pose measurement of target is divided into image characteristics extraction and pose parametric solution two parts, and transmitting terminal and the receiving end in system completes respectively, thereby makes system complete the pose parameter measurement to target.
Target to be measured take below as the taper target with indicia patterns is example, illustrate the method that object pose of the present invention is measured.
Fig. 3 is the outside drawing of airbound target to be measured, the unique point schematic diagram that Fig. 4 is airbound target, and trapezoidal 4 summits of the class shown in Fig. 4 of usining are as clarification of objective point.
(1) pose parameter and Coordinate system definition
Fig. 5 is pose measurement system sketch of the present invention, the central projection illustraton of model that Fig. 6 is measuring method of the present invention.To defining altogether 5 coordinate systems in the pose measurement process of airbound target to be measured:
(1) target-based coordinate system O-XYZ
(2) measure coordinate system O w-X wy wz w
(3) camera coordinate system O c-X cy cz c
(4) image coordinate system o-xy
(5) pixel coordinate is o'-uv
Meanwhile, establish unique point P icoordinate in above-mentioned 5 coordinate systems is respectively: W i=(X i, Y i, Z i) t,
Figure BDA0000392557590000052
Figure BDA0000392557590000053
w=(x i, y i) t, w'=(u i, v i) t.(this patent acquiescence i=1,2,3,4)
W iwith
Figure BDA0000392557590000054
between there is following relation:
W i w = RW i + T - - - ( 1 )
T is translation vector, is a tri-vector T=[T x, T y, T z] t.Represent the relative position between two coordinate systems, the initial point of target-based coordinate system is being measured the coordinate of coordinate system.
R is rotation matrix, is the trigonometric function combination of three angles (α, beta, gamma).Around X-axis rotation angle α, around Y-axis rotation angle β, around Z axis rotation angle γ.Target-based coordinate system is in order respectively around the rotation of three coordinate axis, just reaches three coordinate axis, three attitudes that change in coordinate axis direction consistent corresponding with measuring coordinate system respectively.R is used for describing target-based coordinate system with respect to the attitude of measuring coordinate system.R with the pass of (α, beta, gamma) is:
R = R Z R Y R X = cos γ sin γ 0 - sin γ cos γ 0 0 0 1 cos β 0 - sin β 0 1 0 sin β 0 cos β 1 0 0 0 cos α sin α 0 - sin α cos α = cos γ cos β sin γ cos α + cos γ sin β sin α sin γ sin α - cos γ sin β cos α - sin γ cos β cos γ cos α - sin γ sin β sin α cos γ sin α + sin γ sin β cos α sin β - cos β sin α cos β cos α - - - ( 2 )
Pose measurement will solve translation vector T and attitude angle (α, beta, gamma) exactly.
(2) feature point extraction
Feature point extraction is exactly to extract the coordinate of target signature in the pixel coordinate system of the image of shot by camera.The outward appearance of target to be measured will be carried out tagging scheme design and be produced prior image characteristic information data, different for different target signature mark design proposals.Prior image characteristic information data can be unique point or characteristic curve, and its quantity has determined the wireless signal-path band width that image feature information data will take.Image solves the prior imformation that object pose at least needs 3 unique points.The upper limit of unique point number is relevant to the bandwidth of measuring system to channel allocation, but the more computation complexities of unique point number that adopt are larger, cause the load of graphics processing unit excessive, and Optimal units is 3 or 4.
Take the target shown in Fig. 3 as example, the process that characterization point extracts.Target signature is 4 summit P1 of each class trapezoid area, P2, and P3, P4, according to the high-precision method for extracting for this unique point of the present invention, extracts flow process as Fig. 7, and concrete steps are as follows:
(1) the trapezoidal whole pixel edge of detection type, the specific embodiment of the present invention adopts the trapezoidal edge of Canny operator detection type, and step is:
The first step: use Gaussian filter smoothed image.Gaussian smoothing function is:
H ( x , y ) = e - α 2 + b 2 2 σ 2 G ( x , y ) = f ( x , y ) * H ( x , y ) - - - ( 3 )
Second step: by the finite difference of single order local derviation assign to amplitude and the direction of compute gradient.First order difference convolution mask is:
H 1 = - 1 - 1 1 1 H 2 = 1 - 1 1 - 1 - - - ( 4 )
Figure BDA0000392557590000063
The 3rd step: gradient magnitude is carried out to non-maximum value inhibition.Only obtain overall gradient and be not sufficient to determine edge, therefore, for determining edge, must retain the point of partial gradient maximum, and suppress non-maximum value, the present invention utilizes the direction of gradient to ask gradient.
ξ[i,j]=Sector(θ[i,j]) (6)
Airbound target image gradient direction schematic diagram to be measured as shown in Figure 8, the label of four sectors is 0 to 3, four kinds of corresponding 3 * 3 neighborhoods may be combined.To a point, the center pixel M of neighborhood compares with two pixels along gradient line.If the Grad of M is large unlike two neighbor Grad along gradient line, make M=0.
N[i,j]=NMS(M[i,j],ξ[i,j]) (7)
The 4th step: detect and be connected edge with dual threshold algorithm.The typical method that reduces false edge section quantity is to N[i, j] use a threshold value, all values lower than threshold value is composed to null value.The present invention adopts dual threshold algorithm.Choose dual threshold: τ 1and τ 2, and τ 1≈ 2 τ 2, obtain two threshold value edge image T 1[i, j] and T 2[i, j], at T 1in collect edge, by T 2middle gapped coupling together.
Utilize Canny algorithm to obtain the trapezoidal whole pixel edge point P of class i(m, n).
(2) waist, the upper base in the trapezoidal edge of separated class, go to the bottom.Because black class is trapezoidal trapezoidal adjacent with white group, the trapezoidal waist of class both sides color have two kinds of black and whites; And the color on both sides, the trapezoidal upper and lower end of class, Yi Bian be grey (background) and another side is white or black (signature).According to this feature, the trapezoidal waist of class is separated from marginal point.The trapezoidal waist of class is straight line, with least square fitting out.In the trapezoidal marginal point of class, remove the marginal point on waist, obtain the trapezoidal upper and lower feather edge point of class.The trapezoidal upper and lower end of class is circular arc, projects in image and becomes elliptic arc (or circular arc), with the elliptic curve at these some difference places, two upper and lower ends of matching.
(3) extract the trapezoidal sub-pixel edge of class.Based on Canny operator, asking the trapezoidal whole pixel edge point P of class i(m, n) afterwards, by known whole pixel edge point P i(m, the gradient direction of the sub-pixel edge point of its unknown of gradient direction approximate substitution n), and at the enterprising row interpolation of gradient direction of whole pixel edge point, obtain difference functions φ (x, y), because the gray scale derivative value at place, image border is maximum, therefore, φ (x, the coordinate of maximum of points y) is the coordinate of sub-pixel edge point, so, then by asking the maximal value of difference functions φ (x, y) just can obtain the coordinate P ' of sub-pixel edge point i(m ', n ').
If R is the trapezoidal marginal point P that is class ithe gradient magnitude of (m, n), R 0for marginal point P ithe mould of the shade of gray of (m, n), R -1, R 1be respectively on gradient direction with P ithe two pixel P that point is adjacent i-1, P i+1gradient magnitude, R 0, R -1, R 1by eight template Sobel operators, obtained, marginal point P ithe subpixel coordinates P ' of (m, n) i(m ', n ') be:
m ′ = m + R - 1 - R 1 R - 1 - 2 R 0 + R 1 · W 2 cos ( θ ) n ′ = n + R - 1 - R 1 R - 1 - 2 R 0 + R 1 · W 2 sin ( θ ) - - - ( 8 )
In formula, W be neighbor pixel to the distance of marginal point, W=1 or , θ is the angle of gradient direction and the image longitudinal axis.
According to said method, obtain waist, upper base, the subpixel coordinates P ' of each marginal point of going to the bottom i(m ', n ').
(4) solve unique point subpixel coordinates
Because the trapezoidal waist of class is the straight-line segment along cone bus, therefore, the trapezoidal waist of class can be with the least square fitting model y=kx+b that is in line.If P l1(m l1, n l1) and P l2(m l2, n l2) be respectively the whole pixel coordinate on trapezoidal two waists of class, P ' l1(m ' l1, n ' l1) and P ' l2(m ' l2, n ' l2) be the sub-pixel edge point coordinate of respective point, the trapezoidal waist L of class 1, L 2for:
L 1 : n L 1 , i ′ = k 1 m L 1 , i ′ + b 1 L 2 : n L 2 , j ′ = k 2 m L 2 , j ′ + b 2 - - - ( 9 )
In formula, k 1, b 1, k 2, b 2be respectively:
b 1 = n L 1 ′ ‾ - k 1 m L 1 ′ ‾ k 1 = Σ ( m L 1 ′ ‾ - m L 1 ′ ) ( n L 1 ′ ‾ - n L 1 ′ ) Σ ( m L 1 ′ ‾ - m L 1 ′ ) 2 b 2 = n L 2 ′ ‾ - k 2 m L 2 ′ ‾ k 2 = Σ ( m L 2 ′ ‾ - m L 2 ′ ) ( n L 2 ′ ‾ - n L 2 ′ ) Σ ( m L 2 ′ ‾ - m L 2 ′ ) 2 - - - ( 10 )
In formula, represent pixel average.
Because the trapezoidal upper base of class and going to the bottom as circular arc, projects in image and becomes elliptic arc, therefore, the trapezoidal upper base of class and go to the bottom and fit to quadratic curve equation model ax 2+ bxy+cy 2+ dx+ey+f=0.Make P c1(m c1, n c1), P c2(m c2, n c2) be the trapezoidal upper base of class and the whole pixel coordinate of going to the bottom, P ' c1(m ' c1, n ' c1), P ' c2(m ' c2, n ' c2) be the subpixel coordinates of respective point, the trapezoidal upper base of class is respectively L with going to the bottom 3, L 4.
Because cone is when far away apart from video camera, upper base and go to the bottom on marginal point negligible amounts, the quafric curve effect simulating is undesirable, larger with true curve error.For this problem, the present invention utilizes Lagrangian differential technique matching upper base and the quafric curve at the place of going to the bottom.
Ask the trapezoidal L of class 1, L 3intersecting point coordinate P l1L3(m, n), method as follows:
(a) find out L 3upper apart from L 1for d ≈ 1, two sub-pixel edge point P ' of d ≈ 3 1(m ' 1, n ' 1), P ' 2(m ' 2, n ' 2).
(b) extract L 1on the trapezoidal upper base of opposite side class apart from L 1sub-pixel edge point P ' for d ≈ 1 3(m ' 3, n ' 3).
(c) based on formula, P ' 1(m ' 1, n ' 1), P ' 2(m ' 2, n ' 2), P ' 3(m ' 3, n ' 3) carry out Lagrangian difference
c 1 ( x ) = ( x - m 2 ′ ) ( x - m 3 ′ ) ( m 1 ′ - m 2 ′ ) ( m 1 ′ - m 3 ′ ) c 2 ( x ) = ( x - m 1 ′ ) ( x - m 3 ′ ) ( m 2 ′ - m 1 ′ ) ( m 2 ′ - m 3 ′ ) c 3 ( x ) = ( x - m 1 ′ ) ( x - m 2 ′ ) ( m 3 ′ - m 1 ′ ) ( m 3 ′ - m 2 ′ ) - - - ( 11 )
(4), based on formula, obtain L 1, L 3intersecting point coordinate
y = n 1 c 1 ( x ) + n 2 c 2 ( x ) + n 3 c 3 ( x ) n L 1 , i ′ = k 1 m L 1 , i ′ + b 1 - - - ( 12 )
Other 3 intersecting point coordinate P that similarly, can obtain the trapezoidal waist of class and upper base and go to the bottom l2L3(m, n), P l1L4(m, n), P l2L4(m, n).
According to the specific embodiment of the present invention, above-mentioned feature point extraction, set up the matching relationship of image characteristic point and target priori features, whole links of synthetic image characteristic information data complete by take the graphics processing unit that dsp chip is core.
(3) transmission code rate calculates
After synthetic image characteristic information data, image feature information data be passed to for the pose of pose measurement software is housed and resolved unit by wireless channel.
Through feature point extraction, obtain the image feature information data of airbound target to be measured in the captured image of high-resolution camera, in the present invention, adopt image feature information data to replace image itself as the transferring content of wireless channel, thereby meet the capacity limit of wireless channel.Take target shown in Fig. 3 as example, and its unique point is 4, calculates transmission code rate.
The image feature information data of every two field picture comprise coordinate and the numbering of 4 unique points, each unique point coordinate takies the storage space of 8Bytes, unique point numbering takies the storage space of 2Bytes, so the image feature information data data volume of single-frame images is 40Bytes.
Calculate and adopt 4 unique points, the asynchronous code check of image frame per second.As shown in table 1:
Table 1 code check calculates (1)
Image frame per second Bit rate output
50bps 15.63Kbps
100bps 31.25Kbps
200bps 62.50Kbps
Result of calculation as shown in table 1, while adopting higher frame per second 200bps, wireless channel is 62.5Kbps for the bandwidth of transmitting image image feature information data data, far below the channel capacity of typical radio channel 2Mbps, can save massive band width for wireless channel.
Computed image frame per second is 100bps, the asynchronous code check of unique point number of employing.As shown in table 2:
Table 2 code check calculates (2)
Unique point number Bit rate output
4 31.25Kbps
6 46.88Kbps
8 62.50Kbps
Result of calculation as shown in table 2, while adopting 8 unique points, wireless channel is 62.5Kbps for the bandwidth of transmitting image characteristic information data, far below the channel capacity of typical radio channel 2Mbps.
(4) pose parameter calculation
Pose parameter calculation is completed by graphics processing unit, can complete by known position-pose measurement.Can be by the pose parameter of the computer-solution target of pose measurement software be housed in practice.In method of the present invention, target to be measured is cooperative target, therefore has prior image characteristic information data.In preceding step, the coordinate W of unique point in target-based coordinate system O-XYZ i=(X i, Y i, Z i) be prior imformation, for known; Coordinate w'=(the u of unique point in pixel coordinate system i, v i) tby above-mentioned feature point extraction step, obtain.Pose parameter calculation is exactly to utilize above known quantity in conjunction with the parametric solution of high-resolution camera of the present invention, to go out the pose parameter of target.
Fig. 5 is pose measurement system sketch of the present invention, and high-resolution camera coordinate of the present invention is O-X cy cz c, video camera principal point is coordinate origin, and video camera transverse and longitudinal direction is respectively x axle and y axle, and camera optical axis is z axle.Measuring coordinate is O w-X wy wz w.
(1) solve unique point image coordinate system coordinate
If the distance on x axle and y direction of principal axis between each pixel in pixel coordinate system, pixel actual physical size, is respectively dx and dy, and the pixel coordinate of picture centre is (u 0, v 0), as Fig. 9, by the transformation relation of pixel coordinate system and image coordinate system, can in image, arbitrfary point be coordinate w'=(u at pixel coordinate i, v i) tsolve image coordinate system coordinate w=(x i, y i) t,
x i = ( u i - u 0 ) dx y i = ( v i - v 0 ) dy - - - ( 13 )
(2) solve unique point camera coordinate system coordinate
F represents the focal length of video camera, 4 unique point image coordinate system coordinate (x i, y i) twith camera coordinate system coordinate W i c = ( X i c , Y i c , Z i c ) T There is following relation:
x i f = X i c Z i c y i f = Y i c Z i c - - - ( 14 )
Three dimensions distance between in these 4 points any 2 is known (prior imformation) simultaneously, that is:
d ( P i , P j ) = ( X i c - X j c ) 2 + ( Y i c - Y j c ) 2 + ( Z i c - Z j c ) 2 = d ij - - - ( 15 )
By formula, can obtain 8 equatioies, by formula, can obtain 6 equatioies, this can obtain 1 overdetermined equation group, by least square method, can solve W i c = ( X i c , Y i c , Z i c ) T .
(3) solve unique point world coordinate system coordinate
If the Relative Transformation relation between camera coordinate system and world coordinate system is represented by rotation matrix Rc and translation vector Tc.In Rc and the Tc video camera installation process before measurement, be known (camera calibration).That is:
W i w = RcW i c + Tc - - - ( 16 )
(4) solve the pose parameter of target
The pose parameter of target represents by rotation matrix R and translation vector T, world coordinate system coordinate and object coordinates be that coordinate exists following relation:
W i w = RW i + T - - - ( 17 )
If the coordinate of the barycenter of 4 unique points in world coordinate system and target-based coordinate system is respectively:
P ‾ = 1 4 Σ i = 1 4 W i w , Q ‾ = 1 4 Σ i = 1 4 W i
4 unique points the new coordinate of take under the coordinate system that barycenter is initial point as:
W i w ′ = W i w - P ‾ , W i ′ = W i - Q ‾
Wherein:
W i w ′ = ( x i w ′ , y i w ′ , x i w ′ ) T , W i ′ = ( x i ′ , y i ′ , x i ′ ) T
Thereby obtain:
T = P ‾ + R Q ‾ - - - ( 18 )
If:
S xx = Σ i = 1 4 x i w ′ x i ′ S xy = Σ i = 1 4 x i w ′ y i ′ S xz = Σ i = 1 4 x i w ′ z i ′ . . .
N = S xx + S yy + S zz S yz - S yz S zx - S xz S xy - S xy S yz - S yz S xx - S yy - S zz S xy + S yx S zx + S xz S zx - S xz S xy + S yx - S xx + S yy - S zz S yz + S zy S xy - S yx S zx + S xz S yz + S zy - S xx - S yy + S zz
The corresponding hypercomplex number r of rotation matrix R is the corresponding proper vector of eigenvalue of maximum of N.Order r · = r 0 r 1 r 2 r 3 T , The rotation matrix of its expression is:
R = r 0 2 + r 1 2 - r 2 2 - r 3 2 2 ( r 1 r 2 - r 0 r 3 ) 2 ( r 1 r 3 + r 0 r 2 ) 2 ( r 1 r 2 + r 0 r 3 ) r 0 2 - r 1 2 + r 2 2 - r 3 2 2 ( r 2 r 3 - r 0 r 1 ) 2 ( r 1 r 3 - r 0 r 2 ) 2 ( r 2 r 3 + r 0 r 1 ) r 0 2 - r 1 2 - r 2 2 + r 3 2
Thereby try to achieve R, and solve (α, beta, gamma) by formula (2), by formula (18), solve T=[T x, T y, T z] t, obtain the pose parameter of target.
Will be appreciated that, above description is one particular embodiment of the present invention, and the present invention is not limited only to the specific structure of above diagram or description, and claim is by all changes scheme covering in connotation of the present invention and scope.

Claims (3)

1. the object pose image measuring method based on wireless channel, is characterized in that,
Transmitting terminal and receiving end are set;
Described transmitting terminal comprises video camera and graphics processing unit;
Described video camera is taken the high-definition picture of cooperation airbound target;
Described graphics processing unit extracts at least 3 unique points that described video camera is taken high-definition picture, determines the coordinate of unique point in image; And set up the matching relationship of the priori features of described coordinate and described cooperation airbound target, synthetic image characteristic information data;
The image feature information data of every two field picture comprise coordinate and the numbering of unique point;
The image feature information data of described generation are transferred to receiving end by wireless channel from transmitting terminal;
Described receiving end comprises that pose resolves unit;
Pose resolves unit and adopts the described image feature information data that receive to carry out pose parameter calculation.
2. the object pose image measuring method based on wireless channel according to claim 1, is characterized in that, the image resolution ratio that video camera is taken is higher than 512 * 512, and frame per second is higher than 50bps.
3. the object pose image measuring method based on wireless channel according to claim 1 and 2, it is characterized in that, described feature point extraction, set up the matching relationship of image characteristic point and cooperative target priori features, whole links of synthetic image characteristic information data complete by take the graphics processing unit that dsp chip is core.
CN201310464818.4A 2013-10-08 2013-10-08 A kind of object pose image measuring method based on wireless channel Active CN103528568B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310464818.4A CN103528568B (en) 2013-10-08 2013-10-08 A kind of object pose image measuring method based on wireless channel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310464818.4A CN103528568B (en) 2013-10-08 2013-10-08 A kind of object pose image measuring method based on wireless channel

Publications (2)

Publication Number Publication Date
CN103528568A true CN103528568A (en) 2014-01-22
CN103528568B CN103528568B (en) 2016-08-17

Family

ID=49930771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310464818.4A Active CN103528568B (en) 2013-10-08 2013-10-08 A kind of object pose image measuring method based on wireless channel

Country Status (1)

Country Link
CN (1) CN103528568B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933432A (en) * 2014-03-18 2015-09-23 北京思而得科技有限公司 Processing method for finger pulp crease and finger vein images
CN108090931A (en) * 2017-12-13 2018-05-29 中国科学院光电技术研究所 Anti-blocking and anti-interference marker identification and pose measurement method based on combination of circle and cross features
CN108871314A (en) * 2018-07-18 2018-11-23 江苏实景信息科技有限公司 A kind of positioning and orientation method and device
CN109003305A (en) * 2018-07-18 2018-12-14 江苏实景信息科技有限公司 A kind of positioning and orientation method and device
CN112789672A (en) * 2018-09-10 2021-05-11 感知机器人有限公司 Control and navigation system, attitude optimization, mapping and positioning technology
CN108180917B (en) * 2017-12-31 2021-05-14 芜湖哈特机器人产业技术研究院有限公司 Top map construction method based on pose graph optimization
CN108225327B (en) * 2017-12-31 2021-05-14 芜湖哈特机器人产业技术研究院有限公司 Construction and positioning method of top mark map
CN114268742A (en) * 2022-03-01 2022-04-01 北京瞭望神州科技有限公司 Sky eye chip processing apparatus
CN114509089A (en) * 2021-12-31 2022-05-17 成都弓网科技有限责任公司 Non-contact rail transit train speed direction mileage detection method and system
US11827351B2 (en) 2018-09-10 2023-11-28 Perceptual Robotics Limited Control and navigation systems
CN117710449A (en) * 2024-02-05 2024-03-15 中国空气动力研究与发展中心高速空气动力研究所 NUMA-based real-time pose video measurement assembly line model optimization method

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933432A (en) * 2014-03-18 2015-09-23 北京思而得科技有限公司 Processing method for finger pulp crease and finger vein images
CN108090931A (en) * 2017-12-13 2018-05-29 中国科学院光电技术研究所 Anti-blocking and anti-interference marker identification and pose measurement method based on combination of circle and cross features
CN108180917B (en) * 2017-12-31 2021-05-14 芜湖哈特机器人产业技术研究院有限公司 Top map construction method based on pose graph optimization
CN108225327B (en) * 2017-12-31 2021-05-14 芜湖哈特机器人产业技术研究院有限公司 Construction and positioning method of top mark map
CN108871314B (en) * 2018-07-18 2021-08-17 江苏实景信息科技有限公司 Positioning and attitude determining method and device
CN108871314A (en) * 2018-07-18 2018-11-23 江苏实景信息科技有限公司 A kind of positioning and orientation method and device
CN109003305A (en) * 2018-07-18 2018-12-14 江苏实景信息科技有限公司 A kind of positioning and orientation method and device
CN109003305B (en) * 2018-07-18 2021-07-20 江苏实景信息科技有限公司 Positioning and attitude determining method and device
CN112789672A (en) * 2018-09-10 2021-05-11 感知机器人有限公司 Control and navigation system, attitude optimization, mapping and positioning technology
US11827351B2 (en) 2018-09-10 2023-11-28 Perceptual Robotics Limited Control and navigation systems
CN112789672B (en) * 2018-09-10 2023-12-12 感知机器人有限公司 Control and navigation system, gesture optimization, mapping and positioning techniques
US11886189B2 (en) 2018-09-10 2024-01-30 Perceptual Robotics Limited Control and navigation systems, pose optimization, mapping, and localization techniques
CN114509089A (en) * 2021-12-31 2022-05-17 成都弓网科技有限责任公司 Non-contact rail transit train speed direction mileage detection method and system
CN114268742A (en) * 2022-03-01 2022-04-01 北京瞭望神州科技有限公司 Sky eye chip processing apparatus
CN117710449A (en) * 2024-02-05 2024-03-15 中国空气动力研究与发展中心高速空气动力研究所 NUMA-based real-time pose video measurement assembly line model optimization method
CN117710449B (en) * 2024-02-05 2024-04-16 中国空气动力研究与发展中心高速空气动力研究所 NUMA-based real-time pose video measurement assembly line model optimization method

Also Published As

Publication number Publication date
CN103528568B (en) 2016-08-17

Similar Documents

Publication Publication Date Title
CN103528568A (en) Wireless channel based target pose image measuring method
CN107392963B (en) Eagle eye-imitated moving target positioning method for soft autonomous aerial refueling
CN110926474B (en) Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
CN110473260B (en) Wave video measuring device and method
US8259993B2 (en) Building shape change detecting method, and building shape change detecting system
CN107830846A (en) One kind utilizes unmanned plane and convolutional neural networks measurement communication tower aerial angle method
CN104764440A (en) Rolling object monocular pose measurement method based on color image
US11948344B2 (en) Method, system, medium, equipment and terminal for inland vessel identification and depth estimation for smart maritime
CN103793907B (en) Water body information extracting method and device
CN104463778B (en) A kind of Panoramagram generation method
CN106228579B (en) A kind of video image dynamic water table information extracting method based on geographical space-time scene
CN103925927B (en) A kind of traffic mark localization method based on Vehicular video
CN102855628B (en) Automatic matching method for multisource multi-temporal high-resolution satellite remote sensing image
CN103500449B (en) Visible remote sensing image cloud detection method of optic on a kind of star
CN105427284A (en) Fixed target marking method based on airborne android platform
CN105740856A (en) Method for reading readings of pointer instrument based on machine vision
CN104778695A (en) Water sky line detection method based on gradient saliency
CN104537646A (en) Multi-angle automatic MTF estimation method of remote sensing image
CN110569861A (en) Image matching positioning method based on point feature and contour feature fusion
CN103852060A (en) Visible light image distance measuring method based on monocular vision
CN105243653A (en) Fast mosaic technology of remote sensing image of unmanned aerial vehicle on the basis of dynamic matching
CN111709994B (en) Autonomous unmanned aerial vehicle visual detection and guidance system and method
Wang et al. Building heights estimation using ZY3 data—A case study of Shanghai, China
CN118260606A (en) Automatic registration method for ocean remote sensing geographic coordinates of low-orbit satellite
CN112381942B (en) Building three-dimensional temperature model building method based on unmanned aerial vehicle infrared image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant