CN102410831A - Design and positioning method of multi-stripe scan imaging model - Google Patents

Design and positioning method of multi-stripe scan imaging model Download PDF

Info

Publication number
CN102410831A
CN102410831A CN2010102919402A CN201010291940A CN102410831A CN 102410831 A CN102410831 A CN 102410831A CN 2010102919402 A CN2010102919402 A CN 2010102919402A CN 201010291940 A CN201010291940 A CN 201010291940A CN 102410831 A CN102410831 A CN 102410831A
Authority
CN
China
Prior art keywords
partiald
centerdot
image
overbar
ribbon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2010102919402A
Other languages
Chinese (zh)
Other versions
CN102410831B (en
Inventor
闫利
邓非
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201010291940.2A priority Critical patent/CN102410831B/en
Publication of CN102410831A publication Critical patent/CN102410831A/en
Application granted granted Critical
Publication of CN102410831B publication Critical patent/CN102410831B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

Aiming at the defects of three-linear-array stereoscopic imaging, the invention provides a multi-stripe scan stereoscopic imaging method so as to lower overhigh requirement of high-resolution linear array imaging on satellite stability, and further proposes a corresponding high-precision geometrical positioning theory and method.

Description

Multi-ribbon scanning imagery Model Design and localization method
Technical field
The present invention relates to a kind of imaging model and geometry location method thereof, try out the field of carrying out mapping in the spacer remote sensing satellite.
Background technology
The development of remote sensing satellite imaging technique and three-dimensional imaging ability thereof makes satellite surveying and mapping become geospatial information and obtains one of main means with continuous updating.International, the domestic three-linear array CCD mapping camera of having developed; Verified its has single line battle array and the incomparable advantage of twin-line array CCD mapping camera; Particularly under the situation that spaceborne big battle array tridimensional mapping camera can't be realized at present, this system has become the main flow three-dimensional imaging mode of present cartographic satellite.The geometric positioning accuracy of multi-thread battle array three-dimensional imaging depends on the degree of stability and the attitude measurement accuracy of satellite platform consumingly; Simultaneously, the strong correlation of the elements of exterior orientation of each scan line, and the complicacy of three linear array three-dimensional imagings has also caused the difficulty of using.
The linear array three-dimensional imaging is different from the imaging of frame width of cloth formula, and the latter obeys central projection, adopts collinearity equation to set up tight sensor model; The former obeys parallel projection along the rail direction, and the traverse rod direction is obeyed central projection, and the elements of exterior orientation of each scan line changes in time, and has very strong correlativity.In the photogrammetric algorithm of research MOMS, find; There is the correlativity between elements of exterior orientation in the bundle adjustment of Dynamic Photography image; Single course line bundle adjustment precision is bad; What length for heading more can not get confirming during less than 4 single baselines (face camera and forward and backward and look the baseline that camera constitutes) separates; Must add navigation data (taking the photograph station coordinates,
Figure BSA00000283356500011
ω, κ) record value, carry out simultaneous adjustment.The difficulty that strong correlation between elements of exterior orientation has caused the high-resolution linear array stereopsis to handle.Universal imaging model and concrete sensor imaging have nothing to do; Directly adopt mathematical function, like the geometric relationship between polynomial expression, direct linear transformation, affined transformation model, directional light projection model and rational polynominal model formal description ground points such as (RPC models) and the corresponding picture point.For example utilize the affined transformation model that linear array push-broom type satellite remote-sensing image is handled.The IKONOS satellite succeed in sending up the comprehensive research that has promoted rational function model RPC, RPC is a kind of expression way widely of sensor imaging model, it is applicable to that various kinds of sensors comprises up-to-date aerospace sensor model.
Summary of the invention
The present invention is mainly on the basis of further investigation high resolving power mapping satellite three-dimensional imaging mechanism; Defective to three linear array three-dimensional imagings; The design philosophy of multi-ribbon scanning three-dimensional imaging mode has been proposed; The reduction high-resolution linear array forms images to the excessive demand of satellite degree of stability, and explores corresponding high precision geometry location theory and method.
The multi-ribbon scanning imagery Model Design method of first scheme of the present invention is characterized in that:
Comprise:
Through on satellite, settling forward sight, facing and three band push-broom types of backsight camera CCD, obtain the step of multi-ribbon image,
To the step of the strict imaging model of said multi-ribbon scan-image design, and
The step that said multi-ribbon scan-image is positioned.
The multi-ribbon scanning imagery Model Design method of alternative plan of the present invention on the basis of above-mentioned first scheme, is characterized in that:
Step to said acquisition multi-ribbon image comprises:
Through each band camera is taken the photograph three linear array images extract the image of each same scan period of line array sensor by the 3 sampling time intervals, splice the step that obtains complete image.
The multi-ribbon scanning imagery Model Design method of third party's case of the present invention on the basis of above-mentioned first scheme, is characterized in that:
Step to the strict imaging model of said multi-ribbon scan-image design comprises:
Confirm the step of the coordinate of its corresponding imaging light under satellite body system according to size, focal length, each band CCD linear array number and CCD unit position in linear array of CCD unit,
According to the angle of pitch, roll angle and the crab angle of confirming satellite, calculate the step of the direction of visual lines of imaging light under orbital coordinate system,
Realize TSS imaging light and earth surface intersection through simulation, imaging light is arrived the step under the body-fixed coordinate system of the earth's core in the coordinate conversion under the said orbital coordinate system.
The multi-ribbon scanning imagery Model Design method of cubic case of the present invention on the basis of above-mentioned first scheme, is characterized in that:
Step to said multi-ribbon scan-image positions comprises:
Ask friendship through will form images light and DEM iteration, progressively be specified to the step of picture light and ground joining coordinate under the body-fixed coordinate system of the earth's core,
Resolve the step that sensor coordinates is tied to the transformation relation of satellite platform coordinate system according to imaging parameters,
Set up the mathematical model of TSS image resection, to the strong correlation between the orientation parameter, the method that adopts spectrum to revise is carried out the decorrelation processed steps.
Description of drawings
Fig. 1 illustrates the difference of TLS (three linear arrays) and TSS (three bands) three-dimensional imaging.
Fig. 2 illustrates TSS and faces phase mechanism as process.
Fig. 3 is the CCD linear array simulation synoptic diagram of TSS image.
Fig. 4 illustrates imaging light and DEM iteration intersection process.
Fig. 5 illustrates the relation between the local frame width of cloth coordinate system, image file coordinate system, sensor coordinate system of TSS image.
Fig. 6 illustrates the relation between the local frame width of cloth coordinate system, image file coordinate system, sensor coordinate system of TSS image.
Embodiment
Three strip sensor have been designed in the present invention; Three bands (three-strips scanner; TSS) the sensor imaging model is as shown in Figure 1, has forward sight, faces and three band push-broom types of backsight camera, and each view sensor of forward direction sporting flying via satellite pushes away to sweep and obtains the image band.Compare with the three-linear array CCD camera imaging; The forward sight of TSS imaging, face with backsight no longer be a CCD linear array, but form, in a scan period by three line array CCDs; The forward sight of TSS, face with rear view camera and obtain triplex row image (i.e. band) respectively; Therefore the TSS imaging has the characteristics of " band central projection ", and corresponding one group of elements of exterior orientation of each band in a sampling period is in the different sampling periods; Elements of exterior orientation is in the real-time change, has certain relation between the elements of exterior orientation of each adjacent ribbons.Band push-broom type imaging still existence pushes away in the inswept journey; The dynamic change of elements of exterior orientation value; But the record group number of the elements of exterior orientation that it needs obviously reduces; Help the confirming of elements of exterior orientation, image bearing accuracy raising, reduce satellite platform degree of stability demand etc., be remarkable improvement to TLS.
As shown in Figure 2, each band is made up of 3 line array CCD sensors (A, B, C), at each imaging cycle; These three line array sensors form images simultaneously, are similar to a little frame width of cloth camera, obtain a band image; Along with the forward direction sporting flying of satellite, accomplish continuous covering over the ground.Owing to adopted 3 line array CCD sensors; Then same place name point A will form images on three line array sensors; Shown in a1, a2, a3 among Fig. 2; Need a width of cloth image output, the image that for this reason line array sensor A, B, C is obtained extracts the image that the same scan period obtains by the time interval of 3 Δ t, and final splicing obtains a complete image.
The present invention designs TSS three-dimensional imaging mode, confirms the coordinate of its corresponding imaging light under satellite body system according to size, focal length, each band CCD linear array number and CCD unit position in linear array of CCD unit.
With the projection centre is true origin O 1Set up satellite body system, projection centre is Z with the central CCD unit line of facing camera linear array face 1Axle, direction be upwards for just, X 1Axle is parallel to the center row CCD linear array direction of facing camera, Y 1Axle and Z 1And X 1Axle constitutes right hand plane right-angle coordinate.As shown in Figure 3, forward sight, face with rear view camera and on the focal plane of TSS, be arranged in parallel.For three look camera each CCD unit, can calculate its planimetric coordinates on the focal plane earlier, then according to camera focus, represent the imaging line vector that it is corresponding with the three-dimensional coordinate in the satellite body coordinate system.
Suppose the corresponding imaging light in Sample capable CCD unit of line on the band CCD linear array satellite body system direction down is
Figure BSA00000283356500041
then this CCD unit planimetric coordinates on the focal plane be:
The forward sight camera:
x=((L-1)/2-sample)×μ X (1)
y=(line-(W-1)/2)×μ Y+f·tan(θ f)
Face camera:
x=((L-1)/2-sample)×μ X (2)
y=(line-(W-1)/2)×μ Y
Rear view camera:
x=((L-1)/2-sample)×μ X (3)
y=(line-(W-1)/2)×μ Y-f·tan(θ b)
The imaging light that then this CCD unit is corresponding is expressed as in satellite body system:
u → 1 ' = ( x , y , - f ) T - - - ( 4 )
In the formula, L is the length of CCD linear array; W is the width of ccd array, u X, u YBe respectively CCD cell size on x and the y direction, θ f, θ bThe side-looking angle of looking before and after being respectively, f is a focal length of facing camera.
The definition of TSS sensor orbital coordinate system is true origin O with the satellite barycenter 2, Z 2Axle points to the earth's core and oppositely confirms X through normalization 2Axle is pressed right hand rule and is confirmed.The initial point of TSS orbital coordinate system and body series overlaps, and the rotation relationship between them attitude angle measured of attitude control system via satellite confirms, i.e. angle of pitch pitch (t), roll angle roll (t) and crab angle yaw (t).
Therefore; After three attitude angle confirming satellite, the direction of visual lines of imaging light under orbital coordinate system can pass through computes:
u → 2 = u → 2 ' | | u → 2 ' | | - - - ( 5 )
u → 2 ' = M p · M r · M y · u → 1 - - - ( 6 )
In the formula, M p, M r, M yBe respectively t 3 rotation matrixs that attitude angle constitutes constantly.
If simulation realizes TSS imaging light and earth surface intersection, the light that must will form images is transformed under the earth's core-body-fixed coordinate system.The earth's core body-fixed coordinate system that in the TSS image simulation, adopts is the WGS84 coordinate system, so the coordinate under the orbital coordinate system can be transformed in the body-fixed coordinate system of the earth's core through following formula.
u → 3 = ( X 2 ) X ( Y 2 ) X ( Z 2 ) X ( X 2 ) Y ( Y 2 ) Y ( Z 2 ) Y ( X 2 ) Z ( Y 2 ) Z ( Z 2 ) Z · u → 2 - - - ( 7 )
In the formula,
Figure BSA00000283356500055
It is the sight line vector under the ground heart is admittedly; X 2, Y 2, Z 23 coordinate axis for orbital coordinate system.
The present invention locatees the multi-ribbon scan-image; Heart is down admittedly on ground; Any time; The imaging light that certain CCD unit of TSS sensor is corresponding can be used satellite position
Figure BSA00000283356500056
and ray vectors
Figure BSA00000283356500057
expression, promptly
X = X P + μ × ( u 3 ) X Y = Y P + μ × ( u 3 ) Y Z = Z P + μ × ( u 3 ) Z - - - ( 8 )
Wherein,
Figure BSA00000283356500059
is the position of satellite when moment t; μ is a projection coefficient;
Figure BSA000002833565000510
is the direction of radiation direction in the body-fixed coordinate system of the earth's core.
In actual computation, topocentric accurate position is ignorant, and its corresponding height value can not be confirmed, therefore needs iterative computation progressively to confirm topocentric coordinates, and promptly form images light and DEM iteration are asked friendship, and be as shown in Figure 4.
The present invention has set up the strict sensor model of TSS, and facing camera with TSS is the foundation that example is introduced its strict sensor model.For TSS faces camera; It is made up of N=3 bar CCD linear array unit; Form images moment at each; Face camera and obtain N bar image (i.e. band), each CCD unit corresponding a ray vectors, the imaging line vector that for example the capable sample row of line CCD unit is corresponding on the band ccd array is used P under satellite navigation system PSExpression.
Face camera: P PS = - Tg ( ψ Y ) + Tg ( ψ X ) - 1 = ( ( L - 1 ) / 2 - Sample ) × μ X / f ( Line - ( W - 1 ) / 2 ) × μ Y / f - 1 - - - ( 9 )
The forward sight camera:
P PS = - tg ( ψ Y ) + tg ( ψ X ) - 1 = ( ( L - 1 ) / 2 - sample ) × μ X / f ( ( line - ( W - 1 ) / 2 ) × μ Y + f × tan ( θ ) ) / f - 1 - - - ( 10 )
Rear view camera:
P PS = - tg ( ψ Y ) + tg ( ψ X ) - 1 = ( ( L - 1 ) / 2 - sample ) × μ X / f ( ( line - ( W - 1 ) / 2 ) × μ Y - f × tan ( θ ) ) / f - 1 - - - ( 11 )
Wherein, L is the length of ccd array, and W is the width (being the sweep trace number in the band) of ccd array, μ X, μ YBe respectively the size of the CCD unit on the both direction, f is a focal length of facing camera.
The sight line vector that the TSS image simulation adopts is undefined in satellite navigation, i.e. P PS, only there is a rotational transform between navigation coordinate system and the satellite platform coordinate system, promptly satellite platform system is around Z PThe axle half-twist can obtain navigation coordinate system, so the coordinate of imaging line vector under platform coordinate system of TSS image is P P
P P = R T · P PS = 0 - 1 0 1 0 0 0 0 1 · P PS - - - ( 12 )
Can obtain the earth's core body-fixed coordinate system of TSS image and the relation between the satellite platform coordinate system.
P ECS = S ( t ) + λ · R OS ( t ) · R PS ( t ) · R T T · p P - - - ( 13 )
R 0As far as a scape image is a matrix that does not change in time, corresponding to the orbital coordinate system O-X of central scan row oY oZ oThree.R OS(t) be an amount that changes with sweep time, it is corresponding to each band scanning imagery orbital coordinate system O constantly 2-X 2Y 2Z 2Three.In essence, R 0Be R OS(t) work as t=t cThe time value, the two all is to utilize the position of satellite and velocity to calculate, but R 0Utilize the satellite position and the velocity of central scan row (or band) to calculate, and R OS(t) be satellite position and the speed calculation of getting current scan line (or scanning band).
Suppose width W=3 of CCD linear array, face promptly that the sweep trace number is 3 in the camera band, ground point A image file coordinate of corresponding picture point a on the TSS image is (x I, y I), then this picture point is in imaging in n scan period, and the row of this picture point in current scanning band number be line, and is as shown in Figure 5.
n = INT ( ( y I - 1 ) W ) + 1 line = mod ( ( y I - 1 ) W ) - - - ( 14 )
Wherein, INT representes rounding operation; The complementation of mod function representation, and the value of line be (0,1 ..., W-1).Then the coordinate of picture point p under local frame width of cloth formula coordinate system is p F
p F = x F y F z F = x I ( line - ( W - 1 ) / 2 ) × μ Y 0 - - - ( 15 )
As shown in Figure 5, the platform coordinate system initial point of TSS image is at projection centre C, only has rotation relationship between promptly between sensor coordinate system and satellite platform coordinate system, and do not have translation relation, be i.e. C M=0.Suppose that satellite platform system is earlier around X FThe axle rotation
Figure BSA00000283356500074
The angle is again around Y FAxle rotation ω angle is at last around Z FAxle rotation κ angle is to sensor coordinate system, and establishing rotation matrix is R M
R M = a 1 a 2 a 3 b 1 b 2 b 3 c 1 c 2 c 3 - - - ( 16 )
Then ignoring under the situation of systematic error, i.e. δ x=0 can get:
x F = - f · a 1 · tan ( ψ x ) + b 1 · tan ( ψ y ) + c 1 a 3 · tan ( ψ x ) + b 3 · tan ( ψ y ) + c 3 + x F c y F = - f · a 2 · tan ( ψ x ) + b 2 · tan ( ψ y ) + c 2 a 3 · tan ( ψ x ) + b 3 · tan ( ψ y ) + c 3 + y F c - - - ( 17 )
For each CCD detector, p FAnd P PBe known, therefore can adopt least square adjustment to confirm R MAnd c FAbove solution procedure is called as the interior orientation of TSS image.
The TSS image exists the strict imaging model based on satellite parametric reduction of so TSS image to be:
x F - x F c y F - y F c - f = μ · R M T · R T · R PS T ( t ) · R OS T ( t ) · X ECS - X S ( t ) Y ECS - Y S ( t ) Z ECS - Z S ( t ) - - - ( 18 )
Make the class collinearity condition equation of
Figure BSA00000283356500082
TSS:
x F - x F c = - f · R 11 · ( X ECS - X S ( t ) ) + R 12 · ( Y ECS - Y S ( t ) ) + R 13 · ( Z ECS - Z S ( t ) ) R 31 · ( X ECS - X S ( t ) ) + R 32 · ( Y ECS - Y S ( t ) ) + R 33 ( Z ECS - Z S ( t ) ) y F - y F c = - f · R 21 · ( X ECS - X S ( t ) ) + R 22 · ( Y ECS - Y S ( t ) ) + R 23 · ( Z ECS - Z S ( t ) ) R 31 · ( X ECS - X S ( t ) ) + R 32 · ( Y ECS - Y S ( t ) ) + R 33 · ( Z ECS - Z S ( t ) ) - - - ( 19 )
In the formula, x F, y FBe the coordinate of picture point under the local frame width of cloth of TSS formula coordinate system; X ECS, Y ECS, Z ECSBe the coordinate of ground point under the ground heart is admittedly; X S, Y S, Z SSatellite is at the coordinate of ground heart under admittedly constantly for ground point imaging, and for the seeing image picture of front and back, the symbol of f front is just got, and faces image and gets negative.The main effect of the strict imaging model of TSS image has: the fundamental error equation of the space resection of monolithic or multi-disc, space intersection, area adjustment, calculating simulated data, numerical differentiation correction, monolithic mapping or the like.
The foundation of constraint condition:
Because each view sensor of TSS has adopted W=3 bar CCD line array sensor; As shown in Figure 6; Ground point A has become three times picture facing on 3 line array CCD sensors of camera; Corresponding to picture point a1, a2, a3, though final positive apparent direction is exported a width of cloth image, we can be by which the line array CCD sensor imaging of positive apparent direction to be obtained according to ranks number definite a point of facing the corresponding picture point a of image upper ground surface point A; And can reach the row line in current scanning band, and can obtain the picture point a1 (or a2 or a3) of ground point A correspondence on other two line array sensors of positive apparent direction according to this according to the residing scan period n of a point.
Suppose that picture point a is obtained and got by positive line of visual direction array sensor C, promptly a point then calculates the imaging moment and the coordinate under local frame width of cloth formula coordinate system thereof of a1 and can be obtained by following formula corresponding to a3:
If the row that a1 is ordered number are x I 1, then the coordinate of a1 point under local frame width of cloth formula coordinate system is p F 1=(x F 1, y F 1, z F 1) T, x then F 1=x I 1=x I
If line<=(W/2), Δ y=W-line-1 then, y I 1=y I+ Δ y, n 1=fix (y I 1/ W), line 1=mod (y I 1, W), y F 1=line 1-(W-1)/2, t 1=n 1* W-Δ y;
Otherwise: Δ y=line, y I 1=y I-Δ y, n 1=fix (y I 1/ W), line 1=mod (y I 1, W), y F 1=line 1-(W-1)/2, t 1=n 1* W+ Δ y;
By facing after picture point a confirms a1, a2, a3 on the image; In like manner confirm ground point A apparent direction becomes in front and back picture point a1 ', a2 ', a3 ' and a1 ", a2 ", a3 "; then when confirming ground point A locus, become 9 ray intersections by three ray intersections of three linear arrays; thereby increased excess observation, with it as constraint condition.
The present invention designs the space resection and the sane calculation method thereof of TSS image; Select different tracks and attitude error correction model, then corresponding different orientation parameter combination is modified to example with quadratic term; According to Taylor series expansion, getting the small value once, item obtains error equation:
Vx F = ∂ x F ∂ c x , 0 · dc x , 0 + ∂ x F ∂ c x , 1 · dc x , 1 + ∂ x F ∂ c x , 2 · d c x , 2 + ∂ x F ∂ c y , 0 · dc y , 0 + ∂ x F ∂ c y , 1 · d c y , 1 + ∂ x F ∂ c y , 2 · dc y , 2 + ∂ x F ∂ c z , 0 · dc z , 0
+ ∂ x F ∂ c z , 1 · dc z , 1 + ∂ x F ∂ c z , 2 · d c z , 2 + ∂ x F ∂ e r , 0 · de r , 0 + ∂ x F ∂ e r , 1 · d e r , 1 + ∂ x F ∂ e r , 2 · de r , 2 + ∂ x F ∂ e p , 0 · de p , 0 + ∂ x F ∂ e p , 1 · de p , 1
+ ∂ x F ∂ e p , 2 · d e p , 2 + ∂ x F ∂ e y , 0 · de y , 0 + ∂ x F ∂ e y , 1 · d e y , 1 + ∂ x F ∂ e y , 2 · de y , 2 - l x F
Vy F = ∂ y F ∂ c x , 0 · dc x , 0 + ∂ y F ∂ c x , 1 · dc x , 1 + ∂ y F ∂ c x , 2 · d c x , 2 + ∂ y F ∂ c y , 0 · dc y , 0 + ∂ y F ∂ c y , 1 · d c y , 1 + ∂ y F ∂ c y , 2 · dc y , 2 + ∂ y F ∂ c z , 0 · dc z , 0
+ ∂ y F ∂ c z , 1 · dc z , 1 + ∂ y F ∂ c z , 2 · d c z , 2 + ∂ y F ∂ e r , 0 · de r , 0 + ∂ y F ∂ e r , 1 · d e r , 1 + ∂ y F ∂ e r , 2 · de r , 2 + ∂ y F ∂ e p , 0 · de p , 0 + ∂ y F ∂ e p , 1 · de p , 1
+ ∂ y F ∂ e p , 2 · d e p , 2 + ∂ y F ∂ e y , 0 · de y , 0 + ∂ y F ∂ e y , 1 · d e y , 1 + ∂ y F ∂ e y , 2 · de y , 2 - l y F
In the formula, dc X, 0, dc X, 1..., de Y, 2Be the correction of orientation parameter, their coefficient is the partial derivative of function.
For writing conveniently, make the molecule of strict sensor model, denominator represent with following formula:
X ‾ = R 11 · ( X ECS - X S ( t ) ) + R 12 · ( Y ECS - Y S ( t ) ) + R 13 · ( Z ECS - Z S ( t ) ) Y ‾ = R 21 · ( X ECS - X S ( t ) ) + R 22 · ( Y ECS - Y S ( t ) ) + R 23 · ( Z ECS - Z S ( t ) ) Z ‾ = R 31 · ( X ECS - X S ( t ) ) + R 32 · ( Y ECS - Y S ( t ) ) + R 33 · ( Z ECS - Z S ( t ) ) - - - ( 20 )
∂ x F ∂ c x , 0 = 1 Z ‾ · [ R 11 f + ( x F - x F c ) R 31 ] ∂ x F ∂ c x , 1 = t ‾ · ∂ x F ∂ c x , 0 ∂ x F ∂ c x , 2 = t ‾ 2 · ∂ x F ∂ c x , 0
∂ x F ∂ c y , 0 = 1 Z ‾ · [ R 12 f + ( x F - x F c ) R 32 ] ∂ x F ∂ c y , 1 = t ‾ · ∂ x F ∂ c y , 0 ∂ x F ∂ c y , 2 = t ‾ 2 · ∂ x F ∂ c y , 0
∂ x F ∂ c z , 0 = 1 Z ‾ · [ R 13 f + ( x F - x F c ) R 33 ] ∂ x F ∂ c z , 1 = t ‾ · ∂ x F ∂ c z , 0 ∂ x F ∂ c z , 2 = t ‾ 2 · ∂ x F ∂ c z , 0
∂ y F ∂ c x , 0 = 1 Z ‾ · [ R 21 f + ( y F - y F c ) R 31 ] ∂ y F ∂ c x , 1 = t ‾ · ∂ y F ∂ c x , 0 ∂ y F ∂ c x , 2 = t ‾ 2 · ∂ y F ∂ c x , 0
∂ y F ∂ c y , 0 = 1 Z ‾ · [ R 22 f + ( y F - y F c ) R 32 ] ∂ y F ∂ c y , 1 = t ‾ · ∂ y F ∂ c y , 0 ∂ y F ∂ c y , 2 = t ‾ 2 · ∂ y F ∂ c y , 0
∂ y F ∂ c z , 0 = 1 Z ‾ · [ R 23 f + ( y F - y F c ) R 33 ] ∂ y F ∂ c z , 1 = t ‾ · ∂ y F ∂ c z , 0 ∂ y F ∂ c z , 2 = t ‾ 2 · ∂ y F ∂ c z , 0
Make
Figure BSA00000283356500107
then
∂ h 11 ∂ roll = - cos ( yaw ) · sin ( roll ) ; ∂ h 12 ∂ roll = cos ( yaw ) · cos ( roll ) · sin ( pitch ) ;
∂ h 13 ∂ roll = cos ( yaw ) · cos ( roll ) · cos ( pitch ) ; ∂ h 21 ∂ roll = sin ( yaw ) · sin ( roll ) ;
∂ h 22 ∂ roll = - sin ( yaw ) · cos ( roll ) · sin ( pitch ) ; ∂ h 23 ∂ roll = - sin ( yaw ) · cos ( roll ) · cos ( pitch ) ;
∂ h 31 ∂ roll = - cos ( roll ) ; ∂ h 32 ∂ roll = - sin ( roll ) · sin ( pitch ) ; ∂ h 33 ∂ roll = - sin ( roll ) · cos ( pitch ) ;
∂ h 11 ∂ pitch = 0 ; ∂ h 12 ∂ pitch = - sin ( yaw ) · sin ( pitch ) + cos ( yaw ) · sin ( roll ) · cos ( pitch ) ;
∂ h 13 ∂ pitch = - sin ( yaw ) · cos ( pitch ) - cos ( yaw ) · sin ( roll ) · sin ( pitch ) ; ∂ h 21 ∂ pitch = 0 ;
∂ h 22 ∂ pitch = - cos ( yaw ) · sin ( pitch ) - sin ( yaw ) · sin ( roll ) · cos ( pitch ) ; ∂ h 31 ∂ pitch = 0 ;
∂ h 23 ∂ pitch = sin ( yaw ) · sin ( roll ) · sin ( pitch ) - cos ( yaw ) · cos ( pitch ) ; ∂ h 32 ∂ pitch = cos ( roll ) · cos ( pitch ) ;
∂ h 33 ∂ pitch = - cos ( roll ) · sin ( pitch ) ; ∂ h 12 ∂ yaw = cos ( yaw ) · cos ( pitch ) - sin ( yaw ) · sin ( roll ) · sin ( pitch ) ;
∂ h 11 ∂ yaw = - sin ( yaw ) · cos ( roll ) ; ∂ h 13 ∂ yaw = - cos ( yaw ) · sin ( pitch ) - sin ( yaw ) · sin ( roll ) · cos ( pitch ) ;
∂ h 21 ∂ yaw = - cos ( yaw ) · cos ( roll ) ; ∂ h 22 ∂ yaw = - sin ( yaw ) · cos ( pitch ) - cos ( yaw ) · sin ( roll ) · sin ( pitch ) ;
∂ h 23 ∂ yaw = sin ( yaw ) · sin ( pitch ) - cos ( yaw ) · sin ( roll ) · cos ( pitch ) ; ∂ h 31 ∂ yaw = ∂ h 32 ∂ yaw = ∂ h 33 ∂ yaw = 0 ;
∂ ∂ yaw R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 = b 1 - a 1 c 1 b 2 - a 2 c 2 b 3 - a 3 c 3 · ∂ h 11 ∂ yaw ∂ h 12 ∂ yaw ∂ h 13 ∂ yaw ∂ h 21 ∂ yaw ∂ h 22 ∂ yaw ∂ h 23 ∂ yaw ∂ h 31 ∂ yaw ∂ h 32 ∂ yaw ∂ h 33 ∂ yaw · R OS T ( t ) - - - ( 21 )
∂ ∂ roll R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 = b 1 - a 1 c 1 b 2 - a 2 c 2 b 3 - a 3 c 3 · ∂ h 11 ∂ roll ∂ h 12 ∂ roll ∂ h 13 ∂ roll ∂ h 21 ∂ roll ∂ h 22 ∂ roll ∂ h 23 ∂ roll ∂ h 31 ∂ roll ∂ h 32 ∂ roll ∂ h 33 ∂ roll · R OS T ( t ) - - - ( 22 )
∂ ∂ pitch R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 = b 1 - a 1 c 1 b 2 - a 2 c 2 b 3 - a 3 c 3 · ∂ h 11 ∂ pitch ∂ h 12 ∂ pitch ∂ h 13 ∂ pitch ∂ h 21 ∂ pitch ∂ h 22 ∂ pitch ∂ h 23 ∂ pitch ∂ h 31 ∂ pitch ∂ h 32 ∂ pitch ∂ h 33 ∂ pitch · R OS T ( t ) - - - ( 23 )
∂ X ‾ ∂ roll = ( X ECS - X S ( t ‾ ) ) · ∂ R 11 ∂ roll + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 12 ∂ roll + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 13 ∂ roll
∂ Z ‾ ∂ roll = ( X ECS - X S ( t ‾ ) ) · ∂ R 31 ∂ roll + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 32 ∂ roll + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 33 ∂ roll
∂ x F ∂ roll = - 1 Z ‾ · [ ∂ X ‾ ∂ roll · f + ( x F - x F c ) · ∂ Z ‾ ∂ roll ]
⇒ ∂ x F ∂ e r , 0 = ∂ x F ∂ roll ; ∂ x F ∂ e r , 1 = t ‾ · ∂ x F ∂ roll ; ∂ x F ∂ e r , 2 = t ‾ 2 · ∂ x F ∂ roll ;
∂ X ‾ ∂ pitch = ( X ECS - X S ( t ‾ ) ) · ∂ R 11 ∂ pitch + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 12 ∂ pitch + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 13 ∂ pitch
∂ Z ‾ ∂ pitch = ( X ECS - X S ( t ‾ ) ) · ∂ R 31 ∂ pitch + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 32 ∂ pitch + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 33 ∂ pitch
∂ x F ∂ pitch = - 1 Z ‾ · [ ∂ X ‾ ∂ pitch · f + ( x F - x F c ) · ∂ Z ‾ ∂ pitch ]
⇒ ∂ x F ∂ e p , 0 = ∂ x F ∂ pitch ; ∂ x F ∂ e p , 1 = t ‾ · ∂ x F ∂ pitch ; ∂ x F ∂ e p , 2 = t ‾ 2 · ∂ x F ∂ pitch ;
∂ X ‾ ∂ yaw = ( X ECS - X S ( t ‾ ) ) · ∂ R 11 ∂ yaw + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 12 ∂ yaw + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 13 ∂ yaw
∂ Z ‾ ∂ yaw = ( X ECS - X S ( t ‾ ) ) · ∂ R 31 ∂ yaw + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 32 ∂ yaw + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 33 ∂ yaw
∂ x F ∂ yaw = - 1 Z ‾ · [ ∂ X ‾ ∂ yaw · f + ( x F - x F c ) · ∂ Z ‾ ∂ yaw ]
⇒ ∂ x F ∂ e y , 0 = ∂ x F ∂ yaw ; ∂ x F ∂ e y , 1 = t ‾ · ∂ x F ∂ yaw ; ∂ x F ∂ e y , 2 = t ‾ 2 · ∂ x F ∂ yaw ;
∂ Y ‾ ∂ roll = ( X ECS - X S ( t ‾ ) ) · ∂ R 21 ∂ roll + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 22 ∂ roll + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 23 ∂ roll
∂ y F ∂ roll = - 1 Z ‾ · [ ∂ Y ‾ ∂ roll · f + ( y F - y F c ) · ∂ Z ‾ ∂ roll ]
⇒ ∂ y F ∂ e r , 0 = ∂ y F ∂ roll ; ∂ y F ∂ e r , 1 = t ‾ · ∂ y F ∂ roll ; ∂ y F ∂ e r , 2 = t ‾ 2 · ∂ y F ∂ roll ;
∂ Y ‾ ∂ pitch = ( X ECS - X S ( t ‾ ) ) · ∂ R 21 ∂ pitch + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 22 ∂ pitch + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 23 ∂ pitch
∂ y F ∂ pitch = - 1 Z ‾ · [ ∂ Y ‾ ∂ pitch · f + ( y F - y F c ) · ∂ Z ‾ ∂ pitch ]
⇒ ∂ y F ∂ e p , 0 = ∂ y F ∂ pitch ; ∂ y F ∂ e p , 1 = t ‾ · ∂ y F ∂ pitch ; ∂ y F ∂ e p , 2 = t ‾ 2 · ∂ y F ∂ pitch ;
∂ Y ‾ ∂ yaw = ( X ECS - X S ( t ‾ ) ) · ∂ R 21 ∂ yaw + ( Y ECS - Y S ( t ‾ ) ) · ∂ R 22 ∂ yaw + ( Z ECS - Z S ( t ‾ ) ) · ∂ R 23 ∂ yaw
∂ y F ∂ yaw = - 1 Z ‾ · [ ∂ Y ‾ ∂ yaw · f + ( y F - y F c ) · ∂ Z ‾ ∂ yaw ]
⇒ ∂ y F ∂ e y , 0 = ∂ y F ∂ yaw ; ∂ y F ∂ e y , 1 = t ‾ · ∂ y F ∂ yaw ; ∂ y F ∂ e y , 2 = t ‾ 2 · ∂ y F ∂ yaw ;
Write the coefficient entry of above-mentioned error equation as matrix form A, constant term L=[l x, l y] T, wherein
Figure BSA000002833565001214
Figure BSA000002833565001215
V=[Vx F, Vy F] T, the correction of orientation parameter does
Figure BSA000002833565001216
Then error equation can be expressed as:
V = A X ^ - L - - - ( 24 )
Have 18 unknown parameters in the error equation, so need 9 ground control points just can calculate these orientation parameters at least.Certainly,, need resolve 6 or 12 orientation parameters, then need 3 or 6 reference mark to resolve corresponding orientation parameter at least if adopt constant or a VEC once.
The prominent feature of spaceborne ccd sensor shows as long-focus and narrow field angle.A large number of experiments show that; This imaging geometry has caused and has had stronger correlation between the orientation parameter of sensor, thereby causes having approximate linear between the column vector of error equation coefficient matrix, i.e. multi-collinearity; This moment, normal equation was seriously ill, even unusual.The present invention adopts spectrum to revise correlation method between the orientation parameter that iteration overcomes the TSS image.
Normal equation can be written as:
A T PA X ^ - A T PL = 0 - - - ( 25 )
The following formula both sides are added simultaneously
Figure BSA000002833565001219
( A T PA + I ) X ^ = A T PL + X ^ - - - ( 26 )
I is a unit matrix in the formula; Adopt the method for iteration to find the solution, its iterative formula is:
X ^ ( k ) = ( A T PA + I ) - 1 ( A T PL + X ^ k - 1 ) - - - ( 27 )
Derivation by following formula can find out that spectrum is revised the pathosis that iterative algorithm can improve normal equation, keeps the numerical stability in the solution of normal equation process, does not change the relation of equal quantity of equation again, thereby guarantees the unbiasedness of estimated result.

Claims (4)

1. multi-ribbon scanning imagery Model Design method is characterized in that:
Comprise:
Through on satellite, settling forward sight, facing and three band push-broom types of backsight camera CCD, obtain the step of multi-ribbon image,
To the step of the strict imaging model of said multi-ribbon scan-image design, and
The step that said multi-ribbon scan-image is positioned.
2. multi-ribbon scanning imagery Model Design method according to claim 1, it is characterized in that: the step to said acquisition multi-ribbon image comprises:
Through each band camera is taken the photograph three linear array images extract the image of each same scan period of line array sensor by the 3 sampling time intervals, splice the step that obtains complete image.
3. multi-ribbon scanning imagery Model Design method according to claim 1 is characterized in that: the step to the strict imaging model of said multi-ribbon scan-image design comprises:
Confirm the step of the coordinate of its corresponding imaging light under satellite body system according to size, focal length, each band CCD linear array number and CCD unit position in linear array of CCD unit,
According to the angle of pitch, roll angle and the crab angle of confirming satellite, calculate the step of the direction of visual lines of imaging light under orbital coordinate system,
Realize TSS imaging light and earth surface intersection through simulation, imaging light is arrived the step under the body-fixed coordinate system of the earth's core in the coordinate conversion under the said orbital coordinate system.
4. multi-ribbon scanning imagery Model Design method according to claim 1, it is characterized in that: the step to said multi-ribbon scan-image positions comprises:
Ask friendship through will form images light and DEM iteration, progressively be specified to the step of picture light and ground joining coordinate under the body-fixed coordinate system of the earth's core,
Resolve the step that sensor coordinates is tied to the transformation relation of satellite platform coordinate system according to imaging parameters,
Set up the mathematical model of TSS image resection, to the strong correlation between the orientation parameter, the method that adopts spectrum to revise is carried out the decorrelation processed steps.
CN201010291940.2A 2010-09-26 2010-09-26 Design and positioning method of multi-stripe scan imaging model Expired - Fee Related CN102410831B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010291940.2A CN102410831B (en) 2010-09-26 2010-09-26 Design and positioning method of multi-stripe scan imaging model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010291940.2A CN102410831B (en) 2010-09-26 2010-09-26 Design and positioning method of multi-stripe scan imaging model

Publications (2)

Publication Number Publication Date
CN102410831A true CN102410831A (en) 2012-04-11
CN102410831B CN102410831B (en) 2015-06-24

Family

ID=45913047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010291940.2A Expired - Fee Related CN102410831B (en) 2010-09-26 2010-09-26 Design and positioning method of multi-stripe scan imaging model

Country Status (1)

Country Link
CN (1) CN102410831B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102891713A (en) * 2012-09-27 2013-01-23 哈尔滨工程大学 Low-orbit microsatellite formation system suitable for medium/high-latitude region coverage
CN103134492A (en) * 2013-02-01 2013-06-05 哈尔滨工业大学 Agile imaging satellite linear scan strip pre-generating method based on point target and satellite three-axis gesture rapid determining method
CN105247595A (en) * 2013-05-27 2016-01-13 卡尔蔡司工业测量技术有限公司 Method and device for generating at least one virtual image of a measurement object
CN106603927A (en) * 2017-01-10 2017-04-26 吉林大学 Space-borne APS camera digital domain TDI (Time delayed integration) plus complex line array imaging method
CN106595598A (en) * 2016-12-21 2017-04-26 上海航天控制技术研究所 Wide-field-of-view constant-diyuan optical remote sensing imaging method
CN107421503A (en) * 2017-07-28 2017-12-01 长光卫星技术有限公司 Simple detector three-linear array stereo mapping imaging method and system
CN108932384A (en) * 2018-07-04 2018-12-04 北京市遥感信息研究所 Based on the maximized imaging satellite regional aim covering method of a covering tape
CN109827526A (en) * 2019-03-13 2019-05-31 中国十七冶集团有限公司 One kind being based on photogrammetric planar smoothness detection method and its flow chart of data processing
CN110986886A (en) * 2019-12-18 2020-04-10 中国科学院长春光学精密机械与物理研究所 Double-camera dynamic rotation scanning three-dimensional imaging simulation device
CN111695013A (en) * 2020-04-24 2020-09-22 中国资源卫星应用中心 Method and device for monitoring satellite orbit and imaging strip resources
CN115014294A (en) * 2022-05-18 2022-09-06 河海大学 River channel agile imaging method for east-west satellite

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050259890A1 (en) * 2004-05-18 2005-11-24 Lutes James A Estimation of coefficients for a rational polynomial camera model
CN1273936C (en) * 2004-10-21 2006-09-06 武汉大学 Correspond rediation correction method for push-scanning satellite images CCD
CN100498246C (en) * 2003-12-30 2009-06-10 中国科学院上海技术物理研究所 Machine-carried broom pushing type multidimension imaging device
CN101776451A (en) * 2009-01-08 2010-07-14 中国科学院西安光学精密机械研究所 CCD mapping camera capable of self-stabilizing and self-correcting motion distortion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100498246C (en) * 2003-12-30 2009-06-10 中国科学院上海技术物理研究所 Machine-carried broom pushing type multidimension imaging device
US20050259890A1 (en) * 2004-05-18 2005-11-24 Lutes James A Estimation of coefficients for a rational polynomial camera model
CN1273936C (en) * 2004-10-21 2006-09-06 武汉大学 Correspond rediation correction method for push-scanning satellite images CCD
CN101776451A (en) * 2009-01-08 2010-07-14 中国科学院西安光学精密机械研究所 CCD mapping camera capable of self-stabilizing and self-correcting motion distortion

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102891713B (en) * 2012-09-27 2014-12-03 哈尔滨工程大学 Low-orbit microsatellite formation system suitable for medium/high-latitude region coverage
CN102891713A (en) * 2012-09-27 2013-01-23 哈尔滨工程大学 Low-orbit microsatellite formation system suitable for medium/high-latitude region coverage
CN103134492A (en) * 2013-02-01 2013-06-05 哈尔滨工业大学 Agile imaging satellite linear scan strip pre-generating method based on point target and satellite three-axis gesture rapid determining method
CN103134492B (en) * 2013-02-01 2015-04-29 哈尔滨工业大学 Agile imaging satellite linear scan strip pre-generating method based on point target and satellite three-axis gesture rapid determining method
CN105247595A (en) * 2013-05-27 2016-01-13 卡尔蔡司工业测量技术有限公司 Method and device for generating at least one virtual image of a measurement object
CN106595598A (en) * 2016-12-21 2017-04-26 上海航天控制技术研究所 Wide-field-of-view constant-diyuan optical remote sensing imaging method
CN106595598B (en) * 2016-12-21 2019-03-19 上海航天控制技术研究所 A kind of first optical remote sensing imaging method in permanent ground of wide visual field
CN106603927A (en) * 2017-01-10 2017-04-26 吉林大学 Space-borne APS camera digital domain TDI (Time delayed integration) plus complex line array imaging method
CN106603927B (en) * 2017-01-10 2019-05-28 吉林大学 A kind of spaceborne APS camera numeric field TDI adds multiple line battle array imaging method
CN107421503B (en) * 2017-07-28 2020-04-10 长光卫星技术有限公司 Single-detector three-linear-array three-dimensional mapping imaging method and system
CN107421503A (en) * 2017-07-28 2017-12-01 长光卫星技术有限公司 Simple detector three-linear array stereo mapping imaging method and system
CN108932384A (en) * 2018-07-04 2018-12-04 北京市遥感信息研究所 Based on the maximized imaging satellite regional aim covering method of a covering tape
CN109827526A (en) * 2019-03-13 2019-05-31 中国十七冶集团有限公司 One kind being based on photogrammetric planar smoothness detection method and its flow chart of data processing
CN110986886A (en) * 2019-12-18 2020-04-10 中国科学院长春光学精密机械与物理研究所 Double-camera dynamic rotation scanning three-dimensional imaging simulation device
CN111695013A (en) * 2020-04-24 2020-09-22 中国资源卫星应用中心 Method and device for monitoring satellite orbit and imaging strip resources
CN111695013B (en) * 2020-04-24 2023-05-30 中国资源卫星应用中心 Method and device for monitoring satellite orbit and imaging strip resources
CN115014294A (en) * 2022-05-18 2022-09-06 河海大学 River channel agile imaging method for east-west satellite

Also Published As

Publication number Publication date
CN102410831B (en) 2015-06-24

Similar Documents

Publication Publication Date Title
CN102410831A (en) Design and positioning method of multi-stripe scan imaging model
Hu et al. Understanding the rational function model: methods and applications
CN102735216B (en) CCD stereoscopic camera three-line imagery data adjustment processing method
US7778534B2 (en) Method and apparatus of correcting geometry of an image
CN109631876B (en) Inspection detector positioning method based on single-camera navigation image
CN102519433B (en) Method for inverting geometric calibrating parameter of satellite-borne linear array sensor by using RPC (Remote Position Control)
CN104764443A (en) Optical remote sensing satellite rigorous imaging geometrical model building method
CN102519436A (en) Chang'e-1 (CE-1) stereo camera and laser altimeter data combined adjustment method
CN113538595B (en) Method for improving geometric precision of remote sensing stereo image by using laser height measurement data in auxiliary manner
CN110986888A (en) Aerial photography integrated method
CN103411587A (en) Positioning and attitude-determining method and system
CN105466400A (en) Method for probing multi-source satellite image corresponding image point imaging intersection angles
CN108225282B (en) Remote sensing camera stereo mapping method and system based on multivariate data fusion
Liu et al. A new approach to fast mosaic UAV images
Zhao et al. Direct georeferencing of oblique and vertical imagery in different coordinate systems
CN112461204B (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
Wang et al. Geometric calibration for the aerial line scanning camera GFXJ
Tao et al. On-orbit geometric calibration of the panchromatic/multispectral camera of the ZY-1 02C satellite based on public geographic data
CN104019800B (en) The method of big side-sway line array CCD remote sensing images positioning for ground
CN111044076B (en) Geometric calibration method for high-resolution first-number B satellite based on reference base map
Oh et al. A test result on positional accuracy of Kompsat-2 pan imagery
Lee et al. Automatic building reconstruction with satellite images and digital maps
Grejner-Brzezinska Direct platform orientation with tightly integrated GPS/INS in airborne applications
CN112541964B (en) Method for estimating satellite imaging direction angle based on rational function model
Seo et al. Merge of sub-images from two PAN CCD lines of Kompsat-3 AEISS

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150624

Termination date: 20160926

CF01 Termination of patent right due to non-payment of annual fee