CN102982199A - Simulation method and device of optical lens - Google Patents

Simulation method and device of optical lens Download PDF

Info

Publication number
CN102982199A
CN102982199A CN2012104339542A CN201210433954A CN102982199A CN 102982199 A CN102982199 A CN 102982199A CN 2012104339542 A CN2012104339542 A CN 2012104339542A CN 201210433954 A CN201210433954 A CN 201210433954A CN 102982199 A CN102982199 A CN 102982199A
Authority
CN
China
Prior art keywords
target
optical frames
coordinate
visible object
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012104339542A
Other languages
Chinese (zh)
Other versions
CN102982199B (en
Inventor
孟红
孙勇
李文伟
薛姬荣
黄丹
李广运
李增路
易中凯
李军
唐锐
李俊杰
侯德林
孙旭光
符蓓蓓
杨蔚青
杨建�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ordnance Science and Research Academy of China
Original Assignee
Ordnance Science and Research Academy of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ordnance Science and Research Academy of China filed Critical Ordnance Science and Research Academy of China
Priority to CN201210433954.2A priority Critical patent/CN102982199B/en
Publication of CN102982199A publication Critical patent/CN102982199A/en
Application granted granted Critical
Publication of CN102982199B publication Critical patent/CN102982199B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a simulation method and a device of an optical lens. The simulation method and the device are used for solving the problems in the prior art that complexity of an optical lens simulation system is high and inconvenient to operate. The simulation method is that a parameter of visibility of the optical lens to a target, a coordinate of the visible target, an attribute identification parameter of the visible target, a measuring speed of the visible target and search time of the visible target are obtained according to a location parameter of the target and a location parameter of the optical lens; an optical lens simulation model is built according to the parameter of visibility of the optical lens to the target, the coordinate of the visible target, the attribute identification parameter of the visible target, the search time of the visible target and measuring speed of the visible target; an optical lens simulation parameter is preset in the optical lens simulation model, and search result of the target is input. According to the technical scheme of the simulation method and the device, the search results of the target can be obtained based on the location parameter of the optical lens and the location parameter of the target, and therefore the simulation method and the device achieves that the process is convenient and fast, and the operability is good.

Description

A kind of optical frames emulation mode and device
Technical field
The present invention relates to simulation technical field, relate in particular to a kind of optical frames emulation mode and device.
Background technology
Optical frames is a kind of optical instrument that can change radiation direction, and namely light will not propagated along original track later through optical frames.Optical frames can be lens, also can be level crossing.
Because optical frames has the characteristic that can change radiation direction, therefore be widely used in the every field such as medical treatment, military affairs, industry.In military field, in order to realize the search to target, and the measurement of target correlation parameter, determine more exactly the parameters such as the position of each target in the hunting zone and attribute, need to set up huge reconnaissance system.And the optical frames realistic model is subordinated to above-mentioned huge reconnaissance system, in this reconnaissance system, it is requisite functional module in the reconnaissance system to the search of target and location etc. that the optical frames realistic model be used for to be realized, plays vital effect in reconnaissance system.
Complicated along with operational environment, reconnaissance system becomes the key factor of restriction war win and defeat to the reconnaissance capability of target.Under antagonism and operational environment, the precision of optical frames realistic model also can be subject to the impact of various factorss such as orographic factor, environmental factor (such as weather environment, electromagnetic environment etc.), target property (such as the camouflage degree) except outside the Pass having with self precision.Therefore, the precision that how to improve the optical frames realistic model becomes a key issue of the whole analogue system precision of restriction.
In the prior art, optical frames realistic model search object form is single, as carrying out bearing search or can only carrying out height search etc.; And the optical frames simulation accuracy is lower.At present, the method that improves the optical frames simulation accuracy is generally sets up model to optical frames itself, stresses to improve in theoretical method, to obtain the higher simulation result of precision.Adopt technique scheme, although improved the optical frames simulation accuracy, but do not carry out related with the analogue system under it optical frames, cause the complexity of optical frames realistic model higher, although improved the precision of system's Output rusults, but make whole analogue system too complicated, reduced operability.
Summary of the invention
The embodiment of the invention provides a kind of optical frames emulation mode and device, and is high in order to solve the optical frames analogue system complexity that exists in the prior art, the problem that is not easy to operate.
The concrete technical scheme that the embodiment of the invention provides is as follows:
A kind of optical frames emulation mode comprises:
According to the projected area of target on plane of vision, the distance between optical frames and the target, the calculating optical mirror is to the visibility parameter of target; From described target, obtain visible object;
According to the coordinate of optical frames, the location parameter between optical frames and the described visible object calculates the coordinate of described visible object;
According to the projected area of described visible object on plane of vision, calculate the Attribute Recognition parameter of described visible object;
According to processing time and the tracking time to described visible object, the calculating optical mirror is to the search time of described visible object, and according to the actual speed of described visible object, calculates the measuring speed of visible object;
According to the visibility parameter of described optical frames to target, the coordinate of described visible object, the Attribute Recognition parameter of described visible object, described optical frames is to the search time of visible object, and the measuring speed of described visible object, sets up the optical frames realistic model;
Default optical frames simulation parameter in described optical frames realistic model, the output optical frames is to the Search Results of target.
A kind of optical frames simulator comprises:
The first computing unit is used for according to the projected area of target on plane of vision, the distance between optical frames and the target, and the calculating optical mirror is to the visibility parameter of target; From described target, obtain visible object;
The second computing unit is used for the coordinate according to optical frames, and the location parameter between optical frames and the described visible object calculates the coordinate of described visible object;
The 3rd computing unit is used for according to the projected area of described visible object on plane of vision, calculates the Attribute Recognition parameter of described visible object;
The 4th computing unit is used for according to processing time and tracking time to described visible object, and the calculating optical mirror is to the search time of described visible object, and according to the actual speed of described visible object, calculates the measuring speed of visible object;
Model is set up the unit, is used for according to the visibility parameter of described optical frames to target, the coordinate of described visible object, the Attribute Recognition parameter of described visible object, described optical frames is to the search time of visible object, and the measuring speed of described visible object, sets up the optical frames realistic model;
Output unit is used at the default optical frames simulation parameter of described optical frames realistic model as a result, and the output optical frames is to the Search Results of target.
In the embodiment of the invention, obtain optical frames to the visibility parameter of target according to the location parameter of target and the location parameter of optical frames, the coordinate of visible object, the Attribute Recognition parameter of visible object, the measuring speed of visible object and the search time of visible object; According to the visibility parameter of above-mentioned optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames is to the search time of visible object, and the measuring speed of visible object, sets up the optical frames realistic model; Default optical frames simulation parameter in the optical frames realistic model, the output optical frames is to the Search Results of target.Adopt technical solution of the present invention, can obtain optical frames to the Search Results of target based on the location parameter of optical frames and the location parameter of target, implementation procedure is convenient and swift, and is workable.
Description of drawings
Fig. 1 is optical frames simulator structural representation in the embodiment of the invention;
Fig. 2 is optical frames emulation detail flowchart in the embodiment of the invention;
Fig. 3 be in the embodiment of the invention optical frames to the calculation flow chart of the visibility parameter of target;
Fig. 4 is optical frames part synoptic diagram in the reconnaissance system in the embodiment of the invention;
Fig. 5 be in the embodiment of the invention optical frames to the search procedure synoptic diagram of target;
Fig. 6 be in the embodiment of the invention optical frames to the calculation flow chart of the visibility parameter of target;
Fig. 7 is target and optical frames imaging synoptic diagram in the embodiment of the invention;
Fig. 8 is the position relationship synoptic diagram of optical frames and target in the embodiment of the invention;
Fig. 9 is to the moving-target search synoptic diagram in the embodiment of the invention.
Embodiment
High in order to solve the optical frames analogue system complexity that exists in the prior art, the problem that is not easy to operate.In the embodiment of the invention, obtain optical frames to the visibility parameter of target according to the location parameter of target and the location parameter of optical frames, the coordinate of visible object, the Attribute Recognition parameter of visible object, the measuring speed of visible object and the search time of visible object; According to the visibility parameter of above-mentioned optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames is to the search time of visible object, and the measuring speed of visible object, sets up the optical frames realistic model; Default optical frames simulation parameter in the optical frames realistic model, the output optical frames is to the Search Results of target.Adopt technical solution of the present invention, can obtain optical frames to the Search Results of target based on the location parameter of optical frames and the location parameter of target, implementation procedure is convenient and swift, and is workable.
Below in conjunction with accompanying drawing the preferred embodiment of the present invention is elaborated.
Consult shown in Figure 1ly, in the embodiment of the invention, the optical frames simulator comprises the first computing unit 10, the second computing units 11, the three computing units 12, the four computing units 13, and model is set up unit 14 and output unit 15 as a result, wherein,
The first computing unit 10 is used for according to the projected area of target on plane of vision, the distance between optical frames and the target, and the calculating optical mirror is to the visibility parameter of target;
The second computing unit 11 is used for the coordinate according to optical frames, the location parameter between optical frames and the visible object, the coordinate of calculating visible object;
The 3rd computing unit 12 is used for according to the projected area of visible object on plane of vision, calculates the Attribute Recognition parameter of visible object;
The 4th computing unit 13 is used for according to processing time and tracking time to visible object, and the calculating optical mirror is to the search time of visible object, and according to the actual speed of visible object, calculates the measuring speed of visible object;
Model is set up unit 14, is used for according to the visibility parameter of optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames is to the search time of visible object, and the measuring speed of visible object, sets up the optical frames realistic model;
Output unit 15 as a result, are used at the default optical frames simulation parameter of optical frames realistic model, and the output optical frames is to the Search Results of visible object.
In the embodiment of the invention, relate to the Simulation Application of optical frames in ground reconnaissance system and the aerial reconnaissance system, for above-mentioned two kinds of different application scenarioss.Wherein, in the ground reconnaissance system, use earth coordinates, use the reconnaissance system coordinate system aloft in the reconnaissance system, therefore, when in the reconnaissance system aloft target being searched for and during the operation such as location, the reconnaissance system coordinate system need to being converted to earth coordinates.
Based on technique scheme, consult shown in Figure 2ly, in the embodiment of the invention, relate to the Simulation Application of optical frames in ground reconnaissance system and the aerial reconnaissance system, for above-mentioned two kinds of different application scenarioss, the detailed process of optical frames emulation is:
Step 200: according to the projected area of target on plane of vision, the distance between optical frames and the target, the calculating optical mirror is to the visibility parameter of target.
When the application scenarios of optical frames emulation is the ground reconnaissance system, to consult shown in Figure 3ly, the calculating optical mirror to the flow process of the visibility parameter of target is:
Step a1: calculate the projected area of target on plane of vision.
If the projected area of target on plane of vision is S, in the embodiment of the invention, plane, optical frames place is plane of vision.
In the embodiment of the invention, in the ground reconnaissance system, adopt earth coordinates, terrain data (x, y) is approximate to be arranged according to grid, and the length of side of establishing grid is L, and the coordinate of reconnaissance system is (X 1, Y 1, Z 1).In reconnaissance system, the height on the relative ground of optical frames is H 1, then the coordinate of optical frames in earth coordinates is (X as can be known 1, Y 1, Z 1+ H 1).If the coordinate of target is (X 2, Y 2, Z 2), attitude angle is (φ, θ, γ), and wherein, φ is course angle, and θ is the angle of pitch, and γ is roll angle (left-leaning for just)., target is approximately rectangular parallelepiped herein, the length of establishing target is respectively L 1, L 2, L 3, then with respect to optical frames, target meet apparent area S 1, side-looking area S 2, overlook area and be respectively S 3Be respectively:
S 1=L 2×L 3
S 2=L 1×L 3 (1)
S 3=L 1×L 2
Consult and Figure 4 shows that optical frames part synoptic diagram in the reconnaissance system, wherein, ε 1Be the minimum point of target and the line between the optical frames and the angle between the surface level, ε 2For being the peak of target and the line between the optical frames and the angle between the surface level.According to the position relationship between the optical frames shown in Fig. 4 and the target, can draw the coordinate of any point on the minimum point of optical frames and target and the line between the peak:
Figure BDA00002351997500061
Figure BDA00002351997500062
z k1=Z 1+H 1-d tanε 1 (2)
z k2=Z 1+H 1-d tanε 2
Wherein, Be the angle between the line between target and the optical frames and the x axle; D is the horizontal range between any point and the reconnaissance system on the line between target and the optical frames; Above-mentioned angle parameter can obtain by following formula:
Figure BDA00002351997500064
Figure BDA00002351997500065
tan ϵ 1 = Z 1 + H 1 - Z 2 ( X 2 - X 1 ) 2 + ( Y 2 - Y 1 ) 2 - - - ( 3 )
tan ϵ 2 = Z 1 + H 1 - ( Z 2 + L 3 cos | θ | cos | γ | ( X 2 - X 1 ) 2 + ( Y 2 - Y 1 ) 2
In the above-mentioned formula (3), (x k, y k) point height value can obtain by linear interpolation.
In Fig. 4, from reconnaissance system (X 1, Y 1, Z 1) to target (X 2, Y 2, Z 2) between can carry out following operation to simplify computation process: judge the visibility situation by equidistant intervals △ d in surface level, wherein, the value of △ d needs the density degree of base area graphic data, fluctuating quantity and the calculated amount of landform to consider.Better, when grid length of side L is larger, can make △ d=L, to guarantee taking full advantage of of terrain data, when L is smaller, can get larger numerical value to △ d according to actual conditions, in the situation that does not affect precision, to reduce calculated amount.
If there is certain 1 k in the line between target and optical frames, when this k satisfies z kZ K2The time, then can not intervisibility between optical frames and the target, namely target can not be visual with respect to optical frames; If corresponding any point k on the line between target and the optical frames, this k all satisfies z k<z K1The time, complete intervisibility between optical frames and the target then, namely target can be fully visual with respect to optical frames; If there is any point k in the line between target and optical frames, exist some k to satisfy z k<z K2, and exist some k to satisfy z k' z K1The time, part intervisibility between optical frames and the target, namely with respect to the optical frames part as seen target at this moment, needs the calculating section effective area.
If z kZ K1The set of a k is arranged is Ω, any point k is with respect to the angular altitude ε of optical frames in the set of computations Ω i, order:
ϵ min = min i ⋐ Ω { ϵ i } - - - ( 4 )
Wherein, ε MinThe place landform has determined the target visibility, and namely the target peak has directly affected the target visibility.Order:
L 3 - D × ( tan ϵ 1 - tan ϵ min ) cos | θ | cos | γ | ( 5 )
Wherein, D is the air line distance between target and the optical frames,
Figure BDA00002351997500073
Hh is that target is with respect to the length of optical frames viewable portion.
According to above-mentioned computing formula, target meeting apparent area, side-looking area, overlook area and be respectively with respect to the optical frames viewable portion as can be known:
S′ 1=L 2×hh
S′ 2=L 1×hh (6)
S′ 3=L 1×L 2
Special, establish reconnaissance system and be positioned at geodetic coordinates origin, the unit vector of order from the optical frames to the target
Figure BDA00002351997500074
For:
I R → = ( X 2 - X 1 , X 2 - Y 1 , Z 2 - ( Z 1 + H 1 ) ) ( X 2 - X 1 ) 2 + ( Y 2 - Y 1 ) 2 + ( Z 2 - Z 1 - H 1 ) 2 - - - ( 7 )
( n 1 → , n 2 → , n 3 → ) = L T Z L T X L T Y = cos φ - sin φ 0 sin φ cos φ 0 0 0 1 1 0 0 0 cos θ - sin θ 0 sin θ cos θ cos γ 0 - sin γ 0 1 0 sin γ 0 cos γ
Wherein,
Figure BDA00002351997500082
Figure BDA00002351997500083
Figure BDA00002351997500084
Be respectively normal vector corresponding to each plane of target, then the projected area of target on plane of vision is:
S = | I R → · n 1 → | · S 1 ′ + | I R → · n 2 → | · S 2 ′ + | I R → · n 3 → | · S 3 ′ - - - ( 8 )
In the embodiment of the invention, the projected area of target on plane of vision is larger, and the visibility of this target is higher.
Step a2: the distance between measurement target and the optical frames.
In the embodiment of the invention, the distance between target and the optical frames is air line distance, because target and optical frames are distant, can be similar to optical frames and target are considered as two points, and the air line distance between measuring at 2 represents with D.
Step a3: according to the distance of target between the projected area on the plane of vision and target and optical frames, obtain the visibility parameter of target.
Be specially:
In the optical frames application process, optical frames is in search and position fixing process to target, in the display screen of reconnaissance system, current search position take position corresponding to optical frames central cross line as optical frames, namely the position of optical frames central cross line is optical frames to the position location in the scouting process of target.Consult shown in Figure 5, take the maximum angle of optical frames central cross line and horizontal ordinate and the scope between the minimum angle as its hunting zone on the orientation, take the ultimate range of optical frames central cross linear distance initial point and the scope between the minor increment as its hunting zone on distance.If the hunting zone of optical frames central cross line on the orientation is (θ 1, θ 2), the hunting zone on the distance is (0, D 1), namely in Fig. 5, optical frames central cross line is from θ on the orientation 1Begin to search for to θ 2, then again from θ 2Search θ 1, search so moves in circles.In addition, the search angle speed of optical frames central cross line is ω, and field angle is α, and field angle is the angle that covers when optical frames is in resting state.
Make α 1=α/2, establish optical frames central cross line on the orientation from θ 1To θ 2For once search, from θ 2To θ 1Also be once to search for.Calculate for convenient, ignore the variation of azimuth of target in once searching for, then target residence time in the optical frames visual field is in once searching for:
τ = α / ω ( β - θ 1 + α 1 ) / ω ( θ 2 - β + α 1 ) / ω - - - ( 9 )
Wherein, β is that target is with respect to the position angle of optical frames.In formula (9), in order to simplify computation process, target is considered as point target.And the value of τ comprises three kinds of situations in formula (9), and these three kinds of situations correspond respectively to the different spans of β, that is:
β∈[θ 1121]
β∈(θ 1111)
β∈(θ 2121)
If it is SUMT to the time that the target that searches is followed the tracks of, identified, the process such as extraction spends altogether that optical frames central cross line, is established optical frames central cross line since moment T search, can draw the constantly orientation of optical frames central cross line of t.Make the optical frames central cross line sweep time cycle
Figure BDA00002351997500092
Scanning times
Figure BDA00002351997500093
Figure BDA00002351997500094
Expression rounds downwards.Can try to achieve t according to TT the orientation of optical frames central cross line is constantly:
Figure BDA00002351997500095
Wherein, when TT was even number, the search procedure of expression optical frames was to the B point by the A point; When TT was odd number, the search procedure of expression optical frames was to the A point by the B point.
If the simulation step length during optical frames search target is △ T, target numbers is N, and the coordinate of above-mentioned target under earth coordinates is (x i, y i, z i), wherein, footmark i is random natural number.Consult shown in Figure 5, if optical frames central cross line is point from the A point search to B, wherein, it is T that optical frames central cross line is positioned at the moment that A orders, and being positioned at the moment that B orders is T+ △ T, and search angle speed is ω, target moves to D by C within this time period, obtain target with respect to the horizontal range d of optical frames, as d ∈ (0, D 1) time, the orientation θ of calculating T moment optical frames central cross line and target 1,
Figure BDA00002351997500101
If Calculate the orientation θ of T+ △ T moment optical frames central cross line and target 2, If Or
Figure BDA00002351997500105
Two azimuth angle theta at optical frames central cross line 1, θ 2Between, then may meet.The angular velocity that makes target is V m, meeting is t constantly, adopts following formula calculating target and optical frames central cross line to meet constantly:
Figure BDA00002351997500107
If t<T+ △ T illustrates that then target and optical frames central cross line meet at moment t, otherwise does not meet.
In said process, when any one condition does not satisfy, optical frames central cross line can't meet with target.
Known target residence time in optical frames is τ in search procedure, and the distance between optical frames and the target is D, the projected area of target on plane of vision is S, and the camouflage degree of target is m, m ∈ [0,1], then the visibility parameter of target can represent with the probability of detection of target:
p = 1 - e - K ( 1 - m ) Sτ / D 2 - - - ( 11 )
Wherein, K is a constant, extracts (0,1) equally distributed random number r, if r<p represents that then as seen target is; Otherwise with respect to optical frames, target is invisible.
When the application scenarios of optical frames emulation is the aerial reconnaissance system, to consult shown in Figure 6ly, the calculating optical mirror to the flow process of the visibility parameter of target is:
Step b1: earth coordinates are converted to the reconnaissance system coordinate system.
Aloft in the reconnaissance system, the observation visual angle of optical frames is larger, for convenience of calculation, optical frames is considered as resting state.When being in the aerial reconnaissance system, to target search for and position fixing process in, earth coordinates need to be converted to the reconnaissance system coordinate system.Wherein, earth coordinates (n system) are O 0-x 0y 0z 0(sky, northeast); Reconnaissance system coordinate system (b system) is O-x by bz bTake the optical center point of optical frames as initial point, optical frames optical center line is y axle (direction directed forward), with the horizontal vertical plane on plane, optical frames camera lens place in, the direction vertical with the y axle is the x axle, consists of the vertical direction in plane as the z axle take x axle and y axle.Then the transfer process between earth coordinates and the reconnaissance system coordinate system is:
If certain some q coordinate is (x under the earth coordinates 1, y 1, z 1), some q coordinate is (x under the reconnaissance system coordinate system 0, y 0, z 0), attitude angle is (φ, θ, γ), then point (x 1, y 1, z 1) coordinate is under reconnaissance system system:
x b y b z b = C n b x 1 - x 0 y 1 - y 0 z 1 - z 0 - - - ( 12 )
Wherein, C n b = L y ( γ ) L x ( θ ) L z ( φ ) , Be matrix coefficient.
L y ( γ ) = cos ( γ ) 0 sin ( γ ) 0 1 0 - sin ( γ ) 0 cos ( γ )
L x ( θ ) = 1 0 0 0 cos ( θ ) sin ( θ ) 0 - sin ( θ ) cos ( θ )
L z ( φ ) = cos ( φ ) sin ( φ ) 0 - sin ( φ ) cos ( φ ) 0 0 0 1
Step b2: the visibility parameter of calculating target.
When application aerial reconnaissance system carries out actual emulation, the target imaging process is not carried out emulation, therefore do not produce emulating image, just be calculated to be the zone of picture according to imaging device position and attitude, then the physical location when target is positioned at above-mentioned imaging region, shows that then this target is the target on the image.
The coordinate of known reconnaissance system under earth coordinates is (x 0, y 0, z 0), attitude angle is (φ, θ, γ), the pitching field angle is α y, the orientation field angle is α x, the coordinate of target under earth coordinates is (X m, Y m, Z m), then the coordinate of target under reconnaissance system system is:
x b y b z b = C n b X m - x 0 Y m - y 0 Z m - z 0 - - - ( 13 )
By the coordinate of above-mentioned target, calculate the angle of pitch α of target under reconnaissance system system MyAnd azimuth angle alpha MxFor:
α my = arctan z b y b , α mx = arctan x b y b ,
When satisfying following condition simultaneously: α mx ∈ ( - 1 2 α x , 1 2 α x ) , α my ∈ ( - 1 2 α y , 1 2 α y ) , y b>0, then show target in scouting the zone, otherwise not in scouting the zone.
If the resolution of reconnaissance system is the M*N pixel, the length of target is L, K, H respectively.Then the distance R between target and reconnaissance system is:
R = X m 2 + Y m 2 + Z m 2
In the embodiment of the invention, if the attitude angle of known target can accurately calculate the projected area on plane of vision, but calculate more complicated, this do to simplify process after, the meeting apparent area, side-looking area, overlook area and be respectively S of target 1, S 2, S 3, order:
R → = ( X m , Y m , Z m )
Wherein,
Figure BDA00002351997500127
Be the coordinate of target under earth coordinates.In the embodiment of the invention,
Figure BDA00002351997500128
Be vector, take the distance of target and optical frames place reconnaissance system as Size, with the line direction of optical frames place reconnaissance system and target be
Figure BDA000023519975001210
Direction.In addition, order:
I R → = R → R
n 1 → = ( cos θ m cos φ m , cos θ m sin φ m , sin θ m ) T = ( v x , v y , v z ) T
n 2 → ′ = ( - v y , v x , 0 ) T / v x 2 + v y 2
n 3 → ′ = n 1 → × n 2 → ′ = ( - v x v z , v y v z , v x 2 + v y 2 ) T v x 2 + v y 2
Wherein, With
Figure BDA000023519975001216
Be the intermediate conversion variable.
n 2 → = cos γ m n 3 → ′ + sin γ m n 3 → ′
n 3 → = - sin γ m n 2 → ′ + cos γ m n 3 → ′
Wherein, (φ m, θ m, γ m) be the attitude angle of target under earth coordinates.Then the projected area S of target on plane of vision is:
S = | I R → · n 1 → | · S 1 + | I R → · n 2 → | · S 2 + | I R → · n 3 → | · S 3 - - - ( 14 )
If the length that the display screen of display-object state is corresponding in the reconnaissance system and the wide L that is respectively xAnd L y, the cone that optical frames scanning is consisted of intercepts sectional view along bus, consults to Figure 7 shows that take surface level as the separatrix, intercept half of above-mentioned sectional view, wherein, O is optical frames, OA is the distance B between optical frames and target, and AB is for being a bowlder with target Equivalent, corresponding radius of equivalent circle r s, AG is 1/2nd length of the maximum radius of optical frames region of search covering, EC is 1/2nd length corresponding to the long limit of screen, i.e. L xTo be target present the radius of equivalent circle r of image at screen to/2, ED, and the radius of equivalent circle that can be tried to achieve target by S is:
r s = S / π
Thereby the radius of equivalent circle r that can try to achieve on the photograph is:
r = r s L x 2 Dtg ( 0.5 α y )
Then aloft in the reconnaissance system, the visibility parameter can be approximated to be number of pixels:
n = r L x M
As n>=δ nThe time, represent that then target is visual with respect to optical frames; Otherwise this target is not visible target, δ nThe value that obtains of data by experiment.
In the embodiment of the invention, following steps are the simulation process that carries out to the basis of target visibility judged result, and therefore, the target in the following steps is visible object.
Step 210: according to the coordinate of optical frames, the location parameter between optical frames and the visible object, the coordinate of calculating visible object.
In the embodiment of the invention, the location parameter between optical frames and the visible object comprises target with respect to the attitude angle of optical frames, and target is with respect to the position angle of optical frames, angular altitude, the distance between optical frames and the target etc.
When the application scenarios of optical frames emulation is the ground reconnaissance system, to the position fixing process of target be:
If the coordinate of reconnaissance system is (X 1, Y 1, Z 1), the height on the relative ground of optical frames is H 1, the coordinate of target is (X 2, Y 2, Z 2), according to the first error of optical frames, the coordinate of optical frames is revised.Above-mentioned the first error is the average of the positioning error of reconnaissance system, median error, and the comprehensive value of determining of the converting system between median error and the mean square deviation.The average of reconnaissance system positioning error is 0, and median error is σ d, the conversion coefficient between median error and the mean square deviation is ρ, and value is 1.4826, and then the reconnaissance system position is:
X=X 1+ρ·σ d·gauss
Y=Y 1+ρ·σ d·gauss (15)
Z=F(X,Y)
Wherein, Z is obtained by known altitude figures according to X, Y, and gauss is that average is 0, and mean square deviation is the sample value of 1 white Gaussian noise.The output coordinate of optical frames is:
X g=X
Y g=Y (16)
Z g=Z+H 1
If it is m that reconnaissance system is obeyed respectively average to range error, measurement of bearing error, the height measuring error of target 1, m 2, m 3, mean square deviation is σ 1, σ 2, σ 3Normal distribution, thereby the measured value of distance B, azimuthal angle beta, angular altitude ε is respectively:
D = ( X 2 - X 1 ) 2 + ( Y 2 - Y 1 ) 2 + ( Z 2 - Z 1 - H 1 ) 2 + m 1 + σ 1 gauss
β = tg - 1 ( Y 2 - Y 1 X 2 - X 1 ) + m 2 + σ 2 gauss
ϵ = tg - 1 ( Z 2 - Z 1 - H 1 ( X 2 - X 1 ) 2 + ( Y 2 - Y 1 ) 2 ) + m 3 + σ 3 gauss
By X g, Y g, Z g, D, β, the ε scouting position of trying to achieve target is:
X m=X g+D cosεcosβ
Y m=Y g+D cosεsinβ
Z m=Z g+Dsinε
Adopt technique scheme, in the position fixing process to target, introduce error amount, Effective Raise the bearing accuracy of reconnaissance system to target.
When the application scenarios of optical frames emulation is the aerial reconnaissance system, to the position fixing process of target be:
By target under reconnaissance system system coordinate and the attitude angle of reconnaissance system, just can obtain target coordinate under earth coordinates.According to the second error amount of optical frames, above-mentioned attitude angle is revised.Above-mentioned the second error is the average according to the positioning error of reconnaissance system, median error, the average of altitude gauge (being arranged in reconnaissance system) altimetry error, mean square deviation, and the comprehensive value of determining of the conversion coefficient between median error and the mean square deviation.Wherein, the average of the positioning error of reconnaissance system is 0, and median error is σ d, the average of altitude measuring high level error is 0, mean square deviation is σ z, ρ is the conversion coefficient between median error and the mean square deviation, value is 1.4826.
The coordinate of target under the reconnaissance system coordinate system can be tried to achieve by formula (13), and the error obedience average of establishing reconnaissance system output attitude angle is 0, and mean square deviation is σ φ, σ θ, σ γWhite Gaussian noise, then aloft in the reconnaissance system, the actual attitude angle of target satisfies:
φ u=φ+σ φgauss
θ u=θ+σ θgauss
γ u=γ+σ γgauss
Thereby try to achieve the coordinate of target under earth coordinates be:
x m y m z m = C b n x b y b z b + x d ′ y d ′ z d ′
Wherein,
Figure BDA00002351997500152
Then aloft in the reconnaissance system, the coordinate of target is:
x d ′ y d ′ z d ′ = x 0 + ρ · σ d · gauss y 0 + ρ · σ d · gauss z 0 + σ Z · gauss - - - ( 17 )
Said process is when being considered as an object that has a high 3 D stereo of length and width with target, to the calculating of coordinates of targets value.Special, aloft in the reconnaissance system, because optical frames and target range are larger, calculate for convenient, target can be considered as Area Objects or point target.Detailed process is:
At first, be 0, mean square deviation is respectively σ L, σ L1, σ βCoordinate after point target is extracted is:
x‘=x mD1·gauss
y‘=y mD1·gauss
If l 0, l 1, β is the long limit of certain Area Objects, minor face and the target azimuthal true value with respect to optical frames, the value after the extraction is respectively:
l’ 0=l 0L·gauss
l’ 1=l 1L1·gauss (18)
β‘=β+σ β·gauss
In said process, to obtaining the higher value of precision behind the coordinate figure of the Area Objects under the aerial reconnaissance system and point target and the position angle extraction accuracy, Effective Raise precision and the reliability of reconnaissance system.
Step 220: according to the projected area of visible object on plane of vision, calculate the Attribute Recognition parameter of visible object.
The Attribute Recognition of target comprises enemy and we's Attribute Recognition of target, is unfriendly target or our target etc. such as target; And target type identification, be vehicle such as target type, people etc.
When the application scenarios of optical frames emulation is the ground reconnaissance system, to the Attribute Recognition parameter calculation procedure of target be:
The projected area of known target on plane of vision is S, then the equivalent projection radius R of target on optical frames 1For:
R 1 = S / π - - - ( 19 )
Consult shown in Figure 8ly, the O observation point of setting up an office, CG are optical frames, its radius is r, and the distance between observation point O and optical frames is D', and the air line distance between optical frames and target is D, EB is the radius of equivalent circle R of target, and FH is the equivalent redius of target behind optical frames, and computing formula is:
R ′ = R 1 D ′ D + D ′ - - - ( 20 )
In the embodiment of the invention, default scouting personnel are P to the identification probability of target type and enemy and we's attribute:
P = R ′ r - - - ( 21 )
In the embodiment of the invention, the decision process of target type and objective attribute target attribute is adopted in [0,1] the interval mode that extracts uniform random number, above-mentioned random number and above-mentioned probability are compared, draw target type and objective attribute target attribute according to comparative result.Adopt technique scheme, can obtain convenient and swift and exactly target type and objective attribute target attribute.Concrete grammar is:
Extract [0,1] interval uniform random number P ' 1If, P ' 1≤ P, then with the true type of this target as the target type that spies out, otherwise, and the immediate target of this target shape as the target type that spies out.Better, in the embodiment of the invention, goal-selling type parameter and the target type corresponding with each target type parameter in reconnaissance system.Directly obtain target type according to the target type parameter that records.
Extract [0,1] interval uniform random number P ' 2If, P ' 2≤ P, then with enemy and we's attribute of this target enemy and we's attribute as the target that spies out, otherwise, the enemy and we attribute of the attribute opposite with the true enemy and we's attribute of target as the target that spies out.
When the application scenarios of optical frames emulation is the aerial reconnaissance system, to the Attribute Recognition parameter calculation procedure of target be:
According to the many experiments process, can obtain the scouting personnel is P to the identification probability of target type 1, and be P to the identification probability of enemy and we's attribute 2Aloft in the reconnaissance system, the same employing in [0,1] the interval mode that extracts uniform random number compares above-mentioned random number and above-mentioned probability, draws target type and objective attribute target attribute according to comparative result.Detailed process is:
Extract [0,1] interval uniform random number P ' 1If, P ' 1≤ P 1, then with the true type of this target as the target type that spies out, otherwise, will with the immediate target of this target shape as the target type that spies out.
Extract [0,1] interval uniform random number P/ 2If, P/ 2≤ P 2, enemy and we's attribute of the target that enemy and we's attribute conduct of this target spies out, otherwise, the enemy and we attribute of the attribute opposite with the true enemy and we's attribute of target as the target that spies out.
Step 230: according to processing time and tracking time to visible object, the calculating optical mirror is to the search time of target; And according to the actual speed to visible object, calculate the measuring speed of target.
If comprise M static target and N moving target in the optical frames hunting zone, to processing time of each static target be t 1i, to the processing time of each moving target be t 2i, be t to the tracking time of each moving target 3iConsult shown in Figure 9, optical frames central cross line is by counterclockwise searching a B from an A, search angle speed is ω, moving target moves to a D from a C, and meet at E point and optical frames central cross line, meeting constantly is t, and the rear optical frames central cross line following target of meeting is to putting D, to putting other association attributes of the speed of this moving target being estimated and extracted behind the D target.Then optical frames is to being the search time of target:
SUMT = Σ i = 1 M t 1 i + Σ i = 1 N t 2 i + Σ i = 1 N t 3 i - - - ( 22 )
If the true velocity of target is v x, v y, v zIf, the measuring error Normal Distribution of speed, and average is m v, mean square deviation is σ v, the measuring speed that then obtains target is:
v mx=v x+m vv·gauss
v my=v y+m vv·gauss (23)
v mz=v z+m vv·gauss
In the embodiment of the invention, step 230 can be positioned at after the step 200 or step 210 after, be not limited in the embodiment of the invention calculating of the measuring speed of the calculating of having no progeny search time of carrying out again target through target localization and objective attribute target attribute and Discrimination of Types and target.
Step 240: according to the visibility parameter of optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames is to the search time of visible object, and the measuring speed of visible object, sets up the optical frames realistic model.
In the embodiment of the invention, according to the visibility parameter of optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames are to the search time of visible object, and the measuring speed of visible object, can set up the optical frames realistic model, in modeling process, introduce error amount, according to the error amount correlation parameter of fresh target more, Effective Raise the precision of reconnaissance system.
Step 250: default optical frames simulation parameter in the optical frames realistic model, the output optical frames is to the Search Results of visible object.
In the embodiment of the invention, the coordinate of input optical frames in the optical frames realistic model, the coordinate of target, the form parameter of target, target is with respect to the position angle of optical frames, the distance between optical frames and the target, and target is with respect to the attitude angle of optical frames, the radius of optical frames, namely exportable optical frames is to the Search Results of visible object.
In the embodiment of the invention, according to the projected area of target on plane of vision, the distance between optical frames and the target, the calculating optical mirror is to the visibility parameter of target; From above-mentioned target, obtain visible object; According to the coordinate of optical frames, the location parameter between optical frames and the visible object, the coordinate of calculating visible object; According to the projected area of visible object on plane of vision, calculate the Attribute Recognition parameter of visible object; According to processing time and tracking time to visible object, the calculating optical mirror is to the search time of visible object; And according to the actual speed of visible object, calculate the measuring speed of visible object; According to the visibility parameter of optical frames to visible object, the coordinate of visible object, the Attribute Recognition parameter of visible object, optical frames is to the search time of visible object, and the measuring speed of visible object, sets up the optical frames realistic model; Default optical frames simulation parameter in the optical frames realistic model, the output optical frames is to the Search Results of visible object.Adopt technical solution of the present invention, can obtain optical frames to the Search Results of target based on the location parameter of optical frames and the location parameter of target, implementation procedure is convenient and swift, and is workable.
Those skilled in the art should understand that embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt complete hardware implementation example, complete implement software example or in conjunction with the form of the embodiment of software and hardware aspect.And the present invention can adopt the form of the computer program of implementing in one or more computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) that wherein include computer usable program code.
The present invention is that reference is described according to process flow diagram and/or the block scheme of method, equipment (system) and the computer program of the embodiment of the invention.Should understand can be by the flow process in each flow process in computer program instructions realization flow figure and/or the block scheme and/or square frame and process flow diagram and/or the block scheme and/or the combination of square frame.Can provide these computer program instructions to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device producing a machine, so that the instruction of carrying out by the processor of computing machine or other programmable data processing device produces the device of the function that is used for being implemented in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame appointments.
These computer program instructions also can be stored in energy vectoring computer or the computer-readable memory of other programmable data processing device with ad hoc fashion work, so that the instruction that is stored in this computer-readable memory produces the manufacture that comprises command device, this command device is implemented in the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame.
These computer program instructions also can be loaded on computing machine or other programmable data processing device, so that carry out the sequence of operations step producing computer implemented processing at computing machine or other programmable devices, thereby be provided for being implemented in the step of the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame in the instruction that computing machine or other programmable devices are carried out.
Although described the preferred embodiments of the present invention, in a single day those skilled in the art get the basic creative concept of cicada, then can make other change and modification to these embodiment.So claims are intended to all changes and the modification that are interpreted as comprising preferred embodiment and fall into the scope of the invention.
Obviously, those skilled in the art can carry out various changes and modification to the embodiment of the invention and not break away from the spirit and scope of the embodiment of the invention.Like this, if these of the embodiment of the invention are revised and modification belongs within the scope of claim of the present invention and equivalent technologies thereof, then the present invention also is intended to comprise these changes and modification interior.

Claims (16)

1. an optical frames emulation mode is characterized in that, comprising:
According to the projected area of target on plane of vision, the distance between optical frames and the target, the calculating optical mirror obtains visible object to the visibility parameter of target from described target;
According to the coordinate of optical frames, the location parameter between optical frames and the described visible object calculates the coordinate of described visible object;
According to the projected area of described visible object on plane of vision, calculate the Attribute Recognition parameter of described visible object;
According to processing time and the tracking time to described visible object, the calculating optical mirror is to the search time of described visible object, and according to the actual speed of described visible object, calculates the measuring speed of visible object;
According to the visibility parameter of described optical frames to target, the coordinate of described visible object, the Attribute Recognition parameter of described visible object, described optical frames is to the search time of visible object, and the measuring speed of described visible object, sets up the optical frames realistic model;
Default optical frames simulation parameter in described optical frames realistic model, the output optical frames is to the Search Results of target.
2. the method for claim 1 is characterized in that, further comprises:
Described optical frames realistic model is the ground Reconnaissance system; Perhaps,
Described optical frames realistic model is the aerial reconnaissance system.
3. method as claimed in claim 1 or 2 is characterized in that, determines the projected area of target on plane of vision according to following step:
When the optical frames realistic model is the ground reconnaissance system, according to coordinate and the coordinate Calculation target of optical frames and the air line distance between the optical frames of target; According to described air line distance, and the location parameter between optical frames and the target, target calculated with respect to the visible height value of optical frames; According to described visible height value, and the length and width of target and height, calculate the apparent area of meeting of target viewable portion, side-looking area and overlook area; According to the coordinate of target and the coordinate of optical frames, through matrixing, calculate normal vector corresponding to target; According to the apparent area of meeting of described target viewable portion, the side-looking area is overlooked area, and normal vector corresponding to target, determines the projected area of target on plane of vision;
When the optical frames realistic model is the aerial reconnaissance system, according to the length of target, calculate meeting apparent area, side-looking area and overlooking area of target; According to the coordinate system conversion coefficient, be coordinate under coordinate system corresponding to aerial reconnaissance system with the coordinate of target and the coordinate transformation of optical frames; According to the coordinate of the target after the described conversion and the coordinate of optical frames, according to the coordinate of target and the coordinate of optical frames, through matrixing, calculate normal vector corresponding to target; The normal vector corresponding according to described target, the meeting apparent area, side-looking area and overlook area of described target, and target is determined the projected area of target on plane of vision with respect to the attitude angle of optical frames.
4. the method for claim 1 is characterized in that, when the optical frames realistic model was the aerial reconnaissance system, the calculating optical mirror comprised the visibility parameter of target:
According to the projected area of target on plane of vision, obtain the radius of equivalent circle of target projected area on plane of vision;
According to described radius of equivalent circle, determine optical frames to the visibility parameter of target, wherein, described visibility parameter and radius of equivalent circle are positive correlation.
5. the method for claim 1 is characterized in that, according to the coordinate of optical frames, the location parameter between optical frames and the described visible object calculates the coordinate of described visible object, comprising:
When the optical frames realistic model is the ground reconnaissance system, according to the first error amount of optical frames, the coordinate of optical frames is revised; According to described revised coordinate, the distance between optical frames and the described visible object, and target is calculated the coordinate of described visible object with respect to the position angle of optical frames;
When the optical frames realistic model is the aerial reconnaissance system,, with respect to the attitude angle of optical frames and the second error amount of optical frames described attitude angle is revised according to target; According to revised attitude angle, and the coordinate of optical frames, calculate the coordinate of described visible object.
6. the method for claim 1 is characterized in that, according to the projected area of described visible object on plane of vision, calculates the Attribute Recognition parameter of described visible object, comprising:
According to the projected area of described visible object on plane of vision, obtain the equivalent redius of target behind optical frames;
According to described equivalent redius, the distance between optical frames and the target, and the radius of optical frames are obtained the Attribute Recognition parameter of described visible object.
7. such as claim 1 or 6 described methods, it is characterized in that, when the optical frames realistic model is the aerial reconnaissance system, calculate after the visible object coordinate, calculate before the Attribute Recognition parameter of visible object, further comprise:
According to the coordinate of described visible object, the location parameter of the form parameter of target and target and optical frames extracts error amount;
According to described error amount, upgrade the coordinate of described target.
8. the method for claim 1 is characterized in that, according to the actual speed of described visible object, calculates the measuring speed of visible object, comprising:
According to the actual speed of described visible object, and the error amount of described visible object speed, obtain the measuring speed of visible object.
9. an optical frames simulator is characterized in that, comprising:
The first computing unit is used for according to the projected area of target on plane of vision, the distance between optical frames and the target, and the calculating optical mirror obtains visible object to the visibility parameter of target from described target;
The second computing unit is used for the coordinate according to optical frames, and the location parameter between optical frames and the described visible object calculates the coordinate of described visible object;
The 3rd computing unit is used for according to the projected area of described visible object on plane of vision, calculates the Attribute Recognition parameter of described visible object;
The 4th computing unit is used for according to processing time and tracking time to described visible object, and the calculating optical mirror is to the search time of described visible object, and according to the actual speed of described visible object, calculates the measuring speed of visible object;
Model is set up the unit, is used for according to the visibility parameter of described optical frames to target, the coordinate of described visible object, the Attribute Recognition parameter of described visible object, described optical frames is to the search time of visible object, and the measuring speed of described visible object, sets up the optical frames realistic model;
Output unit is used at the default optical frames simulation parameter of described optical frames realistic model as a result, and the output optical frames is to the Search Results of target.
10. device as claimed in claim 9 is characterized in that, further comprises:
Described optical frames realistic model is the ground Reconnaissance system; Perhaps,
Described optical frames realistic model is the aerial reconnaissance system.
11., it is characterized in that described the first computing unit specifically is used for such as claim 9 or 10 described devices:
When the optical frames realistic model is the ground reconnaissance system, according to coordinate and the coordinate Calculation target of optical frames and the air line distance between the optical frames of target; According to described air line distance, and the location parameter between optical frames and the target, target calculated with respect to the visible height value of optical frames; According to described visible height value, and the length and width of target and height, calculate the apparent area of meeting of target viewable portion, side-looking area and overlook area; According to the coordinate of target and the coordinate of optical frames, through matrixing, calculate normal vector corresponding to target; According to the apparent area of meeting of described target viewable portion, the side-looking area is overlooked area, and normal vector corresponding to target, determines the projected area of target on plane of vision;
When the optical frames realistic model is the aerial reconnaissance system, according to the length of target, calculate meeting apparent area, side-looking area and overlooking area of target; According to the coordinate system conversion coefficient, be coordinate under coordinate system corresponding to aerial reconnaissance system with the coordinate of target and the coordinate transformation of optical frames; According to the coordinate of the target after the described conversion and the coordinate of optical frames, according to the coordinate of target and the coordinate of optical frames, through matrixing, calculate normal vector corresponding to target; The normal vector corresponding according to described target, the meeting apparent area, side-looking area and overlook area of described target, and target is determined the projected area of target on plane of vision with respect to the attitude angle of optical frames.
12. device as claimed in claim 11 is characterized in that, described the first computing unit is further used for:
According to the projected area of target on plane of vision, obtain the radius of equivalent circle of target projected area on plane of vision;
According to described radius of equivalent circle, determine optical frames to the visibility parameter of target, wherein, described visibility parameter and radius of equivalent circle are positive correlation.
13. device as claimed in claim 9 is characterized in that, described the second computing unit specifically is used for:
When the optical frames realistic model is the ground reconnaissance system, according to the first error amount of optical frames, the coordinate of optical frames is revised; According to described revised coordinate, the distance between optical frames and the described visible object, and target is calculated the coordinate of described visible object with respect to the position angle of optical frames;
When the optical frames realistic model is the aerial reconnaissance system,, with respect to the attitude angle of optical frames and the second error amount of optical frames described attitude angle is revised according to target; According to revised attitude angle, and the coordinate of optical frames, calculate the coordinate of described visible object.
14. device as claimed in claim 9 is characterized in that, described the 3rd computing unit specifically is used for:
According to the projected area of described visible object on plane of vision, obtain the equivalent redius of target behind optical frames;
According to described equivalent redius, the distance between optical frames and the target, and the radius of optical frames are obtained the Attribute Recognition parameter of described visible object.
15., it is characterized in that described the 3rd computing unit is further used for such as claim 9 or 14 described devices:
According to the coordinate of described visible object, the location parameter of the form parameter of target and target and optical frames extracts error amount;
According to described error amount, upgrade the coordinate of described target.
16. device as claimed in claim 9 is characterized in that, described the 4th computing unit specifically is used for:
According to the actual speed of described visible object, and the error amount of described visible object speed, obtain the measuring speed of visible object.
CN201210433954.2A 2012-11-02 2012-11-02 Simulation method and device of optical lens Expired - Fee Related CN102982199B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210433954.2A CN102982199B (en) 2012-11-02 2012-11-02 Simulation method and device of optical lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210433954.2A CN102982199B (en) 2012-11-02 2012-11-02 Simulation method and device of optical lens

Publications (2)

Publication Number Publication Date
CN102982199A true CN102982199A (en) 2013-03-20
CN102982199B CN102982199B (en) 2015-07-15

Family

ID=47856214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210433954.2A Expired - Fee Related CN102982199B (en) 2012-11-02 2012-11-02 Simulation method and device of optical lens

Country Status (1)

Country Link
CN (1) CN102982199B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105792532A (en) * 2016-05-06 2016-07-20 浪潮电子信息产业股份有限公司 Tear drop selecting method and PCB

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李增路: "侦察系统搜索目标仿真", 《系统仿真学报》 *
胡小云: "可见光侦察装备侦察功能仿真模型研究", 《指挥控制与仿真》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105792532A (en) * 2016-05-06 2016-07-20 浪潮电子信息产业股份有限公司 Tear drop selecting method and PCB
CN105792532B (en) * 2016-05-06 2018-05-08 浪潮电子信息产业股份有限公司 A kind of tear system of selection and PCB

Also Published As

Publication number Publication date
CN102982199B (en) 2015-07-15

Similar Documents

Publication Publication Date Title
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN105469405B (en) Positioning and map constructing method while view-based access control model ranging
RU2487419C1 (en) System for complex processing of information of radio navigation and self-contained navigation equipment for determining real values of aircraft navigation parameters
CN104833354A (en) Multibasic multi-module network integration indoor personnel navigation positioning system and implementation method thereof
CN103983263A (en) Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN112305559A (en) Power transmission line distance measuring method, device and system based on ground fixed-point laser radar scanning and electronic equipment
CN102496181A (en) True-orthophotomap making method oriented to large-scale production
Karpenko et al. Visual navigation of the UAVs on the basis of 3D natural landmarks
CN111862215B (en) Computer equipment positioning method and device, computer equipment and storage medium
Li et al. Indoor multi-sensor fusion positioning based on federated filtering
US11361502B2 (en) Methods and systems for obtaining aerial imagery for use in geospatial surveying
Chen et al. Low cost and efficient 3D indoor mapping using multiple consumer RGB-D cameras
CN102982199A (en) Simulation method and device of optical lens
Deng et al. Entropy flow-aided navigation
Ma et al. Low‐Altitude Photogrammetry and Remote Sensing in UAV for Improving Mapping Accuracy
CN115830116A (en) Robust visual odometer method
CN115578417A (en) Monocular vision inertial odometer method based on feature point depth
RU2406071C1 (en) Method of mobile object navigation
Liu et al. A tightly-coupled method of lidar-inertial based on complementary filtering
Xiaochen et al. Evaluation of Lucas-Kanade based optical flow algorithm
Shang et al. Research on the rapid 3D measurement of satellite antenna reflectors using stereo tracking technique
Gao et al. Fragment Perforation Spatial Localization Measurement Method and Calculation Analysis by Using Photogrammetry Technology
Chatziparaschis et al. Real-time unmanned aerial vehicle surveying using spatial criteria: a simulated study
CN118408553B (en) Unmanned aerial vehicle navigation method for environment three-dimensional reconstruction and recognition
Zhou et al. Object detection and spatial location method for monocular camera based on 3D virtual geographical scene

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150715

Termination date: 20161102