KR100763973B1 - Method for real-time range estimation of unmanned air vehicle - Google Patents

Method for real-time range estimation of unmanned air vehicle Download PDF

Info

Publication number
KR100763973B1
KR100763973B1 KR1020060028891A KR20060028891A KR100763973B1 KR 100763973 B1 KR100763973 B1 KR 100763973B1 KR 1020060028891 A KR1020060028891 A KR 1020060028891A KR 20060028891 A KR20060028891 A KR 20060028891A KR 100763973 B1 KR100763973 B1 KR 100763973B1
Authority
KR
South Korea
Prior art keywords
relative distance
error
estimate
calculated
equation
Prior art date
Application number
KR1020060028891A
Other languages
Korean (ko)
Inventor
황익호
나원상
Original Assignee
국방과학연구소
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 국방과학연구소 filed Critical 국방과학연구소
Priority to KR1020060028891A priority Critical patent/KR100763973B1/en
Application granted granted Critical
Publication of KR100763973B1 publication Critical patent/KR100763973B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/26Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with fixed angles and a base of variable length, at, near, or formed by the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0205Details
    • G01S5/021Calibration, monitoring or correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0205Details
    • G01S5/0244Accuracy or reliability of position solution or of measurements contributing thereto
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

A method for estimating a relative distance of an unmanned aircraft in real time is provided to secure rapid and stable convergence when estimating the relative distance between the unmanned aircraft with a driven sensor and a target. A method for estimating a relative distance of an unmanned aircraft in rear time includes a step of calculating a minimum weighted square relative distance estimation value from a vertical relative distance component, a gaze changing rate measuring value, an approaching speed, and a weighted value(1); a step of computing a reduced coefficient error and a deflection error, which are included in the minimum weighted square relative distance estimation value(2); a step of computing a deflection error estimation value and a reduced coefficient error estimation value by using statistic characteristics of the gaze changing rate measuring value(3); and a step of obtaining a final relative distance estimating value by compensating the reduced coefficient error estimation value and the deflection error estimation value to the minimum weighted square relative distance estimation value(4).

Description

무인비행체의 실시간 상대거리 추정방법 {METHOD FOR REAL-TIME RANGE ESTIMATION OF UNMANNED AIR VEHICLE}Real time relative distance estimation method of unmanned aerial vehicle {METHOD FOR REAL-TIME RANGE ESTIMATION OF UNMANNED AIR VEHICLE}

도 1은, 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법을 나타내는 블록도이다.1 is a block diagram showing a real-time relative distance estimation method of an unmanned aerial vehicle according to the present invention.

도 2는, 피동형 센서를 장착한 무인비행체와 표적간의 상대거리 추정필터 설계를 위한 상대운동의 기하학적 모형을 나타내는 도면이다.2 is a diagram illustrating a geometric model of relative motion for designing a relative distance estimation filter between an unmanned aerial vehicle equipped with a passive sensor and a target.

본 발명은 피동센서를 장착한 무인비행체의 실시간 상대거리 추정방법에 관한 것으로서, 특히 상대거리와 시선변화율의 곱이 시선벡터에 수직한 상대속도 성분과 같다는 원리를 직접적으로 이용한 선형 상대거리 추정필터 알고리듬을 이용하여, 피동센서를 장착한 무인비행체와 표적 간의 상대거리를 빠르고 안정된 수렴성을 보장하면서 실시간으로 추정하는 방법에 관한 것이다.The present invention relates to a method for estimating a real-time relative distance of an unmanned aerial vehicle equipped with a passive sensor, and in particular, a linear relative distance estimation filter algorithm directly using the principle that the product of the relative distance and the rate of change of the eye is equal to the relative velocity component perpendicular to the line of sight vector. The present invention relates to a method of estimating a relative distance between an unmanned aerial vehicle equipped with a passive sensor and a target in real time while ensuring fast and stable convergence.

유도탄 등의 무인비행체에 장착되어 표적에 대한 정보를 획득하는 센서는, 표적을 향하여 전파 등을 발사하여 되돌아온 신호를 이용하는 능동형 센서와, 표적에서 자연발생된 적외선 등을 이용하는 피동형 센서로 구분할 수 있다.Sensors mounted on unmanned aerial vehicles such as missiles to obtain information on targets can be classified into active sensors using signals returned by emitting radio waves or the like toward targets, and passive sensors using infrared rays generated naturally from targets.

능동형 센서의 경우에는, 센서와 표적 간에 전파가 이동한 시간 등을 알 수 있으므로, 상대거리 정보를 손쉽게 획득 또는 추정할 수 있다. 그러나, 피동형 센서의 경우에는, 상대거리와 관련된 정보를 획득할 수 없고, 단지 센서에서 표적을 향하는 시선방향 및 시선 변화율만을 제공한다. 따라서, 피동형 센서가 장착된 무인비행체는, 표적에 대한 센서의 상대운동을 감안하여 센서 측정치를 필터링함으로써, 상대거리에 대한 정보를 추정하며, 이러한 필터를 상대거리 추정필터(또는 피동거리 추정필터)라 한다.In the case of the active sensor, since the time at which the radio wave is moved between the sensor and the target can be known, the relative distance information can be easily obtained or estimated. However, in the case of the passive sensor, information relating to the relative distance cannot be obtained, and only provides the gaze direction and the rate of change of the gaze from the sensor toward the target. Thus, an unmanned aerial vehicle equipped with a passive sensor estimates information on relative distance by filtering sensor measurements in consideration of the relative motion of the sensor with respect to the target, and uses such a filter as a relative distance estimation filter (or driven distance estimation filter). It is called.

기존의 피동거리 추정필터 알고리듬은 일반적으로, 표적과 센서간의 상대위치, 상대속도 등을 동력학적으로 모델링하고, 시선각 측정치 혹은 시선각 변화율 측정치를 고려한 확장칼만필터를 구성하는 형태 또는 그 변형들이 대부분이었다.Existing driven distance estimation filter algorithms generally form the dynamic Kalman filter that dynamically model the relative position and relative velocity between the target and sensor, and take into consideration the angle of view measurement or the rate of change of the angle of view. It was.

이러한 경우에, 필터 구성을 위한 시스템 차수가 증가하고, 표적과 센서 간의 상대운동이 비선형 운동방정식에 의해 간접적으로 정의되므로, 빠른 수렴속도를 기대하기 어렵다는 단점이 있다. 게다가, 비선형 필터의 특성상, 필터 초기치의 설정방법에 따라 성능이 크게 변화하는 등 알고리듬의 신뢰성과 관련한 문제점들이 끊임없이 지적되어 왔다.In this case, the system order for the filter configuration is increased, and since the relative motion between the target and the sensor is defined indirectly by the nonlinear equation of motion, it is difficult to expect a fast convergence speed. In addition, due to the characteristics of the nonlinear filter, problems related to the reliability of the algorithm have been constantly pointed out such that the performance varies greatly depending on the method of setting the initial filter value.

본 발명은 상기한 종래 문제점을 고려하여 이루어진 것으로서, 상대거리와 시선변화율의 곱이 시선벡터에 수직한 상대속도 성분과 같다는 원리를 직접적으로 이용한 선형 상대거리 추정필터 알고리듬에 의하여, 무인비행체의 상대거리를 빠르고 안정된 수렴성으로 실시간으로 추정할 수 있도록 하는 것을 목적으로 한다.The present invention has been made in consideration of the above-described conventional problems, and by using a linear relative distance estimation filter algorithm directly using the principle that the product of the relative distance and the rate of change of eye is equal to the relative velocity component perpendicular to the eye vector, the relative distance of the unmanned aerial vehicle is determined. It aims to be able to estimate in real time with fast and stable convergence.

상기 목적을 달성하기 위하여, 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법은, 상대속도의 시선에 수직인 속도 성분과, 시선변화율 측정치 및 접근속도로부터 공칭 가중최소자승 상대거리 추정치와, 상기 시선변화율 측정치를 이용하여 편향오차(bias error)와 환산계수오차(scale-factor error)의 형태를 구하고, 시선변화율 측정잡음의 통계적 특성을 이용하여 계산된 편향오차 추정치와 환산계수오차 추정치를 상기 공칭 가중최소자승 상대거리 추정치에 보상하여 최종 상대거리 추정치를 구하는 것을 특징으로 한다.In order to achieve the above object, the method for estimating the real-time relative distance of the unmanned aerial vehicle according to the present invention includes a speed component perpendicular to the line of sight of the relative speed, a nominal weighted least square relative distance estimate from the measurement rate of the line of sight and the approaching speed, and the line of sight. Bias errors and scale-factor errors are obtained using the rate-of-change measurements, and the bias error estimates and the coefficient-of-error estimates calculated using the statistical characteristics of the gaze change rate measurement noise are weighted above the nominal weights. Compensating the least square relative distance estimate to obtain a final relative distance estimate.

본 발명에 따르면, 상기 시선변화율 측정치의 오차 공분산 행렬을 미리 알 수 없는 경우에, 실시간 공분산 추정기를 구동하여 매시간 인가되는 시선변화율 측정치로부터 오차 공분산 함수를 추정하여, 상기 편향오차 추정치와 환산계수오차 추정치를 실시간으로 구할 수 있다.According to the present invention, when the error covariance matrix of the gaze change rate measurement value is not known in advance, an error covariance function is estimated from a gaze change rate measurement value applied every hour by driving a real-time covariance estimator, and the deflection error estimation value and the conversion coefficient error estimation value are estimated. Can be obtained in real time.

또한, 상대속도의 시선에 수직인 성분을

Figure 112007034749840-pat00030
, 접근속도를
Figure 112007034749840-pat00031
, 시선변화율 측정치를
Figure 112007034749840-pat00032
, 측정치의 신뢰도를 반영하기 위한 가중치를 q라 할 때, 상기 상대거리 추정치
Figure 112007034749840-pat00117
는,Also, the component perpendicular to the line of sight of the relative velocity
Figure 112007034749840-pat00030
, Approach speed
Figure 112007034749840-pat00031
, The rate of change of gaze
Figure 112007034749840-pat00032
When the weight to reflect the reliability of the measured value q , the relative distance estimate
Figure 112007034749840-pat00117
Is,

Figure 112007034749840-pat00118
Figure 112007034749840-pat00118

의 수학식으로 구하고,To be obtained from

상기

Figure 112006502675780-pat00034
Figure 112006502675780-pat00035
의 순환식으로 정의되어,remind
Figure 112006502675780-pat00034
Is
Figure 112006502675780-pat00035
Is defined as the recursion of

Figure 112006502675780-pat00036
의 수학식으로 계산되며,
Figure 112006502675780-pat00036
Calculated by

상기 환산계수오차

Figure 112007034749840-pat00037
는,The conversion factor error
Figure 112007034749840-pat00037
Is,

Figure 112006502675780-pat00002
Figure 112006502675780-pat00002

의 수학식으로 계산되고,Is calculated by

무인비행체가 이동한 거리를

Figure 112007034749840-pat00038
라 할 때, 상기 편향오차
Figure 112007034749840-pat00039
는, How far the drone has traveled
Figure 112007034749840-pat00038
When said, said deflection error
Figure 112007034749840-pat00039
Is,

Figure 112006502675780-pat00003
Figure 112006502675780-pat00003

의 수학식으로 계산될 수 있다.It can be calculated by the following equation.

또한, 본 발명에 따르면, 상기 편향오차와 환산계수오차는, 측정잡음의 통계적 특성을 이용하여 추정된다.In addition, according to the present invention, the deflection error and the conversion coefficient error are estimated using the statistical characteristics of the measured noise.

즉, 시선변화율 측정치

Figure 112007034749840-pat00040
에 포함된 측정잡음의 오차 공분산 행렬을
Figure 112007034749840-pat00041
라 할 때, 상기 환산계수오차
Figure 112007034749840-pat00042
의 추정치
Figure 112007034749840-pat00043
는,That is, the rate of change of gaze
Figure 112007034749840-pat00040
Error covariance matrix of the measured noise
Figure 112007034749840-pat00041
When said, said conversion coefficient error
Figure 112007034749840-pat00042
Estimate of
Figure 112007034749840-pat00043
Is,

Figure 112006502675780-pat00004
Figure 112006502675780-pat00004

의 수학식으로 계산되며, 상기 편향오차

Figure 112007034749840-pat00044
의 추정치
Figure 112007034749840-pat00045
는,Calculated by the equation of the deflection error
Figure 112007034749840-pat00044
Estimate of
Figure 112007034749840-pat00045
Is,

Figure 112006502675780-pat00046
Figure 112006502675780-pat00046

의 수학식으로 계산된다.It is calculated by

최종 상대거리 추정치

Figure 112007034749840-pat00145
는 상기 편향오차 추정치 및 환산계수오차 추정치를 이용하여,Final relative distance estimate
Figure 112007034749840-pat00145
Is based on the bias error estimate and the conversion coefficient error estimate,

Figure 112007034749840-pat00146
Figure 112007034749840-pat00146

의 수학식으로 계산될 수 있다.It can be calculated by the following equation.

또한, 최종 상대거리 추정치

Figure 112007034749840-pat00147
를 실시간으로 산출하기 위한 상대거리 추정필터는, 환산계수오차 추정치와 편향오차 추정치에 관한 다음의 수학식Also, final relative distance estimate
Figure 112007034749840-pat00147
Relative distance estimation filter for calculating the equation in real time, the following equation for the conversion coefficient error estimate and the bias error estimate

Figure 112006502675780-pat00049
과,
Figure 112006502675780-pat00049
and,

Figure 112006502675780-pat00050
Figure 112006502675780-pat00050

을 이용하여 구현할 수 있다.Can be implemented using

본 발명의 특징 및 이점들은 첨부도면에 의거한 다음의 상세한 설명으로 더욱 명백해질 것이다. 이에 앞서, 본 명세서 및 청구범위에 사용된 용어나 단어는 발명자가 그 자신의 발명을 가장 최선의 방법으로 설명하기 위해 용어의 개념을 적절하게 정의할 수 있다는 원칙에 입각하여 본 발명이 기술적 사상에 부합하는 의미와 개념으로 해석되어야만 한다.The features and advantages of the present invention will become more apparent from the following detailed description based on the accompanying drawings. Prior to this, the terms or words used in the present specification and claims are defined in the technical idea based on the principle that the inventor can appropriately define the concept of the term in order to explain the invention in the best way. It must be interpreted to mean meanings and concepts.

본 발명에 따른 무인비행체의 실시간 상대거리 추정방법은, 시선 벡터에 수직인 상대속도는 상대거리와 시선변화율의 곱임을 이용하여 공칭 가중최소자승 상대거리 추정치를 구하고, 공칭 가중최소자승 상대거리 추정치와 시선변화율 측정치의 오차 특성을 이용하여, 편향오차 추정치와 환산계수오차 추정치를 실시간으로 추정하고, 이것을 공칭 가중최소자승 상대거리 추정치에 보상하여 최종 상대거리 추정치를 구하는 형태를 취하고 있다.In the real-time relative distance estimation method of the unmanned aerial vehicle according to the present invention, a relative speed perpendicular to the gaze vector is calculated using a product of the relative distance and the rate of change of the eye, and a nominal weighted least square relative distance estimate is obtained. Using the error characteristics of the measurement of the rate of change of gaze, the estimation of the bias error and the conversion coefficient error is estimated in real time, and the final relative distance estimate is obtained by compensating the estimated nominal weighted least square relative distance estimate.

이와 같이, 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법은 상대거리와 시선변화율 측정치 간의 대수적이고 직접적인 관계식을 이용하여, 오차가 제거된 최소차수의 가중최소자승 상대거리 추정치를 구하므로, 계산량이 매우 적을 뿐만 아니라 빠르고 안정된 수렴성을 제공한다.As described above, the method for estimating the real-time relative distance of the unmanned aerial vehicle according to the present invention uses an algebraic and direct relationship between the relative distance and the measurement of the rate of change of the eye, and thus estimates the weighted least square relative distance of the least order in which the error is eliminated. Not only is it very small, it offers fast and stable convergence.

즉, 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법은, 시선 벡터에 수직인 상대 속도 성분과, 시선변화율 측정치 및 접근속도를 이용해 얻어진 공칭 가중최소자승 상대거리 추정치에 포함된 편향오차와 환산계수오차의 형태를 구하고, 시선변화율 측정치의 통계적 특성을 이용하여 근사(近似)된 편향오차 추정치와 환산계수오차 추정치를 상기 공칭 가중최소자승 상대거리 추정치에 보상하여 최종 상대거리 추정치를 구하는 것을 특징으로 한다.That is, the real-time relative distance estimation method of the unmanned aerial vehicle according to the present invention includes a bias error and a conversion coefficient included in the relative velocity component perpendicular to the eye vector, the nominal weighted least square relative distance value obtained by using the eye change rate measurement value and the approach speed. Calculates the shape of the error, and calculates a final relative distance estimate by compensating the estimated bias error and the conversion coefficient error estimated by using the statistical characteristics of the measurement of the rate of change of eye gaze to the nominal weighted least squares relative distance estimate. .

이와 같은 과정을 도 1 및 도 2를 참조하면서, 구체적으로 설명한다.This process will be described in detail with reference to FIGS. 1 and 2.

도 1에는 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법을 나타내는 블록도가 도시되어 있고, 도 2에는 2차원 평면상에서 피동형 센서를 장착한 무인비행체와 표적간의 상대거리 추정필터 설계를 위한 상대운동의 기하학적 모형이 도시되어 있다.1 is a block diagram illustrating a method for estimating a real-time relative distance of an unmanned aerial vehicle according to the present invention, and FIG. 2 is a relative motion for designing a relative distance estimation filter between an unmanned aerial vehicle equipped with a passive sensor and a target on a two-dimensional plane. The geometric model of is shown.

상대거리 추정 알고리듬 설계에 앞서, 표적의 속도가 무인비행체에 비해 상대적으로 매우 작다는 가정 하에 상대속도 벡터를 무인비행체 속도벡터로 근사한다. 즉, 도 2에서, 시선벡터에 수직한 속도 성분은

Figure 112006502675780-pat00051
로, 시선벡터 방향의 속도 성 분, 즉 접근속도는
Figure 112006502675780-pat00052
로 정의된다. 상대거리 변화율과 접근속도는 다음의 수학식 1과 같은 관계를 갖는다.Prior to the design of the relative distance estimation algorithm, the relative velocity vector is approximated to the unmanned vehicle velocity vector on the assumption that the velocity of the target is relatively small compared to the unmanned vehicle. That is, in FIG. 2, the velocity component perpendicular to the line of sight vector is
Figure 112006502675780-pat00051
The velocity component in the direction of the eye vector,
Figure 112006502675780-pat00052
Is defined as The relative distance change rate and the approach speed have a relationship as in Equation 1 below.

[수학식 1][Equation 1]

Figure 112006502675780-pat00008
Figure 112006502675780-pat00008

여기서,

Figure 112006502675780-pat00053
는 상대거리 추정필터의 샘플링 주기를 나타낸다.here,
Figure 112006502675780-pat00053
Denotes a sampling period of the relative distance estimation filter.

Figure 112006502675780-pat00054
시점에서
Figure 112006502675780-pat00055
시점까지 무인비행체가 이동한 거리는 다음의 수학식 2와 같이 정의된다.
Figure 112006502675780-pat00054
At this point
Figure 112006502675780-pat00055
The distance that the drone moves to the viewpoint is defined as in Equation 2 below.

[수학식 2][Equation 2]

Figure 112006502675780-pat00056
Figure 112006502675780-pat00056

Figure 112006502675780-pat00057
시점에서 시선변화율 측정치의 참값을
Figure 112006502675780-pat00058
이라고 하면, 시선변화율 측정치에 수직한 상대속도 성분은 다음의 수학식 3과 같이 표현된다.
Figure 112006502675780-pat00057
The true value of the eye change rate
Figure 112006502675780-pat00058
In this case, the relative velocity component perpendicular to the measurement of the rate of change of eye is expressed as in Equation 3 below.

[수학식 3][Equation 3]

Figure 112006502675780-pat00059
Figure 112006502675780-pat00059

주어진 측정치들과, 이 측정치의 신뢰도를 반영하기 위한 가중치 q를 이용하여, 최소자승 관점에서의 성능지수함수

Figure 112006502675780-pat00060
를 다음의 수학식 4와 같이 정의한다.Performance index function in terms of least squares using given measurements and weights q to reflect the reliability of these measurements
Figure 112006502675780-pat00060
Is defined as in Equation 4 below.

[수학식 4][Equation 4]

Figure 112006502675780-pat00061
Figure 112006502675780-pat00061

상기 수학식 4에서,In Equation 4,

Figure 112006502675780-pat00062
이고,
Figure 112006502675780-pat00062
ego,

측정치

Figure 112006502675780-pat00063
,
Figure 112006502675780-pat00064
,
Figure 112006502675780-pat00065
,
Figure 112006502675780-pat00066
는 다음과 같이 해당 변수의 참값
Figure 112006502675780-pat00067
,
Figure 112006502675780-pat00068
,
Figure 112006502675780-pat00069
,
Figure 112006502675780-pat00070
과 서로 독립인 영평균 측정잡음
Figure 112006502675780-pat00071
,
Figure 112006502675780-pat00072
,
Figure 112006502675780-pat00073
,
Figure 112006502675780-pat00074
의 합으로 표현되는 것으로 가정한다.Measure
Figure 112006502675780-pat00063
,
Figure 112006502675780-pat00064
,
Figure 112006502675780-pat00065
,
Figure 112006502675780-pat00066
Is the true value of that variable as
Figure 112006502675780-pat00067
,
Figure 112006502675780-pat00068
,
Figure 112006502675780-pat00069
,
Figure 112006502675780-pat00070
Zero-measured measurement noise independent of each other
Figure 112006502675780-pat00071
,
Figure 112006502675780-pat00072
,
Figure 112006502675780-pat00073
,
Figure 112006502675780-pat00074
It is assumed to be expressed as the sum of.

Figure 112006502675780-pat00012
Figure 112006502675780-pat00012

상기 수학식 4에서, 성능지수함수

Figure 112006502675780-pat00075
를 최소화하는 공칭 가중최소자승 상대거리 추정치
Figure 112006502675780-pat00076
는, 도 1의 블록 [1]과 같이 공칭 가중최소 자승 필터의 출력이며, 다음의 수학식 5와 같이 계산된다.In Equation 4, the figure of merit
Figure 112006502675780-pat00075
Nominal weighted least squares relative distance estimate
Figure 112006502675780-pat00076
Is an output of a nominal weighted least square filter as shown in block [1] of FIG. 1, and is calculated as shown in Equation 5 below.

[수학식 5][Equation 5]

Figure 112007034749840-pat00120
Figure 112007034749840-pat00120

상기 수학식 5에서,

Figure 112006502675780-pat00077
Figure 112006502675780-pat00078
로 정의되고, 다음의 수학식 6을 이용하여 계산될 수 있다.In Equation 5,
Figure 112006502675780-pat00077
Is
Figure 112006502675780-pat00078
It is defined as and can be calculated using the following equation (6).

[수학식 6][Equation 6]

Figure 112006502675780-pat00079
Figure 112006502675780-pat00079

상기 수학식 5에서, 공칭 가중최소자승 상대거리 추정치는 다음의 같은 환산계수오차

Figure 112007034749840-pat00080
와, 편향오차
Figure 112007034749840-pat00081
를 포함한 형태로, 다음의 수학식 7과 같이 다시 쓸 수 있다.In Equation 5, the nominal weighted least squares relative distance estimate is
Figure 112007034749840-pat00080
Whip error
Figure 112007034749840-pat00081
In the form including, it can be rewritten as shown in Equation 7 below.

[수학식 7][Equation 7]

Figure 112006502675780-pat00015
Figure 112006502675780-pat00015

상기 수학식 7에서,In Equation 7,

환산계수오차

Figure 112007034749840-pat00082
는Conversion factor error
Figure 112007034749840-pat00082
Is

Figure 112006502675780-pat00083
이고,
Figure 112006502675780-pat00083
ego,

편향오차

Figure 112007034749840-pat00084
는,Deflection error
Figure 112007034749840-pat00084
Is,

Figure 112006502675780-pat00085
이다.
Figure 112006502675780-pat00085
to be.

본 발명에 따르면, 상기 편향오차와 환산계수오차는, 측정잡음의 통계적 특성을 이용하여 추정될 수 있는데, 이 경우에, 환산계수오차 추정치

Figure 112007034749840-pat00086
와, 편향오차 추정치
Figure 112007034749840-pat00087
는 다음의 수학식 8과 수학식 9와 같이 근사할 수 있다.According to the present invention, the bias error and the conversion coefficient error can be estimated using the statistical characteristics of the measured noise, in this case, the conversion coefficient error estimate
Figure 112007034749840-pat00086
Wah error estimate
Figure 112007034749840-pat00087
Can be approximated as Equation 8 and Equation 9 below.

[수학식 8][Equation 8]

Figure 112006502675780-pat00016
Figure 112006502675780-pat00016

[수학식 9][Equation 9]

Figure 112006502675780-pat00017
Figure 112006502675780-pat00017

상기 수학식 8 및 수학식 9에서,

Figure 112006502675780-pat00088
는, 시선변화율 측정치
Figure 112006502675780-pat00089
에 포함된 측정잡음의 오차 공분산 행렬을 나타낸다.In Equations 8 and 9,
Figure 112006502675780-pat00088
Is the rate of change of gaze
Figure 112006502675780-pat00089
Error covariance matrix of the measured noise included in

상기 수학식 8 및 수학식 9에 의하여 도출된 환산계수오차 추정치

Figure 112007034749840-pat00090
와 편향오차 추정치
Figure 112007034749840-pat00091
를 이용하여, 공칭 가중최소자승 상대거리 추정치에 다음의 수학식 10과 같이 보상하고, 이를 최종 상대거리 추정치
Figure 112007034749840-pat00148
로 정의한다. 이러한 과정은 도 1의 블록도에 잘 나타나 있다.Conversion coefficient error estimates derived from Equations 8 and 9
Figure 112007034749840-pat00090
And bias error estimates
Figure 112007034749840-pat00091
Using, compensate the nominal weighted least squares relative distance estimate as shown in Equation 10 below, and the final relative distance estimate
Figure 112007034749840-pat00148
Defined as This process is well illustrated in the block diagram of FIG.

[수학식 10][Equation 10]

Figure 112007034749840-pat00149
Figure 112007034749840-pat00149

공칭 가중최소자승 상대거리 추정치의 편향오차 추정치와 환산계수오차 추정치를 순차적으로 계산하기 위해서는, 시선변화율 측정치

Figure 112007034749840-pat00093
의 측정잡음 오차 공분산 함수
Figure 112007034749840-pat00094
를 실시간으로 추정하여야 한다. 측정잡음 분산은 다수의 공개 문헌에 제시된 방법들을 활용할 수 있다.(도 1의 블록 [4]) 즉, 최종 상대거리 추정치
Figure 112007034749840-pat00150
를 실시간으로 산출하기 위한 거리 추정필터는, 환산계수오차 추정치와 편향오차 추정치에 관한 다음의 수학식 11 및 수학식 12에 의하여 구현할 수 있다.To calculate the bias error estimates and the conversion factor error estimates of the nominal weighted least squares relative distance estimates sequentially,
Figure 112007034749840-pat00093
Measurement noise error covariance function
Figure 112007034749840-pat00094
Should be estimated in real time. Measurement noise variance can utilize the methods presented in a number of publications (block [4] in FIG. 1), ie, final relative distance estimates.
Figure 112007034749840-pat00150
The distance estimation filter for calculating the in real time may be implemented by the following equations (11) and (12) regarding the conversion coefficient error estimate and the bias error estimate.

[수학식 11][Equation 11]

Figure 112006502675780-pat00096
과,
Figure 112006502675780-pat00096
and,

[수학식 12][Equation 12]

Figure 112006502675780-pat00097
Figure 112006502675780-pat00097

상기한 수학식들과 도 1의 블록도의 각 블록 단계와의 관계를 정리하면 다음의 표 1과 같다.The relationship between the above equations and each block step in the block diagram of FIG. 1 is summarized in Table 1 below.

[표 1]TABLE 1

Figure 112006502675780-pat00020
Figure 112006502675780-pat00020

상기한 바와 같이 구성된 본 발명에 따른 무인비행체의 실시간 상대거리 추정방법에 의하면, 상대거리와 시선변화율의 곱이 시선벡터에 수직한 상대속도 성분과 같다는 윈리를 직접적으로 이용한 선형 상대거리 추정필터 알고리듬에 의하여, 무인비행체의 상대거리를 실시간으로 추정할 수 있다. 본 발명에 따른 상대거리 추정방법은 필터의 차수가 낮아 기존의 기법에 비하여 계산량이 적어 빠르고 안정된 수렴성을 보장할 뿐만 아니라, 공칭 가중최소자승 필터에 의해 계산된 상대거리 추정치에 포함된 편향오차 및 환산계수오차를 효과적으로 제거할 수 있으므로 표적에 대한 무인비행체의 상대거리 추정 정확도를 높일 수 있다.According to the real-time relative distance estimation method of the unmanned aerial vehicle according to the present invention configured as described above, the linear relative distance estimation filter algorithm directly using Winley that the product of the relative distance and the rate of change of the eye is equal to the relative speed component perpendicular to the line of sight vector In addition, the relative distance of the drone can be estimated in real time. The method of estimating the relative distance according to the present invention has a low order of the filter, so that the calculation amount is less than that of the conventional method, thereby ensuring fast and stable convergence, as well as deflection error and conversion included in the relative distance estimated by the nominal weighted least squares filter. Counting errors can be effectively eliminated, which increases the accuracy of estimating the relative distance of the drone to the target.

Claims (6)

시선 벡터에 수직인 상대속도 성분
Figure 112007034749840-pat00122
와, 시선변화율 측정치
Figure 112007034749840-pat00123
, 접근속도
Figure 112007034749840-pat00124
및 측정치의 신뢰도를 반영하기 위한 가중치 q로부터, 공칭 가중최소자승 상대거리 추정치
Figure 112007034749840-pat00125
를 다음의 수학식(1)에 의해 산출하고,
Relative velocity component perpendicular to the line of sight vector
Figure 112007034749840-pat00122
Wah, eye change
Figure 112007034749840-pat00123
, Approach speed
Figure 112007034749840-pat00124
And estimate the nominal weighted least squares relative distance from the weight q to reflect the reliability of the measurement.
Figure 112007034749840-pat00125
Is calculated by the following equation (1),
Figure 112007034749840-pat00126
... (1)
Figure 112007034749840-pat00126
... (One)
(여기서,
Figure 112007034749840-pat00127
Figure 112007034749840-pat00128
의 순환식으로 정의되어,
(here,
Figure 112007034749840-pat00127
Is
Figure 112007034749840-pat00128
Is defined as the recursion of
Figure 112007034749840-pat00129
의 수학식으로 계산한다)
Figure 112007034749840-pat00129
Calculated by the equation
상기 공칭 가중최소자승 상대거리 추정치
Figure 112007034749840-pat00130
에 포함된 환산계수오차
Figure 112007034749840-pat00131
및 편향오차
Figure 112007034749840-pat00132
의 형태를 다음의 수학식(2)와 (3)에 의해 계산하고,
Estimated Nominal Weighted Least Squares Relative Distance
Figure 112007034749840-pat00130
Conversion factor error included in
Figure 112007034749840-pat00131
And deflection errors
Figure 112007034749840-pat00132
Is calculated by the following equations (2) and (3),
Figure 112007034749840-pat00133
... (2)
Figure 112007034749840-pat00133
... (2)
Figure 112007034749840-pat00134
... (3)
Figure 112007034749840-pat00134
... (3)
(여기서,
Figure 112007034749840-pat00135
는 무인비행체가 이동한 거리)
(here,
Figure 112007034749840-pat00135
Is the distance the drone has moved)
상기 시선변화율 측정치
Figure 112007034749840-pat00136
의 통계적 특성을 이용하여 편향오차 추정치와 환산계수오차 추정치를 구하며,
The eye change rate measurement value
Figure 112007034749840-pat00136
Using the statistical characteristics of, estimate the bias error and the conversion coefficient error,
상기 환산계수오차 추정치와 상기 편향오차 추정치를 상기 공칭 가중최소자승 상대거리 추정치
Figure 112007034749840-pat00137
에 보상함으로서 최종 상대거리 추정치를 구하는 것을 특징으로 하는 무인비행체의 실시간 상대거리 추정방법.
The nominal weighted least squares relative distance estimate of the conversion coefficient error estimate and the bias error estimate
Figure 112007034749840-pat00137
A method for estimating a relative relative distance of an unmanned aerial vehicle, comprising: obtaining a final relative distance estimate by compensating for.
삭제delete 삭제delete 제 1 항에 있어서,The method of claim 1, 상기 환산계수오차
Figure 112007034749840-pat00108
와 상기 편향오차
Figure 112007034749840-pat00109
가 측정잡음의 통계적 특성을 이용하여 추정되고, 시선변화율 측정치
Figure 112007034749840-pat00110
에 포함된 측정잡음의 오차 공분산 행렬을
Figure 112007034749840-pat00111
라 할 때, 환산계수오차 추정치
Figure 112007034749840-pat00138
는, 다음의 수학식(4)
The conversion factor error
Figure 112007034749840-pat00108
And the deflection error
Figure 112007034749840-pat00109
Is estimated using the statistical characteristics of the measured noise,
Figure 112007034749840-pat00110
Error covariance matrix of the measured noise
Figure 112007034749840-pat00111
Estimate of conversion factor error
Figure 112007034749840-pat00138
Is the following equation (4)
Figure 112007034749840-pat00024
... (4)
Figure 112007034749840-pat00024
... (4)
로 계산되고,Is calculated as 편향오차 추정치
Figure 112007034749840-pat00113
는, 다음의 수학식(5)
Bias Error Estimates
Figure 112007034749840-pat00113
Is the following equation (5)
Figure 112007034749840-pat00025
... (5)
Figure 112007034749840-pat00025
... (5)
로 계산되며,Is calculated as 최종 상대거리 추정치
Figure 112007034749840-pat00151
는,
Final relative distance estimate
Figure 112007034749840-pat00151
Is,
Figure 112007034749840-pat00152
... (6)
Figure 112007034749840-pat00152
... (6)
의 수학식으로 계산되는 것을 특징으로 하는 무인비행체의 실시간 상대거리 추정방법.Real time relative distance estimation method of an unmanned aerial vehicle, characterized in that calculated by the equation.
제 4 항에 있어서,The method of claim 4, wherein 최종 상대거리 추정치
Figure 112007034749840-pat00153
를 실시간으로 산출하기 위한 상대거리 추정필터는, 환산계수오차 추정치
Figure 112007034749840-pat00140
와 편향오차 추정치
Figure 112007034749840-pat00141
와 관련된 다음의 수학식(7), (8)
Final relative distance estimate
Figure 112007034749840-pat00153
Relative distance estimation filter for calculating the
Figure 112007034749840-pat00140
And bias error estimates
Figure 112007034749840-pat00141
Equations (7) and (8) associated with
Figure 112007034749840-pat00116
... (7)
Figure 112007034749840-pat00116
... (7)
Figure 112007034749840-pat00027
... (8)
Figure 112007034749840-pat00027
... (8)
을 이용하여 구현하는 것을 특징으로 하는 무인비행체의 실시간 상대거리 추정방법.Real-time relative distance estimation method of the unmanned aerial vehicle, characterized in that implemented using.
제 4 항 또는 제 5 항에 있어서,The method according to claim 4 or 5, 상기 시선변화율 측정치
Figure 112007034749840-pat00142
의 오차 공분산 행렬
Figure 112007034749840-pat00143
를 미리 알 수 없는 경우에, 매시간 인가되는 시선변화율 측정치
Figure 112007034749840-pat00144
를 이용하여 실시간 공분산 추정기를 구동하여 상기 오차 공분산 함수를 추정하고, 상기 편향오차 추정치와 환산계수오차 추정치를 실시간으로 구하는 것을 특징으로 하는 무인비행체의 실시간 상대거리 추정방법.
The eye change rate measurement value
Figure 112007034749840-pat00142
Error covariance matrix
Figure 112007034749840-pat00143
Measurement of the rate of change of eye applied every hour
Figure 112007034749840-pat00144
And estimating the error covariance function by using a real-time covariance estimator to obtain the bias error estimate and the conversion coefficient error estimate in real time.
KR1020060028891A 2006-03-30 2006-03-30 Method for real-time range estimation of unmanned air vehicle KR100763973B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020060028891A KR100763973B1 (en) 2006-03-30 2006-03-30 Method for real-time range estimation of unmanned air vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020060028891A KR100763973B1 (en) 2006-03-30 2006-03-30 Method for real-time range estimation of unmanned air vehicle

Publications (1)

Publication Number Publication Date
KR100763973B1 true KR100763973B1 (en) 2007-10-05

Family

ID=39419269

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020060028891A KR100763973B1 (en) 2006-03-30 2006-03-30 Method for real-time range estimation of unmanned air vehicle

Country Status (1)

Country Link
KR (1) KR100763973B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101751647B1 (en) 2016-09-30 2017-06-28 국방과학연구소 The rejection of body coupling signal from guidance signal in a seeker with stabilization loop

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100310841B1 (en) 1997-12-11 2001-11-22 윌리엄 이. 갈라스 A system for tracking and locking a high energy laser beam onto a moving target
KR100377357B1 (en) 2000-08-25 2003-03-26 삼성전자주식회사 System for tracking target using image feedback and Method therefor
KR100418345B1 (en) 2001-10-16 2004-02-11 박상래 System and method which have 3-dimensional position tracking style of target image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100310841B1 (en) 1997-12-11 2001-11-22 윌리엄 이. 갈라스 A system for tracking and locking a high energy laser beam onto a moving target
KR100377357B1 (en) 2000-08-25 2003-03-26 삼성전자주식회사 System for tracking target using image feedback and Method therefor
KR100418345B1 (en) 2001-10-16 2004-02-11 박상래 System and method which have 3-dimensional position tracking style of target image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101751647B1 (en) 2016-09-30 2017-06-28 국방과학연구소 The rejection of body coupling signal from guidance signal in a seeker with stabilization loop

Similar Documents

Publication Publication Date Title
US10885640B2 (en) Enhanced object detection and motion estimation for a vehicle environment detection system
US10386462B1 (en) Systems and methods for stereo radar tracking
CN101661104B (en) Target tracking method based on radar/infrared measurement data coordinate conversion
US9891306B2 (en) Geolocating a remote emitter
EP3117235B1 (en) Robust dual-radar-beam systems and methods for traffic monitoring
CN107728138B (en) Maneuvering target tracking method based on current statistical model
CN107063245B (en) SINS/DVL combined navigation filtering method based on 5-order SSRCKF
CN104035083B (en) A kind of radar target tracking method based on measurement conversion
CN106443661A (en) Maneuvering extended target tracking method based on unscented Kalman filter
KR101909953B1 (en) Method for vehicle pose estimation using LiDAR
WO2015162873A1 (en) Position and orientation estimation device, image processing device, and position and orientation estimation method
CN105136145A (en) Kalman filtering based quadrotor unmanned aerial vehicle attitude data fusion method
CN105824003A (en) Indoor moving target positioning method based on trajectory smoothing
CN113466890B (en) Light laser radar inertial combination positioning method and system based on key feature extraction
CN109324315B (en) Space-time adaptive radar clutter suppression method based on double-layer block sparsity
CN103808316A (en) Indoor-flying intelligent body inertial system and laser range finder combination type navigation improving method
CN106403940A (en) Anti-atmospheric parameter drift unmanned aerial vehicle flight navigation system altitude information fusion method
RU2564380C1 (en) Correction method of strap-down inertial navigation system
CN110572139B (en) Fusion filtering implementation method and device for vehicle state estimation, storage medium and vehicle
CN112328959B (en) Multi-target tracking method based on adaptive extended Kalman probability hypothesis density filter
US8831906B1 (en) Technique for determining closest point of approach
KR100763973B1 (en) Method for real-time range estimation of unmanned air vehicle
JP2015197403A (en) target tracking device
CN110736459B (en) Angular deformation measurement error evaluation method for inertial quantity matching alignment
CN107270890B (en) Ranging method and system of TOF (time of flight) distance sensor on unmanned aerial vehicle

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20120830

Year of fee payment: 6

FPAY Annual fee payment

Payment date: 20130729

Year of fee payment: 7

FPAY Annual fee payment

Payment date: 20140901

Year of fee payment: 8

FPAY Annual fee payment

Payment date: 20150901

Year of fee payment: 9

FPAY Annual fee payment

Payment date: 20160901

Year of fee payment: 10

FPAY Annual fee payment

Payment date: 20170904

Year of fee payment: 11

FPAY Annual fee payment

Payment date: 20180904

Year of fee payment: 12

FPAY Annual fee payment

Payment date: 20190903

Year of fee payment: 13