CN105716625A - Method and system for automatically detecting a misalignment during operation of a monitoring sensor of an aircraft - Google Patents

Method and system for automatically detecting a misalignment during operation of a monitoring sensor of an aircraft Download PDF

Info

Publication number
CN105716625A
CN105716625A CN201511036038.5A CN201511036038A CN105716625A CN 105716625 A CN105716625 A CN 105716625A CN 201511036038 A CN201511036038 A CN 201511036038A CN 105716625 A CN105716625 A CN 105716625A
Authority
CN
China
Prior art keywords
aircraft
image
displacement
monitoring sensor
angular difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201511036038.5A
Other languages
Chinese (zh)
Other versions
CN105716625B (en
Inventor
S·戈捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airbus Operations SAS
Original Assignee
Airbus Operations SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airbus Operations SAS filed Critical Airbus Operations SAS
Publication of CN105716625A publication Critical patent/CN105716625A/en
Application granted granted Critical
Publication of CN105716625B publication Critical patent/CN105716625B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20228Disparity calculation for image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a method and system for automatically detecting a misalignment during operation of a monitoring sensor of an aircraft. The detection system (1) comprises an analysis unit (4) for analyzing successive images generated by a monitoring sensor (2) to determine in an image a position of an axis representative of the displacement of the aircraft between two successive images, the so-called real position, a computation unit for computing, on the basis of data relating to a displacement of an aircraft between the generations of these two images and of the positioning of the monitoring sensor (2) on the aircraft, a position in an image of an axis representative of the displacement of the aircraft, the theoretical position, an estimation unit (13) for estimating the angular disparity between the real position and the theoretical position, and a comparison unit (16) for comparing this angular disparity with at least one predefined tolerance value so as to detect a misalignment when this angular disparity is greater than this tolerance value.

Description

For automatically detecting the method and system of misalignment during the operation at the monitoring sensor of aircraft
Technical field
The present invention relates to the method and system for automatically detecting misalignment during the operation of the monitoring sensor of aircraft particularly carrier.
Background technology
It is known that contemporary aircraft is equipped with the system for imaging (or imager) (or sensor) that the environment from monitoring aircraft provides data for pilot, it is named as " monitoring sensor " hereinafter.Such monitoring sensor is likely to particularly relate to:
-radar, it is provided that be present in the barrier in the front of aircraft or the image of environmental aspect;
-the visual system that strengthens, EVS, including the infrared camera of image of the enhancing in the region providing aircraft front under the visual condition of deterioration for the pilot of aircraft;And
-roll camera, supply the external sights before aircraft for pilot, to help pilot to drive on the ground.
It is known that the reliability of the information provided by such monitoring sensor is directly relevant according to the alignment of predetermined position to it.Particularly:
-EVS strengthens the image of visual system, is projected onto on head up displays HUD, it is necessary to be such as ideally added to due on the transparent real image being seen by this head up displays;
-radar has to carry out the upper detection of the axle at aircraft;And
-rolling camera must ideally be directed at the axle of aircraft.
Along the predetermined axle associated with aircraft monitoring sensor be aligned in monitoring sensor assembling on the final assembly line of aircraft during fixed.
But, during the life-span of aircraft, such as it is likely to such as directly or indirectly cause the misalignment of monitoring sensor via the displacement of the structure or structure member of carrying monitoring sensor with the event of the collision of exterior object.
Misalignment means the sight line (monitoring sensor produces image according to the sight line) skew about the predetermined direction being considered as benchmark of monitoring sensor.
It is therefore essential to, or the even at least useful especially misalignment that it is possible to detect monitoring sensor during operation on board the aircraft, from without being induced into mistake and there is reliable information.
Summary of the invention
The present invention relates to the method for automatically detecting misalignment during the operation of the monitoring sensor in being building up to aircraft, described monitoring sensor can generate the image of the external environment condition of aircraft, and this makes it possible to perform reliably and effectively to detect especially.
According to the present invention, described method includes continuous print step, and described step is in that, during operation:
A) the multiple consecutive images generated by monitoring sensor are analyzed, thus determining the axle of displacement representing aircraft position between described first image and the second image, i.e. so-called actual position at least one first image of described image;
B) from the mobile system of aircraft obtain with aircraft generate the instantaneous of described first image and generate described second image instantaneous between the relevant data of displacement;
C) based on the location of these data and carry-on monitoring sensor, calculating represents the axle of the displacement of aircraft position in described first image, i.e. so-called theoretical position;
D) angular difference representing between actual position and the theoretical position of the axle of the displacement of aircraft is estimated;With
E) this angular difference and at least one value limited in advance are compared, thus detect misalignment when this angular difference is more than the described value limited in advance.
Therefore, by the present invention, obtain for during operation (that is when aircraft is operation in flight course or when advancing on the ground time), based on to the image generated by monitoring sensor and the analysis of data provided by mobile system (in particular for measuring the system of the displacement of aircraft and attitude), automatically detect the effective ways of the misalignment of monitoring sensor, as specified by hereafter.
By the present invention, the crew of aircraft is therefore, it is possible to be apprised of any misalignment of monitoring sensor, and and so knows that whether the information provided by monitoring sensor is accurately with believable.
It is to be noted that, in the framework of the present invention, described first image is generated after (especially calculating actual position on described first image) and being typically in other image (so-called second image) in chronological order.But it is also anticipated that be that described first image was generated before described second image.
In a preferred embodiment, described method also includes the step angular difference estimated in step d) and/or the item of information relevant to the misalignment detected in step e) being transferred at least one custom system, and this angular difference makes it possible to quantify the misalignment of monitoring sensor.
Additionally, in an advantageous manner, step a) is in that:
-analyze the image generated by monitoring sensor thus selecting characteristic point;
-for each analysis image of selected characteristic point, so that it is determined that each motion of these characteristic points;
-for each motion of characteristic point, the second component of the first component of motion caused for the displacement due to aircraft with the motion caused by the spontaneous movement of the element belonging to characteristic point is separated;And
-the described actual position of axle of the displacement representing described aircraft is determined by means of the first component of the motion of described characteristic point.
Each embodiment that can be jointly or separately adopted according to the present invention:
-the analysis of each image that realizes in the step a) is restricted at least one so-called area-of-interest of described image;
-the analysis of each image of realization in step a), performs only for all n continuous print images, and n is the integer more than 1;
-step a) is in that to determine the actual position illustrating that the so-called extension of the described actual position representing the axle of the displacement of aircraft focuses on, step c) is in that to determine the so-called theoretical position focused on that extends that the described theoretical position representing the axle of the displacement of aircraft is described, and step d) is in that estimation extends the actual position focused on and the difference extended between the theoretical position focused on;
-described method includes the step determining the confidence level of the angular difference estimated in step d);
-described method includes the step of the temporal filtering of the angular difference estimated in step d).
The invention still further relates to the system for automatically detecting misalignment during the operation of the monitoring sensor in being building up to aircraft, described monitoring sensor can generate the image of the external environment condition of aircraft.
According to the present invention, described detection system includes:
-analytic unit, it is configured to analyze the multiple consecutive images generated by monitoring sensor, thus determining the axle of displacement representing aircraft position between described first image and the second image, i.e. so-called actual position at least one first image of described image;
-acquiring unit, be configured to from the mobile system of aircraft obtain with aircraft generate the instantaneous of described first image and generate described second image instantaneous between the relevant data of displacement;
-computing unit, is configured to the location based on these data and carry-on monitoring sensor, and calculating represents the axle of the displacement of aircraft position in described first image, i.e. so-called theoretical position;
-estimation unit, is configured to the angular difference estimating to represent between actual position and the theoretical position of the axle of the displacement of aircraft;And
-comparing unit, is configured to compare this angular difference and at least one value limited in advance, thus detect misalignment when this angular difference is more than the described value limited in advance.
Advantageously, detection system also includes transmission unit, is configured to the angular difference estimated by estimation unit and/or the item of information relevant to the misalignment detected by comparing unit is transferred at least one custom system, such as the correction system of angular difference.
The invention still further relates to a kind of aircraft, be particularly provided with the carrier of the detection system of all bright detection systems as noted above.
Accompanying drawing explanation
Each figure that encloses will illustrate the mode that wherein can embody the present invention.In these figures, the element that identical label instruction is similar.
Fig. 1 is the schematic diagram of the specific embodiment of the system for automatically detecting misalignment during the operation at the monitoring sensor of aircraft.
Fig. 2 illustrates the aircraft (namely carrier) being provided with such detection system.
Fig. 3 schematically illustrates and focuses on, by means of extending, the example determining the angular difference that misalignment is described.
Fig. 4 is the schematic diagram of the example of the process realized by detection system.
Detailed description of the invention
Schematically show in FIG and make it possible to illustrate that the system 1 of the present invention is the system for automatically detecting misalignment during the operation of the monitoring sensor of aircraft AC (such as carrier) " sensor 2 " (hereinafter " monitoring sensor 2 ").
The monitoring sensor 2 being arranged on aircraft AC can generate the image of the external environment condition of aircraft AC, as schematically shown in fig. 2 in the example rolling camera.This rolling camera provides the consecutive image (or video sequence) relevant with the external sights in aircraft AC front of traveling on the S of ground for pilot, enabling the especially assisting in flying person when driving on the ground.Monitoring sensor 2 is along checking that axle AV produces image.
About monitoring sensor, this is likely to also need to such as:
-image of barrier is provided or is present in the radar of image of environmental aspect in front of aircraft;
-include infrared camera and the enhancing visual system strengthening image of front region of aircraft, EVS are provided for the pilot of aircraft under the visual condition of deterioration;Or
-according to any imaging system that the monitoring of the environment of aircraft is provided for pilot data.
Described detection system 1 includes, as represented in FIG, and central location 3, this central location 3 includes:
-analytic unit 4, it is configured to analyze the multiple consecutive images generated by monitoring sensor 2, its objective is the axle (hereinafter so-called " displacement axle ") at least determining the displacement representing aircraft at least one first image of described image position between described first image and the second image, so-called actual position, as specified by hereafter.Correspondingly, this analytic unit 4 or be directly connected to monitoring sensor 2 (representing like that in Fig. 1) via link 5, or be connected to storage device or receive the image use device of image from monitoring sensor 2;
-acquiring unit " acquisition " 6 (being hereinafter " acquiring unit 6 "), be configured to via link 8 from set 7 acquisition of (multiple) mobile system (" system ") of aircraft and aircraft generate described first image instantaneous and generate described second image instantaneous between the relevant data of displacement;
-computing unit " calculating " 9 (being hereinafter " computing unit 9 "), it is configured to the known location based on these data received via the link 10 of acquiring unit 6 and (receiving via the link 11 of data base " data base " 12 (hereinafter for " data base 12 ")) carry-on monitoring sensor 2, to calculate the position of the axle (being hereinafter so-called " displacement axle ") of the displacement representing aircraft in described first image, described position is referred to as theoretical position.The described known locations of carry-on monitoring sensor include monitoring sensor depending on seeing that axle is about the position of the benchmark of aircraft and orientation, measure described position and orientation during such as such as monitoring sensor being installed on board the aircraft;And
-estimation unit " estimation " 13 (being hereinafter " estimation unit 13 "), is configured to the angular difference estimating between (receiving via the link 15 of computing unit 9) theoretical position of the displacement axle of (receiving via the link 14 of the analytic unit 4) actual position of the displacement axle of (if relevant) aircraft and aircraft.When this angular difference is not 0, this angular difference illustrates the alignment defect of monitoring sensor 2.
Described first image is generally generated in chronological order after described second image.But it is also anticipated that be that described first image was generated in chronological order before described second image.
Additionally, " generating the instantaneous of image " means that monitoring sensor generates the moment (or instantaneous) of (or acquirement) this image.
The central location 3 of detection system 1 also includes comparing unit " comparison " 16 (hereinafter for " comparing unit 16 "), it is configured at least one value limited in advance by (receiving via the link 17 of estimation unit 13) this angular difference with (receiving via link 26) expression tolerance limit from data base " data base " 25 (following for " data base 25 ") compare, thus detect misalignment when this angular difference is more than the described value limited in advance.As long as angular difference is less than or equal to this value limited in advance, be considered as alignment defect (despite the presence of) be within tolerance limit, and do not have the detected system 1 of misalignment to detect.
When misalignment being detected, corresponding item of information can be transferred at least one system 19 (Alarm) of aircraft, especially alarm device (such as warning sound type and/or the visual type of misalignment to crew member in advance) via link 18.
Therefore, detection system 1 can during operation (that is when aircraft is operation, during flying or when on the S of ground advance time (as represented in fig. 2)) effectively and automatically detect monitoring sensor 2 misalignment, this is based on the image generated by this monitoring sensor 2 and is accomplished by the analysis of the data of one or more mobile systems offer of set 7, and can warn the crew of aircraft in advance when carrying out such detection.
Make it possible to the data relevant with the displacement of aircraft to provide the described set 7 of acquiring unit 6 to include such as, at least one in the elements below of aircraft or system:
-air data inertia and benchmark system, ADIRS;
-alignment system, for instance GPS (" GlobalPositioningSystem ", global positioning system) type;
Any airborne device of the data that-employing is relevant with the displacement of aircraft or computer.
Acquiring unit 6 can also state system from part some previously mentioned forming set 7 or element receives information to be processed.
Therefore acquiring unit 6 is configured to obtain the data relevant with the displacement of aircraft.The displacement of this aircraft includes the aircraft displacement along track, and the displacement of the attitude rectification of aircraft (rotation of three axles around it).
Further, in certain embodiments, as considered in the following description, relative to the displacement of ground survey aircraft.In various embodiments, it is also possible to key element fixing relative to another and measure the displacement of aircraft especially relative to aircraft ambient air group, for instance when monitoring sensor is weather radar wherein.
In a preferred embodiment, detection system 1 includes transmission unit (link 20), for (or being provided by comparing unit 16) angular difference transmission estimated by estimation unit 13 at least one custom system " user " 21 (hereinafter for " custom system 21 ").This angular difference makes it possible to, if relevant, quantifies the misalignment of monitoring sensor 2.
The misalignment so obtained quantifies to be used to automatically again be directed at monitoring sensor 2 (if the latter adopts electronic indication system (automatic or manual)), and this is to accomplish via the suitable device for re-alignment representing custom system 21.
The detection of misalignment of monitoring sensor is to be obtained by the comparison between following:
-from the displacement to the aircraft inferred by the analysis of the motion between monitoring sensor 2 (wanting the alignment verifying this monitoring sensor 2) sequence of images that provides;And
-by the displacement of the aircraft of offer while consideration monitoring sensor is relative to the relative position (being recorded in data base 12) at the center (being generally center of gravity) of the displacement of aircraft of the system (such as, the system of ADIRS type) for positioning aircraft.
In a preferred embodiment, analytic unit 4 includes:
-motion detection elements " detection " 22 (being hereinafter " motion detection elements 22 "), the integrated treatment element including following:
● the first treatment element, for analyzing (and receiving via the link 5) image generated by monitoring sensor 2 thus selecting characteristic point;
● the second treatment element, for each analysis image for characteristic point so that it is determined that each motion in these characteristic points;And
● the 3rd treatment element, for for each motion in characteristic point, the second component of the motion caused by spontaneous movement of the first component of separation motion caused by the displacement of aircraft and the element belonging to characteristic point;And
-computing element " calculating " 23 (below is " computing element 23 "), it is connected to motion detection elements 22 by link 24 and is configured to the actual position that the first component of the motion by means of the described characteristic point received from the 3rd treatment element via link 24 determines the displacement axle of aircraft.This actual position is transmitted by link 14.
In the framework of the present invention, motion detection elements 22 is capable of the various standard solution for extracting motion from video sequence (or becoming the consecutive image of series).It is possible that by way of illustration, the extraction (or algorithm) of following type is quoted: by Markov model, Markov field, time average, statistical operator, coupling (or " Block-matching "), Kraft Rui Ou-Luo Ka etc..
Motion detection elements 22 can also use the criteria optimization of these various algorithms to obtain good real-time performance.
In addition, in order to limit the quantity of the calculating of necessity, and therefore to optimize CPU (central processing unit) load, motion detection elements 22 is configured to the analysis not being to perform motion on the whole surface of image but on one or more area-of-interests of image, a part for the region of interest domain representation image of image.Therefore, by the mode illustrated, in the example of monitoring sensor strengthening visual system corresponding to EVS type, it is not necessary that analyze sky, area-of-interest can correspond to whole the checked image except sky in this case.
Further, in order to limit the quantity of calculating of necessity, and therefore optimize cpu load, it is possible to make stipulations with:
-make granularity (profile, bunch, motion detection on point-of-interest SIFT, Harris's point-of-interest etc.) adapt to the image generated by monitoring sensor type (colored, infrared, radar) and adapt to the environment seen by monitoring sensor;
-if it is possible, to being performed motion analysis by the image of black and white storage;And
-reduce the frequency calculating misalignment.As consider monitoring sensor function, between every pair of continuous print image perform detection it is not necessary that or otiose.Then the analysis of the image realized by analytic unit 4 is performed individually for all of n consecutive image, in other words for image 1, then for image n+1, hereafter for image 2n+1 ..., n is the integer more than 1, and equal to such as 2,3....
As indicated above, motion detection elements 22 can use the one in the various canonical algorithms of the motion for extracting aircraft.
Therefore, in certain embodiments, this motion detection elements 22 uses and focuses on, based on extending of image, the extraction realized, as represented in figure 3.
It is known that in static scene, during the displacement of imager (such as monitoring sensor 2), the direction of the speed of projection each point on the image plane is all assembled towards the point being referred to as " extend and focus on " (FOE).
Therefore, central location 3 (Fig. 1) determines that at least one image the extension of the displacement axle described actual position between two images in succession that aircraft is described focuses on the so-called actual position of (or truly extending focusing), and the described theoretical position of the displacement axle of aircraft is described, the so-called theoretical position of (or theoretical extension focuses on) is focused on based on the extension of the data relevant with the displacement of the aircraft between the catching of two images in succession and the location Calculation of monitoring sensor on board the aircraft, and it estimates extending the actual position focused on and the difference extended between the theoretical position focused on.
In the example of fig. 3, it is considered to two continuous print image I1 and I2.By way of illustration, the scene SC that these graphical representation are captured in the front of aircraft.This scene SC especially comprises the landing runway 27 that aircraft travels over, it is schematically indicated the fluctuating 30 of the particular element 28 of such as plate, cloud 29 and landform.
Method for measuring angular difference based on image I1 and I2 shows following steps:
-determine image IA, image IA is highlighted the shift vector V1 to V4 of the characteristic point P1 to P4 considered;
-determine true extension focusing 31 based on vector V1, V2 and V3, illustrating such as the image IB of Fig. 3.This truly extends the displacement axle focusing on the aircraft that 31 expressions are seen by monitoring sensor.Vector V4 is relevant with the cloud 29 also being subjected to intrinsic displacement.In order to consider the information relevant with vector V4 and delete the intrinsic displacement of cloud 29, it is achieved such as the filtering specified hereinafter;
-based on the information produced from one or more mobile systems (i.e. ADIRS system in this example) of aircraft, determine aircraft AC generate image I1 and I2 each instantaneous between relative to the displacement on ground;
-by means of the aircraft AC predetermined Angle Position P relative to the displacement on ground, adjustment tolerance limit TOL and monitoring sensorα, determine theoretical extension focusing 32, say, that the position that the displacement axle of aircraft should be positioned at.This position is illustrated by the circle 32 considering tolerance limit TOL in the image IC of Fig. 3;And
-determine the differences between two displacement axles being highlighted on this image IC by arrow 33.
Additionally, realize filtering so that outer scene is reverted to static scene, thus the proper velocity except deleting the speed except monitoring sensor.Can be based on extending the framework internal diabetes of the method focusing on FOE unless the direction of speed assembled for the monitoring sensor of EVS type or the simple scheme when the spontaneous movement of object can be ignored.
Central location 3 can include wave filter to perform temporal filtering.Such as, this wave filter such as can be integrated in estimation unit 13.This temporal filtering can enable to the clear and definite dynamic misalignment (necessity is considered) deflecting caused constant misalignment caused with the motion of the structure due to aircraft and vibration being particularly due to structure to separate.This temporal filtering is it is also possible that it can be considered that form the measurement error of the alignment sensor of the aircraft of a part (Fig. 1) for set 7.
Additionally, measuring accuracy is had impact by the resolution of monitoring sensor and the distance to the object being imaged, it to be integrated in central location 3 in the various calculating of realization.
It is that when operation, (being such as described above) detection system 1 realizes the following steps represented in the diagram to aircraft (that is during flying or when advancing on the ground) when during operation:
E1/, via analytic unit 4, analyzes the multiple consecutive images 11,12... generated by monitoring sensor, so that it is determined that displacement axle position in these images, the i.e. so-called actual position of aircraft;
E2/, via acquiring unit 6, obtains the data relevant relative to the displacement on ground with aircraft in the mobile system of (" acquisition ") at least (set 7) aircraft;
E3/, via computing unit 9, based on the location of these data produced from data base 12 and carry-on monitoring sensor, calculates displacement axle position in the picture, the i.e. so-called theoretical position of (" calculating ") aircraft;
E4/, via estimation unit 13, estimates the angular difference between the actual position and the theoretical position that shift axle of (" estimation ") aircraft;And
E5/ via comparing unit 16, the value (or tolerance limit) that this angular difference is limited in advance with at least one compare (" comparison ") thus when this angular difference is more than the value limited in advance described in receiving from data base 25 detection misalignment.
The method realized by detection system 1 also includes filter step E6 (" filtering ") between step E4 and E5, to confirm the existence of constant misalignment.
Additionally, analytical procedure E1 includes following sub-step A1 to A5:
A1/ limits one or more area-of-interest F (dynamic or static) of image 11,12;
A2/ selects characteristic point (or point-of-interest) on one or more area-of-interest F;
Characteristic point is matched by A3/ between each image;
A4/ calculates the track (image IA) of point-of-interest;And
A5/ calculates the real position (image IB) extending and focusing on 31.
The additional sub-step A6 deleting the track being not aircraft (and dispersing about average) track is likely provided.
The method realized by detection system 1 can also include:
-substitute acquired image by means of suitable custom system 21 (Fig. 1) thus realizing the step E7 of the correction (" correction ") of misalignment;And
-calculate the step E8 as the quantity of convergent point and the confidence level (" confidence level ") of the function of error evolution.
Correspondingly, central location 3 can include the computing element (it is such as integrated in estimation unit 13) of the confidence level for determining the value for estimated angular difference.This confidence level can be derived certainly:
The evolution of the process over time of-misalignment, this evolution must be constant (and the moment only in event (such as causing the shock of its generation) is in progress) after the filtering of the dynamic deformation of structure;And
-for the quantity of the useful point of displacement of calculating aircraft.The quantity of convergent point is more high, and confidence level and degree of accuracy are more good.
All detection systems 1 as described above make it possible to perform the automatic detection of the misalignment of monitoring sensor by software when being realized and while not requiring any special additional hardware, by providing algorithm suitable as described above to accomplish this point.Therefore detection system 1 can be implemented in and be present in carry-on and have in the system of enough availabilities with regard to interface & processor resource.

Claims (11)

1. the method for automatically detecting misalignment during the operation of the monitoring sensor in being built into aircraft, described monitoring sensor can generate the image of the external environment condition of aircraft, described method includes continuous print step, and described step is in that, during operation:
A) the multiple consecutive images generated by monitoring sensor are analyzed, thus determining the axle of displacement representing aircraft position between described first image and the second image, i.e. so-called actual position at least one first image of described image;
B) from least one mobile system of aircraft obtain with aircraft generate the instantaneous of described first image and generate described second image instantaneous between the relevant data of displacement;
C) based on the location of these data and carry-on monitoring sensor, calculating represents the axle of the displacement of aircraft position in described first image, i.e. so-called theoretical position;
D) angular difference representing between actual position and the theoretical position of the axle of the displacement of aircraft is estimated;
E) this angular difference and at least one value limited in advance are compared, thus detect misalignment when this angular difference is more than the described value limited in advance, and
Determine the confidence level of the angular difference estimated in step d).
2. the method for claim 1,
Wherein said method includes the step angular difference estimated in step d) and/or the item of information relevant to the misalignment detected in step e) being transferred at least one custom system.
3. the method as described in one of claim 1 and 2,
Wherein step a) is in that:
-analyze the image generated by monitoring sensor thus selecting characteristic point;
-each for selected characteristic point, is analyzed so that it is determined that each motion of these characteristic points to image;
-for each motion of characteristic point, the second component of the first component of motion caused for the displacement due to aircraft with the motion caused by the spontaneous movement of the element belonging to characteristic point is separated;And
-the described actual position of axle of the displacement representing aircraft is determined by means of the first component of the motion of described characteristic point.
4. the method as described in any one in aforementioned claim,
Wherein, the analysis of each image realized in step a) is restricted at least one so-called area-of-interest of described image.
5. the method as described in any one in aforementioned claim,
Wherein, the analysis of each image realized in step a), perform only for all n continuous print images, n is the integer more than 1.
6. the method as described in any one in aforementioned claim,
Wherein, step a) is in that to determine the actual position illustrating that the so-called extension of the described actual position representing the axle of the displacement of aircraft focuses on, step c) is in that to determine the so-called theoretical position focused on that extends that the described theoretical position representing the axle of the displacement of aircraft is described, and step d) is in that estimation extends the actual position focused on and the difference extended between the theoretical position focused on.
7. the method as described in any one in aforementioned claim,
Wherein determine that the step of the confidence level of angular difference is in that from inferring confidence level as follows:
The evolution of the process over time of-misalignment;And
-for the quantity of the useful point of displacement of calculating aircraft.
8. the method as described in any one in aforementioned claim,
Wherein said method includes the step of the temporal filtering of the angular difference estimated in step d).
9., for automatically detecting a system for misalignment during the operation of the monitoring sensor in being building up to aircraft, described monitoring sensor can generate the image of the external environment condition of aircraft, and described system includes:
-analytic unit, it is configured to analyze the multiple consecutive images generated by monitoring sensor, thus determining the axle of displacement representing aircraft position between described first image and the second image, i.e. so-called actual position at least one first image of described image;
-acquiring unit, be configured to from the mobile system of aircraft obtain with aircraft generate the instantaneous of described first image and generate described second image instantaneous between the relevant data of displacement;
-computing unit, is configured to the location based on these data and carry-on monitoring sensor, and calculating represents the axle of the displacement of aircraft position in described first image, i.e. so-called theoretical position;
-estimation unit, is configured to the angular difference estimating to represent between actual position and the theoretical position of the axle of the displacement of aircraft;And
-comparing unit, is configured to compare this angular difference and at least one value limited in advance, thus detect misalignment when this angular difference is more than the described value limited in advance.
10. system as claimed in claim 10,
Wherein said system includes transmission unit, and transmission unit is configured to the angular difference estimated by estimation unit and/or the item of information relevant to the misalignment detected by comparing unit is transferred at least one custom system.
11. an aircraft, wherein said aircraft includes the system as described in claim 9 and 10.
CN201511036038.5A 2014-12-12 2015-12-11 Method and system for automatically detecting misalignment of monitoring sensors of an aircraft Active CN105716625B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1462316A FR3030091B1 (en) 2014-12-12 2014-12-12 METHOD AND SYSTEM FOR AUTOMATICALLY DETECTING A DISALLIATION IN OPERATION OF A MONITORING SENSOR OF AN AIRCRAFT.
FR1462316 2014-12-12

Publications (2)

Publication Number Publication Date
CN105716625A true CN105716625A (en) 2016-06-29
CN105716625B CN105716625B (en) 2021-01-22

Family

ID=52988173

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511036038.5A Active CN105716625B (en) 2014-12-12 2015-12-11 Method and system for automatically detecting misalignment of monitoring sensors of an aircraft

Country Status (3)

Country Link
US (1) US10417520B2 (en)
CN (1) CN105716625B (en)
FR (1) FR3030091B1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107883916A (en) * 2016-09-29 2018-04-06 波音公司 Method and apparatus for sense aircraft areal deformation
CN108931258A (en) * 2017-05-23 2018-12-04 空中客车运营简化股份公司 Method and apparatus for monitoring and estimating the relevant parameter of flight to aircraft
CN111099037A (en) * 2019-12-16 2020-05-05 中国航空工业集团公司洛阳电光设备研究所 Method for monitoring security of display picture of civil aircraft head-up display
CN112240785A (en) * 2019-07-18 2021-01-19 金鸡滑雪具公司 Analysis system for in-use performance of skateboard

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9538334B2 (en) 2015-01-15 2017-01-03 GEOTAB Incorporated Telematics furtherance visualization system
US10788316B1 (en) 2016-09-21 2020-09-29 Apple Inc. Multi-sensor real-time alignment and calibration
CN117689859B (en) * 2024-02-02 2024-05-10 深圳市双翌光电科技有限公司 High-precision visual alignment method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10115043A1 (en) * 2000-04-25 2002-04-04 Iteris Inc Calibration method for vehicle-mounted camera system uses evaluation of reference body image provided by camera for correcting misaligment of camera axis
EP1710749A1 (en) * 2005-04-01 2006-10-11 Audi Ag Correction of yaw angle measuring errors for driving lane detection sensors
CN1894557A (en) * 2003-12-16 2007-01-10 特里伯耶拿有限公司 Calibration of a surveying instrument
CN103328928A (en) * 2011-01-11 2013-09-25 高通股份有限公司 Camera-based inertial sensor alignment for personal navigation device
US20130307982A1 (en) * 2012-05-15 2013-11-21 Toshiba Alpine Automotive Technology Corporation Onboard camera automatic calibration apparatus
CN103733234A (en) * 2011-02-21 2014-04-16 斯特拉特克系统有限公司 A surveillance system and a method for detecting a foreign object, debris, or damage in an airfield
US20140300736A1 (en) * 2013-04-09 2014-10-09 Microsoft Corporation Multi-sensor camera recalibration

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995681A (en) * 1997-06-03 1999-11-30 Harris Corporation Adjustment of sensor geometry model parameters using digital imagery co-registration process to reduce errors in digital imagery geolocation data
US6735348B2 (en) * 2001-05-01 2004-05-11 Space Imaging, Llc Apparatuses and methods for mapping image coordinates to ground coordinates
US20030048357A1 (en) * 2001-08-29 2003-03-13 Geovantage, Inc. Digital imaging system for airborne applications
US20040257441A1 (en) * 2001-08-29 2004-12-23 Geovantage, Inc. Digital imaging system for airborne applications
US6653650B2 (en) * 2001-11-26 2003-11-25 Northrop Grumman Corporation Streamlined method and apparatus for aligning a sensor to an aircraft
US7209161B2 (en) * 2002-07-15 2007-04-24 The Boeing Company Method and apparatus for aligning a pair of digital cameras forming a three dimensional image to compensate for a physical misalignment of cameras
US6792369B2 (en) * 2002-08-09 2004-09-14 Raytheon Company System and method for automatically calibrating an alignment reference source
US7602415B2 (en) * 2003-01-17 2009-10-13 Insitu, Inc. Compensation for overflight velocity when stabilizing an airborne camera
US7605774B1 (en) * 2004-07-02 2009-10-20 Rockwell Collins, Inc. Enhanced vision system (EVS) processing window tied to flight path
US7337650B1 (en) * 2004-11-09 2008-03-04 Medius Inc. System and method for aligning sensors on a vehicle
JP4681856B2 (en) * 2004-11-24 2011-05-11 アイシン精機株式会社 Camera calibration method and camera calibration apparatus
US7379619B2 (en) * 2005-03-09 2008-05-27 Texas Instruments Incorporated System and method for two-dimensional keystone correction for aerial imaging
US7382448B1 (en) * 2005-03-16 2008-06-03 Celestron Acquisition, Llc Alignment system for observation instruments
JP4820221B2 (en) * 2006-06-29 2011-11-24 日立オートモティブシステムズ株式会社 Car camera calibration device and program
IL181889A (en) * 2007-03-13 2010-11-30 Israel Aerospace Ind Ltd Method and system for providing a known reference point for an airborne imaging platform
RU2460187C2 (en) * 2008-02-01 2012-08-27 Рокстек Аб Transition frame with inbuilt pressing device
US8497905B2 (en) * 2008-04-11 2013-07-30 nearmap australia pty ltd. Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features
CA2723225A1 (en) * 2008-05-02 2009-11-05 Eyeic, Inc. System for using image alignment to map objects across disparate images
JP4555876B2 (en) * 2008-05-30 2010-10-06 株式会社日本自動車部品総合研究所 Car camera calibration method
JP4690476B2 (en) * 2009-03-31 2011-06-01 アイシン精機株式会社 Car camera calibration system
US20110010026A1 (en) * 2009-07-13 2011-01-13 Utah State University Calibration Method for Aerial Vehicles
JP5299231B2 (en) * 2009-11-17 2013-09-25 富士通株式会社 Calibration device
FR2954494B1 (en) * 2009-12-18 2012-07-27 Thales Sa METHOD OF CALIBRATING A MEASURING INSTRUMENT OF AN OPTRONIC SYSTEM
US20120224058A1 (en) * 2011-03-02 2012-09-06 Rosemount Aerospace Inc. Airplane cockpit video system
CA2835290C (en) * 2011-06-10 2020-09-08 Pictometry International Corp. System and method for forming a video stream containing gis data in real-time
CN102829762B (en) * 2011-06-17 2015-02-11 刘正千 Unmanned aerial vehicle image processing system and method
US9215383B2 (en) * 2011-08-05 2015-12-15 Sportsvision, Inc. System for enhancing video from a mobile camera
US9051695B2 (en) * 2011-10-18 2015-06-09 Herzog Railroad Services, Inc. Automated track surveying and ballast replacement
IL220815A (en) * 2012-07-08 2016-12-29 Israel Aerospace Ind Ltd Calibration systems and methods for sensor payloads
KR101282718B1 (en) * 2012-12-28 2013-07-05 한국항공우주연구원 Absolute misalignment calibration method between attitude sensors and linear array image sensor
US9025825B2 (en) * 2013-05-10 2015-05-05 Palo Alto Research Center Incorporated System and method for visual motion based object segmentation and tracking
US9394059B2 (en) * 2013-08-15 2016-07-19 Borealis Technical Limited Method for monitoring autonomous accelerated aircraft pushback

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10115043A1 (en) * 2000-04-25 2002-04-04 Iteris Inc Calibration method for vehicle-mounted camera system uses evaluation of reference body image provided by camera for correcting misaligment of camera axis
CN1894557A (en) * 2003-12-16 2007-01-10 特里伯耶拿有限公司 Calibration of a surveying instrument
EP1710749A1 (en) * 2005-04-01 2006-10-11 Audi Ag Correction of yaw angle measuring errors for driving lane detection sensors
CN103328928A (en) * 2011-01-11 2013-09-25 高通股份有限公司 Camera-based inertial sensor alignment for personal navigation device
CN103733234A (en) * 2011-02-21 2014-04-16 斯特拉特克系统有限公司 A surveillance system and a method for detecting a foreign object, debris, or damage in an airfield
US20130307982A1 (en) * 2012-05-15 2013-11-21 Toshiba Alpine Automotive Technology Corporation Onboard camera automatic calibration apparatus
US20140300736A1 (en) * 2013-04-09 2014-10-09 Microsoft Corporation Multi-sensor camera recalibration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHRISTIAN HERDTWECK EL: ""Monocular Heading Estimation in Non-stationary Urban Environment"", 《2012 IEEE INTERNATIONAL CONFERENCE ON MULTISENSOR FUSION AND INTEGRATION FOR INTELLIGENT SYSTEMS》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107883916A (en) * 2016-09-29 2018-04-06 波音公司 Method and apparatus for sense aircraft areal deformation
CN108931258A (en) * 2017-05-23 2018-12-04 空中客车运营简化股份公司 Method and apparatus for monitoring and estimating the relevant parameter of flight to aircraft
CN108931258B (en) * 2017-05-23 2024-02-20 空中客车运营简化股份公司 Method and device for monitoring and estimating parameters relating to the flight of an aircraft
CN112240785A (en) * 2019-07-18 2021-01-19 金鸡滑雪具公司 Analysis system for in-use performance of skateboard
CN111099037A (en) * 2019-12-16 2020-05-05 中国航空工业集团公司洛阳电光设备研究所 Method for monitoring security of display picture of civil aircraft head-up display
CN111099037B (en) * 2019-12-16 2023-02-10 中国航空工业集团公司洛阳电光设备研究所 Method for monitoring security of display picture of civil aircraft head-up display

Also Published As

Publication number Publication date
FR3030091A1 (en) 2016-06-17
FR3030091B1 (en) 2018-01-26
US10417520B2 (en) 2019-09-17
CN105716625B (en) 2021-01-22
US20160171700A1 (en) 2016-06-16

Similar Documents

Publication Publication Date Title
CN105716625A (en) Method and system for automatically detecting a misalignment during operation of a monitoring sensor of an aircraft
US20210012520A1 (en) Distance measuring method and device
US10884110B2 (en) Calibration of laser and vision sensors
US8970401B2 (en) Using image sensor and tracking filter time-to-go to avoid mid-air collisions
KR102400452B1 (en) Context-aware object detection in aerial photographs/videos using travel path metadata
KR101758735B1 (en) Method for acquiring horizontal distance between camera and target, camera and surveillance system adopting the method
CN107144839A (en) Pass through the long object of sensor fusion detection
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
EP3155369B1 (en) System and method for measuring a displacement of a mobile platform
US20160165191A1 (en) Time-of-approach rule
US20050238220A1 (en) Method and device for inspecting linear infrastructures
US20130093880A1 (en) Height Measurement Apparatus And Method
Kakillioglu et al. 3D sensor-based UAV localization for bridge inspection
KR20160125803A (en) Apparatus for defining an area in interest, apparatus for detecting object in an area in interest and method for defining an area in interest
Brunner et al. Combining multiple sensor modalities for a localisation robust to smoke
KR100751096B1 (en) Velocity measuring apparatus and method using optical flow
Doehler et al. Image-based drift and height estimation for helicopter landings in brownout
KR101183645B1 (en) System for measuring attitude of aircraft using camera and method therefor
US10698111B2 (en) Adaptive point cloud window selection
Bourzeix et al. Speed estimation using stereoscopic effect
Dolph et al. Monocular Ranging for Small Unmanned Aerial Systems in the Far-Field
Roberts et al. Inertial navigation sensor integrated motion analysis for obstacle detection
JP2023141392A (en) Gas monitoring method, gas monitoring device, gas monitoring system and gas monitoring program
Christensen et al. Assessing Sequential Monoscopic Images for Height Estimation of Fixed-Wing Drones.
KR20220082202A (en) Stereo image rectification method for detecting 3D objects of unmanned aerial vehicles

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant