WO2011159288A2 - Positionnement de cadran avec compensation de vitesse cible - Google Patents

Positionnement de cadran avec compensation de vitesse cible Download PDF

Info

Publication number
WO2011159288A2
WO2011159288A2 PCT/US2010/038738 US2010038738W WO2011159288A2 WO 2011159288 A2 WO2011159288 A2 WO 2011159288A2 US 2010038738 W US2010038738 W US 2010038738W WO 2011159288 A2 WO2011159288 A2 WO 2011159288A2
Authority
WO
WIPO (PCT)
Prior art keywords
target
velocity
gimbal
change
user input
Prior art date
Application number
PCT/US2010/038738
Other languages
English (en)
Other versions
WO2011159288A3 (fr
Inventor
Stewart W. Evans
Anca G. Williams
Original Assignee
Flir Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flir Systems, Inc. filed Critical Flir Systems, Inc.
Priority to PCT/US2010/038738 priority Critical patent/WO2011159288A2/fr
Publication of WO2011159288A2 publication Critical patent/WO2011159288A2/fr
Publication of WO2011159288A3 publication Critical patent/WO2011159288A3/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/22Aiming or laying means for vehicle-borne armament, e.g. on aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/64Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
    • G02B27/644Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for large deviations, e.g. maintaining a fixed line of sight while a vehicle on which the system is mounted changes course

Definitions

  • Imaging systems, light sources, weapons, and other devices can be mounted and used on a variety of supports.
  • moving vehicles including various aircraft, watercraft, and ground vehicles, can provide versatile supports capable of transporting such devices.
  • Many devices benefit from being easily and accurately pointed at a desired target.
  • Gimbal systems can be used, alone or with gyroscopic stabilization, easily and accurately to point such devices without necessarily having to reorient the supports to which the devices are mounted.
  • Gimbal systems are any device-mounting mechanisms that include at least two different, typically mutually perpendicular, axes of rotation, thus providing angular movement in at least two directions (e.g., pan and tilt, among others).
  • a gimbal system can include one or more constituent gimbals, each of which can rotate relative to one or more other constituent gimbals and/or a supported payload.
  • a gimbal system also can include corresponding motors for rotating the various gimbals, control systems for controlling the various motors and/or payload components, gyroscopes for stabilizing the payload, and/or any other components used to aim and/or otherwise control the payload.
  • One use of gimbal systems includes tracking a target located at some position in three-dimensional space, such as an object on the surface of the ground or in the air, with a sensor or designator mounted within a gimbal.
  • a gimbal mounted to a moving platform such as an aircraft, may be configured to maintain its line of sight toward a particular target position once that position has been determined.
  • keeping the gimbal pointing toward a target object becomes more complicated when the target itself is moving. Accordingly, improved gimbal tracking systems are needed to track moving targets effectively. Summary
  • the present disclosure provides a gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight ("los") of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input information regarding a target velocity into a gimbal controller.
  • los line of sight
  • Figure 1 is a view of an exemplary gimbal system with target velocity compensation mounted to an exterior of a support platform (namely, a helicopter), in accordance with aspects of the present disclosure.
  • Figure 2 is a schematic view of selected aspects of the gimbal system of Figure 1 , including a user interface, in accordance with aspects of the present disclosure.
  • Figure 3 is a schematic view of a gimbal system with target velocity compensation being used to track a target, in accordance with aspects of the present disclosure.
  • FIG 4 illustrates an earth-centered earth-fixed (ECEF) ("e") coordinate system, in accordance with aspects of the present disclosure.
  • ECEF earth-centered earth-fixed
  • Figure 5 illustrates a navigation ("n") coordinate system, in accordance with aspects of the present disclosure.
  • Figure 6 illustrates an aircraft body ("b") coordinate system, in the context of a gimbal system mounted on a helicopter, in accordance with aspects of the present disclosure.
  • Figure 7 illustrates gimbal mount (“gm”), gimbal payload (“gp”), and optical ("o") coordinate systems, in accordance with aspects of the present disclosure.
  • Figure 8 is a vector diagram depicting vectors used to determine the line-of-sight-to-target vector, in accordance with aspects of the present disclosure.
  • Figure 9 is a flow chart depicting an exemplary method of tracking a moving target with a gimbal-mounted sensor, in accordance with aspects of the present disclosure.
  • the present disclosure provides a gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight ("los") of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input a target velocity into a gimbal controller.
  • los line of sight
  • FIG. 1 shows an exemplary gimbal system 10 with target velocity compensation, in accordance with aspects of the present disclosure.
  • Gimbal system 10 may include a turret unit 12 (also termed a gimbal apparatus), which supports and orients a payload that may include a tracking device, mounted on a support platform 14.
  • the support platform is a helicopter, with the turret unit mounted on the exterior of the vehicle.
  • a user interface unit and portions of an associated controller may be located inside the vehicle or, in some cases, remotely (e.g., in a command center).
  • the turret unit may have a compact, aerodynamic configuration, with sensitive components, such as electronics and/or the payload, enclosed for protection from ambient (external) air, to minimize exposure to moisture, salt, particulates, etc.
  • the position of the support platform, if movable, may be determined as a function of time using any suitable mechanism(s), such as a global positioning system (GPS) device and/or an inertial navigation system (INS) associated with the platform, among others.
  • GPS global positioning system
  • INS inertial navigation system
  • the tracking device in the gimbal payload may, for example, be a camera, a laser, a sensor, and/or any other device capable of sensing, marking, illuminating, or otherwise distinguishing or identifying a desired target.
  • the target may be stationary or moving with a nonzero velocity, and may be disposed on the surface of the earth or at some other position within the line of sight of the tracking device.
  • the system may include one or more controllers.
  • a controller as used herein, is any device that operates the gimbal system, and components thereof, including tracking, generally automatically.
  • the controller may include a local controller, such as the mount/gimbal controller described below, and/or a remote computing device.
  • the controller may include a processor that can perform arithmetic and/or logical operations on data, such as position and velocity data, and generate commands such as servo commands that control the gimbal assembly and reorient the associated payload.
  • the controller may include memory to store instructions and/or data.
  • the controller may be connected with user input and output devices, and portions of the controller may be connected with each other, using any suitable connections, including wires, optical conduits, and/or wireless connections.
  • Turret unit 12 may include a mounting portion 18 (also or alternatively termed a control portion), a gimbal assembly 20, and a payload 22.
  • Mounting portion 18 may be fixed to support platform 14, such that the mounting portion is at least substantially stationary with respect to the support platform.
  • Gimbal assembly 20 may be connected to and supported pivotably by mounting portion 18.
  • the mounting portion may function as a connecting bridge between support platform 14 and gimbal assembly 20.
  • Payload 22 in turn, may be connected to and supported by gimbal assembly 20, and may be oriented controllably with respect to the mounting portion (and the support platform) by driven motion (e.g., motor-driven motion) of gimbal assembly 20.
  • Mounting portion 18 alone or collectively with gimbal assembly 20 may form a chamber 24 in which internal components of the mounting portion are housed.
  • the chamber may provide a separate internal air space, which may be desiccated and isolated physically (although not necessarily sealed hermetically), to protect the internal components from water vapor, dust, rain, snow, debris, insects, etc.
  • These internal components may include electronic components, generally termed electronics 26 (e.g., one or more circuit boards). Electronics 26 may form at least a portion of a controller 27 of the turret unit.
  • the controller may be in communication with a user interface unit 28, which may permit a user to communicate with the controller, such as by inputting commands to the controller and/or receiving data (e.g., image and/or video data) from the controller.
  • the user interface unit may include a joystick 30 or other user input device(s), for inputting commands, and a display 32 or other user output device(s), for receiving and presenting data.
  • the display may include a reference mark 34, such as crosshairs, with which an image 36 of the target may be aligned or otherwise referenced.
  • the mounting portion may further comprise a drive assembly 40 to drive pivotal motion of the gimbal assembly with respect to the mounting portion about a first axis 42 (e.g., an azimuthal axis).
  • the drive assembly may include a drive motor, one or more gears, and the like.
  • the mounting portion may comprise a pointing sensor 43 (e.g., an encoder), a bearing, and the like.
  • Turret unit 12 may be utilized to aim payload 22, such as a camera or marker, with respect to support platform 14.
  • the turret unit may aim the payload by controlled pivotal movement of constituent gimbals 44-50 of gimbal assembly 20 relative to support platform 14, based on direct input from a user (e.g., via a joystick) and/or via an automatic tracking system (e.g., from a target velocity compensation system).
  • the angular orientation of the payload may be adjusted horizontally and vertically via the gimbals without changing the orientation of the support platform.
  • the angular orientation of the payload may be adjusted to compensate for changes in the orientation and/or position of the support platform and/or motion of the target.
  • the turret unit may allow one or more fixed and/or moving objects/scenes to be detected over time from a fixed and/or moving support platform 14.
  • Gimbal assembly 20 may include, among others, an outer gimbal 44 (also termed an azimuthal gimbal) and an inner gimbal 46 (also termed an elevational gimbal). Gimbal assembly 20 may be coupled pivotably to mounting portion 18 via outer gimbal 44, for controlled, driven pivotal movement of the entire gimbal assembly (and the outer gimbal) about first axis 42 (e.g., a generally vertical axis). Inner gimbal 46 may be pivotably coupled to mounting portion 18 through outer gimbal 44, such that the outer gimbal carries the inner gimbal.
  • Inner gimbal 46 may undergo pivotal movement about a second axis 52 (e.g., a generally horizontal axis, also termed an elevational axis) that is nonparallel (e.g., transverse and/or orthogonal) to first axis 42.
  • Payload 22 may be connected to mounting portion 18 via inner gimbal 46. Accordingly, pivotal movement of outer gimbal 44 and/or inner gimbal 46 may aim the payload in relation to first and second axes 42, 52.
  • the payload may be connected pivotably to inner gimbal 46 via one or more additional gimbals 48, 50 that pivot about one or more additional axes 54, 56.
  • controlled pivotal movement of outer and inner gimbals 44, 46 may provide coarser adjustments to the orientation of payload 22, and controlled pivotal movement of additional gimbals 48, 50 (minor gimbals) may provide finer adjustments to the orientation (or vice versa).
  • Turret unit 12 may include a thermal regulation system that operates to provide feedback-regulated control of temperature within the turret unit, and particularly mounting portion 18.
  • An exemplary thermal regulation system is described in U.S. Provisional Patent Application Serial No. 61/296,336, filed January 19, 2010.
  • Turret unit 12 may include a position control system 62.
  • System 62 may operate to provide controlled positioning of the payload.
  • the position control system may include controller 27, a driver(s) (e.g., a driver for each gimbal), and an encoder(s) (e.g., an encoder for each gimbal).
  • the driver(s) may drive pivotal movement of each gimbal under the control of the controller.
  • the encoder(s) may measure the position of the gimbal before, during, and/or after movement of each gimbal by the driver(s), to provide feedback information to the controller for further operation of the driver(s) and/or to allow the current position of a gimbal to be recorded and/or reported.
  • the position control system further may include one or more gyroscopes to stabilize the position of the gimbals and/or the payload.
  • Turret unit 12 may include a power supply 64.
  • the power supply may include any mechanism for supplying power, such as electrical power, to thermal regulation system 58, position control system 62, and electronics 26, among others. Suitable power supplies may generate, condition, and/or deliver power, including AC and/or DC power, in continuous and/or pulsed modes. Exemplary power supplies may include batteries, AC-to-DC converters, DC-to-AC converters, and so on.
  • FIG. 3 schematically depicts a tracking system, generally indicated at 100, configured to maintain its line of sight toward a target.
  • Tracking system 100 may be mounted on a suitable platform 102, which may be an aircraft, among others.
  • the line of sight of the tracking device which is generally indicated at 104, may be initially directed toward a desired target 106 using any suitable mechanism.
  • the initial target position may be acquired manually.
  • line of sight 104 may be adjusted with a user-operated joystick or other input device connected to a gimbal controller until the tracking device acquires the target.
  • the target position is known in some coordinate system, the target may be acquired by manually entering target coordinates into the gimbal controller.
  • a target position may be pre-programmed or sensed automatically (for example, through shape, size, color, pattern, and/or other forms of automated recognition), in which case user input may not be needed to acquire the target initially.
  • tracking involves providing the targeting system with instructions that allow it to remain pointing at the target, despite the movements of the platform and target.
  • the line of sight toward target 106 at a first time is indicated in Figure 3 at 108, and the line of sight toward target 106 at a second, later time is indicated at 1 10.
  • the movements of platform 102 that affect pointing may include both translations (i.e., overall center-of-mass movement of the platform), indicated by platform velocity vector 1 12, and rotations (e.g., pitching, rolling, and yawing of the platform), indicated by platform angular velocity vector 1 14.
  • the movements of the target that affect pointing generally only include translations, indicated by target velocity vector 1 16, because rotation of the target will not affect whether it remains in view.
  • the instructions for pointing may involve continuously or periodically specifying a line of sight vector, and changes thereof, from the tracking system, such as a tracking device supported by a gimbal system, to the target.
  • the determination of the line of sight vector, and any changes thereto, may involve transformations between different coordinate systems and/or rotations within a given coordinate system, as described below.
  • Positions such as platform position and target position, may be described using three-dimensional (3D) vectors.
  • Vectors may be represented as a set of coordinates, corresponding to a magnitude and direction, in some suitable reference frame or coordinate system.
  • a displacement vector extending from a point / ' such as a tracking device, to a point j, such as a target, may be represented in a coordinate system f using the notation f X or simply f X , where it is understood that the denoted quantity is a vector in either case.
  • a velocity vector pointing in the direction from point / ' to point j in coordinate system f may be represented as f V or, for simplicity, just f V .
  • This section describes various coordinate systems that may be used to describe displacement and velocity vectors and thus the relative positions and motions of objects.
  • the coordinate systems described below with respect to aircraft and gimbals may, more generally, be described with reference to any suitable object(s).
  • Figure 4 depicts an "earth -centered earth-fixed (ECEF) frame" coordinate system (denoted herein by a superscript or subscript “e”), generally indicated at 120.
  • the ECEF frame measures position with respect to an origin 122 (O e ) at the center of the earth.
  • Positions in the ECEF frame may be provided in Cartesian coordinates (x, y, z) or geodetic coordinates (latitude, longitude, altitude), among others.
  • Cartesian coordinates the x- axis is typically taken to point toward the prime meridian (0° longitude), the y- axis is typically taken to point toward 90 degrees east longitude, and the z- axis is typically taken to point toward geographic north (the north pole).
  • Figure 5 depicts a "navigation frame" coordinate system (denoted herein by a superscript or subscript "n”), generally indicated at 130.
  • the navigation frame is a local level frame that travels along with the platform with which it is associated, with its origin 132 (On) at some predetermined position (such as the platform center of mass) determined by the platform.
  • the navigation frame has its x-axis pointing local north (defined by the local meridian), its y-axis pointing east, and its z-axis pointing straight down, toward the center of the earth, regardless of the orientation of the aircraft.
  • Figure 6 depicts a "body frame" coordinate system (denoted herein by a superscript or subscript “b”), generally indicated at 140.
  • the body frame also moves along with the platform with which it is associated, with its x-axis pointing aircraft (or platform) forward, its y-axis pointing out the right wing (or the equivalent), and its z-axis pointing down, through the bottom of the platform, to form a right-handed coordinate system.
  • the body frame is related to the navigation frame through pitch, roll, and yaw rotations that transform from the local level frame to the actual orientation of the aircraft or other platform. Accordingly, the body frame typically has its origin 142 (O b ) at the same location as the origin of the navigation frame, such as the center of mass of the platform.
  • Figure 7 depicts three additional coordinate systems, referenced with respect to a gimbal system, and the relationships between them: (1 ) the "gimbal mount frame” coordinate system (denoted herein by a superscript or subscript “gm” and generally indicated at 150), (2) the “gimbal payload frame” coordinate system (denoted herein by a superscript or subscript “gp” and generally indicated at 160), and (3) the “optical frame” coordinate system (denoted herein by a superscript or subscript "o” and generally indicated at 170).
  • the gimbal mount frame 150 defines the physical mounting point of the gimbal to the aircraft and its unrotated orientation. It has its origin 152 (Ogm) at the top of the gimbal bolt pattern, and when the gimbal is at zero azimuth and elevation (i.e., unrotated) relative to the mount, the x-axis of the gimbal mount frame points along the line of sight of the active sensor, the y- axis points out the right side of the gimbal, and the z-axis points down through the bottom of the gimbal.
  • Ogm origin 152
  • the gimbal payload frame 160 defines the orientation of the gimbal relative to the gimbal mount frame. It has its origin 162 (O gp ) at the center of rotation of the gimbal (like the optical frame described below), its x- axis pointing out from the "ear" of the gimbal, its z-axis pointing along the payload bore-sight, and its y-axis finishing a right-handed coordinate system. Because, as Figure 7 depicts, the gimbal payload frame is related to the optical frame by a fixed (constant) rotation, it is assumed in the remainder of this description that the gimbal payload frame 160 will be calibrated to the optical frame 170.
  • the optical frame 170 which is closely related to two other gimbal frames of reference, has its origin 172 (O 0 ) at the center of rotation of the gimbal and is defined by the direction of the true line-of-sight of the active gimbal sensor.
  • a gimbal controller typically sends rotation commands to a gimbal in the form of azimuth, elevation and/or roll angles relative to the existing line of sight of this active sensor, thus supplying commands in the optical frame.
  • Tracking systems make use of the known, calculable and/or or estimated positions and motions of the targeting system and target.
  • the position, orientation and motion of the tracking system will be known and may be referred to herein as the "navigation solution.”
  • the position and motion of the target will also be known.
  • the position and motion of the target may be calculated or estimated based on known system parameters and/or user input.
  • the movement of the platform may be described using a flight path vector such as a flight path trajectory vector or a flight path velocity vector. These vectors may be represented with respect to any coordinate system.
  • the flight path trajectory vector is a position vector of known length (such as unit length) that points in the instantaneous direction of the platform motion.
  • the flight path velocity vector is a function of position and time (because velocity is defined as a change in position with respect to time, for example, in meters per second) that also points in the instantaneous direction of the platform motion.
  • the symbol used herein for the direction of the flight path is "F".
  • a trajectory vector in the direction of the flight path, referenced to the navigation frame would be denoted as n X 0 F , where the "O" indicates that the vector starts at the origin of the navigation frame, the “F” indicates that the vector points in the direction of the flight path, and the "n” indicates that the vector is given in the navigation coordinate system.
  • a velocity vector in the direction of the flight path would be expressed in the navigation frame as n V 0 F .
  • the symbol that may be used herein in a similar manner to denote the direction of flight path acceleration is "FA”.
  • the position of the target may be described using a target position vector. This vector may be represented with respect to any coordinate system.
  • the symbol used herein for target is "T”. Using this notation, a target position expressed in the ECEF coordinate system would be denoted by e X e T .
  • the movement of the target i.e., the change in the position of the target
  • V a target velocity vector denoted by the symbol V.
  • This vector also may be represented with respect to any coordinate system.
  • the velocity vector represents the discrete-time derivative of the target position vector. Using this notation, a target velocity vector expressed in the ECEF frame would be denoted by e V e T .
  • Vectors such as a line of sight displacement vector between a tracking device and target, are independent of frame or coordinate system. However, the representation of a vector in one frame or coordinate system typically will differ from the representation of the same vector in another coordinate system.
  • This section describes exemplary mathematical methods, such as the use of rotation matrices, for transforming vectors from one coordinate system to another. Such transformations may be effected, for example, using a suitable direction cosine matrix (DCM).
  • DCM direction cosine matrix
  • C will be used to denote a matrix that transforms a vector from coordinate system x to coordinate system y, where the transformation is accomplished by multiplying the original vector by the DCM to obtain a transformed vector, according to the ordinary methods of linear algebra.
  • This matrix can be constructed from three successive rotations as follows: 1 . Longitude Rotation (about z):
  • transpose of the above matrix may be used.
  • This transpose matrix can be found through standard methods of linear algebra.
  • C Gimbal Mount Frame to Optical Frame. This transformation may be accomplished with the following DCM, which may be derived in similar fashion to the previous transformation matrices.
  • DCM DCM
  • El stands for elevation angle
  • Az stands for azimuth angle (both of the line of sight with respect to the gimbal mount)
  • s and c stand for sine and cosine as before:
  • the transpose of the above DCM may be used:
  • DCM Optical Frame to Azimuth, Elevation. This DCM is used to rotate a vector from the optical frame (o) to gimbal azimuth yoke frame. It is typically followed by a transformation that extracts the gimbal azimuth and elevation angles, which is denoted "gmb_los2azel" below.
  • the gimbal angle inputs are in spherical coordinates for azimuth, elevation and roll:
  • the purpose of a tracking system according to the present disclosure is, in brief, to determine the line of sight vector from the tracking system to a target and then to adjust that line of sight, as needed, to compensate for movements of the platform and/or target.
  • Figure 8 is a vector diagram, generally indicated at 200, that depicts how to determine the line-of-sight-to- target vector 202 from the respective position vectors 204, 206 of the gimbal rotation center (GRC) 208 and the target 210 in the ECEF coordinate system, at any given instant, in accordance with aspects of the present disclosure.
  • the GRC is the center of the gimbal payload, where the rotation axis for azimuth and elevation intersect.
  • the positions of GRC 208 and target 210 in ECEF coordinates can be determined using GPS position data, INS position data, terrestrial navigation data, or the like, or a combination thereof.
  • the GRC position may be determined from the INS position using the orientation of the platform and the known translational offset between the INS and the GRC.
  • Example 1 As described above, transforming platform and target positions into gimbal rotation commands that result in pointing a sensor at a target generally may be accomplished through the application of various vector rotations and coordinate transformations. For example, using the previously introduced notation, an exemplary transformation procedure would include the following steps, where it is assumed that the gimbal and target positions are known in the ECEF coordinate system.
  • the gimbal to target displacement vector may be determined in the ECEF frame by vector subtracti n :
  • the target displacement vector then may be determined in the navigation frame and the optical frame through successive application of the appropriate transformation matrices:
  • the target displacement vector then may be rotated from the optical frame into the gimbal azimuth yoke frame:
  • the azimuth and elevation "errors" (i.e., corrections) for the gimbal may be determined from the target displacement vector in the gimbal azimuth yoke frame:
  • [AZ E , EL E ] gmb_los2azel ( gmb X 0 ' ).
  • FIG. 3 depicts the relationship between flight path trajectory and target position vectors at two different times (t and t + dt), in accordance with aspects of the present disclosure.
  • determining the line of sight vector from a moving platform to a moving target at various times involves determining the position of the target in the optical frame as a function of time, based on (i) a known or calculable change in position of the platform, and (ii) a known, calculable, or user- provided change in position of the target. This can generally be done in conjunction with the previously described techniques for determining a line of sight vector and gimbal rotation corrections at a single instant of time, by including transformations that compensate for the platform and target motions.
  • the rates of change of the gimbal azimuth and elevation angles to keep the gimbal line of sight pointed toward the target may be determined from (i) an initially determined target displacement vector in the navigation frame, (ii) initially determined azimuth and elevation angles to point the gimbal toward the target, and (iii) the known or estimated velocities of the gimbal platform and the target, as described below.
  • a new target displacement vector may be determined from the previous target displacement vector by subtracting the change in position of the platform and adding the change in position of the target:
  • the new target displacement vector then may be rotated into the optical frame: and then into the gimbal azimuth yoke frame:
  • AZ rate [AZ E (t + At)- AZ E (t)] / At
  • the disclosed techniques may be generalized to include the possibility of constant acceleration or even variable acceleration of the platform and/or the target.
  • the main effect of such complications is to increase the amount of data processing required to carry out the calculations.
  • E. User Input Although techniques for retaining a gimbal line of sight toward a target have already been described above, in some cases it may be desirable to provide additional features that allow a user to adjust a gimbal orientation manually, for example to compensate for unpredictable changes in target velocity. In such cases, the target velocity as a function of time may not be a known or predictable quantity, so that manual user input, possibly in combination with one or more tracking algorithms, may be the best method of tracking the moving target. User input may be accomplished, for example, in conjunction with an input device such as a joystick or similar directional indicator, as described below.
  • an input device such as a joystick or similar directional indicator
  • Figure 9 is a flow chart depicting a method, generally indicated at 250, of tracking a moving target with a gimbal-mounted sensor, in accordance with aspects of the present disclosure.
  • the method of Figure 9 generally includes combining one or more tracking algorithms, such as those described above, with user input indicating a change in target velocity.
  • To correctly calibrate a user's input relating to target velocity it is helpful to determine an initial target velocity, as indicated at step 252. This can be accomplished using the "navigation solution" for the platform (i.e., the collection of known platform motions) in combination with the target altitude and position at two different times.
  • the navigation solution for the platform includes the line of sight velocity in the navigation frame (i.e., the platform linear velocity), the line of sight acceleration in the navigation frame (the platform linear acceleration), the line of sight attitude with respect to the navigation frame, and the rotation rate of the platform with respect to the navigation frame.
  • the target velocity then may be calculated as follows. Let
  • V Y los 0 be a unit vector in the direction of the line of sight, i.e., pointing along the x- axis of the optical frame.
  • the attitude of the line of sight at time t and after a time dt can be computed in any desired coordinate system.
  • the navigation frame In the navigation frame,
  • the change in attitude of the line of sight toward the target may be sensed and used to calculate the rotation rate of the line of sight in at least two ways. If an internal inertial navigation system is used (e.g., an inertial measurement unit disposed inside the gimbal payload), it may sense the attitude of the line of sight directly. If an external inertial navigation system is used (e.g., an inertial measurement unit mounted elsewhere on the platform body), it may sense the attitude of the platform, which can then be rotated by the gimbal angles to compute the attitude of the line of sight at the relevant times.
  • an internal inertial navigation system e.g., an inertial measurement unit disposed inside the gimbal payload
  • it may sense the attitude of the line of sight directly.
  • an external inertial navigation system e.g., an inertial measurement unit mounted elsewhere on the platform body
  • the target position is also known as a function of time, for example from GPS data. However, in some cases the platform position may be computed as a function of time using known or measured platform velocity and/or acceleration. In either case, determining the target position is a matter of finding the intersection of the line of sight vector from the known position of the gimbal with the ellipsoid surface of the earth.
  • This calculation can be repeated as a function of time so that the velocity of the target can be computed:
  • the gimbal controller may be programmed to follow a point in space that moves with a constant velocity equal to the initially determined velocity.
  • the target does in fact move with constant velocity, this allows the system to correctly track the moving target with no additional operator input.
  • the target velocity will generally not be exactly constant due to the curvature of the earth, even if the speed of the target is assumed constant. Therefore, tracking a target moving on the ground involves following the motion of the target as it moves on the curved ellipsoid of the earth's surface. This may be accomplished by finding the ellipsoid intersection at each iterated target position, i.e.
  • a nonzero joystick input may be transformed by a processor into a change in target velocity, and added to the previous target velocity by a velocity integrator. The new velocity then may be assumed constant (taking into account the curvature of the earth for a ground-based target, as described above) until further user input regarding velocity is provided.
  • this input is passed through a transducer configured to convert a signal received from the user input device into a corresponding change in gimbal orientation, as indicated at step 256.
  • the target velocity corresponding to the user-induced gimbal motion is determined. This can be accomplished, for example, by comparing the user-induced gimbal motion to the preexisting gimbal tracking motion, which has already been associated with a target velocity as described previously.
  • a velocity integrator combines the change in target velocity associated with the user input with the previous value of the target velocity, through vector addition.
  • Steps 254, 256, 258 and 260 may be repeated any number of times, depending on the user input. In other words, the user may have any desired effect on the target velocity communicated to the gimbal controller.
  • the fact that the target velocity is integrated each time means that user input may be used primarily for target velocity changes and corrections, since the target velocity will be treated as constant (or constant along the ellipsoid) in the absence of user velocity input.
  • a non-integrated mode may be provided in which the velocity integrator is switched off. In that case, user input might be required to maintain any target velocity at all, or at least any target velocity other than a constant target velocity.
  • the velocity integrator may include various features to add convenience when providing user input relating to target velocity.
  • the integrator may include an adjustable gain feature that can be used to adjust the percentage of the user input to the transducer that is interpreted as a change in velocity, thus effectively allowing an adjustment to the sensitivity of the transducer to user input. For instance, a less sensitive setting may be more convenient in urban environments, where target velocities are generally expected to be smaller, whereas a less sensitive setting may be more convenient when viewing a highway, where target velocities are generally expected to be relatively large.
  • the velocity integrator may include an optional time out decay mode, in which the target velocity decays toward zero after some predetermined amount of time without user input.
  • the target position may be propagated forward as a function of time, as indicated at step 262, based on the integrated value of target velocity.
  • the updated target position vector in the optical frame is determined, for instance by the methods described previously.
  • the target position vector is transformed into a gimbal line of sight correction, also as described previously, and at step 268, the calculated correction is communicated to the gimbal controller in the form of rates of change in azimuth and/or elevation.
  • the gimbal rotates in response to instructions from the gimbal controller.
  • the resulting gimbal motion is responsive to the integrated target velocity, including any initially determined target velocity plus all user-supplied changes, while compensating for platform motions at the same time.
  • F. Input Filters Various filters may be applied to the user input. For example, a combination of low-pass and high-pass filters may be applied so that brief user input, such as a tap on a joystick, is interpreted as a slight change or "nudge" in the position of the target rather than a change in target velocity. On the other hand, longer user input, such as a continuous push of a minimum duration on a joystick, may be interpreted as a change in the velocity of the target. Both types of joystick input may be processed sufficiently for a modified position and/or velocity of the target to be determined and transformed into gimbal tracking commands. Other similar filters may be provided to allow joystick input having particular duration, strength, or other qualities be interpreted as changes in the target position and/or velocity. Any such filters may be configured to be selectively turned on and off by the user. VI. Mounting/Control Portions
  • a mounting or control portion may be any part of a gimbal apparatus that connects a gimbal assembly to a support platform and/or that carries electronics providing one or more aspects of gimbal apparatus control and/or data processing.
  • the mounting/control portion may form an end region of a turret unit. Also, this portion may be unstabilized and may be termed a "skillet.”
  • the mounting/control portion may support a gimbal assembly and may be connected directly to at least one gimbal and connected indirectly to one or more additional gimbals of the gimbal assembly.
  • the mounting/control portion may be attached to a support platform (see Section IX).
  • the mounting/control portion may be mounted to a support platform via any suitable mechanism, with any suitable orientation.
  • a mounting/control portion and/or the corresponding turret unit
  • such mounting may be static or dynamic, for example, involving additional gimbal(s) to provide dynamic mounting.
  • the mounting/control portion may carry and/or contain any suitable components of a turret unit, including a controller(s), power supply, electrical conduits or other electrical circuitry, a fan(s), and/or the like. Details of the mounting mechanism, including orientation and offsets, may be important in determining the various coordinate systems and coordinate transformations required to convert information regarding platform and target positions into line-of-sight pointing directions for the gimbal system.
  • a gimbal assembly is a hierarchical arrangement of two or more pivotable members (gimbals).
  • a gimbal assembly may include a higher-order gimbal pivotally coupled directly to a mounting portion.
  • the gimbal assembly also may include a lower-order gimbal pivotally coupled directly to the higher-order gimbal and indirectly to the mounting portion, such that the lower-order gimbal is carried by the higher-order gimbal.
  • pivotal motion of the higher-order gimbal in relation to the mounting portion results in collective pivotal motion of both gimbals, whereas pivotal motion of the lower-order gimbal may be independent of the higher-order gimbal.
  • the gimbal assembly further may include any suitable number of additional lower- order gimbals that are pivotally coupled directly to a relatively higher-order gimbal and/or that carry an even lower-order gimbal.
  • a gimbal assembly may be configured to rotate a payload about any suitable or desired number of axes, including 2, 3, 4, 5, 6, or more axes.
  • some of the axes of rotation may be collinear or coplanar.
  • the axes of rotation typically are either orthogonal to one another or parallel to (including collinear with) one another, although this is not required.
  • parallel axes of rotation, or substantially parallel axes can be used to provide increased precision, with a first level of rotation about a first axis providing coarser large-magnitude adjustments and a second level of rotation about a second axis (parallel or nonparallel) to the first axis providing finer small-magnitude adjustments.
  • Each gimbal of a gimbal assembly may be capable of any suitable pivotal motion.
  • the pivotal motion may be a complete revolution (360 degrees) or less than a complete revolution.
  • the gimbal assembly may include a hierarchical arrangement of major and minor gimbal pairs.
  • the major gimbal pair may be a pair of gimbals having a relatively larger range of angular motion (such as greater than about 90 degrees).
  • the minor gimbal pair may be a pair of gimbals that are pivotally coupled to the major gimbal pair (and indirectly to the mounting portion) and having a relatively smaller range of angular motion (such as less than about 90 degrees).
  • Each gimbal of a gimbal assembly may be driven controllably by a driver.
  • a driver An exemplary driver that may be suitable is described in U.S. Patent No. 7,561 ,784, issued July 14, 2009, which is incorporated herein by reference.
  • the driver(s) may be controlled, at least in part, by the target velocity compensation system, to facilitate tracking a target such as a moving target.
  • a payload is any device that is carried and aimed by a gimbal assembly.
  • the payload may include one or more detectors and/or emitters, among others.
  • a detector generally comprises any mechanism for detecting a suitable or desired signal, such as electromagnetic radiation, an electric field, a magnetic field, a pressure or pressure difference (e.g., sonic energy), a temperature or temperature difference (e.g., thermal energy), a particle or particles (e.g., high energy particles), movement (e.g., an inertial measurement device), and/or the like.
  • An emitter generally comprises any mechanism for emitting a suitable or desired signal, such as electromagnetic radiation (e.g., via a laser or radar), sonic energy, and/or the like.
  • the payload generally is in communication with a controller that sends signals to and/or receives signals from the payload.
  • the payload may be coupled (generally via a controller) to a display such that signals from the payload may be formatted into a visual form for viewing on the display.
  • the payload also may be coupled (again generally via a controller) to the target velocity compensation system, so that information about a target pertinent to tracking the target can be gathered, presented, and/or assessed.
  • the payload may form a detection portion (or all) of an imaging system.
  • An imaging system generally comprises any device or assembly of devices configured to generate an image, or an image signal, based on received energy, such as electromagnetic radiation.
  • an imaging system detects spatially distributed imaging energy (e.g., visible light and/or infrared radiation, among others) and converts it to a representative signal.
  • Imaging may involve optically forming a duplicate, counterpart, and/or other representative reproduction of an object or scene, especially using a mirror and/or lens. Detecting may involve recording such a duplicate, counterpart, and/or other representative reproduction, in analog or digital formats, especially using film and/or digital recording mechanisms.
  • an imaging system may include an analog camera that receives radiation (e.g., optical radiation) and exposes film based on the received radiation, thus producing an image on the film.
  • an imaging system may include a digital camera that receives radiation (e.g., optical radiation) and generates a digital image signal that includes information that can be used to generate an image that visually portrays the received radiation.
  • an imaging system may include an active component such as a laser to illuminate a scene and form an image from one or more reflections and/or emissions induced by the laser.
  • Imaging energy may include any type of energy, particularly electromagnetic energy, from which an image can be generated, including but not limited to ultraviolet radiation, visible light, and infrared radiation.
  • Suitable detectors for an imaging system may include (1 ) array detectors, such as charge-coupled devices (CCDs), charge-injection devices (CIDs), complementary metal-oxide semiconductor (CMOS) arrays, photodiode arrays, and the like, and/or (2) arrays of point detectors, such as photomultiplier tubes (PMTs), photodiodes, pin photodiodes, avalanche photodiodes, photocells, phototubes, and the like.
  • Detectors may be sensitive to the intensity, wavelength, polarization, and/or coherence of the detected imaging energy, among other properties, as well as spatial and/or temporal variations thereof.
  • Special-purpose detectors may include millimeter-wave (MMW) imagers, light detection and ranging (LIDAR) imagers, and mine- detection sensors, among others.
  • MMW millimeter-wave
  • LIDAR light detection and ranging
  • the imaging system also may include optics (i.e., one or more optical elements).
  • exemplary optical elements may include (1 ) reflective elements (such as mirrors), (2) refractive elements (such as lenses), (3) transmissive or conductive elements (such as fiber optics or light guides), (4) diffractive elements (such as gratings), and/or (5) subtractive elements (such as filters), among others.
  • the imaging system also may contain gyroscopes and/or other elements arranged to form an inertial measurement unit (IMU) on an optical bench.
  • IMU inertial measurement unit
  • the IMU may be used to assess the pointing angle of the line-of-sight, as well as geo-location, geo-referencing, geo-pointing, and/or geo-tracking in earth coordinates.
  • the imaging system may be capable of generating image signals based on reflection from a self-contained laser and/or other light or radiation source.
  • the generated image may or may not contain range information.
  • Such imagers may generate large amounts of heat.
  • the present disclosure may enable the use and incorporation of light detection and ranging (LIDAR) systems, such as 3-D LIDAR systems, into gimbal systems in which the large amounts of associated heat would otherwise prevent their use.
  • LIDAR light detection and ranging
  • an imaging system may be capable of generating image signals based on two or more different types or wavebands of imaging energy.
  • the imaging system may be configured to generate a first image signal representative of visible light and a second image signal representative of infrared radiation.
  • Visible light and infrared radiation are both types of electromagnetic radiation (see Definitions); however, they are characterized by different wavebands of electromagnetic radiation that may contain or reflect different information that may be used for different purposes.
  • visible light may be used to generate an image signal that in turn may be used to create a photograph or movie showing how a scene appears to a human observer.
  • infrared radiation may be used to generate an image signal that in turn may be used to create a heat profile showing heat intensity information for a scene.
  • the imaging system may be used with any suitable set of first and second (or first, second, and third (and so on)) image signals, using any suitable wavelength bands.
  • These suitable image signals may include first and second visible wavebands, first and second infrared wavebands, mixtures of visible, infrared, and/or ultraviolet wavebands, and so on, depending on the application.
  • the imaging system may be configured to generate a first image signal representative of infrared radiation in a first waveband (e.g., short-wavelength infrared (SWIR)) and a second image signal representative of infrared radiation in a second waveband (e.g., long- wavelength infrared (LWIR)).
  • a first waveband e.g., short-wavelength infrared (SWIR)
  • a second image signal representative of infrared radiation in a second waveband e.g., long- wavelength infrared (LWIR)
  • an imaging system may form composite images.
  • the composite images may be straight combinations of two or more other images. However, in some cases, one or both of the images may be processed prior to or during the process of combining the images.
  • Composite images may be formed for use in firefighting, aeronautics, surveillance, and/or the like, for example, by superimposing infrared images of hot spots, runway lights, persons, and/or the like on visible images.
  • the payload alternatively, or in addition, may include non-imaging components, such as laser rangefinders, laser designators, laser illuminators, laser communication devices, polorarimeters, hyperspectral sensors, inertial measurement units (IMUs), and/or the like.
  • non-imaging components such as laser rangefinders, laser designators, laser illuminators, laser communication devices, polorarimeters, hyperspectral sensors, inertial measurement units (IMUs), and/or the like.
  • the gimbal system of the present disclosure may include a turret unit supported by a support platform.
  • a support platform generally refers to any mechanism for holding, bearing, and/or presenting a turret unit and its payload.
  • the support platform may be moving, movable but stationary, or fixed in relation to the earth, and may be disposed on the ground, in the air or space, or on and/or in water, among others. In any case, the support platform may be selected to complement the function of the turret unit and particularly its payload.
  • the support platform may be movable, such as a vehicle.
  • exemplary vehicles include an aircraft or airborne device (e.g., a fixed-wing piloted aircraft, pilotless remote-controlled aircraft, helicopter, drone, missile, dirigible, aerostat balloon, rocket, etc.), a ground vehicle (e.g., a car, truck, motorcycle, tank, etc.), a watercraft (e.g., a boat, submarine, carrier, etc.), or the like.
  • target velocity compensation may need to account for both target velocity and platform velocity.
  • the support platform may be fixed in position.
  • Exemplary fixed support platforms may include a building, an observation tower, and/or an observation platform, among others.
  • the support platform may be a temporarily stationary movable support, such as a hovering helicopter and/or a parked car, truck, or motorcycle, among others. In this case, target velocity compensation may only need to account for target velocity.
  • a gimbal system with a moving, temporarily stationary, or fixed support platform may be used for any suitable application(s).
  • Exemplary applications for a gimbal system include navigation, targeting, search and rescue, law enforcement, firefighting, and/or surveillance, among others.
  • the wavelength ranges identified in these meanings are exemplary, not limiting, and may overlap slightly, depending on source or context.
  • the wavelength ranges lying between about 1 nm and about 1 mm, which include ultraviolet, visible, and infrared radiation, and which are bracketed by x-ray radiation and microwave radiation, may collectively be termed optical radiation.
  • the wavelength ranges lying above about 1 mm, which include microwave radiation and radio waves, may collectively be termed the radio spectrum.
  • Ultraviolet radiation Electromagnetic radiation invisible to the human eye and having wavelengths from about 100 nm, just longer than x-ray radiation, to about 400 nm, just shorter than violet light in the visible spectrum.
  • Ultraviolet radiation includes (A) UV-C (from about 100 nm to about 280 or 290 nm), (B) UV-B (from about 280 or 290 nm to about 315 or 320 nm), and (C) UV-A (from about 315 or 320 nm to about 400 nm).
  • Visible light includes (A) UV-C (from about 100 nm to about 280 or 290 nm), (B) UV-B (from about 280 or 290 nm to about 315 or 320 nm), and (C) UV-A (from about 315 or 320 nm to about 400 nm).
  • Infrared radiation Electromagnetic radiation invisible to the human eye and having wavelengths from about 700 nanometers, just longer than red light in the visible spectrum, to about 1 millimeter, just shorter than microwave radiation.
  • Infrared radiation includes (A) IR-A (from about 700 nm to about 1 ,400 nm), (B) IR-B (from about 1 ,400 nm to about 3,000 nm), and (C) IR-C (from about 3,000 nm to about 1 mm).
  • IR radiation, particularly IR-C may be caused or produced by heat and may be emitted by an object in proportion to its temperature and emissivity.
  • Portions of the infrared having wavelengths between about 3,000 and 5,000 nm (i.e., 3 and 5 ⁇ ) and between about 7,000 or 8,000 and 14,000 nm (i.e., 7 or 8 and 14 ⁇ ) may be especially useful in thermal imaging, because they correspond to minima in atmospheric absorption and thus are more easily detected (particularly at a distance).
  • NIR near infrared
  • SWIR short-wave infrared
  • MWIR mid-wave infrared
  • LWIR long-wave infrared
  • VLWIR very long-wave infrared
  • Microwave Radiation Electromagnetic radiation invisible to the human eye and having wavelengths from about 1 millimeter, just longer than infrared radiation, to about 1 meter, just shorter than radio waves.
  • Radio Waves Electromagnetic radiation invisible to the human eye and having wavelengths greater than about 1 meter, just longer than microwave radiation. In practice, radio waves typically have wavelengths less than about 100,000 kilometers, which corresponds to extremely low frequency waves.
  • An optical system for tracking a moving target comprising (1 ) a pointing device attachable to a support platform; (2) a sensor, supported by the pointing device, and pivotably orientable with respect to the support platform about a pair of nonparallel axes by controlled driven motion of the pointing device, to provide pan and tilt movement of the sensor, such that the sensor can be pointed at the target; (3) a user input device configured to allow a user to provide information regarding position and velocity of the target; and (4) a controller programmed to receive the information provided by the user and, based on that information, to prepare and transmit instructions to the pointing device to orient and maintain its line of sight toward the target, thereby allowing the sensor to track the target.
  • A2 The system of paragraph A1 , wherein a position of the support platform is determined by at least one of a global positioning system and an inertial navigation system, and wherein the controller is configured to calculate the position of the target using the position of the platform and the determined direction from the pointing device to the target.
  • A3. The system of paragraph A1 , wherein the target is moving on the surface of the Earth, and wherein the controller is configured to calculate the position of the target by calculating the intersection of a line extending from the pointing device toward the target with an ellipsoid representing the surface of the Earth.
  • the information provided by the user includes information used to determine a first direction from the pointing device toward the target at a first instant of time and a second direction from the pointing device toward the target at a second instant of time, and wherein the controller is configured to calculate a corresponding target velocity using the first and second determined directions.
  • controller is configured to compute a vector sum of (i) a change in target velocity associated with the information provided by the user and (ii) a previously determined target velocity, and to cause the pointing device to maintain its line of sight toward a position moving at a velocity corresponding to the vector sum.
  • A6 The system of paragraph A5, wherein the controller is configured to cause the pointing device to maintain its line of sight toward a position moving at a speed corresponding to a magnitude of the vector sum, in a direction along an ellipsoid representing the surface of the Earth.
  • controller is configured to interpret a first type of user input as a change in target velocity and a second type of user input as a change in target position.
  • A8 The system of paragraph A7, wherein the first type of user input is a continuous nonzero joystick input for a time greater than a predetermined minimum time, and the second type of user input is a nonzero joystick input for a time less than the predetermined minimum time.
  • An optical device for tracking a moving target comprising (1 ) a gimbal system attachable to a support platform; (2) an imaging system, supported by the gimbal system, and pivotably orientable with respect to the support platform about a pair of nonparallel axes by controlled driven motion of the gimbal system, to provide pan and tilt movement of the imaging system, such that a line of sight of the imaging system can be pointed at the target; (3) a display configured to present images of the target collected by the imaging system; (4) a user input device configured to allow a user to input information regarding successive positions of the target, based on images of the target presented on the display; and (5) a controller programmed to receive information from the user input device and, based on that information, to prepare and transmit instructions to the gimbal system to orient and maintain the line of sight of the imaging system toward the target while the target moves between two positions, thereby allowing the imaging system to track the target.
  • the user input device is configured to allow a user to input information regarding a change in target velocity
  • the controller is configured to compute a vector sum of (i) the change in target velocity associated with the information provided by the user and (ii) a previously determined target velocity, and to cause the imaging system to maintain its line of sight toward a position moving at a velocity corresponding to the vector sum.
  • controller is configured to interpret a first type of user input as a change in target velocity and a second type of user input as a change in target position.
  • a method of tracking a moving target comprising (1 ) bringing the target into a field of view of an imaging system; (2) calculating an initial velocity of the target based on orientation of a line of sight of the imaging system toward the target at two different times; and (3) keeping the target in the field of view while the target is moving by either (i) causing the field of view to track a point moving with velocity corresponding to the initial velocity of the target, or (ii) receiving user input relating to a change in target velocity, using the user input to determine the change in target velocity, and causing the field of view to track a point moving with a velocity corresponding to the vector sum of the initial velocity and the change in target velocity.
  • a method of tracking a moving target comprising (1 ) receiving information regarding an initial position of the target; (2) directing a gimbal to point a tracking device toward the initial position of the target; (3) receiving information regarding a subsequent position of the target; (4) directing the gimbal to point the tracking device toward the subsequent position of the target; (5) calculating an initial target velocity using a rotation rate of a line of sight from the tracking device to the target as the line of sight rotates between the initial position of the target and the subsequent position of the target; and (6) directing the gimbal to point the tracking device to follow a point moving with a velocity corresponding to the initial target velocity, thereby tracking the target.
  • determining which user input relates to a change in target position and which user input relates to a change in target velocity includes interpreting nonzero user input having a continuous duration less than a predetermined minimum as a change in target position, and interpreting nonzero user input having a continuous duration greater than a predetermined minimum as a change in target velocity.

Abstract

Un système de cadran, comprenant des composants et des procédés d'utilisation, est configuré pour suivre les cibles mobiles. En particulier, le système de l'invention peut être configuré pour orienter et maintenir la ligne de visée (« los ») du système de cadran vers une cible, pendant que la cible et, dans certains cas, la plateforme prenant en charge le système de cadran se déplacent. Le système peut également être configuré pour calculer une vitesse cible estimée d'après une entrée utilisateur, et pour calculer les positions cibles postérieures à partir des positions précédentes en intégrant la vitesse cible estimée au fil du temps. Une diversité de filtres et d'autres mécanismes peut être utilisée pour permettre à un utilisateur d'entrer des informations concernant une vitesse cible dans un contrôleur de cadran.
PCT/US2010/038738 2010-06-15 2010-06-15 Positionnement de cadran avec compensation de vitesse cible WO2011159288A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2010/038738 WO2011159288A2 (fr) 2010-06-15 2010-06-15 Positionnement de cadran avec compensation de vitesse cible

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2010/038738 WO2011159288A2 (fr) 2010-06-15 2010-06-15 Positionnement de cadran avec compensation de vitesse cible

Publications (2)

Publication Number Publication Date
WO2011159288A2 true WO2011159288A2 (fr) 2011-12-22
WO2011159288A3 WO2011159288A3 (fr) 2012-05-10

Family

ID=45348781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2010/038738 WO2011159288A2 (fr) 2010-06-15 2010-06-15 Positionnement de cadran avec compensation de vitesse cible

Country Status (1)

Country Link
WO (1) WO2011159288A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014077788A2 (fr) * 2012-11-16 2014-05-22 Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş. Système et procédé d'acquisition de cible

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6172747B1 (en) * 1996-04-22 2001-01-09 The United States Of America As Represented By The Secretary Of The Navy Airborne video tracking system
US20040118622A1 (en) * 1994-05-27 2004-06-24 Morrell John B. Speed limiting for a balancing transporter accounting for variations in system capability
US20040207727A1 (en) * 2003-01-17 2004-10-21 Von Flotow Andreas H Compensation for overflight velocity when stabilizing an airborne camera
US20080055413A1 (en) * 2006-09-01 2008-03-06 Canon Kabushiki Kaisha Automatic-tracking camera apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040118622A1 (en) * 1994-05-27 2004-06-24 Morrell John B. Speed limiting for a balancing transporter accounting for variations in system capability
US6172747B1 (en) * 1996-04-22 2001-01-09 The United States Of America As Represented By The Secretary Of The Navy Airborne video tracking system
US20040207727A1 (en) * 2003-01-17 2004-10-21 Von Flotow Andreas H Compensation for overflight velocity when stabilizing an airborne camera
US20080055413A1 (en) * 2006-09-01 2008-03-06 Canon Kabushiki Kaisha Automatic-tracking camera apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014077788A2 (fr) * 2012-11-16 2014-05-22 Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş. Système et procédé d'acquisition de cible
WO2014077788A3 (fr) * 2012-11-16 2014-09-04 Tusaş - Türk Havacilik Ve Uzay Sanayii A.Ş. Système et procédé d'acquisition de cible

Also Published As

Publication number Publication date
WO2011159288A3 (fr) 2012-05-10

Similar Documents

Publication Publication Date Title
US9681065B2 (en) Gimbal positioning with target velocity compensation
US20110304737A1 (en) Gimbal positioning with target velocity compensation
US9531928B2 (en) Gimbal system with imbalance compensation
US8844896B2 (en) Gimbal system with linear mount
US11263761B2 (en) Systems and methods for visual target tracking
JP6596745B2 (ja) 対象物体を撮像するシステム
US10057509B2 (en) Multiple-sensor imaging system
CA2642655C (fr) Systeme de cardan avec ecoulement d'air
ES2375935T3 (es) Sistema de compensación de velocidad de sobrevuelo durante la estabilización de una cámara aerotransportada.
US20170175948A1 (en) Gimbal system having preloaded isolation
Thurrowgood et al. A biologically inspired, vision‐based guidance system for automatic landing of a fixed‐wing aircraft
JP6282275B2 (ja) インフラストラクチャマッピングシステム及び方法
US10375311B2 (en) Anti-rotation mount
US20130101276A1 (en) Single axis gimbal optical stabilization system
US9671616B2 (en) Optics system with magnetic backlash reduction
US10800344B2 (en) Aerial photogrammetric device and aerial photogrammetric method
CN111226154B (zh) 自动对焦相机和系统
JP7011908B2 (ja) 光学情報処理装置、光学情報処理方法および光学情報処理用プログラム
WO2012170673A1 (fr) Système à cardans ayant un bloc de montage de translation
CA2977822C (fr) Monture anti-rotation
US9699392B2 (en) Imaging system for an aircraft
JP2019050007A (ja) 移動体の位置を判断する方法および装置、ならびにコンピュータ可読媒体
US20110221934A1 (en) Ground-Based Instrumentation Operating with Airborne Wave Reflectors
WO2011159288A2 (fr) Positionnement de cadran avec compensation de vitesse cible
CN110209199A (zh) 一种农田火源监测无人机系统设计

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10853351

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10853351

Country of ref document: EP

Kind code of ref document: A2