WO2022106824A1 - Method for operating a tracking system and tracking system - Google Patents

Method for operating a tracking system and tracking system Download PDF

Info

Publication number
WO2022106824A1
WO2022106824A1 PCT/GB2021/052988 GB2021052988W WO2022106824A1 WO 2022106824 A1 WO2022106824 A1 WO 2022106824A1 GB 2021052988 W GB2021052988 W GB 2021052988W WO 2022106824 A1 WO2022106824 A1 WO 2022106824A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
point
fixed
line
helmet
Prior art date
Application number
PCT/GB2021/052988
Other languages
French (fr)
Inventor
Colin Richard Mills
Simon TRYTHALL
John Richard Williams
Original Assignee
Bae Systems Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB2018167.3A external-priority patent/GB2601138A/en
Priority claimed from EP20275170.7A external-priority patent/EP4001952A1/en
Application filed by Bae Systems Plc filed Critical Bae Systems Plc
Publication of WO2022106824A1 publication Critical patent/WO2022106824A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude

Definitions

  • the present invention relates to a method for operating a tracking system, in particular to determine the position of an object, and to a tracking system.
  • one or more emitters are positioned in a known configuration on an outer shell of a helmet, to be energised in a controlled but rapid sequence.
  • a camera is positioned at a known location in the aircraft with a view of the helmet such that when emitters are energised they may be detected in images captured by the camera.
  • the relative positions of at least four different emitters in the generated images may be determined and, given the known relative positions of the emitters on the helmet, used to calculate the position and orientation of the helmet in an aircraft frame of reference.
  • Figure 1 showing, schematically, components of a hybrid tracker system according to the present disclosure.
  • Figure 2 illustrates a hybrid tracker system according to some examples.
  • Figure 3 illustrates a method according to some examples.
  • Figure 4 illustrates a tracker system according to some examples. Detailed description
  • Example embodiments of the present invention will be described in the context of an extension to the position tracking functionality of a known hybrid helmet tracker system, for example a hybrid tracker system as described in a patent application by the present Applicant published as WO 2017/042578 A1 , for use in an aircraft.
  • a helmet tracker system comprises an optical helmet tracking component operable in combination with an inertial helmet tracking component.
  • the method is described with reference to a helmet on an aircraft, that it may be applied to track any object within a region of operation.
  • the region of operation may be stationary or moving relative to an inertial reference frame associated with the Earth.
  • the region of operation may also be referred to as a platform.
  • the platform may comprise a vehicle, such as an aircraft.
  • the platform may comprise a simulator.
  • the object may comprise a helmet, a head mounted display, head worn display or any other suitable object where it is desirable to track the positon and orientation of the object.
  • the inertial tracking component may be configured as the principle source of data for determining changes in orientation of the helmet.
  • the inertial tracking component may comprise an arrangement of miniature inertial sensors attached to the helmet for sensing a rate of change in orientation of the helmet.
  • a tracker processor is configured to calculate orientation of the helmet based upon rate data output by the inertial sensors.
  • the tracker processor is also configured to apply corrections to the rate data sensed by the inertial sensors using rate data output by the optical tracker component to correct for changing response (drift) of the inertial sensors or any misalignment of the inertial sensors.
  • the optical tracker component is configured both to sense changes in orientation of the helmet, for example for the purposes of providing corrections in the inertial tracker component, and to determine the position of the helmet within a region of operation, e.g. within the aircraft cockpit.
  • the optical helmet tracking component may comprise an arrangement of controllable emitters integrated within an outer shell of a helmet and at least one camera mounted at a fixed, known position within an aircraft cockpit. The camera position is chosen so that it may have a direct line of sight to at least some of the helmet emitters, typically to any group of at least four emitters, in normal use.
  • An optical tracker processor is configured to control the illumination of the emitters and to process image data output by the camera, thereby to sense movement of the helmet, as will now be described in more detail.
  • the positions of the emitters in a helmet frame of reference, and hence their relative positions on the helmet, are known from initial manufacture or configuration of the helmet.
  • the position of the camera in an aircraft frame of reference is also known for any particular application of the tracker system. It is also expected that the orientation of the aircraft in an inertial reference frame associated with the Earth is also known from aircraft sensors.
  • the optical tracker component in one known example, operates cyclically to determine position and orientation of the helmet in the aircraft frame of reference, typically up to 200 times per second. During each cycle, the emitters are activated. If an emitter is visible within a particular captured image, then the identity of the emitter is known using suitable methods and hence its position in the helmet frame of reference is known.
  • the position of the emitter in the respective captured image indicates the direction of a line of sight from the emitter to the fixed camera in the aircraft frame of reference.
  • the relative positions of emitters appearing in an image during a given cycle, the known relative positions of the identified emitters in the helmet frame of reference and the respective lines of sight from the camera to each emitter enables the position and orientation of the helmet to be calculated in the aircraft frame of reference if four emitters are visible in images captured by the camera. If only three emitters are visible to the camera then this yields six equations, two for each emitter imaged by the camera, with six unknowns, three unknowns for the position and three unknowns for orientation.
  • the optical tracking component determines the position and orientation of the helmet in particular circumstances, but not in general. If only two emitters are visible this yields four equations and thus it is not possible to determine the position and orientation of a helmet using the optical tracking component alone.
  • a hybrid tracker system may be configured to determine the position of a helmet in the aircraft frame of reference when only two emitters of the optical tracker component are visible to the camera during any given cycle.
  • the hybrid tracker operates only as an inertial tracker determining orientation of the helmet only, until such time as more emitters become visible to the camera.
  • FIG. 1 a schematic representation is shown of sensor components in an example hybrid helmet tracker system 5 as may be used to track the position and orientation of a helmet 10 in an aircraft frame of reference 15.
  • the aircraft frame of reference 15 will be referred to hereafter as ‘aircraft axes’ 15.
  • the hybrid helmet tracker system 5 comprises sensors of an optical helmet tracking component and sensors of an inertial helmet tracking component.
  • the helmet 10 is provided with at least four emitters mounted within an outer shell of the helmet 10, including a first emitter 20 and a second emitter 25.
  • the emitters may be illuminated as discussed above, under the control of an optical tracking processor (not shown in Figure 1 ).
  • the position of each emitter 20, 25 is defined in a helmet frame of reference 50, to be referred to hereafter as ‘helmet axes’ 50.
  • the optical tracking component also comprises a camera 30 mounted at a known fixed position and orientation relative to the aircraft axes 15, directed towards the helmet 10. It is intended that the camera 30 is orientated to capture images of different combinations of illuminated emitters mounted on the helmet 10 as the helmet 10 moves and to supply image data of the captured images to the optical tracking processor.
  • the camera 30 comprises a lens 35 and an image sensor 40.
  • an inertial sensor block 45 is mounted on or within the helmet 10.
  • the inertial sensor block 45 comprises three orthogonally-oriented Micro-Electromechanical System (MEMS) gyroscopic sensors.
  • MEMS Micro-Electromechanical System
  • the inertial sensors 45 are able to sense a rate of change in orientation of the helmet 10 in three dimensions in an inertial reference frame associated with the Earth, to be referred to hereafter as ‘Earth axes’ (not shown in Figure 1 ).
  • the rate of change data output by the inertial sensors 45 are received and processed by a tracker system processor (not shown in Figure 1 ) to determine the orientation of the helmet 10 in Earth axes.
  • Sensors provided on the aircraft are able to provide data defining the orientation of the aircraft in Earth axes.
  • the orientation of the helmet 10 in Earth axes as calculated by the tracker system processor from rate data output by the inertial sensors 45, may therefore be transformed using orientation data from the navigation system by a straightforward transformation into an orientation of the helmet 10 in the aircraft axes 15.
  • the position of the helmet 10 in the aircraft axes 15 is calculated by the optical tracking processor of the hybrid tracker system 5, but to determine helmet position using the optical tracking processor usually requires four of the helmetmounted emitters to be visible to the camera 30.
  • the position 65 in the image at which the emission from the emitter 20 is received at the image sensor 40 defines a first line of sight in aircraft axes 15 to the emitter 20 from the lens 35 of the camera 30.
  • the position of the emitter 20 in the aircraft axes 15 lies an unknown distance along that first line of sight.
  • the position of the emitter 20 is therefore defined mathematically in terms of a known line of sight vector 55 in aircraft axes, defining the direction of the defined line of sight from camera lens 35 to the first emitter 20, and an unknown distance from the camera lens 35 to the first emitter 20.
  • the line of sight vector may be a unit vector (or a normalised vector), and may be multiplied by a scaling factor determined from the unknown distance from the camera lens 35 to the first emitter 20.
  • the orientation of the helmet 10 and the helmet axes 50 in aircraft axes 15 are known from the inertial tracking component. Furthermore, the position of the first emitter 20 in the helmet axes 50 is also known. The unknown position of the helmet 10 (origin of the helmet axes 50) in aircraft axes 15 may therefore be related mathematically by:
  • V C A is the known position of the camera lens 35 or image sensor 40 in aircraft axes 15;
  • K el is the unknown distance from the camera lens 35 or image sensor 40 to the first emitter 20;
  • V A el is the known line of sight vector 55 in aircraft axes 15;
  • [7M] is a rotation matrix representing the known orientation of the helmet 10 in aircraft axes 15, as generated from the inertial block 45; is the known position of the first emitter 20 in the helmet axes 50; and
  • V A is the unknown position of the helmet 10 relative to the camera in the aircraft axes 15.
  • the above formula generates two independent equations (one equation for each x and y coordinate measurement on the camera image sensor 40 plane representing the line of sight 55): with four unknown quantities the three coordinates of the helmet 10 in aircraft axes 15 and the distance from the camera lens 35 or the image sensor 40 to the first emitter 20.
  • the position 70 at which the light from the illuminated second emitter 25 is received at the image sensor 40 defines a second line of sight in aircraft axes 15 from the lens 35 of the camera 30 or the image sensor 40 to the emitter 25.
  • the position of the emitter 25 in the aircraft axes 15 lies an unknown distance along that second line of sight, in a direction defined by a second line of sight vector 60.
  • the unknown coordinates of the helmet 10 (origin of the helmet axes 50) in aircraft axes 15 may therefore be related mathematically by:
  • V C A + K e2 x V c A e2 [HA] x V e H 2 + V A eq 2
  • K e2 is the unknown distance from the camera lens 35 or image sensor 40 to the second emitter 25;
  • V A 2 is known line of sight vector 60 in aircraft axes.
  • the technique described above with reference to Figure 1 makes use of emitters fixed to the helmet frame of reference and a camera fixed to the aircraft axes 15. However, it is not necessary to use a camera and emitter. Any suitable arrangement to determine the line of sight vectors may be used. For example a positon sensor may be used in the place of the camera.
  • the emitter may also be a passive device, such that it may reflect light from an external source that may be detected by the position sensor.
  • the camera may be fixed to the helmet and the emitters to the aircraft axes 15.
  • any object on any platform may be tracked using the same method.
  • a head worn display, head mounted display, or any other type of display may be tracked on a platform.
  • the platform may be stationary relative to the earth axes, or moving relative to the earth axes.
  • the platform may comprise a simulator or a vehicle.
  • the vehicle may comprise a motor vehicle, an aircraft or a naval vessel or maritime vessel.
  • Figure 2 comprises a region of interest 200.
  • a first point 210, a second point 220 and a third point 230 are located in the region of interest 200.
  • the region of interest is the region in which the object may be tracked.
  • the first point 210 and the second point 220 are fixed in location relative to a first frame of reference 240, such that the first point 210 and the second point 220 move with the first frame of reference.
  • the relative positon of the first point 210 and the second point 220 is fixed.
  • Sensing means may be used to determine a first line of sight vector 250 from the third point 230 to the first point 210. Sensing means may be used to determine a second line of sight vector 260 from the third point 230 to the first point 260.
  • the line of sight vectors 250 and 260 may be unit vectors (or normalised vectors), and as such define a direction from the third point 230 to the first point 210 and the second point 220. This is because the distance between the third point 230 and the first point 210 (and the second point 220) is unknown.
  • the sensing means may be associated with the third point 230.
  • the sensing means may comprise a sensor located at the third point 230 configured to determine the direction from the third point 230 to the first point 210 and the second point 220.
  • the sensor may comprise a camera.
  • the second line of sight vector 250 is related to the first line of sight vector 260 based on equations 1 and 2.
  • the first point 210 and second point 220 are fixed in position relative to each other.
  • the position of each of the first point 210 and the second point 220 relative to each other in a second frame of reference may be determined.
  • the orientation of the first frame of reference 240 in the second frame of reference 240 is known.
  • the first frame of reference 240 may be fixed in the object frame of reference, such that the first point 210 and the second point are fixed in position relative to the object, as described in relation to Figure 1 . If the first frame of reference 240 is fixed in the object frame of reference then the second frame of reference may be fixed relative to the frame of reference of the region of interest 200, i.e. a platform frame of reference. This is consistent with some examples where a camera or sensing means is in a fixed location in the region of interest 200 and the first point 210 and the second points are fixed on the object (such as a helmet).
  • the first frame of reference 240 may be fixed in the in the region of interest 200 frame of reference, i.e. a platform frame of reference. Such that the first point 210 and the second point are fixed in position relative to the region of interest 200. If the first frame of reference 240 is fixed in the region of interest 200 frame of reference then the second frame of reference may be fixed relative to the object frame of reference. This is consistent with some examples where a camera or sensing means is in a fixed location on the object and the first point 210 and the second points are fixed in the region of interest.
  • Figure 3 illustrates a method to determine a position of an object, the method is indicated by the reference sign 300.
  • the first line of sight vector is determined 310.
  • the second line of sight vector is determined.
  • the second line of sight vector is related to the first line of sight vector as the first point and second point are fixed in position relative to each other.
  • Information defining the position of each of the two points relative to each other in a first frame of reference is received 330. This information is known as the two points are fixed relative to each other.
  • Information defining an orientation of the first frame of reference in the second frame of reference is received 340. This orientation may be determined by an inertial tracker.
  • the position of the object in the platform frame of reference may be determined 350.
  • the second frame of reference may be fixed in the platform frame of reference and the first frame of reference may be fixed in an object frame of reference, or the first frame of reference may be fixed in the platform frame of reference and the second frame of reference may be fixed in the object frame of reference.
  • FIG. 4 illustrates a hybrid tracker system 400 in accordance with some examples.
  • Hybrid tracker system 400 comprises an optical tracking system 410, an inertial tracking system 420 and a processing means 430.
  • Hybrid tracker system may be suitable to work with the techniques described with relation to Figures 1 -3.
  • processing components described above with reference to Figures 1 -4 may in practice be implemented by a single chip or integrated circuit or plural chips or integrated circuits, optionally provided as a chipset, an application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), digital signal processor (DSP), graphics processing units (GPUs), etc.
  • the chip or chips may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor or processors, a digital signal processor or processors, baseband circuitry and radio frequency circuitry, which are configurable so as to operate in accordance with the exemplary embodiments.
  • the exemplary embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).
  • the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice.
  • the program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the invention.
  • the carrier may be any entity or device capable of carrying the program.
  • the carrier may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example a CD ROM or a semiconductor ROM; a magnetic recording medium; optical memory devices in general; etc.
  • SSD solid-state drive
  • ROM read-only memory
  • magnetic recording medium such as a CD ROM or a semiconductor ROM
  • optical memory devices in general; etc.
  • the examples described herein are to be understood as illustrative examples of embodiments of the invention. Any methods described are not limited to the order in which they are described, and the steps of the method may happen in any order. Further embodiments and examples are envisaged. Any feature described in relation to any one example or embodiment may be used alone or in combination with other features.
  • any feature described in relation to any one example or embodiment may also be used in combination with one or more features of any other of the examples or embodiments, or any combination of any other of the examples or embodiments.
  • variations and modifications not described herein, as would be apparent to a person of ordinary skill in the relevant art, may also be employed within the scope of the invention, which is defined in the claims.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for determining a position of an object in a platform frame of reference is disclosed. The method comprises determining a first line of sight vector (250) and a second line of sight vector (260) in a first frame of reference from a third point (230) to first and second points (210, 220), wherein the second line of sight vector is related to the first line of sight vector according to the determined position of the second point relative to the determined position of the first point, the first point and second point are fixed in position relative to each other; receiving information defining the position of each of the two points relative to each other in a second frame of reference; receiving information defining an orientation of the second first frame of reference relative to the first frame of reference, thereby to determine the position of the object in the platform frame of reference.

Description

METHOD FOR OPERATING A TRACKING SYSTEM AND TRACKING SYSTEM
Field of the invention
The present invention relates to a method for operating a tracking system, in particular to determine the position of an object, and to a tracking system.
Background
In a known optical helmet tracker system for use in an aircraft, one or more emitters are positioned in a known configuration on an outer shell of a helmet, to be energised in a controlled but rapid sequence. A camera is positioned at a known location in the aircraft with a view of the helmet such that when emitters are energised they may be detected in images captured by the camera. The relative positions of at least four different emitters in the generated images may be determined and, given the known relative positions of the emitters on the helmet, used to calculate the position and orientation of the helmet in an aircraft frame of reference.
Brief description of the drawings
Example embodiments of the invention will now be described in more detail with reference to the accompanying drawings.
Figure 1 , showing, schematically, components of a hybrid tracker system according to the present disclosure.
Figure 2 illustrates a hybrid tracker system according to some examples.
Figure 3 illustrates a method according to some examples.
Figure 4 illustrates a tracker system according to some examples. Detailed description
Example embodiments of the present invention will be described in the context of an extension to the position tracking functionality of a known hybrid helmet tracker system, for example a hybrid tracker system as described in a patent application by the present Applicant published as WO 2017/042578 A1 , for use in an aircraft. In that example, a helmet tracker system comprises an optical helmet tracking component operable in combination with an inertial helmet tracking component. However, although the method is described with reference to a helmet on an aircraft, that it may be applied to track any object within a region of operation. The region of operation may be stationary or moving relative to an inertial reference frame associated with the Earth. The region of operation may also be referred to as a platform. The platform may comprise a vehicle, such as an aircraft. Alternatively, the platform may comprise a simulator. The object may comprise a helmet, a head mounted display, head worn display or any other suitable object where it is desirable to track the positon and orientation of the object.
In this particular example of a hybrid helmet tracker, the inertial tracking component may be configured as the principle source of data for determining changes in orientation of the helmet. The inertial tracking component may comprise an arrangement of miniature inertial sensors attached to the helmet for sensing a rate of change in orientation of the helmet. A tracker processor is configured to calculate orientation of the helmet based upon rate data output by the inertial sensors. The tracker processor is also configured to apply corrections to the rate data sensed by the inertial sensors using rate data output by the optical tracker component to correct for changing response (drift) of the inertial sensors or any misalignment of the inertial sensors.
The optical tracker component is configured both to sense changes in orientation of the helmet, for example for the purposes of providing corrections in the inertial tracker component, and to determine the position of the helmet within a region of operation, e.g. within the aircraft cockpit. The optical helmet tracking component may comprise an arrangement of controllable emitters integrated within an outer shell of a helmet and at least one camera mounted at a fixed, known position within an aircraft cockpit. The camera position is chosen so that it may have a direct line of sight to at least some of the helmet emitters, typically to any group of at least four emitters, in normal use. An optical tracker processor is configured to control the illumination of the emitters and to process image data output by the camera, thereby to sense movement of the helmet, as will now be described in more detail.
The positions of the emitters in a helmet frame of reference, and hence their relative positions on the helmet, are known from initial manufacture or configuration of the helmet. The position of the camera in an aircraft frame of reference is also known for any particular application of the tracker system. It is also expected that the orientation of the aircraft in an inertial reference frame associated with the Earth is also known from aircraft sensors.
The optical tracker component, in one known example, operates cyclically to determine position and orientation of the helmet in the aircraft frame of reference, typically up to 200 times per second. During each cycle, the emitters are activated. If an emitter is visible within a particular captured image, then the identity of the emitter is known using suitable methods and hence its position in the helmet frame of reference is known.
The position of the emitter in the respective captured image, after correcting for the optical properties of a lens of the camera, indicates the direction of a line of sight from the emitter to the fixed camera in the aircraft frame of reference. The relative positions of emitters appearing in an image during a given cycle, the known relative positions of the identified emitters in the helmet frame of reference and the respective lines of sight from the camera to each emitter enables the position and orientation of the helmet to be calculated in the aircraft frame of reference if four emitters are visible in images captured by the camera. If only three emitters are visible to the camera then this yields six equations, two for each emitter imaged by the camera, with six unknowns, three unknowns for the position and three unknowns for orientation. Thus, if only three emitters are visible to the camera, then it is possible for the optical tracking component to determine the position and orientation of the helmet in particular circumstances, but not in general. If only two emitters are visible this yields four equations and thus it is not possible to determine the position and orientation of a helmet using the optical tracking component alone.
In an example to be described with reference to Figure 1 , a hybrid tracker system, as defined above, may be configured to determine the position of a helmet in the aircraft frame of reference when only two emitters of the optical tracker component are visible to the camera during any given cycle. In a conventional hybrid tracker, if only two emitters are visible in the optical tracking component, the hybrid tracker operates only as an inertial tracker determining orientation of the helmet only, until such time as more emitters become visible to the camera.
Referring to Figure 1 , a schematic representation is shown of sensor components in an example hybrid helmet tracker system 5 as may be used to track the position and orientation of a helmet 10 in an aircraft frame of reference 15. The aircraft frame of reference 15 will be referred to hereafter as ‘aircraft axes’ 15. The hybrid helmet tracker system 5 comprises sensors of an optical helmet tracking component and sensors of an inertial helmet tracking component.
For the optical helmet tracking component, the helmet 10 is provided with at least four emitters mounted within an outer shell of the helmet 10, including a first emitter 20 and a second emitter 25. The emitters may be illuminated as discussed above, under the control of an optical tracking processor (not shown in Figure 1 ). The position of each emitter 20, 25 is defined in a helmet frame of reference 50, to be referred to hereafter as ‘helmet axes’ 50. The optical tracking component also comprises a camera 30 mounted at a known fixed position and orientation relative to the aircraft axes 15, directed towards the helmet 10. It is intended that the camera 30 is orientated to capture images of different combinations of illuminated emitters mounted on the helmet 10 as the helmet 10 moves and to supply image data of the captured images to the optical tracking processor. The camera 30 comprises a lens 35 and an image sensor 40.
For the inertial tracking component, an inertial sensor block 45 is mounted on or within the helmet 10. In one example, the inertial sensor block 45 comprises three orthogonally-oriented Micro-Electromechanical System (MEMS) gyroscopic sensors. The inertial sensors 45 are able to sense a rate of change in orientation of the helmet 10 in three dimensions in an inertial reference frame associated with the Earth, to be referred to hereafter as ‘Earth axes’ (not shown in Figure 1 ). The rate of change data output by the inertial sensors 45 are received and processed by a tracker system processor (not shown in Figure 1 ) to determine the orientation of the helmet 10 in Earth axes.
Sensors provided on the aircraft, typically as part of an aircraft navigation system, are able to provide data defining the orientation of the aircraft in Earth axes. The orientation of the helmet 10 in Earth axes, as calculated by the tracker system processor from rate data output by the inertial sensors 45, may therefore be transformed using orientation data from the navigation system by a straightforward transformation into an orientation of the helmet 10 in the aircraft axes 15.
The position of the helmet 10 in the aircraft axes 15 is calculated by the optical tracking processor of the hybrid tracker system 5, but to determine helmet position using the optical tracking processor usually requires four of the helmetmounted emitters to be visible to the camera 30.
In this example, it is assumed that only the first emitter 20 and the second emitter 25 are visible to the camera 30. If the first emitter 20 appears illuminated in a respective image captured by the camera's image sensor 40, the position 65 in the image at which the emission from the emitter 20 is received at the image sensor 40 defines a first line of sight in aircraft axes 15 to the emitter 20 from the lens 35 of the camera 30. The position of the emitter 20 in the aircraft axes 15 lies an unknown distance along that first line of sight. The position of the emitter 20 is therefore defined mathematically in terms of a known line of sight vector 55 in aircraft axes, defining the direction of the defined line of sight from camera lens 35 to the first emitter 20, and an unknown distance from the camera lens 35 to the first emitter 20. It may be convenient to consider the unknown distance as a distance from the image sensor 40 to the emitter 20, having adjusted for the optical properties of the camera lens 35. The line of sight vector may be a unit vector (or a normalised vector), and may be multiplied by a scaling factor determined from the unknown distance from the camera lens 35 to the first emitter 20.
The orientation of the helmet 10 and the helmet axes 50 in aircraft axes 15 are known from the inertial tracking component. Furthermore, the position of the first emitter 20 in the helmet axes 50 is also known. The unknown position of the helmet 10 (origin of the helmet axes 50) in aircraft axes 15 may therefore be related mathematically by:
Figure imgf000008_0001
Where:
VC A is the known position of the camera lens 35 or image sensor 40 in aircraft axes 15;
Kel is the unknown distance from the camera lens 35 or image sensor 40 to the first emitter 20;
VA el is the known line of sight vector 55 in aircraft axes 15;
[7M] is a rotation matrix representing the known orientation of the helmet 10 in aircraft axes 15, as generated from the inertial block 45; is the known position of the first emitter 20 in the helmet axes 50; and
VA is the unknown position of the helmet 10 relative to the camera in the aircraft axes 15.
The above formula generates two independent equations (one equation for each x and y coordinate measurement on the camera image sensor 40 plane representing the line of sight 55): with four unknown quantities the three coordinates of the helmet 10 in aircraft axes 15 and the distance from the camera lens 35 or the image sensor 40 to the first emitter 20.
Consider the second emitter 25 also appearing in the image frame captured by the camera 30, the position 70 at which the light from the illuminated second emitter 25 is received at the image sensor 40 defines a second line of sight in aircraft axes 15 from the lens 35 of the camera 30 or the image sensor 40 to the emitter 25. The position of the emitter 25 in the aircraft axes 15 lies an unknown distance along that second line of sight, in a direction defined by a second line of sight vector 60. As for the first emitter 20, the unknown coordinates of the helmet 10 (origin of the helmet axes 50) in aircraft axes 15 may therefore be related mathematically by:
VC A + Ke2 x Vc A e2 = [HA] x Ve H 2 + VA eq 2
Where:
Ke2 is the unknown distance from the camera lens 35 or image sensor 40 to the second emitter 25;
VA 2 is known line of sight vector 60 in aircraft axes.
Noting that Kel and Ke2 and the position of the helmet 10 in aircraft axes 15 are not independent and are related by equations 3 and 4, this leads to four equations and three independent unknown quantities: the three coordinates of the helmet 10 in aircraft axes 15.
Figure imgf000009_0001
Where:
|v| is the magnitude of vector v.
The above equations (eq 1 -4) can be solved by geometry, and hence V , the unknown position of the helmet 10 in the aircraft axes 15 can be determined, as follows:
Figure imgf000009_0002
Figure imgf000010_0001
By the technique described above, in the event that only two emitters are visible to the camera 30, full hybrid tracking operation may continue, determining both position and orientation of the helmet 10, with benefits of greater tracking accuracy than may typically be achieved by an inertial tracking component operating alone.
The technique described above with reference to Figure 1 makes use of emitters fixed to the helmet frame of reference and a camera fixed to the aircraft axes 15. However, it is not necessary to use a camera and emitter. Any suitable arrangement to determine the line of sight vectors may be used. For example a positon sensor may be used in the place of the camera. The emitter may also be a passive device, such that it may reflect light from an external source that may be detected by the position sensor.
As an alternative to fixing a camera to the aircraft axes 15 and emitters to the helmet axes 50 the camera may be fixed to the helmet and the emitters to the aircraft axes 15.
The technique described above with reference to Figure 1 describes a helmet on an aircraft. However, any object on any platform may be tracked using the same method. For example a head worn display, head mounted display, or any other type of display may be tracked on a platform. The platform may be stationary relative to the earth axes, or moving relative to the earth axes. The platform may comprise a simulator or a vehicle. The vehicle may comprise a motor vehicle, an aircraft or a naval vessel or maritime vessel.
A more general example is described with relation to Figure 2. Figure 2 comprises a region of interest 200. A first point 210, a second point 220 and a third point 230 are located in the region of interest 200. The region of interest is the region in which the object may be tracked. The first point 210 and the second point 220 are fixed in location relative to a first frame of reference 240, such that the first point 210 and the second point 220 move with the first frame of reference. The relative positon of the first point 210 and the second point 220 is fixed.
Sensing means may be used to determine a first line of sight vector 250 from the third point 230 to the first point 210. Sensing means may be used to determine a second line of sight vector 260 from the third point 230 to the first point 260. The line of sight vectors 250 and 260 may be unit vectors (or normalised vectors), and as such define a direction from the third point 230 to the first point 210 and the second point 220. This is because the distance between the third point 230 and the first point 210 (and the second point 220) is unknown.
The sensing means may be associated with the third point 230. In some examples the sensing means may comprise a sensor located at the third point 230 configured to determine the direction from the third point 230 to the first point 210 and the second point 220. In some examples the sensor may comprise a camera.
The second line of sight vector 250 is related to the first line of sight vector 260 based on equations 1 and 2. The first point 210 and second point 220 are fixed in position relative to each other.
The position of each of the first point 210 and the second point 220 relative to each other in a second frame of reference may be determined. The orientation of the first frame of reference 240 in the second frame of reference 240 is known.
The first frame of reference 240 may be fixed in the object frame of reference, such that the first point 210 and the second point are fixed in position relative to the object, as described in relation to Figure 1 . If the first frame of reference 240 is fixed in the object frame of reference then the second frame of reference may be fixed relative to the frame of reference of the region of interest 200, i.e. a platform frame of reference. This is consistent with some examples where a camera or sensing means is in a fixed location in the region of interest 200 and the first point 210 and the second points are fixed on the object (such as a helmet).
In contrast to Figure 1 , the first frame of reference 240 may be fixed in the in the region of interest 200 frame of reference, i.e. a platform frame of reference. Such that the first point 210 and the second point are fixed in position relative to the region of interest 200. If the first frame of reference 240 is fixed in the region of interest 200 frame of reference then the second frame of reference may be fixed relative to the object frame of reference. This is consistent with some examples where a camera or sensing means is in a fixed location on the object and the first point 210 and the second points are fixed in the region of interest.
Figure 3 illustrates a method to determine a position of an object, the method is indicated by the reference sign 300.
The first line of sight vector is determined 310. The second line of sight vector is determined. The second line of sight vector is related to the first line of sight vector as the first point and second point are fixed in position relative to each other.
Information defining the position of each of the two points relative to each other in a first frame of reference is received 330. This information is known as the two points are fixed relative to each other.
Information defining an orientation of the first frame of reference in the second frame of reference is received 340. This orientation may be determined by an inertial tracker.
Using the first and second line of sight vectors defined at 310 and 320, the information received at 330, the information received at 340 and the calculated first and second distances from the third point to the first and second points, the position of the object in the platform frame of reference may be determined 350.
The second frame of reference may be fixed in the platform frame of reference and the first frame of reference may be fixed in an object frame of reference, or the first frame of reference may be fixed in the platform frame of reference and the second frame of reference may be fixed in the object frame of reference.
Figure 4 illustrates a hybrid tracker system 400 in accordance with some examples. Hybrid tracker system 400 comprises an optical tracking system 410, an inertial tracking system 420 and a processing means 430. Hybrid tracker system may be suitable to work with the techniques described with relation to Figures 1 -3.
It will be understood that processing components described above with reference to Figures 1 -4 may in practice be implemented by a single chip or integrated circuit or plural chips or integrated circuits, optionally provided as a chipset, an application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), digital signal processor (DSP), graphics processing units (GPUs), etc. The chip or chips may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor or processors, a digital signal processor or processors, baseband circuitry and radio frequency circuitry, which are configurable so as to operate in accordance with the exemplary embodiments. In this regard, the exemplary embodiments may be implemented at least in part by computer software stored in (non-transitory) memory and executable by the processor, or by hardware, or by a combination of tangibly stored software and hardware (and tangibly stored firmware).
Although at least some aspects of the embodiments described herein with reference to the drawings comprise computer processes performed in processing systems or processors, the invention also extends to computer programs, particularly computer programs on or in a carrier, adapted for putting the invention into practice. The program may be in the form of non-transitory source code, object code, a code intermediate source and object code such as in partially compiled form, or in any other non-transitory form suitable for use in the implementation of processes according to the invention. The carrier may be any entity or device capable of carrying the program. For example, the carrier may comprise a storage medium, such as a solid-state drive (SSD) or other semiconductor-based RAM; a ROM, for example a CD ROM or a semiconductor ROM; a magnetic recording medium; optical memory devices in general; etc.The examples described herein are to be understood as illustrative examples of embodiments of the invention. Any methods described are not limited to the order in which they are described, and the steps of the method may happen in any order. Further embodiments and examples are envisaged. Any feature described in relation to any one example or embodiment may be used alone or in combination with other features. In addition, any feature described in relation to any one example or embodiment may also be used in combination with one or more features of any other of the examples or embodiments, or any combination of any other of the examples or embodiments. Furthermore, variations and modifications not described herein, as would be apparent to a person of ordinary skill in the relevant art, may also be employed within the scope of the invention, which is defined in the claims.

Claims

1. A method for determining a position of an object in a platform frame of reference, comprising:
(i) determining a first line of sight vector and a second line of sight vector in the first frame of reference from a third point to the first and second points, wherein the second line of sight vector is related to the first line of sight vector according to the determined position of the second point relative to the determined position of the first point, the first point and second point are fixed in position relative to each other;
(ii) receiving information defining the position of each of the two points relative to each other in a second frame of reference;
(iii) receiving information defining an orientation of the first frame in the first frame of reference;
(iv) using the first and second line of sight vectors defined at (i), the information received at (ii), the information received at (iii) and the first and second distances from the third point to the first and second points, thereby to determine the position of the object in the platform frame of reference; and wherein the first frame of reference is fixed in the platform frame of reference and the second frame of reference is fixed in an object frame of reference, or the second frame of reference is fixed in the platform frame of reference and the first frame of reference is fixed in the object frame of reference.
2. The method according to claim 1 , wherein (iv) comprises defining four independent equations having four independent unknown quantities, the independent unknown quantities comprising the three coordinates of the object in the first frame of reference and the distance to the first point from the third point.
3. The method according to claim 1 , wherein (iv) comprises defining four independent equations having four independent unknown quantities, the independent unknown quantities comprising the three coordinates of the object in the second frame of reference and the distance to the first point from the third point.
4. The method according to any preceding claim, wherein determining a position of a first and second point in the first frame of reference comprises detecting a first and second sensor at the first and second point respectively using a third sensor.
5. The method according to claim 4, wherein the first and second sensors are passive sensors.
6. The method according to claim 4, wherein the first and second sensors are active sensors.
7. The method according to claim 6, wherein the first and second sensors comprise light emitting diodes.
8. The method according to claim 5, wherein the first and second sensors comprise reflectors configured to reflect light towards the third sensor.
9. The method according to any of claims 4-8, wherein the third sensor comprises a camera configured to record an image of the first and second points.
10. The method according to claim 9, wherein the camera is configured to record an image.
11 . The method according to claim 10, wherein the first point is visible in a first image captured by the camera at a first time and the second point is visible in a second image captured by the camera at a second time and the time interval between the first time and the second time is sufficiently short for the object to be assumed stationary during the time interval.
12. A computer-readable medium comprising instructions, that cause a processing means to perform the method of any of claims 1 -11.
13. A tracker system, comprising an optical tracker component and an inertial tracker component, wherein the inertial tracker component is configured to determine an orientation of an object in a platform frame of reference and the optical tracker component comprises: - 15 - an optical tracker processor, configured:
(i) to determine the position of a first and second point in a first frame of reference, the determined positions defining first and second line of sight vectors in the first frame of reference from a third point to the first and second points, wherein the second line of sight vector is related to the first line of sight vector according to the determined position of the second point relative to the determined position of the first point, the first point and second point are fixed in position relative to each other;
(ii) to receive information defining the position of each of the two points relative to each other in a second frame of reference;
(iii) to receive information defining an orientation of the first frame of reference in the second frame of reference;
(iv) to use the first and second line of sight vectors defined at (i), the information received at (ii), the information received at (iii) and the first and second distances from the third point to the first and second points, thereby to determine the position of the object in the platform frame of reference; and wherein the first frame of reference is fixed in the platform frame of reference and the second frame of reference is fixed in an object frame of reference, or the second frame of reference is fixed in the platform frame of reference and the first frame of reference is fixed in the object frame of reference.
14. A tracker system configured to perform the method of any of claims 1 -11.
15. A head worn display comprising a tracker system according to any of claims 13 or 14.
PCT/GB2021/052988 2020-11-19 2021-11-18 Method for operating a tracking system and tracking system WO2022106824A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
EP20275170.7 2020-11-19
GB2018167.3A GB2601138A (en) 2020-11-19 2020-11-19 Method for operating a tracking system and tracking system
EP20275170.7A EP4001952A1 (en) 2020-11-19 2020-11-19 Method for operating a tracking system and tracking system
GB2018167.3 2020-11-19

Publications (1)

Publication Number Publication Date
WO2022106824A1 true WO2022106824A1 (en) 2022-05-27

Family

ID=78725522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2021/052988 WO2022106824A1 (en) 2020-11-19 2021-11-18 Method for operating a tracking system and tracking system

Country Status (1)

Country Link
WO (1) WO2022106824A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5085507A (en) * 1989-12-27 1992-02-04 Texas Instruments Incorporated Device for three dimensional tracking of an object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US20110079703A1 (en) * 2009-10-02 2011-04-07 Teledyne Scientific & Imaging, Llc Object tracking system
WO2017042578A1 (en) 2015-09-11 2017-03-16 Bae Systems Plc Helmet tracker
RU2720076C1 (en) * 2019-05-29 2020-04-23 Российская Федерация, от имени которой выступает Министерство обороны Российской Федерации Method of angular and spatial coordinates estimation of objects in reference points in optical-electronic positioning system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5085507A (en) * 1989-12-27 1992-02-04 Texas Instruments Incorporated Device for three dimensional tracking of an object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US20110079703A1 (en) * 2009-10-02 2011-04-07 Teledyne Scientific & Imaging, Llc Object tracking system
WO2017042578A1 (en) 2015-09-11 2017-03-16 Bae Systems Plc Helmet tracker
RU2720076C1 (en) * 2019-05-29 2020-04-23 Российская Федерация, от имени которой выступает Министерство обороны Российской Федерации Method of angular and spatial coordinates estimation of objects in reference points in optical-electronic positioning system

Similar Documents

Publication Publication Date Title
US11181737B2 (en) Head-up display device for displaying display items having movement attribute or fixed attribute, display control method, and control program
CN103502876B (en) For the method and apparatus correcting the projection arrangement of vehicle
CN108917746B (en) Helmet posture measuring method, measuring device and measuring system
EP3460756B1 (en) Tracking system and method thereof
US20130215270A1 (en) Object detection apparatus
US11450040B2 (en) Display control device and display system
JP2020125033A (en) Display control device and display control program
CN102211523A (en) Method and apparatus for tracking a position of an object marker
US20200098115A1 (en) Image processing device
JP6988873B2 (en) Position estimation device and computer program for position estimation
JP5716273B2 (en) Search target position specifying device, search target position specifying method and program
CA3152294A1 (en) Method and system of vehicle driving assistance
CN115176124A (en) Posture/position detecting device for detector
EP1584896A1 (en) Passive measurement of terrain parameters
EP4001952A1 (en) Method for operating a tracking system and tracking system
EP3795952A1 (en) Estimation device, estimation method, and computer program product
WO2022106824A1 (en) Method for operating a tracking system and tracking system
GB2601138A (en) Method for operating a tracking system and tracking system
US20230079899A1 (en) Determination of an absolute initial position of a vehicle
CN116892949A (en) Ground object detection device, ground object detection method, and computer program for ground object detection
JP2021017073A (en) Position estimation apparatus
JP2017182564A (en) Positioning device, positioning method, and positioning computer program
JP4656017B2 (en) Head motion tracker device
JP2020076714A (en) Position attitude estimation device
US20220051449A1 (en) Method and system of providing virtual environment during movement and related non-transitory computer-readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21811446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21811446

Country of ref document: EP

Kind code of ref document: A1