EP3210038A1 - Viewing and tracking of target objects - Google Patents

Viewing and tracking of target objects

Info

Publication number
EP3210038A1
EP3210038A1 EP15794620.3A EP15794620A EP3210038A1 EP 3210038 A1 EP3210038 A1 EP 3210038A1 EP 15794620 A EP15794620 A EP 15794620A EP 3210038 A1 EP3210038 A1 EP 3210038A1
Authority
EP
European Patent Office
Prior art keywords
target object
radiation
scattering surface
pixel
operable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15794620.3A
Other languages
German (de)
French (fr)
Inventor
Genevieve GARIEPY
Francesco TONOLINI
Jonathan LEACH
Daniele Faccio
Robert Henderson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heriot Watt University
Original Assignee
Heriot Watt University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heriot Watt University filed Critical Heriot Watt University
Publication of EP3210038A1 publication Critical patent/EP3210038A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

Abstract

Disclosed are methods and apparatuses for obtaining positional information relating to a target object, particularly where the target object which is hidden from view, and not in the line of sight of the apparatus. The apparatus comprises an illumination source operable to illuminate a scattering surface, the scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by the scattering surface. The apparatus also comprises a detection device operable to detect reflected radiation, the reflected radiation being the scattered radiation which has reflected off the target object to within the field of view of the detection device and a processor operable to calculate the positional information from the detected reflected radiation.

Description

Viewing and Tracking of Target Objects
The present invention relates to methods and apparatuses for the illumination and viewing of objects that may be hidden from an observer.
The methods and apparatuses have particular applicability to optical sensing and imaging, especially where direct line of sight of a target object or objects of interest to an observer is not possible.
Optical imaging and image analysis are very important within any number of situations including but not limited to photography, astronomy, security, surveying, military operations, aerial and non-invasive inspection, livestock and crop management etc.
When the human eye looks at an object, it requires a direct "line of sight" of the object within its field of view so that any light incident on the object, and then reflected from it, is captured by the eye and processed by the brain as a recognisable image of the object. If the same object is hidden from the observer field of view, for example behind a wall, light incident on it will not reach the eye and the object is not visible to the observer.
To aid human observers, there exists imaging devices such as the periscope, where mirrors are used to manipulate the path of reflected light from an object to the eye of the observer. These can be used to observe an object behind a wall. Periscopes are quite common at sporting events or concerts to help people see over the crowd, and also find use when an observer wishes to remain hidden from sight, for example in submarines or warfare where a soldier may wish to remain hidden behind a fortification giving protection while watching out for an enemy or target.
The human eye has evolved to operate at its peak efficiency during daylight hours and is most sensitive to wavelengths that lie within the visible part of the electromagnetic spectrum, typically 400 to 700 nanometres (nm). Outwith this spectral range, the human eye is less sensitive (e.g., to longer wavelengths) or can even be physically damaged (e.g., by shorter wavelengths) and a number of solid state devices and optical systems have been developed to allow imaging at these longer or shorter wavelengths within the electromagnetic spectrum.
Depending on the wavelength, these devices can include Charge Coupled Devices (CCD), photo-multiplier tubes, infra-red detectors, terahertz detectors, microwave or radio antenna, gamma-ray or x-ray detectors or photographic film sensitised to the wavelength of interest etc.
Although these detectors offer a broad range of wavelengths of detection, the mode of operation is generally limited to "line of sight" operation; exception being where the energy or wavelength is such that it can travel through solid objects to the detector.
It would therefore be desirable to obtain a wavelength independent method for viewing or tracking objects that are not within the direct line of sight of an observer.
SUMMARY OF INVENTION
In a first aspect of the invention there is provided an apparatus for obtaining positional information relating to a target object, the apparatus comprising:
an illumination source operable to illuminate a scattering surface, said scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by said scattering surface;
a detection device operable to detect reflected radiation, said reflected radiation being said scattered radiation which has reflected off said target object to within the field of view of said detection device; and
a processor operable to calculate said positional information from the detected reflected radiation.
In a second aspect of the invention, there is provided a method for obtaining positional information relating to a target object, the method comprising:
illuminating a scattering surface, said scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by said scattering surface;
detecting reflected radiation within an imaged area, said reflected radiation being said scattered radiation which has reflected off said target object to said imaged area; and
calculating said positional information from the detected reflected radiation. Other optional features are as described in the dependent claims.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will now be described, by way of example only, by reference to the accompanying drawings, in which:
Figure 1 is a side view diagram showing schematically an arrangement for imaging and tracking a hidden from view object according to a first embodiment of the invention; Figure 2 is a top view diagram of the arrangement of Figure 1 showing (a) the radiation from the illumination source scattering from the floor and (b) the scattered radiation reflecting from the target object;
Figure 3 is a top view diagram of the arrangement of Figures 1 and 2 illustrating the ensemble of possible locations for the object; and
Figure 4 is a top view diagram of the arrangement of Figures 1 and 2 illustrating the step of combining the probability distributions to estimate target object location.
DETAILED DESCRIPTION OF THE EMBODIMENTS
The ability to detect motion and track a still or moving object hidden around a corner or behind a wall has many uses and may, for example, provide a crucial advantage when physically going around the obstacle is impossible or dangerous. Previous methods have demonstrated that is possible to reconstruct the shape of an object hidden from view. However, these methods do not enable the tracking of movement in real-time. A compact non-line-of-sight laser ranging technology is disclosed which relies upon the ability to send light around an obstacle using a scattering surface (e.g., floor or wall) and to detect the return signal from a target object with only a few seconds acquisition time. By detecting this signal with a single-photon avalanche diode (SPAD) camera, it is possible to follow the movement of an object located a distance away from the camera with centimetre precision. The disclosed imaging technique in combination with analysis of the data collected can reveal both the presence of a target object (or objects) in a scene and indicate size, shape and direction of movement of that object (or objects), where the object(s) is/are outside the normal field of view of the observer. This is beneficial over current imaging systems such as those using Light Detection and Ranging (or LIDAR) approaches where there is a need for direct "line of sight" of an object within the field of view. The object being tracked may be entirely hidden from the field of view of the observer. The imaging system is such that it does not require a direct line of sight between the object of interest and the detector of the imaging system. The target object(s) of interest may be stationary or may be moving in space and time. Where the object(s) of interest is non-stationary, there is disclosed a method of tracking the movement of the object (or objects). The object(s) may be tracked in real time. Within a scene containing multiple objects of interest to an observer a portion of the objects of interest may be stationary and a further portion may be non-stationary. The imaging system may comprise an illumination source, a detection device, a means of processing the data from the detection device and an output device. The wavelength of operation of the illumination source may be such that it is within the electromagnetic spectrum of radiation and can be detected by the detection device. The illumination source of the invention provides illumination to an area within a scene; the scene containing an object, or objects, of interest to the observer.
The illumination source may provide a continuous or discontinuous mode of illumination. In discontinuous mode, the illumination source may be operated at a certain, known frequency. The frequency of operation or number of cycles per second (Hertz, Hz) of the source may be optimised to provide a balance between the need to provide illumination and the ability to detect the illumination at a detection device. The frequency of operation of the illumination source can be slow (few Hz) or can be fast (many thousands of Hz or greater). The frequency of operation of the illumination source can be between 0 and 1000Hz (lKhz), between lKhz and 1 million Hz (lMHz), between lMhz and lOOOMhz (lGhz) or in some cases beyond 1GHz.
Any illumination source could be used provided that a detector can be matched to it. One example of an illumination source of the invention is a laser source. Laser sources cover a broad and useful part of the electromagnetic spectrum of radiation (~200nm to well beyond 1 micron in wavelength) and a number of commercial detectors are available to match the illumination wavelength of the laser. Laser sources provide a high power of illumination and depending on their design parameters can operate in a continuous mode of operation or a discontinuous mode of operation making them suitable for applications disclosed herein.
Depending on their design, Laser sources can provide different laser pulse widths. These pulses are typically very short in duration and can be in the millisecond, nanosecond, picosecond or femtosecond range. The detection device may have a wavelength of detection being within the output spectrum of the illumination source and ideally having a maximum sensitivity at the same output region of the electromagnetic spectrum as the illumination source. The detector of the imaging system may be chosen such that it is optimised to the operation of the illumination source. The detection device is capable of capturing the illumination data under conditions of either continuous illumination or discontinuous illumination.
The detector may collect illumination data over a short period of time or over a long period of time. Illumination data collected by the detector may be transferred from the detection device to an image processing system. The illumination data collected by the detector is processed in such a manner that the relative movement of the object(s) hidden from view can be determined. When data from the detector is processed with respect to time, additional information can be gathered on the speed and, or, the direction of travel of the object(s).
The illumination data from the detector can be processed instantaneously, in "real time", or it can be recorded and stored for processing at a future time. The output device may be a presentation means or any other media to visually represent the data collected by the imaging system to the observer. The illumination source and the detector are located at a distance, X, from the object of interest to the observer. The distance, X , can be small or can be large. The illumination source and the detector may be located at an angle, Y, to the object of interest to the observer. The angle can be small or can be large depending on the size of the object of interest and the distance, X.
Figure 1 shows schematically a system for imaging and tracking a hidden from view object (target object), from a side view. Figure 2 shows the same arrangement in top down view (a) illustrating the radiation from the illumination source scattering from the floor and (b) illustrating the scattered radiation reflecting from the target object. In Figure 1, a person behind a wall 100 represents the target object 110, but it could be anything and could encompass very different scenarios, e.g. a car behind a corner, a stranded person in a room that is on fire, an object in an underwater wreckage with limited access from outside etc.
The system comprises an illumination source 120 and detection device 130. Control of these may be performed using processor 145. The illumination source may comprise, for example, a high repetition rate, pulsed laser. Specifically, The laser may comprise an 800 nm wavelength femtosecond oscillator which emits pulses of 10 nj energy and 10 fs duration at a 67 MHz repetition rate (0.67 W average power).
The system may be arranged to generate a synchronisation signal to synchronise the acquisition to the propagation of the radiation pulses from the illumination source 120. In an embodiment, a small portion of the illumination source 120 output may be sent to an optical constant fraction discriminator (OCF) which then generates the synchronization signal (e.g., TTL signal) which is sent to the detection device 130 and/or the processor.145 for synchronisation. The laser pulses 135 are directed towards a scattering surface 140 that lies beyond the obstacle that obscures or limits the direct line of sight to the object (e.g. the wall 100 in the Figure 1). This scattering surface 140 can be any surface within line of sight of the target object 110, and may be another wall, a ceiling or roof„ an open door, the surface of another object or, as shown in this example, the floor. When a scattered laser pulse 150 hits the scattering surface 140 (e.g., floor), it will scatter into a spherical wave 160 (Figure 2(a)).
This spherical wave 160 will then propagate outwards in all directions, including behind the wall 100 and therefore reach the target object 110 to be detected and tracked. The radiation in the spherical wave will then in turn be reflected from the target object 110, with some of the reflected radiation 170 (Figure 2(b)) being reflected towards the imaged area 180, which is within the direct line of sight of the detection device 130.
In Figures 1 and 2, the detection device is shown to be imaging an imaged area 180 on the floor, just beyond the edge of the obscuring wall. It should be understood that each scattering event results in the emission of a spherical wave. In the case of a complicated object, there will probably be many spherical waves of reflected radiation 170 originating from different parts of the object. These will enable determination of the actual shape of the target object 110. In other cases, e.g. a car imaged from a distance or with a lower resolution, it may be that only a single spherical wave emitted from the object is imaged. This spherical wave will appear as a section of circle on the imaged area 180 as the spherical wave of reflected radiation 170 intersects the imaged area 180. In Figure 2(b), an example of real data 190 is shown, illustrating the image of the spherical wave of reflected radiation 170 within the imaged area 180 as captured by the detection device 130. This data shows the spherical wave of reflected radiation 170 at a precise time instant: the picosecond temporal resolution of exemplary detection devices allows the capture of the propagation of the spherical wave of reflected radiation 170 as it traverses the imaged area 180. The total data cube acquired by the detection device 130 will provide a video in which (part of) the spherical wave of reflected radiation 170 is seen to propagate from left to right in the image. It is the combination of the temporal information, i.e. how the spherical wave of reflected radiation 170 moves over time, together with the spatial information, i.e. the actual shape of the spherical wave of reflected radiation 170 that allows the exact location of the target object 110 to be determined. This determination may be performed by processor 145. The location of the target object 110 is retrieved by utilising the fact that: (i) the time it takes for the radiation to propagate from the illumination source 120 to the target object 110 and back, similarly to a LIDAR system, gives information about the target object's distance and (if) the curvature and direction with which the spherical wavefront of reflected radiation 170 propagates across the imaged area 180 provides information on the target object's position. The detection device may comprise single photon counting technology using a single-photon avalanche diode (SPAD) camera. A high temporal resolution of the detection device can be obtained by operating it in a Time-Correlated-Single- Photon-Counting (TCSPC) mode. The arrival times of single photons may be measured with a resolution in terms of picoseconds, e.g., with a time bin between 10-lOOps, or between 30 and 80ps. As such, the system may use (millions) of laser pulses in order to properly reconstruct the fully animated video. The acquisition time depends on the repetition rate of the laser: the higher the repetition rate, the faster the acquisition time. SPAD detectors, originally developed as single pixel elements, are gradually becoming widely available as focal plane arrays. The single photon sensitivity and picosecond temporal resolution make them good candidates for real-time non-line-of-sight ranging of a moving target. Specifically, in this example, the detection device 130 may comprise a 32x32-pixel array of Si CMOS SPADs. A SPAD is based on a p-n junction device biased beyond its breakdown region. The high reverse bias voltage generates a sufficient magnitude of electric field such that a single charge carrier introduced into the depletion layer of the device can cause a self-sustaining avalanche via impact ionisation. The avalanche is quenched, either actively or passively to allow the device to be "reset" to detect further photons. The initiating charge carrier can be photo-electrically generated by means of a single incident photon striking the high field region. It is this feature which gives rise to the name 'Single Photon Avalanche Diode'. This single photon detection mode of operation is often referred to as 'Geiger Mode'. The high sensitivity of the detection device 130 allows extremely short acquisition times, which in turn allows one to locate target objects on timescales sufficiently short to be able to track their movement. Locating the position of an object hidden behind a wall with centimetre precision is possible, without the need for pre- acquiring a background in the absence of the object. It can also be shown that realtime acquisition is possible for an object moving at a few centimetres per second. The detection device 130 may comprise an array of SPADs individually operable in a time-correlated single-photon counting (TCSPC) mode: every time a photon is detected by a pixel, the time difference between its arrival and the arrival of the synchronisation signal (e.g., TTL trigger from the OCF) is measured and stored in a time histogram. Each histogram comprises a number of time pixels and a time bin of certain duration. Specifically, histogram may comprise 1024 time pixels with a time- bin of 45.5 ps. The time resolution is limited by the electronic jitter of the system, which may be approximately 110 ps (measured at full-width-half-maximum), for example. This impulse response corresponds to a spatial (depth) resolution of a 1.65 cm, allowing the approximation of the back scattering as a single spherical wave originating from the target.
The target object-position retrieval algorithm may rely on both the temporal and spatial information recorded by the detection device (SPAD camera). Every pixel i of the (e.g., 32x32-pixel) camera, corresponding to a position r- = (χι, Υι) in the imaged area, records a histogram of photon arrival times. These histograms give a probability distribution of arrival times for photons scattered by a hidden target. This time distribution can be mapped into a probability density in space for the target position. A spatial probability density is calculated for each pixel, and the product of all probabilities constitutes the joint probability density of finding the target at a specific position in space.
First, the signal of interest coming from the target object alone is isolated from the signal coming from unwanted sources in the environment such as the walls and the ceiling. This can be achieved by simply acquiring a background signal in the absence of the target object; however, this may not be a practical solution if we are interested in tracking non-cooperative moving target objects. Instead, by acquiring data with the target object at different positions, it is possible to distinguish the signal that is not changing at each acquisition (generated by the static sources) and the signal that is changing (generated by the target object). An average of the temporal histograms for each pixel proves to be a very a good approximation of the background signal and allows to effectively isolate the signal generated from the target object alone. Background subtraction is discussed in more detail later.
Once the target object signal is isolated, the processing proceeds to time-of-flight measurements and fitting of a Gaussian function to the temporal histograms. For each pixel i, the peak position of the Gaussian fit (t)j is a measure of the average total photon flight time, with an uncertainty that is taken to be the Gaussian standard deviation ot.. The arrival time tt is a measure of the light travel-time from the moment the laser hits the ground, scatters to a target object at a point = (½, y0) and scatters back to the specific point rt in the field of view (imaged area) of the camera. This is illustrated in Figure 3. There is an ensemble of locations that satisfy this condition, thus forming a three-dimensional ellipsoid which collapses to a two-dimensional ellipse 300 on a plane parallel to the floor (scattering surface), defined by the target object's height. This ellipse 300 is defined by:
\r^- \ + |r¾ - | = tt X c
where |r¾— f \ and |r¾— rt\ are the distances from the laser point Ϋι (labelled 150) on the floor to the target object and from the target object to the point rt (labelled 310) respectively, c is the speed of light (e.g., relevant to the medium). Solving this equation for the object position r¾ gives infinitely many solutions lying on the surface of an ellipsoid defined by foci Ϋι and r-: the possible positions in space that can generate a signal at pixel i at time tt lie on an ellipsoidal surface with evenly distributed probability. By way of example, one possible photon path 320, not corresponding to the actual location of the object 110, is shown. In the absence of any uncertainties in the measured signals, the resulting pixel probability density of the object's location ^l pse ( ^) calculated from the data collected by pixel i is therefore given by:
peiupse ^ χ fl if |r¾ - I + |r¾ - rt\ = tt X c 1 0 I 0 otherwise The function above can be represented in a simpler form by using ellipsoidal coordinates with foci Ϋι and r-:
Pt eI"pM( S) oc ff(e - ctt)
where ε = |r¾— f \ + |r¾— r-| In any real implementation, the histogram hi(t) recorded by any given pixel i will contain an uncertainty on the arrival time of the signal. As a result, the probability density P^l pse ( ^) will no longer be a uniform ellipsoid, and the uncertainty in the time histogram can be mapped onto the spatial probability density as:
/ iipse (r¾) oc S e - ct)fi t)dt = fi (-)
J -co
The signal recorded in histogram hi(t) has a Gaussian form with a standard deviation of ot... The uncertainty is originating from different sources, for example the jitter on the system and the finite size of the target. As a result of the Gaussian form of the recorded signal, the general expression of the pixel probability density
P ^iellipse r (r0) becomes:
This ellipse 300 represents a probability distribution for the position of the target object with uncertainty ot.. The uncertainty ot.. is represented here by the line thickness of ellipse 300. Here {t)t is the mean arrival time registered by the pixel i, and fft..its standard deviation.
In the described arrangement, it is aimed to locate the object in x and y, on the same plane as the scattering surface (i.e., the plane of the floor in this example), making the assumption that such an object is not moving considerably in the vertical direction. A two dimensional search space at a given height close to ground level is therefore defined, and r¾ takes the form r¾= (xo; yo). In a real-life implementation, the height can be appropriately estimated based on the type of target being tracked or located. An error Δζ in estimating this height will result in the worst case in an error Ar0 of the same order in the determination of the target's r¾ coordinates, although it will be typically much smaller and decreases rapidly for objects that are further away: Ar0 « (z0/ |r¾ |)Az.
As mentioned above, the calculated pixel probability densities p.ellipse 0f each pixel are multiplied to obtain the joint probability density (r¾). However, there is a risk that a given pixel i will lead to an unsignificant fit of the signal and that the values of (t)j and ot.. will be unreliable. To avoid that these unreliable p.ellipse ' (ψ^) affect the joint probability density by multiplying relevant densities by zero, the probability Pj (r¾) associated with the pixel i can be taken to be a linear combination of the pixel probability density p.ellipse ' (ψ^) and a uniform probability density punirm that will prevent any point in space to be multiplied by zero:
P ) = + (1 - at)PuiaTorn
Here ajis a coefficient between 0 and 1, related to the reliability of the probability density Pj (r¾). The choice of at for each pixel depends on how well the probability density Pi ( o overlaps with the space in which the object is being sought. More precisely at may be set as Aj/A where A, is the area contained in the search space where the probability density Pj (r¾) is over a certain threshold (half its maximum) and A is the area of the search space.
Pixel probability distributions p.ellipse ' (ψ^) are calculated for every pixel i in the imaged area. In order to retrieve the target object's position, the joint probability density is calculated by multiplying the probability distributions from all x (e.g., x=1024) camera pixels:
X
P(r¾) = N ]^[ pi (r¾)
i=i
P(r¾) determines the overall probability distribution of the location of the target, and N is a normalisation constant.
Figure 4 illustrates this step graphically. Ellipses 300 calculated from different pixels 310 give slightly displaced probability distribution that intercepts at a given point. Here four ellipses 300, are shown, each corresponding to one of four pixels 310 highlighted. The area where the ellipses overlap indicates the region of highest probability for the target location. In a real example, there will be many more pixels and therefore the same number of ellipses (of the order of magnitude of 1000, for example). Multiplying these probability distributions provides an estimate of the location of target ob j ect 110.
As already mentioned, in a case where an object-free background is impractical or impossible to acquire, the average of histograms recorded at different times can be used to estimate the background. In an embodiment, the average may comprise a median of the histograms recorded at different times. The moving target results in signal differences between the recorded histograms. It can be shown that the average of the histograms is very similar to the background signal. In a real-life implementation, the object-present background can be calculated with the first few acquisitions. If a target object is present, but is not moving, , this algorithm will fail to detect the target; but as soon as the target starts moving, it will take around 15 seconds (about 5 acquisitions) to record a background and begin to accurately locate the target. The best approximation of a background will come from a target that is moving in both the x and y directions. However, it should be appreciated that during this initialisation period of 15 seconds, the camera is acquiring data which does provide information regarding the movement of the target: this data may be less accurate with respect to the data acquired at later times, but nonetheless indicates target movement. For stationary targets, other methods of adjusting for background noise can be performed, as known in the art. To retrieve the position of the target object, the position rt of pixel i on the floor is also required. The camera is looking down at the floor, but still records a (e.g., 32x32 pixel) square image. This image actually corresponds to a field of view that is trapezoidal and stretched both spatially and temporally, with respect to a squared field of view perpendicular to the line of sight of the camera. To correctly retrieve the positions of a target object, the actual position r- = (x^ Yi) of each pixel of the camera is determined. To do so, the dimensions of the imaged area and its distance to the camera are measured, to reconstruct the actual shape of the imaged area on the scattering surface (e.g., floor). In a real-life application, knowing the height and angle of the camera with respect to the scattering surface will be sufficient to determine the geometry of its field of view and therefore know the positions (x,; y,) of each pixel in the field of view. Temporal distortion of the recorded data may also be corrected for: because the imaged area is not perpendicular to the camera's line of sight, photons recorded at the top of the field of view take longer to reach the camera than the ones coming from the bottom. Again, knowing the geometry of our imaging system, the measured (t)j can be corrected accordingly. The values of , (t)j and ot can then be used to retrieve the position of a target object, as explained.
When trying to track target objects that are far away from the detector, the error in the position determination tends to increase due to the fact that the curvature of the scattered waves decreases (spherical waves at a large distance from the source look like plane waves) meaning that their areas of overlap become less well defined. This problem can be offset by repeating the measurement with the illumination source pointing to a slightly different position and/or using multiple detection devices, each looking at slightly different positions (different imaged areas). In general, this generalisation to multiple illumination points and/or detection devices will increase the tracking resolution of the system.
The methods described herein can be extended to tracking multiple target objects, provided that the signal originating from the objects do not significantly overlap. As a proof of principle, the signal from two targets separated by 45 cm were recorded. Once the background is retrieved from the recorded signal, it is possible to distinguish the two signals coming from the two distinct targets, as they produce backscattered spherical waves that can be distinguished both in time and in space. For this proof-of-principle experiment, the multiple peaks in each histogram were located and the retrieval algorithm individually applied to each of the two separate signals. The retrieved probability densities are in good agreement with the positions retrieved from single-target measurements. More precise tracking of multiple targets may be enhanced by some relatively straightforward solutions such as increasing the field of view of the system by using large-area arrayed detectors or decreasing the temporal response of the system. Large-format SPAD array cameras with these properties are in development.
The motion tracking will be more precise if the object moves by less than its physical dimension during the sub-second acquisition time. For a person, this amounts to maximum speeds of the order of a few meters/second, i.e. to a person walking at a relatively fast pace (4 km/h). For a car, it would be of the order of 20- 30 km/h. If the object is moving faster than this, then we would still be able to track its motion correctly but the object would appear to be larger than in reality due to blurring effects. This can be offset by adopting detectors which have faster acquisition times and/or achieve all of the data processing described above directly on-board. In this case, the time limitations mentioned here (that are due mainly to data download times) can be reduced by factors lOx or even lOOx and thus allowing tracking of very fast moving objects.
Nevertheless, for many applications, localisation of an object or simply information regarding if it is moving or not is sufficient. In these cases, blurring effects are of no consequence. Moreover, we should underline that this is just the first step to tracking target objects using lasers and cameras. New faster lasers and cameras are already being developed that will allow us to increase even further the specifications of this first demonstration.
It should be appreciated that the above description is for illustration only and other embodiments and variations may be envisaged without departing from the spirit and scope of the invention.

Claims

Claims
1. An apparatus for obtaining positional information relating to a target object, the apparatus comprising:
an illumination source operable to illuminate a scattering surface, said scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by said scattering surface;
a detection device operable to detect reflected radiation, said reflected radiation being said scattered radiation which has reflected off said target object to within the field of view of said detection device; and
a processor operable to calculate said positional information from the detected reflected radiation.
2. An apparatus as claimed in claim 1 wherein the target object is not within the line of sight of the illumination source and detection device.
3. An apparatus as claimed in claim 1 or 2 wherein the scattering surface comprises a floor, wall or ceiling.
4. An apparatus as claimed in any preceding claim wherein said detection device is operable to image an imaged area within its field of view.
5. An apparatus as claimed in claim 4 wherein said imaged area is coplanar with the scattering surface.
6. An apparatus as claimed in claim 4 or 5 wherein the processor is operable to determine the position of each pixel within the imaged area from the relative positions of the illumination source, scattering surface and detection device.
7. An apparatus as claimed in claim 4, 5 or 6 wherein said detection device is operable to capture a plurality of successive images of said reflected radiation as it propagates through said imaged area.
8. An apparatus as claimed in claim 7 wherein said reflected radiation takes the form of a spherical wave and said positional information comprises its distance and position, said distance being calculated from the time taken for a photon of said illumination radiation to travel from the illumination source to the imaged area via the scattering surface and target object; and the position is calculated from the curvature and direction of said spherical wave as determined from said images.
9. An apparatus as claimed in any preceding claim wherein the detection device comprises an array of single-photon avalanche diodes.
10. An apparatus as claimed in claim 9 wherein the detection device operates in a Time-Correlated-Single-Photon-Counting mode, wherein arrival times of single photons are measured relative to a synchronisation signal corresponding to the illumination source emitting a pulse of radiation to illuminate the scattering surface.
11. An apparatus as claimed in claim 10 wherein the processor is operable to determine an arrival time for each of said pixels, said arrival time corresponding to the time duration for a photon of said illumination radiation to travel from the illumination source to the corresponding pixel within the field of view of said detection device via said scattering surface and target object.
12. An apparatus as claimed in claim 11 wherein the processor is operable to determine:
a histogram for each pulse emitted from said illumination source, and each pixel of said detection device, wherein each histogram counts the number of photons received within each of a plurality of bins, each bin corresponding to a different subdivision of a time period between emission of said pulse and a later time; and from each histogram, determine an arrival time corresponding to a peak in said histogram.
13. An apparatus as claimed in claim 12 wherein the target object is moving and said processor is operable to determine an average of the histograms for a plurality of successive radiation pulses, and taking this average as a measure of the background effects.
14. An apparatus as claimed in claim 13 wherein the average is a median.
15. An apparatus as claimed in any of claims 10 to 14 wherein the processor is operable to determine from each determined arrival time, a pixel probability distribution or pixel probability density describing the probability of the location of the target object based on said arrival time.
16. An apparatus as claimed in claim 15 wherein said processor further determines a measure of uncertainty in said arrival time, and said determination of the pixel probability distribution or pixel probability density uses said measure of uncertainty.
17. An apparatus as claimed in claim 15 or 16 wherein said pixel probability distribution or pixel probability density form an ellipse and said processor is operable to determine an overall probability distribution or probability density of the location of the target by combining the pixel probability distributions or pixel probability densities corresponding to each of said pixels to find their area of overlap.
18. An apparatus as claimed in claim 15, 16 or 17 wherein said processor is operable to determine an overall probability distribution or probability density of the location of the target by multiplying all the pixel probability distributions or pixel probability densities corresponding to of said pixels.
19. An apparatus as claimed in any preceding claim wherein the target object is moving and the processor is operable to track the movement of the target object over time.
20. An apparatus as claimed in claim 19 wherein the processor is operable to separately determine the position of the target object from radiation corresponding to each of a plurality of successive pulses from said illumination device, and to track movement of said target object from the differences in the determined positions corresponding to the plurality of successive pulses.
21. An apparatus as claimed in any preceding claim wherein the apparatus is operable to calculate said position information from plural sets of measurements, each set of measurements corresponding to a different position being illuminated by said illumination device on said scattering surface or another scattering surface, and/or to a different imaged area imaged by said detection device.
22. An apparatus as claimed in claim 21 comprising a plurality of detection devices, each one imaging a different imaged area.
23. An apparatus as claimed in any preceding claim wherein there are a plurality of target objects and said processor is operable to distinguish the reflected radiation from each of said target objects temporally and spatially.
24. A method for obtaining positional information relating to a target object, the method comprising:
illuminating a scattering surface, said scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by said scattering surface;
detecting reflected radiation within an imaged area, said reflected radiation being said scattered radiation which has reflected off said target object to said imaged area; and
calculating said positional information from the detected reflected radiation.
25. A method as claimed in claim 24 wherein the target object is not within the line of sight of the illumination source and detection device.
26. A method as claimed in claim 24 or 25 wherein the scattering surface comprises a floor, wall or ceiling.
27. A method as claimed in any of claims 24 to 26 wherein said imaged area is coplanar with the scattering surface.
28. A method as claimed in any of claims 24 to 27 comprising determining the position of each pixel within the imaged area from the relative positions of a source of radiation illuminating the scattering, the scattering surface and the location where said reflected radiation is detected.
29. A method as claimed in any of claims 24 to 28 comprising capturing a plurality of successive images of said reflected radiation as it propagates through said imaged area.
30. A method as claimed in claim 29 wherein said reflected radiation takes the form of a spherical wave and said positional information comprises its distance and position, said distance being calculated from the time taken for a photon of said illumination radiation to travel from a source of radiation illuminating the scattering surface to the imaged area via the scattering surface and target object; and the position is calculated from the curvature and direction of said spherical wave as determined from said images.
31. A method as claimed in any of claims 24 to 30 wherein said detecting of the reflected radiation is performed using an array of single-photon avalanche diodes.
32. A method as claimed in claim 31 wherein the detecting of the reflected radiation is performed by Time-Correlated-Single-Photon-Counting, wherein arrival times of single photons are measured relative to a synchronisation signal corresponding to the emission of a pulse of radiation to illuminate the scattering surface.
33. A method as claimed in claim 32 comprising determining an arrival time for each of said pixels, said arrival time corresponding to the time duration for a photon of said illumination radiation to travel from a source of radiation illuminating the scattering surface to the corresponding pixel within said imaged area via said scattering surface and target object.
34. A method as claimed in claim 33 wherein the processor is operable to determine:
a histogram for each pulse emitted from said illumination source, and each pixel of said detection device, wherein each histogram counts the number of photons received within each of a plurality of bins, each bin corresponding to a different subdivision of a time period between emission of said pulse and a later time; and from each histogram, determine an arrival time corresponding to a peak in said histogram.
35. A method as claimed in claim 34 wherein the target object is moving, said method comprising determining an average of the histograms for a plurality of successive radiation pulses, and taking this average as a measure of the background effects.
36. A method as claimed in claim 35 wherein the average is a median.
37. A method as claimed in any of claims 32 to 36 comprising determining from each determined arrival time, a pixel probability distribution or pixel probability density describing the probability of the location of the target object based on said arrival time.
38. A method as claimed in claim 37 comprising determining a measure of uncertainty in said arrival time, and said determination of the pixel probability distribution or pixel probability density uses said measure of uncertainty.
39. A method as claimed in claim 37 or 38 wherein said pixel probability distribution or pixel probability density form an ellipse and said method comprises determining an overall probability distribution or probability density of the location of the target by combining the pixel probability distributions or pixel probability densities corresponding to each of said pixels to find their area of overlap.
40. A method as claimed in claim 37, 38 or 39 wherein comprising determining an overall probability distribution or probability density of the location of the target by multiplying all the pixel probability distributions or pixel probability densities corresponding to of said pixels.
41. A method as claimed in any of claims 24 to 40 wherein the target object is moving, said method comprising tracking the movement of the target object over time.
42. A method as claimed in claim 41 comprising determining the position of the target object from radiation corresponding to each of a plurality of successive pulses from said illumination device, and tracking movement of said target object from the differences in the determined positions corresponding to the plurality of successive pulses.
43. A method as claimed in any of claims 24 to 42 comprising calculating said position information from plural sets of measurements, each set of measurements corresponding to a different position being illuminated on said scattering surface or another scattering surface, and/or to a different imaged area imaged by said detection device.
44. A method as claimed in any of claims 24 to 43 wherein there are a plurality of target objects and said method comprises distinguishing the reflected radiation from each of said target objects temporally and spatially.
EP15794620.3A 2014-10-20 2015-10-20 Viewing and tracking of target objects Withdrawn EP3210038A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1418731.4A GB201418731D0 (en) 2014-10-20 2014-10-20 Viewing and tracking of hidden objects in a scene
PCT/GB2015/053116 WO2016063028A1 (en) 2014-10-20 2015-10-20 Viewing and tracking of target objects

Publications (1)

Publication Number Publication Date
EP3210038A1 true EP3210038A1 (en) 2017-08-30

Family

ID=52013380

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15794620.3A Withdrawn EP3210038A1 (en) 2014-10-20 2015-10-20 Viewing and tracking of target objects

Country Status (3)

Country Link
EP (1) EP3210038A1 (en)
GB (1) GB201418731D0 (en)
WO (1) WO2016063028A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109844563A (en) 2016-08-12 2019-06-04 法斯特里3D公司 The method and apparatus for measuring range-to-go in a multi-user environment by least one detector
CN106546981A (en) * 2016-10-24 2017-03-29 复旦大学 Movement human safety check imaging system and method
CN110114246B (en) * 2016-12-07 2022-03-01 乔伊森安全系统收购有限责任公司 3D time-of-flight active reflection sensing systems and methods
CN110537124B (en) * 2017-03-01 2021-12-07 奥斯特公司 Accurate photodetector measurement for LIDAR
US11105925B2 (en) 2017-03-01 2021-08-31 Ouster, Inc. Accurate photo detector measurements for LIDAR
CN107576969B (en) * 2017-08-08 2019-06-28 中国科学院西安光学精密机械研究所 Large scene based on GPU parallel computation hides target imaging System and method for
DE102017220774B4 (en) * 2017-11-21 2022-10-20 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device and method for determining a distance to an object
DE102017222258A1 (en) * 2017-12-08 2019-06-13 Robert Bosch Gmbh Method for a LIDAR device for detecting a hidden object
US11978754B2 (en) 2018-02-13 2024-05-07 Sense Photonics, Inc. High quantum efficiency Geiger-mode avalanche diodes including high sensitivity photon mixing structures and arrays thereof
CN111868556A (en) * 2018-02-13 2020-10-30 感应光子公司 Method and system for high resolution remote flash LIDAR
JP7159340B2 (en) * 2018-10-29 2022-10-24 古野電気株式会社 Target measuring device and target measuring method
CN113874259B (en) 2019-04-04 2023-11-03 乔伊森安全系统收购有限责任公司 Detection and monitoring of active optical retroreflectors
CN111880194B (en) * 2020-08-10 2023-11-28 中国科学技术大学 Non-field-of-view imaging apparatus and method
CN112986903B (en) * 2021-04-29 2021-10-15 香港中文大学(深圳) Intelligent reflection plane assisted wireless sensing method and device
CN114578376B (en) * 2022-05-05 2022-08-19 中国科学院西安光学精密机械研究所 Single photon imaging simulation method based on ocean turbulence

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8749619B2 (en) * 2010-09-29 2014-06-10 Massachusetts Institute Of Technology Methods and apparatus for transient light imaging
US9146317B2 (en) * 2011-05-23 2015-09-29 Massachusetts Institute Of Technology Methods and apparatus for estimation of motion and size of non-line-of-sight objects
US9148649B2 (en) * 2011-10-07 2015-09-29 Massachusetts Institute Of Technology Methods and apparatus for imaging of occluded objects from scattered light

Also Published As

Publication number Publication date
WO2016063028A1 (en) 2016-04-28
GB201418731D0 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
EP3210038A1 (en) Viewing and tracking of target objects
KR102609223B1 (en) Accurate photodetector measurements for lidar
US10823825B2 (en) System and method for wide-area surveillance
Stilla et al. Waveform analysis for small-footprint pulsed laser systems
EP3195042B1 (en) Linear mode computational sensing ladar
Wallace EURASIP Member et al. Full waveform analysis for long-range 3D imaging laser radar
US7834985B2 (en) Surface profile measurement
US20200341144A1 (en) Independent per-pixel integration registers for lidar measurements
US8749619B2 (en) Methods and apparatus for transient light imaging
EP2336805B1 (en) Textured pattern sensing and detection, and using a charge-scavenging photodiode array for the same
US20110242285A1 (en) Imaging system and method using partial-coherence speckle interference tomography
Steinvall et al. Laser range profiling for small target recognition
Henriksson et al. Photon-counting panoramic three-dimensional imaging using a Geiger-mode avalanche photodiode array
Laurenzis et al. Non-line-of-sight active imaging of scattered photons
McDonald Jr et al. Range-gated imaging experiments using gated intensifiers
Göhler et al. Range accuracy of a gated-viewing system as a function of the number of averaged images
Henriksson et al. Panoramic single‑photon counting 3D lidar
WO2020249359A1 (en) Method and apparatus for three-dimensional imaging
Andersen et al. Submillimeter 3-D laser radar for space shuttle tile inspection
Henderson et al. Proximity-based sensor fusion of depth cameras and isotropic rad-detectors
Gariepy et al. Tracking hidden objects with a single-photon camera
US20220091236A1 (en) Techniques for detecting and mitigating interference among multiple lidar sensors
CN114660571A (en) Non-vision field target multi-angle detection combined positioning device and method
Golitsyn et al. The implementation of gated-viewing system based on CCD image sensor
JPS58223079A (en) Measurement collation system for remote object

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170413

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20171209