EP3210038A1 - Ansicht und verfolgung von zielobjekten - Google Patents

Ansicht und verfolgung von zielobjekten

Info

Publication number
EP3210038A1
EP3210038A1 EP15794620.3A EP15794620A EP3210038A1 EP 3210038 A1 EP3210038 A1 EP 3210038A1 EP 15794620 A EP15794620 A EP 15794620A EP 3210038 A1 EP3210038 A1 EP 3210038A1
Authority
EP
European Patent Office
Prior art keywords
target object
radiation
scattering surface
pixel
operable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15794620.3A
Other languages
English (en)
French (fr)
Inventor
Genevieve GARIEPY
Francesco TONOLINI
Jonathan LEACH
Daniele Faccio
Robert Henderson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heriot Watt University
Original Assignee
Heriot Watt University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heriot Watt University filed Critical Heriot Watt University
Publication of EP3210038A1 publication Critical patent/EP3210038A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak

Definitions

  • the present invention relates to methods and apparatuses for the illumination and viewing of objects that may be hidden from an observer.
  • the methods and apparatuses have particular applicability to optical sensing and imaging, especially where direct line of sight of a target object or objects of interest to an observer is not possible.
  • Optical imaging and image analysis are very important within any number of situations including but not limited to photography, astronomy, security, surveying, military operations, aerial and non-invasive inspection, livestock and crop management etc.
  • the human eye looks at an object, it requires a direct "line of sight" of the object within its field of view so that any light incident on the object, and then reflected from it, is captured by the eye and processed by the brain as a recognisable image of the object. If the same object is hidden from the observer field of view, for example behind a wall, light incident on it will not reach the eye and the object is not visible to the observer.
  • imaging devices such as the periscope, where mirrors are used to manipulate the path of reflected light from an object to the eye of the observer. These can be used to observe an object behind a wall.
  • Periscopes are quite common at sporting events or concerts to help people see over the crowd, and also find use when an observer wishes to remain hidden from sight, for example in submarines or warfare where a soldier may wish to remain hidden behind a fortification giving protection while watching out for an enemy or target.
  • the human eye has evolved to operate at its peak efficiency during daylight hours and is most sensitive to wavelengths that lie within the visible part of the electromagnetic spectrum, typically 400 to 700 nanometres (nm). Outwith this spectral range, the human eye is less sensitive (e.g., to longer wavelengths) or can even be physically damaged (e.g., by shorter wavelengths) and a number of solid state devices and optical systems have been developed to allow imaging at these longer or shorter wavelengths within the electromagnetic spectrum.
  • these devices can include Charge Coupled Devices (CCD), photo-multiplier tubes, infra-red detectors, terahertz detectors, microwave or radio antenna, gamma-ray or x-ray detectors or photographic film sensitised to the wavelength of interest etc.
  • CCD Charge Coupled Devices
  • photo-multiplier tubes Infra-red detectors, terahertz detectors, microwave or radio antenna, gamma-ray or x-ray detectors or photographic film sensitised to the wavelength of interest etc.
  • the mode of operation is generally limited to "line of sight” operation; exception being where the energy or wavelength is such that it can travel through solid objects to the detector.
  • an apparatus for obtaining positional information relating to a target object comprising:
  • an illumination source operable to illuminate a scattering surface, said scattering surface being within the line of sight of the target object, such that scattered radiation is scattered by said scattering surface;
  • a detection device operable to detect reflected radiation, said reflected radiation being said scattered radiation which has reflected off said target object to within the field of view of said detection device;
  • a processor operable to calculate said positional information from the detected reflected radiation.
  • a method for obtaining positional information relating to a target object comprising:
  • Figure 1 is a side view diagram showing schematically an arrangement for imaging and tracking a hidden from view object according to a first embodiment of the invention
  • Figure 2 is a top view diagram of the arrangement of Figure 1 showing (a) the radiation from the illumination source scattering from the floor and (b) the scattered radiation reflecting from the target object;
  • Figure 3 is a top view diagram of the arrangement of Figures 1 and 2 illustrating the ensemble of possible locations for the object.
  • Figure 4 is a top view diagram of the arrangement of Figures 1 and 2 illustrating the step of combining the probability distributions to estimate target object location.
  • a compact non-line-of-sight laser ranging technology is disclosed which relies upon the ability to send light around an obstacle using a scattering surface (e.g., floor or wall) and to detect the return signal from a target object with only a few seconds acquisition time.
  • the disclosed imaging technique in combination with analysis of the data collected can reveal both the presence of a target object (or objects) in a scene and indicate size, shape and direction of movement of that object (or objects), where the object(s) is/are outside the normal field of view of the observer. This is beneficial over current imaging systems such as those using Light Detection and Ranging (or LIDAR) approaches where there is a need for direct "line of sight" of an object within the field of view.
  • LIDAR Light Detection and Ranging
  • the imaging system is such that it does not require a direct line of sight between the object of interest and the detector of the imaging system.
  • the target object(s) of interest may be stationary or may be moving in space and time. Where the object(s) of interest is non-stationary, there is disclosed a method of tracking the movement of the object (or objects). The object(s) may be tracked in real time. Within a scene containing multiple objects of interest to an observer a portion of the objects of interest may be stationary and a further portion may be non-stationary.
  • the imaging system may comprise an illumination source, a detection device, a means of processing the data from the detection device and an output device.
  • the wavelength of operation of the illumination source may be such that it is within the electromagnetic spectrum of radiation and can be detected by the detection device.
  • the illumination source of the invention provides illumination to an area within a scene; the scene containing an object, or objects, of interest to the observer.
  • the illumination source may provide a continuous or discontinuous mode of illumination.
  • the illumination source may be operated at a certain, known frequency.
  • the frequency of operation or number of cycles per second (Hertz, Hz) of the source may be optimised to provide a balance between the need to provide illumination and the ability to detect the illumination at a detection device.
  • the frequency of operation of the illumination source can be slow (few Hz) or can be fast (many thousands of Hz or greater).
  • the frequency of operation of the illumination source can be between 0 and 1000Hz (lKhz), between lKhz and 1 million Hz (lMHz), between lMhz and lOOOMhz (lGhz) or in some cases beyond 1GHz.
  • any illumination source could be used provided that a detector can be matched to it.
  • an illumination source of the invention is a laser source.
  • Laser sources cover a broad and useful part of the electromagnetic spectrum of radiation ( ⁇ 200nm to well beyond 1 micron in wavelength) and a number of commercial detectors are available to match the illumination wavelength of the laser.
  • Laser sources provide a high power of illumination and depending on their design parameters can operate in a continuous mode of operation or a discontinuous mode of operation making them suitable for applications disclosed herein.
  • Laser sources can provide different laser pulse widths. These pulses are typically very short in duration and can be in the millisecond, nanosecond, picosecond or femtosecond range.
  • the detection device may have a wavelength of detection being within the output spectrum of the illumination source and ideally having a maximum sensitivity at the same output region of the electromagnetic spectrum as the illumination source.
  • the detector of the imaging system may be chosen such that it is optimised to the operation of the illumination source.
  • the detection device is capable of capturing the illumination data under conditions of either continuous illumination or discontinuous illumination.
  • the detector may collect illumination data over a short period of time or over a long period of time. Illumination data collected by the detector may be transferred from the detection device to an image processing system. The illumination data collected by the detector is processed in such a manner that the relative movement of the object(s) hidden from view can be determined. When data from the detector is processed with respect to time, additional information can be gathered on the speed and, or, the direction of travel of the object(s).
  • the illumination data from the detector can be processed instantaneously, in "real time", or it can be recorded and stored for processing at a future time.
  • the output device may be a presentation means or any other media to visually represent the data collected by the imaging system to the observer.
  • the illumination source and the detector are located at a distance, X, from the object of interest to the observer.
  • the distance, X can be small or can be large.
  • the illumination source and the detector may be located at an angle, Y, to the object of interest to the observer. The angle can be small or can be large depending on the size of the object of interest and the distance, X.
  • Figure 1 shows schematically a system for imaging and tracking a hidden from view object (target object), from a side view.
  • Figure 2 shows the same arrangement in top down view (a) illustrating the radiation from the illumination source scattering from the floor and (b) illustrating the scattered radiation reflecting from the target object.
  • a person behind a wall 100 represents the target object 110, but it could be anything and could encompass very different scenarios, e.g. a car behind a corner, a stranded person in a room that is on fire, an object in an underwater wreckage with limited access from outside etc.
  • the system comprises an illumination source 120 and detection device 130. Control of these may be performed using processor 145.
  • the illumination source may comprise, for example, a high repetition rate, pulsed laser.
  • the laser may comprise an 800 nm wavelength femtosecond oscillator which emits pulses of 10 nj energy and 10 fs duration at a 67 MHz repetition rate (0.67 W average power).
  • the system may be arranged to generate a synchronisation signal to synchronise the acquisition to the propagation of the radiation pulses from the illumination source 120.
  • a small portion of the illumination source 120 output may be sent to an optical constant fraction discriminator (OCF) which then generates the synchronization signal (e.g., TTL signal) which is sent to the detection device 130 and/or the processor.145 for synchronisation.
  • OCF optical constant fraction discriminator
  • the laser pulses 135 are directed towards a scattering surface 140 that lies beyond the obstacle that obscures or limits the direct line of sight to the object (e.g. the wall 100 in the Figure 1).
  • This scattering surface 140 can be any surface within line of sight of the target object 110, and may be another wall, a ceiling or roofbal an open door, the surface of another object or, as shown in this example, the floor.
  • a scattered laser pulse 150 hits the scattering surface 140 (e.g., floor), it will scatter into a spherical wave 160 ( Figure 2(a)).
  • This spherical wave 160 will then propagate outwards in all directions, including behind the wall 100 and therefore reach the target object 110 to be detected and tracked.
  • the radiation in the spherical wave will then in turn be reflected from the target object 110, with some of the reflected radiation 170 ( Figure 2(b)) being reflected towards the imaged area 180, which is within the direct line of sight of the detection device 130.
  • the detection device is shown to be imaging an imaged area 180 on the floor, just beyond the edge of the obscuring wall. It should be understood that each scattering event results in the emission of a spherical wave. In the case of a complicated object, there will probably be many spherical waves of reflected radiation 170 originating from different parts of the object. These will enable determination of the actual shape of the target object 110. In other cases, e.g. a car imaged from a distance or with a lower resolution, it may be that only a single spherical wave emitted from the object is imaged.
  • This spherical wave will appear as a section of circle on the imaged area 180 as the spherical wave of reflected radiation 170 intersects the imaged area 180.
  • FIG 2(b) an example of real data 190 is shown, illustrating the image of the spherical wave of reflected radiation 170 within the imaged area 180 as captured by the detection device 130.
  • This data shows the spherical wave of reflected radiation 170 at a precise time instant: the picosecond temporal resolution of exemplary detection devices allows the capture of the propagation of the spherical wave of reflected radiation 170 as it traverses the imaged area 180.
  • the total data cube acquired by the detection device 130 will provide a video in which (part of) the spherical wave of reflected radiation 170 is seen to propagate from left to right in the image. It is the combination of the temporal information, i.e. how the spherical wave of reflected radiation 170 moves over time, together with the spatial information, i.e. the actual shape of the spherical wave of reflected radiation 170 that allows the exact location of the target object 110 to be determined. This determination may be performed by processor 145.
  • the location of the target object 110 is retrieved by utilising the fact that: (i) the time it takes for the radiation to propagate from the illumination source 120 to the target object 110 and back, similarly to a LIDAR system, gives information about the target object's distance and (if) the curvature and direction with which the spherical wavefront of reflected radiation 170 propagates across the imaged area 180 provides information on the target object's position.
  • the detection device may comprise single photon counting technology using a single-photon avalanche diode (SPAD) camera. A high temporal resolution of the detection device can be obtained by operating it in a Time-Correlated-Single- Photon-Counting (TCSPC) mode.
  • TCSPC Time-Correlated-Single- Photon-Counting
  • the arrival times of single photons may be measured with a resolution in terms of picoseconds, e.g., with a time bin between 10-lOOps, or between 30 and 80ps.
  • the system may use (millions) of laser pulses in order to properly reconstruct the fully animated video.
  • the acquisition time depends on the repetition rate of the laser: the higher the repetition rate, the faster the acquisition time.
  • SPAD detectors originally developed as single pixel elements, are gradually becoming widely available as focal plane arrays. The single photon sensitivity and picosecond temporal resolution make them good candidates for real-time non-line-of-sight ranging of a moving target.
  • the detection device 130 may comprise a 32x32-pixel array of Si CMOS SPADs.
  • a SPAD is based on a p-n junction device biased beyond its breakdown region.
  • the high reverse bias voltage generates a sufficient magnitude of electric field such that a single charge carrier introduced into the depletion layer of the device can cause a self-sustaining avalanche via impact ionisation.
  • the avalanche is quenched, either actively or passively to allow the device to be "reset” to detect further photons.
  • the initiating charge carrier can be photo-electrically generated by means of a single incident photon striking the high field region. It is this feature which gives rise to the name 'Single Photon Avalanche Diode'. This single photon detection mode of operation is often referred to as 'Geiger Mode'.
  • the high sensitivity of the detection device 130 allows extremely short acquisition times, which in turn allows one to locate target objects on timescales sufficiently short to be able to track their movement. Locating the position of an object hidden behind a wall with centimetre precision is possible, without the need for pre- acquiring a background in the absence of the object. It can also be shown that realtime acquisition is possible for an object moving at a few centimetres per second.
  • the detection device 130 may comprise an array of SPADs individually operable in a time-correlated single-photon counting (TCSPC) mode: every time a photon is detected by a pixel, the time difference between its arrival and the arrival of the synchronisation signal (e.g., TTL trigger from the OCF) is measured and stored in a time histogram.
  • TCSPC time-correlated single-photon counting
  • Each histogram comprises a number of time pixels and a time bin of certain duration. Specifically, histogram may comprise 1024 time pixels with a time- bin of 45.5 ps.
  • the time resolution is limited by the electronic jitter of the system, which may be approximately 110 ps (measured at full-width-half-maximum), for example.
  • This impulse response corresponds to a spatial (depth) resolution of a 1.65 cm, allowing the approximation of the back scattering as a single spherical wave originating from the target.
  • the detection device e.g., 32x32-pixel
  • the signal of interest coming from the target object alone is isolated from the signal coming from unwanted sources in the environment such as the walls and the ceiling.
  • This can be achieved by simply acquiring a background signal in the absence of the target object; however, this may not be a practical solution if we are interested in tracking non-cooperative moving target objects. Instead, by acquiring data with the target object at different positions, it is possible to distinguish the signal that is not changing at each acquisition (generated by the static sources) and the signal that is changing (generated by the target object).
  • An average of the temporal histograms for each pixel proves to be a very a good approximation of the background signal and allows to effectively isolate the signal generated from the target object alone. Background subtraction is discussed in more detail later.
  • the processing proceeds to time-of-flight measurements and fitting of a Gaussian function to the temporal histograms.
  • the peak position of the Gaussian fit (t) j is a measure of the average total photon flight time, with an uncertainty that is taken to be the Gaussian standard deviation o t ..
  • This ellipse 300 is defined by:
  • the histogram hi(t) recorded by any given pixel i will contain an uncertainty on the arrival time of the signal.
  • the probability density P ⁇ l pse ( ⁇ ) will no longer be a uniform ellipsoid, and the uncertainty in the time histogram can be mapped onto the spatial probability density as:
  • the signal recorded in histogram hi(t) has a Gaussian form with a standard deviation of o t ...
  • the uncertainty is originating from different sources, for example the jitter on the system and the finite size of the target.
  • This ellipse 300 represents a probability distribution for the position of the target object with uncertainty o t ..
  • the uncertainty o t .. is represented here by the line thickness of ellipse 300.
  • ⁇ t) t is the mean arrival time registered by the pixel i, and ff t ..its standard deviation.
  • the height can be appropriately estimated based on the type of target being tracked or located.
  • the calculated pixel probability densities p. ellipse 0 f each pixel are multiplied to obtain the joint probability density (r3 ⁇ 4).
  • the probability P j (r3 ⁇ 4) associated with the pixel i can be taken to be a linear combination of the pixel probability density p. ellipse ' ( ⁇ ) and a uniform probability density p uni f° rm that will prevent any point in space to be multiplied by zero:
  • a j is a coefficient between 0 and 1, related to the reliability of the probability density P j (r3 ⁇ 4).
  • the choice of a t for each pixel depends on how well the probability density Pi ( o overlaps with the space in which the object is being sought. More precisely a t may be set as Aj/A where A, is the area contained in the search space where the probability density P j (r3 ⁇ 4) is over a certain threshold (half its maximum) and A is the area of the search space.
  • Pixel probability distributions p. ellipse ' ( ⁇ ) are calculated for every pixel i in the imaged area.
  • P(r3 ⁇ 4) determines the overall probability distribution of the location of the target, and N is a normalisation constant.
  • Figure 4 illustrates this step graphically.
  • Ellipses 300 calculated from different pixels 310 give slightly displaced probability distribution that intercepts at a given point.
  • four ellipses 300 are shown, each corresponding to one of four pixels 310 highlighted.
  • the area where the ellipses overlap indicates the region of highest probability for the target location.
  • the average of histograms recorded at different times can be used to estimate the background.
  • the average may comprise a median of the histograms recorded at different times.
  • the moving target results in signal differences between the recorded histograms. It can be shown that the average of the histograms is very similar to the background signal.
  • the object-present background can be calculated with the first few acquisitions. If a target object is present, but is not moving, this algorithm will fail to detect the target; but as soon as the target starts moving, it will take around 15 seconds (about 5 acquisitions) to record a background and begin to accurately locate the target.
  • the best approximation of a background will come from a target that is moving in both the x and y directions.
  • the camera is acquiring data which does provide information regarding the movement of the target: this data may be less accurate with respect to the data acquired at later times, but nonetheless indicates target movement.
  • other methods of adjusting for background noise can be performed, as known in the art.
  • To retrieve the position of the target object the position r t of pixel i on the floor is also required. The camera is looking down at the floor, but still records a (e.g., 32x32 pixel) square image.
  • This image actually corresponds to a field of view that is trapezoidal and stretched both spatially and temporally, with respect to a squared field of view perpendicular to the line of sight of the camera.
  • the dimensions of the imaged area and its distance to the camera are measured, to reconstruct the actual shape of the imaged area on the scattering surface (e.g., floor).
  • the scattering surface e.g., floor
  • Temporal distortion of the recorded data may also be corrected for: because the imaged area is not perpendicular to the camera's line of sight, photons recorded at the top of the field of view take longer to reach the camera than the ones coming from the bottom. Again, knowing the geometry of our imaging system, the measured (t)j can be corrected accordingly. The values of , (t)j and o t can then be used to retrieve the position of a target object, as explained.
  • the error in the position determination tends to increase due to the fact that the curvature of the scattered waves decreases (spherical waves at a large distance from the source look like plane waves) meaning that their areas of overlap become less well defined.
  • This problem can be offset by repeating the measurement with the illumination source pointing to a slightly different position and/or using multiple detection devices, each looking at slightly different positions (different imaged areas). In general, this generalisation to multiple illumination points and/or detection devices will increase the tracking resolution of the system.
  • the methods described herein can be extended to tracking multiple target objects, provided that the signal originating from the objects do not significantly overlap.
  • the signal from two targets separated by 45 cm were recorded. Once the background is retrieved from the recorded signal, it is possible to distinguish the two signals coming from the two distinct targets, as they produce backscattered spherical waves that can be distinguished both in time and in space.
  • the multiple peaks in each histogram were located and the retrieval algorithm individually applied to each of the two separate signals.
  • the retrieved probability densities are in good agreement with the positions retrieved from single-target measurements.
  • More precise tracking of multiple targets may be enhanced by some relatively straightforward solutions such as increasing the field of view of the system by using large-area arrayed detectors or decreasing the temporal response of the system. Large-format SPAD array cameras with these properties are in development.
  • the motion tracking will be more precise if the object moves by less than its physical dimension during the sub-second acquisition time. For a person, this amounts to maximum speeds of the order of a few meters/second, i.e. to a person walking at a relatively fast pace (4 km/h). For a car, it would be of the order of 20- 30 km/h. If the object is moving faster than this, then we would still be able to track its motion correctly but the object would appear to be larger than in reality due to blurring effects. This can be offset by adopting detectors which have faster acquisition times and/or achieve all of the data processing described above directly on-board. In this case, the time limitations mentioned here (that are due mainly to data download times) can be reduced by factors lOx or even lOOx and thus allowing tracking of very fast moving objects.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
EP15794620.3A 2014-10-20 2015-10-20 Ansicht und verfolgung von zielobjekten Withdrawn EP3210038A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1418731.4A GB201418731D0 (en) 2014-10-20 2014-10-20 Viewing and tracking of hidden objects in a scene
PCT/GB2015/053116 WO2016063028A1 (en) 2014-10-20 2015-10-20 Viewing and tracking of target objects

Publications (1)

Publication Number Publication Date
EP3210038A1 true EP3210038A1 (de) 2017-08-30

Family

ID=52013380

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15794620.3A Withdrawn EP3210038A1 (de) 2014-10-20 2015-10-20 Ansicht und verfolgung von zielobjekten

Country Status (3)

Country Link
EP (1) EP3210038A1 (de)
GB (1) GB201418731D0 (de)
WO (1) WO2016063028A1 (de)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3497477A1 (de) 2016-08-12 2019-06-19 Fastree3D SA Verfahren und vorrichtung zur messung einer distanz zu einem ziel in einer mehrbenutzerumgebung mithilfe von mindestens einem detektor
CN106546981A (zh) * 2016-10-24 2017-03-29 复旦大学 运动人体安检成像系统和方法
CN110114246B (zh) 2016-12-07 2022-03-01 乔伊森安全系统收购有限责任公司 3d飞行时间有源反射感测系统和方法
CN114114209A (zh) * 2017-03-01 2022-03-01 奥斯特公司 用于lidar的准确光检测器测量
US11105925B2 (en) 2017-03-01 2021-08-31 Ouster, Inc. Accurate photo detector measurements for LIDAR
CN107576969B (zh) * 2017-08-08 2019-06-28 中国科学院西安光学精密机械研究所 基于gpu并行计算的大场景隐藏目标成像系统与方法
DE102017220774B4 (de) * 2017-11-21 2022-10-20 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung und Verfahren zum Bestimmen einer Distanz zu einem Objekt
DE102017222258A1 (de) * 2017-12-08 2019-06-13 Robert Bosch Gmbh Verfahren für eine LIDAR-Vorrichtung zur Erfassung eines verdeckten Objekts
US11978754B2 (en) 2018-02-13 2024-05-07 Sense Photonics, Inc. High quantum efficiency Geiger-mode avalanche diodes including high sensitivity photon mixing structures and arrays thereof
US11467286B2 (en) * 2018-02-13 2022-10-11 Sense Photonics, Inc. Methods and systems for high-resolution long-range flash lidar
WO2020090287A1 (ja) * 2018-10-29 2020-05-07 古野電気株式会社 物標計測装置及び物標計測方法
DE112020001011B4 (de) 2019-04-04 2023-07-13 Joyson Safety Systems Acquisition Llc Erkennung und Überwachung von aktiven optischen Retroreflektoren
CN111880194B (zh) * 2020-08-10 2023-11-28 中国科学技术大学 非视域成像装置及方法
CN112986903B (zh) * 2021-04-29 2021-10-15 香港中文大学(深圳) 一种智能反射平面辅助的无线感知方法及装置
CN114578376B (zh) * 2022-05-05 2022-08-19 中国科学院西安光学精密机械研究所 一种基于海洋湍流的单光子成像仿真方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8749619B2 (en) * 2010-09-29 2014-06-10 Massachusetts Institute Of Technology Methods and apparatus for transient light imaging
US9146317B2 (en) * 2011-05-23 2015-09-29 Massachusetts Institute Of Technology Methods and apparatus for estimation of motion and size of non-line-of-sight objects
US9148649B2 (en) * 2011-10-07 2015-09-29 Massachusetts Institute Of Technology Methods and apparatus for imaging of occluded objects from scattered light

Also Published As

Publication number Publication date
WO2016063028A1 (en) 2016-04-28
GB201418731D0 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
WO2016063028A1 (en) Viewing and tracking of target objects
KR102609223B1 (ko) 라이더를 위한 정확한 광검출기 측정
US10823825B2 (en) System and method for wide-area surveillance
US20230375704A1 (en) Accurate Photo Detector Measurements For Lidar
Stilla et al. Waveform analysis for small-footprint pulsed laser systems
EP3195042B1 (de) Rechnergestützter abtastender linearmodus-ladar
Wallace EURASIP Member et al. Full waveform analysis for long-range 3D imaging laser radar
US7834985B2 (en) Surface profile measurement
US20200341144A1 (en) Independent per-pixel integration registers for lidar measurements
US8749619B2 (en) Methods and apparatus for transient light imaging
US8514284B2 (en) Textured pattern sensing and detection, and using a charge-scavenging photodiode array for the same
US20110242285A1 (en) Imaging system and method using partial-coherence speckle interference tomography
Steinvall et al. Laser range profiling for small target recognition
Henriksson et al. Photon-counting panoramic three-dimensional imaging using a Geiger-mode avalanche photodiode array
Laurenzis et al. Non-line-of-sight active imaging of scattered photons
McDonald Jr et al. Range-gated imaging experiments using gated intensifiers
Göhler et al. Range accuracy of a gated-viewing system as a function of the number of averaged images
Henriksson et al. Panoramic single‑photon counting 3D lidar
Andersen et al. Submillimeter 3-D laser radar for space shuttle tile inspection
Henderson et al. Proximity-based sensor fusion of depth cameras and isotropic rad-detectors
Gariepy et al. Tracking hidden objects with a single-photon camera
US20220091236A1 (en) Techniques for detecting and mitigating interference among multiple lidar sensors
CN114660571A (zh) 非视域目标多角度探测联合定位装置及方法
JPS58223079A (ja) 遠隔地の対象物の測定照合方式
Kutteruf et al. 1541nm GmAPD LADAR system

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170413

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20171209