WO2014102442A1 - A method and apparatus for de-noising data from a distance sensing camera - Google Patents

A method and apparatus for de-noising data from a distance sensing camera Download PDF

Info

Publication number
WO2014102442A1
WO2014102442A1 PCT/FI2012/051304 FI2012051304W WO2014102442A1 WO 2014102442 A1 WO2014102442 A1 WO 2014102442A1 FI 2012051304 W FI2012051304 W FI 2012051304W WO 2014102442 A1 WO2014102442 A1 WO 2014102442A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel sensor
phase difference
amplitude
pixel
filtering
Prior art date
Application number
PCT/FI2012/051304
Other languages
French (fr)
Inventor
Mihail GEORGIEV
Atanas GOTCHEV
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to EP12890686.4A priority Critical patent/EP2939049B1/en
Priority to JP2015550122A priority patent/JP6367827B2/en
Priority to KR1020157020397A priority patent/KR101862914B1/en
Priority to US14/655,618 priority patent/US10003757B2/en
Priority to CN201280078236.6A priority patent/CN105026955B/en
Priority to PCT/FI2012/051304 priority patent/WO2014102442A1/en
Publication of WO2014102442A1 publication Critical patent/WO2014102442A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/493Extracting wanted echo signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Definitions

  • the present application relates to time-of-flight camera systems, and in particular the reduction of noise in the distance images from said time of flight camera systems.
  • Time-of-flight (ToF) cameras can sense distance or range to an object by emitting a pulse of modulated light signal and then measuring the time differential in the returning wave front.
  • a ToF camera can comprise a light source such as a bank of light emitting diodes (LED) whereby a continuously modulated harmonic light signal can be emitted.
  • the distance or range of an object from the light source can be determined by measuring the shift in phase (or difference in time) between the emitted and the reflected photons of light.
  • the reflected photons can be sensed by the camera by the means of charge coupled device or the like.
  • phase shift (or phase delay) between the emitted and reflected photons is not measured directly.
  • a ToF camera system may adopt a pixel structure whereby the correlation between the received optical signal and an electrical reference source is performed in order to determine a measure of the phase delay.
  • the resulting distance (or range) map can represent the distance to objects as the relative intensity of pixels within the distance map image.
  • the distance map image can be corrupted with the effect of noise, whether it is random noise as a result of thermal noise in the charge coupled device or noise as a result of systematic errors in the measurement of the distance to the observed object.
  • the operational performance of a ToF camera system can be influenced by internal factors resulting from the operational mode of the camera and external factors caused by characteristics of the sensed scene and sensing environment.
  • internal factors which can limit the capabilities of a ToF camera system may include the physical limitations of the sensors used such as inherent noise and resolution.
  • Other internal factors which can limit the capability of a ToF camera system can include the power of the emitted signal, and the integration time for forming the reflected signal samples.
  • External factors which may limit the performance of a ToF camera system may include the angle of incidence of the illuminating light onto sensed object, the light reflectivity of colours and materials of the sensed objects, the sensing range of the ToF camera system, and the returned light signal being formed by multiple reflections.
  • a method comprising: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the method may further comprise at least one of: de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de-noising of the phase difference may occur prior to combining the amplitude and phase difference; and de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude may occur prior to combining the amplitude and phase difference.
  • the filtering may further comprise filtering with a non-local spatial transform filter.
  • the non-local spatial transform filter can be a non-local means filter.
  • the method may further comprise calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
  • the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
  • the image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
  • an apparatus configured to: determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determine an amplitude of the reflected light signal received by the at least one pixel sensor; combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de- noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the apparatus may be further configured to at least one of: de-noise the phase difference for the at least one pixel sensor filtering the phase difference for the at least one pixel sensor, wherein the apparatus may be configured to de-noise the phase difference prior to combining the amplitude and phase difference; and de- noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the apparatus may be configured to de- noise the amplitude prior to combining the amplitude and phase difference.
  • the filtering may comprise filtering with a non-local spatial transform type filter.
  • the non-local spatial transform type filter can be a non-local means filter.
  • the apparatus maybe further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
  • the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
  • the image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
  • an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured with the at least one processor to cause the apparatus at least to: determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determine an amplitude of the reflected light signal received by the at least one pixel sensor; combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the at least one memory and the computer code configured with the at least one processor may be further configured to at least one of: de-noise the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the at least one memory and the computer code may also be configured with the at least one processor to de-noise the phase difference prior to combining the amplitude and phase difference; and de-noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the at least one memory and the computer code may also be configured with the at least one processor to de-noise the amplitude prior to combining the amplitude and phase difference.
  • the filtering may comprise filtering with a non-local spatial transform type filter.
  • the non-local spatial transform filter can be a non-local means filter.
  • the at least one memory and the computer code configured with the at least one processor may be further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
  • the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
  • the image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
  • a computer program code when executed by a processor realises: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the computer program code when executed by the processor may further realises at least one of: de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de- noising of the phase difference may occur prior to combining the amplitude and phase difference; and de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude may occur prior to combining the amplitude and phase difference.
  • the computer program code when executed by the processor realises filtering the computer program code may further realise filtering with a non-local spatial transform filter.
  • the non-local spatial transform filter can be a non-local means filter.
  • the computer program code when executed by the processor may further realises calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
  • the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
  • the image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
  • Figure 1 shows schematically an apparatus suitable for employing some embodiments
  • FIG. 2 shows schematically a ToF camera system suitable for employing some embodiments
  • Figure 3 shows schematically a flow diagram illustrating the operation of the ToF camera system of Figure 2;
  • Figure 4 shows schematically a complex parameter representation for signal parameters of the ToF camera system of Figure 2.
  • Figure 1 shows a schematic block diagram of an exemplary electronic device or apparatus 10, which may incorporate a ToF camera system according to embodiments of the application.
  • the apparatus 10 may for example be a mobile terminal or user equipment of a wireless communication system.
  • the apparatus 10 may be an audio-video device such as video camera, audio recorder or audio player such as a mp3 recorder/player, a media recorder (also known as a mp4 recorder/player), or any computer suitable for the processing of audio signals.
  • the apparatus 10 may for example be a sub component of a larger computer system, whereby the apparatus 10 is arranged to operate with other electronic components or computer systems.
  • the apparatus 10 may be arranged as an application specific integrated circuit (ASIC) with the functionality to control and interface to a ToF camera module and also process information from the ToF camera module.
  • ASIC application specific integrated circuit
  • the apparatus 10 may be arranged as an individual module which can be configured to be integrated into a generic computer such as a personal computer or a laptop.
  • the apparatus 10 in some embodiments may comprise a ToF camera system or module 1 1 , which can be coupled to a processor 21 .
  • the processor 21 can be configured to execute various program codes.
  • the implemented program codes in some embodiments can comprise code for processing the depth map image from the ToF camera module 1 1 . In particular the implemented program codes can facilitate noise reduction in the processed distance image.
  • the apparatus further comprises a memory 22.
  • the processor is coupled to memory 22.
  • the memory can be any suitable storage means.
  • the memory 22 comprises a program code section 23 for storing program codes implementable upon the processor 21 .
  • the memory 22 can further comprise a stored data section 24 for storing data, for example data for which has been retrieved from the ToF camera module 1 1 for subsequent processing of the distance map image as will be described later.
  • the implemented program code stored within the program code section 23, and the data stored within the stored data section 24 can be retrieved by the processor 21 whenever needed via the memory-processor coupling.
  • the apparatus 10 can comprise a user interface 15.
  • the user interface 15 can be coupled in some embodiments to the processor 21 .
  • the processor can control the operation of the user interface and receive inputs from the user interface 15.
  • the user interface 15 can enable a user to input commands to the electronic device or apparatus 10, for example via a keypad, and/or to obtain information from the apparatus 10, for example via a display which is part of the user interface 15.
  • the user interface 15 can in some embodiments comprise a touch screen or touch interface capable of both enabling information to be entered to the apparatus 10 and further displaying information to the user of the apparatus 10.
  • the display which is part of the user interface 15 may be used for displaying the depth map image as such and can be a liquid crystal display (LCD), or a light emitting diode (LED) display, or an organic LED (OLED) display, or a plasma screen.
  • the user interface 15 may also comprise a pointing device, such as a mouse a trackball, cursor direction keys, or a motion sensor, for controlling a position of a small cursor image presented on the display and for issuing commands associated with graphical elements presented on the display.
  • the apparatus further comprises a transceiver 13, the transceiver in such embodiments can be coupled to the processor and configured to enable a communication with other apparatus or electronic devices, for example via a wireless communications network.
  • the transceiver 13 or any suitable transceiver or transmitter and/or receiver means can in some embodiments be configured to communicate with other electronic devices or apparatus via a wire or wired coupling.
  • the transceiver 13 can communicate with further devices by any suitable known communications protocol, for example in some embodiments the transceiver 13 or transceiver means can use a suitable universal mobile telecommunications system (UMTS) protocol, a wireless local area network (WLAN) protocol such as for example IEEE 802.X, a suitable short-range radio frequency communication protocol such as Bluetooth, or infrared data communication pathway (IRDA).
  • UMTS universal mobile telecommunications system
  • WLAN wireless local area network
  • IRDA infrared data communication pathway
  • FIG. 2 shows schematically some components of the ToF camera module 11 according to some embodiments.
  • the ToF camera module 1 1 may be a Photonic Mixer Device (PMD) camera module.
  • PMD Photonic Mixer Device
  • the optical transmitter may be formed from an array of light emitting diodes (LEDs) configured to operate at wavelengths in the near infra-red (NIR) spectral region. Each LED may be married up with a respective optic in order to assist in the emission of light from the LED.
  • LEDs light emitting diodes
  • NIR near infra-red
  • the light emitted from the optical transmitter may be amplitude modulated by an electrical reference signal from the modulator 203.
  • the reflected light signal may be received by a receiving optic 205 and channelled to the PMD sensor array 207.
  • the PMD sensor array 207 may comprise an array of pixel sensor elements whereby each pixel sensor element may be in the form of a two light sensitive photo gates which are conductive and transparent to the received light.
  • Each pixel sensor may further comprise readout diodes in order to enable an electrical signal to be read via a pixel readout circuitry.
  • a function of the PMD sensor array 207 can be to correlate the received optical signal for each pixel sensor with the electrical reference signal and to send the results of the correlation (via the readout circuitry of each pixel sensor) to an analogue to digital converter 209.
  • the PMD sensor array 207 may also be arranged to receive the electrical reference signal from the oscillator 205.
  • the electrical reference signal used to perform the cross correlation functionality within the PMD sensor may be a phase variant of the original reference signal used to modulate the LED array at the optical transmitter 201 .
  • ToF systems utilise the principle of time of signal propagation in order to determine the range to an object.
  • this principle may be manifested by using continuous wave modulation whereby the phase delay between sent and received light signals corresponds to the ToF and hence the distance to the object.
  • the transmitted light signal is continuously amplitude modulated with a particular frequency the received reflected light signal will have the same frequency but may have a different phase and amplitude.
  • the difference in phase, or phase delay, between the transmitted light signal and received light signal can be used to determine the distance to the object.
  • This distance may be expressed by the following expression where D is the distance to the object, ⁇ is the determined phase delay between received and transmitted light signals, c s is the speed light and f is the modulation frequency.
  • the modulation frequency f may be of the order of 20MHz.
  • the phase and the amplitude of the reflected received signal may be determined by cross correlating the reflected received signal with the original modulating signal (electrical reference signal) from the modulator 203.
  • the cross correlation may be performed for each pixel sensor within the PMD sensor array 207.
  • the cross correlation function may be performed at a number of pre-selected phase positions in order to enable the calculation of the phase difference (or delay) between the transmitted light signal and the received reflected light signal.
  • inventions may adopt a general approach to determining the cross correlation function C(T) .
  • the general approach may calculate the cross correlation function C(T) for a number of different phase positions which is greater than four.
  • the cross correlation function C(T) between the received reflected light signal s(t) and the modulating signal g(t) can be expressed as
  • A denotes the modulation amplitude and ⁇ denotes the phase of the received reflected signal.
  • denotes the phase of the received reflected signal.
  • the output from the PMD device in other words the cross correlation function C(T) for each pixel sensor, may then be may be channelled to the Analogue to digital converter 209.
  • the Analogue to Digital (A D) converter 209 may convert each cross correlation function signal from an analogue signal to a digital signal in order to enable further processing of the signal. It is to be appreciated that the (A D) converter 209 may convert the cross correlation function signal for each phase position ⁇ on a pixel wise basis.
  • the digital output from the analogue to digital converter 209 may then be passed to the signal parameter determiner 21 1 .
  • the signal parameter determiner 21 1 can determine the parameters required to form the distance (or range) map of the illuminated scene on a pixel by pixel basis.
  • the parameters determined by the signal parameter determiner 21 1 may be the phase delay ⁇ between the modulated signal g(t) and the reflected received signal s(t), the reflected received signal amplitude A, and the modulation offset K.
  • phase difference ⁇ may be determined by using the following expression
  • the amplitude A of the received reflected optical signal may be determined from
  • the modulation offset K may be determined from
  • N signifies the number of phase positions over which the cross correlation function is determined
  • C n represents the Cross correlation functions for each phase position ⁇ ⁇ .
  • the phase difference ⁇ may be determined by using the following expression
  • the amplitude A of the received reflected optical signal may be determined from
  • the modulation offset K may be determined as in equation (8)
  • embodiments means for determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by a pixel light sensor of an array of pixels in an image sensor of the time of flight camera system, wherein the reflected light signal received by the pixel light sensor is reflected from an object illuminated by the transmitted light signal.
  • embodiments may also comprise means for determining the amplitude of the reflected light signal received by the pixel light sensor. It is to be further appreciated that the above parameters can be determined for each pixel in turn.
  • signal parameter map may be used to denote collectively the parameters ⁇ , A and K for all pixels of the map image.
  • the output of the signal parameter determiner 21 1 may be connected to the input of the de-noising processor 213.
  • the de-noising processor 213 may be arranged to provide a filtering framework for de-noising the parameters of the signal parameter map, in other words the filtering framework provides for the de-noising of the parameters ⁇ , A and K associated with each pixel.
  • FIG. 3 there is shown a flow diagram depicting at least in part some of the operations of the signal parameter determiner 211 and the de-noising processor 213 according to embodiments of the invention.
  • de-noising of the signal parameters ⁇ , A and K may be performed by adopting the technique of Non-Local spatial transform filtering.
  • the essence of such an approach is to centre the pixel which is to be filtered within a window of neighbouring pixels.
  • the window of pixels may also be known in the art as a patch, and the patch containing the pixel to be filtered is known as the reference patch.
  • the image is then scanned for other patches which closely resemble the patch containing the pixel to be filtered (in other words the reference patch).
  • De-noising of the pixel may then be performed by determining the average pixel value over all pixel values within the image whose patches are deemed to be similar to the reference patch.
  • the similarity between the reference patch and a further patch within the image may be quantified by utilizing a Euclidean based distance metric.
  • Non-Local spatial transform filtering may be performed on the signal parameters of each pixel on a pixel wise basis.
  • Non-Local spatial transform filtering known as Non-Local means filtering may be used to filter the signal parameters associated with each pixel.
  • Non-Local means filtering may be expressed by where N n is a filter normaliser, G a is a Guassian Kernel, ⁇ denotes the range over which the image is searched, u denotes the values attached to the pixel at a position, x is the index of the pixel being filtered and is the centre pixel of the reference patch, y is the index of the pixel at the centre of a further patch (a patch which can be similar to the reference patch), h can be a filter parameter tuned in relation with noise variance, * .(0) can denote a centred convolution operator, and (+.) can denote the pixel indices around the centre pixel of a corresponding patch.
  • the signal parameters ⁇ and A may each be individually filtered by using the above Non-Local means filter in order to produce de-noised signal parameters ⁇ ⁇ and A D .
  • De-noising the parameters ⁇ and A may be performed for each pixel position in turn.
  • the above Non-Local Means Filter may be applied to each signal parameter ⁇ and A in turn, where the variable u in the above equation can represent either the parameter ⁇ or A. It is to be understood that the output from the Non-Local Means Filter NL(x) the de-noised parameters ⁇ ⁇ and A D for a pixel position x.
  • the output from the Non-Local Means Filter NL(x) when the input variable u represents the phase difference signal parameter ⁇ at pixel position x is the de-noised phase delay signal parameter ⁇ ⁇
  • the output from the Non-Local Means Filter NL(x) when the input variable u represents the amplitude signal parameter A is the de- noised amplitude parameter ⁇ .
  • the step of individually processing the signal parameters ⁇ and A with the above Non-Local Means Filter to produce the individual de-noised parameters ⁇ ⁇ and A D may be viewed as an optional pre-filtering step.
  • the step of individually processing the signal parameters ⁇ and A may not be performed in some modes of operation.
  • the decision to perform the above pre filtering step may be configured as a mode of operational of the de- noising processor 213.
  • Figure 3 depicts the option of including the above pre filtering step into the mode of operation of the de-noising processor as the decision step 303.
  • the above pre filtering step can have the advantage of reducing effects of objects whose surfaces may have poor light reflectivity; or objects being illuminated by a light signal with a small angle of incidence; or the effect of multi path reflection in the returned light signal.
  • the aforementioned effects can result in a wrapping of the phase delay ⁇ , whereby objects which have a range near the limit for the wavelength of the incident light may result in the phase delay of the reflected light being wrapped into the next waveform period. This wrapping effect may be happen when noise can have an influence on the value of the phase delay ⁇ , when the phase delay ⁇ is near the wrapping boundary.
  • Pre-filtering by de noising the signal parameters A and ⁇ on an individual basis may have the effect of improving the similarity weights confidence for searched patches during the operation of subsequent de-noising filter stages.
  • pre filtering by the de-noising filter NL(x) may be performed on either just the amplitude signal parameter A or the phase delay signal parameter ⁇ . This has the effect of reducing the computational complexity of the pre-filtering stage when compared to applying the de-noising filter individually to each signal parameter in turn.
  • the de-noising filter NL(x) may be applied solely to the amplitude signal parameter A to give the de-noised amplitude signal parameter A D . This can have the advantage in embodiments of improving the de-noising and edge preservation in the eventual distance map, however the effects due to structural artefacts such as those listed above may still be retained.
  • the de-noising filter NL(x) may be applied solely to the phase delay signal parameter ⁇ . This can have the advantage of suppressing artefacts such as those listed above however edge preservation may not be so pronounced in the eventual distance image.
  • embodiments may comprise at least one of: means for de-noising a phase difference between a reflected light signal received by a pixel light sensor and a transmitted light signal, the de-noising of the phase difference being a pre- filtering step; and means for de-noising the amplitude of the reflected light signal received by the pixel light sensor, the de-noising of the phase difference also being a pre-filtering step.
  • the step of performing de-noising for each signal parameter ⁇ , and A of each pixel is shown as processing step 305 in Figure 3.
  • the phase difference ⁇ and amplitude A of the received reflected light signal for each pixel may be combined into a single combined parameter.
  • phase difference ⁇ and amplitude A of the received reflected light signal for each pixel sensor may be combined into a single complex parameter Z.
  • the complex signal parameter for each pixel may be expressed as
  • graph 401 depicts the amplitude A for the signal parameter map
  • graph 403 depicts the phase difference ⁇ for the same signal parameter map.
  • graph 405 depicts the complex signal parameter Z for each pixel position for the signal parameter map. It can be seen from graph 405 that the complex signal parameter for each pixel position is vector bearing information relating to both the amplitude A and the phase difference ⁇ of the received reflected signal. In other words, embodiments may comprise means for combining the amplitude and phase difference for each pixel light sensor into a combined signal parameter for each pixel light sensor. The step of determining the complex signal parameter for each pixel is shown as processing step 307 in Figure 3.
  • the de-noising filter may be applied to the complex signal Z for each pixel within the signal parameter map.
  • the complex signal parameter Z for each pixel may be de-noised by applying the above Non-Local Means Filter.
  • the above Non-local Means Filter may be modified in terms of the complex signal parameter Z.
  • the Non-Local Means Filter may be expressed in terms of Z as
  • the output from the above Non-Local means filter NL cmp i x (x) may be the de-noised complex signal parameter Z DeN at the pixel position x.
  • embodiments may comprise means for de-noising the combined signal parameter for a pixel light sensor by filtering the combined parameter.
  • the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the pixel light sensor.
  • the step of de-noising the complex signal parameter for each pixel is shown as processing step 309 in Figure 3.
  • the de-noised signal parameters A DeN and ⁇ p DeN for each pixel position x may be obtained from the de-noised complex signal parameter Z DeN .
  • the de-noised amplitude A N and phase difference ⁇ ⁇ can be obtained from
  • the step of obtaining the de-noised signal parameters A DeN and ⁇ p DeN for each pixel at position x from the de-noised complex signal parameter Z DeN is shown as processing step 31 1 in Figure 3.
  • the processing steps 303 to 31 1 may be repeated for a number of iterations.
  • the de-noised signal parameters A DeN and ⁇ p DeN for each pixel at position x may form the input to a further iteration of processing step 305, whereby the signal parameters A DeN and ⁇ p DeN may then be individually applied to the above Non-Local means filter NL(x).
  • Further de-noised signal parameters where the signal parameters A DeN and ⁇ p DeN form the input to the Non-Local means filter NL(x), may be denoted as A D2 and ⁇ p D2 where "2" denotes a second or further iteration.
  • the output of the Non-Local means filter NL(x) filter that is the individually de-noised amplitude and phase difference signal parameters A D2 and ⁇ p D2 for each phase position x, may then be combined into a single combined parameter.
  • the single combined parameter may be a single complex parameter Z 2 .
  • the single complex parameter Z 2 may then form the input of the complex modified form of the Non-Local Means Filter NL cmp i x (x).
  • the output of the complex modified form of the Non-Local Means Filter NL cmp i x (x) may then be a further de-noised complex parameter Z DeN2 for a pixel position x.
  • and (p DeN2 arg(Z DeW2 ).
  • embodiments may comprise means for determining the de-noised phase difference for a pixel light sensor from the de-noised combined signal parameter.
  • decision step 313 The step of determining whether to iterate processing steps 303 to 31 1 is shown as decision step 313 in Figure 3.
  • parameters of the Non-Local spatial transform filter may be adjusted for each processing loop.
  • the filter parameter h in the Non-local means filter can be adapted for each iteration of the processing loop.
  • the step of adjusting the Non-Local spatial transform filter parameter for further iterations of the processing loop is depicted in Figure 3 as the processing step 315 in the return branch 313a.
  • the loop back branch 313a returns to the decision branch 303. Accordingly, the decision of whether to include the pre- filtering step 305 may be taken for each iteration of the processing loop.
  • the de-nosing processor 21 1 may then determine the distance value D to the illuminated object for each pixel position.
  • the distance value D may be determined by taking the de-noised phase difference signal parameter ⁇ p DeN as provided by processing step 31 1 and applying the value to equation (1 ) above.
  • the distance value D may be determined on a pixel wise basis in accordance with the phase delay for each pixel position within the map.
  • embodiments may comprise means for calculating the distance range to an object for a pixel light sensor using the de-noised phase delay calculated for the pixel light sensor.
  • processing step 317 The step of determining the distance value D to the illuminated object for each pixel is shown as processing step 317 in Figure 3.
  • processing step 317 The step of determining the distance value D to the illuminated object for each pixel is shown as processing step 317 in Figure 3.
  • YES - Pre-filter optionally A, ⁇ Af Dj or
  • the output from the ToF camera system 1 1 may be the distance value D for each pixel within the PMD sensor array 207.
  • the distance value D for each pixel may then be passed to the processor 1 1 for further processing.
  • the distance values D for each pixel position may be displayed as a distance or depth map image in the Ul 15.
  • de-noising has been described in terms of the Non-Local Means Filter.
  • other embodiments may use other forms of de-noising such as Gaussian smoothing, Bi(Multi)-lateral filtering, wavelet shrinkage, Sliding-local transform filtering (e.g. sliding-DCT) and Block-matching 3D collaborative transform-domain filtering BM3D.
  • inventions may adopt other ToF camera systems such as radars, sonars and lidars.
  • embodiments may adopt different distance measuring systems, where the measurement data can be interpreted in complex domain such as structured-light based systems, systems based on structure-from-stereo, and termographic cameras.
  • the various embodiments of the invention may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto.
  • While various aspects of the invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • At least some embodiments may be an apparatus comprising at least one processor and at least one memory including computer program code the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the embodiments of this invention may be implemented by computer software executable by a data processor of the mobile device, such as in the processor entity, or by hardware, or by a combination of software and hardware. Further in this regard it should be noted that any blocks of the logic flow as in the Figures may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions.
  • the encoder may be a non-transitory computer-readable storage medium having stored thereon computer readable, which, when executed by a computing apparatus, causes the computing apparatus to perform a method comprising: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
  • the memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, in other words a non-transitory computer-readable storage medium.
  • the data processors may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASIC), gate level circuits and processors based on multi-core processor architecture, as non-limiting examples.
  • Embodiments of the inventions may be practiced in various components such as integrated circuit modules.
  • the design of integrated circuits is by and large a highly automated process.
  • Complex and powerful software tools are available for converting a logic level design into a semiconductor circuit design ready to be etched and formed on a semiconductor substrate.
  • Programs such as those provided by Synopsys, Inc. of Mountain View, California and Cadence Design, of San Jose, California automatically route conductors and locate components on a semiconductor chip using well established rules of design as well as libraries of pre-stored design modules.
  • the resultant design in a standardized electronic format (e.g., Opus, GDSII, or the like) may be transmitted to a semiconductor fabrication facility or "fab" for fabrication.
  • circuitry refers to all of the following:
  • circuits and software and/or firmware
  • combinations of circuits and software such as: (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions and
  • circuits such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term in this application, including any claims.
  • the term 'circuitry' would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware.
  • the term 'circuitry' would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or similar integrated circuit in server, a cellular network device, or other network device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

It is inter alia disclosed to determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal (301); determine an amplitude of the reflected light signal received by the at least one pixel sensor (301); combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor (307); and de-noise the combined signal parameter for the at least one pixel sensor by filtering with a filter the combined parameter for the at least one pixel sensor (309).

Description

A Method and Apparatus for de-noising data from a distance sensing camera
Field of the Application The present application relates to time-of-flight camera systems, and in particular the reduction of noise in the distance images from said time of flight camera systems.
Background of the Application
Time-of-flight (ToF) cameras can sense distance or range to an object by emitting a pulse of modulated light signal and then measuring the time differential in the returning wave front. A ToF camera can comprise a light source such as a bank of light emitting diodes (LED) whereby a continuously modulated harmonic light signal can be emitted. The distance or range of an object from the light source can be determined by measuring the shift in phase (or difference in time) between the emitted and the reflected photons of light. The reflected photons can be sensed by the camera by the means of charge coupled device or the like.
The phase shift (or phase delay) between the emitted and reflected photons is not measured directly. Instead a ToF camera system may adopt a pixel structure whereby the correlation between the received optical signal and an electrical reference source is performed in order to determine a measure of the phase delay.
The resulting distance (or range) map can represent the distance to objects as the relative intensity of pixels within the distance map image. However, the distance map image can be corrupted with the effect of noise, whether it is random noise as a result of thermal noise in the charge coupled device or noise as a result of systematic errors in the measurement of the distance to the observed object. In particular the operational performance of a ToF camera system can be influenced by internal factors resulting from the operational mode of the camera and external factors caused by characteristics of the sensed scene and sensing environment.
For example, internal factors which can limit the capabilities of a ToF camera system may include the physical limitations of the sensors used such as inherent noise and resolution. Other internal factors which can limit the capability of a ToF camera system can include the power of the emitted signal, and the integration time for forming the reflected signal samples.
External factors which may limit the performance of a ToF camera system may include the angle of incidence of the illuminating light onto sensed object, the light reflectivity of colours and materials of the sensed objects, the sensing range of the ToF camera system, and the returned light signal being formed by multiple reflections.
These factors can seriously impact on the precision of distance (or range measurements and operational efficiency of a ToF camera system. In particular for low-power ToF system devices the effect of noise can limit the distance sensing capability.
Summary of Some Embodiments The following embodiments aim to address the above problem.
There is provided according to an aspect of the application a method comprising: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
The method may further comprise at least one of: de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de-noising of the phase difference may occur prior to combining the amplitude and phase difference; and de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude may occur prior to combining the amplitude and phase difference. The filtering may further comprise filtering with a non-local spatial transform filter.
The non-local spatial transform filter can be a non-local means filter.
The method may further comprise calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
The combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
The image sensor of the time of flight camera system may be based at least in part on a photonic mixer device. According to a further aspect of the application there is provided an apparatus configured to: determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determine an amplitude of the reflected light signal received by the at least one pixel sensor; combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de- noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
The apparatus may be further configured to at least one of: de-noise the phase difference for the at least one pixel sensor filtering the phase difference for the at least one pixel sensor, wherein the apparatus may be configured to de-noise the phase difference prior to combining the amplitude and phase difference; and de- noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the apparatus may be configured to de- noise the amplitude prior to combining the amplitude and phase difference.
The filtering may comprise filtering with a non-local spatial transform type filter. The non-local spatial transform type filter can be a non-local means filter. The apparatus maybe further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor. The combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
The image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
According to another aspect of the application there is provided an apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured with the at least one processor to cause the apparatus at least to: determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determine an amplitude of the reflected light signal received by the at least one pixel sensor; combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
The at least one memory and the computer code configured with the at least one processor may be further configured to at least one of: de-noise the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the at least one memory and the computer code may also be configured with the at least one processor to de-noise the phase difference prior to combining the amplitude and phase difference; and de-noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the at least one memory and the computer code may also be configured with the at least one processor to de-noise the amplitude prior to combining the amplitude and phase difference. The filtering may comprise filtering with a non-local spatial transform type filter.
The non-local spatial transform filter can be a non-local means filter. The at least one memory and the computer code configured with the at least one processor may be further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor. The combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor. The image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
According to another aspect of the application there is provided a computer program code when executed by a processor realises: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
The computer program code when executed by the processor may further realises at least one of: de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de- noising of the phase difference may occur prior to combining the amplitude and phase difference; and de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude may occur prior to combining the amplitude and phase difference.
The computer program code when executed by the processor realises filtering, the computer program code may further realise filtering with a non-local spatial transform filter.
The non-local spatial transform filter can be a non-local means filter.
The computer program code when executed by the processor may further realises calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by: determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
The combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
The image sensor of the time of flight camera system may be based at least in part on a photonic mixer device.
Brief Description of Drawings
For better understanding of the present invention, reference will now be made by way of example to the accompanying drawings in which: Figure 1 shows schematically an apparatus suitable for employing some embodiments;
Figure 2 shows schematically a ToF camera system suitable for employing some embodiments;
Figure 3 shows schematically a flow diagram illustrating the operation of the ToF camera system of Figure 2; and
Figure 4 shows schematically a complex parameter representation for signal parameters of the ToF camera system of Figure 2. Description of Some Embodiments of the Application
The following describes in more detail possible ToF camera systems with the provision for de-noising the distance (or range) map. In this regard reference is first made to Figure 1 which shows a schematic block diagram of an exemplary electronic device or apparatus 10, which may incorporate a ToF camera system according to embodiments of the application.
The apparatus 10 may for example be a mobile terminal or user equipment of a wireless communication system. In other embodiments the apparatus 10 may be an audio-video device such as video camera, audio recorder or audio player such as a mp3 recorder/player, a media recorder (also known as a mp4 recorder/player), or any computer suitable for the processing of audio signals.
In other embodiments the apparatus 10 may for example be a sub component of a larger computer system, whereby the apparatus 10 is arranged to operate with other electronic components or computer systems. In such embodiments the apparatus 10 may be arranged as an application specific integrated circuit (ASIC) with the functionality to control and interface to a ToF camera module and also process information from the ToF camera module. In some embodiments the apparatus 10 may be arranged as an individual module which can be configured to be integrated into a generic computer such as a personal computer or a laptop. The apparatus 10 in some embodiments may comprise a ToF camera system or module 1 1 , which can be coupled to a processor 21 . The processor 21 can be configured to execute various program codes. The implemented program codes in some embodiments can comprise code for processing the depth map image from the ToF camera module 1 1 . In particular the implemented program codes can facilitate noise reduction in the processed distance image.
In some embodiments the apparatus further comprises a memory 22. In some embodiments the processor is coupled to memory 22. The memory can be any suitable storage means. In some embodiments the memory 22 comprises a program code section 23 for storing program codes implementable upon the processor 21 . Furthermore in some embodiments the memory 22 can further comprise a stored data section 24 for storing data, for example data for which has been retrieved from the ToF camera module 1 1 for subsequent processing of the distance map image as will be described later. The implemented program code stored within the program code section 23, and the data stored within the stored data section 24 can be retrieved by the processor 21 whenever needed via the memory-processor coupling.
In some further embodiments the apparatus 10 can comprise a user interface 15. The user interface 15 can be coupled in some embodiments to the processor 21 . In some embodiments the processor can control the operation of the user interface and receive inputs from the user interface 15. In some embodiments the user interface 15 can enable a user to input commands to the electronic device or apparatus 10, for example via a keypad, and/or to obtain information from the apparatus 10, for example via a display which is part of the user interface 15. The user interface 15 can in some embodiments comprise a touch screen or touch interface capable of both enabling information to be entered to the apparatus 10 and further displaying information to the user of the apparatus 10.
The display which is part of the user interface 15 may be used for displaying the depth map image as such and can be a liquid crystal display (LCD), or a light emitting diode (LED) display, or an organic LED (OLED) display, or a plasma screen. Furthermore, the user interface 15 may also comprise a pointing device, such as a mouse a trackball, cursor direction keys, or a motion sensor, for controlling a position of a small cursor image presented on the display and for issuing commands associated with graphical elements presented on the display.
In some embodiments the apparatus further comprises a transceiver 13, the transceiver in such embodiments can be coupled to the processor and configured to enable a communication with other apparatus or electronic devices, for example via a wireless communications network. The transceiver 13 or any suitable transceiver or transmitter and/or receiver means can in some embodiments be configured to communicate with other electronic devices or apparatus via a wire or wired coupling. The transceiver 13 can communicate with further devices by any suitable known communications protocol, for example in some embodiments the transceiver 13 or transceiver means can use a suitable universal mobile telecommunications system (UMTS) protocol, a wireless local area network (WLAN) protocol such as for example IEEE 802.X, a suitable short-range radio frequency communication protocol such as Bluetooth, or infrared data communication pathway (IRDA).
It is to be understood again that the structure of the apparatus 10 could be supplemented and varied in many ways. It would be appreciated that the schematic structures described in Figures 2 and 4 and the flow diagram depicted in Figure 3 represent only a part of the operation of the ToF camera system as exemplarily shown implemented in the apparatus shown in Figure 1 .
Figure 2 shows schematically some components of the ToF camera module 11 according to some embodiments.
In some embodiments the ToF camera module 1 1 may be a Photonic Mixer Device (PMD) camera module. In such embodiments there may be an optical transmitter 201 which may be arranged to emit a cone of modulated light thereby illuminating a scene for distance detection. In embodiments the optical transmitter may be formed from an array of light emitting diodes (LEDs) configured to operate at wavelengths in the near infra-red (NIR) spectral region. Each LED may be married up with a respective optic in order to assist in the emission of light from the LED.
The light emitted from the optical transmitter may be amplitude modulated by an electrical reference signal from the modulator 203.
The reflected light signal may be received by a receiving optic 205 and channelled to the PMD sensor array 207. The PMD sensor array 207 may comprise an array of pixel sensor elements whereby each pixel sensor element may be in the form of a two light sensitive photo gates which are conductive and transparent to the received light. Each pixel sensor may further comprise readout diodes in order to enable an electrical signal to be read via a pixel readout circuitry.
A function of the PMD sensor array 207 can be to correlate the received optical signal for each pixel sensor with the electrical reference signal and to send the results of the correlation (via the readout circuitry of each pixel sensor) to an analogue to digital converter 209. In order to enable the correlation within each pixel sensor, the PMD sensor array 207 may also be arranged to receive the electrical reference signal from the oscillator 205. In embodiments the electrical reference signal used to perform the cross correlation functionality within the PMD sensor may be a phase variant of the original reference signal used to modulate the LED array at the optical transmitter 201 .
It is to be appreciated that ToF systems utilise the principle of time of signal propagation in order to determine the range to an object. For PMD type ToF systems this principle may be manifested by using continuous wave modulation whereby the phase delay between sent and received light signals corresponds to the ToF and hence the distance to the object. When the transmitted light signal is continuously amplitude modulated with a particular frequency the received reflected light signal will have the same frequency but may have a different phase and amplitude. The difference in phase, or phase delay, between the transmitted light signal and received light signal can be used to determine the distance to the object. This distance may be expressed by the following expression
Figure imgf000014_0001
where D is the distance to the object, φ is the determined phase delay between received and transmitted light signals, cs is the speed light and f is the modulation frequency. In embodiments the modulation frequency f may be of the order of 20MHz.
The phase and the amplitude of the reflected received signal may be determined by cross correlating the reflected received signal with the original modulating signal (electrical reference signal) from the modulator 203. As mentioned above the cross correlation may be performed for each pixel sensor within the PMD sensor array 207. In embodiments the cross correlation function may be performed at a number of pre-selected phase positions in order to enable the calculation of the phase difference (or delay) between the transmitted light signal and the received reflected light signal.
In a first group of embodiments the cross correlation function C(T) between the received reflected light signal s(t) and the modulating signal g(t), may be calculated for four different phase delays, for example at τ0 = 0°, τ1 =
90°, τ2 = 180°, τ3 = 270°.
It is to be understood that other embodiments may adopt a general approach to determining the cross correlation function C(T) . The general approach may calculate the cross correlation function C(T) for a number of different phase positions which is greater than four.
The cross correlation function C(T) between the received reflected light signal s(t) and the modulating signal g(t) can be expressed as
C(x)=s(t) (8)g(t) (2)
In some embodiments the received reflected light signal s(t) can be expressed in the form of s(t) = 1 + A cos(eot - φ), (3) and the modulating signal can be expressed in the form of
g(t) = cos(eot) . (4)
Where A denotes the modulation amplitude and φ denotes the phase of the received reflected signal. In some embodiments the cross correlated signal for four different phase delays of To = o°, τ1 = 90°, τ2 = 180°, τ3 = 270°can be simplified to C(T) = K + Acos(<p + τ) (5)
Where K is a modulation offset. It is to be understood that the cross correlation function C(T) can be determined at a number of different equally spaced phase positions τ by the PMD, that is τ = TQ T-L, ... Tw--1. In one group of embodiments the cross correlation function C(T) can be calculated for the following phase positions τ0 = 0°, τ1 = 90°, τ2 = 180°, τ3 = 270°.
The output from the PMD device, in other words the cross correlation function C(T) for each pixel sensor, may then be may be channelled to the Analogue to digital converter 209. The Analogue to Digital (A D) converter 209 may convert each cross correlation function signal from an analogue signal to a digital signal in order to enable further processing of the signal. It is to be appreciated that the (A D) converter 209 may convert the cross correlation function signal for each phase position τ on a pixel wise basis.
The digital output from the analogue to digital converter 209 may then be passed to the signal parameter determiner 21 1 .
The signal parameter determiner 21 1 can determine the parameters required to form the distance (or range) map of the illuminated scene on a pixel by pixel basis.
The parameters determined by the signal parameter determiner 21 1 may be the phase delay φ between the modulated signal g(t) and the reflected received signal s(t), the reflected received signal amplitude A, and the modulation offset K.
In embodiments the phase difference φ may be determined by using the following expression
Figure imgf000017_0001
The amplitude A of the received reflected optical signal may be determined from
Figure imgf000017_0002
The modulation offset K may be determined from
K = 77 (∑n=0 Cn)
It is to be appreciated in the above expressions that N signifies the number of phase positions over which the cross correlation function is determined, and Cn represents the Cross correlation functions for each phase position τη. In the first group of embodiments N is determined to be four in order to account for the different phase positions τ0 = 0°, τ = 90°, τ2 = 180°, τ3 = 270°. For this specific embodiment, the phase difference φ may be determined by using the following expression
Figure imgf000017_0003
The amplitude A of the received reflected optical signal may be determined from
A _ ^ (C(T3) -C(TI))2 +(C(T0) -C(T2))2
The modulation offset K may be determined as in equation (8)
In other words there may be in embodiments means for determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by a pixel light sensor of an array of pixels in an image sensor of the time of flight camera system, wherein the reflected light signal received by the pixel light sensor is reflected from an object illuminated by the transmitted light signal. Furthermore, embodiments may also comprise means for determining the amplitude of the reflected light signal received by the pixel light sensor. It is to be further appreciated that the above parameters can be determined for each pixel in turn.
It is to be understood herein that the term signal parameter map may be used to denote collectively the parameters φ, A and K for all pixels of the map image.
The output of the signal parameter determiner 21 1 , in other words the signal parameter map, may be connected to the input of the de-noising processor 213.
The de-noising processor 213 may be arranged to provide a filtering framework for de-noising the parameters of the signal parameter map, in other words the filtering framework provides for the de-noising of the parameters φ, A and K associated with each pixel.
With reference to Figure 3 there is shown a flow diagram depicting at least in part some of the operations of the signal parameter determiner 211 and the de-noising processor 213 according to embodiments of the invention.
Accordingly there is depicted in Figure 3 the processing step 301 of determining the signal parameters (φ, A and K) from the cross correlation function C(T) for each pixel.
In embodiments de-noising of the signal parameters φ, A and K may be performed by adopting the technique of Non-Local spatial transform filtering. The essence of such an approach is to centre the pixel which is to be filtered within a window of neighbouring pixels. It is to be noted that the window of pixels may also be known in the art as a patch, and the patch containing the pixel to be filtered is known as the reference patch. The image is then scanned for other patches which closely resemble the patch containing the pixel to be filtered (in other words the reference patch). De-noising of the pixel may then be performed by determining the average pixel value over all pixel values within the image whose patches are deemed to be similar to the reference patch.
In embodiments the similarity between the reference patch and a further patch within the image may be quantified by utilizing a Euclidean based distance metric.
Non-Local spatial transform filtering may be performed on the signal parameters of each pixel on a pixel wise basis.
In a first group of embodiments a form of Non-Local spatial transform filtering known as Non-Local means filtering may be used to filter the signal parameters associated with each pixel.
Non-Local means filtering may be expressed by
Figure imgf000019_0001
where Nn is a filter normaliser, Gais a Guassian Kernel, Ω denotes the range over which the image is searched, u denotes the values attached to the pixel at a position, x is the index of the pixel being filtered and is the centre pixel of the reference patch, y is the index of the pixel at the centre of a further patch (a patch which can be similar to the reference patch), h can be a filter parameter tuned in relation with noise variance, *.(0) can denote a centred convolution operator, and (+.) can denote the pixel indices around the centre pixel of a corresponding patch.
In a first group of embodiments the signal parameters φ and A may each be individually filtered by using the above Non-Local means filter in order to produce de-noised signal parameters ψΌ and AD. De-noising the parameters φ and A may be performed for each pixel position in turn. In other words the above Non-Local Means Filter may be applied to each signal parameter φ and A in turn, where the variable u in the above equation can represent either the parameter φ or A. It is to be understood that the output from the Non-Local Means Filter NL(x) the de-noised parameters ψΌ and AD for a pixel position x. In other words the output from the Non-Local Means Filter NL(x) when the input variable u represents the phase difference signal parameter φ at pixel position x is the de-noised phase delay signal parameter ψΌ, and the output from the Non-Local Means Filter NL(x) when the input variable u represents the amplitude signal parameter A is the de- noised amplitude parameter ^.
In some embodiments the step of individually processing the signal parameters φ and A with the above Non-Local Means Filter to produce the individual de-noised parameters ψΌ and AD may be viewed as an optional pre-filtering step.
In other words the step of individually processing the signal parameters φ and A may not be performed in some modes of operation. The decision to perform the above pre filtering step may be configured as a mode of operational of the de- noising processor 213.
Accordingly, Figure 3 depicts the option of including the above pre filtering step into the mode of operation of the de-noising processor as the decision step 303. The above pre filtering step can have the advantage of reducing effects of objects whose surfaces may have poor light reflectivity; or objects being illuminated by a light signal with a small angle of incidence; or the effect of multi path reflection in the returned light signal. The aforementioned effects can result in a wrapping of the phase delay φ, whereby objects which have a range near the limit for the wavelength of the incident light may result in the phase delay of the reflected light being wrapped into the next waveform period. This wrapping effect may be happen when noise can have an influence on the value of the phase delay φ, when the phase delay φ is near the wrapping boundary. Pre-filtering by de noising the signal parameters A and φ on an individual basis may have the effect of improving the similarity weights confidence for searched patches during the operation of subsequent de-noising filter stages.
In other embodiments pre filtering by the de-noising filter NL(x) may be performed on either just the amplitude signal parameter A or the phase delay signal parameter φ. This has the effect of reducing the computational complexity of the pre-filtering stage when compared to applying the de-noising filter individually to each signal parameter in turn. In one variant of the above group of embodiments the de-noising filter NL(x) may be applied solely to the amplitude signal parameter A to give the de-noised amplitude signal parameter AD. This can have the advantage in embodiments of improving the de-noising and edge preservation in the eventual distance map, however the effects due to structural artefacts such as those listed above may still be retained.
In another variant of the above group of embodiments the de-noising filter NL(x) may be applied solely to the phase delay signal parameter φ. This can have the advantage of suppressing artefacts such as those listed above however edge preservation may not be so pronounced in the eventual distance image.
In other words embodiments may comprise at least one of: means for de-noising a phase difference between a reflected light signal received by a pixel light sensor and a transmitted light signal, the de-noising of the phase difference being a pre- filtering step; and means for de-noising the amplitude of the reflected light signal received by the pixel light sensor, the de-noising of the phase difference also being a pre-filtering step. The step of performing de-noising for each signal parameter φ, and A of each pixel is shown as processing step 305 in Figure 3. The phase difference φ and amplitude A of the received reflected light signal for each pixel may be combined into a single combined parameter.
In some embodiments the phase difference φ and amplitude A of the received reflected light signal for each pixel sensor may be combined into a single complex parameter Z. The complex signal parameter for each pixel may be expressed as
Z = AeW , (12) where j is the imaginary unit. It is to be understood that in a mode of operation whereby the above pre filtering step 305 is deployed the complex signal parameter may be expressed as
Figure imgf000022_0001
With respect to Figure 4 there is shown a graphical representation of a complex signal parameter map comprising four pixels.
With reference to Figure 4, graph 401 depicts the amplitude A for the signal parameter map, and graph 403 depicts the phase difference φ for the same signal parameter map.
Again with reference to Figure 4, graph 405 depicts the complex signal parameter Z for each pixel position for the signal parameter map. It can be seen from graph 405 that the complex signal parameter for each pixel position is vector bearing information relating to both the amplitude A and the phase difference φ of the received reflected signal. In other words, embodiments may comprise means for combining the amplitude and phase difference for each pixel light sensor into a combined signal parameter for each pixel light sensor. The step of determining the complex signal parameter for each pixel is shown as processing step 307 in Figure 3.
In embodiments the de-noising filter may be applied to the complex signal Z for each pixel within the signal parameter map.
In a first group of embodiments the complex signal parameter Z for each pixel may be de-noised by applying the above Non-Local Means Filter. In other words, the above Non-local Means Filter may be modified in terms of the complex signal parameter Z. The Non-Local Means Filter may be expressed in terms of Z as
1 Gg*|Z(x+.)-Z(y+.)| (0)
NLcmplx(x) = — - exp ft2 Z{y)dy (14)
The output from the above Non-Local means filter NLcmpix(x) may be the de-noised complex signal parameter ZDeN at the pixel position x.
In other words embodiments may comprise means for de-noising the combined signal parameter for a pixel light sensor by filtering the combined parameter. In some embodiments the combined signal parameter may be a complex signal parameter formed from combining the amplitude and phase difference for the pixel light sensor.
The step of de-noising the complex signal parameter for each pixel is shown as processing step 309 in Figure 3. The de-noised signal parameters ADeN and <pDeN for each pixel position x may be obtained from the de-noised complex signal parameter ZDeN. In other words the de-noised amplitude AN and phase difference φΝ can be obtained from
ADeN = l DeWl and <pDeN = arg(ZDeW) respectively.
The step of obtaining the de-noised signal parameters ADeN and <pDeN for each pixel at position x from the de-noised complex signal parameter ZDeN is shown as processing step 31 1 in Figure 3. In some embodiments the processing steps 303 to 31 1 may be repeated for a number of iterations. In such embodiments the de-noised signal parameters ADeN and <pDeN for each pixel at position x may form the input to a further iteration of processing step 305, whereby the signal parameters ADeN and <pDeN may then be individually applied to the above Non-Local means filter NL(x).
Further de-noised signal parameters, where the signal parameters ADeN and <pDeN form the input to the Non-Local means filter NL(x), may be denoted as AD2 and <pD2 where "2" denotes a second or further iteration. As before the output of the Non-Local means filter NL(x) filter, that is the individually de-noised amplitude and phase difference signal parameters AD2 and <pD2 for each phase position x, may then be combined into a single combined parameter. In a first group of embodiments the single combined parameter may be a single complex parameter Z2.
The single complex parameter Z2 may then form the input of the complex modified form of the Non-Local Means Filter NLcmpix(x).
The output of the complex modified form of the Non-Local Means Filter NLcmpix(x) may then be a further de-noised complex parameter ZDeN2 for a pixel position x. The de-noised amplitude ADeN2 and phase difference (pDeN2 can be obtained by further applying the equations ADeN2 = \ZDeN2 | and (pDeN2 = arg(ZDeW2).
In other words, embodiments may comprise means for determining the de-noised phase difference for a pixel light sensor from the de-noised combined signal parameter.
The step of determining whether to iterate processing steps 303 to 31 1 is shown as decision step 313 in Figure 3.
With reference to Figure 3 if it is determined at processing step 313 that there should be further iterations of processing steps 303 to 31 1 then the return branch 313a is taken and the process loops back to commence a new iteration. In embodiments, parameters of the Non-Local spatial transform filter may be adjusted for each processing loop. For instance, in the first group of embodiments the filter parameter h in the Non-local means filter can be adapted for each iteration of the processing loop. The step of adjusting the Non-Local spatial transform filter parameter for further iterations of the processing loop is depicted in Figure 3 as the processing step 315 in the return branch 313a.
It is to be appreciated in embodiments that the loop back branch 313a returns to the decision branch 303. Accordingly, the decision of whether to include the pre- filtering step 305 may be taken for each iteration of the processing loop.
If it is determined at processing step 313 that there is to be no further iterations then the decision branch 313b is taken and the de-nosing processor 21 1 may then determine the distance value D to the illuminated object for each pixel position. In embodiments the distance value D may be determined by taking the de-noised phase difference signal parameter <pDeN as provided by processing step 31 1 and applying the value to equation (1 ) above. The distance value D may be determined on a pixel wise basis in accordance with the phase delay for each pixel position within the map.
In other words embodiments may comprise means for calculating the distance range to an object for a pixel light sensor using the de-noised phase delay calculated for the pixel light sensor.
The step of determining the distance value D to the illuminated object for each pixel is shown as processing step 317 in Figure 3. Below is shown a possible pseudo code implementation to the above iterative approach for de-noising signal parameter data.
0. Initialize. Iteration counter N=0.
0.1. Capture data from PMD. Compute A, φ (Eq 9, 10)
1. IF pre-filtering?
1.1. YES - Pre-filter optionally A, φ AfDj or
1.2. NO - Skip step
2. Calculate ZD from A[Dj or φ[Β1 (Eq. 13)
3. Apply de -noising filter on ZD {e.g. Eq. 14)
3.1. SAVE output -ZDeN
3.2. Calculate ADeN, q)DeN from ZDeN
4. IF Iterate?
4.1. YES - N=N+1. Adapt filter parameter - h.
4.1.0. REPEAT Steps 1÷4
4.2. NO - Calculate D from (pDeN. (Eq. 1)
5. SAVE D. Finish
The output from the ToF camera system 1 1 may be the distance value D for each pixel within the PMD sensor array 207.
The distance value D for each pixel may then be passed to the processor 1 1 for further processing. In some embodiments the distance values D for each pixel position may be displayed as a distance or depth map image in the Ul 15.
It is to be appreciated in the above embodiments that the process of de-noising has been described in terms of the Non-Local Means Filter. However, other embodiments may use other forms of de-noising such as Gaussian smoothing, Bi(Multi)-lateral filtering, wavelet shrinkage, Sliding-local transform filtering (e.g. sliding-DCT) and Block-matching 3D collaborative transform-domain filtering BM3D.
Furthermore other embodiments may adopt other ToF camera systems such as radars, sonars and lidars.
Furthermore other embodiments may adopt different distance measuring systems, where the measurement data can be interpreted in complex domain such as structured-light based systems, systems based on structure-from-stereo, and termographic cameras.
Although the above examples describe embodiments of the invention operating within an apparatus 10, it would be appreciated that the invention as described above may be implemented as part of any computer or electronic apparatus supporting a ToF camera system.
In general, the various embodiments of the invention may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the invention is not limited thereto. While various aspects of the invention may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof. Thus at least some embodiments may be an apparatus comprising at least one processor and at least one memory including computer program code the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor. The embodiments of this invention may be implemented by computer software executable by a data processor of the mobile device, such as in the processor entity, or by hardware, or by a combination of software and hardware. Further in this regard it should be noted that any blocks of the logic flow as in the Figures may represent program steps, or interconnected logic circuits, blocks and functions, or a combination of program steps and logic circuits, blocks and functions.
Thus at least some embodiments of the encoder may be a non-transitory computer-readable storage medium having stored thereon computer readable, which, when executed by a computing apparatus, causes the computing apparatus to perform a method comprising: determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal; determining an amplitude of the reflected light signal received by the at least one pixel sensor; combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
The memory may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, in other words a non-transitory computer-readable storage medium. The data processors may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASIC), gate level circuits and processors based on multi-core processor architecture, as non-limiting examples.
Embodiments of the inventions may be practiced in various components such as integrated circuit modules. The design of integrated circuits is by and large a highly automated process. Complex and powerful software tools are available for converting a logic level design into a semiconductor circuit design ready to be etched and formed on a semiconductor substrate.
Programs, such as those provided by Synopsys, Inc. of Mountain View, California and Cadence Design, of San Jose, California automatically route conductors and locate components on a semiconductor chip using well established rules of design as well as libraries of pre-stored design modules. Once the design for a semiconductor circuit has been completed, the resultant design, in a standardized electronic format (e.g., Opus, GDSII, or the like) may be transmitted to a semiconductor fabrication facility or "fab" for fabrication.
As used in this application, the term 'circuitry' refers to all of the following:
(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) to combinations of circuits and software (and/or firmware), such as: (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions and
(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term in this application, including any claims. As a further example, as used in this application, the term 'circuitry' would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term 'circuitry' would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or similar integrated circuit in server, a cellular network device, or other network device.
The foregoing description has provided by way of exemplary and non-limiting examples a full and informative description of the exemplary embodiment of this invention. However, various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. However, all such and similar modifications of the teachings of this invention will still fall within the scope of this invention as defined in the appended claims.

Claims

Claims:
1 . A method comprising:
determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal;
determining an amplitude of the reflected light signal received by the at least one pixel sensor;
combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and
de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
2. The method as claimed in Claim 1 further comprising at least one of:
de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de-noising of the phase difference occurs prior to combining the amplitude and phase difference; and
de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude occurs prior to combining the amplitude and phase difference.
3. The method as claimed in Claims 1 and 2, wherein the filtering further comprises:
filtering with a non-local spatial transform filter.
4. The method as claimed in Claim 3, wherein the non-local spatial transform filter is a non-local means filter.
5. The method as claimed in Claims 1 to 4, further comprising calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by:
determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and
calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
6. The method as claimed in Claims 1 to 5, wherein the combined signal parameter is a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
7. The method as claimed in Claims 1 to 6, wherein the image sensor of the time of flight camera system is based at least in part on a photonic mixer device.
8. An apparatus configured to:
determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal;
determine an amplitude of the reflected light signal received by the at least one pixel sensor;
combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and
de-noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
9. The apparatus as claimed in Claim 8 further configured to at least one of:
de-noise the phase difference for the at least one pixel sensor filtering the phase difference for the at least one pixel sensor, wherein the apparatus is configured to de-noise the phase difference prior to combining the amplitude and phase difference; and
de-noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the apparatus is configured to de-noise the amplitude prior to combining the amplitude and phase difference.
10. The apparatus as claimed in Claims 8 and 9, wherein filtering comprises filtering with a non-local spatial transform type filter.
1 1 . The apparatus as claimed in Claim 10, wherein the non-local spatial transform type filter is a non-local means filter.
12. The apparatus as claimed in Claims 8 to 1 1 , further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by:
determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and
calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
13. The apparatus as claimed in Claims 8 to 12, wherein the combined signal parameter is a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
14. The apparatus as claimed in Claims 8 to 13, wherein the image sensor of the time of flight camera system is based at least in part on a photonic mixer device.
15. An apparatus comprising at least one processor and at least one memory including computer code for one or more programs, the at least one memory and the computer code configured with the at least one processor to cause the apparatus at least to: determine a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal;
determine an amplitude of the reflected light signal received by the at least one pixel sensor;
combine the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and
de-noise the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
16. The apparatus as claimed in Claim 15, wherein the at least one memory and the computer code configured with the at least one processor is further configured to at least one of:
de-noise the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the at least one memory and the computer code is configured with the at least one processor to de-noise the phase difference prior to combining the amplitude and phase difference; and
de-noise the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the at least one memory and the computer code is configured with the at least one processor to de-noise the amplitude prior to combining the amplitude and phase difference.
17. The apparatus as claimed in Claims 15 and 16, wherein filtering comprises filtering with a non-local spatial transform type filter.
18. The apparatus as claimed in Claim 17, wherein the non-local spatial transform filter is a non-local means filter.
19. The apparatus as claimed in Claims 15 to 18, wherein the at least one memory and the computer code configured with the at least one processor is further configured to calculate the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by:
determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor, and
calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
20. The apparatus as claimed in Claims 15 to 19, wherein the combined signal parameter is a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
21 . The apparatus as claimed in Claims 15 to 20, wherein the image sensor of the time of flight camera system is based at least in part on a photonic mixer device.
22. A computer program code when executed by a processor realises:
determining a phase difference between a light signal transmitted by a time of flight camera system and a reflected light signal received by at least one pixel sensor of an array of pixel sensors in an image sensor of the time of flight camera system, wherein the reflected light signal received by the at least one pixel sensor is reflected from an object illuminated by the transmitted light signal;
determining an amplitude of the reflected light signal received by the at least one pixel sensor;
combining the amplitude and phase difference for the at least one pixel sensor into a combined signal parameter for the at least one pixel sensor; and
de-noising the combined signal parameter for the at least one pixel sensor by filtering the combined parameter for the at least one pixel sensor.
23. The computer program code as claimed in Claim 22, wherein the computer program code when executed by the processor further realises at least one of: de-noising the phase difference for the at least one pixel sensor by filtering the phase difference for the at least one pixel sensor, wherein the de-noising of the phase difference occurs prior to combining the amplitude and phase difference; and
de-noising the amplitude for the at least one pixel sensor by filtering the amplitude for the at least one pixel sensor, wherein the de-noising of the amplitude occurs prior to combining the amplitude and phase difference.
24. The computer program code as claimed in Claims 22 and 23, wherein the computer program code when executed by the processor realises filtering, the computer program code further realises:
filtering with a non-local spatial transform filter.
25. The computer program code as claimed in Claim 24, wherein the non-local spatial transform filter is a non-local means filter.
26. The computer program code as claimed in Claims 22 to 25, wherein the computer program code when executed by the processor further realises calculating the distance range to the object from the de-noised combined signal parameter for the at least one pixel sensor by:
determining the de-noised phase difference for the at least one pixel sensor from the de-noised combined signal parameter for the at least one pixel sensor; and
calculating the distance range to the object for the at least one pixel sensor using the de-noised phase difference for the at least one pixel sensor.
27. The computer program code as claimed in Claims 22 to 26, wherein the combined signal parameter is a complex signal parameter formed from combining the amplitude and phase difference for the at least one pixel sensor.
28. The computer program code as claimed in Claims 22 to 27, wherein the image sensor of the time of flight camera system is based at least in part on a photonic mixer device.
PCT/FI2012/051304 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera WO2014102442A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP12890686.4A EP2939049B1 (en) 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera
JP2015550122A JP6367827B2 (en) 2012-12-28 2012-12-28 Method and apparatus for removing noise from distance sensor / camera
KR1020157020397A KR101862914B1 (en) 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera
US14/655,618 US10003757B2 (en) 2012-12-28 2012-12-28 Method and apparatus for de-noising data from a distance sensing camera
CN201280078236.6A CN105026955B (en) 2012-12-28 2012-12-28 For to the method and apparatus for answering the data of camera to carry out noise reduction from distance perception
PCT/FI2012/051304 WO2014102442A1 (en) 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2012/051304 WO2014102442A1 (en) 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera

Publications (1)

Publication Number Publication Date
WO2014102442A1 true WO2014102442A1 (en) 2014-07-03

Family

ID=51019941

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2012/051304 WO2014102442A1 (en) 2012-12-28 2012-12-28 A method and apparatus for de-noising data from a distance sensing camera

Country Status (6)

Country Link
US (1) US10003757B2 (en)
EP (1) EP2939049B1 (en)
JP (1) JP6367827B2 (en)
KR (1) KR101862914B1 (en)
CN (1) CN105026955B (en)
WO (1) WO2014102442A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105518485A (en) * 2014-01-13 2016-04-20 软动力学传感器公司 Method for driving time-of-flight system
LU92688B1 (en) * 2015-04-01 2016-10-03 Iee Int Electronics & Eng Sa Method and system for real-time motion artifact handling and noise removal for tof sensor images
WO2016186775A1 (en) * 2015-05-17 2016-11-24 Microsoft Technology Licensing, Llc Gated time of flight camera
CN110956657A (en) * 2018-09-26 2020-04-03 Oppo广东移动通信有限公司 Depth image acquisition method and device, electronic equipment and readable storage medium
US10641606B2 (en) 2016-08-30 2020-05-05 Sony Semiconductor Solutions Corporation Distance measuring device and method of controlling distance measuring device
CN111918214A (en) * 2014-08-15 2020-11-10 化文生 Device for determining a distance based on a transmission signal
CN111973195A (en) * 2016-04-14 2020-11-24 威里利生命科学有限责任公司 Tomographic imaging continuous monitoring of tumor hypoxia

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9918017B2 (en) 2012-09-04 2018-03-13 Duelight Llc Image sensor apparatus and method for obtaining multiple exposures with zero interframe time
US9531961B2 (en) 2015-05-01 2016-12-27 Duelight Llc Systems and methods for generating a digital image using separate color and intensity data
US9819849B1 (en) 2016-07-01 2017-11-14 Duelight Llc Systems and methods for capturing digital images
US10558848B2 (en) 2017-10-05 2020-02-11 Duelight Llc System, method, and computer program for capturing an image with correct skin tone exposure
US9807322B2 (en) 2013-03-15 2017-10-31 Duelight Llc Systems and methods for a digital image sensor
US10924688B2 (en) 2014-11-06 2021-02-16 Duelight Llc Image sensor apparatus and method for obtaining low-noise, high-speed captures of a photographic scene
US11463630B2 (en) 2014-11-07 2022-10-04 Duelight Llc Systems and methods for generating a high-dynamic range (HDR) pixel stream
KR101720881B1 (en) * 2015-12-03 2017-03-28 한양대학교 산학협력단 Image denoising method based on non-local means algorithm using principal components analysis and image processing apparatus using the same
WO2017169782A1 (en) * 2016-03-31 2017-10-05 富士フイルム株式会社 Distance image processing device, distance image acquisition device, and distance image processing method
CN114449163A (en) 2016-09-01 2022-05-06 迪尤莱特公司 Apparatus and method for adjusting focus based on focus target information
JP7149941B2 (en) * 2016-12-07 2022-10-07 ソニーセミコンダクタソリューションズ株式会社 Apparatus and method
WO2019044571A1 (en) 2017-09-01 2019-03-07 ソニー株式会社 Image processing device, image processing method, program, and mobile body
CN111417867B (en) * 2017-10-02 2023-10-03 安全堡垒有限责任公司 Detection and prevention of cyber physical attacks against sensors
US11393115B2 (en) * 2018-11-27 2022-07-19 Infineon Technologies Ag Filtering continuous-wave time-of-flight measurements, based on coded modulation images
US11849223B2 (en) 2018-12-21 2023-12-19 Chronoptics Limited Time of flight camera data processing system
US11402477B2 (en) * 2019-03-01 2022-08-02 Beijing Voyager Technology Co., Ltd System and methods for ranging operations using modulated signals
CN110501691B (en) * 2019-08-13 2022-03-08 Oppo广东移动通信有限公司 Noise filtering method of TOF module, TOF module and device
CN110954921B (en) * 2019-12-03 2022-01-04 浙江大学 Laser radar echo signal-to-noise ratio improving method based on block matching 3D collaborative filtering
US12033305B2 (en) * 2019-12-17 2024-07-09 Stmicroelectronics (Grenoble 2) Sas Filtering device, associated system and method
CN111340723B (en) * 2020-02-23 2022-04-15 武汉大学 Terrain-adaptive airborne LiDAR point cloud regularization thin plate spline interpolation filtering method
KR20210113464A (en) * 2020-03-05 2021-09-16 삼성전자주식회사 Imaging device and electronic device including the same
KR102491028B1 (en) * 2020-09-25 2023-01-20 주식회사 포스코 Machine component calibration apparatus
US11927673B1 (en) * 2023-05-16 2024-03-12 Wireless Photonics, Llc Method and system for vehicular lidar and communication utilizing a vehicle head light and/or taillight

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2073035A1 (en) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS &amp; ENGINEERING S.A. Recording of 3D images of a scene
US20090190007A1 (en) * 2008-01-30 2009-07-30 Mesa Imaging Ag Adaptive Neighborhood Filtering (ANF) System and Method for 3D Time of Flight Cameras
US20100046802A1 (en) * 2008-08-19 2010-02-25 Tatsumi Watanabe Distance estimation apparatus, distance estimation method, storage medium storing program, integrated circuit, and camera
US20120082346A1 (en) * 2010-10-04 2012-04-05 Microsoft Corporation Time-of-flight depth imaging
US20120121162A1 (en) * 2010-11-11 2012-05-17 Samsung Electronics Co., Ltd. Filtering apparatus and method for high precision restoration of depth image
US20120134598A1 (en) * 2010-11-26 2012-05-31 Samsung Electronics Co., Ltd. Depth Sensor, Method Of Reducing Noise In The Same, And Signal Processing System Including The Same

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5311271A (en) * 1992-01-21 1994-05-10 Dme/Golf, Inc. Golf course range finder
US5841522A (en) * 1996-08-01 1998-11-24 Lumen Laboratories, Inc. Phase detector
JP4543904B2 (en) * 2004-11-30 2010-09-15 パナソニック電工株式会社 Distance image sensor
US7994465B1 (en) * 2006-02-06 2011-08-09 Microsoft Corporation Methods and devices for improved charge management for three-dimensional and color sensing
EP2026246A1 (en) 2007-08-03 2009-02-18 Harman/Becker Automotive Systems GmbH Method and apparatus for evaluating an image
EP2238742B1 (en) 2007-12-25 2014-05-14 Medic Vision - Brain Technologies Ltd. Noise reduction of images
JP2009258006A (en) 2008-04-18 2009-11-05 Sokkia Topcon Co Ltd Light wave range finder
EP2148212B1 (en) * 2008-07-24 2019-08-21 Toshiba Medical Systems Corporation Magnetic resonance imaging apparatus for contrast enhancement of images
JP5426344B2 (en) 2009-12-04 2014-02-26 アズビル株式会社 Object detection sensor and object detection method
WO2012014077A2 (en) 2010-07-29 2012-02-02 Waikatolink Limited Apparatus and method for measuring the distance and/or intensity characteristics of objects
WO2014177750A1 (en) * 2013-04-29 2014-11-06 Nokia Corporation A method and apparatus for fusing distance data from a distance sensing camera with an image
KR102061699B1 (en) * 2013-06-19 2020-01-02 삼성전자주식회사 An image sensor, image processing system including the same, and an operating method of the same
CN106603942B (en) * 2016-12-15 2019-12-03 杭州艾芯智能科技有限公司 A kind of TOF camera noise-reduction method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2073035A1 (en) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS &amp; ENGINEERING S.A. Recording of 3D images of a scene
US20090190007A1 (en) * 2008-01-30 2009-07-30 Mesa Imaging Ag Adaptive Neighborhood Filtering (ANF) System and Method for 3D Time of Flight Cameras
US20100046802A1 (en) * 2008-08-19 2010-02-25 Tatsumi Watanabe Distance estimation apparatus, distance estimation method, storage medium storing program, integrated circuit, and camera
US20120082346A1 (en) * 2010-10-04 2012-04-05 Microsoft Corporation Time-of-flight depth imaging
US20120121162A1 (en) * 2010-11-11 2012-05-17 Samsung Electronics Co., Ltd. Filtering apparatus and method for high precision restoration of depth image
US20120134598A1 (en) * 2010-11-26 2012-05-31 Samsung Electronics Co., Ltd. Depth Sensor, Method Of Reducing Noise In The Same, And Signal Processing System Including The Same

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PARK, J. ET AL.: "High quality depth map upsampling for 3D-TOF cameras''.", 2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 6 November 2011 (2011-11-06), pages 1623 - 1630, XP032101376 *
See also references of EP2939049A4 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10509126B2 (en) 2014-01-13 2019-12-17 Sony Depthsensing Solutions Sa/Nv Method for driving a time-of-flight system
CN105518485A (en) * 2014-01-13 2016-04-20 软动力学传感器公司 Method for driving time-of-flight system
CN111918214A (en) * 2014-08-15 2020-11-10 化文生 Device for determining a distance based on a transmission signal
CN111918214B (en) * 2014-08-15 2022-11-11 星盟国际有限公司 Device for determining a distance based on a transmission signal
CN107743638A (en) * 2015-04-01 2018-02-27 Iee国际电子工程股份公司 For carrying out the method and system of the processing of real time kinematics artifact and denoising to TOF sensor image
WO2016156308A1 (en) * 2015-04-01 2016-10-06 Iee International Electronics & Engineering S.A. Method and system for real-time motion artifact handling and noise removal for tof sensor images
LU92688B1 (en) * 2015-04-01 2016-10-03 Iee Int Electronics & Eng Sa Method and system for real-time motion artifact handling and noise removal for tof sensor images
US11215700B2 (en) 2015-04-01 2022-01-04 Iee International Electronics & Engineering S.A. Method and system for real-time motion artifact handling and noise removal for ToF sensor images
WO2016186775A1 (en) * 2015-05-17 2016-11-24 Microsoft Technology Licensing, Llc Gated time of flight camera
US9864048B2 (en) 2015-05-17 2018-01-09 Microsoft Technology Licensing, Llc. Gated time of flight camera
CN111973195A (en) * 2016-04-14 2020-11-24 威里利生命科学有限责任公司 Tomographic imaging continuous monitoring of tumor hypoxia
US10641606B2 (en) 2016-08-30 2020-05-05 Sony Semiconductor Solutions Corporation Distance measuring device and method of controlling distance measuring device
US11310411B2 (en) 2016-08-30 2022-04-19 Sony Semiconductor Solutions Corporation Distance measuring device and method of controlling distance measuring device
CN110956657A (en) * 2018-09-26 2020-04-03 Oppo广东移动通信有限公司 Depth image acquisition method and device, electronic equipment and readable storage medium
CN110956657B (en) * 2018-09-26 2023-06-30 Oppo广东移动通信有限公司 Depth image acquisition method and device, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
JP6367827B2 (en) 2018-08-01
EP2939049A1 (en) 2015-11-04
CN105026955B (en) 2018-12-18
US20150334318A1 (en) 2015-11-19
CN105026955A (en) 2015-11-04
KR101862914B1 (en) 2018-05-31
KR20150103154A (en) 2015-09-09
JP2016509208A (en) 2016-03-24
EP2939049A4 (en) 2016-08-10
US10003757B2 (en) 2018-06-19
EP2939049B1 (en) 2018-02-28

Similar Documents

Publication Publication Date Title
EP2939049B1 (en) A method and apparatus for de-noising data from a distance sensing camera
US10110881B2 (en) Model fitting from raw time-of-flight images
US10230934B2 (en) Depth map correction using lookup tables
EP3092509B1 (en) Fast general multipath correction in time-of-flight imaging
US10884109B2 (en) Analytical-adaptive multifrequency error minimization unwrapping
EP2992357A1 (en) A method and apparatus for fusing distance data from a distance sensing camera with an image
US9852495B2 (en) Morphological and geometric edge filters for edge enhancement in depth images
US9903941B2 (en) Time of flight camera device and method of driving the same
US9900581B2 (en) Parametric online calibration and compensation in ToF imaging
US9940701B2 (en) Device and method for depth image dequantization
US20230147186A1 (en) Adaptive processing in time of flight imaging
US20160247286A1 (en) Depth image generation utilizing depth information reconstructed from an amplitude image
CN109242782B (en) Noise processing method and device
US10838487B2 (en) Mixed pixel unwrapping error mitigation by filtering optimization
US20190340776A1 (en) Depth map interpolation using generalized likelihood ratio test parameter estimation of a coded image
WO2023077412A1 (en) Object distance measurement method and device
CN113240604B (en) Iterative optimization method of flight time depth image based on convolutional neural network
US20210390719A1 (en) Determining depth in a depth image
Schönlieb et al. Stray-light mitigation for under-display time-of-flight imagers
US20220224881A1 (en) Method, apparatus, and device for camera calibration, and storage medium

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201280078236.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12890686

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012890686

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14655618

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2015550122

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20157020397

Country of ref document: KR

Kind code of ref document: A