US20150285623A1 - Distance detection device and method including dynamically adjusted frame rate - Google Patents

Distance detection device and method including dynamically adjusted frame rate Download PDF

Info

Publication number
US20150285623A1
US20150285623A1 US14/247,751 US201414247751A US2015285623A1 US 20150285623 A1 US20150285623 A1 US 20150285623A1 US 201414247751 A US201414247751 A US 201414247751A US 2015285623 A1 US2015285623 A1 US 2015285623A1
Authority
US
United States
Prior art keywords
frame
light
distance
detection device
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/247,751
Other versions
US9170095B1 (en
Inventor
Makoto Tachibana
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US14/247,751 priority Critical patent/US9170095B1/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TACHIBANA, MAKOTO
Publication of US20150285623A1 publication Critical patent/US20150285623A1/en
Application granted granted Critical
Publication of US9170095B1 publication Critical patent/US9170095B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the present disclosure relates to time-of-flight (TOF) distance measurement devices and methods.
  • TOF time-of-flight
  • a TOF system including a three-dimensional distance sensor that measures the flight time of light emitted from a light-emitting device and reflected from a target to a light-receiving device.
  • the distance of the target relative to the emitter/receiver may be calculated based on the time of flight and known properties related to the speed of light.
  • the quantity of emitted light from the light-emitting device greatly influences the precision of the distance measurements performed with such devices. Accordingly, the number of light-emitting elements included in the TOF device, the size of the light-emitting elements, and the power consumption of the light-emitting elements may be optimized based on a given application-specific detection distance.
  • the aforementioned optimization of light-emitting elements can prove difficult. As a result, distance detection accuracy can worsen at long ranges in such implementations.
  • a distance detection device includes a light emitter configured to transmit light in a series of frames, wherein each frame in the series of frames includes at least one light pulse.
  • the distance detection device may include a light receiver configured to receive, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target.
  • the light receiver may be further configured to generate, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame.
  • the distance detection device may include circuitry configured to calculate, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver, dynamically control a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame, and control the light emitter such that the one or more light pulses are emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.
  • FIG. 1 illustrates a non-limiting exemplary block diagram corresponding to a system including a TOF distance detection device, according to certain embodiments
  • FIG. 2 illustrates a non-limiting exemplary timing diagram for a TOF distance detection device, according to certain embodiments
  • FIG. 3 illustrates a non-limiting exemplary table demonstrating a dynamic change in frame rate for a TOF distance detection device, according to certain embodiments
  • FIGS. 4A and 4B illustrate non-limiting examples of frame rates corresponding to various measured distances determined by a TOF distance detection device, according to certain embodiments
  • FIG. 5 illustrates a non-limiting exemplary waveform diagram demonstrating a dynamic change in frame rate of a TOF distance detection device, according to certain embodiments
  • FIG. 6 illustrates a non-limiting exemplary flowchart corresponding to processing for dynamically controlling a frame rate of a TOF distance detection device, according to certain embodiments
  • FIG. 7 illustrates a non-limiting exemplary waveform diagram demonstrating another dynamic change in frame rate of a TOF distance detection device, according to certain embodiments
  • FIG. 8 illustrates a non-limiting exemplary flowchart corresponding to additional processing for dynamically controlling a frame rate of a TOF distance detection device, according to certain embodiments
  • FIG. 9 illustrates a non-limiting exemplary block diagram corresponding to a terminal device for implementing dynamic frame rate control processing, according to certain embodiments.
  • FIG. 10 illustrates an exemplary target detection using the terminal device of FIG. 9 .
  • FIG. 1 illustrates a non-limiting exemplary block diagram corresponding to a system including a TOF distance detection device, according to certain embodiments.
  • a distance detection device includes a controller 11 , a light emitting device 13 , a light reception device 15 , and an interface 17 .
  • the controller 11 may include one or more central processing units (CPUs), and may control each element in the distance detection device to perform features related to various signal processing, including the processing and algorithms described herein.
  • CPUs central processing units
  • the light emitting device 13 is a light source which generates a beam of invisible light such as infrared (IR) light.
  • the emitted light may be transmitted toward an object 19 , which is an object of a distance detection performed by the distance detection device.
  • the light emitting device 13 may include one or more infrared light emitting diodes (LEDs).
  • the light emitting device 13 generates pulse-form light of a predetermined period (light emission interval) under control of the controller 11 .
  • the light reception device 15 receives light transmitted by the light emitting device 13 and reflected from the object 19 . In response to receiving the reflected light, the light reception device 15 generates a signal indicating a light quantity and light reception time corresponding to the received light.
  • the light reception device 15 may include one or more of a photodiode, a CMOS sensor, a CCD sensor, or the like, for generating the output signal based on the light quantity and the light reception time.
  • the light reception device 15 includes a first optical light sensor 15 a and a second optical light sensor 15 b .
  • Each of the light sensors 15 a and 15 b may respectively output a signal s 1 and s 2 corresponding to a stored charge amount generated in response to receiving light transmitted by the light emitting device 13 and reflected from the object 19 .
  • the signals s 1 and s 2 may correspond to an indication of stored charge accumulated based on the light quantity and the light reception time of the received light.
  • first and second light sensors 15 a and 15 b in the example of FIG. 1 share a single photoelectric element for converting received light into electricity.
  • two electrical storage circuits can be arranged in parallel in the latter stage of the light reception device 15 , and the two electrical storage circuits may be connected by a switch in order to form a structure that switches a single photoelectric element to two electrical storage circuits via the switch.
  • the first and second light sensors 15 a and 15 b may be respectively controlled by a gate signal “GATE a” and “GATE b” transmitted by the controller 11 in synchronization with light emission from the light emitting device 13 .
  • the gate signals GATE a and GATE b control the first and second light sensors 15 a and 15 b by operating in a gate window of a predetermined phase difference. Aspects of controlling the first and second light sensors 15 a and 15 b via the gate signals will be discussed in greater detail in later paragraphs.
  • the interface 17 is connected between the controller 11 and a host system.
  • the interface 17 acts as a communication interface between the controller 11 and the host system such that instructions may be passed to and from the host system and the controller 11 . Additionally, the interface 17 provides a communication pathway for transmitting to the host system distance data calculated by the controller 11 .
  • Interface 17 may be a wired communication interface or a wireless communication interface.
  • distance may be measured with respect to a distance detection device by detecting the phase difference of light emitted from the light emitting device 13 and the received light received at the light reception device 15 .
  • a phase difference of the transmitted and received light is not detected based on a time stamp of the input and output light pulse, but instead is detected based on the accumulated amount of electrical charge in the light sensors 15 a and 15 b , which are synchronized in accordance with the light transmission.
  • Measuring a phase difference (time) between a light transmission and light reception is not an accurate method of performing distance detection in a TOF system due to the poor temporal resolution with respect to the speed of light.
  • determining a distance based on stored electrical charge generated in response to received light reflected from an object provides the benefit of improved accuracy with respect to simple phase difference measurements relying on a time difference.
  • the two light sensors provided in the distance detection device according to FIG. 1 may convert the stored electrical charge generated in response to the received light into a voltage, and the resolution of the distance detection may be indirectly raised by scaling.
  • FIG. 2 illustrates a non-limiting exemplary timing diagram for a TOF distance detection device, according to certain embodiments.
  • the exemplary timing diagram of FIG. 2 will now be utilized to demonstrate the principle of operation of the distance detection apparatus shown in FIG. 1 .
  • the waveform of “LIGHT EMISSION” shows a light pulse that has a pulse width t emitted from the light emitting device 13 .
  • the waveform of “LIGHT RECEPTION” shows a light pulse that has been emitted from the light emitting device 13 and reflected by the object 19 such that it is received by the light reception device 15 .
  • This example assumes that a time t 2 passes between the initial transmission of the light pulse by the light emission device 13 and the reception of the light pulse by the light reception device 15 .
  • the waveforms corresponding to “GATE a” and “GATE b” illustrate gate signals that respectively control gate windows corresponding to the first and second light sensors 15 a and 15 b .
  • the gate signals in this example have a pulse waveform and comprise the gate window which designates the period corresponding to the width of the gate pulse for activating the light sensors 15 a and 15 b .
  • the width of the gate windows corresponding to the gate signals is the same as the width t of the light pulse emitted by the light emission device 13 .
  • the gate window of the light sensor 15 a is generated at a same timing as the light pulse emitted from the light emitting device 13 .
  • the gate window corresponding to the light sensor 15 b is generated at a predetermined time difference (phase difference) relative to the light sensor 15 a . This time difference for transmitting the respective gate signals is assumed in the present example to equal the width t of the light pulse emitted from the light emitting device 13 .
  • a charge amount s 1 is accumulated during the time period in which light is received by the light sensor 15 a within its gate window (i.e., the time t 1 illustrated by the hashed area under the GATE a pulse).
  • the light sensor 15 b operates in the gate window according to the gate signal in GATE b, and the accumulated charge generated in response to receiving light within this gate window results in the stored charge amount s 2 .
  • the gate window corresponding to the light sensor 15 b starts at the finish time of the “LIGHT EMISSION” pulse, and the “LIGHT RECEPTION” is finished when time t passes from that point in time. Therefore, the storage time of the electrical charge according to the light reception amount of the light sensor 15 b is restricted to the light received in the time t 2 .
  • the following formula (1) may be formed to describe the distance D between the distance detection device and an object as a function of the speed of light c and the time t 2 .
  • the distance D can be calculated based on the known time t and the amounts s 1 and s 2 of measured stored electrical charges.
  • a frame refers to a unit of one detection process of distance performed based on at least one light emission (e.g., LED irradiation) and light reception. Further, a frame rate corresponds to the number of objects of the frame included per unit time (for example, one second). While the above-mentioned countermeasure against decreased accuracy due to decreased electrical charge detection from a light sensor (i.e., increasing the transmission power and/or the transmission time of a light pulse for measuring the distance) may be effective in some implementations, this countermeasure becomes impractical and/or unavailable in relatively smaller device implementations such as mobile devices (e.g., smartphones, etc.).
  • embodiments of the present disclosure may ensure sufficient distance detection accuracy by dynamically changing a frame rate and a frequency of light emission per frame rate while measuring the distance in a state with fixed emission power, fixed emission time, and fixed light emission interval.
  • changing a frame rate means changing the frequency of light emission (namely, the number of light emission pulses) contained in one frame.
  • a first method is a method of determining a frame rate in a current frame based on a distance relative to an object detected in an immediately preceding frame with a distance detection device. For example, suppose the detection range of an object with a distance detection device is 0 to 200 centimeters. Further, suppose in the range of 0 to 100 centimeters a frame rate of 50 frames per second (FPS) is applied, and in the range of 100 to 200 centimeters a frame rate of 25 FPS is applied. This results in a time-per-frame in the two ranges of 20 milliseconds and 40 milliseconds, respectively.
  • FPS frames per second
  • the frequency of light emission of the light emitting device will be two times and four times for these ranges, respectively.
  • signals s 1 and s 2 are generated based on the electrical charge accumulated in response to the light reception. Therefore, when the distance of an object is far, the accumulated amount of electrical charge generated by a light sensor may be raised by dynamically adjusting the frame rate and controlling the light received within the frame. As a result, degradation of detection accuracy by the increase in detection distance can be moderated without a corresponding rise in the average electric current value per time when the distance is being measured.
  • FIG. 3 illustrates a schematic example of a dynamically changing frame rate according to a first method.
  • the frame rate of the following frame is determined according to the detected distance (OUTPUT DISTANCE) in the completed frame.
  • frame # 1 of frame rate FR 0 includes n 1 light pulses, and the distance detected in this frame is d 1 .
  • the frame rate of the following frame # 2 is determined based on this distance d 1 .
  • the frame rate in frame # 2 is maintained at frame rate FR 0 . Accordingly, the number of light pulses in frame # 2 remains at n 1 .
  • the output distance d 2 of frame # 2 becomes the same as the distance d 1 detected in frame # 1 . Because the distance d 2 remains unchanged, the frame rate in frame # 2 is also maintained at frame rate FR 0 .
  • frame # 3 it is assumed that the distance between the object and the distance detection devices increases relative to the distance from frame # 2 .
  • the distance in frame # 4 increases above a predetermined threshold distance.
  • the detection accuracy of the output distance d 3 of frame # 3 is low, it can be utilized for far and near determinations of distance of an object.
  • a frame rate in the following frame # 4 is changed from frame rate FR 0 to frame rate FR 1 . That is, when the output distance d 3 increases over the predetermined threshold boundary, the controller 110 selects a lower frame rate FR 1 and applies the decreased frame rate FR 1 to the subsequent frame # 4 .
  • FIGS. 4A and 4B illustrate data tables that may be implemented for setting a frame rate in a current frame based on a distance detected in an immediately preceding frame, according to certain embodiments.
  • the data table in FIG. 4A is assumed to be implemented for a distance measurement range of 0 to 2 meters.
  • the total distance detection range in this example is divided into two periods (i.e., 0 to 1 meters, and 1 to 2 meters).
  • Frame rates corresponding to each distance detection range segment are shown in the bottom row of the data table.
  • the distance range segment of 0 to 1 meter has an associated frame rate of 50 frames per second
  • the distance detection range of 1 to 2 meters has a corresponding frame rate of 40 frames per second.
  • the distance detection device sets the frame rate in the immediately subsequent frame to a frame rate of 50 frames per second. Similarly, if the distance detected within a current frame is within the range of 1 to 2 meters, the distance detection device sets the frame rate in the immediately subsequent frame to a frame rate of 40 frames per second.
  • FIG. 4B illustrates a case in which the detection range is increased to a detection range of 2 meters or more, and the total range of the distance detection device is divided into three segments.
  • the table in FIG. 4B includes segments corresponding to distance ranges of 0 to 1 meters, 1 to 2 meters, and greater than 2 meters.
  • the bottom row of the table in FIG. 4B includes frame rates corresponding to each distance range segment.
  • the distance range of 0 to 1 meters has a corresponding frame rate of 60 frames per second
  • the distance range of 1 to 2 meters has a corresponding frame rate of 50 frames per second
  • the distance range of greater than 2 meters has a corresponding frame rate of 40 frames per second.
  • a similar determination of dynamically determining a frame rate in a current frame based on a distance detection result in an immediately preceding frame may be implemented using the data in the table of FIG. 4B in a similar manner as the example discussed above for FIG. 4A .
  • FIG. 4A and FIG. 4B illustrate examples of detection ranges and segments of detection ranges in corresponding frame rate that may be applied in certain embodiments, the present disclosure is not limited to any particular distance detection range or number of segments for dynamically controlling a frame rate of a distance detection device.
  • FIG. 5 illustrates a wave form diagram including various signals for demonstrating the dynamic control of frame rate in a distance detection device in embodiments implementing the first method.
  • the names of the various signals included in the diagram of FIG. 5 were discussed previously with respect to FIG. 2 and therefore, a discussion of what these signals represent will not be repeated here.
  • the example of FIG. 5 further illustrates processing with respect to frames # 3 and # 4 , which were discussed with respect to FIG. 3 .
  • frame # 3 of frame rate FR 0 includes one light pulse, and this example assumes that electrical charges in the amounts of s 1 and s 2 were obtained at the time of completing frame # 3 .
  • the distance D may be calculated for the frame # 3 .
  • the frame rate corresponding to frame # 4 may be dynamically controlled by comparing the calculated output distance with corresponding frame rates stored in a data table, such as the data tables illustrated in FIGS. 4A and 4B .
  • the output distance D is assumed to correspond to a frame rate FR 1 . As shown in the graph of FIG.
  • the output distance D may be calculated in frame # 4 based on the accumulated electrical charge s 1 and s 2 , and the frame rate of frame # 5 may be controlled based on the output distance calculated for frame # 4 .
  • FIG. 6 illustrates a non-limiting exemplary flowchart corresponding to processing for dynamically controlling a frame rate for a TOF distance detection device according to the first method.
  • step S 11 the controller 11 sets an initial frame rate for the distance detection in the first frame.
  • the light emitting device 13 emits at least one light pulse at a fixed emission interval.
  • the light emitting device 13 may emit a series of 10 millisecond light pulses at a fixed interval within the initial frame.
  • the controller 11 performs a distance measurement for the current frame based on the accumulated charge amounts s 1 and s 2 generated when the light emitted at step S 12 is reflected from an object from which the distance is being measured and received by the light receiving device 15 .
  • the controller 11 determines, based on the distance calculated at step S 13 , whether the frame rate in the immediately subsequent frame with respect to the current frame should be dynamically changed. For example, in certain embodiments, the controller 11 may compare the distance calculated at step S 13 with data in a data table including corresponding frame rates, such as the data tables shown in FIGS. 4A and 4B . If the controller 11 at step S 14 determines that the frame rate in the subsequent frame should not be changed, the process returns to step S 12 .
  • step S 15 the controller 11 performs processing for dynamically controlling the frame rate such that the frame rate in the subsequent frame changes to the frame rate determined at step S 14 .
  • the process then returns to step S 12 where the process is repeated for a series of frames.
  • the controller 11 terminates the current frame.
  • the frame rate of the frame immediately subsequent to the current frame was determined at the time of finishing the current frame.
  • the frame rate of a certain frame was determined before the start of the frame.
  • a frame rate is not determined in the second method before the start of a frame, but instead the frame rate in the current frame is decided based on a result determined at the time of finishing the current frame.
  • the second method has an advantage in that it can optimize frame rates which include the influence of light quantity change factors other than the distance of an object with respect to the light sensor (e.g., reflectance of an object, it originates in a color, etc.).
  • FIG. 7 illustrates a non-limiting exemplary wave form diagram demonstrating processing related to the dynamic control of a frame rate for a TOF distance detection device according to the second method.
  • the controller 11 determines that frame #n is not completed at this time, and the controller 11 extends the frame #n so that the light pulse of a second pulse may be included within the frame.
  • the light sensors 15 a and 15 b will receive light from the additional pulse, thereby increasing the amount of stored charge amounts s 1 and s 2 .
  • the lower portion of the graph in FIG. 7 shows that the amount of stored charge amounts s 1 and s 2 obtained from the first light pulse does not exceed the level of a predetermined threshold value Th in frame #n.
  • the store charge amount s 2 increases above the threshold value Th.
  • frame #n is completed at this time.
  • the frame rate (here FR 1 ) of frame #n is determined at this time. In the second method of dynamic frame control, even if one of the store charge amounts s 1 and s 2 exceeds a threshold value, the current frame is not completed until the other amount of store charge is calculated.
  • the controller 11 may dynamically control the frame rate of frame #n such that the frame rate is extended so as to include a third light pulse within the frame.
  • frame rate control according to the second method may be performed sequentially by continuously increasing the frame rate while the stored charge amounts s 1 and s 2 remain below the threshold value Th.
  • certain implementations may include an upper limit for time in which the current frame may be extended by the controller 11 . After the completion of the upper time limit for extending the current frame, the controller 11 may forcibly terminate the current frame. This provides the benefit of not continuously extending the current frame in perpetuity.
  • the frame #n+1 begins at a start time corresponding to the subsequent frame.
  • a frame rate of the frame #n+1 has not been fully determined at this time.
  • the controller 11 determines that the frame #n+1 is finished before the following light pulse is emitted. Additionally, the controller 11 determines the frame rate of the frame #n+1 (here frame rate FR 0 ).
  • FIG. 8 illustrates a non-limiting exemplary flow chart corresponding to processing for dynamically controlling a frame rate for a TOF distance detection device according to the second method.
  • the controller 11 drives the light emitting device 13 such that at least one light pulse is generated within the current frame.
  • the controller 11 determines the charge stored amounts s 1 and s 2 generated during receipt of the at least one light pulse from step S 21 , and determines if at least one of the charge stored amounts s 1 and s 2 is greater than a predetermined threshold value.
  • step S 22 determines that neither of the charge stored amounts s 1 and s 2 is greater than the predetermined threshold. Otherwise, the process proceeds to step S 23 .
  • step S 23 the controller 11 determines whether a minimum time of one frame has elapsed since the emission of the at least light pulse from the light emitting device 13 at step S 21 . If the controller 11 determines at step S 23 that the lower time limit has not been exceeded, the process proceeds to step S 24 . Otherwise, the process proceeds to step S 26 .
  • step S 24 the controller 11 waits until the current frame is completed (i.e., until the minimum time elapses for one frame) and then proceeds to step S 26 .
  • step S 25 the controller 11 determines whether an upper limit time threshold has been exceeded.
  • the upper limit time threshold corresponding to step S 25 may be suitably selected such that the distance determination and dynamic frame control according to the second method does not continue in perpetuity (e.g., the frame rate is not continuously extended due to the stored charge amount(s) not exceeding the threshold value Th). If the controller 11 determines at step S 25 that the upper time limit is exceeded, the process progresses to step S 26 . Otherwise, the process progresses to step S 27 .
  • step S 26 the controller 11 performs a calculation of the distance between the distance detection device and the object based on the stored charge amounts s 1 and s 2 .
  • the process of FIG. 8 then returns to step S 21 following the distance calculation.
  • step S 27 the controller 11 waits until the next timing to emit a subsequent light pulse, based on the fixed emission interval.
  • the explicit values described in the examples provided herein are not limiting, and the present disclosure is not limited to any particular value of the initial frame rate value, the phase number of the changing of the frame rate, the upper and lower time limits, etc., and any arbitrary values may be applied. Further, a light emission interval is not restricted to 10 milliseconds for the purposes of the present disclosure.
  • FIG. 9 illustrates a non-limiting exemplary block diagram corresponding to a terminal device for implementing dynamic frame rate control processing according to certain embodiments.
  • the example illustrated in FIG. 9 includes a display apparatus 10 provided with a three-dimensional (3D) gesture sensing device 107 that implements a dynamic frame rate control method in accordance with the present disclosure.
  • the display apparatus 10 of FIG. 9 may, e.g., be implemented as any arbitrary device that includes a display, such as a television receiver, a personal computer (PC), a tablet, etc.
  • the display apparatus 10 includes a control line 150 and a data line 160 , and is comprised of the following various functional elements that communicate control signals and other data signals across these lines: a controller 101 , a communication processor 102 connected to antenna 103 , a display 104 , an operation unit 105 , a memory 106 , the 3D gesture sensing device 107 , a speaker 110 , a microphone 122 , and a television broadcast receiver 124 connected to antenna 125 .
  • the display apparatus 10 may include various other components not explicitly providing in the example of FIG. 9 . Further, elements shown in FIG. 9 may be optionally omitted in certain implementations.
  • the television broadcasting receiver 124 is not an essential element to the display apparatus 10 and/or distance detection processing according to the present disclosure and therefore, this element may be optionally omitted in certain implementations.
  • the controller 101 may include one or more central processing units (CPUs), and may control each element in the display apparatus 10 to perform features related to communication control, audio signal processing, control for the audio signal processing, image processing and control, and other kinds of signal processing.
  • the controller 101 may control elements in the display apparatus 10 based on a process of monitoring and managing the output of the 3D gesture detection apparatus 107 .
  • the controller 101 may perform these features by executing instructions stored in the memory 106 .
  • the features may be executed using instructions stored in an external device accessed on a network or on a non-transitory computer readable medium.
  • the communication processor 102 controls communications performed between the terminal device 100 and other external devices via the antenna 101 or another connection (e.g., a wired connection). For example, the communication processor 102 may control communication between base stations for cellular telephone communication. Additionally, the communication processor 102 may control wireless communication performed with the other external apparatuses according to protocols such as Bluetooth, IEEE 802.11, and near field communication (NFC); or wired or wireless communication on a network (e.g., the Internet) using, e.g., an Ethernet connection.
  • protocols such as Bluetooth, IEEE 802.11, and near field communication (NFC)
  • NFC near field communication
  • the antenna 103 transmits/receives electromagnetic wave signals between base stations for performing radio-based communication, such as the various forms of cellular telephone communication.
  • the display 104 may be a liquid crystal display (LCD), an organic electroluminescence display panel, or another display screen technology.
  • the display 104 may display operational inputs, such as numbers or icons, which may be used for control of the display apparatus 10 .
  • the display 104 may additionally display a graphical user interface with which a user may control aspects of the display apparatus 10 .
  • the display 104 may display characters and images received by the display apparatus 10 and/or stored in the memory 106 or accessed from an external device on a network.
  • the display apparatus 10 may access a network such as the Internet and display text and/or images transmitted from a Web server.
  • the operation unit 105 may include one or more buttons similar to external control elements (e.g., power control, volume control, standby control, etc.) for providing an operational interface on which the user may control the display apparatus 10 .
  • the operation unit 105 may generate an operation signal based on a detected input corresponding to one of the buttons, switches, etc.
  • the operation signals generated by the operation unit 105 may be supplied to the controller 101 for performing related processing control of the display apparatus 10 .
  • the operation unit 105 may be integrated as a touch panel display within the display 104 .
  • the memory 106 may include, e.g., Read Only Memory (ROM), Random Access Memory (RAM), or a memory array comprised of a combination of volatile and non-volatile memory units.
  • the memory 106 may be utilized as working memory by the controller 101 while executing the processing and algorithms of the present disclosure, or to store other instructions corresponding to processing performed by the controller 101 (e.g., operating system instructions). Additionally, the memory 106 may be used for long-term storage, e.g., of images and information related thereto.
  • the 3D gesture sensing device 107 includes one or more sensors that can detect 3D motion of detection targets, such as a finger of a user's hand.
  • array-type distance sensors included in the 3D gesture sensing device 107 may detect a distance from the display apparatus 10 relative to on object within a detection range of the sensors.
  • the stereo image of a target is measurable.
  • the speaker 110 emits an audio signal corresponding to audio/voice data supplied from the display apparatus 10 .
  • the microphone 122 detects surrounding audio, and converts the detected audio into an audio signal. The audio signal may then be output to the controller 101 for further processing.
  • the television broadcasting receiver 124 receives image and audio signal data via the antenna 124 or another wired or wireless connection.
  • the television broadcasting receiver 124 performs signal processing on the received image and audio signal data such that the data may be reproduced on the display apparatus 10 via the display 104 .
  • FIG. 10 illustrates an exemplary target detection using the terminal device of FIG. 9 .
  • FIG. 10 is an explanatory drawing of a user's hand in a detection space of the 3D gesture sensing device 107 .
  • the detection space in this example is set in front of (i.e., the viewing space) of the display 12 .
  • the y-axis of the display 12 is the up-down direction and x-axis is the left-right direction relative to the display 12 surface.
  • a direction vertical to the display 12 is the z-axis.
  • a direction perpendicular to the display 12 corresponds to the z-axis direction of an associated detection sensor in the 3D gesture sensing device 107 .
  • the feature points of not only one point of the tip of an index finger, but also another finger or a hand may be collectively utilized.
  • the examples discussed herein perform detection processing with respect to a user's hand, one of ordinary skill in the art will appreciate that the gesture detection processes described herein may be adapted to detect gestures and feature points related to other objects.
  • the position of each tip of the fingers 21 and 22 may be detected by 3D gesture sensing device 107 as features points A and B.
  • the 3D gesture sensing device 107 may perform processing for distinguishing between a first instruction operation (e.g., a scroll operation) and a second instruction operation (e.g., a click operation) based on time-series data related to three parameters: the coordinates of the first point A, the coordinates of the second point B, and the distance between points A and point B.
  • a user's gesture may be monitored by the 3D gesture sensing device 107 , and the coordinates of the first point A at the tip of the index finger 21 and the coordinates of the second point B at the tip of the thumb 22 may be acquired periodically. Simultaneously, the distance between the first point A and the second point B may be computed from the determined coordinates of both points.
  • the feature points of not only one point of the tip of an index finger but another finger or a hand may be collectively utilized.
  • the examples discussed herein perform detection processing with respect to a user's hand, one of ordinary skill in the art will appreciate that the gesture detection processes described herein may be adapted to detect gestures and feature points related to other objects.
  • a processing circuit includes a programmed processor, as a processor includes circuitry.
  • a processing circuit also includes devices such as an application specific integrated circuit (ASIC) and conventional circuit components arranged to perform the recited functions.
  • ASIC application specific integrated circuit
  • the functions and features described herein may also be executed by various distributed components of a system.
  • one or more processors may execute these system functions, wherein the processors are distributed across multiple components communicating in a network.
  • the distributed components may include one or more client and/or server machines, in addition to various human interface and/or communication devices (e.g., display monitors, smart phones, tablets, personal digital assistants (PDAs)).
  • the network may be a private network, such as a LAN or WAN, or may be a public network, such as the Internet.
  • Input to the system may be received via direct user input and/or received remotely either in real-time or as a batch process.
  • some implementations may be performed on modules or hardware not identical to those described. Accordingly, other implementations are within the scope that may be claimed.
  • a distance detection device comprising: a light emitter configured to transmit light in a series of frames, wherein each frame in the series of frames includes at least one light pulse; a light receiver configured to receive, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target, and generate, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; and circuitry configured to calculate, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver, dynamically control a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame, and control the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.
  • the distance detection device of (1) or (2) further comprising a memory that stores one or more frame rates respectively corresponding to distances calculated in the immediately preceding frame.
  • circuitry configured to control the at least two light sensors by outputting a gate signal to each of the at least two light sensors, wherein the gate signal controls a time interval during which light may be detected by the at least two light sensors.
  • the circuitry is further configured to dynamically control the frame rate by extending the current frame so as to include a detection of at least one more light pulse by the light receiver.
  • a method comprising: transmitting, by a light emitter, light in a series of frames, wherein each frame in the series of frames includes at least one light pulse; receiving, by a light receiver, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target; generating, by the light receiver, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; calculating, by circuitry, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver; dynamically controlling, by the circuitry, a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame; and controlling, by the circuitry, the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

A device including an emitter that transmits light in a series of frames, wherein each frame in the series includes at least one pulse. The device includes a receiver that receives, for each frame in the series, the at least one pulse reflected from a target, and generates, in response to receiving the at least one pulse in a current frame, an output for calculating a distance between the target and the device for the current frame. The device includes circuitry that calculates, for each frame in the series, the distance between the target and the device based on the receiver output. The circuitry dynamically controls a frame rate for each frame in the series based on the distance calculated in a frame immediately preceding the current frame, and controls the emitter such that the at least one pulse is emitted in the current frame at the calculated frame rate.

Description

    BACKGROUND
  • 1. Technical Field
  • The present disclosure relates to time-of-flight (TOF) distance measurement devices and methods.
  • 2. Description of Related Art
  • The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventor, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly nor impliedly admitted as prior art against the present disclosure.
  • As one type of distance detection technology, there exists a TOF system including a three-dimensional distance sensor that measures the flight time of light emitted from a light-emitting device and reflected from a target to a light-receiving device. The distance of the target relative to the emitter/receiver may be calculated based on the time of flight and known properties related to the speed of light.
  • SUMMARY
  • In distance detection devices implementing TOF techniques, the quantity of emitted light from the light-emitting device greatly influences the precision of the distance measurements performed with such devices. Accordingly, the number of light-emitting elements included in the TOF device, the size of the light-emitting elements, and the power consumption of the light-emitting elements may be optimized based on a given application-specific detection distance. However, in implementations such as mobile terminals (e.g., smartphones, tablets, etc.) in which size reduction and power efficiency are important practical and operational concerns, the aforementioned optimization of light-emitting elements can prove difficult. As a result, distance detection accuracy can worsen at long ranges in such implementations.
  • In one or more embodiments, a distance detection device includes a light emitter configured to transmit light in a series of frames, wherein each frame in the series of frames includes at least one light pulse. The distance detection device may include a light receiver configured to receive, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target. The light receiver may be further configured to generate, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame. The distance detection device may include circuitry configured to calculate, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver, dynamically control a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame, and control the light emitter such that the one or more light pulses are emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.
  • The foregoing general description of the illustrative embodiments and the following detailed description thereof are merely exemplary aspects of the teachings of this disclosure, and are not restrictive.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A more complete appreciation of this disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
  • FIG. 1 illustrates a non-limiting exemplary block diagram corresponding to a system including a TOF distance detection device, according to certain embodiments;
  • FIG. 2 illustrates a non-limiting exemplary timing diagram for a TOF distance detection device, according to certain embodiments;
  • FIG. 3 illustrates a non-limiting exemplary table demonstrating a dynamic change in frame rate for a TOF distance detection device, according to certain embodiments;
  • FIGS. 4A and 4B illustrate non-limiting examples of frame rates corresponding to various measured distances determined by a TOF distance detection device, according to certain embodiments;
  • FIG. 5 illustrates a non-limiting exemplary waveform diagram demonstrating a dynamic change in frame rate of a TOF distance detection device, according to certain embodiments;
  • FIG. 6 illustrates a non-limiting exemplary flowchart corresponding to processing for dynamically controlling a frame rate of a TOF distance detection device, according to certain embodiments;
  • FIG. 7 illustrates a non-limiting exemplary waveform diagram demonstrating another dynamic change in frame rate of a TOF distance detection device, according to certain embodiments;
  • FIG. 8 illustrates a non-limiting exemplary flowchart corresponding to additional processing for dynamically controlling a frame rate of a TOF distance detection device, according to certain embodiments;
  • FIG. 9 illustrates a non-limiting exemplary block diagram corresponding to a terminal device for implementing dynamic frame rate control processing, according to certain embodiments; and
  • FIG. 10 illustrates an exemplary target detection using the terminal device of FIG. 9.
  • DETAILED DESCRIPTION
  • Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views.
  • FIG. 1 illustrates a non-limiting exemplary block diagram corresponding to a system including a TOF distance detection device, according to certain embodiments.
  • A distance detection device according to the example of FIG. 1 includes a controller 11, a light emitting device 13, a light reception device 15, and an interface 17.
  • The controller 11 may include one or more central processing units (CPUs), and may control each element in the distance detection device to perform features related to various signal processing, including the processing and algorithms described herein.
  • The light emitting device 13 is a light source which generates a beam of invisible light such as infrared (IR) light. The emitted light may be transmitted toward an object 19, which is an object of a distance detection performed by the distance detection device. In one or more embodiments, the light emitting device 13 may include one or more infrared light emitting diodes (LEDs). In one or more embodiments, the light emitting device 13 generates pulse-form light of a predetermined period (light emission interval) under control of the controller 11.
  • The light reception device 15 receives light transmitted by the light emitting device 13 and reflected from the object 19. In response to receiving the reflected light, the light reception device 15 generates a signal indicating a light quantity and light reception time corresponding to the received light. In one or more embodiments, the light reception device 15 may include one or more of a photodiode, a CMOS sensor, a CCD sensor, or the like, for generating the output signal based on the light quantity and the light reception time. In the example of FIG. 1, the light reception device 15 includes a first optical light sensor 15 a and a second optical light sensor 15 b. Each of the light sensors 15 a and 15 b may respectively output a signal s1 and s2 corresponding to a stored charge amount generated in response to receiving light transmitted by the light emitting device 13 and reflected from the object 19. As mentioned previously, the signals s1 and s2 may correspond to an indication of stored charge accumulated based on the light quantity and the light reception time of the received light.
  • Distance detection processing described herein is not limited to any structure with respect to the distance detection device, and other structural elements not explicitly shown or discussed herein may be applied in a TOF distance detection system according to the present disclosure. For example, first and second light sensors 15 a and 15 b in the example of FIG. 1 share a single photoelectric element for converting received light into electricity. However, in certain embodiments, two electrical storage circuits can be arranged in parallel in the latter stage of the light reception device 15, and the two electrical storage circuits may be connected by a switch in order to form a structure that switches a single photoelectric element to two electrical storage circuits via the switch.
  • The first and second light sensors 15 a and 15 b may be respectively controlled by a gate signal “GATE a” and “GATE b” transmitted by the controller 11 in synchronization with light emission from the light emitting device 13. The gate signals GATE a and GATE b control the first and second light sensors 15 a and 15 b by operating in a gate window of a predetermined phase difference. Aspects of controlling the first and second light sensors 15 a and 15 b via the gate signals will be discussed in greater detail in later paragraphs.
  • The interface 17 is connected between the controller 11 and a host system. The interface 17 acts as a communication interface between the controller 11 and the host system such that instructions may be passed to and from the host system and the controller 11. Additionally, the interface 17 provides a communication pathway for transmitting to the host system distance data calculated by the controller 11. Interface 17 may be a wired communication interface or a wireless communication interface.
  • In a TOF distance detection system, distance may be measured with respect to a distance detection device by detecting the phase difference of light emitted from the light emitting device 13 and the received light received at the light reception device 15. In the case of the TOF distance detection system according to the example of FIG. 1, a phase difference of the transmitted and received light is not detected based on a time stamp of the input and output light pulse, but instead is detected based on the accumulated amount of electrical charge in the light sensors 15 a and 15 b, which are synchronized in accordance with the light transmission. Measuring a phase difference (time) between a light transmission and light reception is not an accurate method of performing distance detection in a TOF system due to the poor temporal resolution with respect to the speed of light. Accordingly, determining a distance based on stored electrical charge generated in response to received light reflected from an object, as in the present disclosure, provides the benefit of improved accuracy with respect to simple phase difference measurements relying on a time difference. The two light sensors provided in the distance detection device according to FIG. 1 may convert the stored electrical charge generated in response to the received light into a voltage, and the resolution of the distance detection may be indirectly raised by scaling.
  • Next, FIG. 2 illustrates a non-limiting exemplary timing diagram for a TOF distance detection device, according to certain embodiments. The exemplary timing diagram of FIG. 2 will now be utilized to demonstrate the principle of operation of the distance detection apparatus shown in FIG. 1.
  • Referring to FIG. 2, the waveform of “LIGHT EMISSION” shows a light pulse that has a pulse width t emitted from the light emitting device 13. The waveform of “LIGHT RECEPTION” shows a light pulse that has been emitted from the light emitting device 13 and reflected by the object 19 such that it is received by the light reception device 15. This example assumes that a time t2 passes between the initial transmission of the light pulse by the light emission device 13 and the reception of the light pulse by the light reception device 15. The waveforms corresponding to “GATE a” and “GATE b” illustrate gate signals that respectively control gate windows corresponding to the first and second light sensors 15 a and 15 b. The gate signals in this example have a pulse waveform and comprise the gate window which designates the period corresponding to the width of the gate pulse for activating the light sensors 15 a and 15 b. In this example, the width of the gate windows corresponding to the gate signals is the same as the width t of the light pulse emitted by the light emission device 13. In this example, the gate window of the light sensor 15 a is generated at a same timing as the light pulse emitted from the light emitting device 13. Additionally, the gate window corresponding to the light sensor 15 b is generated at a predetermined time difference (phase difference) relative to the light sensor 15 a. This time difference for transmitting the respective gate signals is assumed in the present example to equal the width t of the light pulse emitted from the light emitting device 13.
  • As illustrated in the lower portion of the graph shown in FIG. 2, because the gate window corresponding to the light sensor 15 a is controlled by the gate signal GATE a and consequently charge is stored in response to receiving light within this gate window, a charge amount s1 is accumulated during the time period in which light is received by the light sensor 15 a within its gate window (i.e., the time t1 illustrated by the hashed area under the GATE a pulse). The light sensor 15 b operates in the gate window according to the gate signal in GATE b, and the accumulated charge generated in response to receiving light within this gate window results in the stored charge amount s2. The gate window corresponding to the light sensor 15 b starts at the finish time of the “LIGHT EMISSION” pulse, and the “LIGHT RECEPTION” is finished when time t passes from that point in time. Therefore, the storage time of the electrical charge according to the light reception amount of the light sensor 15 b is restricted to the light received in the time t2.
  • The following formula (1) may be formed to describe the distance D between the distance detection device and an object as a function of the speed of light c and the time t2.

  • D=(c*t2)/2
  • Moreover, as for the relationship between the time t, the time t2, and the charge amounts s1 and s2, the following formula (2) may be formed.

  • t2={s2/(s1+s2)}*t
  • The following formula (3) may be derived from the formulas (1) and (2):

  • D=(c*t/2)×s2/(s1+s2)
  • From formula (3), the distance D can be calculated based on the known time t and the amounts s1 and s2 of measured stored electrical charges.
  • As mentioned previously, when the quantity of light received by a light sensor becomes relatively small because the distance between the detected object and the distance detection device becomes large, detection accuracy will fall and the resolution and signal-to-noise ratio at the time of the accumulated charge amount calculation will be insufficient with respect to the accumulated amount of electrical charge. As a countermeasure against the decrease in accuracy due to a small quantity of light received at the light sensor, it is possible to adjust a parameter like the power of a light emitting diode that transmits the light pulse. Additionally, the emission time of the light may be adjusted.
  • For the purposes of the present disclosure, a frame refers to a unit of one detection process of distance performed based on at least one light emission (e.g., LED irradiation) and light reception. Further, a frame rate corresponds to the number of objects of the frame included per unit time (for example, one second). While the above-mentioned countermeasure against decreased accuracy due to decreased electrical charge detection from a light sensor (i.e., increasing the transmission power and/or the transmission time of a light pulse for measuring the distance) may be effective in some implementations, this countermeasure becomes impractical and/or unavailable in relatively smaller device implementations such as mobile devices (e.g., smartphones, etc.).
  • As an alternative to counteracting decreased detection accuracy as a result of low electrical charge detection, embodiments of the present disclosure may ensure sufficient distance detection accuracy by dynamically changing a frame rate and a frequency of light emission per frame rate while measuring the distance in a state with fixed emission power, fixed emission time, and fixed light emission interval. When a light emission interval is made constant, changing a frame rate means changing the frequency of light emission (namely, the number of light emission pulses) contained in one frame.
  • Generally, if a frame rate is made low, since the frequency of light emission utilized for one distance detection can be increased, detection accuracy will become acceptable. On the other hand, if a frame rate is made high, it may be necessary to raise the power of one irradiation so that detection accuracy is not dropped, which results in increased power consumption. Thus, frame rate and detection accuracy have a tradeoff relationship with respect to power consumption.
  • In embodiments of the present disclosure, two methods are proposed as a control method for dynamically controlling a frame rate of a distance detection device. A first method is a method of determining a frame rate in a current frame based on a distance relative to an object detected in an immediately preceding frame with a distance detection device. For example, suppose the detection range of an object with a distance detection device is 0 to 200 centimeters. Further, suppose in the range of 0 to 100 centimeters a frame rate of 50 frames per second (FPS) is applied, and in the range of 100 to 200 centimeters a frame rate of 25 FPS is applied. This results in a time-per-frame in the two ranges of 20 milliseconds and 40 milliseconds, respectively. Here, if the light emission interval of the light emitting device is fixed at 10 milliseconds, the frequency of light emission of the light emitting device will be two times and four times for these ranges, respectively. When some light reception of reflected light occurs within one frame, signals s1 and s2 are generated based on the electrical charge accumulated in response to the light reception. Therefore, when the distance of an object is far, the accumulated amount of electrical charge generated by a light sensor may be raised by dynamically adjusting the frame rate and controlling the light received within the frame. As a result, degradation of detection accuracy by the increase in detection distance can be moderated without a corresponding rise in the average electric current value per time when the distance is being measured.
  • FIG. 3 illustrates a schematic example of a dynamically changing frame rate according to a first method. In the first method, whenever one frame is completed, the frame rate of the following frame is determined according to the detected distance (OUTPUT DISTANCE) in the completed frame. In the example of FIG. 3, frame # 1 of frame rate FR0 includes n1 light pulses, and the distance detected in this frame is d1. At the time of completing frame # 1, the frame rate of the following frame # 2 is determined based on this distance d1. In this example, the frame rate in frame # 2 is maintained at frame rate FR0. Accordingly, the number of light pulses in frame # 2 remains at n1. If the distance between an object and the light emitting device does not change in frame # 1 and frame # 2, the output distance d2 of frame # 2 becomes the same as the distance d1 detected in frame # 1. Because the distance d2 remains unchanged, the frame rate in frame # 2 is also maintained at frame rate FR0.
  • In frame # 3, it is assumed that the distance between the object and the distance detection devices increases relative to the distance from frame # 2. In this example, it may be assumed that the distance in frame # 4 increases above a predetermined threshold distance. In such a case, although the detection accuracy of the output distance d3 of frame # 3 is low, it can be utilized for far and near determinations of distance of an object. Based on this output distance d3, a frame rate in the following frame # 4 is changed from frame rate FR0 to frame rate FR1. That is, when the output distance d3 increases over the predetermined threshold boundary, the controller 110 selects a lower frame rate FR1 and applies the decreased frame rate FR1 to the subsequent frame # 4. While the output distance d4 may not change with respect to the output distance d3 in frame # 3, the detection accuracy increases in frame # 4 relative to frame #3 due to the change in the frame rate. Assuming the output distance d5 is the same as the output distance d4 from frame # 4, the frame rate in frame # 5 remains at frame rate FR1.
  • Although the method of determining a frame rate of a current frame based on a distance calculated for a preceding frame is not specifically limited by the present disclosure, one or more embodiments according to the present disclosure may utilize stored data tables that store information related to frame rates corresponding to various distance ranges. For example, FIGS. 4A and 4B illustrate data tables that may be implemented for setting a frame rate in a current frame based on a distance detected in an immediately preceding frame, according to certain embodiments.
  • Referring first to FIG. 4A, the data table in FIG. 4A is assumed to be implemented for a distance measurement range of 0 to 2 meters. The total distance detection range in this example is divided into two periods (i.e., 0 to 1 meters, and 1 to 2 meters). Frame rates corresponding to each distance detection range segment are shown in the bottom row of the data table. Specifically, the distance range segment of 0 to 1 meter has an associated frame rate of 50 frames per second, and the distance detection range of 1 to 2 meters has a corresponding frame rate of 40 frames per second. Applying the data included in the table of FIG. 4A to the dynamic frame rate processing according to the first method of the present disclosure, if a distance in a current frame is determined to be within the range of 0 to 1 meters, the distance detection device sets the frame rate in the immediately subsequent frame to a frame rate of 50 frames per second. Similarly, if the distance detected within a current frame is within the range of 1 to 2 meters, the distance detection device sets the frame rate in the immediately subsequent frame to a frame rate of 40 frames per second.
  • Referring now to FIG. 4B, FIG. 4B illustrates a case in which the detection range is increased to a detection range of 2 meters or more, and the total range of the distance detection device is divided into three segments. Specifically, the table in FIG. 4B includes segments corresponding to distance ranges of 0 to 1 meters, 1 to 2 meters, and greater than 2 meters. The bottom row of the table in FIG. 4B includes frame rates corresponding to each distance range segment. Specifically, the distance range of 0 to 1 meters has a corresponding frame rate of 60 frames per second, the distance range of 1 to 2 meters has a corresponding frame rate of 50 frames per second, and the distance range of greater than 2 meters has a corresponding frame rate of 40 frames per second. A similar determination of dynamically determining a frame rate in a current frame based on a distance detection result in an immediately preceding frame may be implemented using the data in the table of FIG. 4B in a similar manner as the example discussed above for FIG. 4A.
  • It is noted that while the examples of FIG. 4A and FIG. 4B illustrate examples of detection ranges and segments of detection ranges in corresponding frame rate that may be applied in certain embodiments, the present disclosure is not limited to any particular distance detection range or number of segments for dynamically controlling a frame rate of a distance detection device.
  • Next, FIG. 5 illustrates a wave form diagram including various signals for demonstrating the dynamic control of frame rate in a distance detection device in embodiments implementing the first method. The names of the various signals included in the diagram of FIG. 5 were discussed previously with respect to FIG. 2 and therefore, a discussion of what these signals represent will not be repeated here. Further, the example of FIG. 5 further illustrates processing with respect to frames # 3 and #4, which were discussed with respect to FIG. 3.
  • Referring to FIG. 5, frame # 3 of frame rate FR0 includes one light pulse, and this example assumes that electrical charges in the amounts of s1 and s2 were obtained at the time of completing frame # 3. Based on the electrical charge amounts s1 and s2 determined at the end of frame # 3, the distance D may be calculated for the frame # 3. Based on the calculated distance D for frame # 3, the frame rate corresponding to frame #4 may be dynamically controlled by comparing the calculated output distance with corresponding frame rates stored in a data table, such as the data tables illustrated in FIGS. 4A and 4B. In the example of FIG. 5, the output distance D is assumed to correspond to a frame rate FR1. As shown in the graph of FIG. 5, by increasing the frame rate in frame # 4 with respect to the frame rate from frame # 3, and assuming that the transmission interval of light pulses remains fixed, the increase in frame rate to frame rate FR1 results in two light pulses being emitted within frame # 4. Accordingly, the amount of stored charge s1 and s2 increases as a result of receiving the light for both the first and second light pulses. Similarly to frame #3, the output distance D may be calculated in frame # 4 based on the accumulated electrical charge s1 and s2, and the frame rate of frame # 5 may be controlled based on the output distance calculated for frame # 4.
  • Next, FIG. 6 illustrates a non-limiting exemplary flowchart corresponding to processing for dynamically controlling a frame rate for a TOF distance detection device according to the first method.
  • At step S11, the controller 11 sets an initial frame rate for the distance detection in the first frame.
  • At step S12, based on the initial frame rate set at step S11, the light emitting device 13 emits at least one light pulse at a fixed emission interval. For example, the light emitting device 13 may emit a series of 10 millisecond light pulses at a fixed interval within the initial frame.
  • At step S13, the controller 11 performs a distance measurement for the current frame based on the accumulated charge amounts s1 and s2 generated when the light emitted at step S12 is reflected from an object from which the distance is being measured and received by the light receiving device 15.
  • At step S14, the controller 11 determines, based on the distance calculated at step S13, whether the frame rate in the immediately subsequent frame with respect to the current frame should be dynamically changed. For example, in certain embodiments, the controller 11 may compare the distance calculated at step S13 with data in a data table including corresponding frame rates, such as the data tables shown in FIGS. 4A and 4B. If the controller 11 at step S14 determines that the frame rate in the subsequent frame should not be changed, the process returns to step S12.
  • Otherwise, at step S15, the controller 11 performs processing for dynamically controlling the frame rate such that the frame rate in the subsequent frame changes to the frame rate determined at step S14. The process then returns to step S12 where the process is repeated for a series of frames.
  • Next, a second method of dynamically controlling a frame rate in a series of frames including at least one light pulse will be explained. According to the second method of dynamically controlling a frame rate, when the accumulated charge is generated by at least one of the light sensors 15 a and 15 b exceeds a predetermined threshold level, the controller 11 terminates the current frame. According to the first method of dynamically controlling the frame rate, the frame rate of the frame immediately subsequent to the current frame was determined at the time of finishing the current frame. In other words, the frame rate of a certain frame was determined before the start of the frame. In contrast, a frame rate is not determined in the second method before the start of a frame, but instead the frame rate in the current frame is decided based on a result determined at the time of finishing the current frame. The second method has an advantage in that it can optimize frame rates which include the influence of light quantity change factors other than the distance of an object with respect to the light sensor (e.g., reflectance of an object, it originates in a color, etc.).
  • FIG. 7 illustrates a non-limiting exemplary wave form diagram demonstrating processing related to the dynamic control of a frame rate for a TOF distance detection device according to the second method.
  • Referring to FIG. 7, it is assumed for this example that the amount of stored charge amounts s1 and s2 obtained from the first light pulse does not exceed the level of a predetermined threshold value Th in frame #n. As a result, the controller 11 determines that frame #n is not completed at this time, and the controller 11 extends the frame #n so that the light pulse of a second pulse may be included within the frame. As a result, the light sensors 15 a and 15 b will receive light from the additional pulse, thereby increasing the amount of stored charge amounts s1 and s2. As shown in the lower portion of the graph in FIG. 7, as a result of receiving light and storing charge corresponding to the second light pulse in frame #n, the store charge amount s2 increases above the threshold value Th. As a result of the store charge amount s2 exceeding the threshold value Th, frame #n is completed at this time. Additionally, the frame rate (here FR1) of frame #n is determined at this time. In the second method of dynamic frame control, even if one of the store charge amounts s1 and s2 exceeds a threshold value, the current frame is not completed until the other amount of store charge is calculated.
  • While the example shown in FIG. 7 illustrates a case in which the stored charge amount s2 exceeds the threshold value Th within frame #n after receiving the second light pulse, there may be a case in which the stored charge amount does not exceed the threshold value after receiving the second light pulse. In response, the controller 11 may dynamically control the frame rate of frame #n such that the frame rate is extended so as to include a third light pulse within the frame. In certain embodiments, frame rate control according to the second method may be performed sequentially by continuously increasing the frame rate while the stored charge amounts s1 and s2 remain below the threshold value Th. However, certain implementations may include an upper limit for time in which the current frame may be extended by the controller 11. After the completion of the upper time limit for extending the current frame, the controller 11 may forcibly terminate the current frame. This provides the benefit of not continuously extending the current frame in perpetuity.
  • Referring still to FIG. 7, the frame #n+1 begins at a start time corresponding to the subsequent frame. However, a frame rate of the frame #n+1 has not been fully determined at this time. In this example, it is assumed that there is a case where the reflected light amount in frame #n+1 is large relative to the light amount received in frame #n. As a result of the increase in the received amount of light in frame #n+1, the amount of stored charge s2 increases above the threshold value Th following the reception of the first light pulse. As a result of determining that the stored charge amount s2 increases above the threshold value Th, the controller 11 determines that the frame #n+1 is finished before the following light pulse is emitted. Additionally, the controller 11 determines the frame rate of the frame #n+1 (here frame rate FR0).
  • Next, FIG. 8 illustrates a non-limiting exemplary flow chart corresponding to processing for dynamically controlling a frame rate for a TOF distance detection device according to the second method.
  • Referring to FIG. 8, at step S21, the controller 11 drives the light emitting device 13 such that at least one light pulse is generated within the current frame.
  • At step S22, the controller 11 determines the charge stored amounts s1 and s2 generated during receipt of the at least one light pulse from step S21, and determines if at least one of the charge stored amounts s1 and s2 is greater than a predetermined threshold value.
  • If the controller 11 at step S22 determines that neither of the charge stored amounts s1 and s2 is greater than the predetermined threshold, the process proceeds to step S25. Otherwise, the process proceeds to step S23.
  • At step S23, the controller 11 determines whether a minimum time of one frame has elapsed since the emission of the at least light pulse from the light emitting device 13 at step S21. If the controller 11 determines at step S23 that the lower time limit has not been exceeded, the process proceeds to step S24. Otherwise, the process proceeds to step S26.
  • At step S24, the controller 11 waits until the current frame is completed (i.e., until the minimum time elapses for one frame) and then proceeds to step S26.
  • At step S25, the controller 11 determines whether an upper limit time threshold has been exceeded. The upper limit time threshold corresponding to step S25 may be suitably selected such that the distance determination and dynamic frame control according to the second method does not continue in perpetuity (e.g., the frame rate is not continuously extended due to the stored charge amount(s) not exceeding the threshold value Th). If the controller 11 determines at step S25 that the upper time limit is exceeded, the process progresses to step S26. Otherwise, the process progresses to step S27.
  • At step S26, the controller 11 performs a calculation of the distance between the distance detection device and the object based on the stored charge amounts s1 and s2. The process of FIG. 8 then returns to step S21 following the distance calculation.
  • At step S27, the controller 11 waits until the next timing to emit a subsequent light pulse, based on the fixed emission interval.
  • It should be appreciated that for both the first and second methods of dynamic frame control described in the present disclosure, the explicit values described in the examples provided herein are not limiting, and the present disclosure is not limited to any particular value of the initial frame rate value, the phase number of the changing of the frame rate, the upper and lower time limits, etc., and any arbitrary values may be applied. Further, a light emission interval is not restricted to 10 milliseconds for the purposes of the present disclosure.
  • Next, FIG. 9 illustrates a non-limiting exemplary block diagram corresponding to a terminal device for implementing dynamic frame rate control processing according to certain embodiments. Specifically, the example illustrated in FIG. 9 includes a display apparatus 10 provided with a three-dimensional (3D) gesture sensing device 107 that implements a dynamic frame rate control method in accordance with the present disclosure. The display apparatus 10 of FIG. 9 may, e.g., be implemented as any arbitrary device that includes a display, such as a television receiver, a personal computer (PC), a tablet, etc.
  • The display apparatus 10 includes a control line 150 and a data line 160, and is comprised of the following various functional elements that communicate control signals and other data signals across these lines: a controller 101, a communication processor 102 connected to antenna 103, a display 104, an operation unit 105, a memory 106, the 3D gesture sensing device 107, a speaker 110, a microphone 122, and a television broadcast receiver 124 connected to antenna 125. Additionally, the display apparatus 10 may include various other components not explicitly providing in the example of FIG. 9. Further, elements shown in FIG. 9 may be optionally omitted in certain implementations. For example, the television broadcasting receiver 124 is not an essential element to the display apparatus 10 and/or distance detection processing according to the present disclosure and therefore, this element may be optionally omitted in certain implementations.
  • The controller 101 may include one or more central processing units (CPUs), and may control each element in the display apparatus 10 to perform features related to communication control, audio signal processing, control for the audio signal processing, image processing and control, and other kinds of signal processing. For example, in certain embodiments, the controller 101 may control elements in the display apparatus 10 based on a process of monitoring and managing the output of the 3D gesture detection apparatus 107. The controller 101 may perform these features by executing instructions stored in the memory 106. Alternatively or in addition to the local storage of the memory 106, the features may be executed using instructions stored in an external device accessed on a network or on a non-transitory computer readable medium.
  • The communication processor 102 controls communications performed between the terminal device 100 and other external devices via the antenna 101 or another connection (e.g., a wired connection). For example, the communication processor 102 may control communication between base stations for cellular telephone communication. Additionally, the communication processor 102 may control wireless communication performed with the other external apparatuses according to protocols such as Bluetooth, IEEE 802.11, and near field communication (NFC); or wired or wireless communication on a network (e.g., the Internet) using, e.g., an Ethernet connection.
  • The antenna 103 transmits/receives electromagnetic wave signals between base stations for performing radio-based communication, such as the various forms of cellular telephone communication.
  • The display 104 may be a liquid crystal display (LCD), an organic electroluminescence display panel, or another display screen technology. In addition to displaying still and moving image data, the display 104 may display operational inputs, such as numbers or icons, which may be used for control of the display apparatus 10. The display 104 may additionally display a graphical user interface with which a user may control aspects of the display apparatus 10. Further, the display 104 may display characters and images received by the display apparatus 10 and/or stored in the memory 106 or accessed from an external device on a network. For example, the display apparatus 10 may access a network such as the Internet and display text and/or images transmitted from a Web server.
  • The operation unit 105 may include one or more buttons similar to external control elements (e.g., power control, volume control, standby control, etc.) for providing an operational interface on which the user may control the display apparatus 10. The operation unit 105 may generate an operation signal based on a detected input corresponding to one of the buttons, switches, etc. The operation signals generated by the operation unit 105 may be supplied to the controller 101 for performing related processing control of the display apparatus 10. In certain embodiments, the operation unit 105 may be integrated as a touch panel display within the display 104.
  • The memory 106 may include, e.g., Read Only Memory (ROM), Random Access Memory (RAM), or a memory array comprised of a combination of volatile and non-volatile memory units. The memory 106 may be utilized as working memory by the controller 101 while executing the processing and algorithms of the present disclosure, or to store other instructions corresponding to processing performed by the controller 101 (e.g., operating system instructions). Additionally, the memory 106 may be used for long-term storage, e.g., of images and information related thereto.
  • The 3D gesture sensing device 107 includes one or more sensors that can detect 3D motion of detection targets, such as a finger of a user's hand. For example, array-type distance sensors included in the 3D gesture sensing device 107 may detect a distance from the display apparatus 10 relative to on object within a detection range of the sensors. In certain embodiments, it is also possible to acquire distance image information by measuring the time, in real-time for every pixel, from when projected light from the sensors strikes upon a target and returns to a CMOS image sensor corresponding to an array-like pixel. Thus, the stereo image of a target is measurable.
  • The speaker 110 emits an audio signal corresponding to audio/voice data supplied from the display apparatus 10.
  • The microphone 122 detects surrounding audio, and converts the detected audio into an audio signal. The audio signal may then be output to the controller 101 for further processing.
  • The television broadcasting receiver 124 receives image and audio signal data via the antenna 124 or another wired or wireless connection. The television broadcasting receiver 124 performs signal processing on the received image and audio signal data such that the data may be reproduced on the display apparatus 10 via the display 104.
  • Next, FIG. 10 illustrates an exemplary target detection using the terminal device of FIG. 9. Specifically, FIG. 10 is an explanatory drawing of a user's hand in a detection space of the 3D gesture sensing device 107.
  • As shown in FIG. 10, the detection space in this example is set in front of (i.e., the viewing space) of the display 12. For the purposes of this example, the y-axis of the display 12 is the up-down direction and x-axis is the left-right direction relative to the display 12 surface. A direction vertical to the display 12 is the z-axis. In this case, a direction perpendicular to the display 12 corresponds to the z-axis direction of an associated detection sensor in the 3D gesture sensing device 107.
  • For the purposes of the present disclosure, when a gesture involves using the user's index finger, the feature points of not only one point of the tip of an index finger, but also another finger or a hand may be collectively utilized. Further, while the examples discussed herein perform detection processing with respect to a user's hand, one of ordinary skill in the art will appreciate that the gesture detection processes described herein may be adapted to detect gestures and feature points related to other objects.
  • The position of each tip of the fingers 21 and 22 may be detected by 3D gesture sensing device 107 as features points A and B. In certain embodiments, the 3D gesture sensing device 107 may perform processing for distinguishing between a first instruction operation (e.g., a scroll operation) and a second instruction operation (e.g., a click operation) based on time-series data related to three parameters: the coordinates of the first point A, the coordinates of the second point B, and the distance between points A and point B. For example, a user's gesture may be monitored by the 3D gesture sensing device 107, and the coordinates of the first point A at the tip of the index finger 21 and the coordinates of the second point B at the tip of the thumb 22 may be acquired periodically. Simultaneously, the distance between the first point A and the second point B may be computed from the determined coordinates of both points.
  • For the purposes of the present disclosure, when a gesture involves using the user's index finger, the feature points of not only one point of the tip of an index finger but another finger or a hand may be collectively utilized. Further, while the examples discussed herein perform detection processing with respect to a user's hand, one of ordinary skill in the art will appreciate that the gesture detection processes described herein may be adapted to detect gestures and feature points related to other objects.
  • Processing related to distinguishing between, and executing processing related to, detected instruction operations based on inputs received by the 3D gesture sensing device 107 is further described in U.S. application Ser. No. 14/183,171, the contents of which are incorporated herein by reference.
  • Obviously, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein. For example, advantageous results may be achieved if the steps of the disclosed techniques were performed in a different sequence, if components in the disclosed systems were combined in a different manner, or if the components were replaced or supplemented by other components. The functions, processes and algorithms described herein may be performed in hardware or software executed by hardware, including computer processors and/or programmable processing circuits configured to execute program code and/or computer instructions to execute the functions, processes and algorithms described herein. A processing circuit includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC) and conventional circuit components arranged to perform the recited functions.
  • The functions and features described herein may also be executed by various distributed components of a system. For example, one or more processors may execute these system functions, wherein the processors are distributed across multiple components communicating in a network. The distributed components may include one or more client and/or server machines, in addition to various human interface and/or communication devices (e.g., display monitors, smart phones, tablets, personal digital assistants (PDAs)). The network may be a private network, such as a LAN or WAN, or may be a public network, such as the Internet. Input to the system may be received via direct user input and/or received remotely either in real-time or as a batch process. Additionally, some implementations may be performed on modules or hardware not identical to those described. Accordingly, other implementations are within the scope that may be claimed.
  • It must be noted that, as used in the specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
  • The above disclosure also encompasses the embodiments noted below.
  • (1) A distance detection device comprising: a light emitter configured to transmit light in a series of frames, wherein each frame in the series of frames includes at least one light pulse; a light receiver configured to receive, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target, and generate, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; and circuitry configured to calculate, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver, dynamically control a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame, and control the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.
  • (2) The distance detection device of (1), wherein the circuitry is configured to increase the frame rate with increasing distance between the target and the distance detection device.
  • (3) The distance detection device of (1) or (2), further comprising a memory that stores one or more frame rates respectively corresponding to distances calculated in the immediately preceding frame.
  • (4) The distance detection device of any one of (1) to (3), wherein the circuitry is configured to dynamically control the frame rate by matching the distance calculated in the immediately preceding frame rate with a corresponding frame rate, of the one or more frame rates stored in the memory.
  • (5) The distance detection device of any one of (1) to (4), wherein the light receiver includes at least two light sensors that detect the at least one light pulse.
  • (6) The distance detection device of any one of (1) to (5), wherein the circuitry is configured to control the at least two light sensors by outputting a gate signal to each of the at least two light sensors, wherein the gate signal controls a time interval during which light may be detected by the at least two light sensors.
  • (7) The distance detection device of any one of (1) to (6), wherein the gate signals output by the circuitry are offset in phase with respect to each other.
  • (8) The distance detection device of any one of (1) to (7), wherein the light receiver output represents an amount of electrical charge generated as a result of each of the at least two light sensors detecting the at least one light pulse during the time interval defined by the gate signal.
  • (9) The distance detection device of any one of (1) to (8), wherein the light receiver output includes an indication of the electrical charge represented by each of the at least two light sensors.
  • (10) The distance detection device of any one of (1) to (9), wherein the amount of the electrical charge is a function of one or more of an intensity and a detection duration of the at least one light pulse detected by each of the at least two sensors during the time interval defined by the gate signal.
  • (11) The distance detection device of any one of (1) to (10), wherein the circuitry is configured to calculate the distance between the target and the distance detection device based on the amount of the electrical charge.
  • (12) The distance detection device of any one of (1) to (11), wherein the light receiver output represents an amount of electrical charge generated by each of the at least two sensors as a result of each of the at least two light sensors detecting the at least one light pulse during the time interval defined by the gate signal.
  • (13) The distance detection device of any one of (1) to (12), wherein the circuitry is configured to determine whether one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds a predetermined threshold level.
  • (14) The distance detection device of any one of (1) to (13), wherein when neither of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level, the circuitry is further configured to dynamically control the frame rate by extending the current frame so as to include a detection of at least one more light pulse by the light receiver.
  • (15) The distance detection device of any one of (1) to (14), wherein the circuitry is configured to extend the current frame until one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level.
  • (16) The distance detection device of any one of (1) to (15), wherein the circuitry is configured to extend the current frame until one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level or until a predetermined time threshold is exceeded.
  • (17) The distance detection device of any one of (1) to (16), wherein the circuitry is configured to determine the frame rate of the current frame when one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level.
  • (18) The distance detection device of any one of (1) to (17), wherein the circuitry is configured to initiate a next frame following the determination of the frame rate corresponding to the current frame such that the frame rate of the current frame is determined as a result of ending the current frame.
  • (19) A method comprising: transmitting, by a light emitter, light in a series of frames, wherein each frame in the series of frames includes at least one light pulse; receiving, by a light receiver, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target; generating, by the light receiver, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; calculating, by circuitry, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver; dynamically controlling, by the circuitry, a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame; and controlling, by the circuitry, the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.
  • (20) A non-transitory computer readable medium having instructions stored therein that when executed by one or more processors cause a distance detection device to perform a method of dynamically controlling a frame rate corresponding to the distance detection device, wherein the distance detection device includes circuitry, a light emitter, and a light receiver, and the method comprises: transmitting, by the light emitter, light in a series of frames, wherein each frame in the series of frames includes at least one light pulse; receiving, by the light receiver, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target; generating, by the light receiver, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; calculating, by the circuitry, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver; dynamically controlling, by the circuitry, a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame; and controlling, by the circuitry, the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame.

Claims (13)

1. A distance detection device comprising:
a light emitter configured to transmit light in a series of frames, wherein each frame in the series of frames includes at least one light pulse;
a light receiver configured to
receive, for each frame in the series of frames, the at least one light pulse when the at least one light pulse is reflected from a target, and
generate, in response to receiving the at least one light pulse in a current frame of the series of frames, an output for calculating a distance between the target and the distance detection device for the current frame; and
circuitry configured to
calculate, for each frame in the series of frames, the distance between the target and the distance detection device based on the output of the light receiver,
dynamically control a frame rate for each frame in the series of frames based on the distance calculated in a frame immediately preceding the current frame, and
control the light emitter such that the at least one light pulse is emitted in the current frame at the frame rate calculated based on the distance in the immediately preceding frame, wherein
the light receiver includes at least two light sensors that detect the at least one light pulse,
the circuitry is configured to control the at least two light sensors by outputting a gate signal to each of the at least two light sensors, wherein the gate signal controls a time interval during which light may be detected by the at least two light sensors,
the gate signals output by the circuitry are offset in phase with respect to each other,
the light receiver output represents an amount of electrical charge generated by each of the at least two sensors as a result of each of the at least two light sensors detecting the at least one light pulse during the time interval defined by the gate signal,
the circuitry is configured to determine whether one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds a predetermined threshold level, and
when neither of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level, the circuitry is further configured to dynamically control the frame rate by extending the current frame so as to include a detection of at least one more light pulse by the light receiver.
2. The distance detection device of claim 1, wherein
the circuitry is configured to increase the frame rate with increasing distance between the target and the distance detection device.
3. The distance detection device of claim 1, further comprising
a memory that stores one or more frame rates respectively corresponding to distances calculated in the immediately preceding frame.
4. The distance detection device of claim 3, wherein
the circuitry is configured to dynamically control the frame rate by matching the distance calculated in the immediately preceding frame rate with a corresponding frame rate, of the one or more frame rates stored in the memory.
5-9. (canceled)
10. The distance detection device of claim 1, wherein
the amount of the electrical charge is a function of one or more of an intensity and a detection duration of the at least one light pulse detected by each of the at least two sensors during the time interval defined by the gate signal.
11. The distance detection device of claim 1, wherein
the circuitry is configured to calculate the distance between the target and the distance detection device based on the amount of the electrical charge.
12-14. (canceled)
15. The distance detection device of claim 1, wherein
the circuitry is configured to extend the current frame until one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level.
16. The distance detection device of claim 1, wherein
the circuitry is configured to extend the current frame until one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level or until a predetermined time threshold is exceeded.
17. The distance detection device of claim 1, wherein
the circuitry is configured to determine the frame rate of the current frame when one or more of the amounts of electrical charge generated by each of the at least two sensors exceeds the predetermined threshold level.
18. The distance detection device of claim 17, wherein
the circuitry is configured to initiate a next frame following the determination of the frame rate corresponding to the current frame such that the frame rate of the current frame is determined as a result of ending the current frame.
19-20. (canceled)
US14/247,751 2014-04-08 2014-04-08 Distance detection device and method including dynamically adjusted frame rate Active US9170095B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/247,751 US9170095B1 (en) 2014-04-08 2014-04-08 Distance detection device and method including dynamically adjusted frame rate

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/247,751 US9170095B1 (en) 2014-04-08 2014-04-08 Distance detection device and method including dynamically adjusted frame rate

Publications (2)

Publication Number Publication Date
US20150285623A1 true US20150285623A1 (en) 2015-10-08
US9170095B1 US9170095B1 (en) 2015-10-27

Family

ID=54209501

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/247,751 Active US9170095B1 (en) 2014-04-08 2014-04-08 Distance detection device and method including dynamically adjusted frame rate

Country Status (1)

Country Link
US (1) US9170095B1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160110570A1 (en) * 2014-10-20 2016-04-21 Schneider Electric Industries Sas Rfid reader/writer device
WO2017089159A1 (en) * 2015-11-23 2017-06-01 Ams Ag Sensor arrangement and method for determining time-of-flight
US20170167898A1 (en) * 2015-12-09 2017-06-15 Pixart Imaging (Penang) Sdn. Bhd. Scheme for interrupt-based motion reporting
CN106908156A (en) * 2017-03-09 2017-06-30 郑州艾斯亚生物科技有限公司 A kind of high-speed pulse counting method and device
US20170363740A1 (en) * 2016-06-17 2017-12-21 Kabushiki Kaisha Toshiba Distance measuring device
US20180045513A1 (en) * 2015-06-24 2018-02-15 Murata Manufacturing Co., Ltd. Range sensor
US9958545B2 (en) * 2015-11-30 2018-05-01 Luminar Technologies, Inc. Lidar system
CN108139482A (en) * 2015-10-09 2018-06-08 松下知识产权经营株式会社 Photographic device and the solid-state imager used wherein
WO2018110183A1 (en) * 2016-12-14 2018-06-21 パナソニックIpマネジメント株式会社 Image capture control device, image capture control method, program, and recording medium
US20180376045A1 (en) * 2015-12-16 2018-12-27 Gopro, Inc. Dynamic Synchronization of Frame Rate to a Detected Cadence in a Time Lapse Image Sequence
WO2019012756A1 (en) * 2017-07-11 2019-01-17 ソニーセミコンダクタソリューションズ株式会社 Electronic device and method for controlling electronic device
US10267898B2 (en) 2017-03-22 2019-04-23 Luminar Technologies, Inc. Scan patterns for lidar systems
US20190187290A1 (en) * 2016-06-13 2019-06-20 Lg Electronics Inc. Night vision display device
US10749525B2 (en) * 2014-06-02 2020-08-18 Xyz Interactive Technologies Inc. Touch-less switching
WO2021085125A1 (en) * 2019-10-28 2021-05-06 ソニーセミコンダクタソリューションズ株式会社 Ranging system, drive method, and electronic device
WO2021145212A1 (en) * 2020-01-14 2021-07-22 ソニーセミコンダクタソリューションズ株式会社 Distance measurement sensor, distance measurement system, and electronic apparatus
US20220382378A1 (en) * 2021-05-28 2022-12-01 Pixart Imaging Inc. Method for identifying object, optical sensing apparatus and system
US11585920B2 (en) * 2017-12-28 2023-02-21 Intel Corporation Vehicle sensor fusion

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10605920B2 (en) 2016-01-13 2020-03-31 Ams Sensors Singapore Pte. Ltd. Power savings through refresh control for distance sensing devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5828407A (en) * 1995-06-30 1998-10-27 Canon Kabushiki Kaisha Method of controlling a solid-state image sensing device and image sensing apparatus adopting the method
US20090079960A1 (en) * 2007-09-24 2009-03-26 Laser Technology, Inc. Integrated still image, motion video and speed measurement system
US20110051119A1 (en) * 2009-09-01 2011-03-03 Dong Ki Min Delay compensation in modulated optical time-of-flight phase estimation

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4380412B2 (en) 2004-05-10 2009-12-09 株式会社デンソー Imaging control apparatus and program
JP5017989B2 (en) 2006-09-27 2012-09-05 ソニー株式会社 Imaging apparatus and imaging method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5828407A (en) * 1995-06-30 1998-10-27 Canon Kabushiki Kaisha Method of controlling a solid-state image sensing device and image sensing apparatus adopting the method
US20090079960A1 (en) * 2007-09-24 2009-03-26 Laser Technology, Inc. Integrated still image, motion video and speed measurement system
US20110051119A1 (en) * 2009-09-01 2011-03-03 Dong Ki Min Delay compensation in modulated optical time-of-flight phase estimation

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11362657B2 (en) 2014-06-02 2022-06-14 Xyz Interactive Technologies Inc. Touch-less switching
US10749525B2 (en) * 2014-06-02 2020-08-18 Xyz Interactive Technologies Inc. Touch-less switching
US9633240B2 (en) * 2014-10-20 2017-04-25 Schneider Electric Industries Sas RFID reader/writer device
US20160110570A1 (en) * 2014-10-20 2016-04-21 Schneider Electric Industries Sas Rfid reader/writer device
US10794695B2 (en) * 2015-06-24 2020-10-06 Murata Manufacturing Co., Ltd. Range sensor
US20180045513A1 (en) * 2015-06-24 2018-02-15 Murata Manufacturing Co., Ltd. Range sensor
EP3361283A4 (en) * 2015-10-09 2018-10-24 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus and solid-state image element used for same
US10686994B2 (en) 2015-10-09 2020-06-16 Panasonic Intellectual Property Management Co., Ltd. Imaging device, and solid-state imaging element used for same
CN108139482A (en) * 2015-10-09 2018-06-08 松下知识产权经营株式会社 Photographic device and the solid-state imager used wherein
WO2017089159A1 (en) * 2015-11-23 2017-06-01 Ams Ag Sensor arrangement and method for determining time-of-flight
US10761197B2 (en) 2015-11-23 2020-09-01 Ams Ag Sensor arrangement and method for determining time-of-flight
US10557940B2 (en) 2015-11-30 2020-02-11 Luminar Technologies, Inc. Lidar system
US10520602B2 (en) 2015-11-30 2019-12-31 Luminar Technologies, Inc. Pulsed laser for lidar system
US11022689B2 (en) 2015-11-30 2021-06-01 Luminar, Llc Pulsed laser for lidar system
US10591600B2 (en) 2015-11-30 2020-03-17 Luminar Technologies, Inc. Lidar system with distributed laser and multiple sensor heads
US9958545B2 (en) * 2015-11-30 2018-05-01 Luminar Technologies, Inc. Lidar system
US10175067B2 (en) * 2015-12-09 2019-01-08 Pixart Imaging (Penang) Sdn. Bhd. Scheme for interrupt-based motion reporting
US20170167898A1 (en) * 2015-12-09 2017-06-15 Pixart Imaging (Penang) Sdn. Bhd. Scheme for interrupt-based motion reporting
CN106855751A (en) * 2015-12-09 2017-06-16 原相科技(槟城)有限公司 Obtain method, optical sensor and the optical mice of electronic installation movable information
US20180376045A1 (en) * 2015-12-16 2018-12-27 Gopro, Inc. Dynamic Synchronization of Frame Rate to a Detected Cadence in a Time Lapse Image Sequence
US10638047B2 (en) * 2015-12-16 2020-04-28 Gopro, Inc. Dynamic synchronization of frame rate to a detected cadence in a time lapse image sequence
US11560091B2 (en) * 2016-06-13 2023-01-24 Lg Electronics Inc. Night vision display device
US20190187290A1 (en) * 2016-06-13 2019-06-20 Lg Electronics Inc. Night vision display device
EP3471399A4 (en) * 2016-06-13 2020-02-19 LG Electronics Inc. -1- Night vision display device
US20170363740A1 (en) * 2016-06-17 2017-12-21 Kabushiki Kaisha Toshiba Distance measuring device
US10739456B2 (en) * 2016-06-17 2020-08-11 Kabushiki Kaisha Toshiba Distance measuring device
WO2018110183A1 (en) * 2016-12-14 2018-06-21 パナソニックIpマネジメント株式会社 Image capture control device, image capture control method, program, and recording medium
CN106908156A (en) * 2017-03-09 2017-06-30 郑州艾斯亚生物科技有限公司 A kind of high-speed pulse counting method and device
US10267898B2 (en) 2017-03-22 2019-04-23 Luminar Technologies, Inc. Scan patterns for lidar systems
US11686821B2 (en) 2017-03-22 2023-06-27 Luminar, Llc Scan patterns for lidar systems
WO2019012756A1 (en) * 2017-07-11 2019-01-17 ソニーセミコンダクタソリューションズ株式会社 Electronic device and method for controlling electronic device
US11585920B2 (en) * 2017-12-28 2023-02-21 Intel Corporation Vehicle sensor fusion
WO2021085125A1 (en) * 2019-10-28 2021-05-06 ソニーセミコンダクタソリューションズ株式会社 Ranging system, drive method, and electronic device
WO2021145212A1 (en) * 2020-01-14 2021-07-22 ソニーセミコンダクタソリューションズ株式会社 Distance measurement sensor, distance measurement system, and electronic apparatus
US20220382378A1 (en) * 2021-05-28 2022-12-01 Pixart Imaging Inc. Method for identifying object, optical sensing apparatus and system
US11662828B2 (en) * 2021-05-28 2023-05-30 Pixart Imaging Inc. Method for identifying object, optical sensing apparatus and system

Also Published As

Publication number Publication date
US9170095B1 (en) 2015-10-27

Similar Documents

Publication Publication Date Title
US9170095B1 (en) Distance detection device and method including dynamically adjusted frame rate
US10313570B2 (en) Dynamic conservation of imaging power
US11448757B2 (en) Distance measuring device
CN109791195B (en) Adaptive transmit power control for optical access
EP3182156A1 (en) Ranging apparatus
JP2018136123A (en) Distance sensor and user interface apparatus
CN113556528A (en) Method and system for capturing video imagery under low light conditions using light emission by a depth sensing camera
US9435646B2 (en) Displacement detection device and operating method thereof
CN109196662A (en) Optical detection device and electronic equipment
US20130335576A1 (en) Dynamic adaptation of imaging parameters
WO2020095603A1 (en) Light emission control device, and light emission control method
US9791935B2 (en) Method for gesture detection, optical sensor circuit, in particular an optical sensor circuit for gesture detection, and optical sensor arrangement for gesture detection
EP2982940A1 (en) Angle detection device and survey instrument including the same
US20210013257A1 (en) Pixel circuit and method of operating the same in an always-on mode
US9313376B1 (en) Dynamic depth power equalization
JP2017190978A (en) Distance sensor, electronic device, and manufacturing method for electronic device
CN110266394A (en) Adjusting method, terminal and computer readable storage medium
US9727148B2 (en) Navigation device and image display system with inertial mode
US10628951B2 (en) Distance measurement system applicable to different reflecting surfaces and computer system
US20130175429A1 (en) Image sensor, image sensing method, and image capturing apparatus including the image sensor
US11579291B2 (en) Optical ranging system having multi-mode operation using short and long pulses
WO2022242348A1 (en) Dtof depth image acquisition method and apparatus, electronic device, and medium
US10036810B2 (en) Non-contact optical sensing device and method for sensing depth of an object in three-dimensional space
CN115201840A (en) Low power LiDAR system with intelligent laser interrogation
US20210271334A1 (en) Navigation system, navigation device and frame rate adjusting method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TACHIBANA, MAKOTO;REEL/FRAME:032993/0923

Effective date: 20140512

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8