CN106537184B - Apparatus, system and method for real-time tracking of objects - Google Patents

Apparatus, system and method for real-time tracking of objects Download PDF

Info

Publication number
CN106537184B
CN106537184B CN201580038606.7A CN201580038606A CN106537184B CN 106537184 B CN106537184 B CN 106537184B CN 201580038606 A CN201580038606 A CN 201580038606A CN 106537184 B CN106537184 B CN 106537184B
Authority
CN
China
Prior art keywords
distance
points
time
laser
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580038606.7A
Other languages
Chinese (zh)
Other versions
CN106537184A (en
Inventor
理查德·赛巴斯汀
坎道尔·贝斯理
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DSCG Solutions Inc
Original Assignee
DSCG Solutions Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DSCG Solutions Inc filed Critical DSCG Solutions Inc
Publication of CN106537184A publication Critical patent/CN106537184A/en
Application granted granted Critical
Publication of CN106537184B publication Critical patent/CN106537184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/167Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

In one general aspect, a system for determining motion of an object includes a laser system configured to generate distance and velocity measurements of a plurality of points on the object and a processor. The processor is configured to determine a rotation of the object from the distance and velocity measurements of the plurality of points on the object. In some aspects, the processor is also configured to determine a distance moved by the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object and the rotation of the object.

Description

Apparatus, system and method for real-time tracking of objects
Related application
The present application claims 2015, 5/19, priority and continuation from us application No. 14/716,467 entitled apparatus, system, and method for Real-Time Tracking of objects, the application claims 62/001,544 entitled "apparatus, system, and method for Tracking objects" and 62/030,988 entitled "apparatus, system, and method for Real-Time Tracking of objects" from 2014, 5/21, and 62/030,988 entitled "priority and benefit of us provisional application for Real-Time Tracking of objects (Devices, Systems, and Methods for Real-Time Tracking of objects)", the entire disclosures of which are incorporated herein by reference.
Technical Field
This description relates to systems and methods for tracking objects.
Background
In some known systems, a laser light detection and ranging (LIDAR) system may be used in conjunction with a video system to track objects. Some such known systems may be complex and difficult to use. Additionally, in some such known systems, the video system may require light to detect the object to be tracked. Accordingly, there is a need for systems, methods, and apparatus that address the deficiencies of the current technology and provide other novel and innovative features.
Disclosure of Invention
In one general aspect, a system for determining motion of an object includes a laser system configured to generate distance and velocity measurements of a plurality of points on the object and a processor. The processor is configured to determine a rotation of the object from the distance and velocity measurements of the plurality of points on the object. In some aspects, the processor is also configured to determine a distance and a direction moved by the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object and the rotation of the object. In some implementations, the processor is configured to determine, from the distance and velocity measurements of the plurality of points on the object, a distance and a direction moved by the object between a first time and a second time in a direction orthogonal to a laser beam emitted by the laser system. In some embodiments, the object is an object having a rigid body.
In another general aspect, a non-transitory computer-readable storage medium stores instructions that, when executed, cause one or more processors to perform a process, the process comprising: generating distance and velocity measurements of a plurality of points on an object; and determining a rotation of the object from the distance and velocity measurements of the plurality of points on the object. In some implementations, the process includes determining a distance and a direction moved by the object between the first time and the second time.
In another general aspect, a method includes: generating distance and velocity measurements of a plurality of points on an object; and determining a rotation of the object from the distance and velocity measurements of the plurality of points on the object. In some implementations, the method includes determining a distance and a direction moved by the object between the first time and the second time.
Drawings
Fig. 1 is a schematic diagram illustrating a LIDAR system, according to an embodiment.
Fig. 2 is a schematic diagram illustrating an example processor, according to an embodiment.
Fig. 3 and 4 illustrate objects that may be tracked according to an implementation.
Fig. 5-7 illustrate another object that may be tracked according to an embodiment.
Fig. 8 is a flow diagram illustrating a method of tracking an object according to an embodiment.
Detailed Description
Fig. 1 is a diagram schematically illustrating a LIDAR system 100, according to an aspect of the present invention. The LIDAR system 100 includes a laser system 105 and an analysis module 130. The laser system 105 includes a laser group 110 and a receiver group 120.
The LIDAR system 100 is configured to track an object 10. For example, in some implementations, the LIDAR system 100 is configured to track the object 10 from time T1 to time T2. In some implementations, the LIDAR system 100 is configured to determine a rotation of the object 10 about an axis between time T1 and time T2. In some implementations, the LIDAR system 100 is configured to determine rotations of the object 10 about at least two different axes. For example, in some implementations, the LIDAR system 100 is configured to determine rotations of the object between time T1 and time T2 about at least two different axes that are perpendicular or orthogonal to each other. In some implementations, the LIDAR system 100 is configured to determine a rotation of an object in the dark (or if the object is not disposed in a light source such that the object is visible to the human eye). For example, in some implementations, the LIDAR system 100 is configured to determine a rotation of an object when the object is in a light field of less than 10 lumens.
In some implementations, the LIDAR system 100 is also configured to determine movement of the object 10 between time T1 and time T2. For example, in some implementations, the LIDAR system 100 is configured to determine movement of the object 10 within a plane, such as the x-y plane, between time T1 and time T2. In some implementations, the LIDAR system 100 is configured to determine a rotation of an object in the dark (or if the object is not disposed in a light source such that the object is visible to the human eye).
The object 10 may have any shape or form. For example, in some embodiments, object 10 is a rigid solid object. In some embodiments, object 10 is a human subject or individual or a portion of a body of a human subject or individual (e.g., a head or face of the human subject or individual). In some embodiments, object 10 may be referred to as a target or as a target object.
The LIDAR system 100 is configured to generate or measure a distance (or distance estimate) and/or a velocity (or velocity estimate) of an object 10 (which may be stationary or moving relative to the LIDAR system 100) using the laser system 105 and the analysis module 130. For example, in some embodiments, the generated or measured velocity is a velocity in the direction of the beam of radiation (as described in more detail below). In other words, the measured velocity is the velocity of the object toward or away from the LIDAR system 100. In some implementations, the distance may be a distance estimate and the velocity may be a velocity estimate. In some implementations, the distance may be an accurate distance estimate and the velocity may be an accurate velocity estimate. In some implementations, the LIDAR system 100 is configured to produce accurate distance estimates and/or accurate velocity estimates despite the presence of multipath effects associated with electromagnetic radiation from the laser 110, for example, and/or other interference that may occur during measurements.
In some implementations, the LIDAR system 100 is configured to generate or measure distances and/or velocities for a plurality of different points on the object 10 using the laser system 105 and the analysis module 130. For example, in the illustrated implementation, the LIDAR system 100 is configured to generate or measure distances and/or velocities of five points (or positions) 11, 12, 13, 14, and 15 on the object 10. In other implementations, the LIDAR system 100 is configured to generate or measure distances and/or velocities of more than five points on the object at any given time. For example, the LIDAR system 100 may be configured to generate or measure distances and/or velocities of sixteen points or more than sixteen points on an object.
The laser system 105 of the LIDAR system 100 includes a group of lasers 110. In the illustrated implementation, the set of lasers 110 is configured to emit or direct laser beams 111A, 112A, 113A, 114A, and 115A. In other implementations, the set of lasers 110 is configured to emit or direct less than 5 laser beams. For example, in one implementation, the set of lasers 110 is configured to emit or direct 4 laser beams. In yet other implementations, the set of lasers 110 is configured to emit between 4 and 16 laser beams. In a further implementation, the set of lasers is configured to emit or direct more than 16 laser beams.
In the illustrated implementation, the set of lasers 110 includes lasers 111, 112, 113, 114, and 115 to emit or direct laser beams. In other implementations, a single laser may be used to emit or direct the laser beams 111A, 112A, 113A, 114A, and 115A. In other embodiments, the laser bank 110 includes more or less than five lasers. For example, in some embodiments, the laser group 110 includes at least 5 lasers. In other embodiments, the laser group 110 comprises at least 4 lasers. In other implementations, the laser bank 110 includes between 5 and 16 lasers. In other implementations, the laser group 110 includes between 4 and 16 lasers. In yet other implementations, the set 110 includes more than 16 lasers.
Each of lasers 111, 112, 113, 114, and 115 is configured to emit (e.g., generate, propagate) electromagnetic radiation (which may be, for example, a coherent light emission (e.g., a monochromatic light emission) or a light beam) at one or more frequencies. In some implementations, a laser can be configured to emit (e.g., generate, propagate) a plurality of coherent light emissions (e.g., monochromatic light emissions) or light beams. The emission from the laser may be referred to as electromagnetic radiation emission, as emitted electromagnetic radiation or as transmitted electromagnetic radiation.
In particular, each of the lasers of laser system 105 is configured to emit (e.g., generate, propagate) a coherent light emission (e.g., a monochromatic light emission) or beam of light from LIDAR system 100 toward a point on object 10. In some implementations, each of the lasers of the laser system 105 is configured to emit a beam toward a different point on the object 10. In some implementations, the lasers of the laser system 105 are configured to emit or direct more than one beam of light toward the object 10. For example, a single laser may be used to emit or direct multiple (e.g., 4, 5, or more than 5) beams of light toward different points on object 10.
In the illustrated implementation, the laser 111 is configured to emit a beam of light or electromagnetic radiation 111A toward a point 11 on the object 10. Laser 112 is configured to emit a beam of light or electromagnetic radiation 112A toward a point 12 on object 10. The laser 113 is configured to emit a beam of light or electromagnetic radiation 113A toward a point 13 on the object 10. Laser 114 is configured to emit a beam of light or electromagnetic radiation 114A toward a point 14 on object 10. Laser 115 is configured to emit a beam 115A of light or electromagnetic radiation toward a point 15 on object 10.
The LIDAR system 100 may be any type of system configured to detect the distance and velocity of an object.
The laser system 105 of the LIDAR system 100 includes a receiver group 120. In the illustrated implementation, receiver set 120 includes receivers 121, 122, 123, 124, and 125. In other implementations, the receiver group 120 includes more or less than five receivers. For example, in some embodiments, the receiver group 120 includes at least 5 receivers. In other implementations, the receiver group 120 includes between 5 and 16 receivers. In yet other implementations, the receiver bank 120 includes more than 16 receivers. In some implementations, the receiver group 120 includes a receiver for each laser in the laser group 110. In some implementations, the receiver group 120 includes a receiver for each laser beam emitted by the laser group 110. In some implementations, the receiver group 120 includes a receiver for each laser beam emitted by each laser of the laser group 110. In some implementations, the receiver group 120 includes a receiver positioned for each point or measurement on the object 10 being viewed.
Each of receivers 121, 122, 123, 124, and 125 is configured to receive electromagnetic radiation reflected from object 10 (which may also be referred to as reflected electromagnetic radiation) in response to electromagnetic radiation emitted from the laser toward object 10. For example, in the illustrated implementation, receiver 121 is configured to receive electromagnetic radiation 111B reflected from point 11 of object 10. Receiver 122 is configured to receive beam of electromagnetic radiation 112B reflected from spot 12 of object 10. The receiver 123 is configured to receive the beam of electromagnetic radiation 113B reflected from the point 13 of the object 10. Receiver 124 is configured to receive beam of electromagnetic radiation 114B reflected from point 14 of object 10. Receiver 125 is configured to receive beam of electromagnetic radiation 115B reflected from point 15 of object 10.
The analysis module 130 of the LIDAR system 100 is configured to analyze a combination of emitted electromagnetic radiation (e.g., the emitted electromagnetic radiation beams 111A-115A) from each of the lasers and reflected electromagnetic radiation (e.g., the reflected electromagnetic radiation 111B-115B) received by each of the receivers. The emitted electromagnetic radiation may be emitted according to a pattern that includes an up-chirp followed by a down-chirp (or a down-chirp followed by an up-chirp). The combination of the frequency of the emitted electromagnetic radiation from each of the lasers and the frequency of the reflected electromagnetic radiation received by the receiver may be analyzed by the analysis module 130 to determine the distance (distance from the LIDAR system) and velocity of each observed point of the object 10. Specifically, in the illustrated implementation, the LIDAR system 100 is configured to determine a distance and/or a velocity of each of the points 11, 12, 13, 14, and 15 of the object 10 from a first time T1 to a second time T2.
In some implementations, the LIDAR system 100 is configured to track, observe, or otherwise monitor each point 11, 12, 13, 14, and 15 on the object 10 approximately 100 times per second. In such embodiments, the time difference between T1 and T2 is about 0.01 seconds. In other implementations, the LIDAR system 100 is configured to track or observe each point more frequently than 100 times per second (e.g., 1000 times per second or more). In some implementations, the LIDAR system 100 is configured to track or observe each point less than 100 times per second.
As will be discussed in more detail below, the analysis module 130 is also configured to determine the rotation of the object and the distance and direction moved by the object between time T1 and time T2.
Fig. 2 is a schematic diagram of the analysis module 130. The analysis module 130 includes an image module 132, a comparison module 134, a rotation module 136, and a distance module 138. The image module 132 is configured to acquire a three-dimensional image of the object 10. In some cases, object 10 is a known object or subject, such as a known human. In such cases, the object 10 has a known three-dimensional structure, and the image module 132 may retrieve the structure from a database, memory, or from any other storage or memory device 139. In some implementations, the three-dimensional structure of the object can be provided to the image module 132 from a database, memory, or other storage or memory device 139. In some embodiments, the database, memory, or other storage is local to the analysis module 130. In other implementations, the three-dimensional structure may be received by the analysis module 130 from a remote storage device (e.g., via the internet or an internal network).
In some cases, object 10 does not have a known three-dimensional structure. In such cases, the image module 132 is configured to generate a three-dimensional structure of the object using data received from the laser system 105. For example, image module 132 may generate a three-dimensional structure of object 10 (or a three-dimensional structure of a portion of object 10) using distance data generated by, for example, laser system 105.
The comparison module 134 is configured to determine the distance and/or velocity of the object 10. More specifically, the comparison module 134 is configured to determine distances and/or velocities of a plurality of points (e.g., 11, 12, 13, 14, and 15) of the object 10. As described above, in one implementation, comparison module 134 is configured to analyze a combination of emitted electromagnetic radiation from each of the lasers and reflected electromagnetic radiation received by each of the receivers to determine a distance and/or velocity of points 11, 12, 13, 14, and 15 of object 10 from first time T1 to second time T2.
The rotation module 136 is configured to determine a rotation of the object 10. In some implementations, rotation module 136 is configured to determine rotation of object 10 about more than one axis. For example, in some implementations, rotation module 136 is configured to determine rotation of object 10 about two axes that are not parallel (e.g., orthogonal) to each other. For example, in one implementation, the laser system is configured to emit radiation along an axis (Z-axis) toward the object 10, and the rotation module 136 is configured to determine rotation of the object about a first axis (X-axis) orthogonal to the Z-axis and a second axis (Y-axis) orthogonal to the Z-axis. In some implementations, the rotation module 136 is configured to determine an amount of rotation of the object between the first time T1 and the second time T2.
In some embodiments, for a rigid solid object, the velocity field component in the Cartesian (Cartesian) direction will vary linearly in spatial coordinates orthogonal to that direction. In addition, there will be no variation of the components in the spatial direction of the components. For example, the velocity component in the z direction is considered to be Vz. At any given time, there may be no change in Vz in the z direction, otherwise the object will stretch out to violate the definition of a rigid solid body. If a trigonometric function/vector analysis of the z motion caused by the rotational components Wx, Wy and Wz is studied, it can be seen that for each rotational component, the motion Vz can be described by a linear equation:
vz (x, y) ═ a x + B y + C, where A, B and C are constants at a given time. Wherein the content of the first and second substances,
A=-Wy,
b is Wx, and
c depends on the location of the origin of the coordinate system.
Wz does not impart a z-component of velocity.
Thus, at a given time, if the velocity Vz is measured at several (x, y) locations (e.g., several points on the object 10), the values Wx, Wy and the translational constant velocity C ═ Vz0 may be solved using a set of linear equations. In some embodiments, there are sufficient spatial (x, y) points where the linear equation substantially overdetermines.
The distance module 138 is configured to determine the distance the object 10 has traveled in the x-y plane. For example, in some implementations, the distance module 138 is configured to determine the distance that the object 10 has traveled in an x-y plane orthogonal to the z-axis (the axis of the radiation beam of the laser system 105) between time T1 and time T2.
In some implementations, where the orientation of the object is known, the slopes dz/dx and dz/dy are considered to vary as a function of (x, y) position on the object. An array of LIDAR distance values (as determined by the laser system 105) may be used to determine the slope pairs (dz/dx, dz/dy) at several points (x, y). For example, in some implementations, the slope and/or curvature of the surface of the object may be determined in each of the x-direction and the y-direction to obtain a slope and/or curvature gradient. For some surfaces, the object orientation information plus the slope pairs uniquely determine the location on the surface of the object. For example, for a complex surface (e.g., a person or individual's face), the slope pairs will likely determine a unique location in a local region (even though the same slope pairs may be found more than once across the face). In some implementations, multiple slope pairs will redundantly estimate position and may be used to reduce error in position estimation from noisy distance data.
Where the absolute LIDAR beam position on the object is estimated from the slope pair and the current rotated object model is available, the distance module 138 may determine a change in position of the object 10. For example, the rotation (as determined or calculated) of the object may be reversed, removed, or undone (such that the object returns to its original orientation). A determination may also be made of the change in beam position required to restore the beam to its desired position. The position location data for the object may then be used to determine a translation speed (dx/dt, dy/dt). In some implementations, with the target rotation and translation speeds known, beam repositioning can be done so that beam motion is smooth and beam positioning approaches a desired position at a future point in time.
In some implementations, a relatively small number of beam spots may be required to maintain position without scanning. In some implementations, the position may be maintained without scanning and the vibration may be monitored using a LIDAR system.
Components (e.g., modules, processors (e.g., processors defined within a substrate such as a silicon substrate)) (e.g., analysis module 130) of LIDAR system 100 may be configured to operate based on one or more platforms (e.g., one or more similar or different platforms) that may include one or more types of hardware, software, firmware, operating systems, runtime libraries, and/or the like. In some implementations, the components of the LIDAR system 100 may be configured to operate within a cluster of devices (e.g., a server cluster).
In some implementations, one or more portions of the components shown in the LIDAR system 100 in fig. 1 and/or 2 may be or may include a hardware-based module (e.g., a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a memory), a firmware module, and/or a software-based module (e.g., a module of computer program code, a set of computer-readable instructions executable at a computer). For example, in some implementations, one or more portions of the LIDAR system 100 may be or may include software modules configured for execution by at least one processor (not shown). In some implementations, the functionality of the components may be included in different modules and/or components than those shown in fig. 1 and/or 2.
In some implementations, one or more of the components of the LIDAR system 100 may be or may include a processor configured to process instructions stored in a memory. For example, analysis module 130 (and/or portions thereof) may be a combination of a processor and memory configured to execute instructions related to a process to perform one or more functions.
Although not shown, in some implementations, components of the LIDAR system 100 (or portions thereof) may be configured to operate within, for example, a data center (e.g., a cloud computing environment), a computer system, one or more server/host devices, and/or the like. In some implementations, components of the LIDAR system 100 (or portions thereof) may be configured to operate within a network. Accordingly, LIDAR system 100 (or portions thereof) may be configured to operate within multiple types of network environments that may include one or more devices and/or one or more server devices. For example, the network may be or include a Local Area Network (LAN), a Wide Area Network (WAN) and/or the like. The network may be or may include a wireless network and/or a wireless network implemented using, for example, gateway devices, bridges, switches, and/or the like. The network may include one or more segments and/or may have portions based on multiple protocols, such as Internet Protocol (IP) and/or proprietary protocols. The network may comprise at least part of the internet.
In some implementations, the LIDAR system 100 may include a memory. The memory may be any type of memory such as random access memory, disk drive memory, flash memory, and/or the like. In some implementations, the memory can be implemented as more than one memory component (e.g., more than one RAM component or disk drive memory) associated with components of the LIDAR system 100.
As best illustrated in fig. 3 and 4, in one implementation, the object 20 may be viewed by the LIDAR system 100 (e.g., a target). The object 20 may have any shape, but is represented as a circle in fig. 3 and 4. In fig. 3, at time T1, a point 22 on the object 20 is observed by the LIDAR system 100. At time T1, point 22 is located at (3,3) in the x, y plane. As illustrated in fig. 4, at time T2, point 22 is located at (4,3) in the x, y plane. The movement of the points may be the result of different types of movement of the object 20. For example, object 20 may be moved from location to location (translational motion) or object 20 may be rotated (e.g., about an axis parallel to the y-axis of the x-y plane).
As illustrated in fig. 5, 6, and 7, an individual's head or face 30 may be tracked or observed by the LIDAR system 100. Specifically, a point or location 32 of the head or face 30 may be observed. As illustrated in FIG. 5, at time T1, point 32 is located at (3,2) in the x-y plane. At time T2, point 32 can be observed to be at (4, 2). The movement of the dots may be the result of different types of motion. For example, the person or individual may have rotated their head (e.g., about an axis parallel to the y-axis), as illustrated in fig. 6. Alternatively, the person or individual may have moved their head (without any rotation), as illustrated in fig. 7.
As described above, the rotation module 136 is configured to determine the rotation of the object by observing the distance and speed of a number of points on the object. Once the rotation of the object is known, the distance module 138 is configured to determine the distance that the object has moved in the x-y plane, as described above. Thus, in one implementation, the LIDAR system 100 is configured to determine whether the face or head of the person is oriented as illustrated in fig. 6 or as illustrated in fig. 7 at time T2.
Fig. 8 is a flow diagram of a method 800 according to an embodiment of the invention. The method 800 may be used to determine the rotation and/or motion (or distance moved) of an object between a first time T1 and a second time T2. The method 800 may be used multiple times in succession to determine the rotation or movement of an object over a long period of time. For example, the method may be used hundreds of times per second to track or monitor objects.
At 810, a three-dimensional image of an object is acquired. For example, in some embodiments, an image module (e.g., image module 132) may obtain a three-dimensional image of an object (if the object is a known object) from a database or other resource. In other embodiments, the image module 132 may acquire a three-dimensional image of the object (or portion of the object) by using the scanning information from the laser system 105 to develop a three-dimensional image of the object (or portion of the object).
At 820, a plurality of points on the object are observed at a first time T1. For example, in some implementations, the laser system 105 can observe and detect the distance and/or velocity of each of a plurality of points on the object. For example, in some embodiments, five points on the object are observed at any given time. In other embodiments, more than five dots are observed. For example, between 5 and 16 points may be observed. In other embodiments, more than 16 dots are observed.
As discussed above, the laser system 105 can observe the plurality of points on the object by emitting a beam of radiation and receiving a reflection of such radiation by each of the plurality of points. As discussed above, the comparison of the emitted radiation and the reflected radiation may provide the distance and/or velocity of the object in the z-direction (the direction of the radiation beam).
At 830, the plurality of points (or points located substantially at the same location as the plurality of points) are observed at a second time T2. In some embodiments, the same point on the object is identified by analysis or comparison of three-dimensional images (known or system developed three-dimensional images). The second time T2 is different from the first time T1. In some embodiments, the second time T2 is later in time than the first time T1. The plurality of points may be observed by the laser system 105, as described above.
At 840, a rotation of the object between time T1 and time T2 is determined. For example, in some implementations, the rotation module 136 may be configured to analyze distance and speed information and determine a rotation of the object, as described above. In some embodiments, the rotation of the object about one axis is determined. In some embodiments, the rotation of the object about at least two axes orthogonal to each other is determined. In some embodiments, the rotation of the object about two axes orthogonal to the z-axis (the direction or axis of the radiation beam) is determined.
At 850, a distance and/or direction moved by the object between time T1 and time T2 is determined. For example, in some embodiments, the motion of an object in an x-y plane orthogonal to the z-axis (the axis of the radiation beam) is determined. As described above, the distance module 138 may determine the motion or distance moved by the object in the x-y plane by rotation determination and the slope of the portion of the object being viewed. In particular, unique portions of the observed object may be identified by the slope or slope analysis of the object. The location of the portion of the object may be specified. The location of the point or location along with the rotation data may result in a determination of the motion of the object in the x-y plane. In some implementations, the determined or observed rotation can be reversed or removed such that the object is disposed at T2 in the same orientation as it was at T1. Then, unique identification points on the object can be identified and the distance such points have moved in the x-y plane (e.g., distance in the x-direction and distance in the y-direction) can be determined. In some embodiments, closely spaced LIDAR beams with repeated scanning cycles may be used to detect target motion normal to the beam direction. If the beams are close enough, the distance to the surface can be expressed approximately as a linear function of the distance between the beams. As discussed in detail above, the motion due to Vx, Vy, and Wz can be determined using multi-point laser light detection and ranging information to correct the position of the object or target 10 for Vz (velocity in the z direction), Wx (rotation about the x-axis), and Wy (rotation about the y-axis) motion. As discussed in detail below, each of the motions may be determined separately.
In some implementations, the LIDAR system includes a laser system that includes a laser or laser beam configured to move in a pattern or patterns relative to the tracked object. For example, in some implementations, the laser system 105 of the LIDAR system 100 includes multiple lasers or beams configured to move in a pattern or patterns relative to the tracked object.
For example, in some implementations, the LIDAR system 100 may have one mode in which the laser beam is fixed or stationary and a second mode in which the laser beam moves in a certain pattern or patterns, such as a shape. In some implementations, when the LIDAR system is in the second mode, the two or more laser beams move in a certain pattern or patterns. In some embodiments, different laser beams may be moved independently in different modes.
In other implementations, the LIDAR system 100 includes some lasers, or generates some laser beams that are stationary and some laser beams that are configured to move in a certain pattern (or patterns) or shape.
The laser or light beam may be moved in any pattern or shape. For example, in some implementations, the laser or beam is configured to move in an elliptical shape. In other implementations, the laser or light beam is configured to move in a line, circle, square, rectangle, triangle, or any other shape. In some embodiments, the shape or pattern in which the laser or light beam moves is indicated or determined by the tracked object. For example, in some embodiments, the pattern or shape of the laser movement may be similar to the shape of the tracked object. For example, an elliptical shape or pattern may be used when tracking an individual's face because the shape of the individual's face is substantially elliptical. Additionally, in some implementations, the laser or beam is configured to move with or follow the tracked object. In such implementations, the laser or light beam may be directed in a slightly different direction to follow or account for the movement of the tracked object.
In some implementations, the analysis module 130 (e.g., the distance module 138 of the analysis module 130) is configured to determine a distance and/or velocity of movement of the object 10. For example, analysis module 130 may determine or calculate a distance moved by the object in a direction normal or orthogonal to the direction of laser beam motion.
In an implementation, while the laser beam moves in one direction along its pattern or shape, the analysis module 130 is configured to detect motion (distance and/or velocity) of the object 10 in a direction parallel to the direction of laser beam movement or a direction perpendicular to the direction of beam movement. In some implementations, the analysis module 130 is also configured to detect or determine rotation of the object about an axis parallel to the laser beam (Z direction).
In some implementations, the laser beams moving along the pattern or shape are disposed on or hit the target at a very small distance from each other. In other words, the beams are closely spaced. In some embodiments, the beams are less than a few centimeters from each other. In other embodiments, the beams are less than a few millimeters from each other.
The velocity (Vx or also Vx) of the object 10 in the x-direction can be determined as follows. The multiple points measured during the time period are used to calculate and remove object motion in z, Wx, and Wy, as described above. A pair of LIDAR (or laser) beams are in a repetitive scan pattern, using an index (k), (e.g., an ellipse) there will be a portion of the scan cycle where most of the motion of the beams is primarily in the y-direction. In some implementations, both beams have the same pointing device, which will maintain approximately the same lateral separation Δ x at a given distance and will have approximately the same vertical y position at a given time (y1 ≈ y2 ≈ y). The index (j) is used to sample the distance of the target and the lateral position of the beam. For each of the two beams, the series of measurement points (x1(j), y1(j), z1(j)) and (x2(j), y2(j), z2(j)) sample the surface z (x, y). If the slope of the target surface is approximately expressed as linear, then this slope dz/dx is measured at each y as:
dz/dx(j)=dz/dx(y(j))=(z2(j)-z1(j))/(x2(j)-x1(j))=(z2(j)-z1(j))/Δx。
in a subsequent scan cycle, the beam will revisit each y-position at approximately the same delay Tcycle from the time of its previous visit to that y-position. In a repeated scan cycle, the target may have moved in the x-direction. If the target moves a distance δ x ═ vx ═ Tcycle during the cycle, then it will have
z1(y, k +1) ═ z1(y, k) -dz/dx (y, k) × δ x, or
[z1(y,k+1)-z1(y,k)]*[x2(y,k)-x1(y,k)]=-[z2(y,k)-z1(y,k)]*δx
At each delay Tcycle, there is an estimated error e for each sample (j) in the cycle (k):
ex(j)=[z1(y(m),k+1)-z1(y(j),k)]*[x2(y(j),k)-x1(y(j),k)]+[z2(y(j),k)-z1(y(j),k)]*δx(1,j),
where y (m) is the sample in cycle (k +1) having the value closest to y (j) in cycle (k). In some embodiments, it is desirable to minimize the error Ex sum (Ex (j)). Thus, in some cases, a delay Tcycle corresponding to or associated with a minimum amount of error may be selected and used.
Alternatively, z1(y, k +1) may be an interpolated value using z1(y (m), k +1) and z1(y (m ± 1, k + 1). if Tcycle is assumed to be approximately constant over the set of paired samples j and m and there is no significant acceleration (or change in velocity) in the x direction (e.g., because Tcycle is a very short period of time), δ x will be constant for multiple j and m pairs and a standard least squares solution may be implemented for δ x.
Leading to the following solution.
Ax=δx/Δx=sum([z1(y(m),k+1)-z1(y(j),k)]*[z2(y(j),k)-z1(yj(j),k)])
sum([z2(y(j),k)-z1(y(j),k)]*([z2(y(j),k)-z1(y(j),k)])
Then, vx is δ x/Tcycle is Ax Δ x/Tcycle
An approximation can also be made that the x-component vx of the object or target velocity is constant for the scan cycle. If this approximation does not hold, the acceleration term ax can be introduced such that
vx=vx(0)+ax*Tcycle
And solved for both vx0 and ax.
If the beam moves by an x distance Δ Xbeam during a scan cycle, this beam offset can be corrected by adjusting the z value in a subsequent scan cycle for beam position changes to obtain the measurement that would have been made at the (x (j), y (j)) position of the previous scan cycle
z1 adjusts (y (m), k +1) ═ z1(y (m), k +1) -dz/dx (y (j), k) × Δ Xbeam.
With this adjustment made, the least squares solution for vx is made as before.
Similarly, the velocity in the y-direction (Vy or also Vy) can be determined as follows. There will be a portion of the scanning cycle where most of the motion of the beam is primarily in the x-direction. During this segment of the scan cycle in the x-direction, errors can be minimized
Ey ═ sum (Ey (j)), where
ey(j)=z1(y(j)–vy*Tcycle,k)–z1(y(m),k+1)。
In some cases, this approach works even if there is a progressive line scan due to the similarity of the surface shapes of closely spaced scan lines.
The rotation about the Z axis (Wz or also referred to as ω Z) can be determined because it introduces a linear gradient into the observed values of vx and vy that occur for each scan cycle. A non-zero value of wz will result in a line gradient where vx varies as a function of y and vy varies as a function of x. Additional terms may be added to the least squares solution for vx and vy to also obtain wz. Additionally, multiple beams may be used in the determination. For example, the solution for vy can be determined at different values of x using the method above, vy producing ω z with the gradient of x:
vy2=vy1-ωz*(x2-x1)。
ωz=(vy2-vy1)/(x2-x1)。
in some embodiments, a calculation or determination may be made when the Tcycle is not constant. Additionally, in some cases, a calculation or determination may be made when the beam spacing is not constant or consistent.
Implementations of the various techniques described herein may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Embodiments may be implemented as a computer program product, i.e., tangibly embodied in an information carrier, e.g., in a machine-readable storage device (computer-readable medium, non-transitory computer-readable storage medium, tangible computer-readable storage medium) or in a propagated signal, for processing by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers. A computer program, such as the computer program(s) described above, can be written in any form of programming language, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be processed on a computer or on multiple computers at a site or distributed across multiple sites and interconnected by a communication network.
Method steps may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method steps also can be performed by, and an apparatus can be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the processing of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. Elements of a computer may include at least one processor for executing instructions and one or more memory devices for storing instructions and data. Generally speaking, a computer may also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices); magnetic disks (e.g., internal hard disks or removable disks); magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, implementations may be implemented on a computer having a display apparatus (e.g., a liquid crystal display (LCD or LED) monitor, a touch screen display) and a keyboard for displaying information to the user and a pointing device (e.g., a mouse or a trackball) by which the user may provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
Implementations may be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation), or any combination of such back-end, middleware, or front-end components. The components may be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a Local Area Network (LAN) and a Wide Area Network (WAN), such as the Internet.
In some embodiments, the LIDAR system may achieve millimeter distance accuracy performance from a subject or individual's moving face. However, in some embodiments, solid object velocity estimation requires processing of multiple samples to remove significant velocity components from speech and other biological components. A 500Hz vibration with an amplitude of 0.05mm (50 microns) will have a maximum velocity of about 16cm/sec (2 pi 500E-5 0.157 m/sec). While the amplitude of the vibration is a negligible change in distance for the process of tracking the face of the subject or individual, the instantaneous velocity may be significant and the vibration velocity may be removed. In some implementations, removing the vibration speed may require processing speed data samples significantly longer than the period of the vibration to be removed and care is taken to avoid noise or bias. For example, noise in the velocity (e.g., velocity in the Z direction) may affect or degrade the ability to detect or determine the rotation of an object or the Z velocity of an object. In some implementations, the vibration or velocity noise is relatively small and may be averaged to remove its effect.
While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes, and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the embodiments. It is understood that they have been presented by way of example only, and not limitation, and that various changes in form and detail may be made. Any portions of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. Implementations described herein may include multiple combinations and/or subcombinations of the functions, components and/or features of the different implementations described.

Claims (20)

1. A system for determining motion of an object, comprising:
a laser system comprising a set of emitters configured to emit at least one laser beam onto a plurality of points of the object and a set of receivers configured to receive electromagnetic radiation reflected from the plurality of points of the object, the laser system configured to generate at least a distance measurement and a velocity measurement for each of the plurality of points on the object based on the electromagnetic radiation reflected from the plurality of points of the object and received by the set of receivers; and
an analysis module configured to:
determining a rotation of the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object,
determining, from the rotation of the object, a distance moved by the object between the first time and the second time in a direction orthogonal to laser beams emitted by the set of emitters of the laser system.
2. The system of claim 1, wherein the object is a face of an individual.
3. The system of claim 1, wherein the laser system is configured to generate distance and velocity measurements of more than four points on the object.
4. The system of claim 1, wherein the laser system is configured to generate simultaneous distance and velocity measurements of more than four points on the object.
5. The system of claim 1, wherein the laser system is configured to emit more than four laser beams.
6. The system of claim 1, wherein the analysis module is configured to determine, from the distance and velocity measurements of the plurality of points on the object, a rotation of the object about an axis orthogonal to a laser beam emitted by the laser system.
7. The system of claim 1, wherein the analysis module configured to determine, from the distance and velocity measurements of the plurality of points on the object, a distance that the object moved in a direction orthogonal to a direction of a laser beam emitted by the laser system between the first time and the second time is also configured to generate a slope of the object, the slope of the object representing a ratio of a change in position of the object in a direction along the laser beam to a change in position of the object in a direction orthogonal to the laser beam.
8. A non-transitory computer-readable storage medium storing instructions that, when executed, cause one or more processors to perform a process, the process comprising:
generating distance and velocity measurements for a plurality of points on an object based on electromagnetic radiation reflected from the plurality of points and received by each of a receiver set of a laser system;
determining a rotation of the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object;
determining, from the rotation of the object, a distance that the object moves between the first time and the second time in a direction orthogonal to laser beams emitted by a group of emitters of the laser system.
9. The non-transitory computer-readable storage medium of claim 8, wherein the object is a face of an individual.
10. The non-transitory computer-readable storage medium of claim 8, wherein the generating includes emitting at least four laser beams from a laser system.
11. The non-transitory computer-readable storage medium of claim 8, wherein said generating comprises generating distance and velocity measurements for at least four points on an object.
12. The non-transitory computer-readable storage medium of claim 8, wherein the determining includes determining a rotation of the object about a first axis and a rotation of the object about a second axis from the distance and velocity measurements of the plurality of points on the object.
13. The non-transitory computer-readable storage medium of claim 8, wherein the process further comprises:
determining, using a laser system configured to emit at least five beams, a distance and a direction moved by the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object and the rotation of the object.
14. A method for determining motion of an object, comprising:
generating distance and velocity measurements for a plurality of points on the object based on electromagnetic radiation reflected from the plurality of points of the object and received by each of a receiver set of the laser system;
determining a rotation of the object between a first time and a second time from the distance and velocity measurements of the plurality of points on the object; and
determining, from the rotation of the object, a distance that the object moves between the first time and the second time in a direction orthogonal to laser beams emitted by a group of emitters of the laser system.
15. The method of claim 14, wherein generating comprises generating distance and velocity measurements of a plurality of points on an object using a laser system configured to emit at least four laser beams.
16. The method of claim 14, wherein the generating comprises generating distance and velocity measurements for at least four points on an object.
17. The method of claim 14, further comprising:
determining a shape of at least a portion of the object.
18. The method of claim 14, further comprising acquiring a three-dimensional image of the object.
19. The method of claim 14, wherein the object is a face of an individual.
20. The method of claim 14, further comprising determining a direction moved by the object between the first time and the second time.
CN201580038606.7A 2014-05-21 2015-05-20 Apparatus, system and method for real-time tracking of objects Active CN106537184B (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201462001544P 2014-05-21 2014-05-21
US62/001,544 2014-05-21
US201462030988P 2014-07-30 2014-07-30
US62/030,988 2014-07-30
US14/716,467 US10012734B2 (en) 2014-05-21 2015-05-19 Devices, systems, and methods for real time tracking of an object
US14/716,467 2015-05-19
PCT/US2015/031772 WO2015179515A1 (en) 2014-05-21 2015-05-20 Devices, systems, and methods for real time tracking of an object

Publications (2)

Publication Number Publication Date
CN106537184A CN106537184A (en) 2017-03-22
CN106537184B true CN106537184B (en) 2020-06-16

Family

ID=53284603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580038606.7A Active CN106537184B (en) 2014-05-21 2015-05-20 Apparatus, system and method for real-time tracking of objects

Country Status (6)

Country Link
US (2) US10012734B2 (en)
EP (1) EP3146361B1 (en)
JP (1) JP2017516110A (en)
CN (1) CN106537184B (en)
TW (2) TWI659220B (en)
WO (1) WO2015179515A1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10012734B2 (en) 2014-05-21 2018-07-03 DSCG Solutions, Inc. Devices, systems, and methods for real time tracking of an object
WO2016174659A1 (en) 2015-04-27 2016-11-03 Snapaid Ltd. Estimating and using relative head pose and camera field-of-view
US10557942B2 (en) * 2016-06-07 2020-02-11 DSCG Solutions, Inc. Estimation of motion using LIDAR
US20190324144A1 (en) * 2016-10-13 2019-10-24 Troy A. Reynolds Apparatus for remote measurement of an object
US10634794B2 (en) * 2017-02-28 2020-04-28 Stmicroelectronics, Inc. Vehicle dynamic obstacle compensation system
TWI620946B (en) * 2017-05-19 2018-04-11 Displacement record generation method and portable electronic device
US20190086544A1 (en) * 2017-09-19 2019-03-21 Honeywell International Inc. Lidar air data system with decoupled lines of sight
KR101931592B1 (en) * 2017-12-12 2019-03-13 주식회사 골프존 Device for sensing a moving ball and method for computing parameters of moving ball using the same
US10906536B2 (en) 2018-04-11 2021-02-02 Aurora Innovation, Inc. Control of autonomous vehicle based on determined yaw parameter(s) of additional vehicle
DE102018220088A1 (en) * 2018-11-22 2020-05-28 Carl Zeiss Industrielle Messtechnik Gmbh Method and device for determining at least one spatial position and orientation of at least one measurement object
CN111587381A (en) * 2018-12-17 2020-08-25 深圳市大疆创新科技有限公司 Method for adjusting motion speed of scanning element, distance measuring device and mobile platform
CN110706258B (en) 2019-10-10 2022-10-04 北京百度网讯科技有限公司 Object tracking method and device
CN111028272B (en) * 2019-12-11 2023-06-20 北京百度网讯科技有限公司 Object tracking method and device
CN113924505A (en) * 2020-05-09 2022-01-11 深圳市大疆创新科技有限公司 Distance measuring device, distance measuring method and movable platform
TWI751003B (en) * 2021-01-22 2021-12-21 友達光電股份有限公司 Light source positioning method and system
DE102021208483A1 (en) * 2021-08-05 2023-02-09 Robert Bosch Gesellschaft mit beschränkter Haftung Method and device for operating a laser unit depending on a detected state of an object and laser device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101435870A (en) * 2007-11-12 2009-05-20 电装波动株式会社 Laser radar apparatus that measures direction and distance of an object
CN101437440A (en) * 2005-12-14 2009-05-20 数字信号公司 System and method for tracking eyeball motion
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62280677A (en) * 1986-05-28 1987-12-05 Nec Corp Laser radar system
DE19528676C2 (en) * 1995-08-04 1997-05-22 Zeiss Carl Jena Gmbh Interferometer arrangement for absolute distance measurement
US6169966B1 (en) * 1996-12-27 2001-01-02 Kabushiki Kaisha Toshiba Apparatus for detecting a moving state of an object
CN1078703C (en) * 1999-07-02 2002-01-30 清华大学 Target space position and attitude laser tracking-measuring system and method
US6446468B1 (en) * 2000-08-01 2002-09-10 Fitel Usa Corp. Process for fabricating optical fiber involving overcladding during sintering
AU2005286872B2 (en) * 2004-09-21 2012-03-08 Digital Signal Corporation System and method for remotely monitoring physiological functions
EP1783517A1 (en) * 2005-11-04 2007-05-09 AGELLIS Group AB Multi-dimensional imaging method and apparatus
JP4839827B2 (en) * 2005-12-26 2011-12-21 コニカミノルタセンシング株式会社 3D measuring device
DE112009001652T5 (en) * 2008-07-08 2012-01-12 Chiaro Technologies, Inc. Multichannel recording
AU2010257107B2 (en) * 2009-02-20 2015-07-09 Digital Signal Corporation System and method for generating three dimensional images using lidar and video measurements
US8524852B2 (en) * 2010-04-19 2013-09-03 Acushnet Company Thermoset polyurethanes based on moisture-resistance polyols for use in golf balls
US8537338B1 (en) * 2011-08-24 2013-09-17 Hrl Laboratories, Llc Street curb and median detection using LIDAR data
US9188676B2 (en) * 2012-08-15 2015-11-17 Digital Signal Corporation System and method for detecting a face contour using a three-dimensional measurement system
US8948497B2 (en) 2012-09-04 2015-02-03 Digital Signal Corporation System and method for increasing resolution of images obtained from a three-dimensional measurement system
US10012734B2 (en) 2014-05-21 2018-07-03 DSCG Solutions, Inc. Devices, systems, and methods for real time tracking of an object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101437440A (en) * 2005-12-14 2009-05-20 数字信号公司 System and method for tracking eyeball motion
CN101435870A (en) * 2007-11-12 2009-05-20 电装波动株式会社 Laser radar apparatus that measures direction and distance of an object
CN101533529A (en) * 2009-01-23 2009-09-16 北京建筑工程学院 Range image-based 3D spatial data processing method and device

Also Published As

Publication number Publication date
US20150338518A1 (en) 2015-11-26
TWI659220B (en) 2019-05-11
EP3146361B1 (en) 2024-07-10
JP2017516110A (en) 2017-06-15
WO2015179515A1 (en) 2015-11-26
TW201928395A (en) 2019-07-16
TW201546475A (en) 2015-12-16
US10571573B2 (en) 2020-02-25
CN106537184A (en) 2017-03-22
US20180299555A1 (en) 2018-10-18
TWI707154B (en) 2020-10-11
EP3146361A1 (en) 2017-03-29
US10012734B2 (en) 2018-07-03

Similar Documents

Publication Publication Date Title
CN106537184B (en) Apparatus, system and method for real-time tracking of objects
CN108431548B (en) Motion estimation in six degrees of freedom (6DOF) using LIDAR
JP2010519552A (en) System and method for position detection by a single sensor
US20200150273A1 (en) Estimation of motion using lidar
JPWO2020236819A5 (en)
US11513229B2 (en) Multi-beam processing of lidar vibration signals
US20190377066A1 (en) Tracking objects using video images and lidar

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant