WO2017023202A1 - Time-of-flight monitoring system - Google Patents

Time-of-flight monitoring system Download PDF

Info

Publication number
WO2017023202A1
WO2017023202A1 PCT/SG2015/050246 SG2015050246W WO2017023202A1 WO 2017023202 A1 WO2017023202 A1 WO 2017023202A1 SG 2015050246 W SG2015050246 W SG 2015050246W WO 2017023202 A1 WO2017023202 A1 WO 2017023202A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
monitoring unit
interest
flight
flight monitoring
Prior art date
Application number
PCT/SG2015/050246
Other languages
French (fr)
Inventor
Sergey SHINKEVICH
Dmitri TIKHOSTOUP
Denis Aleksandrovich VRAZHNOV
Aron DIK
Original Assignee
Vadaro Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vadaro Pte Ltd filed Critical Vadaro Pte Ltd
Priority to PCT/SG2015/050246 priority Critical patent/WO2017023202A1/en
Publication of WO2017023202A1 publication Critical patent/WO2017023202A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition

Definitions

  • TITLE TIME-OF-FLIGHT MONITORING SYSTEM
  • the subject disclosure is directed to monitoring a field of view with a time-of- flight monitoring system and analyzing time-of-flight and related data.
  • Monitoring equipment is regularly employed by retail businesses, manufacturers, workplaces, government entities, transportation facilities, homeowners, and numerous other individuals and entities. These varied users employ such equipment for numerous applications, including analytics of shopping patterns, security, crime prevention, employee monitoring, and efficiency studies, among others.
  • the monitoring equipment may be capable of observing a field of view at a location, and it may be desired to obtain information about objects or people within the field of view. For example, desired information may include movement patterns, demographic information, physical characteristics, the identification of particular individuals, and more complex statistics that may be determined based on a combination of this information and other similar information. This information can also be compiled over time, to be used for even more complex analytics that may aid in long-term decision making for an individual or entity.
  • Some monitoring equipment may utilize a video sensor that provides a video image signal such as video image stream or a sequence of still images.
  • video-based systems may provide a large amount of raw information about the observed field of view. It may be necessary to employ significant resources in order to extract useful information from this large amount of raw image -based information.
  • a person may be required to physically observe the video image stream or still images, and for more complex analytics, may need to physically observe and record information about individuals within the field of view.
  • Such personal observation is subject to numerous deficiencies, such as fatigue, variation between different observers, and expense.
  • an image may be automatically processed by a computing device such as a computer or remote server.
  • Image-based monitoring methods are dependent on ambient or environmental conditions at or near the monitoring device, such as the amount, direction, and types of lighting and any objects or conditions that interfere with the lens of the imaging device.
  • Other types of monitoring equipment may include thermal sensors, acoustic sensors, or other similar technologies. Unlike image sensors, systems employing these technologies may allow for simpler processing techniques, or may operate better under challenging ambient or environmental conditions. However, existing sensors of these types may lack the ability to identify many types of desired information, the ability to capture desired information over a large enough field of view, or may have a significant material cost, that renders them less useful for many applications.
  • the present disclosure provides a method of identifying a plurality of objects within a field of view, comprising transmitting a light signal from a time- of-flight monitoring unit to illuminate a region including the field of view and receiving a reflected signal of the light signal at a time-of-flight camera of the time-of-flight monitoring unit, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal.
  • the method also comprises identifying the plurality of objects based on the depth information, determining an area of interest within the field-of-view, and generating one or more statistics for one or more of the identified objects within the area of interest.
  • the present disclosure provides a time-of-flight monitoring unit comprising an illumination system configured to transmit a light signal to illuminate a region including a field of view and a time of flight camera configured to receive a reflected signal of the light signal, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal.
  • the time-of-flight monitoring unit also comprises a processing module configured to identify the plurality of objects based on the depth information, determine an area of interest within the field-of-view, and generate one or more statistics for one or more of the identified objects within the area of interest.
  • FIG. 1 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with some embodiments of the present disclosure
  • FIG. 2 depicts a block diagram of an exemplary time-of-flight monitoring unit in accordance with some embodiments of the present disclosure
  • FIG. 3 depicts an exemplary time-of-flight monitoring unit in accordance with some embodiments of the present disclosure
  • FIG. 4 depicts steps for the configuration of an exemplary time-of-flight monitoring unit in accordance with embodiments of the present disclosure
  • FIG. 5 depicts an exemplary time-of-flight configuration interface in accordance with some embodiments of the present disclosure.
  • FIG. 6 depicts steps for tilt and height setup in accordance with some embodiments of the present disclosure
  • FIG. 7 depicts a plurality of exemplary time-of-flight monitoring units in a monitoring system in accordance with some embodiments of the present disclosure
  • FIG. 8 depicts steps for identifying objects based on depth information in accordance with some embodiments of the present disclosure
  • FIGS. 9A-9F depict raw and processed depth map images in accordance with some embodiments of the present disclosure
  • FIGS. 10A-10B depict a Gaussian distribution and correlation matrix in accordance with some embodiments of the present disclosure
  • FIG. 11 depicts steps for analyzing video image signals in accordance with some embodiments of the present disclosure
  • FIG. 12 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure
  • FIG. 13 depicts steps for determining identifying characteristics using machine learning in accordance with some embodiments of the present disclosure
  • FIG. 14 depicts steps for determining privacy settings in accordance with some embodiments of the present disclosure
  • FIG. 15 depicts steps for determining statistics in accordance with some embodiments of the present disclosure.
  • FIG. 16 depicts steps for identifying a person of interest based on a beacon signal in accordance with some embodiments of the present disclosure.
  • FIG. 1 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure.
  • An exemplary application may be a monitoring application for a retail location 1 as depicted in FIG. 1.
  • a retail location is provided as an exemplary application, a person having ordinary skill in the art will understand that the systems and methods described herein may be implemented in numerous other monitoring applications, such as manufacturing facilities, shipping facilities, agricultural facilities, office workplaces, government buildings, security queues, private residences, and other applications. While the exemplary embodiment described with respect to FIG. 1 may focus on the monitoring of customers and employees, numerous other objects may be monitored such as crates, boxes, vehicles, livestock, etc.
  • Exemplary retail location 1 may include a plurality of customers 10, 11, and 12, a plurality of employees 20 and 21, a checkout area 30, a display area 40, and a time-of-flight monitoring unit 50.
  • Time-of-flight monitoring unit 50 may have optical characteristics and be located in a manner such that time-of-flight data can be observed for a wide field of view, such as the portion of the retail location depicted in FIG. 1.
  • a light source (not depicted) illuminates the field of view with a light signal such as non-visible infra-red (IR) light.
  • Time-of-flight monitoring unit 50 may include a time-of-flight camera (not depicted) for observing IR light that is reflected from the field of view.
  • a processor of a processing module (not depicted) of time-of-flight monitoring unit 50 may process the reflected light to create depth information, which in turn may be used identify objects of interest such as employee or customers within the field of view.
  • the processor may perform additional analysis of the identified objects to provide relevant data or statistics for the particular application, for example, by tracking the objects of interest within the field of view. This can be used for numerous applications such as people counting, zone analysis, queue analysis, or checkout analysis.
  • Time-of-flight monitoring unit 50 may also be in communication with other devices such as other time-of-flight monitoring units, mobile devices, tablet computers, servers, personal computers, etc.
  • FIG. 2 depicts a block diagram of an exemplary time-of-flight monitoring unit
  • time-of-flight monitoring unit 50 may include a processing module 52, time-of-flight camera 54, illumination system 56, secondary sensor 58, power module 60, and communication module 62.
  • Processing module 52 may include a processor having processing capability necessary to perform the processing functions described herein, including but not limited to hardware logic, computer readable instructions running on a processor, or any combination thereof.
  • processing module 52 may be a system on a module (SOM) that is able to perform a variety of processing, communication, and memory operations that are useful for time-of-flight processing applications.
  • SOM system on a module
  • Processing module 52 may run software to perform the operations described herein, including software accessed in machine readable form on a tangible non-transitory computer readable storage medium, as well as software that describes the configuration of hardware such as hardware description language (HDL) software used for designing chips.
  • HDL hardware description language
  • tangible (or non-transitory) storage medium examples include disks, thumb drives, and memory, etc., but does not include propagated signals.
  • Tangible computer readable storage medium include volatile and non- volatile, removable and non-removable media, such as computer readable instructions, data structures, program modules or other data. Examples of such media include RAM, ROM, EPROM, EEPROM, flash memory, disks or optical storage, magnetic storage, or any other non-transitory medium that stores information that is accessed by a processor or computing device.
  • one or more computer readable storage media may be integrated with processing module 52, may be one or more other components of time-of-flight monitoring unit 50, may be located on another device, may be located at a remote location, or any combination thereof.
  • Illumination system 56 may provide a light source that emits a light signal for illuminating a field of view, and in some embodiments, a wide field of view. This emitted light is then reflected by objects within the field of view and captured by the time-of-flight camera 54.
  • IR light emitting diodes LEDs
  • IR LEDs may be well suited for illuminating a field of view because the IR light is generally not perceptible to people within the field of view, and one or more IR LEDs are able to be precisely controlled to provide light in a manner that facilitates integration with time- of-flight camera 54.
  • IR LEDs may be integrated within an illumination system 56 of time-of-flight monitoring unit 50, in an exemplary embodiment ten IR LEDs may be arranged in a plurality of illumination arrays having a physical location about the time-of-flight monitoring unit 50 that facilitates illumination of a wide field of view.
  • the processing module 52 may provide one or more modulation signals, each of which is associated with one or more IR LEDs. Modulation signals may be analog, digital, or any combination thereof.
  • the modulation signals may directly drive the IR LEDs, in other embodiments the drive signals may be provided to drive circuitry that drives and conditions the signal that in turn drives the IR LEDs.
  • the modulation signal may be controlled by time-of-flight sensor and synchronized with its internal logic.
  • Operating parameters of the IR LEDs may be controlled by the modulation signals. In some embodiments it may not be necessary to operate all of the IR LEDs, for example, if a particular field of view may be illuminated with less than all of the IR LEDs. Using only some of the LEDs may result in less power usage, heat production, etc. In some embodiments, the selective illumination of the IR LEDs may be controlled such that the one or more IR LEDs that are not operating changes over time. [0035] During operation, the IR LEDs that are providing illumination may have an illumination cycle that has an illumination period, based on a modulation frequency and duty cycle.
  • the IR LEDs may be operated for an illumination time of the illumination period, such that the IR LEDs are illuminated for only a portion of the illumination period.
  • the duty cycle may range from 20-40% and the modulation frequency may range from 19-25 MHz.
  • the modulation signal may also provide a modulation waveform for the drive of the IR LEDs.
  • the IR LEDs may be cycled at the modulation frequency.
  • the modulation frequency may be variable or selectable. Having a variety of modulation frequencies may allow for a plurality of time-of-flight monitoring units 50 to be placed in close proximity or in overlapping locations without resulting in interference between the reflected light at each of the time-of-flight monitoring units.
  • a single time-of- flight monitoring unit 50 may transmit at multiple frequencies.
  • a modulation waveform may be any suitable waveform such as a square wave, sine wave, sawtooth wave, etc.
  • the modulation waveform may have a duty cycle that may be varied based on desired characteristics of the transmitted and reflected light.
  • the illumination system 56 may be adjusted for particular environments or applications. For example, in some embodiments it may be desirable to minimize the illumination time or the duty cycle of the modulation waveform in a manner that minimizes power consumption and heat generation, for example if the time-of-flight monitoring unit 50 has a low power connection or is powered by a battery. In some embodiments, these operating parameters could automatically be modified based on operating conditions, such as characteristics of the power supply, ambient
  • a decrease of the duty cycle may result in power saving, and the saved power can be used for increasing currents and thus the optic output power.
  • Time-of-flight camera 54 may be configured to receive reflected light originating from illumination system 56 and to generate depth information based on the reflected light.
  • the time-of-flight camera may include one or more lenses, a time-of-flight sensor, and sensor circuitry.
  • any suitable lenses may be provided for the time-of-flight camera 54, in an exemplary embodiment the one or more lenses may be configured to receive reflected light from the field of view, such as a wide field of view. For example, a wide angle lens with a 1.2mm focal length may provide a 1 10 degree horizontal field of view.
  • An exemplary time-of-flight sensor may have pixels that receive the reflected light from the field of view via the one or more lenses.
  • the time-of-flight sensor may include a CMOS pixel array as the time- of-flight sensor.
  • Sensor circuitry may control shuttering of the time-of-flight camera 54 (e.g. , based on one or more control signals provided by processing module 52), such that the time-of- flight sensor only receives light during a time of interest corresponding to the expected return time of the reflected light.
  • Shuttering may be based on the parameters of the illumination system such as the illumination time, periodic rate, height, tilt angle, any other suitable parameters, or any combination thereof.
  • depth information may be determined from the reflected light received by the pixels of the time-of-flight sensor in any suitable manner.
  • the time-of-flight camera 54 may determine a phase shift between the illumination and reflection to determine distance.
  • the distances associated with the pixels of the time-of-flight sensor may be compiled into a depth map that may be used for further processing.
  • time-of-flight monitoring unit 50 may also include a secondary sensor 58.
  • a secondary sensor 58 may be used as a secondary sensor 58, including video sensors (e.g., streaming video, a series of still camera images, etc.), infra-red sensors, image sensors, proximity sensors, stereoscopic sensors, thermal sensors, another time-of-flight camera, any other suitable sensor, or any combination thereof.
  • the secondary sensor 58 may be a video sensor that streams video within the field of view, such as an RGB camera.
  • Time-of-flight camera 54, illumination system 56, and secondary sensor 58 may be in communication with processing module 52 in any suitable manner, including wired connections, wireless communications, any other suitable communication method, or any combination thereof.
  • time-of-flight camera 54, illumination system 56, and secondary sensor 58 may be in communication with processing module 52 via a wired internal bus of time-of-flight monitoring unit 50, such as USB, I C, and SCSI.
  • processing module may communicate any suitable information with time-of-flight camera 54, illumination system 56, and secondary sensor 58, in some
  • processing module 52 may communicate with time-of-flight camera 54 to monitor and control shuttering and other sensor operations and to receive depth information such as a depth map, may communicate with illumination system 56 to provide modulation signals and to monitor operating conditions of the illumination system 56, and may
  • secondary sensor 58 communicates with secondary sensor 58 to control the operation of secondary sensor 58 and receive sensor signals such as a video image signal.
  • Processing module 52 may perform processing based on the depth information received from time-of-flight camera 54, such as a depth map. As will be described in more detail herein, processing may include identifying a plurality of objects such as people who are located within the field of view. In some embodiments, in addition to identifying objects, processing module 52 may perform analysis of the data and determine statistics, for example, to perform people counting, queue analysis, zone analysis, or checkout analysis. Because these analyses are performed in the first instance based on time-of-flight data, as opposed to other data sources such as the video image signal, it may be possible to perform complex analyses faster and/or with less processing resources than conventional systems. This may allow for processing module 52 to operate with lower power requirements and less processing power, significantly reducing the expense associated with monitoring systems and allowing complex analyses to be performed on the time-of-flight monitoring unit 50, rather than on a remote computer or server.
  • Processing module 52 may also perform processing based on data received from a secondary sensor 58 such as a video sensor. Although any suitable processing may be performed, in an exemplary embodiment the video received from the secondary sensor may be used for identification of individuals, capture of biometric information, identification of adults vs. children, customer vs. associates, any other suitable relevant information, or any combination thereof.
  • a secondary sensor 58 such as a video sensor.
  • the video received from the secondary sensor may be used for identification of individuals, capture of biometric information, identification of adults vs. children, customer vs. associates, any other suitable relevant information, or any
  • processing of information from secondary sensor 58 may be aided based on information determined by time-of-flight camera 54, as described in more detail herein.
  • processing module 52 may identify the locations of one or more customers within the field of view based on the depth information provided by time-of-flight camera 54. Those customer locations may then be used for processing module 52 to more efficiently determine information from a signal such as a video image signal provided by a secondary sensor 58, by identifying particular portions of the video image signal for additional analysis.
  • Time-of-flight monitoring unit 50 may also include a power module 60.
  • Power module 60 may include any suitable module, chip, device, components, or combination thereof that provides power to the other portions of time-of-flight monitoring unit 50, and in some embodiments, may include a plurality of power modules providing power to different parts of time-of-flight monitoring unit 50.
  • Power module 60 may receive power from any suitable power source, including external power sources such as line-powered power sources or internal power sources such as a battery.
  • the components of time-of- flight monitoring unit 50 may be selected and configured to operate with a Power over Ethernet (PoE) power source, permitting both power and data to be accessed via a single Ethernet connection.
  • PoE Power over Ethernet
  • power module 60 may convert and regulate the power from the PoE power source such that it is suitable for providing power to the other components of time-of-flight monitoring unit 50.
  • Time-of-flight monitoring unit 50 may also include a communication module 62.
  • Communication module 62 may include any suitable module, chip, device, components, or combination thereof that allows processing module 52 or any other components of time-of- flight monitoring unit 50 to communicate with external components, devices, processors, systems, servers, computers, or any other external systems. Although communication module 62 may provide for communications in any suitable manner (e.g. , electrical, optical, etc.), in an exemplary embodiment communication module may facilitate digital electronic
  • FIG. 3 depicts an exemplary time-of-flight monitoring unit 50 in accordance with some embodiments of the present disclosure.
  • the components of time-of-flight monitoring unit 50 may be packaged into a compact physical unit that may be structured to be located in various physical locations to capture a desired field of view at a location.
  • the time-of-flight monitoring unit 50 may be structured in a manner that allows for the unit to be recessed within a ceiling or wall, to have a low profile for discreet attachment on a surface of a ceiling or wall, to be self-supporting on a surface or a supporting stand, to be attached to structural support such as a rafter via an integral connecting arm, or in any other suitable manner.
  • time- of-flight monitoring unit 50 may adapted to be located in a recessed portion of a ceiling or wall.
  • Time-of-flight monitoring unit 50 may include a housing 64 that encloses the electrical components, IR LEDs, and sensors of the time-of-flight monitoring unit 50.
  • housing 64 may be constructed of any suitable material or combination of materials, in an exemplary embodiment housing 64 may be constructed of a plastic, aluminum, or other metals.
  • housing 64 includes a face 66 and an enclosure 68, which are connected to each other with screws.
  • Face 66 may provide a physical interface to align the time-of-flight camera 54,
  • a time-of-flight monitoring unit 50 may be oriented such that a face 66 of housing 64 is facing downward toward the field of view.
  • the physical structure of face 66 may require that the spacing and the location of the time-of-flight camera 54 and IR LEDs 56 are optimized such that a uniform illumination is provided over the field of view and such that the time-of-flight camera 54 is able to receive reflected light from the field of view.
  • IR LEDs may be arranged in two opposite semi-circular arrays, each including half (e.g. , five) of the IR LEDs. Time-of-flight camera 54 and secondary sensor 58 may be located between the IR LEDs.
  • Enclosure 68 of housing 64 may provide a physical interface with an installation location such as a hole in a ceiling. Enclosure 68 also may include heat sinks that are located and spaced to provide for the dissipation of heat produced by electrical components of time-of- flight monitoring unit 50, such as IR LEDs 56. In some embodiments enclosure 68 may include interfaces for electrical and communication connections, such as Ethernet and USB
  • FIG. 4 depicts steps 100 for the configuration of time-of-flight monitoring unit
  • the time-of-flight monitoring unit may be provided with unique identification information.
  • the unique identifier may be provided in any suitable manner, e.g. , at the manufacturer, assigned by a centralized system upon installation, by a user through a user interface, in any other suitable manner, or any combination thereof.
  • the unique identifier may facilitate communication between remote devices and a particular time-of-flight monitoring unit 50.
  • any secondary sensor 58 may be configured either manually or automatically.
  • configuration parameters may include video codecs, video quality, resolution, bitrate, framerate, real time streaming, http live streaming, any other suitable parameters, or any combination thereof.
  • the time-of-flight camera 54 and IR LEDs 56 operation may be configured either manually or automatically.
  • Configuration parameters that may be set include the IR channel, the camera tilt angle, the distance of the camera to a target surface (e.g. , the floor for a ceiling mounted unit), a minimum detection height of objects (e.g. , if monitoring customers, a height associated with a child or a crouching person), relative location settings (e.g. , relative to other units and/or known centralized location), a boundary line, a boundary area, a checkout area, queue area, service zone, any other suitable configuration parameters, or any combination thereof.
  • configuration may be performed through a configuration interface as depicted in FIG. 5.
  • FIG. 5 depicts an exemplary time-of-flight configuration interface 200 in accordance with some embodiments of the present disclosure.
  • Configuration interface 200 may be provided on any suitable device in communication with time-of-flight monitoring unit 50, or at a time-of-flight monitoring unit 50 if the unit has a display.
  • configuration interface 200 may include any suitable configuration parameters
  • configuration interface 200 includes IR channel field 202, camera tilt field 204, floor height field 206, minimal height field 208, region field 210, field of view 212, and anchor points 214.
  • Other configuration parameters such as those discussed above may be provided but are not depicted in FIG. 5. Although not depicted, it will be understood that such configuration settings may be provided in any suitable manner.
  • configuration interface is depicted and described as configuring certain parameters manually and certain parameters automatically, it will be understood that any of the parameters may be set either manually or automatically.
  • a user may set the IR channel 202 manually, in some embodiments the IR channel 202 may be set automatically.
  • a user may provide basic location information to a device in communication with the units. From this basic information, parameters such as the modulation frequency of the IR LEDs 56 of each of the time-of-flight monitoring units may be set automatically in a manner that avoids the overlap of reflected light having the same or similar modulation frequencies.
  • a field of view may be in the range of 20 ft. x 14 ft., at an exemplary placement height of 13 ft., and at 24 ft x 18 ft. for a placement height of 14 ft.
  • Controls for camera tilt field 204 and floor height field 206 may orient time-of-flight camera 504 and IR LEDs 506 to the field of view in order to accurately measure depth information within the field of view.
  • these parameters may be set manually, e.g., based on known or measured values for the camera tilt and floor height. In some embodiments, this configuration may be performed with a partially or fully automated process, such as by the steps depicted in FIG. 6.
  • FIG. 6 depicts steps 300 for tilt and height setup in accordance with some embodiments of the present disclosure.
  • anchor points 214 may be virtual points that may be positioned within the depicted field of view 212 of the time-of-flight camera 54.
  • a plurality of virtual anchor points 214 e.g. , three anchor points
  • Configuration rules may include a distance from the optical center of the camera, a distance from other anchor points 214, a requirement to be located on a common plane (e.g. , the floor), avoidance of particular types of surfaces (e.g. , non-reflective surfaces such a dark carpet), any other suitable rules, or any combination thereof.
  • the plurality of virtual anchor points 214 e.g. , three anchor points
  • Configuration rules may include a distance from the optical center of the camera, a distance from other anchor points 214, a requirement to be located on a common plane (e.g. , the floor), avoidance of particular types of surfaces (e.g. , non-re
  • configuration rules may require that each anchor point 214 be positioned at least 10 cm from all other anchor points 214 and from the optical center of the camera, that all anchor points 214 be positioned over a common surface of a floor, and that no anchor points be positioned over a dark colored carpet surface.
  • placement of the anchor points 214 may be done manually, for example, by using the device interface (e.g. , using a mouse or touchscreen) to drag the anchor points 214 to locations that comply with the configuration rules. In other embodiments, this positioning may be done automatically, e.g. , by clicking the "Auto" radio button within the camera tilt 202 field on a device running the time-of-flight configuration interface 200.
  • the device and/or the processing module 52 may scan the field of view 212, analyze depth information such as depth maps, and perform any other suitable operations to position the anchoring points within the field of view 212.
  • the device and/or the processing module 52 may calculate the camera height and tilt at step 304.
  • height and tilt may be calculated in any suitable manner, in an embodiment the height may be calculated as a mean value of distances to each point and the tilt may be calculated based on deviation of these distances.
  • the configuration interface 200 may be updated with the calculated camera tilt field 204 and floor height field 206 values.
  • a user may wish to modify these automatically updated settings and may do so at step 308, for example, by modifying the values for camera tilt field 204 and floor height field 206 through the manual interface for each of those fields.
  • the camera tilt field 204 and floor height field 206 values may be stored in memory of processing module 52 for use in correctly interpreting received depth information.
  • configuration interface 200 may also include minimal height field 208.
  • the minimal height field 208 may provide information that allows processing module 52 to identify an appropriate window of interest for the received depth information. For example, for monitoring people it may be desirable to set a minimal height that captures adults, but ignores shorter objects (e.g. , small children, pets, etc.) that are not of interest or might otherwise occupy unnecessary processing resources. Although in an embodiment the minimal height may be set, it will be understood that other criteria could be used to set a window of interest for depth information, such as a height range (e.g. , minimum and maximum), an object type (e.g. , child, adult, types of animals, physical objects, vehicles, etc.), any other suitable criteria, or any combination thereof.
  • a height range e.g. , minimum and maximum
  • an object type e.g. , child, adult, types of animals, physical objects, vehicles, etc.
  • Configuration interface 200 may also include a region field 210.
  • the region field is the area used for detection of objects.
  • an interface may be provided to select one or more areas of interest for processing, e.g. , for generating statistics.
  • any suitable area of interest may be provided in any suitable manner, in some embodiments the area of interest may be a boundary line, a boundary region, a checkout area, any other suitable area of interest, or any combination thereof.
  • a boundary line may be a line that is placed on a display of the field of view, for example, using a user input such as a keyboard, mouse, or touch screen.
  • a boundary line may be used for any suitable application, in an embodiment the boundary line may be relevant to a people counting application. Any object that is identified as a person may be counted as they cross the line.
  • a boundary region may be a closed shape that is placed on a display of the field of view, for example, using a user input such as a keyboard, mouse, or touch screen.
  • a boundary region may be selected in any suitable manner, such as drawing the boundary region on the field of view, selecting a point within the field of view and selecting the size of the boundary region, any other suitable selection method, or any combination thereof.
  • a boundary region may be used for any suitable application, in an embodiment the boundary region may be relevant to a queue analysis or zone analysis application. Statistics may be generated for any object that is identified as a person within the boundary region, as described herein.
  • a checkout area may be an area that is associated with a checkout area, such as a conventional checkout area with an employee, a self-checkout area, or a hybrid checkout area (e.g. , a checkout area in which one or more employees monitor and assist with self-checkout).
  • the checkout area may be identified in any suitable manner, for example, using a user input such as a keyboard, mouse or touch screen, or in some
  • a checkout region may be selected in any suitable manner, such as drawing the checkout area on the field of view, selecting a point within the field of view that includes a checkout device and selecting the size of the checkout area, automatic selection based on depth information, any other suitable selection method, or any combination thereof.
  • a checkout area may be used for any suitable application, in an embodiment the checkout area may be relevant to a checkout analysis application.
  • Statistics may be generated for any object that is identified as a person within the checkout area, as described herein.
  • FIG. 7 depicts a plurality of time-of-flight monitoring units 50, labeled as time- of-flight monitoring units 402a-402i, in a time-of-flight monitoring system in accordance with some embodiments of the present disclosure.
  • FIG. 7 depicts a location 410 (e.g. , a retail store) that is monitored by a plurality of time-of-flight monitoring units 402a-402i.
  • a field of view 404a-404i may be associated with each of the time-of-flight monitoring units 402a-402i.
  • time-of-flight monitoring units may be adapted to a particular space or location, including a plurality of time-of-flight monitoring units having different optical characteristics, fields of view, etc., in order to efficiently monitor spaces having irregular shapes or environmental conditions.
  • time-of-flight monitoring units 402a-402i are located such that the edge of each associated field of view 404a-404i is continuous with of an edge of another field of view 404a-404i.
  • the respective field of view edges are depicted as being aligned in FIG. 7, it will be understood that each edge may overlap the adjacent edges to allow for a degree of redundancy in the handoff of object monitoring between time-of-flight monitoring units 402a-402i.
  • the field of view may have a different shape (e.g. , rectangular, circular, semi-circular, etc.) based on the configuration of the IR LEDs 56 and the time-of-flight camera 54.
  • Objects that move through and between any of the fields of view 404a-404i may be continuously monitored and tracked by the time-of- flight monitoring units 402a-404i, local computing devices 406, remote servers 408, or any combination thereof.
  • each of time-of-flight monitoring units 402a-402i may be aware of and in communication with adjacent monitoring units, such that one of time-of-flight monitoring units 402a-402i may provide information relating to objects to adjacent units. Such information may include whether an object is moving into an adjacent monitoring unit's field of view, a unique identifier associated with the object, depth information for the object, speed of travel, associated demographic data, history data relating to movement through fields of view associated with other monitoring units, statistics for the object, any other suitable information, or any combination thereof.
  • Time-of-flight monitoring units 402a-402i may communicate this information through any suitable communication medium (wired or wireless) as described herein, although in an exemplary embodiment the time-of-flight monitoring units 402a-402i may all be on a shared Ethernet network.
  • Time-of-flight monitoring units 402a-402i may also be in communication with local computing devices 406, remote servers 408, any other suitable devices, or any combination thereof. In some embodiments, some or all of the processing relating to adjacent monitoring units may be performed by local computing device 406, remote servers 408, or any combination thereof. Moreover, any of time-of-flight monitoring units 402a-402i, local computing devices 406, remote servers 408, or any other suitable device, may be in
  • FIG. 8 depicts steps 500 for identifying objects based on depth information in accordance with some embodiments of the present disclosure. It will be understood that steps 500 may be performed at any suitable computing device, such as on processing module 52 of time-of-flight monitoring unit 50 (e.g. , any of time-of-flight monitoring units 402a-402i), local computing devices 406, remote servers 408, or distributed among any combination thereof. However, in an exemplary embodiment described herein, the processing described in steps 500 may be efficient enough to be run in real-time on a single processing module 52 of a time-of- flight monitoring unit 50. Although the steps 500 may be performed to identify any suitable objects as described herein, in an exemplary embodiment the object to be identified may be a person.
  • processing module 52 may receive a frame of depth information such as a depth map from time-of-flight camera 54.
  • the depth map may represent depth information obtained for a field of view in accordance with the embodiments described herein.
  • An example of a raw depth map in accordance with embodiments of the present disclosure is depicted at FIG. 9A.
  • the steps 500 may be performed for a single frame of a depth map, i.e. , without secondary information to assist in identifying people, such as tracking of previous positions of people under observation, identifying information to match particular individuals, information from other sensors, any knowledge of prior frames, or any other secondary information.
  • secondary information may also be used to assist in the identification of people according to steps 500.
  • Steps 504 - 510 may modify the depth information according to one or more exemplary object
  • processing module 52 may correct the raw depth map image based on one or more optimization factors. Optimization factors may be based on known error sources, such as lens tilt, image distortion, any other suitable factors, or any combination thereof. These optimization factors may be determined during manufacturing, during setup of the time-of-flight monitoring unit 50, based on an analysis of historical depth information data for a particular monitoring unit, at any other suitable time, or any combination thereof.
  • the processing module 52 may correct the raw depth map based on the lens tilt value that is set or determined during setup, and a known distortion value associated with the time-of-flight camera 54.
  • a known distortion value associated with the time-of-flight camera 54 An example of an image that has been corrected in this manner is depicted in FIG. 9B.
  • processing module 52 may modify values falling outside of a desired depth window.
  • a value may fall outside of a desired depth window if the depth is greater than a comparison value, such as the distance to the floor, a minimal depth (e.g. , as set during configuration of the time-of-flight monitoring unit 50), a measured value based on the current frame or historical frames (e.g. , an average, median, depth percentage, or similar value), or any other suitable comparison value.
  • a comparison value such as the distance to the floor, a minimal depth (e.g. , as set during configuration of the time-of-flight monitoring unit 50), a measured value based on the current frame or historical frames (e.g. , an average, median, depth percentage, or similar value), or any other suitable comparison value.
  • depth values that fall outside of the desired depth window may be modified in any suitable manner, in some embodiments the depth values may be set to a constant value, such as the depth value associated with the distance to the floor.
  • FIG. 9C An example of an
  • processing module 52 may calculate the standard deviation of the image pixels and modify the image based on the standard deviation values.
  • a high standard deviation value may be indicative of sudden changes in depth, for example, at a border region between an object and the floor.
  • the standard deviation values may be compared to a threshold (e.g. , a predetermined threshold, a threshold determined based on the configuration of the time- of-flight monitoring unit 50, a threshold based on current and/or historical image frames, etc.) and any pixels associated with a standard deviation that exceeds the threshold may be modified.
  • a threshold e.g. , a predetermined threshold, a threshold determined based on the configuration of the time- of-flight monitoring unit 50, a threshold based on current and/or historical image frames, etc.
  • the depth values may be set to a constant value, such as the depth value associated with the distance to the floor.
  • FIG. 9D An example of an image that has depth values modified in this manner is depicted in FIG. 9D.
  • processing module 52 may modify the image based on morphology characteristics of the object to be identified (e.g. , a person). For example, it may be known that a depth map associated with a person and captured from a camera oriented on the ceiling is likely to have certain characteristic shapes and depth profiles, based for example on head, shoulder, and arm locations. The mathematical morphology characteristics may emphasize shapes of the objects and make them easier to identify and to split in case of multiple objects. Based on these characteristics the image may be modified in a manner that is likely to emphasize portions of the image that correspond to the object to be identified. In some embodiments, pixels may be removed or added in regions that conform to sets of rules associated with the morphology characteristics. An example of an image that has depth values modified in this manner is depicted in FIG. 9E.
  • processing module 52 may compare the depth map to a reference pattern or may use classification algorithms for identifying learned patterns. Although it will be understood that the comparison may be performed in any suitable manner, and that any suitable reference pattern may be used, in an exemplary embodiment processing module 52 may calculate a correlation matrix between the depth map and a Gaussian distribution, which can be a result of applying classification algorithms on the depth map, including cascade classifiers such as Viola- Jones, or other classifiers.
  • a Gaussian distribution is depicted in FIG. 10A.
  • the image that is compared to the Gaussian distribution may be an image depicted in FIG. 9F, wherein the three objects that should be identified as people are identified with crosshairs.
  • FIG. 10B An example of a correlation matrix based on the comparison of the image of FIG. 9F to a Gaussian distribution is depicted in FIG. 10B.
  • processing module 52 may determine identification values for the objects to be identified (e.g. , people) based on the correlation matrix.
  • identification values may be calculated in any suitable manner, in some embodiments the objects may be identified based on absolute values, correlations, and standard deviations associated with each of the maxima of the correlation matrix. Additional logic can be applied for more accurate object identification such as size or shape of the object, history of detections over the time, any other suitable logic, or any combination thereof.
  • processing module 52 may identify candidate maxima based on one or more of the identification values. Although any or all of the identification values may be used to identify candidate maxima, in an exemplary embodiment the candidate maxima may be based on the correlation value associated with each of the maxima exceeding a threshold (e.g. , a predetermined threshold, a threshold determined based on the configuration of the time-of- flight monitoring unit 50, a threshold based on current and/or historical image frames, etc.). [0081] At step 518, processing module 52 may compare the identification values associated with each of the candidate maxima to one or more selection rules to identify the objects (e.g. , people) from the depth information. Exemplary rules may include morphology rules (e.g.
  • the desired object e.g. , the three people of FIG. 9F
  • the desired object may be identified at step 520.
  • processing module may determine unique data for each of the identified objects.
  • Unique data may be associated with a particular object, and may provide information about the object's movements and activities within the field of view.
  • Unique data may be determined based on the depth information from time-of-flight camera 54, data from secondary sensor 58, other data related to objects stored in a database, any other data source, or any combination thereof, as described herein.
  • Exemplary unique data may include the location within the field of view, elapsed time within the field of view and at a particular location, depth information related to the object, movement within the field of view, calculated statistics related to the object, any other suitable information, or any combination thereof.
  • FIG. 11 depicts steps 600 for analyzing video image signals acquired by a secondary sensor 58 in accordance with some embodiments of the present disclosure.
  • the steps 600 may be performed at any suitable computing device, such as on processing module 52 of a time-of-flight monitoring unit 50 (e.g., any of time-of-flight monitoring units 402a-402i), local computing devices 406, remote servers 408, or distributed among any combination thereof.
  • the processing described in steps 600 may be efficient enough to be run in real-time on a single processing module 52 of a time-of-flight monitoring unit 50.
  • the steps 600 may be performed to analyze video image signals associated with any suitable objects as described herein, in an exemplary embodiment the objects to be analyzed may be people.
  • processing module 52 may associate the video image signal with the depth information (e.g. , depth map).
  • the video image signal received from a secondary sensor 58 such as an RGB video sensor may be scaled and oriented to correspond to the depth information (e.g. , depth map) received from time-of-flight camera 54.
  • the scaling and/or orientation may be based on information determined during manufacturing, during configuration, or based on real time data (e.g., identifying prominent objects in both a video image signal and depth map and basing the scaling and/or orientation on the locations of the prominent objects).
  • the video image signal can be effectively used for retrieving information about a person, such as demographic information, color information about a person's clothing (e.g., for identifying employees), any other suitable information, or any combination thereof.
  • the association of the video image signal and depth information may result in a combined data image having a custom format including both image pixel and depth information (e.g. , for an RGB video image signal combined with depth map information, an "RGB-D" signal).
  • processing module 52 may identify the objects (e.g., people) in the video image signal based on the identified locations determined from the depth information (e.g. , based on the process described in steps 500).
  • processing module 52 may identify regions of interest associated with each of the locations for each of the objects.
  • the identified region may be predetermined, based on depth information (e.g. , the borders associated with the object from the processed depth image), based on the video image signal, determined by any other suitable procedure, or any combination thereof.
  • the region of interest may include the object itself as determined from the depth information as well as a buffer region surrounding the object.
  • processing module 52 may process the regions of interest of the video image signal. Processing only the regions of interest reduces the processing load and time, and may allow for complex operations to be performed in real-time at the processing module 52, even when processing module 52 has limited processing speed or power. Although any suitable processing may be performed, in some embodiments processing module 52 may identify individuals, perform biometric analysis, determine statistics, or determine any other suitable information, for the objects that are within the region of interest.
  • processing may include identifying a person of interest such as an employee, a customer service specialist, a customer, a manager, and a checkout specialist. For example, it may be known that an employee wears a certain uniform or subset of uniforms, color scheme, or other identifiable color or pattern. This pattern may be known and a criteria may be used to determine whether the object is a person of interest.
  • a criteria may be set in any suitable manner, in some embodiments the criteria may be set manually, automatically, or any combination thereof.
  • a color or set of colors
  • a threshold e.g., a percentage of the video image signal of the person that must include the color or colors.
  • persons of interest may be identified within a set of images and a learning algorithm may determine criteria for distinguishing persons of interest from other people or objects.
  • multiple criteria may be provided to identify multiply types of persons of interest, such as an employee, a customer service specialist, a customer, a manager, and a checkout specialist.
  • FIG. 12 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure.
  • An exemplary application for the monitoring environment may be a monitoring application for a retail location 2 as depicted in FIG. 12.
  • a retail application is provided as an exemplary application, a person having ordinary skill in the art will understand that the systems and methods described herein may be implemented in numerous other monitoring applications, such as manufacturing facilities, shipping facilities, agricultural facilities, workplaces, government buildings, security queues, private residences, and other applications. While the exemplary embodiment described with respect to FIG. 12 may focus on the monitoring of customers and employees, numerous other objects may be monitored such as crates, boxes, vehicles, livestock, etc.
  • Exemplary retail location 2 may include a plurality of customers 10 and 11, a display area 40, a ceiling-mounted time-of-flight monitoring unit 50a, and a wall-mounted time- of-flight monitoring unit 50b.
  • both of the time-of- flight monitoring units 50a and 50b may capture information related to the same or similar target area from different angles.
  • any number of time-of-flight monitoring units 50 may be associated with a single target area.
  • the time-of-flight monitoring units are oriented at right angles in this exemplary embodiment, it will be understood that a plurality of monitoring units may be oriented at a variety of relative locations.
  • Time-of-flight monitoring unit 50b may be performed in a similar manner to a time-of-flight monitoring unit located on the ceiling (e.g. , time-of-flight monitoring unit 50a), with modifications to adjust for the differing location relative to the objects of interest.
  • a ceiling-located time-of-flight monitoring unit e.g. , time-of- flight monitoring unit 50 and 50a
  • some or all of the processing of depth information and/or information from a secondary sensor 58 may be processed by a processing module 52 of time- of-flight monitoring unit 50b.
  • time-of-flight monitoring units e.g.
  • time-of-flight monitoring unit 50a or 50b may function as a master device and the other as a slave device.
  • a monitoring unit located at any suitable orientation may be modified by modifying one or more rules or parameters of interest.
  • the configuration of a time-of-flight monitoring unit 50b having a side-facing orientation may focus on a particular region 70 within the field of view of the time-of-flight camera associated with time-of-flight monitoring unit 50b, such as the region 70 near display area 40, as depicted in FIG. 12.
  • a minimum and a maximum depth associated with the region 70 may be determined automatically, based on user inputs, or any combination thereof.
  • this may include customer 10 located in proximity to display area 40, but not include customer 11 located outside of the region 70.
  • FIG. 13 depicts steps 700 for determining identifying characteristics using machine learning in accordance with some embodiments of the present disclosure.
  • the time- of-flight monitoring units 50 described herein may obtain a wealth of information based on the depth information from time-of-flight camera 54, information from one or more secondary sensors 58 (e.g., a video sensor), or a combination thereof.
  • this information may be used to identify numerous identifying characteristics about the objects of interest, such as identifications of an employee, a customer service specialist, a customer, a manager, or a checkout specialist, as described above.
  • exemplary identifying based on the depth information from time-of-flight camera 54, information from one or more secondary sensors 58 (e.g., a video sensor), or a combination thereof.
  • this information may be used to identify numerous identifying characteristics about the objects of interest, such as identifications of an employee, a customer service specialist, a customer, a manager, or a checkout specialist, as described above.
  • exemplary identifying a person
  • characteristics include clothing (e.g., color of clothing), height, weight, age, age range (e.g., newborn, toddler, child, pre-teen, teenager, adult, senior citizen, etc.), hair color, eye color, biometrics (facial recognition, iris recognition, etc.), gender, race, ethnicity, identity (e.g. , matching to a known individual), likely income level (e.g. , based on identifying information, shopping history, information about other previously visited locations), employee status, customer status, energy level, emotions, anxiety level, intoxication (e.g., based on irregular movement patterns), customer confusion, likelihood of making a purchase, any other suitable identifying characteristics, or any combination thereof.
  • clothing e.g., color of clothing
  • height, weight e.g., height, weight, age, age range (e.g., newborn, toddler, child, pre-teen, teenager, adult, senior citizen, etc.)
  • hair color eye color
  • biometrics facial recognition, iris recognition, etc.
  • gender race, ethnicity
  • each of the steps 700 may be performed by any suitable computing device (e.g. , computing devices 406), server (e.g. , servers 408), or any combination thereof, at any suitable location or combination thereof, and in any suitable manner (e.g. , local vs. distributed processing).
  • computing devices 406 e.g. , computing devices 406
  • server e.g. , servers 408
  • any suitable manner e.g. , local vs. distributed processing.
  • a data source includes at least one time-of-flight monitoring unit 50, and can further include information from other sensors, user-provided information, existing information databases (e.g., a database including customer information), any other suitable data source, or any combination thereof.
  • user-provided information may associate identifying characteristics associated with data obtained from time-of-flight monitoring unit 50.
  • user-provided information may be provided for any identifying characteristic, in an exemplary embodiment users may desire to distinguish between customers and employees based on data from a time-of-flight monitoring unit 50 including a time-of-flight camera 54 and an RGB video sensor as a secondary sensor 58.
  • One or more users may monitor a user interface that allows them to view outputs from time-of-flight monitoring units that identify people in retail locations.
  • an object e.g. , based on depth information
  • the users may then manually determine whether that person is a customer, employee, or other (e.g. , small children, law enforcement, regulatory personnel, cleaning crews, etc.).
  • the determinations, video, and depth information may be stored, providing a data source that may be used by a machine learning system to provide algorithms for classifying a person as a customer, employee, or other.
  • the data may be stored over any suitable time period and on any suitable geographic scale, including from a single time-of-flight monitoring unit, from a single location, from a plurality of locations, etc.
  • one or more learning criteria may be provided.
  • Learning criteria may be user-provided, may be unsupervised, or any combination thereof.
  • exemplary learning criteria include one or more of classification, regression, clustering, density estimation, and dimensionality reduction.
  • the learning criteria may include three classes of employee, customer, and other.
  • the data from the data source may be analyzed based on the learning criteria according to one or more machine learning algorithms.
  • Exemplary machine learning algorithms may include neural networks, similarity learning methods, Bayesian networks, genetic algorithms, principal component analysis, independent component analysis, cluster analysis, support vector machines, any other suitable learning algorithm, or any combination thereof.
  • Data from the data source may be analyzed by the learning algorithm based on the provided learning criteria, providing as an output one or more algorithms for identifying the desired identification characteristic from real-world data, such as output data from a time-of- flight monitoring unit 50.
  • data from the data source e.g.
  • stored depth and video information from a time-of-flight monitoring unit with people classified as employee, customer, or other may be input to a cascade classifier such as Viola- Jones trained on depth representations of objects.
  • the resulting identified objects can be further classified based on information retrieved from the video sensor such as color information.
  • the one or more algorithms provided by the learning algorithm may be implemented for use with one or more time-of-flight monitoring units 50.
  • the learning algorithm may be implemented in any suitable manner, such as in software residing on a computer readable medium and running on processing module 52, one or more computing devices 406, one or more servers 408, any other suitable device, or any combination thereof.
  • FIG. 14 depicts steps 800 for establishing privacy settings in accordance with some embodiments of the present disclosure.
  • a time-of-flight monitoring unit 50 includes a time-of-flight camera 54.
  • depth information such as a depth map from a time-of-flight camera 54 is unlikely to include data or images that can be used to identify a particular individual. This is because depth information generally captures only the physical shape of a person from a particular angle. And while it may be possible to identify a particular individual based on such data, as a general manner such information is best suited to distinguishing between general sizes or shapes (e.g. , child vs. adult, male vs. female, etc.).
  • a time-of-flight monitoring system may provide for anonymous monitoring coupled with large-scale data acquisition.
  • a time-of-flight monitoring unit 50 (or a network of time- of-flight monitoring units) is useful for numerous applications spanning a broad range of industries. These applications may be have differing privacy requirements. For example, legal or regulatory requirements may limit the types of information that can be acquired in public locations, private property that is open to the public, etc. Customers may not wish to have their movements tracked in an identifiable manner at a retail location or similar environment, and may choose not to frequent establishments that store identifiable information. On the other hand, there may be applications such as government buildings, public transportation, workplaces, security queues, etc., where there may not be a similar expectation of privacy.
  • privacy settings may be set in any suitable manner, in some embodiments privacy settings may be set by the manufacturer, on the monitoring units, at a connected computing device or server, or any combination thereof. Privacy settings may be user configurable or may be set automatically (e.g. , based on legal requirements associated with the jurisdiction where the monitoring units will be used).
  • a display privacy settings may be set.
  • Display settings may relate to the information that is displayed to a user (e.g. , at a computing device 406) on a display.
  • Display privacy settings may include settings for statistics, depth information, object information, video image signal data, any other suitable data or information, or any combination thereof
  • an exemplary setting for displaying statistics may only display statistics relating to a particular location that is being monitored (e.g. , queue size, wait time, dwell, customer counts, etc.).
  • An exemplary setting including statistics and depth information may include an image of the depth map along with the statistics, which may allow an observer to view the location of people in a manner that makes it difficult to identify any particular individual.
  • An exemplary setting that includes object information may permit a user to view information about a person in the field of view, such as demographic information, identifying information, history information, any other suitable information, or any combination thereof.
  • An exemplary setting that includes video image signal data may permit a user to view information such as a live video feed.
  • Storage settings may relate to the information that is stored (e.g. , at time-of-flight monitoring unit 50, computing devices 406, and/or servers 408).
  • Storage privacy settings may include settings for statistics, depth information, object information, video image signal data, any other suitable data or information, or any combination thereof, as described above with respect to display privacy settings.
  • the display and storage settings may be applied. Display and storage settings may be applied at one or more of time-of-flight monitoring units 50 (e.g. , time- of-flight monitoring units 402a-402i at locations 410 and 412a-412c), computing devices 406, and servers 408. In some embodiments, applying the display and storage settings may include disabling functionality of the time-of-flight monitoring units 50, such that any information that is not to be displayed or stored is not determined or acquired in the first instance. In some embodiments, applying the display and storage settings may include disabling the transmission of certain data or information from time-of-flight monitoring units 50 to one or more of computing devices 406 and/or servers 408. In some embodiments, applying the display and storage settings may include the computing devices 406 and/or servers 408 not displaying or storing particular data or information.
  • time-of-flight monitoring units 50 e.g. , time- of-flight monitoring units 402a-402i at locations 410 and 412a-412c
  • FIG. 15 depicts steps 900 for determining statistics in accordance with some embodiments of the present disclosure.
  • time-of-flight monitoring unit 50 may have applications monitoring a variety of different object types in a variety of location types. Multiple units may be located in a single location at a plurality of different views and perspectives, and data from multiple locations may be stored and processed together. Accordingly, it will be understood that the time-of-flight monitoring units 50 described herein may be used to determine numerous types of statistics, whether at the level of the time-of-flight monitoring unit 50, computing devices 406, servers 408, any other suitable device, or any combination thereof.
  • a variety of types of information may be determined based on the depth information from the time-of-flight camera 54, and this information may also be integrated with information from a secondary sensor 58 such as a video sensor, and other data sources such as customer databases.
  • Statistics may relate to any such information, such as the previously described identifying characteristics including height, weight, age, age range (e.g. , newborn, toddler, child, pre-teen, teenager, adult, senior citizen, etc.), hair color, eye color, biometrics (facial recognition, iris recognition, etc.), gender, race, ethnicity, identity (e.g., matching to a known individual), likely income level (e.g.
  • steps 900 may be used for determining any other suitable statistics that may be determined based on the present disclosure.
  • the system may identify people based on depth information, and determine when they enter into a location based on a criteria such as a boundary line, e.g., that sets a boundary to an entry to a store or a department within a store.
  • the people count may be incremented whenever a person crosses the boundary line, the first time a person crosses the boundary line, in any other suitable manner, or any combination thereof. In some embodiments, only particular people (e.g. , non- employee customers) may be counted.
  • zone analysis it may be desired to determine statistics about a particular portion of the field of view, e.g., a defined zone. People may be identified based on depth information and their activity may be tracked within a zone, the zone being defined in any suitable manner, e.g. , a boundary region. Although a boundary region may be defined in any suitable manner, in some embodiments the boundary region may be a defined area within the field of view.
  • boundary region may be defined in any suitable manner, in some embodiments the boundary region may be based on a user input, e.g., drawing an area on a user interface with a touchscreen or mouse, defining a central location of an area and defining a size and/or shape of the area, any other suitable manner, or any combination thereof.
  • Statistics may then be determined for the zone.
  • exemplary statistics include zone count, dwell time, customer location, customer engagement, and staff-to-customer ratio.
  • queue analysis it may be desired to determine statistics about a portion of the field of view associated with a customer queue. People may be identified based on depth information and their activity may be tracked within the queue, the queue being defined in any suitable manner, e.g., based on a boundary region or the location of a checkout area. Although a boundary region may be depicted and defined in any suitable manner, in some embodiments the boundary region may be a depicted and defined as described above for a queue area that is a checkout area. Although a checkout area may be defined in any suitable manner, in some embodiments the checkout area may be defined manually, automatically, or suitable combination thereof.
  • a user may use a user input to select the location of one or more checkout locations and define a region around the checkout location as the checkout area.
  • checkout locations may be identified automatically, e.g., based on depth data or a beacon.
  • the statistics may be modified based on whether the queue is structured or unstructured.
  • An exemplary structured queue may have predictable positions for customers, checkout locations, employees, items for purchase, etc. Examples may include supermarket checkouts, retail locations, banks, airport security, and other similar locations.
  • An exemplary unstructured queue may have varied locations for customers, checkout locations, employees, items for purchase, etc. Examples may include self-checkout areas, ATMs, and other similar locations.
  • Statistics may then be determined for the queue. Although it will be understood that any suitable statistics may be determined for the queue, exemplary statistics include waiting time, queue count, and abandonment rate.
  • the statistics may be modified based on whether the checkout area is operated by an employee, is a self-checkout area, or is a hybrid checkout area.
  • a self-checkout area multiple check-out systems may have a smaller number of employees assisting customers when they have problems or issues with self checkout.
  • a hybrid checkout location it may be possible to switch between a self-checkout system mode or a standard cashier-operated mode.
  • Statistics may then be determined for the checkout area.
  • exemplary statistics include checkout time, wait time, and staff support level.
  • a statistic may be selected or created.
  • one or more statistics may be pre-configured, such that a user may select from known statistics.
  • statistics may be created through a user interface.
  • An exemplary user interface may allow a user to choose from among available data sources (e.g. , data from time-of-flight monitoring unit 50, computing devices 406, and/or servers 406) to create a statistic.
  • a user may then identify how the data may be used or combined to create the statistic (e.g. , combining data relating to whether a person is an employee with data relating to the person's location within a store to provide statistics regarding employee activity).
  • the statistic may be configured.
  • a user interface may allow for a user to set parameters that may be used to configure the statistic.
  • it may be desired to determine how many people enter a location.
  • Configuring a people counting statistic may include setting a threshold location for counting (e.g. , drawing a line within the field of view via a user interface), setting a direction for counting (e.g., entry, exit, or both), selecting whether to limit the count to unique users (e.g., based on user tracking or identifying information), any other suitable parameters, or any combination thereof.
  • zone analysis or queue analysis it may be desired to determine statistics about people's activities within a particular area of a location.
  • Configuring a statistic may include setting an area for analysis (e.g. , drawing a shape within the field of view via a user interface that defines the zone), selecting relevant people to count (e.g. , customers, employees, and/or other), identifying checkout locations, any other suitable parameters, or any combination thereof.
  • the statistic may be compiled. Although statistics may be compiled in any suitable manner, in some embodiments the statistic may be compiled by one or more time-of-flight monitoring units 50, computing devices 406, servers 408, any other suitable device, or any combination thereof, based on data from one or more time-of-flight monitoring units 50 and any other suitable data sources. This information may be transmitted to a computing device 406 and/or server 408 for display or storage.
  • FIG. 16 depicts steps 1000 for identifying a person of interest based on a beacon signal in accordance with some embodiments of the present disclosure. As described herein, it may be desired to identify persons of interest, for example, in order to generate statistics relating to a location. Although in some embodiments a person of interest may be identified based on depth information, a video image signal, or a combination thereof, in other words
  • the time-of-flight monitoring unit may provide a wake-up signal for the beacons.
  • a wake-up signal may be provided so that the beacons do not have to be powered and transmitting at all times, and instead transmit in response to the wake-up signal.
  • any suitable wakeup signal may be provided that can be sensed by circuitry of the beacon, such as an infra-red light pattern, an ultrasonic signal, or a modulated
  • time-of-flight monitoring unit may include additional hardware for providing the wake up signal, e.g., to provide an ultrasonic signal or a modulated communication signal.
  • the wake-up signal may be provided in any suitable manner, in an embodiment the wake-up signal may be provided on a periodic basis.
  • the time-of-flight monitoring unit may receive an infrared-red beacon signal.
  • An infra-red beacon signal may be provided in any suitable manner that allows the time-of-flight monitoring unit to distinguish the profile of the beacon signal (e.g. , signal intensity, wavelength, modulation frequency, etc.) from other infrared light received at the time-of-flight sensor of the time of flight monitoring unit.
  • the system may then identify precisely the coordinates of the beacon signal source in the field of view of the time-of-flight camera and associate this signal source with a person to be detected and tracked by the system.
  • the time-of-flight monitoring unit may associate the received beacon signal with an object that has been identified by the time-of-flight monitoring unit. Based on this association, the identified object may be identified as a person of interest, such as an employee.

Abstract

A time-flight-monitoring unit includes a processing module, an illumination system, a time-of-flight camera, and a secondary sensor. The illumination system illuminates a field of view, providing light that is received by the time-of-flight camera. The time-of-flight camera generates depth information that may be used by the processing module to identify a plurality of objects within the field of view. The processing module may also analyze data from the secondary sensor based on the depth information.

Description

TITLE: TIME-OF-FLIGHT MONITORING SYSTEM
FIELD OF THE INVENTION
[0001] The subject disclosure is directed to monitoring a field of view with a time-of- flight monitoring system and analyzing time-of-flight and related data.
BACKGROUND OF THE INVENTION
[0002] Monitoring equipment is regularly employed by retail businesses, manufacturers, workplaces, government entities, transportation facilities, homeowners, and numerous other individuals and entities. These varied users employ such equipment for numerous applications, including analytics of shopping patterns, security, crime prevention, employee monitoring, and efficiency studies, among others. The monitoring equipment may be capable of observing a field of view at a location, and it may be desired to obtain information about objects or people within the field of view. For example, desired information may include movement patterns, demographic information, physical characteristics, the identification of particular individuals, and more complex statistics that may be determined based on a combination of this information and other similar information. This information can also be compiled over time, to be used for even more complex analytics that may aid in long-term decision making for an individual or entity.
[0003] Some monitoring equipment may utilize a video sensor that provides a video image signal such as video image stream or a sequence of still images. Under ideal conditions, video-based systems may provide a large amount of raw information about the observed field of view. It may be necessary to employ significant resources in order to extract useful information from this large amount of raw image -based information. For example, in many circumstances a person may be required to physically observe the video image stream or still images, and for more complex analytics, may need to physically observe and record information about individuals within the field of view. Such personal observation is subject to numerous deficiencies, such as fatigue, variation between different observers, and expense. In other circumstances, an image may be automatically processed by a computing device such as a computer or remote server. Because of the difficulty of extracting desired information from the large amount of raw image-based information, such systems may require a large amount of computing power or computing time. Image-based monitoring methods are dependent on ambient or environmental conditions at or near the monitoring device, such as the amount, direction, and types of lighting and any objects or conditions that interfere with the lens of the imaging device.
[0004] Other types of monitoring equipment may include thermal sensors, acoustic sensors, or other similar technologies. Unlike image sensors, systems employing these technologies may allow for simpler processing techniques, or may operate better under challenging ambient or environmental conditions. However, existing sensors of these types may lack the ability to identify many types of desired information, the ability to capture desired information over a large enough field of view, or may have a significant material cost, that renders them less useful for many applications.
[0005] The above-described deficiencies of today's monitoring solutions are merely intended to provide an overview of some of the problems of conventional systems, and are not intended to be exhaustive. Other problems with conventional systems and corresponding benefits of the various non-limiting embodiments described herein may become further apparent upon review of the following description.
SUMMARY OF THE INVENTION
[0006] The following presents a simplified summary of the specification to provide a basic understanding of some aspects of the specification. This summary is not an extensive overview of the specification. It is intended to neither identify key or critical elements of the specification nor delineate any scope particular to any embodiments of the specification, or any scope of the claims. Its sole purpose is to present some concepts of the specification in a simplified form as a prelude to the more detailed description that is presented later.
[0007] In various embodiments, the present disclosure provides a method of identifying a plurality of objects within a field of view, comprising transmitting a light signal from a time- of-flight monitoring unit to illuminate a region including the field of view and receiving a reflected signal of the light signal at a time-of-flight camera of the time-of-flight monitoring unit, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal. The method also comprises identifying the plurality of objects based on the depth information, determining an area of interest within the field-of-view, and generating one or more statistics for one or more of the identified objects within the area of interest.
[0008] In various embodiments, the present disclosure provides a time-of-flight monitoring unit comprising an illumination system configured to transmit a light signal to illuminate a region including a field of view and a time of flight camera configured to receive a reflected signal of the light signal, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal. The time-of-flight monitoring unit also comprises a processing module configured to identify the plurality of objects based on the depth information, determine an area of interest within the field-of-view, and generate one or more statistics for one or more of the identified objects within the area of interest.
[0009] In addition, various other modifications, alternative embodiments, advantages of the disclosed subject matter, and improvements over conventional monitoring units are described. These and other additional features of the disclosed subject matter are described in more detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The above and other features of the present disclosure, its nature and various advantages will be more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings in which:
[0011] FIG. 1 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with some embodiments of the present disclosure;
[0012] FIG. 2 depicts a block diagram of an exemplary time-of-flight monitoring unit in accordance with some embodiments of the present disclosure;
[0013] FIG. 3 depicts an exemplary time-of-flight monitoring unit in accordance with some embodiments of the present disclosure;
[0014] FIG. 4 depicts steps for the configuration of an exemplary time-of-flight monitoring unit in accordance with embodiments of the present disclosure;
[0015] FIG. 5 depicts an exemplary time-of-flight configuration interface in accordance with some embodiments of the present disclosure.
[0016] FIG. 6 depicts steps for tilt and height setup in accordance with some embodiments of the present disclosure;
[0017] FIG. 7 depicts a plurality of exemplary time-of-flight monitoring units in a monitoring system in accordance with some embodiments of the present disclosure;
[0018] FIG. 8 depicts steps for identifying objects based on depth information in accordance with some embodiments of the present disclosure;
[0019] FIGS. 9A-9F depict raw and processed depth map images in accordance with some embodiments of the present disclosure; [0020] FIGS. 10A-10B depict a Gaussian distribution and correlation matrix in accordance with some embodiments of the present disclosure;
[0021] FIG. 11 depicts steps for analyzing video image signals in accordance with some embodiments of the present disclosure;
[0022] FIG. 12 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure;
[0023] FIG. 13 depicts steps for determining identifying characteristics using machine learning in accordance with some embodiments of the present disclosure;
[0024] FIG. 14 depicts steps for determining privacy settings in accordance with some embodiments of the present disclosure;
[0025] FIG. 15 depicts steps for determining statistics in accordance with some embodiments of the present disclosure; and
[0026] FIG. 16 depicts steps for identifying a person of interest based on a beacon signal in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
[0027] Exemplary time-of-flight monitoring systems are described herein for the purposes of illustration and not limitation. For example, one skilled in the art can appreciate that the illustrative embodiments can have application with respect to other technologies.
[0028] FIG. 1 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure. Although a variety of system configurations and applications are described herein, a person having ordinary skill in the art will understand that numerous additional system configurations and applications are contemplated by the present disclosure. An exemplary application may be a monitoring application for a retail location 1 as depicted in FIG. 1. Although a retail location is provided as an exemplary application, a person having ordinary skill in the art will understand that the systems and methods described herein may be implemented in numerous other monitoring applications, such as manufacturing facilities, shipping facilities, agricultural facilities, office workplaces, government buildings, security queues, private residences, and other applications. While the exemplary embodiment described with respect to FIG. 1 may focus on the monitoring of customers and employees, numerous other objects may be monitored such as crates, boxes, vehicles, livestock, etc.
[0029] Exemplary retail location 1 may include a plurality of customers 10, 11, and 12, a plurality of employees 20 and 21, a checkout area 30, a display area 40, and a time-of-flight monitoring unit 50. Time-of-flight monitoring unit 50 may have optical characteristics and be located in a manner such that time-of-flight data can be observed for a wide field of view, such as the portion of the retail location depicted in FIG. 1. A light source (not depicted) illuminates the field of view with a light signal such as non-visible infra-red (IR) light. Time-of-flight monitoring unit 50 may include a time-of-flight camera (not depicted) for observing IR light that is reflected from the field of view. A processor of a processing module (not depicted) of time-of-flight monitoring unit 50 may process the reflected light to create depth information, which in turn may be used identify objects of interest such as employee or customers within the field of view. In some embodiments, the processor may perform additional analysis of the identified objects to provide relevant data or statistics for the particular application, for example, by tracking the objects of interest within the field of view. This can be used for numerous applications such as people counting, zone analysis, queue analysis, or checkout analysis. Time-of-flight monitoring unit 50 may also be in communication with other devices such as other time-of-flight monitoring units, mobile devices, tablet computers, servers, personal computers, etc.
[0030] FIG. 2 depicts a block diagram of an exemplary time-of-flight monitoring unit
50 in accordance with some embodiments of the present disclosure. In some embodiments, time-of-flight monitoring unit 50 may include a processing module 52, time-of-flight camera 54, illumination system 56, secondary sensor 58, power module 60, and communication module 62.
[0031] Processing module 52 may include a processor having processing capability necessary to perform the processing functions described herein, including but not limited to hardware logic, computer readable instructions running on a processor, or any combination thereof. In an exemplary embodiment, processing module 52 may be a system on a module (SOM) that is able to perform a variety of processing, communication, and memory operations that are useful for time-of-flight processing applications. Processing module 52 may run software to perform the operations described herein, including software accessed in machine readable form on a tangible non-transitory computer readable storage medium, as well as software that describes the configuration of hardware such as hardware description language (HDL) software used for designing chips. Examples of tangible (or non-transitory) storage medium include disks, thumb drives, and memory, etc., but does not include propagated signals. Tangible computer readable storage medium include volatile and non- volatile, removable and non-removable media, such as computer readable instructions, data structures, program modules or other data. Examples of such media include RAM, ROM, EPROM, EEPROM, flash memory, disks or optical storage, magnetic storage, or any other non-transitory medium that stores information that is accessed by a processor or computing device. In exemplary embodiments, one or more computer readable storage media may be integrated with processing module 52, may be one or more other components of time-of-flight monitoring unit 50, may be located on another device, may be located at a remote location, or any combination thereof.
[0032] Illumination system 56 may provide a light source that emits a light signal for illuminating a field of view, and in some embodiments, a wide field of view. This emitted light is then reflected by objects within the field of view and captured by the time-of-flight camera 54. Although it will be understood that any suitable light source may be used to illuminate the field of view, in an exemplary embodiment IR light emitting diodes (LEDs) may be used as the light source. IR LEDs may be well suited for illuminating a field of view because the IR light is generally not perceptible to people within the field of view, and one or more IR LEDs are able to be precisely controlled to provide light in a manner that facilitates integration with time- of-flight camera 54.
[0033] Although any suitable number and configuration of IR LEDs may be integrated within an illumination system 56 of time-of-flight monitoring unit 50, in an exemplary embodiment ten IR LEDs may be arranged in a plurality of illumination arrays having a physical location about the time-of-flight monitoring unit 50 that facilitates illumination of a wide field of view. Although the IR LEDs may be driven in any suitable manner, in an embodiment the processing module 52 may provide one or more modulation signals, each of which is associated with one or more IR LEDs. Modulation signals may be analog, digital, or any combination thereof. Although in some embodiments the modulation signals may directly drive the IR LEDs, in other embodiments the drive signals may be provided to drive circuitry that drives and conditions the signal that in turn drives the IR LEDs. In an embodiment, the modulation signal may be controlled by time-of-flight sensor and synchronized with its internal logic.
[0034] Operating parameters of the IR LEDs may be controlled by the modulation signals. In some embodiments it may not be necessary to operate all of the IR LEDs, for example, if a particular field of view may be illuminated with less than all of the IR LEDs. Using only some of the LEDs may result in less power usage, heat production, etc. In some embodiments, the selective illumination of the IR LEDs may be controlled such that the one or more IR LEDs that are not operating changes over time. [0035] During operation, the IR LEDs that are providing illumination may have an illumination cycle that has an illumination period, based on a modulation frequency and duty cycle. During the illumination cycle the IR LEDs may be operated for an illumination time of the illumination period, such that the IR LEDs are illuminated for only a portion of the illumination period. In an exemplary embodiment, the duty cycle may range from 20-40% and the modulation frequency may range from 19-25 MHz.
[0036] The modulation signal may also provide a modulation waveform for the drive of the IR LEDs. During the illumination time, the IR LEDs may be cycled at the modulation frequency. In some embodiments it may be desirable to provide a modulation frequency for the IR LEDs to assist in identifying the reflected light that is captured by the time-of-flight camera 54, for example, using filters that pass the modulation frequency to processing circuitry. The modulation frequency may be variable or selectable. Having a variety of modulation frequencies may allow for a plurality of time-of-flight monitoring units 50 to be placed in close proximity or in overlapping locations without resulting in interference between the reflected light at each of the time-of-flight monitoring units. In some embodiments, a single time-of- flight monitoring unit 50 may transmit at multiple frequencies. A modulation waveform may be any suitable waveform such as a square wave, sine wave, sawtooth wave, etc. In addition, the modulation waveform may have a duty cycle that may be varied based on desired characteristics of the transmitted and reflected light.
[0037] By modifying the illumination time, the periodic rate of the illumination cycle, the duty cycle, the modulation frequency, and/or the modulation waveform, the illumination system 56 may be adjusted for particular environments or applications. For example, in some embodiments it may be desirable to minimize the illumination time or the duty cycle of the modulation waveform in a manner that minimizes power consumption and heat generation, for example if the time-of-flight monitoring unit 50 has a low power connection or is powered by a battery. In some embodiments, these operating parameters could automatically be modified based on operating conditions, such as characteristics of the power supply, ambient
temperature, temperature of the monitoring unit, current draw of one or more IR LEDs, voltage of one or more IR LEDs, temperature of one or more IR LEDs, any other suitable condition, or any combination thereof. For example, a decrease of the duty cycle may result in power saving, and the saved power can be used for increasing currents and thus the optic output power.
[0038] Time-of-flight camera 54 may be configured to receive reflected light originating from illumination system 56 and to generate depth information based on the reflected light. In an exemplary embodiment the time-of-flight camera may include one or more lenses, a time-of-flight sensor, and sensor circuitry. Although any suitable lenses may be provided for the time-of-flight camera 54, in an exemplary embodiment the one or more lenses may be configured to receive reflected light from the field of view, such as a wide field of view. For example, a wide angle lens with a 1.2mm focal length may provide a 1 10 degree horizontal field of view.
[0039] An exemplary time-of-flight sensor may have pixels that receive the reflected light from the field of view via the one or more lenses. Although it will be understood that any suitable time-of-flight sensor may be used in accordance with the present embodiments, in an exemplary embodiment the time-of-flight sensor may include a CMOS pixel array as the time- of-flight sensor. Sensor circuitry may control shuttering of the time-of-flight camera 54 (e.g. , based on one or more control signals provided by processing module 52), such that the time-of- flight sensor only receives light during a time of interest corresponding to the expected return time of the reflected light. Shuttering may be based on the parameters of the illumination system such as the illumination time, periodic rate, height, tilt angle, any other suitable parameters, or any combination thereof.
[0040] It will be understood that depth information may be determined from the reflected light received by the pixels of the time-of-flight sensor in any suitable manner. In an exemplary embodiment the time-of-flight camera 54 may determine a phase shift between the illumination and reflection to determine distance. The distances associated with the pixels of the time-of-flight sensor may be compiled into a depth map that may be used for further processing.
[0041] In some embodiments, time-of-flight monitoring unit 50 may also include a secondary sensor 58. Although a single secondary sensor 58 is depicted for purposes of illustration, it will also be understood that any number of secondary sensors 58, including no secondary sensor 58, may be provided in accordance with the present disclosure. Any suitable sensor may be used as a secondary sensor 58, including video sensors (e.g., streaming video, a series of still camera images, etc.), infra-red sensors, image sensors, proximity sensors, stereoscopic sensors, thermal sensors, another time-of-flight camera, any other suitable sensor, or any combination thereof. In an embodiment, the secondary sensor 58 may be a video sensor that streams video within the field of view, such as an RGB camera. As will be described herein, a signal such as a video image signal from one or more secondary sensors 58 may be processed based on the depth information from time-of-flight camera 54. A signal from a secondary sensor 58 may also be processed independent of the depth information. [0042] Time-of-flight camera 54, illumination system 56, and secondary sensor 58 may be in communication with processing module 52 in any suitable manner, including wired connections, wireless communications, any other suitable communication method, or any combination thereof. In an exemplary embodiment time-of-flight camera 54, illumination system 56, and secondary sensor 58 may be in communication with processing module 52 via a wired internal bus of time-of-flight monitoring unit 50, such as USB, I C, and SCSI. Although it will be understood that processing module may communicate any suitable information with time-of-flight camera 54, illumination system 56, and secondary sensor 58, in some
embodiments processing module 52 may communicate with time-of-flight camera 54 to monitor and control shuttering and other sensor operations and to receive depth information such as a depth map, may communicate with illumination system 56 to provide modulation signals and to monitor operating conditions of the illumination system 56, and may
communicate with secondary sensor 58 to control the operation of secondary sensor 58 and receive sensor signals such as a video image signal.
[0043] Processing module 52 may perform processing based on the depth information received from time-of-flight camera 54, such as a depth map. As will be described in more detail herein, processing may include identifying a plurality of objects such as people who are located within the field of view. In some embodiments, in addition to identifying objects, processing module 52 may perform analysis of the data and determine statistics, for example, to perform people counting, queue analysis, zone analysis, or checkout analysis. Because these analyses are performed in the first instance based on time-of-flight data, as opposed to other data sources such as the video image signal, it may be possible to perform complex analyses faster and/or with less processing resources than conventional systems. This may allow for processing module 52 to operate with lower power requirements and less processing power, significantly reducing the expense associated with monitoring systems and allowing complex analyses to be performed on the time-of-flight monitoring unit 50, rather than on a remote computer or server.
[0044] Processing module 52 may also perform processing based on data received from a secondary sensor 58 such as a video sensor. Although any suitable processing may be performed, in an exemplary embodiment the video received from the secondary sensor may be used for identification of individuals, capture of biometric information, identification of adults vs. children, customer vs. associates, any other suitable relevant information, or any
combination thereof. In some embodiments, the processing of information from secondary sensor 58 may be aided based on information determined by time-of-flight camera 54, as described in more detail herein. For example, processing module 52 may identify the locations of one or more customers within the field of view based on the depth information provided by time-of-flight camera 54. Those customer locations may then be used for processing module 52 to more efficiently determine information from a signal such as a video image signal provided by a secondary sensor 58, by identifying particular portions of the video image signal for additional analysis. This may significantly reduce the processing requirements for analysis of information from the secondary sensor, allowing processing to be performed by a processing module 52 having lower power requirements and less processing power, significantly reducing the expense associated with monitoring systems and allowing complex analyses of video image signals to be performed on the time-of-flight monitoring unit 50, rather than on a remote computer or server.
[0045] Time-of-flight monitoring unit 50 may also include a power module 60. Power module 60 may include any suitable module, chip, device, components, or combination thereof that provides power to the other portions of time-of-flight monitoring unit 50, and in some embodiments, may include a plurality of power modules providing power to different parts of time-of-flight monitoring unit 50. Power module 60 may receive power from any suitable power source, including external power sources such as line-powered power sources or internal power sources such as a battery. In an exemplary embodiment, the components of time-of- flight monitoring unit 50 may be selected and configured to operate with a Power over Ethernet (PoE) power source, permitting both power and data to be accessed via a single Ethernet connection. In such embodiments, power module 60 may convert and regulate the power from the PoE power source such that it is suitable for providing power to the other components of time-of-flight monitoring unit 50.
[0046] Time-of-flight monitoring unit 50 may also include a communication module 62.
Communication module 62 may include any suitable module, chip, device, components, or combination thereof that allows processing module 52 or any other components of time-of- flight monitoring unit 50 to communicate with external components, devices, processors, systems, servers, computers, or any other external systems. Although communication module 62 may provide for communications in any suitable manner (e.g. , electrical, optical, etc.), in an exemplary embodiment communication module may facilitate digital electronic
communications, including wired (e.g. , Ethernet, RS-232, FireWire, USB, Thunderbolt, etc.) and/or wireless (e.g. , cellular, WiFi, Bluetooth, etc.). In some embodiments, a plurality of communication modules 62 may provide for communication via a plurality of different methods, such as Ethernet, USB, WiFi, and cellular communications. [0047] FIG. 3 depicts an exemplary time-of-flight monitoring unit 50 in accordance with some embodiments of the present disclosure. As described herein, the components of time-of-flight monitoring unit 50 may be packaged into a compact physical unit that may be structured to be located in various physical locations to capture a desired field of view at a location. In some embodiments, the time-of-flight monitoring unit 50 may be structured in a manner that allows for the unit to be recessed within a ceiling or wall, to have a low profile for discreet attachment on a surface of a ceiling or wall, to be self-supporting on a surface or a supporting stand, to be attached to structural support such as a rafter via an integral connecting arm, or in any other suitable manner. In the exemplary embodiment depicted in FIG. 3, time- of-flight monitoring unit 50 may adapted to be located in a recessed portion of a ceiling or wall.
[0048] Time-of-flight monitoring unit 50 may include a housing 64 that encloses the electrical components, IR LEDs, and sensors of the time-of-flight monitoring unit 50. Although housing 64 may be constructed of any suitable material or combination of materials, in an exemplary embodiment housing 64 may be constructed of a plastic, aluminum, or other metals. In the embodiment of FIG. 3, housing 64 includes a face 66 and an enclosure 68, which are connected to each other with screws.
[0049] Face 66 may provide a physical interface to align the time-of-flight camera 54,
IR LEDs 56, and secondary sensor 58 with respect to the field of view. For example, in the embodiment of FIG. 1, a time-of-flight monitoring unit 50 may be oriented such that a face 66 of housing 64 is facing downward toward the field of view. The physical structure of face 66 may require that the spacing and the location of the time-of-flight camera 54 and IR LEDs 56 are optimized such that a uniform illumination is provided over the field of view and such that the time-of-flight camera 54 is able to receive reflected light from the field of view. In an exemplary embodiment IR LEDs may be arranged in two opposite semi-circular arrays, each including half (e.g. , five) of the IR LEDs. Time-of-flight camera 54 and secondary sensor 58 may be located between the IR LEDs.
[0050] Enclosure 68 of housing 64 may provide a physical interface with an installation location such as a hole in a ceiling. Enclosure 68 also may include heat sinks that are located and spaced to provide for the dissipation of heat produced by electrical components of time-of- flight monitoring unit 50, such as IR LEDs 56. In some embodiments enclosure 68 may include interfaces for electrical and communication connections, such as Ethernet and USB
connections.
[0051] FIG. 4 depicts steps 100 for the configuration of time-of-flight monitoring unit
50 in accordance with embodiments of the present disclosure. While, for purposes of simplicity of explanation, the steps of methods described in this disclosure are shown and described as a series of blocks, it is to be understood and appreciated that such illustrations or corresponding descriptions are not limited by the order of the blocks, as some blocks may occur in different orders and/or concurrently with other blocks from what is depicted and described herein. Any non-sequential, or branched, flow illustrated via a flowchart should be understood to indicate that various other branches, flow paths, and orders of the blocks, can be implemented which achieve the same or a similar result. Moreover, not all illustrated blocks may be required to implement the methods described hereinafter.
[0052] Referring again to FIG. 4, at step 102 the time-of-flight monitoring unit may be provided with unique identification information. It will be understood that the unique identifier may be provided in any suitable manner, e.g. , at the manufacturer, assigned by a centralized system upon installation, by a user through a user interface, in any other suitable manner, or any combination thereof. The unique identifier may facilitate communication between remote devices and a particular time-of-flight monitoring unit 50.
[0053] At step 104, any secondary sensor 58 may be configured either manually or automatically. In an exemplary embodiment where secondary sensor 58 is a video sensor, configuration parameters may include video codecs, video quality, resolution, bitrate, framerate, real time streaming, http live streaming, any other suitable parameters, or any combination thereof.
[0054] At step 106, the time-of-flight camera 54 and IR LEDs 56 operation may be configured either manually or automatically. Configuration parameters that may be set include the IR channel, the camera tilt angle, the distance of the camera to a target surface (e.g. , the floor for a ceiling mounted unit), a minimum detection height of objects (e.g. , if monitoring customers, a height associated with a child or a crouching person), relative location settings (e.g. , relative to other units and/or known centralized location), a boundary line, a boundary area, a checkout area, queue area, service zone, any other suitable configuration parameters, or any combination thereof. In some embodiments, configuration may be performed through a configuration interface as depicted in FIG. 5.
[0055] FIG. 5 depicts an exemplary time-of-flight configuration interface 200 in accordance with some embodiments of the present disclosure. Configuration interface 200 may be provided on any suitable device in communication with time-of-flight monitoring unit 50, or at a time-of-flight monitoring unit 50 if the unit has a display. Although configuration interface 200 may include any suitable configuration parameters, in an exemplary embodiment configuration interface 200 includes IR channel field 202, camera tilt field 204, floor height field 206, minimal height field 208, region field 210, field of view 212, and anchor points 214. Other configuration parameters such as those discussed above may be provided but are not depicted in FIG. 5. Although not depicted, it will be understood that such configuration settings may be provided in any suitable manner.
[0056] Although configuration interface is depicted and described as configuring certain parameters manually and certain parameters automatically, it will be understood that any of the parameters may be set either manually or automatically. For example, while in the exemplary embodiment of FIG. 5 a user may set the IR channel 202 manually, in some embodiments the IR channel 202 may be set automatically. For example, if a plurality of time-of-flight monitoring units 50 are installed within a single retail establishment, a user may provide basic location information to a device in communication with the units. From this basic information, parameters such as the modulation frequency of the IR LEDs 56 of each of the time-of-flight monitoring units may be set automatically in a manner that avoids the overlap of reflected light having the same or similar modulation frequencies.
[0057] Although a variety of sizes of the field of view may be observed by time-of- flight monitoring unit 50 based on the height, location, lenses, IR LEDs, and other parameters of the unit, in an exemplary embodiment a field of view may be in the range of 20 ft. x 14 ft., at an exemplary placement height of 13 ft., and at 24 ft x 18 ft. for a placement height of 14 ft. Controls for camera tilt field 204 and floor height field 206 may orient time-of-flight camera 504 and IR LEDs 506 to the field of view in order to accurately measure depth information within the field of view. In some embodiments, these parameters may be set manually, e.g., based on known or measured values for the camera tilt and floor height. In some embodiments, this configuration may be performed with a partially or fully automated process, such as by the steps depicted in FIG. 6.
[0058] FIG. 6 depicts steps 300 for tilt and height setup in accordance with some embodiments of the present disclosure. In an embodiment, anchor points 214 may be virtual points that may be positioned within the depicted field of view 212 of the time-of-flight camera 54. In step 302, a plurality of virtual anchor points 214 (e.g. , three anchor points) may be positioned within the depicted field of view 212 according to one or more configuration rules. Configuration rules may include a distance from the optical center of the camera, a distance from other anchor points 214, a requirement to be located on a common plane (e.g. , the floor), avoidance of particular types of surfaces (e.g. , non-reflective surfaces such a dark carpet), any other suitable rules, or any combination thereof. In an exemplary embodiment, the
configuration rules may require that each anchor point 214 be positioned at least 10 cm from all other anchor points 214 and from the optical center of the camera, that all anchor points 214 be positioned over a common surface of a floor, and that no anchor points be positioned over a dark colored carpet surface.
[0059] In some embodiments, placement of the anchor points 214 may be done manually, for example, by using the device interface (e.g. , using a mouse or touchscreen) to drag the anchor points 214 to locations that comply with the configuration rules. In other embodiments, this positioning may be done automatically, e.g. , by clicking the "Auto" radio button within the camera tilt 202 field on a device running the time-of-flight configuration interface 200. To position anchoring points 214 automatically, the device and/or the processing module 52 may scan the field of view 212, analyze depth information such as depth maps, and perform any other suitable operations to position the anchoring points within the field of view 212.
[0060] Once the anchoring points 214 are placed, the device and/or the processing module 52 may calculate the camera height and tilt at step 304. Although height and tilt may be calculated in any suitable manner, in an embodiment the height may be calculated as a mean value of distances to each point and the tilt may be calculated based on deviation of these distances.
[0061] At step 306, the configuration interface 200 may be updated with the calculated camera tilt field 204 and floor height field 206 values. In some embodiments, a user may wish to modify these automatically updated settings and may do so at step 308, for example, by modifying the values for camera tilt field 204 and floor height field 206 through the manual interface for each of those fields. Once the camera settings have been finalized, at step 310 the camera tilt field 204 and floor height field 206 values may be stored in memory of processing module 52 for use in correctly interpreting received depth information.
[0062] Referring again to FIG. 5, configuration interface 200 may also include minimal height field 208. The minimal height field 208 may provide information that allows processing module 52 to identify an appropriate window of interest for the received depth information. For example, for monitoring people it may be desirable to set a minimal height that captures adults, but ignores shorter objects (e.g. , small children, pets, etc.) that are not of interest or might otherwise occupy unnecessary processing resources. Although in an embodiment the minimal height may be set, it will be understood that other criteria could be used to set a window of interest for depth information, such as a height range (e.g. , minimum and maximum), an object type (e.g. , child, adult, types of animals, physical objects, vehicles, etc.), any other suitable criteria, or any combination thereof. [0063] Configuration interface 200 may also include a region field 210. The region field is the area used for detection of objects. In some embodiments, in addition to the region field 210 (but not depicted), an interface may be provided to select one or more areas of interest for processing, e.g. , for generating statistics. Although it will be understood that any suitable area of interest may be provided in any suitable manner, in some embodiments the area of interest may be a boundary line, a boundary region, a checkout area, any other suitable area of interest, or any combination thereof.
[0064] In an embodiment, a boundary line may be a line that is placed on a display of the field of view, for example, using a user input such as a keyboard, mouse, or touch screen. Although a boundary line may be used for any suitable application, in an embodiment the boundary line may be relevant to a people counting application. Any object that is identified as a person may be counted as they cross the line.
[0065] In another embodiment, a boundary region may be a closed shape that is placed on a display of the field of view, for example, using a user input such as a keyboard, mouse, or touch screen. A boundary region may be selected in any suitable manner, such as drawing the boundary region on the field of view, selecting a point within the field of view and selecting the size of the boundary region, any other suitable selection method, or any combination thereof. Although a boundary region may be used for any suitable application, in an embodiment the boundary region may be relevant to a queue analysis or zone analysis application. Statistics may be generated for any object that is identified as a person within the boundary region, as described herein.
[0066] In another embodiment, a checkout area may be an area that is associated with a checkout area, such as a conventional checkout area with an employee, a self-checkout area, or a hybrid checkout area (e.g. , a checkout area in which one or more employees monitor and assist with self-checkout). The checkout area may be identified in any suitable manner, for example, using a user input such as a keyboard, mouse or touch screen, or in some
embodiments, may be identified based on depth information. A checkout region may be selected in any suitable manner, such as drawing the checkout area on the field of view, selecting a point within the field of view that includes a checkout device and selecting the size of the checkout area, automatic selection based on depth information, any other suitable selection method, or any combination thereof. Although a checkout area may be used for any suitable application, in an embodiment the checkout area may be relevant to a checkout analysis application. Statistics may be generated for any object that is identified as a person within the checkout area, as described herein. [0067] FIG. 7 depicts a plurality of time-of-flight monitoring units 50, labeled as time- of-flight monitoring units 402a-402i, in a time-of-flight monitoring system in accordance with some embodiments of the present disclosure. FIG. 7 depicts a location 410 (e.g. , a retail store) that is monitored by a plurality of time-of-flight monitoring units 402a-402i. A field of view 404a-404i may be associated with each of the time-of-flight monitoring units 402a-402i.
Although a particular quantity and layout of time-of-flight monitoring units is depicted in FIG. 7, it will be understood that any suitable quantity and layout of time-of-flight monitoring units may be adapted to a particular space or location, including a plurality of time-of-flight monitoring units having different optical characteristics, fields of view, etc., in order to efficiently monitor spaces having irregular shapes or environmental conditions.
[0068] In the embodiment of FIG. 7, time-of-flight monitoring units 402a-402i are located such that the edge of each associated field of view 404a-404i is continuous with of an edge of another field of view 404a-404i. Although the respective field of view edges are depicted as being aligned in FIG. 7, it will be understood that each edge may overlap the adjacent edges to allow for a degree of redundancy in the handoff of object monitoring between time-of-flight monitoring units 402a-402i. It will be understood that the field of view may have a different shape (e.g. , rectangular, circular, semi-circular, etc.) based on the configuration of the IR LEDs 56 and the time-of-flight camera 54. Objects that move through and between any of the fields of view 404a-404i may be continuously monitored and tracked by the time-of- flight monitoring units 402a-404i, local computing devices 406, remote servers 408, or any combination thereof.
[0069] In some embodiments, each of time-of-flight monitoring units 402a-402i may be aware of and in communication with adjacent monitoring units, such that one of time-of-flight monitoring units 402a-402i may provide information relating to objects to adjacent units. Such information may include whether an object is moving into an adjacent monitoring unit's field of view, a unique identifier associated with the object, depth information for the object, speed of travel, associated demographic data, history data relating to movement through fields of view associated with other monitoring units, statistics for the object, any other suitable information, or any combination thereof. Time-of-flight monitoring units 402a-402i may communicate this information through any suitable communication medium (wired or wireless) as described herein, although in an exemplary embodiment the time-of-flight monitoring units 402a-402i may all be on a shared Ethernet network.
[0070] Time-of-flight monitoring units 402a-402i may also be in communication with local computing devices 406, remote servers 408, any other suitable devices, or any combination thereof. In some embodiments, some or all of the processing relating to adjacent monitoring units may be performed by local computing device 406, remote servers 408, or any combination thereof. Moreover, any of time-of-flight monitoring units 402a-402i, local computing devices 406, remote servers 408, or any other suitable device, may be in
communication with monitoring units or computing devices associated with one or more additional locations 412a-412c. In this manner, data from numerous time-of-flight monitoring units at numerous locations may be stored and analyzed at one or more central processing locations, and information from the one or more central processing locations may be accessed by the time-of-flight monitoring units and/or local computing devices.
[0071] FIG. 8 depicts steps 500 for identifying objects based on depth information in accordance with some embodiments of the present disclosure. It will be understood that steps 500 may be performed at any suitable computing device, such as on processing module 52 of time-of-flight monitoring unit 50 (e.g. , any of time-of-flight monitoring units 402a-402i), local computing devices 406, remote servers 408, or distributed among any combination thereof. However, in an exemplary embodiment described herein, the processing described in steps 500 may be efficient enough to be run in real-time on a single processing module 52 of a time-of- flight monitoring unit 50. Although the steps 500 may be performed to identify any suitable objects as described herein, in an exemplary embodiment the object to be identified may be a person.
[0072] At step 502, processing module 52 may receive a frame of depth information such as a depth map from time-of-flight camera 54. The depth map may represent depth information obtained for a field of view in accordance with the embodiments described herein. An example of a raw depth map in accordance with embodiments of the present disclosure is depicted at FIG. 9A.
[0073] In the embodiment described herein, the steps 500 may be performed for a single frame of a depth map, i.e. , without secondary information to assist in identifying people, such as tracking of previous positions of people under observation, identifying information to match particular individuals, information from other sensors, any knowledge of prior frames, or any other secondary information. However, it will be understood that such secondary information may also be used to assist in the identification of people according to steps 500. Steps 504 - 510 may modify the depth information according to one or more exemplary object
identification rules. It will be understood that additional object identification rules may be used in accordance with the present disclosure, that object identification rules may be substituted or omitted, or that the order of object identification rules may be changed. [0074] At step 504, processing module 52 may correct the raw depth map image based on one or more optimization factors. Optimization factors may be based on known error sources, such as lens tilt, image distortion, any other suitable factors, or any combination thereof. These optimization factors may be determined during manufacturing, during setup of the time-of-flight monitoring unit 50, based on an analysis of historical depth information data for a particular monitoring unit, at any other suitable time, or any combination thereof. In an exemplary embodiment, at step 504 the processing module 52 may correct the raw depth map based on the lens tilt value that is set or determined during setup, and a known distortion value associated with the time-of-flight camera 54. An example of an image that has been corrected in this manner is depicted in FIG. 9B.
[0075] At step 506, processing module 52 may modify values falling outside of a desired depth window. In some embodiments, a value may fall outside of a desired depth window if the depth is greater than a comparison value, such as the distance to the floor, a minimal depth (e.g. , as set during configuration of the time-of-flight monitoring unit 50), a measured value based on the current frame or historical frames (e.g. , an average, median, depth percentage, or similar value), or any other suitable comparison value. Although depth values that fall outside of the desired depth window (e.g. , by being less than the comparison value) may be modified in any suitable manner, in some embodiments the depth values may be set to a constant value, such as the depth value associated with the distance to the floor. An example of an image that has depth values modified in this manner is depicted in FIG. 9C.
[0076] At step 508, processing module 52 may calculate the standard deviation of the image pixels and modify the image based on the standard deviation values. A high standard deviation value may be indicative of sudden changes in depth, for example, at a border region between an object and the floor. The standard deviation values may be compared to a threshold (e.g. , a predetermined threshold, a threshold determined based on the configuration of the time- of-flight monitoring unit 50, a threshold based on current and/or historical image frames, etc.) and any pixels associated with a standard deviation that exceeds the threshold may be modified. Although pixels may be modified in any suitable manner, in some embodiments the depth values may be set to a constant value, such as the depth value associated with the distance to the floor. An example of an image that has depth values modified in this manner is depicted in FIG. 9D.
[0077] At step 510, processing module 52 may modify the image based on morphology characteristics of the object to be identified (e.g. , a person). For example, it may be known that a depth map associated with a person and captured from a camera oriented on the ceiling is likely to have certain characteristic shapes and depth profiles, based for example on head, shoulder, and arm locations. The mathematical morphology characteristics may emphasize shapes of the objects and make them easier to identify and to split in case of multiple objects. Based on these characteristics the image may be modified in a manner that is likely to emphasize portions of the image that correspond to the object to be identified. In some embodiments, pixels may be removed or added in regions that conform to sets of rules associated with the morphology characteristics. An example of an image that has depth values modified in this manner is depicted in FIG. 9E.
[0078] At step 512, processing module 52 may compare the depth map to a reference pattern or may use classification algorithms for identifying learned patterns. Although it will be understood that the comparison may be performed in any suitable manner, and that any suitable reference pattern may be used, in an exemplary embodiment processing module 52 may calculate a correlation matrix between the depth map and a Gaussian distribution, which can be a result of applying classification algorithms on the depth map, including cascade classifiers such as Viola- Jones, or other classifiers. An example of a Gaussian distribution is depicted in FIG. 10A. In the exemplary embodiment described in the steps 500 of FIG. 8, the image that is compared to the Gaussian distribution may be an image depicted in FIG. 9F, wherein the three objects that should be identified as people are identified with crosshairs. An example of a correlation matrix based on the comparison of the image of FIG. 9F to a Gaussian distribution is depicted in FIG. 10B.
[0079] At step 514, processing module 52 may determine identification values for the objects to be identified (e.g. , people) based on the correlation matrix. Although identification values may be calculated in any suitable manner, in some embodiments the objects may be identified based on absolute values, correlations, and standard deviations associated with each of the maxima of the correlation matrix. Additional logic can be applied for more accurate object identification such as size or shape of the object, history of detections over the time, any other suitable logic, or any combination thereof.
[0080] At step 516, processing module 52 may identify candidate maxima based on one or more of the identification values. Although any or all of the identification values may be used to identify candidate maxima, in an exemplary embodiment the candidate maxima may be based on the correlation value associated with each of the maxima exceeding a threshold (e.g. , a predetermined threshold, a threshold determined based on the configuration of the time-of- flight monitoring unit 50, a threshold based on current and/or historical image frames, etc.). [0081] At step 518, processing module 52 may compare the identification values associated with each of the candidate maxima to one or more selection rules to identify the objects (e.g. , people) from the depth information. Exemplary rules may include morphology rules (e.g. , related to expected identification values for people), false detection rules (e.g., related to common errors such as raised hands adjacent to person), logical filters (e.g., based on size or position of the object), any other suitable rules, or any combination thereof. Based on the result of the selection rules, the desired object (e.g. , the three people of FIG. 9F) may be identified at step 520.
[0082] At step 522, processing module may determine unique data for each of the identified objects. Unique data may be associated with a particular object, and may provide information about the object's movements and activities within the field of view. Unique data may be determined based on the depth information from time-of-flight camera 54, data from secondary sensor 58, other data related to objects stored in a database, any other data source, or any combination thereof, as described herein. Exemplary unique data may include the location within the field of view, elapsed time within the field of view and at a particular location, depth information related to the object, movement within the field of view, calculated statistics related to the object, any other suitable information, or any combination thereof.
[0083] FIG. 11 depicts steps 600 for analyzing video image signals acquired by a secondary sensor 58 in accordance with some embodiments of the present disclosure. It will be understood the steps 600 may be performed at any suitable computing device, such as on processing module 52 of a time-of-flight monitoring unit 50 (e.g., any of time-of-flight monitoring units 402a-402i), local computing devices 406, remote servers 408, or distributed among any combination thereof. However, in an exemplary embodiment described herein, the processing described in steps 600 may be efficient enough to be run in real-time on a single processing module 52 of a time-of-flight monitoring unit 50. Although the steps 600 may be performed to analyze video image signals associated with any suitable objects as described herein, in an exemplary embodiment the objects to be analyzed may be people.
[0084] At step 602, processing module 52 may associate the video image signal with the depth information (e.g. , depth map). In some embodiments, the video image signal received from a secondary sensor 58 such as an RGB video sensor may be scaled and oriented to correspond to the depth information (e.g. , depth map) received from time-of-flight camera 54. The scaling and/or orientation may be based on information determined during manufacturing, during configuration, or based on real time data (e.g., identifying prominent objects in both a video image signal and depth map and basing the scaling and/or orientation on the locations of the prominent objects). Given an accurate position and shape of an object obtained using depth map, the video image signal can be effectively used for retrieving information about a person, such as demographic information, color information about a person's clothing (e.g., for identifying employees), any other suitable information, or any combination thereof. In some embodiments, the association of the video image signal and depth information may result in a combined data image having a custom format including both image pixel and depth information (e.g. , for an RGB video image signal combined with depth map information, an "RGB-D" signal).
[0085] At step 604, processing module 52 may identify the objects (e.g., people) in the video image signal based on the identified locations determined from the depth information (e.g. , based on the process described in steps 500).
[0086] At step 606, processing module 52 may identify regions of interest associated with each of the locations for each of the objects. The identified region may be predetermined, based on depth information (e.g. , the borders associated with the object from the processed depth image), based on the video image signal, determined by any other suitable procedure, or any combination thereof. In an exemplary embodiment, the region of interest may include the object itself as determined from the depth information as well as a buffer region surrounding the object.
[0087] At step 608, processing module 52 may process the regions of interest of the video image signal. Processing only the regions of interest reduces the processing load and time, and may allow for complex operations to be performed in real-time at the processing module 52, even when processing module 52 has limited processing speed or power. Although any suitable processing may be performed, in some embodiments processing module 52 may identify individuals, perform biometric analysis, determine statistics, or determine any other suitable information, for the objects that are within the region of interest.
[0088] In an exemplary embodiment, processing may include identifying a person of interest such as an employee, a customer service specialist, a customer, a manager, and a checkout specialist. For example, it may be known that an employee wears a certain uniform or subset of uniforms, color scheme, or other identifiable color or pattern. This pattern may be known and a criteria may be used to determine whether the object is a person of interest.
Although a criteria may be set in any suitable manner, in some embodiments the criteria may be set manually, automatically, or any combination thereof. In an embodiment of manual criteria, a color (or set of colors) may be identified as well as a threshold (e.g., a percentage of the video image signal of the person that must include the color or colors). In an embodiment of automatic criteria, persons of interest may be identified within a set of images and a learning algorithm may determine criteria for distinguishing persons of interest from other people or objects. In some embodiments, multiple criteria may be provided to identify multiply types of persons of interest, such as an employee, a customer service specialist, a customer, a manager, and a checkout specialist.
[0089] FIG. 12 depicts an exemplary monitoring environment for an exemplary time-of- flight monitoring system in accordance with embodiments of the present disclosure. An exemplary application for the monitoring environment may be a monitoring application for a retail location 2 as depicted in FIG. 12. Although a retail application is provided as an exemplary application, a person having ordinary skill in the art will understand that the systems and methods described herein may be implemented in numerous other monitoring applications, such as manufacturing facilities, shipping facilities, agricultural facilities, workplaces, government buildings, security queues, private residences, and other applications. While the exemplary embodiment described with respect to FIG. 12 may focus on the monitoring of customers and employees, numerous other objects may be monitored such as crates, boxes, vehicles, livestock, etc.
[0090] Exemplary retail location 2 may include a plurality of customers 10 and 11, a display area 40, a ceiling-mounted time-of-flight monitoring unit 50a, and a wall-mounted time- of-flight monitoring unit 50b. In the exemplary embodiment of FIG. 12, both of the time-of- flight monitoring units 50a and 50b may capture information related to the same or similar target area from different angles. Although two time-of-flight monitoring units are depicted, any number of time-of-flight monitoring units 50 may be associated with a single target area. And while the time-of-flight monitoring units are oriented at right angles in this exemplary embodiment, it will be understood that a plurality of monitoring units may be oriented at a variety of relative locations.
[0091] Processing for time-of-flight monitoring unit 50b may be performed in a similar manner to a time-of-flight monitoring unit located on the ceiling (e.g. , time-of-flight monitoring unit 50a), with modifications to adjust for the differing location relative to the objects of interest. As described above for a ceiling-located time-of-flight monitoring unit (e.g. , time-of- flight monitoring unit 50 and 50a), some or all of the processing of depth information and/or information from a secondary sensor 58 may be processed by a processing module 52 of time- of-flight monitoring unit 50b. In some embodiments, time-of-flight monitoring units (e.g. , 50a and 50b) having coverage over the same target area may communicate raw depth information, processed depth information, identified object information, or any other suitable information, for co-processing of the information related to the target area. In some embodiments, one of time-of-flight monitoring unit 50a or 50b may function as a master device and the other as a slave device.
[0092] Some or all of the processing steps described above with respect to FIGS. 6, 8 and 11 may be performed for a monitoring unit located at any suitable orientation (e.g., time-of- flight monitoring unit 50b) by modifying one or more rules or parameters of interest. For example, the configuration of a time-of-flight monitoring unit 50b having a side-facing orientation may focus on a particular region 70 within the field of view of the time-of-flight camera associated with time-of-flight monitoring unit 50b, such as the region 70 near display area 40, as depicted in FIG. 12. During configuration, in addition to the typical tilt and distance parameters necessary to calibrate the camera, a minimum and a maximum depth associated with the region 70. These parameters may be determined automatically, based on user inputs, or any combination thereof. In an exemplary embodiment of a retail location 2, this may include customer 10 located in proximity to display area 40, but not include customer 11 located outside of the region 70.
[0093] FIG. 13 depicts steps 700 for determining identifying characteristics using machine learning in accordance with some embodiments of the present disclosure. The time- of-flight monitoring units 50 described herein may obtain a wealth of information based on the depth information from time-of-flight camera 54, information from one or more secondary sensors 58 (e.g., a video sensor), or a combination thereof. In addition to identifying and providing tracking for objects of interest, this information may be used to identify numerous identifying characteristics about the objects of interest, such as identifications of an employee, a customer service specialist, a customer, a manager, or a checkout specialist, as described above. In the exemplary embodiment where the object is a person, exemplary identifying
characteristics include clothing (e.g., color of clothing), height, weight, age, age range (e.g., newborn, toddler, child, pre-teen, teenager, adult, senior citizen, etc.), hair color, eye color, biometrics (facial recognition, iris recognition, etc.), gender, race, ethnicity, identity (e.g. , matching to a known individual), likely income level (e.g. , based on identifying information, shopping history, information about other previously visited locations), employee status, customer status, energy level, emotions, anxiety level, intoxication (e.g., based on irregular movement patterns), customer confusion, likelihood of making a purchase, any other suitable identifying characteristics, or any combination thereof. Similar identifying characteristics may be used for other objects, such as crates, boxes, vehicles, livestock, etc. It will be understood that each of the steps 700 may be performed by any suitable computing device (e.g. , computing devices 406), server (e.g. , servers 408), or any combination thereof, at any suitable location or combination thereof, and in any suitable manner (e.g. , local vs. distributed processing).
[0094] At step 702, one or more data sources may be identified. A data source includes at least one time-of-flight monitoring unit 50, and can further include information from other sensors, user-provided information, existing information databases (e.g., a database including customer information), any other suitable data source, or any combination thereof. In some embodiments, user-provided information may associate identifying characteristics associated with data obtained from time-of-flight monitoring unit 50. Although it will be understood that user-provided information may be provided for any identifying characteristic, in an exemplary embodiment users may desire to distinguish between customers and employees based on data from a time-of-flight monitoring unit 50 including a time-of-flight camera 54 and an RGB video sensor as a secondary sensor 58. One or more users may monitor a user interface that allows them to view outputs from time-of-flight monitoring units that identify people in retail locations. When an object is identified as a person (e.g. , based on depth information), the users may then manually determine whether that person is a customer, employee, or other (e.g. , small children, law enforcement, regulatory personnel, cleaning crews, etc.). The determinations, video, and depth information may be stored, providing a data source that may be used by a machine learning system to provide algorithms for classifying a person as a customer, employee, or other. Whatever the data source, the data may be stored over any suitable time period and on any suitable geographic scale, including from a single time-of-flight monitoring unit, from a single location, from a plurality of locations, etc.
[0095] At step 704, one or more learning criteria may be provided. Learning criteria may be user-provided, may be unsupervised, or any combination thereof. Although any suitable learning criteria may be provided, exemplary learning criteria include one or more of classification, regression, clustering, density estimation, and dimensionality reduction. For example, in an exemplary embodiment the learning criteria may include three classes of employee, customer, and other.
[0096] At step 706, the data from the data source may be analyzed based on the learning criteria according to one or more machine learning algorithms. Exemplary machine learning algorithms may include neural networks, similarity learning methods, Bayesian networks, genetic algorithms, principal component analysis, independent component analysis, cluster analysis, support vector machines, any other suitable learning algorithm, or any combination thereof. Data from the data source may be analyzed by the learning algorithm based on the provided learning criteria, providing as an output one or more algorithms for identifying the desired identification characteristic from real-world data, such as output data from a time-of- flight monitoring unit 50. In an exemplary embodiment, data from the data source (e.g. , stored depth and video information from a time-of-flight monitoring unit with people classified as employee, customer, or other) may be input to a cascade classifier such as Viola- Jones trained on depth representations of objects. The resulting identified objects can be further classified based on information retrieved from the video sensor such as color information.
[0097] At step 708, the one or more algorithms provided by the learning algorithm may be implemented for use with one or more time-of-flight monitoring units 50. The learning algorithm may be implemented in any suitable manner, such as in software residing on a computer readable medium and running on processing module 52, one or more computing devices 406, one or more servers 408, any other suitable device, or any combination thereof.
[0098] FIG. 14 depicts steps 800 for establishing privacy settings in accordance with some embodiments of the present disclosure. As described herein, a time-of-flight monitoring unit 50 includes a time-of-flight camera 54. Unlike video image signals from a video sensor or other similar monitoring device, depth information such as a depth map from a time-of-flight camera 54 is unlikely to include data or images that can be used to identify a particular individual. This is because depth information generally captures only the physical shape of a person from a particular angle. And while it may be possible to identify a particular individual based on such data, as a general manner such information is best suited to distinguishing between general sizes or shapes (e.g. , child vs. adult, male vs. female, etc.). Absent a specific algorithm for matching captured depth information with known depth information associated with a particular individual, it is highly unlikely that the depth information will provide any information that could be used to identify the individual. This is in contrast to a video image signal from a conventional monitoring system, which may for example provide a clear depiction of a person's face, clothing, etc. Accordingly, a time-of-flight monitoring system may provide for anonymous monitoring coupled with large-scale data acquisition.
[0099] As described herein, a time-of-flight monitoring unit 50 (or a network of time- of-flight monitoring units) is useful for numerous applications spanning a broad range of industries. These applications may be have differing privacy requirements. For example, legal or regulatory requirements may limit the types of information that can be acquired in public locations, private property that is open to the public, etc. Customers may not wish to have their movements tracked in an identifiable manner at a retail location or similar environment, and may choose not to frequent establishments that store identifiable information. On the other hand, there may be applications such as government buildings, public transportation, workplaces, security queues, etc., where there may not be a similar expectation of privacy. Accordingly, it may be desirable to allow for different privacy settings for a time-of-flight monitoring unit 50 or systems incorporating such monitoring units. Although it will be understood that privacy settings may be set in any suitable manner, in some embodiments privacy settings may be set by the manufacturer, on the monitoring units, at a connected computing device or server, or any combination thereof. Privacy settings may be user configurable or may be set automatically (e.g. , based on legal requirements associated with the jurisdiction where the monitoring units will be used).
[00100] At step 802 a display privacy settings may be set. Display settings may relate to the information that is displayed to a user (e.g. , at a computing device 406) on a display.
Display privacy settings may include settings for statistics, depth information, object information, video image signal data, any other suitable data or information, or any
combination thereof. For example, an exemplary setting for displaying statistics may only display statistics relating to a particular location that is being monitored (e.g. , queue size, wait time, dwell, customer counts, etc.). An exemplary setting including statistics and depth information may include an image of the depth map along with the statistics, which may allow an observer to view the location of people in a manner that makes it difficult to identify any particular individual. An exemplary setting that includes object information may permit a user to view information about a person in the field of view, such as demographic information, identifying information, history information, any other suitable information, or any combination thereof. An exemplary setting that includes video image signal data may permit a user to view information such as a live video feed.
[00101] At step 804 the storage settings may be set. Storage settings may relate to the information that is stored (e.g. , at time-of-flight monitoring unit 50, computing devices 406, and/or servers 408). Storage privacy settings may include settings for statistics, depth information, object information, video image signal data, any other suitable data or information, or any combination thereof, as described above with respect to display privacy settings.
[00102] At step 806 the display and storage settings may be applied. Display and storage settings may be applied at one or more of time-of-flight monitoring units 50 (e.g. , time- of-flight monitoring units 402a-402i at locations 410 and 412a-412c), computing devices 406, and servers 408. In some embodiments, applying the display and storage settings may include disabling functionality of the time-of-flight monitoring units 50, such that any information that is not to be displayed or stored is not determined or acquired in the first instance. In some embodiments, applying the display and storage settings may include disabling the transmission of certain data or information from time-of-flight monitoring units 50 to one or more of computing devices 406 and/or servers 408. In some embodiments, applying the display and storage settings may include the computing devices 406 and/or servers 408 not displaying or storing particular data or information.
[00103] FIG. 15 depicts steps 900 for determining statistics in accordance with some embodiments of the present disclosure. As has been described herein, time-of-flight monitoring unit 50 may have applications monitoring a variety of different object types in a variety of location types. Multiple units may be located in a single location at a plurality of different views and perspectives, and data from multiple locations may be stored and processed together. Accordingly, it will be understood that the time-of-flight monitoring units 50 described herein may be used to determine numerous types of statistics, whether at the level of the time-of-flight monitoring unit 50, computing devices 406, servers 408, any other suitable device, or any combination thereof.
[00104] A variety of types of information may be determined based on the depth information from the time-of-flight camera 54, and this information may also be integrated with information from a secondary sensor 58 such as a video sensor, and other data sources such as customer databases. Statistics may relate to any such information, such as the previously described identifying characteristics including height, weight, age, age range (e.g. , newborn, toddler, child, pre-teen, teenager, adult, senior citizen, etc.), hair color, eye color, biometrics (facial recognition, iris recognition, etc.), gender, race, ethnicity, identity (e.g., matching to a known individual), likely income level (e.g. , based on identifying information, shopping history, information about other previously visited locations), employee status, customer status, energy level, emotions, anxiety level, intoxication (e.g. , based on irregular movement patterns), customer engagement, associates to customer ratio, customer confusion, likelihood of making a purchase, any other suitable identifying characteristics, or any combination thereof. Although exemplary statistics of people counting, zone analysis, queue analysis, and checkout analysis will be described in accordance with steps 900, it will be understood that steps 900 may be used for determining any other suitable statistics that may be determined based on the present disclosure.
[00105] In an exemplary embodiment of people counting, it may be desired to determine statistics about the number of people entering a location or a region of the location. The system may identify people based on depth information, and determine when they enter into a location based on a criteria such as a boundary line, e.g., that sets a boundary to an entry to a store or a department within a store. The people count may be incremented whenever a person crosses the boundary line, the first time a person crosses the boundary line, in any other suitable manner, or any combination thereof. In some embodiments, only particular people (e.g. , non- employee customers) may be counted.
[00106] In an exemplary embodiment of zone analysis, it may be desired to determine statistics about a particular portion of the field of view, e.g., a defined zone. People may be identified based on depth information and their activity may be tracked within a zone, the zone being defined in any suitable manner, e.g. , a boundary region. Although a boundary region may be defined in any suitable manner, in some embodiments the boundary region may be a defined area within the field of view. Although the boundary region may be defined in any suitable manner, in some embodiments the boundary region may be based on a user input, e.g., drawing an area on a user interface with a touchscreen or mouse, defining a central location of an area and defining a size and/or shape of the area, any other suitable manner, or any combination thereof. Statistics may then be determined for the zone. Although it will be understood that any suitable statistics may be determined for the zone, exemplary statistics include zone count, dwell time, customer location, customer engagement, and staff-to-customer ratio.
[00107] In an exemplary embodiment of queue analysis, it may be desired to determine statistics about a portion of the field of view associated with a customer queue. People may be identified based on depth information and their activity may be tracked within the queue, the queue being defined in any suitable manner, e.g., based on a boundary region or the location of a checkout area. Although a boundary region may be depicted and defined in any suitable manner, in some embodiments the boundary region may be a depicted and defined as described above for a queue area that is a checkout area. Although a checkout area may be defined in any suitable manner, in some embodiments the checkout area may be defined manually, automatically, or suitable combination thereof. In some embodiments, a user may use a user input to select the location of one or more checkout locations and define a region around the checkout location as the checkout area. In some embodiments, checkout locations may be identified automatically, e.g., based on depth data or a beacon. In some embodiments, the statistics may be modified based on whether the queue is structured or unstructured. An exemplary structured queue may have predictable positions for customers, checkout locations, employees, items for purchase, etc. Examples may include supermarket checkouts, retail locations, banks, airport security, and other similar locations. An exemplary unstructured queue may have varied locations for customers, checkout locations, employees, items for purchase, etc. Examples may include self-checkout areas, ATMs, and other similar locations. Statistics may then be determined for the queue. Although it will be understood that any suitable statistics may be determined for the queue, exemplary statistics include waiting time, queue count, and abandonment rate.
[00108] In an exemplary embodiment of queue analysis, it may be desired to determine statistics about a portion of the field of view associated with the checkout process. People may be identified based on depth information and their activity may be tracked at the checkout area, the checkout area being defined in any suitable manner as described above. In some embodiments, the statistics may be modified based on whether the checkout area is operated by an employee, is a self-checkout area, or is a hybrid checkout area. In an exemplary self- checkout area, multiple check-out systems may have a smaller number of employees assisting customers when they have problems or issues with self checkout. In an exemplary hybrid checkout location, it may be possible to switch between a self-checkout system mode or a standard cashier-operated mode. Statistics may then be determined for the checkout area. Although it will be understood that any suitable statistics may be determined for the checkout area, exemplary statistics include checkout time, wait time, and staff support level.
[00109] Referring again to steps 900 of FIG. 15, at step 902, a statistic may be selected or created. In some embodiments, one or more statistics may be pre-configured, such that a user may select from known statistics. In some embodiments statistics may be created through a user interface. An exemplary user interface may allow a user to choose from among available data sources (e.g. , data from time-of-flight monitoring unit 50, computing devices 406, and/or servers 406) to create a statistic. A user may then identify how the data may be used or combined to create the statistic (e.g. , combining data relating to whether a person is an employee with data relating to the person's location within a store to provide statistics regarding employee activity).
[00110] At step 904 the statistic may be configured. Although statistics may be configured in any suitable manner, in some embodiments a user interface may allow for a user to set parameters that may be used to configure the statistic. In an exemplary embodiment of people counting, it may be desired to determine how many people enter a location. Configuring a people counting statistic may include setting a threshold location for counting (e.g. , drawing a line within the field of view via a user interface), setting a direction for counting (e.g., entry, exit, or both), selecting whether to limit the count to unique users (e.g., based on user tracking or identifying information), any other suitable parameters, or any combination thereof. In an exemplary embodiment of zone analysis or queue analysis, it may be desired to determine statistics about people's activities within a particular area of a location. Configuring a statistic may include setting an area for analysis (e.g. , drawing a shape within the field of view via a user interface that defines the zone), selecting relevant people to count (e.g. , customers, employees, and/or other), identifying checkout locations, any other suitable parameters, or any combination thereof.
[00111] At step 906 the statistic may be compiled. Although statistics may be compiled in any suitable manner, in some embodiments the statistic may be compiled by one or more time-of-flight monitoring units 50, computing devices 406, servers 408, any other suitable device, or any combination thereof, based on data from one or more time-of-flight monitoring units 50 and any other suitable data sources. This information may be transmitted to a computing device 406 and/or server 408 for display or storage.
[00112] FIG. 16 depicts steps 1000 for identifying a person of interest based on a beacon signal in accordance with some embodiments of the present disclosure. As described herein, it may be desired to identify persons of interest, for example, in order to generate statistics relating to a location. Although in some embodiments a person of interest may be identified based on depth information, a video image signal, or a combination thereof, in other
embodiments persons of interest such as employees may wear beacons that facilitate
identification. Moreover, different persons of interest (e.g., employee types) may have different beacons.
[00113] At step 1002, the time-of-flight monitoring unit may provide a wake-up signal for the beacons. A wake-up signal may be provided so that the beacons do not have to be powered and transmitting at all times, and instead transmit in response to the wake-up signal. In an embodiment, any suitable wakeup signal may be provided that can be sensed by circuitry of the beacon, such as an infra-red light pattern, an ultrasonic signal, or a modulated
communication signal. In some embodiments, time-of-flight monitoring unit may include additional hardware for providing the wake up signal, e.g., to provide an ultrasonic signal or a modulated communication signal. Although the wake-up signal may be provided in any suitable manner, in an embodiment the wake-up signal may be provided on a periodic basis.
[00114] At step 1004, the time-of-flight monitoring unit may receive an infrared-red beacon signal. An infra-red beacon signal may be provided in any suitable manner that allows the time-of-flight monitoring unit to distinguish the profile of the beacon signal (e.g. , signal intensity, wavelength, modulation frequency, etc.) from other infrared light received at the time-of-flight sensor of the time of flight monitoring unit. The system then may then identify precisely the coordinates of the beacon signal source in the field of view of the time-of-flight camera and associate this signal source with a person to be detected and tracked by the system. [00115] At step 1006, the time-of-flight monitoring unit may associate the received beacon signal with an object that has been identified by the time-of-flight monitoring unit. Based on this association, the identified object may be identified as a person of interest, such as an employee.
[00116] The foregoing is merely illustrative of the principles of this disclosure and various modifications may be made by those skilled in the art without departing from the scope of this disclosure. The above described embodiments are presented for purposes of illustration and not of limitation. The present disclosure also can take many forms other than those explicitly described herein. Accordingly, it is emphasized that this disclosure is not limited to the explicitly disclosed methods, systems, and apparatuses, but is intended to include variations to and modifications thereof, which are within the spirit of the following claims.
[00117] As a further example, variations of apparatus or process parameters (e.g., dimensions, configurations, components, process step order, etc.) may be made to further optimize the provided structures, devices and methods, as shown and described herein. In any event, the structures and devices, as well as the associated methods, described herein have many applications. Therefore, the disclosed subject matter should not be limited to any single embodiment described herein, but rather should be construed in breadth and scope in accordance with the appended claims.

Claims

CLAIMS What is claimed is:
1. A method of identifying a plurality of objects within a field of view, comprising:
transmitting a light signal from a time-of-flight monitoring unit to illuminate a region including the field of view;
receiving a reflected signal of the light signal at a time-of-flight camera of the time-of- flight monitoring unit, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal;
identifying the plurality of objects based on the depth information;
determining an area of interest within the field-of-view; and
generating one or more statistics for one or more of the identified objects within the area of interest.
2. The method of claim 1, wherein determining the area of interest within the field of view comprises receiving a user input defining the area of interest.
3. The method of claim 2, wherein the user input comprises selecting a boundary region.
4. The method of claim 2, wherein the user input comprises selecting a boundary line.
5. The method of claim 2, wherein the user input comprises identifying a checkout area.
6. The method of claim 1, wherein determining the area of interest within the field of view comprises identifying a checkout area based on the depth information.
7. The method of claim 6, wherein the checkout area is a customer checkout device.
8. The method of claim 1, wherein the one or more statistics comprise one or more queue analysis statistics.
9. The method of claim 8, wherein the one or more queue analysis statistics comprise waiting time, queue count, and abandonment rate.
10. The method of claim 8, wherein the one or more queue analysis statistics are determined for an unstructured queue.
11. The method of claim 1, wherein the one or more statistics comprise one or more zone analysis statistics.
12. The method of claim 11, wherein the one or more zone analysis statistics comprise zone count, dwell time, customer location, customer engagement, and staff-to-customer ratio.
13. The method of claim 1, wherein the one or more statistics comprise one or more checkout analysis statistics.
14. The method of claim 13, wherein the one or more checkout analysis statistics comprise checkout time, wait time, and staff support level.
15. The method of claim 13, wherein the one or more checkout analysis statistics are determined for a self-checkout area.
16. The method of claim 13, wherein the one or more queue analysis statistics are determined for a hybrid checkout area.
17. The method of claim 1, further comprising:
classifying one or more of the identified objects; and
generating one or more statistics based on the classification.
18. The method of claim 17, wherein classifying the identified objects comprises determining whether each of the identified objects is a person of interest.
19. The method of claim 18, wherein the person of interest comprises one or more of an employee, a customer service specialist, a customer, a manager, and a checkout specialist.
20. The method of claim 18, wherein determining whether each of the identified objects is a person of interest comprises:
generating a video image signal for the field of view with a video sensor of the time-of- flight monitoring unit;
associating the video image signal with the depth information;
identifying one or more regions of interest associated with the identified objects based on the associating; and
determining whether each of the identified objects is a person of interest based on the video image signal within the one or more identified regions of interest.
21. The method of claim 20, wherein determining whether each of the identified objects is a person of interest based on the video image signal comprises analyzing a color pattern for each identified object.
22. The method of claim 18, wherein determining whether each of the identified objects is a person of interest comprises:
receiving one or more infra-red beacon signals at the time-of-flight camera of the time- of-flight monitoring unit;
associating each beacon signal with one or more of the identified objects; and identifying the one or more of the identified objects associated with the beacon signal as a person of interest.
23. The method of claim 22, further comprising providing a wake-up signal, wherein the wake-up signal is associated with one or more beacons.
24. The method of claim 23, wherein the wake-up signal comprises an infra-red light pattern.
25. The method of claim 1, further comprising:
receiving information from a customer checkout device;
analyzing interactions between one or more of the identified objects and the customer checkout device based on the received information and depth information associated with the one or more identified objects.
26. The method of claim 1, further comprising:
generating a video image signal for the field of view with a video sensor of the time-of- flight monitoring unit;
associating the video image signal with the depth information;
identifying a region of interest associated with each of the plurality of objects based on the associating; and
processing the portion of the video image signal associated with the region of interest.
27. The method of claim 26, further comprising identifying one or more characteristics based on the processed video image signal.
28. A time-of-flight monitoring unit, comprising:
an illumination system configured to transmit a light signal to illuminate a region including a field of view;
a time of flight camera configured to receive a reflected signal of the light signal, wherein the time-of-flight camera generates depth information for the field of view based on the reflected signal; and
a processing module configured to identify a plurality of objects based on the depth information, determine an area of interest within the field-of-view, and generate one or more statistics for one or more of the identified objects within the area of interest.
29. The time-of-flight monitoring unit of claim 28, wherein the processing module is configured to receive a user input defining the area of interest and determine the area of interest based on the user input.
30. The time-of-flight monitoring unit of claim 29, wherein the user input comprises a boundary region.
31. The time-of-flight monitoring unit of claim 29, wherein the user input comprises a boundary line.
32. The time-of-flight monitoring unit of claim 29, wherein the user input comprises a checkout area.
33. The time-of-flight monitoring unit of claim 28, wherein the processing module is configured to identify a checkout area based on the depth information and determine the area of interest based on the checkout area.
34. The time-of-flight monitoring unit of claim 33, wherein the checkout area is a customer checkout device.
35. The time-of-flight monitoring unit of claim 28, wherein the one or more statistics comprise one or more queue analysis statistics.
36. The time-of-flight monitoring unit of claim 35, wherein the one or more queue analysis statistics comprise waiting time, queue count, and abandonment rate.
37. The time-of-flight monitoring unit of claim 35, wherein the one or more queue analysis statistics are determined for an unstructured queue.
38. The time-of-flight monitoring unit of claim 28, wherein the one or more statistics comprise one or more zone analysis statistics.
39. The time-of-flight monitoring unit of claim 38, wherein the one or more zone analysis statistics comprise zone count, dwell time, customer location, customer engagement, and staff- to-customer ratio.
40. The time-of-flight monitoring unit of claim 28, wherein the one or more statistics comprise one or more checkout analysis statistics.
41. The time-of-flight monitoring unit of claim 40, wherein the one or more checkout analysis statistics comprise checkout time, wait time, and staff support level.
42. The time-of-flight monitoring unit of claim 40, wherein the one or more checkout analysis statistics are determined for a self-checkout area.
43. The time-of-flight monitoring unit of claim 40, wherein the one or more queue analysis statistics are determined for a hybrid checkout area.
44. The time-of-flight monitoring unit of claim 28, wherein the processing module is configured to classify one or more of the identified objects and generate one or more statistics based on the classification.
45. The time-of-flight monitoring unit of claim 44, wherein the processing module is configured to determine whether each of the identified objects is a person of interest based on the classification.
46. The time-of-flight monitoring unit of claim 45, wherein the person of interest comprises one or more of an employee, a customer service specialist, a customer, a manager, and a checkout specialist.
47. The time-of-flight monitoring unit of claim 45, further comprising a video sensor configured to generate a video image signal for the field of view, wherein the processing module is configured to associate the video image signal with the depth information, identify one or more regions of interest associated with the identified objects based on the associating, and determine whether each of the identified objects is a person of interest based on the video image signal within the one or more identified regions of interest.
48. The time-of-flight monitoring unit of claim 47, wherein the processing module is configured to analyze a color pattern for each identified object to determine whether each of the identified objects is a person of interest.
49. The time-of-flight monitoring unit of claim 45, wherein the time-of-flight camera is configured to receive one or more infra-red beacon signals, and wherein the processing module is configured to associate each beacon signal with one or more of the identified objects and identify the one or more of the identified objects associated with the beacon signal as one or more person of interest.
50. The time-of-flight monitoring unit of claim 49, wherein the illumination system is configured to provide a wake-up signal, wherein the wake-up signal is associated with one or more beacons.
51. The time-of-flight monitoring unit of claim 50, wherein the wake-up signal comprises an infra-red light pattern.
52. The time-of-flight monitoring unit of claim 28, wherein the processing module is configured to receive information from a customer checkout device and analyze interactions between one or more of the identified objects and the customer checkout device based on the received information and depth information associated with the one or more identified objects.
53. The time-of-flight monitoring unit of claim 28, further comprising a video sensor configured to generate a video image signal for the field of view, wherein the processing module is configured to associate the video image signal with the depth information, identify a region of interest associated with each of the plurality of objects based on the associating, and process the portion of the video image signal associated with the region of interest.
54. The time-of-flight monitoring unit of claim 53, wherein the processing module is configured to identify one or more characteristics based on the processed video image signal.
PCT/SG2015/050246 2015-07-31 2015-07-31 Time-of-flight monitoring system WO2017023202A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SG2015/050246 WO2017023202A1 (en) 2015-07-31 2015-07-31 Time-of-flight monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2015/050246 WO2017023202A1 (en) 2015-07-31 2015-07-31 Time-of-flight monitoring system

Publications (1)

Publication Number Publication Date
WO2017023202A1 true WO2017023202A1 (en) 2017-02-09

Family

ID=57943400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2015/050246 WO2017023202A1 (en) 2015-07-31 2015-07-31 Time-of-flight monitoring system

Country Status (1)

Country Link
WO (1) WO2017023202A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3629307A1 (en) * 2018-09-27 2020-04-01 Aereco Device and method for counting people
WO2021045511A1 (en) * 2019-09-05 2021-03-11 Samsung Electronics Co., Ltd. Apparatus and methods for camera selection in a multi-camera

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011054971A2 (en) * 2009-11-09 2011-05-12 Alpha Vision Design Research Ltd. Method and system for detecting the movement of objects
US8369575B2 (en) * 2009-05-14 2013-02-05 Samsung Electronics Co., Ltd. 3D image processing method and apparatus for improving accuracy of depth measurement of an object in a region of interest
WO2013121267A1 (en) * 2012-02-15 2013-08-22 Mesa Imaging Ag Time of flight camera with stripe illumination
WO2014013012A1 (en) * 2012-07-18 2014-01-23 Ats Group (Ip Holdings) Limited Image processing to derive movement characteristics for a plurality of queue objects
WO2014028276A1 (en) * 2012-08-14 2014-02-20 Microsoft Corporation Wide angle depth detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8369575B2 (en) * 2009-05-14 2013-02-05 Samsung Electronics Co., Ltd. 3D image processing method and apparatus for improving accuracy of depth measurement of an object in a region of interest
WO2011054971A2 (en) * 2009-11-09 2011-05-12 Alpha Vision Design Research Ltd. Method and system for detecting the movement of objects
WO2013121267A1 (en) * 2012-02-15 2013-08-22 Mesa Imaging Ag Time of flight camera with stripe illumination
WO2014013012A1 (en) * 2012-07-18 2014-01-23 Ats Group (Ip Holdings) Limited Image processing to derive movement characteristics for a plurality of queue objects
WO2014028276A1 (en) * 2012-08-14 2014-02-20 Microsoft Corporation Wide angle depth detection

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3629307A1 (en) * 2018-09-27 2020-04-01 Aereco Device and method for counting people
FR3086782A1 (en) * 2018-09-27 2020-04-03 Aereco DEVICE AND METHOD FOR COUNTING PEOPLE
WO2021045511A1 (en) * 2019-09-05 2021-03-11 Samsung Electronics Co., Ltd. Apparatus and methods for camera selection in a multi-camera

Similar Documents

Publication Publication Date Title
Sun et al. A review of building occupancy measurement systems
US10869003B2 (en) Using a scene illuminating infrared emitter array in a video monitoring camera for depth determination
US20230316762A1 (en) Object detection in edge devices for barrier operation and parcel delivery
US10389954B2 (en) Using images of a monitored scene to identify windows
JP6579450B2 (en) Smart lighting system, method for controlling lighting and lighting control system
Shih A robust occupancy detection and tracking algorithm for the automatic monitoring and commissioning of a building
Benezeth et al. Towards a sensor for detecting human presence and characterizing activity
US9554064B2 (en) Using a depth map of a monitored scene to identify floors, walls, and ceilings
Mikkilineni et al. A novel occupancy detection solution using low-power IR-FPA based wireless occupancy sensor
US9626849B2 (en) Using scene information from a security camera to reduce false security alerts
WO2019083739A1 (en) Intelligent content displays
CN106127292B (en) Flow method of counting and equipment
Sruthi Iot based real time people counting system for smart buildings
US9886620B2 (en) Using a scene illuminating infrared emitter array in a video monitoring camera to estimate the position of the camera
WO2012137046A1 (en) Adaptive illumination
CN104539874B (en) Fusion pyroelectricity sensing mixes monitoring system and method with the human body of video camera
US10769909B1 (en) Using sensor data to detect events
US11830274B2 (en) Detection and identification systems for humans or objects
Chun et al. Real-time smart lighting control using human motion tracking from depth camera
Bhattacharya et al. Arrays of single pixel time-of-flight sensors for privacy preserving tracking and coarse pose estimation
WO2017023202A1 (en) Time-of-flight monitoring system
CN114972727A (en) System and method for multi-modal neural symbol scene understanding
US11423762B1 (en) Providing device power-level notifications
Bouma et al. WPSS: Watching people security services
Khan et al. Occupancy Prediction in Buildings: State of the Art and Future Directions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15900502

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30/05/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 15900502

Country of ref document: EP

Kind code of ref document: A1