WO2016053886A1 - Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor - Google Patents
Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor Download PDFInfo
- Publication number
- WO2016053886A1 WO2016053886A1 PCT/US2015/052684 US2015052684W WO2016053886A1 WO 2016053886 A1 WO2016053886 A1 WO 2016053886A1 US 2015052684 W US2015052684 W US 2015052684W WO 2016053886 A1 WO2016053886 A1 WO 2016053886A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- hardware
- dedicated
- processing unit
- power
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/955—Hardware or software architectures specially adapted for image or video understanding using specific electronic processors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
- H04N23/651—Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/467—Encoded features or binary features, e.g. local binary patterns [LBP]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- a smart sensor for sensing dynamic scene-based occurrences can comprise dedicated computer vision (CV) computation hardware configured to receive sensor data from a sensor array comprising more than one sensor pixel and capable of computing one or more CV features using readings from neighboring sensor pixels of the sensor array, and a first processing unit
- CV computer vision
- the apparatus may comprise peripheral circuitry configured to provide at least one of a timing operation, a focusing operation, an auto-exposure correction operation, object detection, object recognition, storing a scanning window, an event-queuing and/or processing operation, analog processing, analog-to-digital conversion, an integration operation, CV feature computation, a cascade-classifier-based classification, a histogram-based classification, or memory buffering, or any combination thereof.
- the apparatus may comprise the second processing unit, and the first processing devis may be further configured to communicaie the face-detected event to the second processing unit while the second processing unit is operating in a low-power mode.
- FIG. 2B is a block diagram of a sensor system with a sensor array unit, microprocessor, and example peripheral circuitry 214, according to one embodiment.
- FIG. 6 is a simplified illustration of an example configurations of the sensor array of FIG. 5,
- the sensor system processes the information retrieved from the camera using the included embedded processor and sends "events" (or indications that one or more reference occurrences have occurred) for the main processor only when needed or as defined and configured by the application.
- This allows the general-purpose microprocessor (which is typically relatively high-speed and high-power to support a variety of applications) to stay in a low-power (e.g., sleep mode) most of the time as conventional, while becoming active only when events are received from the sensor system.
- a smart sensor capable of performing object detection, recognition, etc., can be useful in a variety of applications including internet of things (loT) applications.
- FIG. 2A is a block diagram that illustrates how a sensor system 210 (also referred to herein as a "smart sensor”) can be configured to enable high-level sensing operations while a main processor 220 can be operating in a low-power (e.g., "sleep" or "stand-by") mode, according to one embodiment.
- a sensor system 210 also referred to herein as a "smart sensor”
- main processor 220 can be operating in a low-power (e.g., "sleep" or "stand-by") mode, according to one embodiment.
- Components of FIG . 2A can be incorporated into a larger electronic device.
- An example of a mobile device in which an sensor system 210 may be incorporated is described below, with regard to FIG. 5.
- CV features can then be computed or extracted by the dedicated CV computation hardware using readings from neighboring sensor pixels of the sensor array, providing outputs such as a computed HSO and/or an LBP feature, label, or descriptor.
- no image signal processing circuitry may be disposed between the sensor array unit 212 and the dedicated CV computation hardware.
- dedicated CV computation hardware may receive raw sensor data from the sensor array unit 212 before any image signal processing is performed on the raw sensor data.
- Other CV computations are also possible based on other CV computation algorithms including edge detection, corner detection, scale-invariant feature transform (or SIFT), speeded up robust features (SURF), histogram of oriented gradients (HOG), local ternary patterns ( I . ! ' ;' : ⁇ . etc., as well as extensions of any of the above.
- an event can be an indication that one or more reference occurrences have occurred.
- events can include data related to a reference occurrence.
- the data included in an event can be indicative of a detected reference object, location information related to the reference object, number of reference objects, movement associated with detected reference object, and the like. This data may be conveyed in any of a variety of ways. For example, in the case of object detection, an event can be a simply binary output where "0" means the reference object has not been detected, and "1" means the reference objeci has been detected.
- the signals received by the CV computation hardware 242 from the sensor array unit 212 have not undergone ISP, for example, the signals have not undergone one or more of defect correction, white balancing, color balancing, auto focus, lens roll off, demosaicing, debayering, or image sharpening, or any combination thereof.
- some processing may occur, such as focusing or auto-exposure correction.
- Such signals that have not undergone ISP may be referred to as raw signals or raw sensor readings or raw sensor data.
- the block of one or more subject sensor elements can include a block of m by n sensor elements, for example, 1 1 by 1 1 sensor elements, it is also understood that a pixel-level LBP computation may also be made where the block of one or more subject sensor elements for which the localized C V feature is computed is a single subject sensor element.
- CV computation hardware 312 as separate from the dedicated microprocessor 320, it is understood that in some implementations, dedicated CV computation hardware 312 may be implemented in hardware within the dedicated microprocessor 320.
- microprocessor 216 can provide control signals to the line buffer(s) 230, ADC 234, two dimensional integration hardware 2.36, hardware scanning window array 238, and CV computation hardware 242.
- the microprocessor 216 may use a cascade classifier algorithm to perform object-class detection, for example face detection.
- further power savings are possible by implementing the cascade classifier in hardware, to further reduce the computational burden on the microprocessor 216.
- the optional cascade classifier hardware 244 includes a hardware
- each LBP feature, (LBPn, LBP] / ) will be multiplied by a given weight, (wi i , w i / ), each of which can be different.
- the first weighted scalar sum value is then compared to a first threshold.
- the cascade classifier hardware 244 moves to the next stage.
- the cascade classifier hardware again requests the CV computation hardware 242 to provide LBP features for m subject sensor elements at locations ⁇ (x2i, y'2;) > ⁇ ⁇ ⁇ Jim) ⁇ stored in the hardware scanning window array 2.38.
- the cascade classifier hardware 2.44 performs another summation of a dot product of each of the LBP features with one or more weights, (w 2 i, ..., 3 ⁇ 4 » ), to generate a second weighted scalar sum vakte.
- the second weighted scalar sum value is then compared to a second threshold.
- the cascade classifier hardware 244 can then indicate to the microprocessor 216 that the reference object has been detected, and may further optionally indicate the location of the portion of the image in which the reference object, or portion of reference object, was detected.
- the cascade classifier hardware 244 can be configured to send an indication to the microprocessor 216 that the reference object was detected along with data associated with the reference object, such as the all or some of the CV features computed in the process of detecting the reference object, the location within the image of those CV features, or any other data associated with the computations or operations performed by the CV computation hardware 242 and/or the cascade classifier hardware 2.44.
- the microprocessor 216 and the main processor 220 of FIG. 2A. As illustrated in FIG, 2B, the microprocessor 2.16 includes an interface 2.46 for communications with the second microprocessor. Additionally or alternatively, the microprocessor 216 might track a position of the detected reference object over time (e.g., over multiple images) to determine movement for gesture recognition, risk of collision, danger, and/or other events, for example.
- microprocessor 216 can, in some implementations, still provide control signals to sensor array unit 212, line buffer(s) 230, etc., or, alternatively or additionally, such control signals may be provided by- lower power control logic.
- a cascade classifier may be run as a software algorithm on the microprocessor 216.
- other software algorithms may be run on the microprocessor in the place of the cascade classifier. For example, reference object detection may be performed using histograms, as described in FIG. 11C.
- one or more of the line buffer(s) 230, the ADC 234, the two dimensional integration hardware 236, the hardware scanning window array 238, the CV computation hardware 242, the cascade classifier hardware 2.44, or any combination thereof may be considered peripheral circuitry, that is circuitry that is peripheral to the sensor array unit 212 and may correspond to peripheral circuitry 214 of FIG, 2A, It is also understood that the various components just listed, or any combination thereof, may be implemented instead as in-pixeJ circuitry within the sensor array unit 212.
- peripheral circuitry 214 is coupled with a plurality of sensor ceil outputs of a sensor array unit 212.
- the sensor array unit 212 and/or peripheral circuitry 214 include dedicated CV computation hardware to perform a feature detection computation using at least a subset of the plurality of sensor cell outputs, where the subset of the plurality of sensor cell outputs correspond to a region of the sensor array unit 212 (e.g., an image array) comprising neighboring sensor ceils or pixels.
- the output of the peripheral circuitry 214 is based (at least partially) on the feature detection computation.
- the first processing unit 217 processes signal s received from the one or more outputs of the smart image array to detect a reference occurrence.
- the first processing unit 217 then generates an event, indicating the reference occurrence, to be received by a second processing unit (e.g., the main processor 220 of FIG. 2A),
- an event for a second processing unit is generated, where the event is indicative of the reference occurrence.
- the term "event" describes information provided to a processing unit, indicative of a reference occurrence.
- the event is provided to a second processing unit.
- the event may simply include an indication that a reference occurrence has happened.
- the event may further include an indication of the type of reference occurrence that was detected.
- the even t may be generated by the first processing unit and sent to the second processing unit. In some embodiments, there may be intervening circuitry between the first and second processing units.
- FIG. 5 shows a simplified illustration of the sensor array unit 212 of FIG, 2A.
- pixels 510 are arranged in rows and columns and placed in the focal plane of a receiving optics to provide image capture. (For clarity, only a few pixels 510 in FIG. 5 have numerical labels.)
- features of the sensor array unit such as pixel size, aspect ratio, resolution, and the like can vary depending on desired functionality.
- the simplified illustration of FIG. 5 shows a 10x10 pixel array, but embodiments may have hundreds, thousands, or millions of pixels (or more).
- the sensor system 210 may be configured to detect one or reference occurrences and generate one or more corresponding events while it is performing in a lower-power operation.
- the sensor system 210 may be incorporated into a mobile phone and configured to detect a reference occurrence when a sensed value for the single pixel 310 indicates a significant increase in an amount of light detected by the sensor system 210. Such a change in the amount of detected light may indicate that the mobile phone has been retrieved from a user's pocket or has been picked up from a table or nightstand.
- the sensor system 210 can determine, while in lower-power operation, that this reference occurrence happened and generate an event indicative of the reference occurrence for the main processor 220.
- the sensor system 2.10 can further activate dedicated CV computation hardware to enable higher-power operation to perform different types of CV operations, such as face detection and face recognition.
- the sensor system 210 while performing the lower-power operation, the sensor system 210 detects a reference occurrence.
- the sensor system 210 In one example in which the sensor system 210 is configured to perform an ALS function, the sensor system 210 generates at least one lower-power optical sensor reading, which may be used to detect the reference occurrence.
- a lower-power optical sensor reading may indicate a change in an amount of ambient fight, and the sensor system 210 may detect a reference occurrence based on the lower-power optical sensor reading when a sensed level of light changes at a rate above a reference threshold, or changes color at a rate above a reference threshold.
- the sensor system 210 can be configured to perform a MD function.
- the sensor system 210 configures the sensor array unit 212 to have a reduced resolution greater than a 2x2 pixel resolution, but less than a maximum resolution of pixels in the sensor array unit 212.
- the sensor system 210 is configured to detect relative changes in sensed light at different effective pixels.
- the sensor system 210 analyzes an amount of light sensed at each of the effective pixels (e.g., subgroups 610 as shown in FIG. 6), determines a first set of differences between the amount of fight sensed at each effective pixel relative to at least one other effective pixel.
- the sensor system 210 provides a parameter for lower-power operation, ⁇ one embodiment, a higher-power operation may detect an object near the sensor system 210, and in some example systems may also determine an estimated distance to the object.
- the sensor system 210 may provide an event comprising a parameter to the lower-power operation indicating the presence of the object, or may also (or instead) provide a parameter indicating a distance to the object.
- a parameter may be employed by the lower-power operation to assist with or enhance a PD function.
- the PD function may be able to more accurately detect an object near the sensor based on the parameter, such as by establishing or adjusting a threshold intensit '- level.
- the sensor system 210 initiates a lower-power operation using the parameter.
- the sensor system 210 may initiate a lower-power operation as described above with respect to FIGS. 7 and 8.
- the lower- power operation after initiation, is configured to use the parameter.
- a PD function may be able to more accurately detect an object near the sensor based on the parameter, such as by establishing or adjusting a threshold intensity- level.
- the parameter may assist or enhance the lower-power operation, such as by assisting with an ALS function by pro viding information associated with a threshold for detecting changes in ambient lighting.
- Some embodiments may repeatedly execute the method 900. For example, after performing a higher-power operation, the sensor system 210 may restart the method 900 and initiate a lower-power operation at block 910.
- FIG. 10A shows an example state diagram for computer-vision computations and lower-power optical sensor readings, which may be performed by the sensor system 210.
- FIG. 10A includes two states, a lower-power operation(s) state 1010 and a higher- power operation(s) state 1020.
- the sensor system 210 In a lower-power operation(s) state 1010, the sensor system 210 is configured to perform one or more lower-power operations and may obtain one or more sensor readings.
- a higher-power operation(s) state 1020 the sensor system 2.10 is configured to perform one or more higher-power operations, such as computer- vision computations and operations, and may obtain one or more sensor readings.
- FIG. 12 illustrates an embodiment of a mobile device 105, which can utilize the sensor system 210 as described above. It should be noted that FIG. 12 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. It can be noted that, in some instances, components illustrated by FIG. 12 can be localized to a single physical device and/or distributed among various networked devices, which may be disposed at different physical locations.
- the mobile device 105 can further include sensor(s) 1240.
- sensors can include, without limitation, one or more aceeierometer(s), gyroscope(s), carnera(s), magnetometer(s), altimeter(s), microphone(s), proximity sensor(s), light sensor(s), and the like.
- the sensor(s) 1240 may include the sensor system 210 of FIGS. 2A or 2B and/or similar electronic sensors.
- a first processor e.g., microprocessor 216 in FIGS. 2A or 2B
- a first processing unit of sensor(s) 1240 can determine, from one or more signals received from the one or more outputs of an image array (e.g., sensor array unit 212 of FIGS. 2A or 2B), that a face has been detected, and in response to the determination, generate a face-detection event, for a second processing unit (e.g., processing unit(s) 1210 of FIG. 12).
- an image array e.g., sensor array unit 212 of FIGS. 2A or 2B
- a second processing unit e.g., processing unit(s) 1210 of FIG. 12
- Embodiments of the mobile device may also include a Satellite Positioning System (SPS) receiver 1280 capable of receiving signals 1284 from one or more SPS satellites using an SPS antenna 1282.
- the SPS receiver 1280 can extract a position of the mobile device, using conventional techniques, from satellites of an SPS system, such as a global navigation satellite system (GNSS) (e.g.. Global Positioning System (GPS)), Galileo, Glonass, Compass, Quasi-Zenith Satellite System (QZSS) over Japan, Indian Regional Navigational Satellite System (IRNSS) over India, Beidou over China, and/or the like.
- GNSS global navigation satellite system
- GPS Global Positioning System
- Galileo Galileo
- Glonass Galileo
- Glonass Galileo
- Glonass Compass
- QZSS Quasi-Zenith Satellite System
- IRNSS Indian Regional Navigational Satellite System
- Beidou Beidou over China
- the mobile de vice 105 may further include and/or be in communication with a memory 1260.
- the memory 1260 can include, without limitation, local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (“RAM”), and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, arid'Or the like.
- the memory 218 of FIG, 2A which can include any of the memory types pre viously listed, may be included in the memor j 2.60 or may be distinct fro memory 1260, depending on desired functionality,
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Data Mining & Analysis (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Geophysics And Detection Of Objects (AREA)
- Burglar Alarm Systems (AREA)
Priority Applications (8)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202511988764.0A CN121617133A (zh) | 2014-09-30 | 2015-09-28 | 使用基于事件的视觉传感器进行低功率始终接通脸部检测、跟踪、辨识及/或分析 |
| CN201580052384.4A CN107077601A (zh) | 2014-09-30 | 2015-09-28 | 使用基于事件的视觉传感器进行低功率始终接通脸部检测、跟踪、辨识及/或分析 |
| CA2959549A CA2959549C (en) | 2014-09-30 | 2015-09-28 | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
| KR1020177008531A KR102474008B1 (ko) | 2014-09-30 | 2015-09-28 | 이벤트들-기반 시각 센서를 이용한 저-전력 상시-온 얼굴 검출, 추적, 인식 및/또는 분석 |
| JP2017516988A JP6737777B2 (ja) | 2014-09-30 | 2015-09-28 | イベントベースのビジョンセンサを使用した低電力で常時オンの顔の検出、追跡、認識および/または分析 |
| BR112017006399-9A BR112017006399B1 (pt) | 2014-09-30 | 2015-09-28 | Detecção, rastreamento e/ou análise facial sempre ligado de baixo consumo de energia usando um sensor de visão baseado em eventos |
| KR1020227042166A KR102633876B1 (ko) | 2014-09-30 | 2015-09-28 | 이벤트들-기반 시각 센서를 이용한 저-전력 상시-온 얼굴 검출, 추적, 인식 및/또는 분석 |
| EP15781806.3A EP3201831B1 (en) | 2014-09-30 | 2015-09-28 | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
Applications Claiming Priority (8)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201462057972P | 2014-09-30 | 2014-09-30 | |
| US201462058009P | 2014-09-30 | 2014-09-30 | |
| US201462057800P | 2014-09-30 | 2014-09-30 | |
| US62/058,009 | 2014-09-30 | ||
| US62/057,800 | 2014-09-30 | ||
| US62/057,972 | 2014-09-30 | ||
| US14/866,549 | 2015-09-25 | ||
| US14/866,549 US9554100B2 (en) | 2014-09-30 | 2015-09-25 | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016053886A1 true WO2016053886A1 (en) | 2016-04-07 |
Family
ID=55585883
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2015/052684 Ceased WO2016053886A1 (en) | 2014-09-30 | 2015-09-28 | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
Country Status (7)
| Country | Link |
|---|---|
| US (3) | US9554100B2 (enExample) |
| EP (1) | EP3201831B1 (enExample) |
| JP (2) | JP6737777B2 (enExample) |
| KR (2) | KR102633876B1 (enExample) |
| CN (2) | CN121617133A (enExample) |
| CA (1) | CA2959549C (enExample) |
| WO (1) | WO2016053886A1 (enExample) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108600659A (zh) * | 2017-03-08 | 2018-09-28 | 三星电子株式会社 | 事件检测装置 |
| JP2020532806A (ja) * | 2017-05-25 | 2020-11-12 | サムスン エレクトロニクス カンパニー リミテッド | 危険状況を感知する方法及びそのシステム |
| US12134316B2 (en) | 2019-03-27 | 2024-11-05 | Sony Group Corporation | State detection device, state detection system, and state detection method |
Families Citing this family (71)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10979674B2 (en) * | 2013-07-22 | 2021-04-13 | Intellivision | Cloud-based segregated video storage and retrieval for improved network scalability and throughput |
| US11601620B2 (en) * | 2013-07-22 | 2023-03-07 | Intellivision Technologies Corp. | Cloud-based segregated video storage and retrieval for improved network scalability and throughput |
| US9554100B2 (en) * | 2014-09-30 | 2017-01-24 | Qualcomm Incorporated | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
| US20170132466A1 (en) * | 2014-09-30 | 2017-05-11 | Qualcomm Incorporated | Low-power iris scan initialization |
| US9940533B2 (en) | 2014-09-30 | 2018-04-10 | Qualcomm Incorporated | Scanning window for isolating pixel values in hardware for computer vision operations |
| US9838635B2 (en) | 2014-09-30 | 2017-12-05 | Qualcomm Incorporated | Feature computation in a sensor element array |
| US10515284B2 (en) | 2014-09-30 | 2019-12-24 | Qualcomm Incorporated | Single-processor computer vision hardware control and application execution |
| KR101751020B1 (ko) * | 2015-12-08 | 2017-07-11 | 한국해양과학기술원 | 다중 위성 기반 위험유해물질 연속 탐지 방법 및 장치 |
| US10062151B2 (en) * | 2016-01-21 | 2018-08-28 | Samsung Electronics Co., Ltd. | Image deblurring method and apparatus |
| CN105759935B (zh) * | 2016-01-29 | 2019-01-18 | 华为技术有限公司 | 一种终端控制方法及终端 |
| KR102586962B1 (ko) * | 2016-04-07 | 2023-10-10 | 한화비전 주식회사 | 감시 시스템 및 그 제어 방법 |
| KR101904453B1 (ko) * | 2016-05-25 | 2018-10-04 | 김선필 | 인공 지능 투명 디스플레이의 동작 방법 및 인공 지능 투명 디스플레이 |
| US9977994B2 (en) | 2016-06-30 | 2018-05-22 | Apple Inc. | Configurable histogram-of-oriented gradients (HOG) processor |
| US10375317B2 (en) | 2016-07-07 | 2019-08-06 | Qualcomm Incorporated | Low complexity auto-exposure control for computer vision and imaging systems |
| KR102720734B1 (ko) * | 2016-08-02 | 2024-10-22 | 삼성전자주식회사 | 이벤트 신호 처리 방법 및 장치 |
| US11507389B2 (en) | 2016-09-29 | 2022-11-22 | Hewlett-Packard Development Company, L.P. | Adjusting settings on computing devices based on location |
| US10984235B2 (en) | 2016-12-16 | 2021-04-20 | Qualcomm Incorporated | Low power data generation for iris-related detection and authentication |
| US10614332B2 (en) | 2016-12-16 | 2020-04-07 | Qualcomm Incorportaed | Light source modulation for iris size adjustment |
| US10860841B2 (en) * | 2016-12-29 | 2020-12-08 | Samsung Electronics Co., Ltd. | Facial expression image processing method and apparatus |
| CN110168565B (zh) * | 2017-01-23 | 2024-01-05 | 高通股份有限公司 | 低功率虹膜扫描初始化 |
| EP4307242A1 (en) * | 2017-01-23 | 2024-01-17 | QUALCOMM Incorporated | Single-processor computer vision hardware control and application execution |
| US10516841B2 (en) | 2017-03-08 | 2019-12-24 | Samsung Electronics Co., Ltd. | Pixel, pixel driving circuit, and vision sensor including the same |
| US10237481B2 (en) * | 2017-04-18 | 2019-03-19 | Facebook Technologies, Llc | Event camera for generation of event-based images |
| EP3393122A1 (en) * | 2017-04-18 | 2018-10-24 | Oculus VR, LLC | Event camera |
| CN107358175B (zh) * | 2017-06-26 | 2020-11-24 | Oppo广东移动通信有限公司 | 虹膜采集方法及电子装置 |
| US10466779B1 (en) * | 2017-07-24 | 2019-11-05 | Facebook Technologies, Llc | Event camera for eye tracking |
| KR20190017280A (ko) * | 2017-08-10 | 2019-02-20 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
| US10838505B2 (en) | 2017-08-25 | 2020-11-17 | Qualcomm Incorporated | System and method for gesture recognition |
| KR102320310B1 (ko) * | 2017-09-11 | 2021-11-02 | 한국전자기술연구원 | 이진트리 기반 고속 다종 분류 회로 장치 |
| US10484530B2 (en) * | 2017-11-07 | 2019-11-19 | Google Llc | Sensor based component activation |
| WO2019099337A1 (en) | 2017-11-14 | 2019-05-23 | Kaban Technologies Llc | Event camera-based deformable object tracking |
| CN107908289B (zh) * | 2017-12-01 | 2021-06-11 | 宁波高新区锦众信息科技有限公司 | 一种基于头部的机器人的人脸识别交互系统 |
| JP2019134271A (ja) * | 2018-01-31 | 2019-08-08 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 |
| KR102731683B1 (ko) | 2018-03-08 | 2024-11-19 | 삼성전자 주식회사 | 이미지 센서와 연결된 인터페이스 및 복수의 프로세서들간에 연결된 인터페이스를 포함하는 전자 장치 |
| US11176490B2 (en) * | 2018-03-09 | 2021-11-16 | Qualcomm Incorporated | Accumulate across stages in machine learning object detection |
| KR102138657B1 (ko) * | 2018-04-12 | 2020-07-28 | 가천대학교 산학협력단 | 계층적 협업 표현 기반 분류를 통한 강인한 얼굴인식 장치 및 그 방법 |
| US10691926B2 (en) | 2018-05-03 | 2020-06-23 | Analog Devices, Inc. | Single-pixel sensor |
| CN110517034A (zh) * | 2018-05-22 | 2019-11-29 | 维沃移动通信有限公司 | 一种对象识别方法及移动终端 |
| JP7100826B2 (ja) * | 2018-11-22 | 2022-07-14 | オムロン株式会社 | 検出装置 |
| WO2020120782A1 (en) * | 2018-12-13 | 2020-06-18 | Prophesee | Method of tracking objects in a scene |
| TWI682329B (zh) * | 2018-12-14 | 2020-01-11 | 技嘉科技股份有限公司 | 臉部辨識方法、裝置及計算機可讀取媒體 |
| US10832543B2 (en) | 2019-01-11 | 2020-11-10 | The Chamberlain Group, Inc. | Activity sensor |
| CN110191319A (zh) * | 2019-05-28 | 2019-08-30 | 英业达科技有限公司 | 电力供应及影像处理系统及其方法 |
| CN110276314A (zh) * | 2019-06-26 | 2019-09-24 | 苏州万店掌网络科技有限公司 | 人脸识别方法及人脸识别摄像机 |
| CN112311964B (zh) | 2019-07-26 | 2022-06-07 | 华为技术有限公司 | 一种像素采集电路、动态视觉传感器以及图像采集设备 |
| US11588987B2 (en) * | 2019-10-02 | 2023-02-21 | Sensors Unlimited, Inc. | Neuromorphic vision with frame-rate imaging for target detection and tracking |
| CN114341924A (zh) | 2019-12-17 | 2022-04-12 | 谷歌有限责任公司 | 低功率视觉感测 |
| US11340696B2 (en) * | 2020-01-13 | 2022-05-24 | Sony Interactive Entertainment Inc. | Event driven sensor (EDS) tracking of light emitting diode (LED) array |
| US11233956B2 (en) | 2020-03-31 | 2022-01-25 | Western Digital Technologies, Inc. | Sensor system with low power sensor devices and high power sensor devices |
| US11620476B2 (en) * | 2020-05-14 | 2023-04-04 | Micron Technology, Inc. | Methods and apparatus for performing analytics on image data |
| EP4165543A1 (en) * | 2020-06-11 | 2023-04-19 | Telefonaktiebolaget LM ERICSSON (PUBL) | Authorization using an optical sensor |
| WO2021263193A1 (en) * | 2020-06-27 | 2021-12-30 | Unicorn Labs Llc | Smart sensor |
| KR102883594B1 (ko) | 2020-08-07 | 2025-11-11 | 삼성전자주식회사 | 대기 모드에서 오브젝트 검출을 위한 프로세서, 이를 포함하는 전자 장치 및 그의 작동 방법 |
| JP7524681B2 (ja) * | 2020-08-31 | 2024-07-30 | 株式会社リコー | 画像処理装置、画像処理方法およびプログラム |
| WO2022081142A1 (en) | 2020-10-13 | 2022-04-21 | Google Llc | Distributed sensor data processing using multiple classifiers on multiple devices |
| US12057126B2 (en) | 2020-10-13 | 2024-08-06 | Google Llc | Distributed sensor data processing using multiple classifiers on multiple devices |
| US12238413B2 (en) | 2020-11-19 | 2025-02-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Low-power always-on image sensor and pattern recognizer |
| CN112417184B (zh) * | 2020-11-23 | 2021-05-25 | 上海点泽智能科技有限公司 | 多场景特征信息存储结构及其比对方法、设备和存储介质 |
| CN116530092B (zh) * | 2020-12-31 | 2026-03-17 | 华为技术有限公司 | 一种视觉传感器芯片、操作视觉传感器芯片的方法以及设备 |
| US12432466B2 (en) * | 2021-04-19 | 2025-09-30 | Chengdu SynSense Technology Co., Ltd. | Event-driven integrated circuit having interface system |
| JP7706936B2 (ja) * | 2021-05-28 | 2025-07-14 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、及びプログラム |
| EP4102256A1 (en) * | 2021-06-11 | 2022-12-14 | Infineon Technologies AG | Sensor devices, electronic devices, method for performing object detection by a sensor device and method for performing object detection by an electronic device |
| US20250209770A1 (en) * | 2022-03-29 | 2025-06-26 | Sony Interactive Entertainment Inc. | Computer system, method, and program |
| US12001261B2 (en) * | 2022-06-27 | 2024-06-04 | Qualcomm Incorporated | Power optimization for smartwatch |
| US11756274B1 (en) * | 2022-07-07 | 2023-09-12 | Snap Inc. | Low-power architecture for augmented reality device |
| US12436594B2 (en) | 2022-09-06 | 2025-10-07 | Apple Inc. | Predictive display power control systems and methods |
| US12436593B2 (en) | 2022-09-06 | 2025-10-07 | Apple Inc. | Sensor-based display power control systems and methods |
| CN115767261A (zh) * | 2022-10-11 | 2023-03-07 | 浙江大华技术股份有限公司 | 摄像机系统及摄像机系统控制方法 |
| CN116301363B (zh) * | 2023-02-27 | 2024-02-27 | 荣耀终端有限公司 | 隔空手势识别方法、电子设备及存储介质 |
| US20240406550A1 (en) * | 2023-05-31 | 2024-12-05 | Microsoft Technology Licensing, Llc | Front-end image preprocessing |
| US12474760B2 (en) | 2024-03-08 | 2025-11-18 | Htc Corporation | Face tracking device, system, and method |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
Family Cites Families (83)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4641349A (en) | 1985-02-20 | 1987-02-03 | Leonard Flom | Iris recognition system |
| US5324958A (en) | 1991-02-19 | 1994-06-28 | Synaptics, Incorporated | Integrating imaging systgem having wide dynamic range with sample/hold circuits |
| US5543590A (en) | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature |
| US5877897A (en) | 1993-02-26 | 1999-03-02 | Donnelly Corporation | Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array |
| US6822563B2 (en) | 1997-09-22 | 2004-11-23 | Donnelly Corporation | Vehicle imaging system with accessory control |
| JPH0799646A (ja) | 1993-05-31 | 1995-04-11 | Sony Corp | ディジタル画像信号の階層符号化および復号装置 |
| JP3212874B2 (ja) | 1996-04-19 | 2001-09-25 | 日本電気株式会社 | ボロメータ型赤外線撮像装置 |
| JP2002516045A (ja) | 1996-11-08 | 2002-05-28 | ナショナル・コンピューター・システムズ・インコーポレーテッド | 較正画素出力を伴う光走査 |
| US20020012459A1 (en) | 2000-06-22 | 2002-01-31 | Chips Brain Co. Ltd. | Method and apparatus for detecting stereo disparity in sequential parallel processing mode |
| US7154548B2 (en) | 2001-01-29 | 2006-12-26 | Valley Oak Semiconductor | Multiplexed and pipelined column buffer for use with an array of photo sensors |
| US7151844B2 (en) | 2001-12-06 | 2006-12-19 | General Motors Corporation | Image sensor method and apparatus having hardware implemented edge detection processing |
| EP1593258B1 (en) | 2003-02-03 | 2009-04-22 | Goodrich Corporation | Random access imaging sensor |
| US8902971B2 (en) | 2004-07-30 | 2014-12-02 | Euclid Discoveries, Llc | Video compression repository and model reuse |
| US7377643B1 (en) | 2004-08-13 | 2008-05-27 | Q Step Technologies, Inc. | Method and apparatus for eye imaging with position registration and constant pupil size |
| US7038185B1 (en) | 2004-12-01 | 2006-05-02 | Mitsubishi Electric Research Laboratories, Inc. | Camera for directly generating a gradient image |
| US7744216B1 (en) | 2006-01-06 | 2010-06-29 | Lockheed Martin Corporation | Display system intensity adjustment based on pupil dilation |
| DE102006023611A1 (de) | 2006-05-19 | 2007-11-22 | Siemens Ag | Verfahren und Vorrichtung zur pixelsynchronen Bildauswertung eines kamerabasierten Systems |
| CN100468385C (zh) * | 2006-06-30 | 2009-03-11 | 大唐移动通信设备有限公司 | 一种实现串行信号处理的协处理器及方法 |
| JP2008109477A (ja) * | 2006-10-26 | 2008-05-08 | Fuji Electric Holdings Co Ltd | 画像生成装置および画像生成方法 |
| US20100226495A1 (en) | 2007-10-29 | 2010-09-09 | Michael Kelly | Digital readout method and apparatus |
| WO2008060124A2 (en) * | 2006-11-17 | 2008-05-22 | Silicon Communications Technology Co., Ltd. | Low power image sensor adjusting reference voltage automatically and optical pointing device comprising the same |
| JP2008131407A (ja) | 2006-11-22 | 2008-06-05 | Matsushita Electric Ind Co Ltd | 固体撮像素子およびそれを用いた撮像装置 |
| AT504582B1 (de) | 2006-11-23 | 2008-12-15 | Arc Austrian Res Centers Gmbh | Verfahren zur generierung eines bildes in elektronischer form, bildelement für einen bildsensor zur generierung eines bildes sowie bildsensor |
| US20090020612A1 (en) | 2007-06-28 | 2009-01-22 | Symbol Technologies, Inc. | Imaging dual window scanner with presentation scanning |
| US8031970B2 (en) * | 2007-08-27 | 2011-10-04 | Arcsoft, Inc. | Method of restoring closed-eye portrait photo |
| US9117119B2 (en) | 2007-09-01 | 2015-08-25 | Eyelock, Inc. | Mobile identity platform |
| JP4948379B2 (ja) | 2007-12-18 | 2012-06-06 | キヤノン株式会社 | パターン識別器生成方法、情報処理装置、プログラム及び記憶媒体 |
| US8462996B2 (en) | 2008-05-19 | 2013-06-11 | Videomining Corporation | Method and system for measuring human response to visual stimulus based on changes in facial expression |
| US8213782B2 (en) | 2008-08-07 | 2012-07-03 | Honeywell International Inc. | Predictive autofocusing system |
| DE102008052930B4 (de) | 2008-10-23 | 2011-04-07 | Leuze Electronic Gmbh & Co Kg | Bildverarbeitender Sensor |
| CN101754389A (zh) * | 2008-12-03 | 2010-06-23 | 大唐移动通信设备有限公司 | 资源控制方法、装置和系统 |
| FR2939919A1 (fr) | 2008-12-16 | 2010-06-18 | New Imaging Technologies Sas | Capteur matriciel |
| US8886206B2 (en) | 2009-05-01 | 2014-11-11 | Digimarc Corporation | Methods and systems for content processing |
| US20100295782A1 (en) | 2009-05-21 | 2010-11-25 | Yehuda Binder | System and method for control based on face ore hand gesture detection |
| GB2471647B (en) * | 2009-06-16 | 2016-03-23 | Aptina Imaging Corp | Use of Z-order data in an image sensor |
| JP5721994B2 (ja) | 2009-11-27 | 2015-05-20 | 株式会社ジャパンディスプレイ | 放射線撮像装置 |
| US8947448B2 (en) | 2009-12-24 | 2015-02-03 | Sony Corporation | Image processing device, image data generation device, image processing method, image data generation method, and data structure of image file |
| WO2011090225A1 (ko) | 2010-01-22 | 2011-07-28 | 아이리텍 잉크 | 홍채크기가 다른 다수의 홍채이미지를 이용한 홍채인식장치 및 방법 |
| SG185500A1 (en) | 2010-05-12 | 2012-12-28 | Pelican Imaging Corp | Architectures for imager arrays and array cameras |
| WO2012093381A1 (en) | 2011-01-03 | 2012-07-12 | Vitaly Sheraizin | Camera assembly with an integrated content analyzer |
| US8805018B2 (en) | 2011-04-11 | 2014-08-12 | Intel Corporation | Method of detecting facial attributes |
| JP2013003787A (ja) | 2011-06-15 | 2013-01-07 | Panasonic Corp | 対象物検出装置 |
| US9444547B2 (en) | 2011-07-26 | 2016-09-13 | Abl Ip Holding Llc | Self-identifying one-way authentication method using optical signals |
| JP5789751B2 (ja) | 2011-08-11 | 2015-10-07 | パナソニックIpマネジメント株式会社 | 特徴抽出装置、特徴抽出方法、特徴抽出プログラム、および画像処理装置 |
| KR101824413B1 (ko) | 2011-08-30 | 2018-02-02 | 삼성전자주식회사 | 휴대단말기의 동작 모드 제어 방법 및 장치 |
| US9299036B2 (en) | 2011-08-31 | 2016-03-29 | Striiv, Inc. | Life pattern detection |
| JP5983616B2 (ja) | 2011-09-13 | 2016-09-06 | 日本電気株式会社 | 撮像装置、撮像方法及びプログラム |
| US8748828B2 (en) | 2011-09-21 | 2014-06-10 | Kla-Tencor Corporation | Interposer based imaging sensor for high-speed image acquisition and inspection systems |
| US9824296B2 (en) | 2011-11-10 | 2017-11-21 | Canon Kabushiki Kaisha | Event detection apparatus and event detection method |
| JP6143190B2 (ja) | 2011-11-16 | 2017-06-07 | 国立大学法人静岡大学 | ランプ信号発生回路及びcmosイメージセンサ |
| CN103135889B (zh) * | 2011-12-05 | 2017-06-23 | Lg电子株式会社 | 移动终端及其3d图像控制方法 |
| FR2985065B1 (fr) | 2011-12-21 | 2014-01-10 | Univ Paris Curie | Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere |
| CN102663409B (zh) | 2012-02-28 | 2015-04-22 | 西安电子科技大学 | 一种基于hog-lbp描述的行人跟踪方法 |
| JP5655806B2 (ja) * | 2012-03-23 | 2015-01-21 | 横河電機株式会社 | 同期装置及びフィールド機器 |
| EP2665257B1 (en) | 2012-05-16 | 2014-09-10 | Harvest Imaging bvba | Image sensor and method for power efficient readout of sub-picture |
| US9332239B2 (en) | 2012-05-31 | 2016-05-03 | Apple Inc. | Systems and methods for RGB image processing |
| KR101896666B1 (ko) * | 2012-07-05 | 2018-09-07 | 삼성전자주식회사 | 이미지 센서 칩, 이의 동작 방법, 및 이를 포함하는 시스템 |
| US9389229B2 (en) | 2012-07-18 | 2016-07-12 | Theranos, Inc. | Methods for detecting and measuring aggregation |
| EP2709066A1 (en) | 2012-09-17 | 2014-03-19 | Lakeside Labs GmbH | Concept for detecting a motion of a moving object |
| US9936132B2 (en) * | 2012-10-26 | 2018-04-03 | The Regents Of The University Of Michigan | CMOS image sensors with feature extraction |
| US9081571B2 (en) | 2012-11-29 | 2015-07-14 | Amazon Technologies, Inc. | Gesture detection management for an electronic device |
| US20140169663A1 (en) | 2012-12-19 | 2014-06-19 | Futurewei Technologies, Inc. | System and Method for Video Detection and Tracking |
| US9760966B2 (en) | 2013-01-08 | 2017-09-12 | Nvidia Corporation | Parallel processor with integrated correlation and convolution engine |
| US10373470B2 (en) | 2013-04-29 | 2019-08-06 | Intelliview Technologies, Inc. | Object detection |
| US10694106B2 (en) | 2013-06-14 | 2020-06-23 | Qualcomm Incorporated | Computer vision application processing |
| KR102081087B1 (ko) | 2013-06-17 | 2020-02-25 | 삼성전자주식회사 | 동기적 영상과 비동기적 영상을 위한 영상 정합 장치 및 이미지 센서 |
| US20140368423A1 (en) | 2013-06-17 | 2014-12-18 | Nvidia Corporation | Method and system for low power gesture recognition for waking up mobile devices |
| US20150036942A1 (en) | 2013-07-31 | 2015-02-05 | Lsi Corporation | Object recognition and tracking using a classifier comprising cascaded stages of multiple decision trees |
| US20150311977A1 (en) | 2013-12-16 | 2015-10-29 | Qualcomm Incorporated | Methods and apparatus for configuring an image sensor for decoding high frequency visible light communication signals |
| KR101569268B1 (ko) | 2014-01-02 | 2015-11-13 | 아이리텍 잉크 | 얼굴 구성요소 거리를 이용한 홍채인식용 이미지 획득 장치 및 방법 |
| WO2015131045A1 (en) | 2014-02-28 | 2015-09-03 | The Board Of Trustees Of The Leland Stanford Junior University | Imaging providing ratio pixel intensity |
| US10075234B2 (en) | 2014-03-25 | 2018-09-11 | Osram Sylvania Inc. | Techniques for emitting position information from luminaires |
| US9983663B2 (en) | 2014-05-16 | 2018-05-29 | Qualcomm Incorporated | Imaging arrangement for object motion detection and characterization |
| US20170091550A1 (en) | 2014-07-15 | 2017-03-30 | Qualcomm Incorporated | Multispectral eye analysis for identity authentication |
| US9554100B2 (en) | 2014-09-30 | 2017-01-24 | Qualcomm Incorporated | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor |
| US9838635B2 (en) | 2014-09-30 | 2017-12-05 | Qualcomm Incorporated | Feature computation in a sensor element array |
| US9762834B2 (en) | 2014-09-30 | 2017-09-12 | Qualcomm Incorporated | Configurable hardware for computing computer vision features |
| US9940533B2 (en) | 2014-09-30 | 2018-04-10 | Qualcomm Incorporated | Scanning window for isolating pixel values in hardware for computer vision operations |
| US20170132466A1 (en) | 2014-09-30 | 2017-05-11 | Qualcomm Incorporated | Low-power iris scan initialization |
| US10515284B2 (en) | 2014-09-30 | 2019-12-24 | Qualcomm Incorporated | Single-processor computer vision hardware control and application execution |
| US9767358B2 (en) | 2014-10-22 | 2017-09-19 | Veridium Ip Limited | Systems and methods for performing iris identification and verification using mobile devices |
| US20160275348A1 (en) | 2015-03-17 | 2016-09-22 | Motorola Mobility Llc | Low-power iris authentication alignment |
| US20160283789A1 (en) | 2015-03-25 | 2016-09-29 | Motorola Mobility Llc | Power-saving illumination for iris authentication |
-
2015
- 2015-09-25 US US14/866,549 patent/US9554100B2/en active Active
- 2015-09-28 CA CA2959549A patent/CA2959549C/en active Active
- 2015-09-28 KR KR1020227042166A patent/KR102633876B1/ko active Active
- 2015-09-28 WO PCT/US2015/052684 patent/WO2016053886A1/en not_active Ceased
- 2015-09-28 JP JP2017516988A patent/JP6737777B2/ja active Active
- 2015-09-28 EP EP15781806.3A patent/EP3201831B1/en active Active
- 2015-09-28 KR KR1020177008531A patent/KR102474008B1/ko active Active
- 2015-09-28 CN CN202511988764.0A patent/CN121617133A/zh active Pending
- 2015-09-28 CN CN201580052384.4A patent/CN107077601A/zh active Pending
-
2017
- 2017-01-03 US US15/397,566 patent/US9870506B2/en active Active
- 2017-01-06 US US15/400,914 patent/US9986211B2/en active Active
-
2020
- 2020-05-14 JP JP2020085260A patent/JP7009553B2/ja active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130229508A1 (en) * | 2012-03-01 | 2013-09-05 | Qualcomm Incorporated | Gesture Detection Based on Information from Multiple Types of Sensors |
Non-Patent Citations (9)
| Title |
|---|
| "Myriad 2 Vision Processor Bringing Computational Imaging and Visual Awareness to Mobile, Wearable, and Embedded Markets Product Brief", 1 August 2014 (2014-08-01), XP055230944, Retrieved from the Internet <URL:http://uploads.movidius.com/1441734401-Myriad-2-product-brief.pdf> [retrieved on 20151124] * |
| ANONYMOUS: "opencv - Haar Cascades vs. LBP Cascades in Face Detection - Stack Overflow", STACK OVERFLOW, 9 January 2012 (2012-01-09), XP055230385, Retrieved from the Internet <URL:http://stackoverflow.com/questions/8791178/haar-cascades-vs-lbp-cascades-in-face-detection> [retrieved on 20151123] * |
| DAVID MOLONEY ET AL: "Myriad 2: Eye of the Computational Vision Storm", HOT CHIPS, 12 August 2014 (2014-08-12), XP055230946, Retrieved from the Internet <URL:http://www.hotchips.org/wp-content/uploads/hc_archives/hc26/HC26-12-day2-epub/HC26.12-6-HP-ASICs-epub/HC26.12.620-Myriad2-Eye-Moloney-Movidius-provided.pdf> [retrieved on 20151124] * |
| JORGE FERNÁNDEZ-BERNI ET AL: "Bottom-up performance analysis of focal-plane mixed-signal hardware for Viola-Jones early vision tasks", INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, vol. 43, no. 8, 16 April 2014 (2014-04-16), pages 1063 - 1079, XP055230917, ISSN: 0098-9886, DOI: 10.1002/cta.1996 * |
| JUNGUK CHO ET AL: "Fpga-based face detection system using Haar classifiers", FIELD PROGRAMMABLE GATE ARRAYS; 20090222 - 20090224, 22 February 2009 (2009-02-22), pages 103 - 112, XP058022870, ISBN: 978-1-60558-410-2, DOI: 10.1145/1508128.1508144 * |
| LAHDENOJA O ET AL: "Extracting Local Binary Patterns with MIPA4k vision processor", CELLULAR NANOSCALE NETWORKS AND THEIR APPLICATIONS (CNNA), 2010 12TH INTERNATIONAL WORKSHOP ON, IEEE, PISCATAWAY, NJ, USA, 3 February 2010 (2010-02-03), pages 1 - 5, XP031648245, ISBN: 978-1-4244-6679-5 * |
| MANUEL SUAREZ ET AL: "CMOS-3D Smart Imager Architectures for Feature Detection", IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, IEEE, PISCATAWAY, NJ, USA, vol. 2, no. 4, 1 December 2012 (2012-12-01), pages 723 - 736, XP011479510, ISSN: 2156-3357, DOI: 10.1109/JETCAS.2012.2223552 * |
| TOBI DELBRUCK ET AL: "Activity-driven, event-based vision sensors", IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS. ISCAS 2010 - 30 MAY-2 JUNE 2010 - PARIS, FRANCE, IEEE, US, 30 May 2010 (2010-05-30), pages 2426 - 2429, XP031724396, ISBN: 978-1-4244-5308-5 * |
| YANG MINHAO ET AL: "Comparison of spike encoding schemes in asynchronous vision sensors: Modeling and design", 2014 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), IEEE, 1 June 2014 (2014-06-01), pages 2632 - 2635, XP032624686, DOI: 10.1109/ISCAS.2014.6865713 * |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108600659A (zh) * | 2017-03-08 | 2018-09-28 | 三星电子株式会社 | 事件检测装置 |
| CN108600659B (zh) * | 2017-03-08 | 2022-04-08 | 三星电子株式会社 | 事件检测装置 |
| US12250467B2 (en) | 2017-03-08 | 2025-03-11 | Samsung Electronics Co., Ltd. | Event detecting device including an event signal generator and an output signal generator configured to generate an output signal by combining event signals |
| JP2020532806A (ja) * | 2017-05-25 | 2020-11-12 | サムスン エレクトロニクス カンパニー リミテッド | 危険状況を感知する方法及びそのシステム |
| US11080891B2 (en) | 2017-05-25 | 2021-08-03 | Samsung Electronics Co., Ltd. | Method and system for detecting dangerous situation |
| JP7102510B2 (ja) | 2017-05-25 | 2022-07-19 | サムスン エレクトロニクス カンパニー リミテッド | 危険状況を感知する方法及びそのシステム |
| US12134316B2 (en) | 2019-03-27 | 2024-11-05 | Sony Group Corporation | State detection device, state detection system, and state detection method |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20230001014A (ko) | 2023-01-03 |
| JP6737777B2 (ja) | 2020-08-12 |
| JP2020145714A (ja) | 2020-09-10 |
| EP3201831C0 (en) | 2024-05-01 |
| CA2959549A1 (en) | 2016-04-07 |
| US20160094814A1 (en) | 2016-03-31 |
| CN121617133A (zh) | 2026-03-06 |
| US9986211B2 (en) | 2018-05-29 |
| BR112017006399A2 (pt) | 2017-12-19 |
| US9870506B2 (en) | 2018-01-16 |
| JP7009553B2 (ja) | 2022-01-25 |
| KR102633876B1 (ko) | 2024-02-20 |
| KR20170063643A (ko) | 2017-06-08 |
| US20170374322A1 (en) | 2017-12-28 |
| KR102474008B1 (ko) | 2022-12-02 |
| EP3201831B1 (en) | 2024-05-01 |
| CA2959549C (en) | 2023-03-14 |
| US20170116478A1 (en) | 2017-04-27 |
| JP2018501531A (ja) | 2018-01-18 |
| EP3201831A1 (en) | 2017-08-09 |
| CN107077601A (zh) | 2017-08-18 |
| US9554100B2 (en) | 2017-01-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9870506B2 (en) | Low-power always-on face detection, tracking, recognition and/or analysis using events-based vision sensor | |
| US11068712B2 (en) | Low-power iris scan initialization | |
| US10515284B2 (en) | Single-processor computer vision hardware control and application execution | |
| US10984235B2 (en) | Low power data generation for iris-related detection and authentication | |
| US10614332B2 (en) | Light source modulation for iris size adjustment | |
| US20180173933A1 (en) | User authentication using iris sector | |
| TWI763769B (zh) | 單一處理器電腦視覺硬體控制及應用執行之裝置及方法 | |
| TWI775801B (zh) | 低功率虹膜掃描初始化 | |
| BR112017006399B1 (pt) | Detecção, rastreamento e/ou análise facial sempre ligado de baixo consumo de energia usando um sensor de visão baseado em eventos | |
| HK40006351A (en) | Single-processor computer vision hardware control and application execution | |
| HK40006351B (en) | Single-processor computer vision hardware control and application execution |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15781806 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2959549 Country of ref document: CA |
|
| ENP | Entry into the national phase |
Ref document number: 20177008531 Country of ref document: KR Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 2017516988 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112017006399 Country of ref document: BR |
|
| REEP | Request for entry into the european phase |
Ref document number: 2015781806 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 112017006399 Country of ref document: BR Kind code of ref document: A2 Effective date: 20170328 |