US20190306441A1 - Method And Apparatus Of Adaptive Infrared Projection Control - Google Patents
Method And Apparatus Of Adaptive Infrared Projection Control Download PDFInfo
- Publication number
- US20190306441A1 US20190306441A1 US16/354,552 US201916354552A US2019306441A1 US 20190306441 A1 US20190306441 A1 US 20190306441A1 US 201916354552 A US201916354552 A US 201916354552A US 2019306441 A1 US2019306441 A1 US 2019306441A1
- Authority
- US
- United States
- Prior art keywords
- roi
- image
- projector
- light
- projecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- G06K9/00268—
-
- G06K9/00604—
-
- G06K9/3233—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H04N5/2256—
-
- H04N5/23229—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3155—Modulator illumination systems for controlling the light source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
Definitions
- the present disclosure is generally related to computer vision and, more particularly, to adaptive infrared (IR) projection control for depth estimation in computer vision.
- IR infrared
- Depth estimation in computer vision utilizes images of a given scene captured by two cameras to triangulate and estimate distances. Typically, depth estimation can be achieved with either passive stereo vision or active stereo vision.
- An objective of the present disclosure is to propose schemes, solutions, concepts, designs, methods and apparatuses that enable adaptive IR projection control. It is believed that reduction in power consumption and IR projection time, as well as improved eye safety, may be achieved by implementing various proposed schemes in accordance with the present disclosure.
- a method may involve receiving data of an image based on sensing by one or more image sensors.
- the method may also involve detecting a region of interest (ROI) in the image.
- the method may further involve adaptively controlling a light projector with respect to projecting light toward the ROI.
- ROI region of interest
- an apparatus may include a processor or control circuit which, during operation, may perform operations including: (a) receiving data of an image based on sensing by one or more image sensors; (b) detecting a region of interest (ROI) in the image; and (c) adaptively controlling a light projector with respect to projecting light toward the ROI.
- a processor or control circuit which, during operation, may perform operations including: (a) receiving data of an image based on sensing by one or more image sensors; (b) detecting a region of interest (ROI) in the image; and (c) adaptively controlling a light projector with respect to projecting light toward the ROI.
- ROI region of interest
- FIG. 1 is a diagram of an example scenario of IR projection for depth estimation.
- FIG. 2 is a diagram of an example scenario in accordance with an implementation of the present disclosure.
- FIG. 3 is a diagram of an example apparatus in accordance with an implementation of the present disclosure.
- FIG. 4 is a flowchart of an example process in accordance with an implementation of the present disclosure.
- FIG. 1 illustrates an example scenario 100 of IR projection for depth estimation.
- Part (A) of FIG. 1 shows an example module 110 of an apparatus (e.g., smartphone) that may be utilized for active stereo vision.
- Module 110 may include two IR cameras, one red-green-blue (RGB) camera, one IR projector and one IR LED.
- RGB red-green-blue
- the IR LED may emit IR light as floodlight to illuminate a scene
- the IR projector may project a structured IR light toward the scene
- each of the two IR cameras may capture a respective IR image of the scene
- the RGB camera may capture an RGB image of the scene.
- Part (B) of FIG. 1 shows an example module 120 of an apparatus (e.g., smartphone) that may be utilized for passive stereo vision.
- the IR LED may emit IR light as floodlight to illuminate a scene
- the IR projector may project a structured IR light toward the scene
- the single IR camera may capture an IR image of the scene
- the RGB camera may capture an RGB image of the scene.
- Part (C) of FIG. 1 shows an example of IR sensor readout timing (e.g., with respect to the IR cameras of module 110 and module 120 ).
- a rolling-shutter camera e.g., an IR camera and/or a RGB camera in module 100 / 200 .
- Each of module 110 and module 120 may support either or both of type I and type II of IR exposure.
- the IR projector in module 100 / 200 may be enabled, activated or otherwise turned on for a period of time which is the duration during which there is an overlap among the readout times of all the rows of a given image (e.g., IR image, RGB image or depth map).
- a given image e.g., IR image, RGB image or depth map.
- the IR projector in module 100 / 200 may be enabled, activated or otherwise turned on for a period of time which is the duration from the beginning of the readout of the first row of a plurality of rows of the given image to the end of the readout of the last row of the plurality of rows.
- the IR projector may be adaptively controlled based on one or more conditions. That is, the IR projector may be adaptively enabled and disabled, or the power of the IR projector may be adaptively adjusted (e.g., reduced), according to whether at least one condition of one or more predefined conditions is met.
- the adaptively-controlled IP projection may result in reduction in power consumption and IR projection time as well as improved eye safety.
- FIG. 2 illustrates an example scenario 200 of adaptive IR projection control in accordance with an implementation of the present disclosure.
- an image e.g., IR image, RGB image or depth map
- ROI region of interest
- an IR projector may be adaptively controlled according to whether at least one condition of one or more predefined conditions is met. As a result, the amount of time the IR projector is turned on to project a structured IR light and/or an amount of power used by the IR projector in projecting the structured IR light may be reduced.
- the face of a person may be a ROI in an image since facial recognition-related applications (e.g., three-dimensional (3D) face unlock, 3D face payment, 3D emoji, and so on) are gaining popularity in terms of usage.
- facial recognition-related applications e.g., three-dimensional (3D) face unlock, 3D face payment, 3D emoji, and so on
- timing and/or power of IR projection by an IR projector in active stereo vision and/or passive stereo vision may be adaptively controlled based on facial recognition or any ROI region in an image, which may be an IR image, RGB image or depth map.
- a face region in the image may be detected (e.g., by a control circuit or processor executing a facial detection/recognition algorithm) in an IR image, RGB image or depth map. Then, regardless of type I or type II of IR exposure, an amount of power used in projecting an IR light and/or an amount of time during which the IR light is projected may be reduced, thereby achieving reduction in power/time as well as improvement in eye safety.
- the timing and/or power in IR projection by an IR projector may be adaptively controlled in one or more of a variety of ways. Description of some examples of adaptive control of IR projection is provided below.
- a face (or another ROI) in an image may be used as a condition in adaptively controlling IR projection.
- the IR projector may be enabled to project an IR light toward a face of a user (or another ROI in an image) during a first portion of a sensor readout time when one or more image sensors (e.g., IR camera and/or RGB camera of module 100 / 200 ) read(s) a portion of the image where the face/ROI is located.
- the IR projector may be disabled to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read(s) another portion of the image where the ROI is not located. Accordingly, this may result in power saving.
- an eye region in an image may be used as a condition in adaptively controlling IR projection.
- the IR projector may also be disabled to cease projecting the IR light when the one or more sensors read(s) a portion of the image where an eye/both eyes of the face is/are located so as to improve eye safety.
- eye action(s) of the user may be used as a condition in adaptively controlling IR projection.
- the IR projector may be enabled to project an IR light toward the face in response to eye(s) of the face being blinked or otherwise closed.
- the IR projector may also be disabled to cease projecting the IR light toward the face in response to the eye(s) of the face being open. Accordingly, this may improve eye safety for the user.
- pupil size may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light toward the face may be reduced in response to an enlargement of the pupil of the eye(s) of the face. Again, this may improve eye safety for the user.
- a distance of the face may be used as a condition in adaptively controlling IR projection.
- IR projection may be adaptively controlled based on a distance of the face/ROI from the one or more sensors (e.g., IR camera(s) and/or RGB camera).
- an amount of power in projecting the IR light toward the face may be reduced in response to the distance of the face/ROI being lower than a distance threshold (e.g., reducing power in projecting the IR light when the face is relatively close). Accordingly, this may improve eye safety for the user.
- environmental brightness may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the brightness level of an ambient light being lower than a brightness threshold.
- environmental brightness may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the brightness level of an ambient light being lower than a brightness threshold.
- a confidence level of a depth map may be used as a condition in adaptively controlling IR projection.
- the IR projector may be enabled to project an IR light toward the ROI for each of one or more regions of a plurality of regions of a depth map having a respective confidence level lower than a confidence threshold.
- an image texture of a ROI may be used as a condition in adaptively controlling IR projection.
- the IR projector may be enabled to project an IR light toward the ROI for each of one or more regions of a plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold.
- a power level of a battery that powers the IR projector may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the power level of the battery being lower than a power threshold.
- FIG. 3 illustrates an example apparatus 300 in accordance with an implementation of the present disclosure.
- Apparatus 300 may perform various functions to implement procedures, schemes, techniques, processes and methods described herein pertaining to adaptive IR projection control for depth estimation in computer vision, including the various procedures, scenarios, schemes, solutions, concepts and techniques described above with respect to FIG. 1 and FIG. 2 as well as process 400 described below.
- Apparatus 300 may be a part of an electronic apparatus, a portable or mobile apparatus, a wearable apparatus, a wireless communication apparatus or a computing apparatus.
- apparatus 300 may be implemented in a smartphone, a smartwatch, a personal digital assistant, a digital camera, or a computing equipment such as a tablet computer, a laptop computer or a notebook computer.
- apparatus 300 may also be a part of a machine type apparatus, which may be an IoT or NB-IoT apparatus such as an immobile or a stationary apparatus, a home apparatus, a wire communication apparatus or a computing apparatus.
- apparatus 300 may be implemented in a smart thermostat, a smart fridge, a smart door lock, a wireless speaker or a home control center.
- apparatus 300 may be implemented in the form of a system-on-chip (SoC) or one or more integrated-circuit (IC) chips such as, for example and without limitation, one or more single-core processors, one or more multi-core processors, one or more reduced-instruction-set-computing (RISC) processors or one or more complex-instruction-set-computing (CISC) processors.
- SoC system-on-chip
- IC integrated-circuit
- Apparatus 300 may include at least some of those components shown in FIG. 3 such as a control circuit 310 , a light projector 320 , a first sensor 330 , a second sensor 340 and a floodlight emitter 360 .
- apparatus 300 may also include a third sensor 350 .
- Apparatus 300 may further include one or more other components not pertinent to the proposed scheme of the present disclosure (e.g., internal power supply, memory device and/or user interface device), and, thus, such component(s) of apparatus 300 are neither shown in FIG. 3 nor described below in the interest of simplicity and brevity.
- control circuit 310 may be implemented in the form of an electronic circuit comprising various electronic components.
- control circuit 310 may be implemented as part of or in the form of one or more single-core processors, one or more multi-core processors, one or more RISC processors, or one or more CISC processors. That is, even though a singular term “a processor” is used herein to refer to control circuit 310 , control circuit 310 may include multiple processors in some implementations and a single processor in other implementations in accordance with the present disclosure.
- control circuit 310 may be implemented in the form of hardware (and, optionally, firmware) with electronic components including, for example and without limitation, one or more transistors, one or more diodes, one or more capacitors, one or more resistors, one or more inductors, one or more memristors and/or one or more varactors that are configured and arranged to achieve specific purposes in accordance with the present disclosure.
- control circuit 310 is a special-purpose machine specifically designed, arranged and configured to perform specific tasks pertaining to adaptive IR projection control for depth estimation in computer vision in accordance with various implementations of the present disclosure.
- control circuit 310 may include an electronic circuit with hardware components implementing one or more of the various proposed schemes in accordance with the present disclosure.
- control circuit 310 may also utilize software codes and/or instructions in addition to hardware components to implement adaptive IR projection control for depth estimation in computer vision in accordance with various implementations of the present disclosure.
- control circuit 310 may receive data of an image based on sensing by one or more of first sensor 330 , second sensor 340 and third sensor 350 . Additionally, control circuit 310 may detect a region of interest (ROI) in the image. Moreover, control circuit 310 may adaptively control light projector 320 with respect to projecting light toward the ROI.
- ROI region of interest
- control circuit 310 in receiving the data of the image, may receive data of an IR image, a RGB image or a depth map.
- control circuit 310 in detecting the ROI in the image, may detect a face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may enable IR projector 320 to project an IR light toward the face during a first portion of a sensor readout time when the one or more image sensors read a portion of the image where the ROI is located. Moreover, control circuit 310 may disable IR projector 320 to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read another portion of the image where the ROI is not located.
- control circuit 310 in enabling IR projector 320 to project the IR light toward the face during the first portion of the sensor readout time when the one or more image sensors read the portion of the image where the ROI is located, control circuit 310 may disable the light projector to cease projecting the IR light when the one or more image sensors read a portion of the image where an eye of the face is located.
- control circuit 310 in detecting the ROI in the image, may detect a face in the image and detecting an eye on the face. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may enable IR projector 320 to project an IR light toward the face responsive to the eye being closed or blinked. Additionally, control circuit 310 may disable IR projector 320 to cease projecting the IR light toward the face responsive to the eye being open.
- control circuit 310 in detecting the ROI in the image, may detect a face in the image, detecting an eye on the face, and monitoring a pupil of the eye. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may control IR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to an enlargement of the pupil.
- control circuit 310 in detecting the ROI in the image, may detect a face in the image and determining a distance of the face from the one or more image sensors based on a size of the face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may control IR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to the distance of the face being lower than a distance threshold.
- control circuit 310 may detect a brightness level of an ambient light (e.g., based on data received from any of first sensor 330 , second sensor 340 and third sensor 350 , or by receiving a sensing date from a light sensor (not shown)). Moreover, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may control IR projector 320 to reduce an amount of power in projecting an IR light responsive to the brightness level of the ambient light being lower than a brightness threshold.
- control circuit 310 in receiving the data of the image, may receive data of a depth map of the ROI. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, control circuit 310 may determine a respective confidence level for each of a plurality of regions of the depth map. Furthermore, control circuit 310 may enable IR projector to project 320 an IR light toward the ROI for each of one or more regions of the plurality of regions of the depth map the respective confidence level of which being lower than a confidence threshold.
- control circuit 310 may perform some operations. For instance, control circuit 310 may determine a respective amount of texture for each of a plurality of regions of the ROI in the image. Moreover, control circuit 310 may enable IR projector 320 to project an IR light toward the ROI for each of one or more regions of the plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold.
- control circuit 310 may perform some other operations. For instance, control circuit 310 may detect a power level of a battery that powers the light projector. Additionally, control circuit 310 may control IR projector 320 to reduce an amount of power in projecting an IR light toward the ROI responsive to the power level of the battery being lower than a power threshold.
- FIG. 4 illustrates an example process 400 in accordance with an implementation of the present disclosure.
- Process 400 may be an example implementation of the various procedures, scenarios, schemes, solutions, concepts and techniques, or a combination thereof, whether partially or completely, with respect to adaptive IR projection control for depth estimation in computer vision in accordance with the present disclosure.
- Process 400 may represent an aspect of implementation of features of apparatus 300 .
- Process 400 may include one or more operations, actions, or functions as illustrated by one or more of blocks 410 , 420 and 430 . Although illustrated as discrete blocks, various blocks of process 400 may be divided into additional blocks, combined into fewer blocks, or eliminated, depending on the desired implementation. Moreover, the blocks of process 400 may executed in the order shown in FIG. 4 or, alternatively, in a different order.
- Process 400 may be implemented by apparatus 300 or any variation thereof. Solely for illustrative purposes and without limitation, process 400 is described below in the context of apparatus 300 .
- Process 400 may begin at block 410 .
- process 400 may involve control circuit 310 receiving data of an image based on sensing by one or more of first sensor 330 , second sensor 340 and third sensor 350 .
- Process 400 may proceed from 410 to 420 .
- process 400 may involve control circuit 310 detecting a region of interest (ROI) in the image.
- ROI region of interest
- process 400 may involve control circuit 310 adaptively controlling light projector 320 with respect to projecting light toward the ROI.
- process 400 may involve control circuit 310 receiving data of an IR image, a RGB image or a depth map.
- process 400 in detecting the ROI in the image, may involve control circuit 310 detecting a face in the image.
- process 400 may involve control circuit 310 enabling IR projector 320 to project an IR light toward the face during a first portion of a sensor readout time when the one or more image sensors read a portion of the image where the ROI is located.
- process 400 may involve control circuit 310 disabling IR projector 320 to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read another portion of the image where the ROI is not located.
- process 400 may involve control circuit 310 disabling the light projector to cease projecting the IR light when the one or more image sensors read a portion of the image where an eye of the face is located.
- process 400 in detecting the ROI in the image, may involve control circuit 310 detecting a face in the image and detecting an eye on the face. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, process 400 may involve control circuit 310 enabling IR projector 320 to project an IR light toward the face responsive to the eye being closed or blinked. Additionally, process 400 may involve control circuit 310 disabling IR projector 320 to cease projecting the IR light toward the face responsive to the eye being open.
- process 400 in detecting the ROI in the image, may involve control circuit 310 detecting a face in the image, detecting an eye on the face, and monitoring a pupil of the eye. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, process 400 may involve control circuit 310 controlling IR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to an enlargement of the pupil.
- process 400 in detecting the ROI in the image, may involve control circuit 310 detecting a face in the image and determining a distance of the face from the one or more image sensors based on a size of the face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, process 400 may involve control circuit 310 controlling IR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to the distance of the face being lower than a distance threshold.
- process 400 may involve control circuit 310 detecting a brightness level of an ambient light. Moreover, in adaptively controlling the light projector with respect to projecting the light toward the ROI, process 400 may involve control circuit 310 controlling IR projector 320 to reduce an amount of power in projecting an IR light responsive to the brightness level of the ambient light being lower than a brightness threshold.
- process 400 in receiving the data of the image, may involve control circuit 310 receiving data of a depth map of the ROI. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI, process 400 may involve control circuit 310 determining a respective confidence level for each of a plurality of regions of the depth map. Furthermore, process 400 may involve control circuit 310 enabling IR projector to project 320 an IR light toward the ROI for each of one or more regions of the plurality of regions of the depth map the respective confidence level of which being lower than a confidence threshold.
- process 400 may involve control circuit 310 performing some operations. For instance, process 400 may involve control circuit 310 determining a respective amount of texture for each of a plurality of regions of the ROI in the image. Moreover, process 400 may involve control circuit 310 enabling IR projector 320 to project an IR light toward the ROI for each of one or more regions of the plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold.
- process 400 may involve control circuit 310 performing some other operations. For instance, process 400 may involve control circuit 310 detecting a power level of a battery that powers the light projector. Additionally, process 400 may involve control circuit 310 controlling IR projector 320 to reduce an amount of power in projecting an IR light toward the ROI responsive to the power level of the battery being lower than a power threshold.
- any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
- operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Ophthalmology & Optometry (AREA)
- Electromagnetism (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- The present disclosure is part of a non-provisional application claiming the priority benefit of U.S. Patent Application No. 62/651,815, filed on 3 Apr. 2018, respectively. The content of aforementioned application is incorporated by reference in its entirety.
- The present disclosure is generally related to computer vision and, more particularly, to adaptive infrared (IR) projection control for depth estimation in computer vision.
- Unless otherwise indicated herein, approaches described in this section are not prior art to the claims listed below and are not admitted as prior art by inclusion in this section.
- Depth estimation in computer vision utilizes images of a given scene captured by two cameras to triangulate and estimate distances. Typically, depth estimation can be achieved with either passive stereo vision or active stereo vision.
- The following summary is illustrative only and is not intended to be limiting in any way. That is, the following summary is provided to introduce concepts, highlights, benefits and advantages of the novel and non-obvious techniques described herein. Select implementations are further described below in the detailed description. Thus, the following summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
- An objective of the present disclosure is to propose schemes, solutions, concepts, designs, methods and apparatuses that enable adaptive IR projection control. It is believed that reduction in power consumption and IR projection time, as well as improved eye safety, may be achieved by implementing various proposed schemes in accordance with the present disclosure.
- In one aspect, a method may involve receiving data of an image based on sensing by one or more image sensors. The method may also involve detecting a region of interest (ROI) in the image. The method may further involve adaptively controlling a light projector with respect to projecting light toward the ROI.
- In one aspect, an apparatus may include a processor or control circuit which, during operation, may perform operations including: (a) receiving data of an image based on sensing by one or more image sensors; (b) detecting a region of interest (ROI) in the image; and (c) adaptively controlling a light projector with respect to projecting light toward the ROI.
- It is noteworthy that, although description provided herein may be in the context of certain EM wave spectra and light-emitting topologies such as IR, the proposed concepts, schemes and any variation(s)/derivative(s) thereof may be implemented in, for and by other EM wave spectra and/or light-emitting technologies such as, for example and without limitation, light-emitting diode (LED), laser, light detection and ranging (LiDAR) and time-of-flight (TOF). Thus, the scope of the present disclosure is not limited to the examples described herein.
- The accompanying drawings are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of the present disclosure. The drawings illustrate implementations of the disclosure and, together with the description, serve to explain the principles of the disclosure. It is appreciable that the drawings are not necessarily in scale as some components may be shown to be out of proportion than the size in actual implementation in order to clearly illustrate the concept of the present disclosure.
-
FIG. 1 is a diagram of an example scenario of IR projection for depth estimation. -
FIG. 2 is a diagram of an example scenario in accordance with an implementation of the present disclosure. -
FIG. 3 is a diagram of an example apparatus in accordance with an implementation of the present disclosure. -
FIG. 4 is a flowchart of an example process in accordance with an implementation of the present disclosure. - Detailed embodiments and implementations of the claimed subject matters are disclosed herein. However, it shall be understood that the disclosed embodiments and implementations are merely illustrative of the claimed subject matters which may be embodied in various forms. The present disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments and implementations set forth herein. Rather, these exemplary embodiments and implementations are provided so that description of the present disclosure is thorough and complete and will fully convey the scope of the present disclosure to those skilled in the art. In the description below, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments and implementations.
- For passive stereo vision, an IR projected projects a structured IR light toward a scene and a single IR camera is utilized to capture an image of the scene. For active stereo vision, two IR cameras are utilized to capture images of the scene.
FIG. 1 illustrates anexample scenario 100 of IR projection for depth estimation. Part (A) ofFIG. 1 shows anexample module 110 of an apparatus (e.g., smartphone) that may be utilized for active stereo vision.Module 110 may include two IR cameras, one red-green-blue (RGB) camera, one IR projector and one IR LED. During operation ofmodule 110, the IR LED may emit IR light as floodlight to illuminate a scene, the IR projector may project a structured IR light toward the scene, each of the two IR cameras may capture a respective IR image of the scene, and the RGB camera may capture an RGB image of the scene. Part (B) ofFIG. 1 shows anexample module 120 of an apparatus (e.g., smartphone) that may be utilized for passive stereo vision. During operation ofmodule 120, the IR LED may emit IR light as floodlight to illuminate a scene, the IR projector may project a structured IR light toward the scene, the single IR camera may capture an IR image of the scene, and the RGB camera may capture an RGB image of the scene. - Part (C) of
FIG. 1 shows an example of IR sensor readout timing (e.g., with respect to the IR cameras ofmodule 110 and module 120). As shown inFIG. 1 , typically a rolling-shutter camera (e.g., an IR camera and/or a RGB camera inmodule 100/200) reads out an image in a row-by-row fashion in such a way that, a certain time after the readout of a given row has begun, the readout of a next row begins. Each ofmodule 110 andmodule 120 may support either or both of type I and type II of IR exposure. Under type I, the IR projector inmodule 100/200 may be enabled, activated or otherwise turned on for a period of time which is the duration during which there is an overlap among the readout times of all the rows of a given image (e.g., IR image, RGB image or depth map). Under type II, the IR projector inmodule 100/200 may be enabled, activated or otherwise turned on for a period of time which is the duration from the beginning of the readout of the first row of a plurality of rows of the given image to the end of the readout of the last row of the plurality of rows. - Under a proposed scheme in accordance with the present disclosure, the IR projector may be adaptively controlled based on one or more conditions. That is, the IR projector may be adaptively enabled and disabled, or the power of the IR projector may be adaptively adjusted (e.g., reduced), according to whether at least one condition of one or more predefined conditions is met. Advantageously, compared to conventional approaches for type I and type II of IR exposure, the adaptively-controlled IP projection may result in reduction in power consumption and IR projection time as well as improved eye safety.
-
FIG. 2 illustrates anexample scenario 200 of adaptive IR projection control in accordance with an implementation of the present disclosure. Inscenario 200, an image (e.g., IR image, RGB image or depth map) having a region of interest (ROI) may be read out as described above. Moreover, under the proposed scheme, an IR projector may be adaptively controlled according to whether at least one condition of one or more predefined conditions is met. As a result, the amount of time the IR projector is turned on to project a structured IR light and/or an amount of power used by the IR projector in projecting the structured IR light may be reduced. - For illustrative purposes and without limiting the scope of the present disclosure, the face of a person may be a ROI in an image since facial recognition-related applications (e.g., three-dimensional (3D) face unlock, 3D face payment, 3D emoji, and so on) are gaining popularity in terms of usage. Accordingly, under the proposed scheme, timing and/or power of IR projection by an IR projector in active stereo vision and/or passive stereo vision may be adaptively controlled based on facial recognition or any ROI region in an image, which may be an IR image, RGB image or depth map. For instance, a face region in the image may be detected (e.g., by a control circuit or processor executing a facial detection/recognition algorithm) in an IR image, RGB image or depth map. Then, regardless of type I or type II of IR exposure, an amount of power used in projecting an IR light and/or an amount of time during which the IR light is projected may be reduced, thereby achieving reduction in power/time as well as improvement in eye safety.
- Under the proposed scheme, the timing and/or power in IR projection by an IR projector (e.g., the IR projector in
module 100 and/or module 200) may be adaptively controlled in one or more of a variety of ways. Description of some examples of adaptive control of IR projection is provided below. - In one example, a face (or another ROI) in an image may be used as a condition in adaptively controlling IR projection. Specifically, the IR projector may be enabled to project an IR light toward a face of a user (or another ROI in an image) during a first portion of a sensor readout time when one or more image sensors (e.g., IR camera and/or RGB camera of
module 100/200) read(s) a portion of the image where the face/ROI is located. Moreover, the IR projector may be disabled to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read(s) another portion of the image where the ROI is not located. Accordingly, this may result in power saving. Furthermore, an eye region in an image may be used as a condition in adaptively controlling IR projection. Specifically, during the first portion of the sensor readout time, the IR projector may also be disabled to cease projecting the IR light when the one or more sensors read(s) a portion of the image where an eye/both eyes of the face is/are located so as to improve eye safety. - In one example, eye action(s) of the user may be used as a condition in adaptively controlling IR projection. Specifically, the IR projector may be enabled to project an IR light toward the face in response to eye(s) of the face being blinked or otherwise closed. Moreover, the IR projector may also be disabled to cease projecting the IR light toward the face in response to the eye(s) of the face being open. Accordingly, this may improve eye safety for the user.
- In one example, pupil size may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light toward the face may be reduced in response to an enlargement of the pupil of the eye(s) of the face. Again, this may improve eye safety for the user.
- In one example, a distance of the face (or another ROI) may be used as a condition in adaptively controlling IR projection. Specifically, IR projection may be adaptively controlled based on a distance of the face/ROI from the one or more sensors (e.g., IR camera(s) and/or RGB camera). That is, as the distance of the face/ROI is inversely proportional to the size of the face/ROI in the image (i.e., larger size means the face/ROI is closer to the one or more sensors, and vice versa), an amount of power in projecting the IR light toward the face may be reduced in response to the distance of the face/ROI being lower than a distance threshold (e.g., reducing power in projecting the IR light when the face is relatively close). Accordingly, this may improve eye safety for the user.
- In one example, environmental brightness (lux) may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the brightness level of an ambient light being lower than a brightness threshold.
- In one example, environmental brightness (lux) may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the brightness level of an ambient light being lower than a brightness threshold.
- In one example, a confidence level of a depth map may be used as a condition in adaptively controlling IR projection. Specifically, the IR projector may be enabled to project an IR light toward the ROI for each of one or more regions of a plurality of regions of a depth map having a respective confidence level lower than a confidence threshold.
- In one example, an image texture of a ROI may be used as a condition in adaptively controlling IR projection. Specifically, the IR projector may be enabled to project an IR light toward the ROI for each of one or more regions of a plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold.
- In one example, a power level of a battery that powers the IR projector may be used as a condition in adaptively controlling IR projection. Specifically, an amount of power in projecting the IR light may be reduced in response to the power level of the battery being lower than a power threshold.
-
FIG. 3 illustrates anexample apparatus 300 in accordance with an implementation of the present disclosure.Apparatus 300 may perform various functions to implement procedures, schemes, techniques, processes and methods described herein pertaining to adaptive IR projection control for depth estimation in computer vision, including the various procedures, scenarios, schemes, solutions, concepts and techniques described above with respect toFIG. 1 andFIG. 2 as well asprocess 400 described below. -
Apparatus 300 may be a part of an electronic apparatus, a portable or mobile apparatus, a wearable apparatus, a wireless communication apparatus or a computing apparatus. For instance,apparatus 300 may be implemented in a smartphone, a smartwatch, a personal digital assistant, a digital camera, or a computing equipment such as a tablet computer, a laptop computer or a notebook computer. Moreover,apparatus 300 may also be a part of a machine type apparatus, which may be an IoT or NB-IoT apparatus such as an immobile or a stationary apparatus, a home apparatus, a wire communication apparatus or a computing apparatus. For instance,apparatus 300 may be implemented in a smart thermostat, a smart fridge, a smart door lock, a wireless speaker or a home control center. Alternatively,apparatus 300 may be implemented in the form of a system-on-chip (SoC) or one or more integrated-circuit (IC) chips such as, for example and without limitation, one or more single-core processors, one or more multi-core processors, one or more reduced-instruction-set-computing (RISC) processors or one or more complex-instruction-set-computing (CISC) processors. -
Apparatus 300 may include at least some of those components shown inFIG. 3 such as acontrol circuit 310, alight projector 320, afirst sensor 330, asecond sensor 340 and afloodlight emitter 360. Optionally,apparatus 300 may also include athird sensor 350.Apparatus 300 may further include one or more other components not pertinent to the proposed scheme of the present disclosure (e.g., internal power supply, memory device and/or user interface device), and, thus, such component(s) ofapparatus 300 are neither shown inFIG. 3 nor described below in the interest of simplicity and brevity. - In one aspect,
control circuit 310 may be implemented in the form of an electronic circuit comprising various electronic components. Alternatively,control circuit 310 may be implemented as part of or in the form of one or more single-core processors, one or more multi-core processors, one or more RISC processors, or one or more CISC processors. That is, even though a singular term “a processor” is used herein to refer to controlcircuit 310,control circuit 310 may include multiple processors in some implementations and a single processor in other implementations in accordance with the present disclosure. In another aspect,control circuit 310 may be implemented in the form of hardware (and, optionally, firmware) with electronic components including, for example and without limitation, one or more transistors, one or more diodes, one or more capacitors, one or more resistors, one or more inductors, one or more memristors and/or one or more varactors that are configured and arranged to achieve specific purposes in accordance with the present disclosure. In other words, in at least some implementations,control circuit 310 is a special-purpose machine specifically designed, arranged and configured to perform specific tasks pertaining to adaptive IR projection control for depth estimation in computer vision in accordance with various implementations of the present disclosure. In some implementations,control circuit 310 may include an electronic circuit with hardware components implementing one or more of the various proposed schemes in accordance with the present disclosure. Alternatively, other than hardware components,control circuit 310 may also utilize software codes and/or instructions in addition to hardware components to implement adaptive IR projection control for depth estimation in computer vision in accordance with various implementations of the present disclosure. - Under various proposed schemes in accordance with the present disclosure, during operation,
control circuit 310 may receive data of an image based on sensing by one or more offirst sensor 330,second sensor 340 andthird sensor 350. Additionally,control circuit 310 may detect a region of interest (ROI) in the image. Moreover,control circuit 310 may adaptively controllight projector 320 with respect to projecting light toward the ROI. - In some implementations, in receiving the data of the image,
control circuit 310 may receive data of an IR image, a RGB image or a depth map. - In some implementations, in detecting the ROI in the image,
control circuit 310 may detect a face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may enableIR projector 320 to project an IR light toward the face during a first portion of a sensor readout time when the one or more image sensors read a portion of the image where the ROI is located. Moreover,control circuit 310 may disableIR projector 320 to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read another portion of the image where the ROI is not located. In some implementations, in enablingIR projector 320 to project the IR light toward the face during the first portion of the sensor readout time when the one or more image sensors read the portion of the image where the ROI is located,control circuit 310 may disable the light projector to cease projecting the IR light when the one or more image sensors read a portion of the image where an eye of the face is located. - In some implementations, in detecting the ROI in the image,
control circuit 310 may detect a face in the image and detecting an eye on the face. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may enableIR projector 320 to project an IR light toward the face responsive to the eye being closed or blinked. Additionally,control circuit 310 may disableIR projector 320 to cease projecting the IR light toward the face responsive to the eye being open. - In some implementations, in detecting the ROI in the image,
control circuit 310 may detect a face in the image, detecting an eye on the face, and monitoring a pupil of the eye. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may controlIR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to an enlargement of the pupil. - In some implementations, in detecting the ROI in the image,
control circuit 310 may detect a face in the image and determining a distance of the face from the one or more image sensors based on a size of the face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may controlIR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to the distance of the face being lower than a distance threshold. - In some implementations,
control circuit 310 may detect a brightness level of an ambient light (e.g., based on data received from any offirst sensor 330,second sensor 340 andthird sensor 350, or by receiving a sensing date from a light sensor (not shown)). Moreover, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may controlIR projector 320 to reduce an amount of power in projecting an IR light responsive to the brightness level of the ambient light being lower than a brightness threshold. - In some implementations, in receiving the data of the image,
control circuit 310 may receive data of a depth map of the ROI. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,control circuit 310 may determine a respective confidence level for each of a plurality of regions of the depth map. Furthermore,control circuit 310 may enable IR projector to project 320 an IR light toward the ROI for each of one or more regions of the plurality of regions of the depth map the respective confidence level of which being lower than a confidence threshold. - In some implementations, in adaptively controlling the light projector with respect to projecting the light toward the ROI,
control circuit 310 may perform some operations. For instance,control circuit 310 may determine a respective amount of texture for each of a plurality of regions of the ROI in the image. Moreover,control circuit 310 may enableIR projector 320 to project an IR light toward the ROI for each of one or more regions of the plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold. - In some implementations, in adaptively controlling the light projector with respect to projecting the light toward the ROI,
control circuit 310 may perform some other operations. For instance,control circuit 310 may detect a power level of a battery that powers the light projector. Additionally,control circuit 310 may controlIR projector 320 to reduce an amount of power in projecting an IR light toward the ROI responsive to the power level of the battery being lower than a power threshold. -
FIG. 4 illustrates anexample process 400 in accordance with an implementation of the present disclosure.Process 400 may be an example implementation of the various procedures, scenarios, schemes, solutions, concepts and techniques, or a combination thereof, whether partially or completely, with respect to adaptive IR projection control for depth estimation in computer vision in accordance with the present disclosure.Process 400 may represent an aspect of implementation of features ofapparatus 300.Process 400 may include one or more operations, actions, or functions as illustrated by one or more ofblocks process 400 may be divided into additional blocks, combined into fewer blocks, or eliminated, depending on the desired implementation. Moreover, the blocks ofprocess 400 may executed in the order shown inFIG. 4 or, alternatively, in a different order. Furthermore, one or more of the blocks ofprocess 400 may be repeated one or more times.Process 400 may be implemented byapparatus 300 or any variation thereof. Solely for illustrative purposes and without limitation,process 400 is described below in the context ofapparatus 300.Process 400 may begin atblock 410. - At 410,
process 400 may involvecontrol circuit 310 receiving data of an image based on sensing by one or more offirst sensor 330,second sensor 340 andthird sensor 350.Process 400 may proceed from 410 to 420. - At 420,
process 400 may involvecontrol circuit 310 detecting a region of interest (ROI) in the image.Process 400 may proceed from 420 to 430. - At 430,
process 400 may involvecontrol circuit 310 adaptively controllinglight projector 320 with respect to projecting light toward the ROI. - In some implementations, in receiving the data of the image,
process 400 may involvecontrol circuit 310 receiving data of an IR image, a RGB image or a depth map. - In some implementations, in detecting the ROI in the image,
process 400 may involvecontrol circuit 310 detecting a face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 enablingIR projector 320 to project an IR light toward the face during a first portion of a sensor readout time when the one or more image sensors read a portion of the image where the ROI is located. Moreover,process 400 may involvecontrol circuit 310 disablingIR projector 320 to cease projecting the IR light toward the face during a second portion of the sensor readout time when the one or more image sensors read another portion of the image where the ROI is not located. In some implementations, in enablingIR projector 320 to project the IR light toward the face during the first portion of the sensor readout time when the one or more image sensors read the portion of the image where the ROI is located,process 400 may involvecontrol circuit 310 disabling the light projector to cease projecting the IR light when the one or more image sensors read a portion of the image where an eye of the face is located. - In some implementations, in detecting the ROI in the image,
process 400 may involvecontrol circuit 310 detecting a face in the image and detecting an eye on the face. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 enablingIR projector 320 to project an IR light toward the face responsive to the eye being closed or blinked. Additionally,process 400 may involvecontrol circuit 310 disablingIR projector 320 to cease projecting the IR light toward the face responsive to the eye being open. - In some implementations, in detecting the ROI in the image,
process 400 may involvecontrol circuit 310 detecting a face in the image, detecting an eye on the face, and monitoring a pupil of the eye. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 controllingIR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to an enlargement of the pupil. - In some implementations, in detecting the ROI in the image,
process 400 may involvecontrol circuit 310 detecting a face in the image and determining a distance of the face from the one or more image sensors based on a size of the face in the image. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 controllingIR projector 320 to reduce an amount of power in projecting an IR light toward the face responsive to the distance of the face being lower than a distance threshold. - In some implementations,
process 400 may involvecontrol circuit 310 detecting a brightness level of an ambient light. Moreover, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 controllingIR projector 320 to reduce an amount of power in projecting an IR light responsive to the brightness level of the ambient light being lower than a brightness threshold. - In some implementations, in receiving the data of the image,
process 400 may involvecontrol circuit 310 receiving data of a depth map of the ROI. In such cases, in adaptively controlling the light projector with respect to projecting the light toward the ROI,process 400 may involvecontrol circuit 310 determining a respective confidence level for each of a plurality of regions of the depth map. Furthermore,process 400 may involvecontrol circuit 310 enabling IR projector to project 320 an IR light toward the ROI for each of one or more regions of the plurality of regions of the depth map the respective confidence level of which being lower than a confidence threshold. - In some implementations, in adaptively controlling the light projector with respect to projecting the light toward the ROI,
process 400 may involvecontrol circuit 310 performing some operations. For instance,process 400 may involvecontrol circuit 310 determining a respective amount of texture for each of a plurality of regions of the ROI in the image. Moreover,process 400 may involvecontrol circuit 310 enablingIR projector 320 to project an IR light toward the ROI for each of one or more regions of the plurality of regions of the ROI the respective amount of texture of which being lower than a texture threshold. - In some implementations, in adaptively controlling the light projector with respect to projecting the light toward the ROI,
process 400 may involvecontrol circuit 310 performing some other operations. For instance,process 400 may involvecontrol circuit 310 detecting a power level of a battery that powers the light projector. Additionally,process 400 may involvecontrol circuit 310 controllingIR projector 320 to reduce an amount of power in projecting an IR light toward the ROI responsive to the power level of the battery being lower than a power threshold. - The herein-described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
- Further, with respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
- Moreover, it will be understood by those skilled in the art that, in general, terms used herein, and especially in the appended claims, e.g., bodies of the appended claims, are generally intended as “open” terms, e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc. It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to implementations containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an,” e.g., “a” and/or “an” should be interpreted to mean “at least one” or “one or more;” the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number, e.g., the bare recitation of “two recitations,” without other modifiers, means at least two recitations, or two or more recitations. Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention, e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc. In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention, e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc. It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
- From the foregoing, it will be appreciated that various implementations of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made without departing from the scope and spirit of the present disclosure. Accordingly, the various implementations disclosed herein are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Claims (20)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/354,552 US20190306441A1 (en) | 2018-04-03 | 2019-03-15 | Method And Apparatus Of Adaptive Infrared Projection Control |
CN201910262312.2A CN110351543A (en) | 2018-04-03 | 2019-04-02 | The method and device of the infrared line projection's control of adaptability |
TW108111654A TWI735858B (en) | 2018-04-03 | 2019-04-02 | Method and apparatus of adpative infrared proejction control |
US17/384,963 US11570381B2 (en) | 2018-04-03 | 2021-07-26 | Method and apparatus of adaptive infrared projection control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862651815P | 2018-04-03 | 2018-04-03 | |
US16/354,552 US20190306441A1 (en) | 2018-04-03 | 2019-03-15 | Method And Apparatus Of Adaptive Infrared Projection Control |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/384,963 Division US11570381B2 (en) | 2018-04-03 | 2021-07-26 | Method and apparatus of adaptive infrared projection control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190306441A1 true US20190306441A1 (en) | 2019-10-03 |
Family
ID=68057444
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/354,552 Abandoned US20190306441A1 (en) | 2018-04-03 | 2019-03-15 | Method And Apparatus Of Adaptive Infrared Projection Control |
US17/384,963 Active US11570381B2 (en) | 2018-04-03 | 2021-07-26 | Method and apparatus of adaptive infrared projection control |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/384,963 Active US11570381B2 (en) | 2018-04-03 | 2021-07-26 | Method and apparatus of adaptive infrared projection control |
Country Status (3)
Country | Link |
---|---|
US (2) | US20190306441A1 (en) |
CN (1) | CN110351543A (en) |
TW (1) | TWI735858B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113711229A (en) * | 2019-05-31 | 2021-11-26 | Oppo广东移动通信有限公司 | Control method of electronic device, and computer-readable storage medium |
US20220114743A1 (en) * | 2019-06-24 | 2022-04-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image processing method and apparatus, and computer-readable non-transitory storage medium |
EP4109871A4 (en) * | 2020-02-21 | 2023-03-29 | NEC Corporation | Biometric authentication device, biometric authentication method, and computer-readable medium storing program therefor |
EP4206605A1 (en) * | 2021-12-28 | 2023-07-05 | Datalogic IP Tech S.r.l. | Controllable laser pattern for eye safety and reduced power consumption for image capture devices |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022006739A1 (en) * | 2020-07-07 | 2022-01-13 | 深圳市锐明技术股份有限公司 | Control method, control apparatus, and infrared camera |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160057365A1 (en) * | 2014-08-21 | 2016-02-25 | Nintendo Co., Ltd. | Information processing apparatus, information processing system, computer-readable non-transitory storage medium having stored therein information processing program, and information processing method |
US20160088241A1 (en) * | 2014-09-24 | 2016-03-24 | Samsung Electronics Co., Ltd. | Method for performing user authentication and electronic device thereof |
US20170132466A1 (en) * | 2014-09-30 | 2017-05-11 | Qualcomm Incorporated | Low-power iris scan initialization |
US20180032813A1 (en) * | 2016-07-29 | 2018-02-01 | Samsung Electronics Co., Ltd. | Electronic device including iris camera |
US20190089939A1 (en) * | 2017-09-18 | 2019-03-21 | Intel Corporation | Depth sensor optimization based on detected distance |
US20190279398A1 (en) * | 2018-02-27 | 2019-09-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Control method, control device, terminal and computer device |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7430365B2 (en) * | 2005-03-31 | 2008-09-30 | Avago Technologies Ecbu (Singapore) Pte Ltd. | Safe eye detection |
US7568802B2 (en) * | 2007-05-09 | 2009-08-04 | Honeywell International Inc. | Eye-safe near infra-red imaging illumination method and system |
CN102103259B (en) * | 2009-12-18 | 2012-11-21 | 深圳市巨龙科教高技术股份有限公司 | Device and method for preventing strong light irradiation of projector |
JP5212927B2 (en) * | 2011-01-25 | 2013-06-19 | 株式会社デンソー | Face shooting system |
TW201329508A (en) * | 2012-01-04 | 2013-07-16 | Walsin Lihwa Corp | Device and method for protecting eyes |
CN105765558A (en) * | 2013-09-03 | 2016-07-13 | 醒眸行有限公司 | Low power eye tracking system and method |
US20160275348A1 (en) * | 2015-03-17 | 2016-09-22 | Motorola Mobility Llc | Low-power iris authentication alignment |
US20160283789A1 (en) * | 2015-03-25 | 2016-09-29 | Motorola Mobility Llc | Power-saving illumination for iris authentication |
US20170061210A1 (en) * | 2015-08-26 | 2017-03-02 | Intel Corporation | Infrared lamp control for use with iris recognition authentication |
US20180189547A1 (en) * | 2016-12-30 | 2018-07-05 | Intel Corporation | Biometric identification system |
CN106803284B (en) * | 2017-01-11 | 2021-03-23 | 北京旷视科技有限公司 | Method and device for constructing three-dimensional image of face |
CN107358175B (en) * | 2017-06-26 | 2020-11-24 | Oppo广东移动通信有限公司 | Iris collection method and electronic device |
CN107463877A (en) * | 2017-07-05 | 2017-12-12 | 广东欧珀移动通信有限公司 | Method for collecting iris, electronic installation and computer-readable recording medium |
CN107451561A (en) * | 2017-07-31 | 2017-12-08 | 广东欧珀移动通信有限公司 | Iris recognition light compensation method and device |
-
2019
- 2019-03-15 US US16/354,552 patent/US20190306441A1/en not_active Abandoned
- 2019-04-02 CN CN201910262312.2A patent/CN110351543A/en not_active Withdrawn
- 2019-04-02 TW TW108111654A patent/TWI735858B/en active
-
2021
- 2021-07-26 US US17/384,963 patent/US11570381B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160057365A1 (en) * | 2014-08-21 | 2016-02-25 | Nintendo Co., Ltd. | Information processing apparatus, information processing system, computer-readable non-transitory storage medium having stored therein information processing program, and information processing method |
US9826173B2 (en) * | 2014-08-21 | 2017-11-21 | Nintendo Co., Ltd. | Information processing apparatus, information processing system, computer-readable non-transitory storage medium having stored therein information processing program, and information processing method |
US20160088241A1 (en) * | 2014-09-24 | 2016-03-24 | Samsung Electronics Co., Ltd. | Method for performing user authentication and electronic device thereof |
US9875551B2 (en) * | 2014-09-24 | 2018-01-23 | Samsung Electronics Co., Ltd | Method for performing user authentication and electronic device thereof |
US20170132466A1 (en) * | 2014-09-30 | 2017-05-11 | Qualcomm Incorporated | Low-power iris scan initialization |
US20180032813A1 (en) * | 2016-07-29 | 2018-02-01 | Samsung Electronics Co., Ltd. | Electronic device including iris camera |
US10430651B2 (en) * | 2016-07-29 | 2019-10-01 | Samsung Electronics Co., Ltd. | Electronic device including iris camera |
US20190089939A1 (en) * | 2017-09-18 | 2019-03-21 | Intel Corporation | Depth sensor optimization based on detected distance |
US20190279398A1 (en) * | 2018-02-27 | 2019-09-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Control method, control device, terminal and computer device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113711229A (en) * | 2019-05-31 | 2021-11-26 | Oppo广东移动通信有限公司 | Control method of electronic device, and computer-readable storage medium |
US20220148214A1 (en) * | 2019-05-31 | 2022-05-12 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Control Method for Electronic Device, Electronic Device and Computer Readable Storage Medium |
US11836956B2 (en) * | 2019-05-31 | 2023-12-05 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Control method for electronic device, electronic device and computer readable storage medium |
US20220114743A1 (en) * | 2019-06-24 | 2022-04-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image processing method and apparatus, and computer-readable non-transitory storage medium |
EP4109871A4 (en) * | 2020-02-21 | 2023-03-29 | NEC Corporation | Biometric authentication device, biometric authentication method, and computer-readable medium storing program therefor |
EP4206605A1 (en) * | 2021-12-28 | 2023-07-05 | Datalogic IP Tech S.r.l. | Controllable laser pattern for eye safety and reduced power consumption for image capture devices |
US11869206B2 (en) | 2021-12-28 | 2024-01-09 | Datalogic Ip Tech S.R.L. | Controllable laser pattern for eye safety and reduced power consumption for image capture devices |
Also Published As
Publication number | Publication date |
---|---|
TWI735858B (en) | 2021-08-11 |
TW201942797A (en) | 2019-11-01 |
US20210352227A1 (en) | 2021-11-11 |
US11570381B2 (en) | 2023-01-31 |
CN110351543A (en) | 2019-10-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11570381B2 (en) | Method and apparatus of adaptive infrared projection control | |
US10956736B2 (en) | Methods and apparatus for power-efficient iris recognition | |
US9736373B2 (en) | Dynamic optimization of light source power | |
US9413939B2 (en) | Apparatus and method for controlling a camera and infrared illuminator in an electronic device | |
WO2017161867A1 (en) | Screen brightness adjustment method and apparatus, and intelligent terminal | |
US9864436B2 (en) | Method for recognizing motion gesture commands | |
EP3490240A1 (en) | Smart flash lamp control method and mobile terminal | |
US11290643B1 (en) | Efficient digital camera image acquisition and analysis | |
US10154198B2 (en) | Power saving techniques for an image capture device | |
US10771766B2 (en) | Method and apparatus for active stereo vision | |
CN111601373B (en) | Backlight brightness control method and device, mobile terminal and storage medium | |
EP3609175B1 (en) | Apparatus and method for generating moving image data including multiple section images in electronic device | |
US20130308835A1 (en) | Mobile Communication Device with Image Recognition and Method of Operation Therefor | |
US10371577B2 (en) | Apparatus and method for measuring temperature in electronic device | |
US11146747B1 (en) | Dynamic driver mechanism for rolling shutter sensor to acquire the structured light pattern | |
EP3854069A1 (en) | Automated camera mode selection | |
WO2023045626A1 (en) | Image acquisition method and apparatus, terminal, computer-readable storage medium and computer program product | |
KR20190001067A (en) | Method and apparatus for speech recognition | |
US20180084178A1 (en) | Smart camera flash system | |
US20200151428A1 (en) | Data Processing Method, Electronic Device and Computer-Readable Storage Medium | |
CN116363722A (en) | Target recognition method, device and storage medium | |
US20210176385A1 (en) | Camera Configuration For Active Stereo Without Image Quality Degradation | |
TWI739041B (en) | Electronic device and control method thereof | |
CA2794067C (en) | Apparatus and method for controlling a camera and infrared illuminator in an electronic device | |
Inoue et al. | Situation-based dynamic frame-rate control for on-line object tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHANG, TE-HAO;WANG, CHI-HUI;JU, CHI-CHENG;AND OTHERS;SIGNING DATES FROM 20190329 TO 20190625;REEL/FRAME:049676/0706 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING RESPONSE FOR INFORMALITY, FEE DEFICIENCY OR CRF ACTION |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |