WO2023094572A1 - Illumination device - Google Patents

Illumination device Download PDF

Info

Publication number
WO2023094572A1
WO2023094572A1 PCT/EP2022/083222 EP2022083222W WO2023094572A1 WO 2023094572 A1 WO2023094572 A1 WO 2023094572A1 EP 2022083222 W EP2022083222 W EP 2022083222W WO 2023094572 A1 WO2023094572 A1 WO 2023094572A1
Authority
WO
WIPO (PCT)
Prior art keywords
zone
led
leds
processing circuitry
zones
Prior art date
Application number
PCT/EP2022/083222
Other languages
French (fr)
Inventor
Mario Manninger
Original Assignee
Ams International Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ams International Ag filed Critical Ams International Ag
Publication of WO2023094572A1 publication Critical patent/WO2023094572A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

Definitions

  • the present disclosure is in the field of devices for illuminating a target to be imaged by an imaging device, e.g. a camera.
  • the disclosure relates, in particular, to operation of such a device in battery-operated portable apparatus, such as a smartphone, laptop computer, games system or tablet device.
  • Imaging devices such as cameras, and in particular portable devices comprising such imaging devices, e.g. smartphones, tablet devices, games systems and laptop computers, may comprise devices for illuminating a scene.
  • some cameras are known to comprise an illumination device, known in the art as a ‘flash’, for illuminating a scene, particularly in poor lighting conditions.
  • a flash may for example comprise one or more bulbs or Light Emitting Diodes (LEDs) and may be synchronised with the imaging device to illumine a scene at a precise time of image capture. Operation of a flash may depend upon ambient lighting conditions. In low ambient light conditions, the flash may be enabled during an image capture. In high ambient light conditions, the flash may remain disabled during an image capture to avoid over exposure of a scene.
  • LEDs Light Emitting Diodes
  • Portable devices and in particular smartphones, are increasingly used for photographic and video purposes. Components of such portable devices may have stringent power consumption requirements, to optimise or maximise a battery life of the portable device.
  • operation of an illumination device e.g. a flash, may consume a substantial amount of power. Prolonged or repeated use of an illumination device may accelerate depletion of charge in a battery in a portable device, thereby adversely affecting a user-experience of the portable device.
  • an illumination device may provide additional illumination for capturing images in low lighting conditions
  • such a device may also affect an image, such as by causing shadows attributed to the illumination device rather than from ambient lighting or from any other light source. Such shadows may adversely affect image quality of a background to a target being imaged. It is therefore desirable to provide a low power means to sufficiently illuminate a scene for an imaging device to ensure adequate image quality. It is desirable that such means are suitable for use in a portable battery-operated apparatus such as a smartphone.
  • the present disclosure is in the field of devices for illuminating a target to be imaged by an imaging device, e.g. a camera.
  • the disclosure relates, in particular, to operation of such a device in battery-operated portable apparatus, e.g. a smartphone, laptop computer, tablet device, games system or the like.
  • a device comprising: a multi-zone time-of-flight sensor configured to provide depth information for each zone of a plurality of zones; a plurality of LEDs, wherein at least one LED is configurable to illuminate each zone; and processing circuitry configured to control each at least one LED in response to depth information for a corresponding zone.
  • each at least one LED in response to, i.e. based on, depth information for the corresponding zone, a selection may be readily made of which zones to illuminate using the plurality of LEDs. By illuminating only selected zones, an overall power consumption of the plurality of LEDs may be reduced, or minimised.
  • the disclosed device may be implemented in an imaging device such as a camera on a smartphone, which may be used to photograph a target, e.g. a person in a self-portrait photograph.
  • the processing circuitry of the disclosed device may detect which of the plurality of zones the target resides within based on the depth information for each zone, and thereby only enable LEDs corresponding to the zones within which the target resides. As such, other zones may not be not illuminated, thereby reducing an overall power consumption of the plurality of LEDs. That is, only the target being photographed may be illuminated, thereby avoiding wasting power unnecessarily illuminating a background.
  • the processing circuitry may be configured to determine a depth map, e.g. a 3- dimensionall representation, of a scene based on the depth information for each zone of a plurality of zones.
  • the processing circuitry may be configured to control each at least one LED based on the determined depth map.
  • a multi-zone time-of-flight sensor may provide an efficient, low-cost and compact means to associate a zone within a scene to one or more corresponding LEDs.
  • use of a multi-zone time-of-flight sensor may provide an efficient mapping between LEDs and zones, thereby simplifying software overhead.
  • a resolution of the multi-zone time-of-flight sensor may correspond to a resolution of the plurality of LEDs, e.g. of a matrix of LEDs, thereby providing a direct mapping between a zone of the time-of-flight sensor and one or more of the plurality of LEDs.
  • a 1 :1 mapping between zones of the time-of- flight sensor and LEDs may be implemented.
  • a plurality of LEDs may be configured to illuminate each zone.
  • multi-zone time-of-flight sensor refers to a sensor configurable to sense a distance to an object, e.g. a proximity, in each of a plurality of zones.
  • the “multi-zone time-of-flight sensor” may be known in the art as a ‘multi-zone sensor’, a ‘multi-zone proximity sensor’, a ‘multizone sensor’, or the like.
  • the multi-zone time-of-flight sensor may be capable of multiobject detection in each zone, e.g. capable of producing a corresponding histogram with a plurality of peaks in each zone.
  • a multi-zone time-of-flight sensor may sense in an array of zones such as, for example, a 2x2, 3x3, 4x4 or an even larger array of zones, such as 30x20 or larger.
  • Each zone of the plurality of zones may define an area or field sensed by the time-of-flight sensor.
  • Each area or field may be distinct, or may at least partially overlap an area or field of a zone defined by at least one adjacent zone.
  • Controlling each at least one LED may comprise controlling a driver circuit, or discrete driver IC (Integrated Circuit) to enable/disable each at least one LED.
  • Controlling each at least one LED may comprise controlling and/or adjusting a current and/or duty cycle of a signal to each at least one LED.
  • the processing circuitry may be configured to identify one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones.
  • the processing circuitry may identity one or more objects or targets using a depth map that is determined based upon the depth information for one or more of the plurality of zones.
  • the processing circuitry may run one or more algorithms for detection and/or identification and/or determination of an object or target.
  • the processing circuitry may be configured to identify and/or select an object or target based on its determined proximity to the multi-zone time-of-f light sensor and/or based on one or more images captured by an image sensor.
  • the processing circuitry may be configured to control each at least one LED to predominantly or only illuminate the one or more objects.
  • the processing circuitry may be configured to control each at least one LED to only illuminate zones corresponding to the one or more objects.
  • the processing circuitry may be configured to distinguish objects or targets that are relatively close to the multi-zone time-of-f light sensor, from a background in a scene that may be relatively far from the multi-zone time-of-flight sensor.
  • the processing circuitry may be configured to control the plurality of LEDs to illuminate only the one or more objects, e.g. enable LEDs within zones in which the one or more objects is detected, thereby saving power compared to enabling all of the plurality of LEDs.
  • the processing circuitry may be configured to adjust a power of each LED based on the depth information, e.g. a depth map, to avoid overexposure of a target or object. That is, even within LEDs that are enabled to illuminate a target or object, a power to each LED may be controlled based on a depth map to more accurately control exposure of the object.
  • the processing circuitry may be configured to run a person-detection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones.
  • such a person-detection and/or face-detection algorithm may be additionally based upon one or more images captured by an image sensor.
  • the processing circuitry may be configured to control the plurality of LEDs to illuminate only a detected person or face, and avoid wasting power illuminating a background or other portions of a scene.
  • the processing circuitry may be configured to control each at least one LED to predominantly or only illuminate one or more detected persons or faces.
  • the processing circuitry may be configured to control each at least one LED to only illuminate zones corresponding to the one or more detected persons or faces.
  • the processing circuitry may be configured to control the plurality of LEDs to illuminate only a detected person or face, e.g. enable LEDs within zones in which the person or face is detected, thereby saving power compared to enabling all of the plurality of LEDs.
  • the processing circuitry may be configured to control the plurality of LEDs by enabling each at least one LED when depth information for the corresponding zone(s) indicates a distance to a target is below a threshold.
  • the processing circuitry may be configured to control the plurality of LEDs by disabling each at least one LED when depth information for the corresponding zone(s) indicates a distance to a target is at or above the threshold.
  • the plurality of LEDs may be controlled to illuminate only zones comprising a target relatively close to the multizone time-of-f light sensor, while avoiding illuminating zones relatively far from the multizone time-of-flight sensor, thereby saving power by only enabling a subset of the plurality of LEDs.
  • the threshold may depend upon a detected ambient light level.
  • the ambient light level may be sensed by one or more ambient light sensors. In some examples, the ambient light level may be sensed by an image sensor. The one or more ambient light sensors may be communicably coupled to the processing circuitry.
  • the device may comprise an image sensor configured to capture one or more images corresponding to the plurality of zones.
  • the device may be, or may comprise, a camera.
  • the processing circuitry may be configured to control each at least one LED to substantially only illuminate one or more selected targets in each image, or each set of images.
  • the imaging device may capture an image wherein only the selected one or more selected targets are illuminated by the plurality of LEDs. That is, LEDs configured to illuminate zones corresponding to the selected one or more selected targets may be enabled.
  • the image sensor may be configured to capture video and the processing circuitry may be configured to control each at least one LED to substantially only illuminate one or more selected targets in the captured video.
  • the LEDs may be configured to illuminate substantially only the target or object. That is, relative movement of object or target may be effectively tracked using the multi-zone time-of-flight sensor and only LEDs corresponding to the zones in which the object or target is detected in any given frame of the video may be illuminated.
  • the device may comprise one or more optical elements configured to direct illumination from each at least one LED towards a corresponding zone of the plurality of zones.
  • the device may be provided as an optical module comprising the one or more optical elements.
  • the one or more optical elements may be configured to increase a field of illumination of the plurality of LEDs.
  • the one or more optical elements may comprise: a concave or convex lens; a Fresnel lens; a microlens array; and/or a metalens.
  • the one or more optical elements may directly map a field of illumination of one or more LEDs of the plurality of LEDs to each zone of the multi-zone time-of-flight sensor.
  • the plurality of LEDs may be provided as a matrix of LEDs. Each LED may be individually addressable. Each at least one LED corresponding to a zone may be addressable.
  • addressable refers to addressable by the processing circuitry, e.g. able to be enabled or disabled by the processing circuitry.
  • At least two LEDs may be configurable to illuminate each zone.
  • Each at least two LEDs may comprise a relatively warm white LED and a relatively cool white LED.
  • selection of an amount of relatively warm white LEDs and relatively cool white LEDs to be enabled in a given zone may enable control over a white-balance of a scene, or of an object or target to be imaged with the image sensor.
  • selection of a drive strength provided to each relatively warm white LED and relatively cool white LED in a given zone may enable control over a white-balance of a scene, or of an object or target to be imaged with the image sensor.
  • an ambient light sensor and/or an image sensor may sense a color of an ambient light, and select an amount of relatively warm white LEDs and relatively cool white LEDs to enable to white-balance a scene.
  • the processing circuitry may be configured to control an intensity of each at least one LED in response to depth information for the corresponding zone.
  • LEDs configured to illuminate one or more portions of an object or target that are relatively close to the multi-zone time-of-flight sensor may be provided with a higher power than LEDs configured to illuminate one or more portions of the object or target that are relatively far to the multi-zone time-of-flight sensor, thereby providing a relatively even distribution of illumination across the target or object.
  • a battery- operated portable apparatus comprising the device according to the first aspect.
  • the portable apparatus may, for example, be a smartphone, a tablet device, a laptop computer, a games system, or the like.
  • the portable apparatus may comprise a camera.
  • the portable apparatus may comprise a communications device.
  • the camera e.g. an image sensor within the camera, may be synchronised with the device according to the first aspect.
  • a method of illuminating a target or object within a scene comprising: configuring a multi-zone time-of-flight sensor to provide depth information for each zone of a plurality of zones; and controlling a plurality of LEDs to illuminate one or more of the zones in response to the corresponding depth information.
  • the method may comprise identifying one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones.
  • the method may comprise identifying and/or selecting an object or target based on its determined proximity to the multi-zone time-of-flight sensor and/or based on one or more images captured by an image sensor.
  • the method may comprise controlling the plurality of LEDs to predominantly or only illuminate the one or more objects.
  • the method may comprise configuring processing circuitry to run a persondetection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones.
  • a persondetection and/or face-detection algorithm may be additionally based on one or more images captured by an image sensor.
  • the method may comprise controlling the plurality of LEDs to predominantly or only illuminate one or more detected persons or faces.
  • the method may comprise controlling the plurality of LEDs to substantially only illuminate one or more selected targets in each image, or each set of images captured by an image sensor.
  • the method may comprise controlling the plurality of LEDs to substantially only illuminate one or more selected targets in video captured by an image sensor
  • a camera comprising the device according to the first aspect.
  • Figure 1 depicts a device according to an embodiment of the disclosure
  • Figure 2 depicts use of a device according to an embodiment of the disclosure
  • Figure 3a depicts use of a device according to a further embodiment of the disclosure for illuminating an object at a first time in a video sequence
  • Figure 3b depicts use of the device of Figure 3a for illuminating the object at a second time in the video sequence
  • Figure 3c depicts use of the device of Figure 3a for illuminating the object at a third time in the video sequence
  • Figure 4 depicts a device implemented in a portable apparatus, according to an embodiment of the disclosure
  • Figure 5 depicts use of a device according to an embodiment of the disclosure
  • Figure 6 depicts a method of capturing an image, according to an embodiment of the disclosure.
  • Figure 7 depicts a method of illuminating a target or object within a scene, according to an embodiment of the disclosure.
  • Figure 1 depicts a device 100 according to an embodiment of the disclosure.
  • the device 100 comprises a multi-zone time-of-flight sensor 105.
  • the multi-zone time- of-flight sensor 105 is configured to provide depth information for each zone 120-1 , 120-2, 120-3 of a plurality of zones.
  • such a time-of-flight sensor 105 may use infrared light to determine depth information.
  • the sensor emits an infrared signal 130, which may be reflected from a target or object and returned to the sensor 105.
  • the round-trip time of the infrared signal may be measured to provide an indication of depth, e.g. a distance to the target or object.
  • the multi-zone time-of-flight sensor 105 is depicted as providing depth information for three zones: a first zone 120-1; a second zone 120-2; and a third zone 120-3.
  • the zones 120-1 , 120-2, 120-3 define a scene sensed by the multi-zone time- of-flight sensor 105.
  • the multi-zone time-of-flight sensor 105 may sense in an array of zones such as, for example, a 2x2, 3x3, 4x4 or an even larger array of zones, such as 30x20 or larger.
  • the device 100 comprises a plurality of LEDs 110-1 , 110-2, 110-3, wherein at least one LED 110-1, 110-2, 110-3 is configurable to illuminate each zone 120-1 , 120- 2, 120-3 with visible radiation 135.
  • a first LED 110-1 is configured to illuminate the first zone 120-1 ; a second LED 110-2 is configure to illuminate the second zone 120-2; and a third LED 110-3 is configure to illuminate the third zone 120-3.
  • a plurality of LEDs may be provided comprising more than three LEDs.
  • a plurality of LEDs may be provided as a matrix.
  • at least one LED corresponds to each zone 120-1, 120-2, 120-3. Further details of arrangements of the LEDs in embodiments of the disclosure is provided below with reference to Figure 5.
  • the device 100 also comprises processing circuitry 115.
  • the processing circuitry is communicably coupled to the multi-zone time-of-f light sensor 105 and the plurality of LEDs 110-1 , 110-2, 110-3.
  • the processing circuity 115 may comprise any of: a logic circuit; an Arithmetic Logic Unit (ALU); a central processing unit (CPU); a combinatorial digital circuit; and/or a state machine.
  • processing may be offloaded to an external device, e.g. an external processor or further processing circuity such as cloud or server based processing circuity or other networked processing circuity (not shown in Figure 1).
  • the processing circuity 115 is configured to control the plurality of LEDs 110-1, 110-2, 110-3 in response to depth information for a corresponding zone 120-1, 120-2, 120-3, wherein the depth information is provided by the time-of-flight sensor 105.
  • each LED 110-1, 110-2, 110-3 By controlling each LED 110-1, 110-2, 110-3 in response to, i.e. based on, depth information for the corresponding zone 120-1, 120-2, 120-3, a selection may be readily made of which zones 120-1 , 120-2, 120-3 to illuminate using the plurality of LEDs 110-1 , 110-2, 110-3. By illuminating only selected zones, an overall power consumption of the plurality of LEDs 110-1, 110-2, 110-3 may be reduced, or minimised, as described in more detail below with reference to Figure 2.
  • Figure 2 depicts use of a device 200 according to an embodiment of the disclosure.
  • the device 200 comprises: a multi-zone time-of-flight sensor 205 configured to provide depth information for each zone of a plurality of zones 220-1, 220-2, 220-3.
  • the device 200 also comprises a plurality of LEDs 210-1, 210-2, 210-3, wherein: a first LED 210-1 is configurable to illuminate a first zone 220-1 ; a second LED 210-2 is configurable to illuminate a second zone 220-2; and a third LED 210-3 is configurable to illuminate a third zone 220-3.
  • the device 200 comprises processing circuitry 215 configured to control each LED 210-1, 210-2, 210-3 in response to depth information for a corresponding zone 220-1, 220-2, 220-3.
  • processing circuitry 215 configured to control each LED 210-1, 210-2, 210-3 in response to depth information for a corresponding zone 220-1, 220-2, 220-3.
  • Features of the device 200 generally correspond to those of device 100, and therefore are not described in further detail.
  • a target 225 is depicted.
  • the target 225 is, for purposes of illustration, a face.
  • the processing circuitry 215 is configured to determine a depth map of a scene comprising each zone 220-1, 220-2, 220-3 and, based on the depth information for each zone each zone 220-1 , 220-2, 220-3 control the LEDs 210-1 , 210-2, 210-3 based on the determined depth map.
  • the target 225 is in the first zone 220-1 at a first distance 240-1 from the multi-zone time-of-flight sensor 205.
  • a background may be disposed at a further distance, e.g. at a second distance 240-2 or further from the multi-zone time-of- flight sensor 205.
  • the depth information provided by the time-of-flight sensor 205 would indicate an object in the first zone 220-1 is closer, e.g. at first distance 240-1 , than a background which is, for purposes of example only, all at the second distance 240-2.
  • an object e.g. target 225
  • any background objects are detected in the second zone 220-2 and third zone 220-3 at at least a second distance 240-2 from the multizone time-of-flight sensor 205.
  • no object may be detected in the second zone 220-2 and third zone 220-3 if, for example, any background objects are out of a range of operation of the multi-zone time-of-flight sensor 205.
  • the processing circuitry 215 may determine that only the first zone z1 comprises an object within a defined distance, e.g. defined by a threshold, from the multi-zone time-of-flight sensor 205, and therefore the processing circuitry 215 may be configured to enable only the first LED 210-1 and to disable the second LED 210-2 and third LEDs 210-3 to predominantly or only illuminate the target 225.
  • a defined distance e.g. defined by a threshold
  • the processing circuitry 215 may be configured to run a person-detection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones 220-1, 220-2, 220-3.
  • Control of the LEDs 210-1, 210-2, 210-3 may be based upon an output of the person-detection and/or face-detection algorithm, e.g. the first LED 210-1 may only be enabled if it is determined that the object at the first distance 240-1 is a person or a face.
  • a current or duty cycle provided to the first LED 210-1 may be controlled and/or adjusted as required, e.g. by the processing circuitry 215.
  • a color of the ambient light may be determined, and an appropriate selection of the LEDs 210-1, 210-2, 210-3 and/or a power supplied to the LEDs 210-1 , 210-2, 210-3 may be made to white balance the scene and/or avoid overexposure.
  • a current or duty cycle provided to the first LED 210-1 may additionally or alternatively be controlled and/or adjusted based, at least in part, on the depth information from the time-of-f light sensor 205. For example, if the first distance 240-1 is determined to be relatively small, then a power provided to the first LED 210-1 may be relatively low, and if the first distance 240-1 is determined to be relatively large, then a power provided to the first LED 210-1 may be relatively high. As such, a desired level of exposure of a scene may be achieved.
  • Figure 3a depicts a device 300 according to a further embodiment of the disclosure.
  • the device 300 comprises a multi-zone time-of-flight sensor 305 configured to provide depth information for each zone of a plurality of zones 320-1, 320-2, 320-3.
  • the device 300 comprises a plurality of LEDs 310-1 , 310-2, 310-3, wherein: a first LED 310-1 is configurable to illuminate a first zone 320-1; a second LED 310-2 is configurable to illuminate a second zone 320-2; and a third LED 310-3 is configurable to illuminate a third zone 320-3.
  • Processing circuitry 215 configured to control each LED 310-1, 310-2, 310-3 in response to depth information for a corresponding zone 320-1 , 320-2, 320-3.
  • Features of the device 300 generally correspond to those of device 200 and therefore are not described in further detail.
  • the device 300 also comprises an image sensor 350.
  • the image sensor 350 may be, for example, a charge-coupled device or an active pixel sensor, known in the art as a CMOS image sensor.
  • the image sensor 350 is communicably coupled to the processing circuitry 313.
  • the processing circuitry 313 is configured to synchronise operation of the image sensor 350 with the plurality of LEDs 310-1 , 310-2, 310-3, to sufficiently expose a scene, target or object for which the image sensor 350 captures an image.
  • the device 300 may be configured as a camera.
  • the processing circuitry 315 may be configured to identify one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones 320-1 , 320-2, 320-3.
  • the processing circuitry 315 may be configured to identify and/or select an object or target based on its determined proximity to the multi- zone time-of-flight sensor 305 and/or based on one or more images captured by an image sensor 350.
  • the processing circuitry 315 may be configured to execute a person-detection and/or face-detection algorithm that may be based upon one or more images captured by the image sensor 350.
  • the image sensor 350 may be configured to capture video and the processing circuitry 315 may be configured to control plurality of LEDs 310-1 , 310-2, 310-3 to substantially only illuminate one or more selected targets in the captured video. This is depicted with reference to Figures 3a, 3b and 3c which correspond to use of the device 300 at a first, second and third time in a video sequence respectively.
  • the processing circuitry 315 determines that an object 325 is in the first zone 320-1, and therefore the first zone 320-1 is to be illuminated, based on depth map information from the time-of-flight sensor 305.
  • the depth map information may indicate that the object 325 stands out from a background, e.g. is relatively close at first distance 340-1 compared to a relatively distant background at or beyond a second distance 340-2.
  • the processing circuitry 315 may perform other or additional algorithms on the depth map information, such as person or facial recognition algorithms or other object detection algorithms.
  • the object 325 has moved relative to the device 300, and now resides in the second zone 320-2.
  • the processing circuitry 315 determines that the object 325 is in the second zone 320-2 and therefore only the second zone 320-2 is to be illuminated by the second LED 310- 2, based on depth map information from the time-of-flight sensor 305.
  • the object 325 has moved again relative to the device 300, and now resides in the third zone 320-3.
  • the processing circuitry 315 determines that the object 325 is in the third zone 320-3 and therefore only the third zone 320-3 is to be illuminated, based on depth map information from the time-of-flight sensor 305.
  • the plurality of LEDs 310-1, 310-2, 310-3 may be configured to illuminate substantially only the object 325. That is, movement of object 325 relative to the device 300 may be effectively tracked using the multi-zone time-of-flight sensor 315 and only LEDs 310-1, 310-2, 310-3 corresponding to the zones 320-1, 320-2, 320-3 in which the object 325 is detected in any given frame of the video sequence may be illuminated.
  • Figure 4 depicts a device 400 implemented in a portable apparatus 455, according to an embodiment of the disclosure. Features of the device generally correspond to those of the device 300 of Figures 3a to 3c.
  • the device 400 comprises a multi-zone time-of-f light sensor 405 configured to provide depth information for each zone of a plurality of zones 420-1 , 420-2, 420-3.
  • the device 400 comprises: a first LED 410-1 is configurable to illuminate a first zone 420-1 ; a second LED 410-2 is configurable to illuminate a second zone 420-2; and a third LED 410-3 is configurable to illuminate a third zone 420-3.
  • Processing circuitry 415 configured to control each LED 410-1, 410-2, 410-3 in response to depth information for a corresponding zone 420-1 , 420-2, 420-3.
  • the device 400 also comprises an image sensor 450.
  • the portable apparatus 455 may be a battery-operated apparatus, and therefore may advantageously benefit from the extended battery life that the disclosed low-power illumination device 400 may enable.
  • the portable apparatus 455 may, for example, be any of a smartphone, a tablet device, a laptop computer, a games system; a digital camera, or any other portable communications device.
  • the device comprises an optical element 460 configured to direct illumination from each LED 410-1, 410-2, 410-3 towards a corresponding zone 420-1 , 420-2, 420-3 of the plurality of zones.
  • the optical element 460 may be configured to increase a field of illumination of each LED 410-1 , 410-2, 410-3.
  • optical optical element 460 may comprise an optical apparatus comprising a plurality of optical elements.
  • the device 400 may be provided as an optical module comprising the one or more optical elements.
  • the optical element 460 may comprise, for example: a concave or convex lens; a Fresnel lens; a microlens array; and/or a metalens.
  • FIG. 5 is a high-level diagram depicting use of a device according to an embodiment of the disclosure, which will be described with reference to the method 600 of Figure 6
  • a multi-zone time-of-flight sensor 505 is depicted.
  • the time-of-flight sensor 505 is configured to capture depth information.
  • the depth information may correspond to a depth map, or a 3-dimensional image of a scene.
  • the multi-zone time-of-flight sensor 505 is communicably coupled to processing circuitry (not shown in Figure 5), e.g. the processing circuitry 115, 215, 315, 415 of Figures 1 to 4.
  • the processing circuitry is configured to identify one or more objects in a scene based on the depth information from the time-of-flight sensor 505.
  • a person 525 is identified as being distinct from a background, e.g. the tree 530.
  • the person 525 may be identified by the processing circuitry as being distinct from the tree 530 because the depth information indicates that the person 525 is substantially closer to the time-of-flight sensor 505 than the tree 530.
  • the processing circuitry is configured to control an LED matrix 510.
  • an LED driver IC 515 is depicted.
  • the processing circuitry may control the LED driver IC 515, thereby controlling the LED matrix 510.
  • the LED matrix 510 is selectively driven to provide a flash illumination and an image sensor (not shown) such as a CMOS imager captures an image.
  • an image sensor such as a CMOS imager captures an image.
  • a subset of the available LEDs of the LED matrix 510 are driven to selectively illuminate the person 525, while not illuminating the background tree 530.
  • the processing circuitry may be configured to post process the captured image.
  • Such post processing may comprise adaptation or correction of the captured image, or other modifications such as whitebalancing to the captured image.
  • the example LED matrix 510 of Figure 5 comprises a 12 x 10 array of LEDs. Only a small subset 550, e.g. a 4 x 4 array of the LEDs, are enabled to illuminate the person 525, and the remaining LEDs remain disabled to save power.
  • the plurality of LEDs forming the LED matrix 510 may be individually addressable, or addressable in groups such that LEDs corresponding to zones sensed by the multi-zone time-of-flight sensor 505 may be enabled or disabled independently.
  • At least two LEDs may be configurable to illuminate each zone.
  • each at least two LEDs may comprise a relatively warm white LED and a relatively cool white LED.
  • the example LED matrix 510 of Figure 5 comprises a distribution of warm white LEDs 540 and cool white LEDs 545.
  • the processing circuitry may be configured to determine an amount of warm white LEDs 540 and cool white LEDs 545 to enable for each zone and/or a current or duty-cycle to provide to the LEDs 545, 550.
  • an ambient light sensor and/or an image sensor e.g. image sensor 450, may sense a color of an ambient light, and select an amount of relatively warm white LEDs 540 and relatively cool white LEDs 545 to enable to white-balance a scene.
  • optical element 560 configured to direct illumination from each LED of the LED matrix 510 towards corresponding zones sensed by the multi-zone time-of-flight sensor 505.
  • Figure 7 depicts a method 700 of illuminating a target or object within a scene.
  • the method 700 comprises configuring a multi-zone time-of-flight sensor to provide depth information for each zone of a plurality of zones.
  • the multi-zone time-of-flight sensor may correspond to the multi-zone time-of- flight sensor 105, 205, 305, 405, 505 as described above with reference to Figures 1, 2, 3a to 3c, 4 and 5.
  • the method 700 comprises controlling a plurality of LEDs to illuminate one or more of the zones in response to the corresponding depth information.
  • the plurality of LEDs may correspond to the plurality of LEDs 110-1, 110-2, 110-3, 210-1, 210-2, 210-3, 310-1, 310-2, 310-3, 410-1 , 410-2, 410-3, 510 as described above with reference to Figures 1 , 2, 3a to 3c, 4 and 5.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

A device (100, 200, 300, 400) is disclosed, wherein the device comprises a multi-zone time-of-flight sensor (105, 205, 305, 405, 505) configured to provide depth information for each zone of a plurality of zones (120, 220, 320, 420). The device also comprises a plurality of LEDs (110, 210, 310, 410), wherein at least one LED is configurable to illuminate each zone. The device also comprises processing circuitry (115, 215, 315, 415) configured to control each at least one LED in response to depth information for a corresponding zone. The device is suitable for illuminating a target to be imaged by an imaging device, in particular on a portable apparatus (455) such as a smartphone. A method (700) of illuminating a target or object within a scene is also disclosed.

Description

ILLUMINATION DEVICE
TECHNICAL FIELD OF THE DISCLOSURE
The present disclosure is in the field of devices for illuminating a target to be imaged by an imaging device, e.g. a camera. The disclosure relates, in particular, to operation of such a device in battery-operated portable apparatus, such as a smartphone, laptop computer, games system or tablet device.
BACKGROUND
Imaging devices such as cameras, and in particular portable devices comprising such imaging devices, e.g. smartphones, tablet devices, games systems and laptop computers, may comprise devices for illuminating a scene.
For example, some cameras are known to comprise an illumination device, known in the art as a ‘flash’, for illuminating a scene, particularly in poor lighting conditions. A flash may for example comprise one or more bulbs or Light Emitting Diodes (LEDs) and may be synchronised with the imaging device to illumine a scene at a precise time of image capture. Operation of a flash may depend upon ambient lighting conditions. In low ambient light conditions, the flash may be enabled during an image capture. In high ambient light conditions, the flash may remain disabled during an image capture to avoid over exposure of a scene.
Portable devices, and in particular smartphones, are increasingly used for photographic and video purposes. Components of such portable devices may have stringent power consumption requirements, to optimise or maximise a battery life of the portable device. However, operation of an illumination device, e.g. a flash, may consume a substantial amount of power. Prolonged or repeated use of an illumination device may accelerate depletion of charge in a battery in a portable device, thereby adversely affecting a user-experience of the portable device.
Furthermore, while an illumination device may provide additional illumination for capturing images in low lighting conditions, such a device may also affect an image, such as by causing shadows attributed to the illumination device rather than from ambient lighting or from any other light source. Such shadows may adversely affect image quality of a background to a target being imaged. It is therefore desirable to provide a low power means to sufficiently illuminate a scene for an imaging device to ensure adequate image quality. It is desirable that such means are suitable for use in a portable battery-operated apparatus such as a smartphone.
It is therefore an aim of at least one embodiment of at least one aspect of the present disclosure to obviate or at least mitigate at least one of the above identified shortcomings of the prior art.
SUMMARY
The present disclosure is in the field of devices for illuminating a target to be imaged by an imaging device, e.g. a camera. The disclosure relates, in particular, to operation of such a device in battery-operated portable apparatus, e.g. a smartphone, laptop computer, tablet device, games system or the like.
According to a first aspect of the disclosure, there is provided a device comprising: a multi-zone time-of-flight sensor configured to provide depth information for each zone of a plurality of zones; a plurality of LEDs, wherein at least one LED is configurable to illuminate each zone; and processing circuitry configured to control each at least one LED in response to depth information for a corresponding zone.
Advantageously, by controlling each at least one LED in response to, i.e. based on, depth information for the corresponding zone, a selection may be readily made of which zones to illuminate using the plurality of LEDs. By illuminating only selected zones, an overall power consumption of the plurality of LEDs may be reduced, or minimised.
For example, the disclosed device may be implemented in an imaging device such as a camera on a smartphone, which may be used to photograph a target, e.g. a person in a self-portrait photograph. The processing circuitry of the disclosed device may detect which of the plurality of zones the target resides within based on the depth information for each zone, and thereby only enable LEDs corresponding to the zones within which the target resides. As such, other zones may not be not illuminated, thereby reducing an overall power consumption of the plurality of LEDs. That is, only the target being photographed may be illuminated, thereby avoiding wasting power unnecessarily illuminating a background.
The processing circuitry may be configured to determine a depth map, e.g. a 3- dimensionall representation, of a scene based on the depth information for each zone of a plurality of zones. The processing circuitry may be configured to control each at least one LED based on the determined depth map.
Advantageously, a multi-zone time-of-flight sensor may provide an efficient, low-cost and compact means to associate a zone within a scene to one or more corresponding LEDs. Advantageously, use of a multi-zone time-of-flight sensor may provide an efficient mapping between LEDs and zones, thereby simplifying software overhead.
Advantageously, a resolution of the multi-zone time-of-flight sensor may correspond to a resolution of the plurality of LEDs, e.g. of a matrix of LEDs, thereby providing a direct mapping between a zone of the time-of-flight sensor and one or more of the plurality of LEDs. In an example, a 1 :1 mapping between zones of the time-of- flight sensor and LEDs may be implemented. In other examples, a plurality of LEDs may be configured to illuminate each zone.
The term “multi-zone time-of-flight sensor” refers to a sensor configurable to sense a distance to an object, e.g. a proximity, in each of a plurality of zones. The “multi-zone time-of-flight sensor” may be known in the art as a ‘multi-zone sensor’, a ‘multi-zone proximity sensor’, a ‘multizone sensor’, or the like.
In some examples, the multi-zone time-of-flight sensor may be capable of multiobject detection in each zone, e.g. capable of producing a corresponding histogram with a plurality of peaks in each zone.
As an example, a multi-zone time-of-flight sensor may sense in an array of zones such as, for example, a 2x2, 3x3, 4x4 or an even larger array of zones, such as 30x20 or larger. Each zone of the plurality of zones may define an area or field sensed by the time-of-flight sensor. Each area or field may be distinct, or may at least partially overlap an area or field of a zone defined by at least one adjacent zone.
Controlling each at least one LED may comprise controlling a driver circuit, or discrete driver IC (Integrated Circuit) to enable/disable each at least one LED.
Controlling each at least one LED may comprise controlling and/or adjusting a current and/or duty cycle of a signal to each at least one LED. The processing circuitry may be configured to identify one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones.
Advantageously, the processing circuitry may identity one or more objects or targets using a depth map that is determined based upon the depth information for one or more of the plurality of zones. For example, the processing circuitry may run one or more algorithms for detection and/or identification and/or determination of an object or target.
The processing circuitry may be configured to identify and/or select an object or target based on its determined proximity to the multi-zone time-of-f light sensor and/or based on one or more images captured by an image sensor.
The processing circuitry may be configured to control each at least one LED to predominantly or only illuminate the one or more objects.
The processing circuitry may be configured to control each at least one LED to only illuminate zones corresponding to the one or more objects.
Advantageously, the processing circuitry may be configured to distinguish objects or targets that are relatively close to the multi-zone time-of-f light sensor, from a background in a scene that may be relatively far from the multi-zone time-of-flight sensor.
Advantageously, the processing circuitry may be configured to control the plurality of LEDs to illuminate only the one or more objects, e.g. enable LEDs within zones in which the one or more objects is detected, thereby saving power compared to enabling all of the plurality of LEDs. Furthermore, the processing circuitry may be configured to adjust a power of each LED based on the depth information, e.g. a depth map, to avoid overexposure of a target or object. That is, even within LEDs that are enabled to illuminate a target or object, a power to each LED may be controlled based on a depth map to more accurately control exposure of the object.
The processing circuitry may be configured to run a person-detection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones.
In some examples, such a person-detection and/or face-detection algorithm may be additionally based upon one or more images captured by an image sensor.
Advantageously, by performing such person or face detection, the processing circuitry may be configured to control the plurality of LEDs to illuminate only a detected person or face, and avoid wasting power illuminating a background or other portions of a scene.
The processing circuitry may be configured to control each at least one LED to predominantly or only illuminate one or more detected persons or faces.
The processing circuitry may be configured to control each at least one LED to only illuminate zones corresponding to the one or more detected persons or faces.
That is, advantageously, the processing circuitry may be configured to control the plurality of LEDs to illuminate only a detected person or face, e.g. enable LEDs within zones in which the person or face is detected, thereby saving power compared to enabling all of the plurality of LEDs.
The processing circuitry may be configured to control the plurality of LEDs by enabling each at least one LED when depth information for the corresponding zone(s) indicates a distance to a target is below a threshold.
The processing circuitry may be configured to control the plurality of LEDs by disabling each at least one LED when depth information for the corresponding zone(s) indicates a distance to a target is at or above the threshold.
Advantageously, by implementing such a threshold, the plurality of LEDs may be controlled to illuminate only zones comprising a target relatively close to the multizone time-of-f light sensor, while avoiding illuminating zones relatively far from the multizone time-of-flight sensor, thereby saving power by only enabling a subset of the plurality of LEDs.
The threshold may depend upon a detected ambient light level.
In some examples, the ambient light level may be sensed by one or more ambient light sensors. In some examples, the ambient light level may be sensed by an image sensor. The one or more ambient light sensors may be communicably coupled to the processing circuitry.
The device may comprise an image sensor configured to capture one or more images corresponding to the plurality of zones.
The device may be, or may comprise, a camera.
The processing circuitry may be configured to control each at least one LED to substantially only illuminate one or more selected targets in each image, or each set of images.
Advantageously, the imaging device may capture an image wherein only the selected one or more selected targets are illuminated by the plurality of LEDs. That is, LEDs configured to illuminate zones corresponding to the selected one or more selected targets may be enabled.
The image sensor may be configured to capture video and the processing circuitry may be configured to control each at least one LED to substantially only illuminate one or more selected targets in the captured video.
Advantageously, even in the case of movement of a target or object within a scene relative to the device, the LEDs may be configured to illuminate substantially only the target or object. That is, relative movement of object or target may be effectively tracked using the multi-zone time-of-flight sensor and only LEDs corresponding to the zones in which the object or target is detected in any given frame of the video may be illuminated.
The device may comprise one or more optical elements configured to direct illumination from each at least one LED towards a corresponding zone of the plurality of zones.
The device may be provided as an optical module comprising the one or more optical elements.
The one or more optical elements may be configured to increase a field of illumination of the plurality of LEDs.
The one or more optical elements may comprise: a concave or convex lens; a Fresnel lens; a microlens array; and/or a metalens.
Advantageously, the one or more optical elements may directly map a field of illumination of one or more LEDs of the plurality of LEDs to each zone of the multi-zone time-of-flight sensor.
The plurality of LEDs may be provided as a matrix of LEDs. Each LED may be individually addressable. Each at least one LED corresponding to a zone may be addressable.
The term ‘addressable’ refers to addressable by the processing circuitry, e.g. able to be enabled or disabled by the processing circuitry.
At least two LEDs may be configurable to illuminate each zone. Each at least two LEDs may comprise a relatively warm white LED and a relatively cool white LED.
Advantageously, selection of an amount of relatively warm white LEDs and relatively cool white LEDs to be enabled in a given zone may enable control over a white-balance of a scene, or of an object or target to be imaged with the image sensor. Advantageously, selection of a drive strength provided to each relatively warm white LED and relatively cool white LED in a given zone may enable control over a white-balance of a scene, or of an object or target to be imaged with the image sensor.
In some embodiments, an ambient light sensor and/or an image sensor may sense a color of an ambient light, and select an amount of relatively warm white LEDs and relatively cool white LEDs to enable to white-balance a scene.
The processing circuitry may be configured to control an intensity of each at least one LED in response to depth information for the corresponding zone.
Advantageously, LEDs configured to illuminate one or more portions of an object or target that are relatively close to the multi-zone time-of-flight sensor may be provided with a higher power than LEDs configured to illuminate one or more portions of the object or target that are relatively far to the multi-zone time-of-flight sensor, thereby providing a relatively even distribution of illumination across the target or object.
According to a second aspect of the disclosure, there is provided a battery- operated portable apparatus comprising the device according to the first aspect.
The portable apparatus may, for example, be a smartphone, a tablet device, a laptop computer, a games system, or the like.
The portable apparatus may comprise a camera.
The portable apparatus may comprise a communications device.
The camera, e.g. an image sensor within the camera, may be synchronised with the device according to the first aspect.
According to a third aspect of the disclosure, there is provided a method of illuminating a target or object within a scene, the method comprising: configuring a multi-zone time-of-flight sensor to provide depth information for each zone of a plurality of zones; and controlling a plurality of LEDs to illuminate one or more of the zones in response to the corresponding depth information.
The method may comprise identifying one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones.
The method may comprise identifying and/or selecting an object or target based on its determined proximity to the multi-zone time-of-flight sensor and/or based on one or more images captured by an image sensor.
The method may comprise controlling the plurality of LEDs to predominantly or only illuminate the one or more objects. The method may comprise configuring processing circuitry to run a persondetection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones. In some examples, such a persondetection and/or face-detection algorithm may be additionally based on one or more images captured by an image sensor.
The method may comprise controlling the plurality of LEDs to predominantly or only illuminate one or more detected persons or faces.
The method may comprise controlling the plurality of LEDs to substantially only illuminate one or more selected targets in each image, or each set of images captured by an image sensor.
The method may comprise controlling the plurality of LEDs to substantially only illuminate one or more selected targets in video captured by an image sensor
According to a further aspect of the disclosure, there is provided a camera comprising the device according to the first aspect.
The above summary is intended to be merely exemplary and non-limiting. The disclosure includes one or more corresponding aspects, embodiments or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation. It should be understood that features defined above in accordance with any aspect of the present disclosure or below relating to any specific embodiment of the disclosure may be utilized, either alone or in combination with any other defined feature, in any other aspect or embodiment or to form a further aspect or embodiment of the disclosure.
BRIEF DESCRIPTION OF THE PREFERRED EMBODIMENTS
These and other aspects of the present disclosure will now be described, by way of example only, with reference to the accompanying drawings, wherein:
Figure 1 depicts a device according to an embodiment of the disclosure;
Figure 2 depicts use of a device according to an embodiment of the disclosure;
Figure 3a depicts use of a device according to a further embodiment of the disclosure for illuminating an object at a first time in a video sequence;
Figure 3b depicts use of the device of Figure 3a for illuminating the object at a second time in the video sequence; Figure 3c depicts use of the device of Figure 3a for illuminating the object at a third time in the video sequence;
Figure 4 depicts a device implemented in a portable apparatus, according to an embodiment of the disclosure;
Figure 5 depicts use of a device according to an embodiment of the disclosure;
Figure 6 depicts a method of capturing an image, according to an embodiment of the disclosure; and
Figure 7 depicts a method of illuminating a target or object within a scene, according to an embodiment of the disclosure.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Figure 1 depicts a device 100 according to an embodiment of the disclosure. The device 100 comprises a multi-zone time-of-flight sensor 105. The multi-zone time- of-flight sensor 105 is configured to provide depth information for each zone 120-1 , 120-2, 120-3 of a plurality of zones.
Generally, such a time-of-flight sensor 105 may use infrared light to determine depth information. The sensor emits an infrared signal 130, which may be reflected from a target or object and returned to the sensor 105. The round-trip time of the infrared signal may be measured to provide an indication of depth, e.g. a distance to the target or object.
For purposes of simplicity of illustration only, the multi-zone time-of-flight sensor 105 is depicted as providing depth information for three zones: a first zone 120-1; a second zone 120-2; and a third zone 120-3.
The zones 120-1 , 120-2, 120-3 define a scene sensed by the multi-zone time- of-flight sensor 105.
It will be appreciated that in example embodiments, the multi-zone time-of-flight sensor 105 may sense in an array of zones such as, for example, a 2x2, 3x3, 4x4 or an even larger array of zones, such as 30x20 or larger.
The device 100 comprises a plurality of LEDs 110-1 , 110-2, 110-3, wherein at least one LED 110-1, 110-2, 110-3 is configurable to illuminate each zone 120-1 , 120- 2, 120-3 with visible radiation 135.
In the example of Figure 1 : a first LED 110-1 is configured to illuminate the first zone 120-1 ; a second LED 110-2 is configure to illuminate the second zone 120-2; and a third LED 110-3 is configure to illuminate the third zone 120-3. For purposes of illustration only, only a first LED110-1, a second LED 110-2, and a third LED 110-3 are depicted. It will be appreciated than in embodiments of the invention, a plurality of LEDs may be provided comprising more than three LEDs. For example, a plurality of LEDs may be provided as a matrix. In embodiments, at least one LED corresponds to each zone 120-1, 120-2, 120-3. Further details of arrangements of the LEDs in embodiments of the disclosure is provided below with reference to Figure 5.
The device 100 also comprises processing circuitry 115. The processing circuitry is communicably coupled to the multi-zone time-of-f light sensor 105 and the plurality of LEDs 110-1 , 110-2, 110-3.
In some embodiments, the processing circuity 115 may comprise any of: a logic circuit; an Arithmetic Logic Unit (ALU); a central processing unit (CPU); a combinatorial digital circuit; and/or a state machine. In yet further embodiments, processing may be offloaded to an external device, e.g. an external processor or further processing circuity such as cloud or server based processing circuity or other networked processing circuity (not shown in Figure 1).
The processing circuity 115 is configured to control the plurality of LEDs 110-1, 110-2, 110-3 in response to depth information for a corresponding zone 120-1, 120-2, 120-3, wherein the depth information is provided by the time-of-flight sensor 105.
By controlling each LED 110-1, 110-2, 110-3 in response to, i.e. based on, depth information for the corresponding zone 120-1, 120-2, 120-3, a selection may be readily made of which zones 120-1 , 120-2, 120-3 to illuminate using the plurality of LEDs 110-1 , 110-2, 110-3. By illuminating only selected zones, an overall power consumption of the plurality of LEDs 110-1, 110-2, 110-3 may be reduced, or minimised, as described in more detail below with reference to Figure 2.
Figure 2 depicts use of a device 200 according to an embodiment of the disclosure. The device 200 comprises: a multi-zone time-of-flight sensor 205 configured to provide depth information for each zone of a plurality of zones 220-1, 220-2, 220-3. The device 200 also comprises a plurality of LEDs 210-1, 210-2, 210-3, wherein: a first LED 210-1 is configurable to illuminate a first zone 220-1 ; a second LED 210-2 is configurable to illuminate a second zone 220-2; and a third LED 210-3 is configurable to illuminate a third zone 220-3. The device 200 comprises processing circuitry 215 configured to control each LED 210-1, 210-2, 210-3 in response to depth information for a corresponding zone 220-1, 220-2, 220-3. Features of the device 200 generally correspond to those of device 100, and therefore are not described in further detail.
In the example of Figure 2, a target 225 is depicted. The target 225 is, for purposes of illustration, a face.
The processing circuitry 215 is configured to determine a depth map of a scene comprising each zone 220-1, 220-2, 220-3 and, based on the depth information for each zone each zone 220-1 , 220-2, 220-3 control the LEDs 210-1 , 210-2, 210-3 based on the determined depth map.
In the example, the target 225 is in the first zone 220-1 at a first distance 240-1 from the multi-zone time-of-flight sensor 205. A background may be disposed at a further distance, e.g. at a second distance 240-2 or further from the multi-zone time-of- flight sensor 205. Thus, the depth information provided by the time-of-flight sensor 205 would indicate an object in the first zone 220-1 is closer, e.g. at first distance 240-1 , than a background which is, for purposes of example only, all at the second distance 240-2.
That is, in the example, an object, e.g. target 225, is detected in the first zone 220-1 at a first distance 240-1 and any background objects are detected in the second zone 220-2 and third zone 220-3 at at least a second distance 240-2 from the multizone time-of-flight sensor 205. In other examples, no object may be detected in the second zone 220-2 and third zone 220-3 if, for example, any background objects are out of a range of operation of the multi-zone time-of-flight sensor 205.
In this example, the processing circuitry 215 may determine that only the first zone z1 comprises an object within a defined distance, e.g. defined by a threshold, from the multi-zone time-of-flight sensor 205, and therefore the processing circuitry 215 may be configured to enable only the first LED 210-1 and to disable the second LED 210-2 and third LEDs 210-3 to predominantly or only illuminate the target 225.
In some examples, the processing circuitry 215 may be configured to run a person-detection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones 220-1, 220-2, 220-3. Control of the LEDs 210-1, 210-2, 210-3 may be based upon an output of the person-detection and/or face-detection algorithm, e.g. the first LED 210-1 may only be enabled if it is determined that the object at the first distance 240-1 is a person or a face.
Furthermore, in addition to enabling the first LED 210-1, a current or duty cycle provided to the first LED 210-1 may be controlled and/or adjusted as required, e.g. by the processing circuitry 215. For example, in devices comprising an image sensor or ambient light sensor, a color of the ambient light may be determined, and an appropriate selection of the LEDs 210-1, 210-2, 210-3 and/or a power supplied to the LEDs 210-1 , 210-2, 210-3 may be made to white balance the scene and/or avoid overexposure.
A current or duty cycle provided to the first LED 210-1 may additionally or alternatively be controlled and/or adjusted based, at least in part, on the depth information from the time-of-f light sensor 205. For example, if the first distance 240-1 is determined to be relatively small, then a power provided to the first LED 210-1 may be relatively low, and if the first distance 240-1 is determined to be relatively large, then a power provided to the first LED 210-1 may be relatively high. As such, a desired level of exposure of a scene may be achieved.
Figure 3a depicts a device 300 according to a further embodiment of the disclosure. The device 300 comprises a multi-zone time-of-flight sensor 305 configured to provide depth information for each zone of a plurality of zones 320-1, 320-2, 320-3. The device 300 comprises a plurality of LEDs 310-1 , 310-2, 310-3, wherein: a first LED 310-1 is configurable to illuminate a first zone 320-1; a second LED 310-2 is configurable to illuminate a second zone 320-2; and a third LED 310-3 is configurable to illuminate a third zone 320-3. Processing circuitry 215 configured to control each LED 310-1, 310-2, 310-3 in response to depth information for a corresponding zone 320-1 , 320-2, 320-3. Features of the device 300 generally correspond to those of device 200 and therefore are not described in further detail.
The device 300 also comprises an image sensor 350. The image sensor 350 may be, for example, a charge-coupled device or an active pixel sensor, known in the art as a CMOS image sensor. The image sensor 350 is communicably coupled to the processing circuitry 313. In the example embodiment, the processing circuitry 313 is configured to synchronise operation of the image sensor 350 with the plurality of LEDs 310-1 , 310-2, 310-3, to sufficiently expose a scene, target or object for which the image sensor 350 captures an image.
The device 300 may be configured as a camera.
As described above with reference to the embodiment of Figure 2, the processing circuitry 315 may be configured to identify one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones 320-1 , 320-2, 320-3.
In some embodiments, the processing circuitry 315 may be configured to identify and/or select an object or target based on its determined proximity to the multi- zone time-of-flight sensor 305 and/or based on one or more images captured by an image sensor 350. For example, the processing circuitry 315 may be configured to execute a person-detection and/or face-detection algorithm that may be based upon one or more images captured by the image sensor 350.
In some embodiments, the image sensor 350 may be configured to capture video and the processing circuitry 315 may be configured to control plurality of LEDs 310-1 , 310-2, 310-3 to substantially only illuminate one or more selected targets in the captured video. This is depicted with reference to Figures 3a, 3b and 3c which correspond to use of the device 300 at a first, second and third time in a video sequence respectively.
At a first time in the video sequence, the processing circuitry 315 determines that an object 325 is in the first zone 320-1, and therefore the first zone 320-1 is to be illuminated, based on depth map information from the time-of-flight sensor 305. As described above, the depth map information may indicate that the object 325 stands out from a background, e.g. is relatively close at first distance 340-1 compared to a relatively distant background at or beyond a second distance 340-2. The processing circuitry 315 may perform other or additional algorithms on the depth map information, such as person or facial recognition algorithms or other object detection algorithms.
At a second time in the video sequence shown in Figure 3b, the object 325 has moved relative to the device 300, and now resides in the second zone 320-2. The processing circuitry 315 determines that the object 325 is in the second zone 320-2 and therefore only the second zone 320-2 is to be illuminated by the second LED 310- 2, based on depth map information from the time-of-flight sensor 305.
At a third time in the video sequence shown in Figure 3c, the object 325 has moved again relative to the device 300, and now resides in the third zone 320-3. The processing circuitry 315 determines that the object 325 is in the third zone 320-3 and therefore only the third zone 320-3 is to be illuminated, based on depth map information from the time-of-flight sensor 305.
In this way, the plurality of LEDs 310-1, 310-2, 310-3 may be configured to illuminate substantially only the object 325. That is, movement of object 325 relative to the device 300 may be effectively tracked using the multi-zone time-of-flight sensor 315 and only LEDs 310-1, 310-2, 310-3 corresponding to the zones 320-1, 320-2, 320-3 in which the object 325 is detected in any given frame of the video sequence may be illuminated. Figure 4 depicts a device 400 implemented in a portable apparatus 455, according to an embodiment of the disclosure. Features of the device generally correspond to those of the device 300 of Figures 3a to 3c.
The device 400 comprises a multi-zone time-of-f light sensor 405 configured to provide depth information for each zone of a plurality of zones 420-1 , 420-2, 420-3. The device 400 comprises: a first LED 410-1 is configurable to illuminate a first zone 420-1 ; a second LED 410-2 is configurable to illuminate a second zone 420-2; and a third LED 410-3 is configurable to illuminate a third zone 420-3. Processing circuitry 415 configured to control each LED 410-1, 410-2, 410-3 in response to depth information for a corresponding zone 420-1 , 420-2, 420-3. The device 400 also comprises an image sensor 450.
The portable apparatus 455 may be a battery-operated apparatus, and therefore may advantageously benefit from the extended battery life that the disclosed low-power illumination device 400 may enable.
The portable apparatus 455 may, for example, be any of a smartphone, a tablet device, a laptop computer, a games system; a digital camera, or any other portable communications device.
The device comprises an optical element 460 configured to direct illumination from each LED 410-1, 410-2, 410-3 towards a corresponding zone 420-1 , 420-2, 420-3 of the plurality of zones. In some embodiments, the optical element 460 may be configured to increase a field of illumination of each LED 410-1 , 410-2, 410-3.
Although only a single optical optical element 460 is depicted, other embodiments may comprise an optical apparatus comprising a plurality of optical elements. The device 400 may be provided as an optical module comprising the one or more optical elements.
The optical element 460 may comprise, for example: a concave or convex lens; a Fresnel lens; a microlens array; and/or a metalens.
Figure 5 is a high-level diagram depicting use of a device according to an embodiment of the disclosure, which will be described with reference to the method 600 of Figure 6
A multi-zone time-of-flight sensor 505 is depicted. In a first step 610 of the method 600, the time-of-flight sensor 505 is configured to capture depth information. The depth information may correspond to a depth map, or a 3-dimensional image of a scene. The multi-zone time-of-flight sensor 505 is communicably coupled to processing circuitry (not shown in Figure 5), e.g. the processing circuitry 115, 215, 315, 415 of Figures 1 to 4.
In a second step 620 of the method 600, the processing circuitry is configured to identify one or more objects in a scene based on the depth information from the time-of-flight sensor 505. In the example of Figure 5, a person 525 is identified as being distinct from a background, e.g. the tree 530.
The person 525 may be identified by the processing circuitry as being distinct from the tree 530 because the depth information indicates that the person 525 is substantially closer to the time-of-flight sensor 505 than the tree 530.
The processing circuitry is configured to control an LED matrix 510. In the example of Figure 5, an LED driver IC 515 is depicted. The processing circuitry may control the LED driver IC 515, thereby controlling the LED matrix 510.
At a third step 630 in the method 600, the LED matrix 510 is selectively driven to provide a flash illumination and an image sensor (not shown) such as a CMOS imager captures an image.
That is, as described above with reference to Figures 2, 3a to 3c and 4, a subset of the available LEDs of the LED matrix 510 are driven to selectively illuminate the person 525, while not illuminating the background tree 530.
In a fourth step 640 of the method 600, the processing circuitry may be configured to post process the captured image. Such post processing may comprise adaptation or correction of the captured image, or other modifications such as whitebalancing to the captured image.
The example LED matrix 510 of Figure 5 comprises a 12 x 10 array of LEDs. Only a small subset 550, e.g. a 4 x 4 array of the LEDs, are enabled to illuminate the person 525, and the remaining LEDs remain disabled to save power.
The plurality of LEDs forming the LED matrix 510 may be individually addressable, or addressable in groups such that LEDs corresponding to zones sensed by the multi-zone time-of-flight sensor 505 may be enabled or disabled independently.
In some embodiments, at least two LEDs may be configurable to illuminate each zone. In some embodiments, each at least two LEDs may comprise a relatively warm white LED and a relatively cool white LED.
The example LED matrix 510 of Figure 5 comprises a distribution of warm white LEDs 540 and cool white LEDs 545. The processing circuitry may be configured to determine an amount of warm white LEDs 540 and cool white LEDs 545 to enable for each zone and/or a current or duty-cycle to provide to the LEDs 545, 550. In some embodiments, an ambient light sensor and/or an image sensor, e.g. image sensor 450, may sense a color of an ambient light, and select an amount of relatively warm white LEDs 540 and relatively cool white LEDs 545 to enable to white-balance a scene. Also depicted in Figure 5 is optical element 560 configured to direct illumination from each LED of the LED matrix 510 towards corresponding zones sensed by the multi-zone time-of-flight sensor 505.
Figure 7 depicts a method 700 of illuminating a target or object within a scene. In a first step 710, the method 700 comprises configuring a multi-zone time-of-flight sensor to provide depth information for each zone of a plurality of zones.
The multi-zone time-of-flight sensor may correspond to the multi-zone time-of- flight sensor 105, 205, 305, 405, 505 as described above with reference to Figures 1, 2, 3a to 3c, 4 and 5.
In a second step 710, the method 700 comprises controlling a plurality of LEDs to illuminate one or more of the zones in response to the corresponding depth information.
The plurality of LEDs may correspond to the plurality of LEDs 110-1, 110-2, 110-3, 210-1, 210-2, 210-3, 310-1, 310-2, 310-3, 410-1 , 410-2, 410-3, 510 as described above with reference to Figures 1 , 2, 3a to 3c, 4 and 5.
Although the disclosure has been described in terms of particular embodiments as set forth above, it should be understood that these embodiments are illustrative only and that the claims are not limited to those embodiments. Those skilled in the art will be able to make modifications and alternatives in view of the disclosure, which are contemplated as falling within the scope of the appended claims. Each feature disclosed or illustrated in the present specification may be incorporated in any embodiments, whether alone or in any appropriate combination with any other feature disclosed or illustrated herein. List of Reference Numerals
100 device 340-1 first distance
105 multi-zone time-of-flight sensor 340-2 second distance
110-1 first LED 350 image sensor
110-2 second LED 400 device
110-3 third LED 40 405 multi-zone time-of-flight sensor
115 processing circuitry 410-1 first LED
120-1 first zone 410-2 second LED
120-2 second zone 410-3 third LED
120-3 third zone 415 processing circuitry
130 infrared signal 45 420-1 first zone
135 visible radiation 420-2 second zone
200 device 420-3 third zone
205 multi-zone time-of-flight sensor 425 object
210-1 first LED 440-1 first distance
210-2 second LED 50 440-2 second distance
210-3 third LED 450 image sensor
215 processing circuitry 455 portable apparatus
220-1 first zone 460 optical element
220-2 second zone 505 multi-zone time-of-flight sensor
220-3 third zone 55 510 LED matrix
225 target 515 driver IC
240-1 first distance 525 person
240-2 second distance 530 tree
300 device 540 warm white LEDs
305 multi-zone time-of-flight sensor 60 545 cool white LEDs
310-1 first LED 550 subset
310-2 second LED 560 optical element
310-3 third LED 600 method
315 processing circuitry 610 first step
320-1 first zone 65 620 second step
320-2 second zone 630 third step
320-3 third zone 640 fourth step
325 object 700 method 710 first step 720 second step

Claims

1. A device (100, 200, 300, 400) comprising: a multi-zone time-of-flight sensor (105, 205, 305, 405, 505) configured to provide depth information for each zone of a plurality of zones (120, 220, 320, 420); a plurality of LEDs (110, 210, 310, 410), wherein at least one LED is configurable to illuminate each zone with visible radiation; and processing circuitry (115, 215, 315, 415) configured to control each at least one LED in response to depth information for a corresponding zone.
2. The device (100, 200, 300, 400) of claim 1, wherein the processing circuitry (115, 215, 315, 415) is configured to: identify one or more objects in a scene based, at least in part, on the depth information for one or more of the plurality of zones (120, 220, 320, 420); and control each at least one LED (110, 210, 310, 410) to predominantly or only illuminate the one or more objects.
3. The device (100, 200, 300, 400) of claim 1 or 2, wherein the processing circuitry (115, 215, 315, 415) is configured to: run a person-detection and/or face-detection algorithm based, at least in part, on the depth information for one or more of the plurality of zones (120, 220, 320, 420); and control each at least one LED (110, 210, 310, 410) to predominantly or only illuminate one or more detected persons or faces.
4. The device (100, 200, 300, 400) of any of claims 1 to 3, wherein the processing circuitry (115, 215, 315, 415) is configured to control the plurality of LEDs by: enabling each at least one LED when depth information for the corresponding zone(s) (120, 220, 320, 420) indicates a distance to a target is below a threshold; and/or disabling each at least one LED when depth information for the corresponding zone(s) (120, 220, 320, 420) indicates a distance to a target is at or above the threshold. The device (100, 200, 300, 400) of claim 4, wherein the threshold depends upon a detected ambient light level. The device (100, 200, 300, 400) of any of claims 1 to 5, comprising an image sensor (350, 450) configured to capture one or more images corresponding to the plurality of zones (120, 220, 320, 420). The device (100, 200, 300, 400) of claim 6, wherein the processing circuitry (115, 215, 315, 415) is configured to control each at least one LED to substantially only illuminate one or more selected targets in each image, or each set of images. The device (100, 200, 300, 400) of claim 6 or 7, wherein the image sensor (350, 450) is configured to capture video and the processing circuitry (115, 215, 315, 415) is configured to control each at least one LED to substantially only illuminate one or more selected targets in the captured video. The device (100, 200, 300, 400) of any of claims 1 to 8, comprising one or more optical elements (460, 560) configured to direct illumination from each at least one LED towards a corresponding zone of the plurality of zones (120, 220, 320, 420). The device (100, 200, 300, 400) of claim 9, wherein the one or more optical elements (460, 560) comprises: a concave or convex lens; a Fresnel lens; a microlens array; and/or a metalens. The device (100, 200, 300, 400) of any of claims 1 to 10, wherein the plurality of LEDs (110, 210, 310, 410) is provided as a matrix of LEDs, wherein each LED is individually addressable or each at least one LED corresponding to a zone (120, 220, 320, 420) is addressable. The device (100, 200, 300, 400) of any of claims 1 to 11, wherein at least two LEDs are configurable to illuminate each zone (120, 220, 320, 420), and wherein each at least two LEDs comprises a relatively warm white LED (540) and a relatively cool white LED (545).
13. The device (100, 200, 300, 400) of any of claims 1 to 12, wherein the processing circuitry (115, 215, 315, 415) is configured to control an intensity of each at least one LED (110, 210, 310, 410) in response to depth information for the corresponding zone (120, 220, 320, 420).
14. The device (100, 200, 300, 400) of any of claims 1 to 13, wherein the multi-zone time-of-flight sensor (105, 205, 305, 405, 505) is a direct time-of-f light sensor.
15. A battery-operated portable apparatus (455) comprising the device (100, 200, 300, 400) of any of claims 1 to 14.
16. A method of illuminating a target or object within a scene, the method comprising: configuring a multi-zone time-of-flight sensor (105, 205, 305, 405, 505) to provide depth information for each zone of a plurality of zones (120, 220, 320, 420); controlling a plurality of LEDs to illuminate one or more of the zones in response to the corresponding depth information.
PCT/EP2022/083222 2021-11-25 2022-11-25 Illumination device WO2023094572A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2117026.1 2021-11-25
GBGB2117026.1A GB202117026D0 (en) 2021-11-25 2021-11-25 Illumination device

Publications (1)

Publication Number Publication Date
WO2023094572A1 true WO2023094572A1 (en) 2023-06-01

Family

ID=80038430

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/083222 WO2023094572A1 (en) 2021-11-25 2022-11-25 Illumination device

Country Status (2)

Country Link
GB (1) GB202117026D0 (en)
WO (1) WO2023094572A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180146186A1 (en) * 2016-11-23 2018-05-24 Microsoft Technology Licensing, Llc. Active illumination 3d imaging system
US20200259987A1 (en) * 2015-11-10 2020-08-13 Lumileds Holding B.V. Adaptive light source
US20210258457A1 (en) * 2020-02-18 2021-08-19 Microsoft Technology Licensing, Llc Selective power efficient three-dimensional imaging

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200259987A1 (en) * 2015-11-10 2020-08-13 Lumileds Holding B.V. Adaptive light source
US20180146186A1 (en) * 2016-11-23 2018-05-24 Microsoft Technology Licensing, Llc. Active illumination 3d imaging system
US20210258457A1 (en) * 2020-02-18 2021-08-19 Microsoft Technology Licensing, Llc Selective power efficient three-dimensional imaging

Also Published As

Publication number Publication date
GB202117026D0 (en) 2022-01-12

Similar Documents

Publication Publication Date Title
US7335868B2 (en) Exposure control system and method for an image sensor
RU2635836C2 (en) Method and device for flash control and terminal
WO2020010848A1 (en) Control method, microprocessor, computer readable storage medium, and computer apparatus
CN105959581A (en) Electronic device having dynamically controlled flashlight for image capturing and related control method
TW201544848A (en) Structured-stereo imaging assembly including separate imagers for different wavelengths
US10916025B2 (en) Systems and methods for forming models of three-dimensional objects
KR20120039498A (en) Information processing device, information processing method, program, and electronic device
JP7138743B2 (en) Image acquisition module, electronic device, image acquisition method and storage medium
KR20160110103A (en) Detection device, load control device, and load control system
JP2003259182A (en) Mobile information terminal device
US10313601B2 (en) Image capturing device and brightness adjusting method
US11843760B2 (en) Timing mechanism to derive non-contaminated video stream using RGB-IR sensor with structured light
US11146747B1 (en) Dynamic driver mechanism for rolling shutter sensor to acquire the structured light pattern
US11200408B2 (en) Biometric imaging system and method for controlling the system
TW201830341A (en) Compensating for vignetting
WO2023094572A1 (en) Illumination device
WO2019129627A1 (en) Device for imaging skin
US10045423B2 (en) Illuminating control system and method for controlling illuminating device
WO2019170688A1 (en) System for measuring skin pores
US7385641B2 (en) Camera arrangement with multiple illuminators for close in photography
US20180091710A1 (en) Image processing system and flashlight device
CN115499576A (en) Light source estimation method, device and system
TW202143111A (en) Electronic device having fingerprint sensing function and fingerprint sensing method
US11985739B2 (en) Low light exposure control using infrared light-emitting diode zones during night vision
TWI788120B (en) Non-contact elevator control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22823336

Country of ref document: EP

Kind code of ref document: A1