WO2022125087A1 - Light intensity determination - Google Patents
Light intensity determination Download PDFInfo
- Publication number
- WO2022125087A1 WO2022125087A1 PCT/US2020/063963 US2020063963W WO2022125087A1 WO 2022125087 A1 WO2022125087 A1 WO 2022125087A1 US 2020063963 W US2020063963 W US 2020063963W WO 2022125087 A1 WO2022125087 A1 WO 2022125087A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subregion
- light
- subregions
- area
- light sources
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
- G09G3/32—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
Definitions
- Imaging devices may be used to capture images from environments under lighting conditions. Imaging devices may be used along with light sources capable of modifying the lighting conditions of such environments. The application of light with the light sources may enhance the lighting conditions of the environment, thereby improving a quality of the images captured by the imaging devices.
- FIG. 1 shows a method to determine a plurality of light intensities for a plurality of light sources, according to an example of the present disclosure
- FIG. 2 shows a determination of an area of interest for an image captured by an imaging device, according to an example of the present disclosure
- FIG. 3 shows a division of an area of interest into a plurality of subregions, according to an example of the present disclosure
- FIG. 4 shows a device comprising an imaging device and a plurality of light sources, according to an example of the present disclosure
- FIG. 5 shows a system comprising an imaging device and a device, according to an example of the present disclosure
- FIG. 6 shows a set of instructions readable by a processor, according to an example of the present disclosure
- FIG. 7 shows a schematic representing a process for determining a plurality of light intensities, according to an example of the present disclosure.
- the terms “a” and “an” are intended to denote at least one of a particular element.
- the term “includes” means includes but not limited to, the term “including” means including but not limited to.
- the term “based on” means based at least in part on.
- Users may use imaging devices to capture images from a portion of an environment.
- the lighting conditions of such a portion may not be the appropriate ones (e.g. high levels of darkness, shadows, color saturation, high contrasts, amongst others)
- the captured image may not fulfill a minimum image quality standard.
- users may use external devices such as lighting devices to modify the lighting conditions of the portion of the environment that is to be captured by the imaging device.
- the subsequent images captured by the imaging device may have a greater image quality when being compared to the quality obtained during the original lighting conditions.
- all the image quality defects present in the original image may be corrected.
- the lighting conditions of the subsequent images captured by the imaging device may still be far away from the desired lighting conditions and may still include shadows, high contrasts, color saturation, amongst others.
- the term “imaging device” will be used to refer to a device that can capture or record visual images of a portion of an environment.
- the imaging device can include a camera or similar device to capture images.
- the imaging device can be a video camera to record a plurality of images that can be in a video format.
- additional imaging devices may be used to capture images that are to be used to determine the lighting conditions of a portion of an environment.
- video cameras are utilized as examples herein, the disclosure is not so limited.
- computing devices can instruct imaging devices to capture images that can be transmitted to other computing devices.
- a computing device may instruct an imaging device such as a video camera to capture a frame (or an image) to be transmitted to a remote computing device to be subsequently displayed at the remote computing device.
- an imaging device such as a video camera
- the frame, or the image captured by the imaging device may have a poor quality that may result in a lower viewership of the audience.
- computing devices can instruct external imaging devices to periodically capture an image to determine the lighting conditions of the environment.
- computing device will be used to refer to electronic systems having processor resources and memory resources.
- Examples of computing devices comprise a laptop computer, a notebook computer, a desktop computer, networking devices such as routers or switches, and mobile devices such as smartphones, tablets, personal digital assistants, smart glasses, and wrist-worn devices.
- the term “lighting source” refers to a device capable of generating light.
- lighting sources comprise light-emitting diodes (LEDs), incandescent lamps, fluorescent lamps, organic light-emitting diodes (OLEDs), among other types of light-generating devices.
- the light source can be altered to provide a range of colors between relatively cold colors and relative warm colors.
- the light source may be red blue green (RGB) light sources that can change At
- the light source may be individually controlled to provide different colors, intensities or color temperatures.
- the image captured by the imaging device includes multiple regions containing objects positioned at different distances within the environment, the contribution of the light sources to the modification of the light conditions may not be the same in each region of the captured image.
- modifying an overall light intensity applied by the light sources may result in a subsequent image with an image quality which still does not fulfill the user’s expectations.
- the presence of external light sources e.g. windows with luminosity, floor lamps, spotlights, and desk lamps
- imaging devices may be used to capture images in which subjects appear. Such subjects may be positioned in a foreground position within the environment, i.e. closer to the imaging device.
- other elements may be present within the images captured by the imaging device. Some of these other elements may be located in a background position within the environment, i.e., further away from the imaging device.
- a region of the image in which a subject is present may be determined with a subject identification application.
- the subject identification application may identify different objects and select an object that takes up the largest quantity of pixels within the image.
- the subject identification application may be capable of matching a human face from the image captured against a database of faces.
- the subject identification application may comprise pinpointing and measuring facial features from a given image in order to determine a region of an image in which a face of the subject is present.
- method 100 determines light intensities in accordance with the lighting condition of an area of interest.
- method 100 comprises capturing an image with an imaging device.
- the imaging device may be part of a computer system or may be an external device used to capture an image that will be used to determine the lighting conditions. In some examples, the image can be periodically captured.
- method 100 comprises identifying an area of interest within the image.
- the area of interest within the image is determined to be the area where a face is present.
- the area of interest (or region of interest), may be determined by a subject identification application, wherein the subject identification application determines the region within the image in which the face is present.
- method 100 comprises dividing the area of interest into a plurality of subregions, wherein the subregions may be obtained by applying filter to the image, e.g. a fixed mask or a dynamic mask.
- the fixed mask is a filter that provides a division of the area of interest into a plurality of predefined regions taking into account the dimensions of the area of interest.
- the dynamic mask is an image filter that provides a division of the area of interest based on physical features of the area of interest, for instance, facial features when the area of interest comprises a face.
- Examples of dynamic masks comprise midsagittal masks, transverse masks, or a combination thereof.
- method 100 comprises determining optical parameter values for each subregion of the plurality of subregions.
- optical parameters comprise a brightness level, a contrast level, and a color temperature.
- the determination of the optical parameters may be determined by a computation unit (for instance a controller), wherein the computation unit outputs optical parameter values of a selected region by analyzing the values and/or the distribution of the pixels that are included in such region.
- method 100 comprises determining a plurality of light intensities associated with the plurality of subregions based on the optical parameter values. In an example, the light intensities are calculated with a function of the optical parameter values.
- the function may further comprise a plurality of correction factors based on a relative location of the area of interest within the image or a relative location of the subregions within the image.
- the light intensities are calculated based on a comparison of the optical parameter values with a plurality of threshold values.
- method 100 comprises setting the plurality of light intensities in the plurality of light sources. The application of the plurality of light intensities with the plurality of light sources will modify the lighting conditions of the image, and more specifically, the lighting conditions of the area of interest.
- method 100 is executed periodically in order to provide a diagnostic of the lighting conditions.
- a user may use an imaging device to do a video call.
- the user may have a plurality of light sources to modify the lighting conditions of a portion of the environment capturable by the imaging device.
- method 100 may be executed every minute in order to check that the lighting conditions obtain a threshold level of lighting conditions.
- method 100 may be executed every twenty seconds.
- the user can set time intervals for the execution of method 100 while using the imaging device.
- a computer system may comprise the plurality of lighting sources and the imaging device.
- a user may set a customized minimum lighting condition for the imaging device.
- the computation unit may determine the plurality of light intensities associated with the plurality of subregions based on the optical parameter values and a threshold parameter value (or values in case of having more than one) determined by the user or by a predefined configuration.
- the method 100 may further comprise comparing the optical parameter values with at least a threshold value.
- determining a plurality of light intensities associated with the plurality of subregions based on the optical parameter values further comprises comparing the optical parameter values with a first threshold value and comparing the optical parameter values with a second threshold value. If the optical parameter value of the subregion is between the first threshold value and the second threshold value, the light intensity associated with the subregion is not modified. In further examples, if the optical parameter value of the subregion is greater than the first threshold value, the light intensity associated with the subregion is increased. Similarly, if the optical parameter value of the subregion is lower than the second threshold value, the light intensity associated with the subregion is decreased.
- the optical parameter value is a brightness level and a first threshold value and a second threshold value are defined for the brightness level.
- the subregion may be considered as being underexposed (if the brightness level is lower than the first threshold value), normal exposed (if the brightness level is between the first threshold value and the second threshold value), or over-exposed (if the brightness level is greater than the second threshold value).
- the light intensity of the plurality of light sources may be modified based on a determined category, a polynomial function comprising the optical parameter values, or a function for the category.
- Image 201 may be a frame that has been captured by an imaging device (not shown in FIG. 2), wherein the imaging device may be part of a computing system or an external device.
- image 201 may result from the block 110 of method 100.
- Image 201 includes a subject 202 and the area of interest 203, wherein the area of interest 203 may be determined with a subject identification application, as described above.
- the identification application my identify (block 120 of the method 100) an area or region in which a face of the subject 202 is present by analyzing the image 201 captured by the imaging device. Once the area of interest 203 is determined, the area may be divided into a plurality of subregions (block 130 of the method 100). However, for simplicity reasons, in FIG.
- the area of interest 203 is shown as a single region, i.e. not subdivided.
- the determination 200 comprises calculating (block 140 of method 100) optical parameter values 205 for the area of interest 203.
- the calculation may be performed with a computation unit such as a processor.
- the area of interest 203 is divided into a plurality of subregions and the optical parameter values 205 comprise parameters associated with each of the subregions.
- masks may be applied to an area in order to obtain a plurality of areas from the area.
- the term “mask” will be used to refer to an image filter, that when used in a filtering operation, divides the area into a plurality of areas.
- different types of division may be possible, such as dividing areas into pluralities of arbitrary areas (for instance when using fixed masks), dividing areas into pluralities of areas associated with light ranges of the light sources (for instance when using fixed masks), dividing areas into pluralities of areas based on facial features contained within the areas (when using dynamic masks), amongst others.
- Examples of masks comprise fixed masks, dynamic masks, or a combination thereof.
- the example division 300 comprises applying a dynamic mask (i.e. an image filter) comprising a midsagittal mask 310 and a transverse mask 320 to the area of interest so that a plurality of subregions is obtained.
- a dynamic mask i.e. an image filter
- the application of the midsagittal mask 310 which detects facial features within the area of interest, divides the region of interest into a left subregion 311 and a right subregion 312.
- the application of the transverse mask 320 divides the area of interest into an upper subregion 321 and a bottom subregion 322.
- the area of interest is divided into a plurality of subregions comprising a first subregion 331 , a second subregion 332, a third subregion 333, and a fourth subregion 334.
- the left subregion 311 and the upper subregion 321 overlap.
- the right subregion 312 and the upper subregion 321 overlap.
- the third subregion 333 the left subregion 311 and the bottom subregion 322 overlap.
- the fourth subregion 334 the right subregion 312 and the bottom subregion 322 overlap.
- the area of interest may be divided into a plurality of subregions by applying a fixed mask, wherein the fixed mask divides the area of interest based on the number of light sources, wherein each of the light sources is associated to at least one subregion of the plurality of subregions.
- the area of interest may be divided into a plurality of areas based on a light range of the light sources.
- the division into a plurality of subregions is based on the location of the light sources relative to the subject. Hence, the area of interest may be divided into subregions based on the number of light sources and/or their location.
- a computation unit may determine optical parameter values in each of the subregions.
- the midsagittal mask 310 and the transverse mask 320 may recalculate the subregions in accordance with the tilting.
- the plurality of intensities applied with the light sources are determined based on the optical parameter values for each of the subregions.
- the determination of the plurality of light intensities further comprises calculating a plurality of correction factors based on a relative location of the area of interest within the image or, in case of having a plurality of subregions, the relative location of each subregion of the plurality of subregions.
- the plurality of correction factors may be applied to the plurality of light intensities associated with the plurality of subregions.
- the correction factors may take into account a relative location of the light source and the area of interest, i.e. a number of light sources and/or their location.
- the division 300 of the area of interest into a plurality of subregions may be performed based on predefined subregions instead of a dynamic division.
- the predefined subregions may be obtained, for instance, by applying image filters such as a fixed mask that may divide the area of interest into a fixed number of subregions, for instance four.
- the number of subregions may be defined in accordance with a location of the light sources, e.g. distribution of the light sources with respect to the area of interest of the image.
- the device 400 comprising an imaging device 410 and a plurality of light sources 420 is shown.
- the device 400 may be, for instance, a computing system.
- the device 400 comprises the imaging device 410, the plurality of light sources 420, a processor 430, and a computer-readable medium 440 comprising instructions 450.
- Examples of computer-readable mediums include a hard drive, a random-access memory (RAM), a read-only memory (ROM), memory cards and sticks and other portable storage devices.
- the instructions 450 when executed by the processor 430, may cause the device 400 to: capture a frame with the imaging device 410, detect a face within the captured frame, divide a region of the frame including the face into a plurality of subregions associated with the plurality of light sources 420, calculate a plurality of light intensities for the plurality of light sources 420, and control the plurality of light sources 420 to emit the plurality of light intensities.
- the plurality of light intensities is calculated based on an optical condition for each subregion of the plurality of subregions. Examples of optical conditions comprise the optical parameter values previously described, for instance a brightness level, a contrast level, and a color temperature.
- the detection of the face may be performed with a subject identification application.
- the subject identification application may be an application executable by the processor 430.
- the division of the region of the frame including the face into a plurality of subregions may be performed as previously explained in FIG. 3.
- the imaging device 410 of the device 400 is to capture a new frame once a period of time has expired, wherein the processor 430 is to calculate a plurality of light intensities for the plurality of light sources 420 based on the new frame.
- the computer-readable medium 440 comprises further instructions 450 to cause the device 400 to: calculate a luminance histogram for each subregion of the plurality of subregions, and determine the optical condition for each subregion of the plurality of subregions based on a distribution of each of the luminance histograms.
- the luminance histogram represents the pixels within a subregion based on their brightness levels.
- an X-axis of the luminance histogram represents brightness levels in a range starting at zero (dark) and finishing at 255 (pure white) and the Y-axis o the luminance histogram represents the number of pixels of each brightness level within the subregion. Based on the distribution of parameters, such as an average value or a standard deviation, the optical condition of each subregion is determined.
- the plurality of light sources 420 are positioned remotely to the imaging device 410 of the device 400.
- the device 400 may remotely control the plurality of light sources 420 so as to emit the plurality of light intensities that have been calculated.
- the light sources 420 are distributed in the environment.
- a light device may comprise the plurality of light sources 420, being the light device, for instance, a ring light.
- a system 500 comprising an imaging device 510 and a device 520 is shown.
- the system 500 further comprises a subject 502 positioned at a portion of an environment located in front of both the imaging device 510 and the device 520.
- the device 520 includes light sources, wherein the light sources comprise a first light source 521 , a second light source 522, a third light source 523, and a fourth light source 524.
- the device 520 further comprises a screen 525 to display visual information.
- the device 520 is a computing device (or computing system) and the subject 502 uses the imaging device 510 to capture images that are subsequently transmitted to other computing devices.
- the imaging device 510 may be used to record images that are stored in a memory (not shown in FIG. 5) of the device 520.
- the device 520 may further comprise the imaging device 510.
- a processor (not shown in FIG. 5) of the device 520 executes instructions to capture, with the imaging device 510, a frame of the environment in which the subject 502 is present. Once the frame is captured, the processor executes instructions to detect a face within the capture frame. As previously described in FIGs. 2 and 3, a subject identification application may be used to determine a region 503 of the frame in which the face is present. Then, the processor may execute instructions to divide the region 503 including the face into a plurality of subregions associated with the plurality of light sources, i.e. the first light source 521 , the second light source 522, the third light source 523, and the fourth light source 524.
- the light sources may comprise subdivisions that allow to provide different light intensities based on a position of the light source.
- a plurality of light intensities may be set to the plurality of light sources, i.e. the light intensity of each light source can be selectively controlled.
- the processor may determine an optical condition for each subregion.
- the optical conditions may be used, for instance, to calculate the plurality of light intensities for the plurality of light sources.
- the plurality of light intensities is calculated based on the optical conditions.
- the calculation of the plurality of light intensities for the plurality of light sources comprises calculating a correction factor for the region 503, wherein the correction factor is a function based on a relative position of the region 503 within the frame. In further examples, the function may be based on a relative position of the subregions within the frame.
- the first light source 521 , the second light source 522, the third light source 523, and the fourth light source 524 are a plurality of lightemitting diodes strips distributed along the perimeter of the screen 525.
- the plurality of light sources may be a plurality of lighting areas within screen 525, wherein the lighting areas are displayed adjacent to the visual information that is being deployed on screen 525.
- the set of instructions 650 may correspond, for instance, to the instructions 450 stored within the computer-readable medium 440 previously described in FIG. 4.
- a computer-readable medium may comprise the set of instructions 650, that when executed by a processor, cause a system to execute the blocks 651 , 652, 653, 654, and 655.
- the set of instructions 650 causes the system to capture a frame with a capturing device.
- the capturing device may correspond to an example of the imaging devices that have been previously described.
- the set of instructions 650 causes the system to identify a subject within the frame. In order to identify the subject, a subject identification application may be used.
- the set of instructions 650 causes the system to divide the region of the frame comprising the face of the subject into a plurality of areas.
- the division in areas may be performed as previously explained in reference with FIG. 3, i.e. by using image filter such as fixed masks or dynamic masks.
- the set of instructions 650 causes the system to determine a set of optical parameters for each area of the plurality areas resulting from the division of the region.
- the set of optical parameters comprise a brightness level and a color temperature.
- the set of optical parameters comprises at least one of a brightness level, a contrast level and a color temperature.
- the set of instructions 650 causes the system to set a plurality of light intensities in a plurality of light sources based on the set of optical parameters.
- the light intensities are calculated with a function that outputs a light intensity for a light source based on an optical parameter of an area.
- the set of instructions 650 may comprise further instructions to cause the system to: calculate a plurality of corrections factors associated with the plurality of areas based on relative locations of the areas within the frame, and apply the plurality of correction factors to the plurality of light intensities.
- the set of instructions 650 may comprise further instructions to cause the system to calculate a luminance histogram for each area of the plurality of areas and determine the set of optical parameters for each area of the plurality of areas based on a distribution of each of the luminance histograms.
- the set of instructions 650 may comprise further instructions to capture a new frame once a period of time has expired, wherein a second plurality of intensities are calculated based on the new frame.
- FIG. 7 a schematic 700 representing a process for determining a plurality of light intensities for a plurality of light sources is shown.
- the schematic 700 represents how a plurality of light intensities for a plurality of light sources are determined based on a frame captured by an imaging device.
- the imaging device may capture an image 701 from an environment, wherein the image 701 may comprise a subject 702.
- a processor that executes identification instructions 753 a division of a region 703 of the image 701 in which a face of the subject 702 is present into a plurality of areas is performed, as previously explained in reference with FIGs. 2 and 3.
- the processor executes computation instructions 754 to determine optical parameters 705 for each area of the plurality of areas.
- the optical parameters 705 comprise at least one of brightness level, a contrast level and color temperature.
- the optical parameters 705 comprise a luminance histogram for each area of the plurality of areas, wherein an X-axis of the luminance histogram represents brightness levels in a range starting at zero (dark) and finishing at 255 (pure white) and the Y-axis o the luminance histogram represents the number of pixels of each brightness level within the area.
- other types of histograms may be used for other types of optical parameters 705.
- the threshold values 706 define a plurality of values to be compared with the optical parameter 705, and based on a result of the comparison, a plurality of light intensities is determined.
- the threshold values 706 may be different for each area of the plurality of areas, wherein users may customize the values based on their preferences.
- the brightness level is compared with the first and the second threshold value. If the brightness level determined for the area falls within a range from the first threshold value to the second threshold value, the brightness level is considered sufficient. If the brightness level determined for the area is greater than both the first threshold value and the second threshold value, the brightness level is considered excessive. If the brightness level determined for the area is lower than both the first threshold value and the second threshold value, the brightness level is considered deficient. Based on the condition (deficient, sufficient, excessive), the plurality of light intensities for the plurality of light sources are determined. In some other examples, the threshold value 706 may comprise different first and second threshold value for each area of the plurality of areas. Hence, for instance, areas within an upper region of the face may have a higher threshold value when compare to areas within a bottom region of the face.
- the plurality of light intensities is determined with a function of the optical parameters 705.
- the processor executes setting instructions 755 to set the plurality of light intensities to a plurality of light sources 720.
- the plurality of intensities is set to the plurality of light sources 720 based on a relative location of the plurality of light sources 720 with respect to the area. For instance, when having a device having a light distribution for the light sources, such as the device 520 of FIG. 5, in case of determining that a top-right area of the plurality of areas is under a deficient condition, the light intensities of the light sources associated to such area, e.g. the first light source 521 and the second light source 522, may be modified in accordance to such condition.
- the setting instructions 755 further comprise instructions to apply a plurality of correction factors to the plurality of light intensities.
- the correction factors may be calculated based on a relative location of the region 703 within image 701. In case the face the subject 702 is determined to be on the right side of the image 701 , the plurality of correction factors may correct such distribution by increasing the light intensities associated with the light sources that are further away from the subject and decreasing the light intensities associated with the light sources that are closer to the subject.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
According to an example, a method comprises capturing an image with an imaging device, identifying an area of interest within the image, dividing the area of interest into a plurality of subregions, determining an optical parameter value for each subregion, determining a plurality of light intensities associated with the subregions and setting the plurality of light intensities in a plurality of light sources.
Description
LIGHT INTENSITY DETERMINATION
BACKGROUND
[0001] Imaging devices may be used to capture images from environments under lighting conditions. Imaging devices may be used along with light sources capable of modifying the lighting conditions of such environments. The application of light with the light sources may enhance the lighting conditions of the environment, thereby improving a quality of the images captured by the imaging devices.
BRIEF DESCRIPTION OF DRAWINGS
[0002] Features of the present disclosure are illustrated by way of example and are not limited in the following figure(s), in which like numerals indicate like elements, in which:
[0003] FIG. 1 shows a method to determine a plurality of light intensities for a plurality of light sources, according to an example of the present disclosure;
[0004] FIG. 2 shows a determination of an area of interest for an image captured by an imaging device, according to an example of the present disclosure;
[0005] FIG. 3 shows a division of an area of interest into a plurality of subregions, according to an example of the present disclosure;
[0006] FIG. 4 shows a device comprising an imaging device and a plurality of light sources, according to an example of the present disclosure;
[0007] FIG. 5 shows a system comprising an imaging device and a device, according to an example of the present disclosure;
[0008] FIG. 6 shows a set of instructions readable by a processor, according to an example of the present disclosure;
[0009] FIG. 7 shows a schematic representing a process for determining a plurality of light intensities, according to an example of the present disclosure.
Al
2
DETAILED DESCRIPTION
[0010] For simplicity and illustrative purposes, the present disclosure is described by referring mainly to examples. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent, however, that the present disclosure may be practiced without limitation to these specific details. In other instances, some methods and structures have not been described in detail so as not to unnecessarily obscure the present disclosure.
[0011] Throughout the present disclosure, the terms "a" and "an" are intended to denote at least one of a particular element. As used herein, the term "includes" means includes but not limited to, the term "including" means including but not limited to. The term "based on" means based at least in part on.
[0012] Users may use imaging devices to capture images from a portion of an environment. However, since the lighting conditions of such a portion may not be the appropriate ones (e.g. high levels of darkness, shadows, color saturation, high contrasts, amongst others), the captured image may not fulfill a minimum image quality standard. In order to increase the overall quality, users may use external devices such as lighting devices to modify the lighting conditions of the portion of the environment that is to be captured by the imaging device. By modifying the original lighting conditions, the subsequent images captured by the imaging device may have a greater image quality when being compared to the quality obtained during the original lighting conditions. Eventually, all the image quality defects present in the original image may be corrected. However, even such modification of the lighting conditions may be attempted, the lighting conditions of the subsequent images captured by the imaging device may still be far away from the desired lighting conditions and may still include shadows, high contrasts, color saturation, amongst others.
[0013] As used herein, the term “imaging device” will be used to refer to a device that can capture or record visual images of a portion of an environment. In some examples, the imaging device can include a camera or similar device to capture
images. For example, the imaging device can be a video camera to record a plurality of images that can be in a video format. In other examples, additional imaging devices may be used to capture images that are to be used to determine the lighting conditions of a portion of an environment. Although video cameras are utilized as examples herein, the disclosure is not so limited.
[0014] In some examples, computing devices can instruct imaging devices to capture images that can be transmitted to other computing devices. For example, a computing device may instruct an imaging device such as a video camera to capture a frame (or an image) to be transmitted to a remote computing device to be subsequently displayed at the remote computing device. Depending on the lighting conditions of the environment, the frame, or the image captured by the imaging device may have a poor quality that may result in a lower viewership of the audience. In other examples, computing devices can instruct external imaging devices to periodically capture an image to determine the lighting conditions of the environment.
[0015] As used herein, the term “computing device” will be used to refer to electronic systems having processor resources and memory resources. Examples of computing devices comprise a laptop computer, a notebook computer, a desktop computer, networking devices such as routers or switches, and mobile devices such as smartphones, tablets, personal digital assistants, smart glasses, and wrist-worn devices.
[0016] In order to improve the lighting conditions of an environment, users may use lighting sources along with the imaging devices, thereby enabling the user to increase the image quality of the subsequent images. As used herein, the term “lighting source” refers to a device capable of generating light. Examples of lighting sources comprise light-emitting diodes (LEDs), incandescent lamps, fluorescent lamps, organic light-emitting diodes (OLEDs), among other types of light-generating devices. In some examples, the light source can be altered to provide a range of colors between relatively cold colors and relative warm colors. In other examples, the light source may be red blue green (RGB) light sources that can change
At
4 between different colors. In some other examples, the light source may be individually controlled to provide different colors, intensities or color temperatures.
[0017] However, since the image captured by the imaging device includes multiple regions containing objects positioned at different distances within the environment, the contribution of the light sources to the modification of the light conditions may not be the same in each region of the captured image. Hence, although a user aims to obtain an overall image quality for the subsequent images, modifying an overall light intensity applied by the light sources may result in a subsequent image with an image quality which still does not fulfill the user’s expectations. In addition, in some examples, the presence of external light sources (e.g. windows with luminosity, floor lamps, spotlights, and desk lamps) within the environment leads to a disruption of the lighting conditions of a specific region of the captured image.
[0018] In order to effectively improve the quality of the images captured by imaging devices, methods to selectively apply light intensities with light sources may be used. In the same way, systems and devices may be used so as to provide an enhanced quality of lighting conditions.
[0019] According to some examples, imaging devices may be used to capture images in which subjects appear. Such subjects may be positioned in a foreground position within the environment, i.e. closer to the imaging device. In addition to the subject, other elements may be present within the images captured by the imaging device. Some of these other elements may be located in a background position within the environment, i.e., further away from the imaging device.
[0020] Since subjects may seek a better illumination within the region of the image in which they appear, the improvement of the lighting conditions may be reduced to the area of the image in which the subject is present. In order to obtain am improved image quality around the subject, the light sources may be used to selectively apply a light intensity to improve the light conditions in a region of the image that includes the subject.
[0021] According to some other examples, a region of the image in which a subject is present may be determined with a subject identification application. For example, the subject identification application may identify different objects and select an object that takes up the largest quantity of pixels within the image. In other examples, the subject identification application may be capable of matching a human face from the image captured against a database of faces. In some other examples, the subject identification application may comprise pinpointing and measuring facial features from a given image in order to determine a region of an image in which a face of the subject is present.
[0022] Referring now to FIG. 1 , a method 100 to determine a plurality of light intensities for a plurality of light sources is shown. Instead of setting the same light intensities to the plurality of light sources in order to improve the lighting conditions of the portion of the environment that is being captured by the imaging device, method 100 determines light intensities in accordance with the lighting condition of an area of interest. At block 110, method 100 comprises capturing an image with an imaging device. As described above, the imaging device may be part of a computer system or may be an external device used to capture an image that will be used to determine the lighting conditions. In some examples, the image can be periodically captured. At block 120, method 100 comprises identifying an area of interest within the image. In some examples, the area of interest within the image is determined to be the area where a face is present. The area of interest (or region of interest), may be determined by a subject identification application, wherein the subject identification application determines the region within the image in which the face is present. At block 130, method 100 comprises dividing the area of interest into a plurality of subregions, wherein the subregions may be obtained by applying filter to the image, e.g. a fixed mask or a dynamic mask. In an example, the fixed mask is a filter that provides a division of the area of interest into a plurality of predefined regions taking into account the dimensions of the area of interest. In other examples, the dynamic mask is an image filter that provides a division of the area of interest based on physical features of the area of interest, for instance, facial features when
the area of interest comprises a face. Examples of dynamic masks comprise midsagittal masks, transverse masks, or a combination thereof.
[0023] At block 140, method 100 comprises determining optical parameter values for each subregion of the plurality of subregions. Examples of optical parameters comprise a brightness level, a contrast level, and a color temperature. The determination of the optical parameters may be determined by a computation unit (for instance a controller), wherein the computation unit outputs optical parameter values of a selected region by analyzing the values and/or the distribution of the pixels that are included in such region. At block 150, method 100 comprises determining a plurality of light intensities associated with the plurality of subregions based on the optical parameter values. In an example, the light intensities are calculated with a function of the optical parameter values. In other examples, the function may further comprise a plurality of correction factors based on a relative location of the area of interest within the image or a relative location of the subregions within the image. In some other examples, the light intensities are calculated based on a comparison of the optical parameter values with a plurality of threshold values. At block 160, once the plurality of light intensities for the plurality of light sources has been determined by the computation unit, method 100 comprises setting the plurality of light intensities in the plurality of light sources. The application of the plurality of light intensities with the plurality of light sources will modify the lighting conditions of the image, and more specifically, the lighting conditions of the area of interest.
[0024] In some examples, method 100 is executed periodically in order to provide a diagnostic of the lighting conditions. In an example, a user may use an imaging device to do a video call. In addition to the imaging device, the user may have a plurality of light sources to modify the lighting conditions of a portion of the environment capturable by the imaging device. During the video call, method 100 may be executed every minute in order to check that the lighting conditions obtain a threshold level of lighting conditions. In other examples, method 100 may be executed every twenty seconds. In further examples, the user can set time intervals
for the execution of method 100 while using the imaging device. In some examples, a computer system may comprise the plurality of lighting sources and the imaging device.
[0025] In other examples, a user may set a customized minimum lighting condition for the imaging device. For example, at block 150 of the method 100, the computation unit may determine the plurality of light intensities associated with the plurality of subregions based on the optical parameter values and a threshold parameter value (or values in case of having more than one) determined by the user or by a predefined configuration.
[0026] In some other examples, the method 100 may further comprise comparing the optical parameter values with at least a threshold value. In an example, determining a plurality of light intensities associated with the plurality of subregions based on the optical parameter values further comprises comparing the optical parameter values with a first threshold value and comparing the optical parameter values with a second threshold value. If the optical parameter value of the subregion is between the first threshold value and the second threshold value, the light intensity associated with the subregion is not modified. In further examples, if the optical parameter value of the subregion is greater than the first threshold value, the light intensity associated with the subregion is increased. Similarly, if the optical parameter value of the subregion is lower than the second threshold value, the light intensity associated with the subregion is decreased.
[0027] According to some examples, the optical parameter value is a brightness level and a first threshold value and a second threshold value are defined for the brightness level. Depending on the brightness level of the subregion, the subregion may be considered as being underexposed (if the brightness level is lower than the first threshold value), normal exposed (if the brightness level is between the first threshold value and the second threshold value), or over-exposed (if the brightness level is greater than the second threshold value). The light intensity of the plurality of light sources may be modified based on a determined category, a polynomial function comprising the optical parameter values, or a function for the category.
[0028] Referring now to FIG. 2, a determination 200 of an area of interest 203 for an image 201 is shown. Image 201 may be a frame that has been captured by an imaging device (not shown in FIG. 2), wherein the imaging device may be part of a computing system or an external device. In an example, image 201 may result from the block 110 of method 100. Image 201 includes a subject 202 and the area of interest 203, wherein the area of interest 203 may be determined with a subject identification application, as described above. The identification application my identify (block 120 of the method 100) an area or region in which a face of the subject 202 is present by analyzing the image 201 captured by the imaging device. Once the area of interest 203 is determined, the area may be divided into a plurality of subregions (block 130 of the method 100). However, for simplicity reasons, in FIG. 2 the area of interest 203 is shown as a single region, i.e. not subdivided. In addition to identifying the area of interest 203, the determination 200 comprises calculating (block 140 of method 100) optical parameter values 205 for the area of interest 203. As described above, the calculation may be performed with a computation unit such as a processor. In other examples, the area of interest 203 is divided into a plurality of subregions and the optical parameter values 205 comprise parameters associated with each of the subregions.
[0029] According to some examples, masks may be applied to an area in order to obtain a plurality of areas from the area. As used herein, the term “mask” will be used to refer to an image filter, that when used in a filtering operation, divides the area into a plurality of areas. When performing filtering operations, different types of division may be possible, such as dividing areas into pluralities of arbitrary areas (for instance when using fixed masks), dividing areas into pluralities of areas associated with light ranges of the light sources (for instance when using fixed masks), dividing areas into pluralities of areas based on facial features contained within the areas (when using dynamic masks), amongst others. Examples of masks comprise fixed masks, dynamic masks, or a combination thereof.
[0030] Referring now to FIG. 3, a division 300 of an area of interest into a plurality of subregions is shown. The area of interest may be, for instance, the area of interest
203 previously described in reference with FIG. 2. The example division 300, which may correspond to the block 130 of method 100, comprises applying a dynamic mask (i.e. an image filter) comprising a midsagittal mask 310 and a transverse mask 320 to the area of interest so that a plurality of subregions is obtained. The application of the midsagittal mask 310, which detects facial features within the area of interest, divides the region of interest into a left subregion 311 and a right subregion 312. Similarly, the application of the transverse mask 320 divides the area of interest into an upper subregion 321 and a bottom subregion 322. As a result, the area of interest is divided into a plurality of subregions comprising a first subregion 331 , a second subregion 332, a third subregion 333, and a fourth subregion 334. In the first subregion 331 , the left subregion 311 and the upper subregion 321 overlap. In the second subregion 332, the right subregion 312 and the upper subregion 321 overlap. In the third subregion 333, the left subregion 311 and the bottom subregion 322 overlap. In the fourth subregion 334, the right subregion 312 and the bottom subregion 322 overlap. In other examples, the area of interest may be divided into a plurality of subregions by applying a fixed mask, wherein the fixed mask divides the area of interest based on the number of light sources, wherein each of the light sources is associated to at least one subregion of the plurality of subregions. In some other examples, the area of interest may be divided into a plurality of areas based on a light range of the light sources. In further examples, the division into a plurality of subregions is based on the location of the light sources relative to the subject. Hence, the area of interest may be divided into subregions based on the number of light sources and/or their location.
[0031] Once the area of interest is divided into subregions, a computation unit may determine optical parameter values in each of the subregions. By having a dynamic division of the region of interest, if the subject tilts its face, the midsagittal mask 310 and the transverse mask 320 may recalculate the subregions in accordance with the tilting. In an example, the plurality of intensities applied with the light sources are determined based on the optical parameter values for each of the subregions.
[0032] In other examples, the determination of the plurality of light intensities further comprises calculating a plurality of correction factors based on a relative location of the area of interest within the image or, in case of having a plurality of subregions, the relative location of each subregion of the plurality of subregions. The plurality of correction factors may be applied to the plurality of light intensities associated with the plurality of subregions. In other examples, the correction factors may take into account a relative location of the light source and the area of interest, i.e. a number of light sources and/or their location.
[0033] As previously described, in other examples the division 300 of the area of interest into a plurality of subregions may be performed based on predefined subregions instead of a dynamic division. The predefined subregions may be obtained, for instance, by applying image filters such as a fixed mask that may divide the area of interest into a fixed number of subregions, for instance four. In some examples, the number of subregions may be defined in accordance with a location of the light sources, e.g. distribution of the light sources with respect to the area of interest of the image.
[0034] Referring now to FIG. 4, a device 400 comprising an imaging device 410 and a plurality of light sources 420 is shown. The device 400 may be, for instance, a computing system. The device 400 comprises the imaging device 410, the plurality of light sources 420, a processor 430, and a computer-readable medium 440 comprising instructions 450. Examples of computer-readable mediums include a hard drive, a random-access memory (RAM), a read-only memory (ROM), memory cards and sticks and other portable storage devices. The instructions 450, when executed by the processor 430, may cause the device 400 to: capture a frame with the imaging device 410, detect a face within the captured frame, divide a region of the frame including the face into a plurality of subregions associated with the plurality of light sources 420, calculate a plurality of light intensities for the plurality of light sources 420, and control the plurality of light sources 420 to emit the plurality of light intensities. In an example, the plurality of light intensities is calculated based on an optical condition for each subregion of the plurality of subregions. Examples of
optical conditions comprise the optical parameter values previously described, for instance a brightness level, a contrast level, and a color temperature.
[0035] As previously explained in reference with FIG. 2, the detection of the face may be performed with a subject identification application. The subject identification application may be an application executable by the processor 430. In the same way, the division of the region of the frame including the face into a plurality of subregions may be performed as previously explained in FIG. 3.
[0036] According to some examples, the imaging device 410 of the device 400 is to capture a new frame once a period of time has expired, wherein the processor 430 is to calculate a plurality of light intensities for the plurality of light sources 420 based on the new frame. In some other examples, the computer-readable medium 440 comprises further instructions 450 to cause the device 400 to: calculate a luminance histogram for each subregion of the plurality of subregions, and determine the optical condition for each subregion of the plurality of subregions based on a distribution of each of the luminance histograms. In an example, the luminance histogram represents the pixels within a subregion based on their brightness levels. For instance, an X-axis of the luminance histogram represents brightness levels in a range starting at zero (dark) and finishing at 255 (pure white) and the Y-axis o the luminance histogram represents the number of pixels of each brightness level within the subregion. Based on the distribution of parameters, such as an average value or a standard deviation, the optical condition of each subregion is determined.
[0037] According to some other examples, the plurality of light sources 420 are positioned remotely to the imaging device 410 of the device 400. The device 400 may remotely control the plurality of light sources 420 so as to emit the plurality of light intensities that have been calculated. In an example, the light sources 420 are distributed in the environment. In other examples, a light device may comprise the plurality of light sources 420, being the light device, for instance, a ring light.
[0038] Referring now to FIG. 5, a system 500 comprising an imaging device 510 and a device 520 is shown. The system 500 further comprises a subject 502 positioned at a portion of an environment located in front of both the imaging device 510 and
the device 520. The device 520 includes light sources, wherein the light sources comprise a first light source 521 , a second light source 522, a third light source 523, and a fourth light source 524. In addition, the device 520 further comprises a screen 525 to display visual information. In an example, the device 520 is a computing device (or computing system) and the subject 502 uses the imaging device 510 to capture images that are subsequently transmitted to other computing devices. In other examples, the imaging device 510 may be used to record images that are stored in a memory (not shown in FIG. 5) of the device 520. In other examples, the device 520 may further comprise the imaging device 510.
[0039] In the example of FIG. 5, a processor (not shown in FIG. 5) of the device 520 executes instructions to capture, with the imaging device 510, a frame of the environment in which the subject 502 is present. Once the frame is captured, the processor executes instructions to detect a face within the capture frame. As previously described in FIGs. 2 and 3, a subject identification application may be used to determine a region 503 of the frame in which the face is present. Then, the processor may execute instructions to divide the region 503 including the face into a plurality of subregions associated with the plurality of light sources, i.e. the first light source 521 , the second light source 522, the third light source 523, and the fourth light source 524. In some examples, the light sources may comprise subdivisions that allow to provide different light intensities based on a position of the light source. As described above, a plurality of light intensities may be set to the plurality of light sources, i.e. the light intensity of each light source can be selectively controlled. Upon division of the region 503 into a plurality of subregions, the processor may determine an optical condition for each subregion. The optical conditions may be used, for instance, to calculate the plurality of light intensities for the plurality of light sources. In an example, the plurality of light intensities is calculated based on the optical conditions. In other examples, the calculation of the plurality of light intensities for the plurality of light sources comprises calculating a correction factor for the region 503, wherein the correction factor is a function based on a relative position of the region 503 within the frame. In further examples, the function may be based on a relative position of the subregions within the frame.
[0040] According to some examples, the first light source 521 , the second light source 522, the third light source 523, and the fourth light source 524 are a plurality of lightemitting diodes strips distributed along the perimeter of the screen 525. However, in other examples, the plurality of light sources may be a plurality of lighting areas within screen 525, wherein the lighting areas are displayed adjacent to the visual information that is being deployed on screen 525.
[0041] Referring now to FIG. 6, a set of instructions 650 is shown. The set of instructions 650 may correspond, for instance, to the instructions 450 stored within the computer-readable medium 440 previously described in FIG. 4. In other examples, a computer-readable medium may comprise the set of instructions 650, that when executed by a processor, cause a system to execute the blocks 651 , 652, 653, 654, and 655. At block 651 , the set of instructions 650 causes the system to capture a frame with a capturing device. The capturing device may correspond to an example of the imaging devices that have been previously described. At block 652, the set of instructions 650 causes the system to identify a subject within the frame. In order to identify the subject, a subject identification application may be used. At block 653, the set of instructions 650 causes the system to divide the region of the frame comprising the face of the subject into a plurality of areas. The division in areas may be performed as previously explained in reference with FIG. 3, i.e. by using image filter such as fixed masks or dynamic masks. At block 654, the set of instructions 650 causes the system to determine a set of optical parameters for each area of the plurality areas resulting from the division of the region. In some examples, the set of optical parameters comprise a brightness level and a color temperature. In other examples, the set of optical parameters comprises at least one of a brightness level, a contrast level and a color temperature. At block 655, the set of instructions 650 causes the system to set a plurality of light intensities in a plurality of light sources based on the set of optical parameters. In an example, the light intensities are calculated with a function that outputs a light intensity for a light source based on an optical parameter of an area.
[0042] According to some examples, the set of instructions 650 may comprise further instructions to cause the system to: calculate a plurality of corrections factors associated with the plurality of areas based on relative locations of the areas within the frame, and apply the plurality of correction factors to the plurality of light intensities.
[0043] According to some other examples, the set of instructions 650 may comprise further instructions to cause the system to calculate a luminance histogram for each area of the plurality of areas and determine the set of optical parameters for each area of the plurality of areas based on a distribution of each of the luminance histograms. In other examples, the set of instructions 650 may comprise further instructions to capture a new frame once a period of time has expired, wherein a second plurality of intensities are calculated based on the new frame.
[0044] Referring now to FIG. 7, a schematic 700 representing a process for determining a plurality of light intensities for a plurality of light sources is shown. The schematic 700 represents how a plurality of light intensities for a plurality of light sources are determined based on a frame captured by an imaging device. As described above, the imaging device may capture an image 701 from an environment, wherein the image 701 may comprise a subject 702. By using a processor that executes identification instructions 753, a division of a region 703 of the image 701 in which a face of the subject 702 is present into a plurality of areas is performed, as previously explained in reference with FIGs. 2 and 3. Upon division of the region 703 into the plurality of areas, the processor executes computation instructions 754 to determine optical parameters 705 for each area of the plurality of areas. In an example, the optical parameters 705 comprise at least one of brightness level, a contrast level and color temperature. In other examples, the optical parameters 705 comprise a luminance histogram for each area of the plurality of areas, wherein an X-axis of the luminance histogram represents brightness levels in a range starting at zero (dark) and finishing at 255 (pure white) and the Y-axis o the luminance histogram represents the number of pixels of each
brightness level within the area. However, other types of histograms may be used for other types of optical parameters 705.
[0045] When the optical parameters 705 have been determined, a comparison with threshold values 706 is carried out. The threshold values 706 define a plurality of values to be compared with the optical parameter 705, and based on a result of the comparison, a plurality of light intensities is determined. In some examples, the threshold values 706 may be different for each area of the plurality of areas, wherein users may customize the values based on their preferences.
[0046] In an example in which the optical parameters 705 comprise a brightness level and the threshold value 706 comprise a first and a second threshold values, the brightness level is compared with the first and the second threshold value. If the brightness level determined for the area falls within a range from the first threshold value to the second threshold value, the brightness level is considered sufficient. If the brightness level determined for the area is greater than both the first threshold value and the second threshold value, the brightness level is considered excessive. If the brightness level determined for the area is lower than both the first threshold value and the second threshold value, the brightness level is considered deficient. Based on the condition (deficient, sufficient, excessive), the plurality of light intensities for the plurality of light sources are determined. In some other examples, the threshold value 706 may comprise different first and second threshold value for each area of the plurality of areas. Hence, for instance, areas within an upper region of the face may have a higher threshold value when compare to areas within a bottom region of the face.
[0047] In other examples, the plurality of light intensities is determined with a function of the optical parameters 705. When the light intensities have been determined, the processor executes setting instructions 755 to set the plurality of light intensities to a plurality of light sources 720. According to an example, the plurality of intensities is set to the plurality of light sources 720 based on a relative location of the plurality of light sources 720 with respect to the area. For instance, when having a device having a light distribution for the light sources, such as the device 520 of FIG. 5, in
case of determining that a top-right area of the plurality of areas is under a deficient condition, the light intensities of the light sources associated to such area, e.g. the first light source 521 and the second light source 522, may be modified in accordance to such condition.
[0048] In other examples, the setting instructions 755 further comprise instructions to apply a plurality of correction factors to the plurality of light intensities. The correction factors may be calculated based on a relative location of the region 703 within image 701. In case the face the subject 702 is determined to be on the right side of the image 701 , the plurality of correction factors may correct such distribution by increasing the light intensities associated with the light sources that are further away from the subject and decreasing the light intensities associated with the light sources that are closer to the subject.
[0049] What has been described and illustrated herein are examples of the disclosure along with some variations. The terms, descriptions, and figures used herein are set forth by way of illustration only and are not meant as limitations. Many variations are possible within the scope of the disclosure, which is intended to be defined by the following claims (and their equivalents) in which all terms are meant in their broadest reasonable sense unless otherwise indicated.
Claims
1 . A method comprising: capturing an image with an imaging device; identifying an area of interest within the image in which a face is present; dividing the area of interest into a plurality of subregions; determining an optical parameter value for each subregion of the plurality of subregions; determining a plurality of light intensities associated with the plurality of subregions based on the optical parameter values; and setting the plurality of light intensities in a plurality of light sources.
2. The method of claim 1 , wherein dividing the area of interest into the plurality of subregions comprises: applying a midsagittal mask that divides the area of interest into a left subregion and a right subregion; and, applying a transverse mask that divides the area of interest into a top subregion and a bottom subregion.
3. The method of claim 1 , wherein determining the plurality of light intensities further comprises: calculating a plurality of correction factors based on a relative location of the area of interest within the image; and applying the plurality of correction factors to the plurality of light intensities associated with the plurality of subregions.
4. The method of claim 1 , wherein the optical parameter value is a brightness level.
5. The method of claim 1 , wherein determining the plurality of light intensities associated with the plurality of subregions based on the optical parameter values further comprises:
-comparing the optical parameter values with a first threshold value, and
-comparing the optical parameter values with a second threshold value, wherein if the optical parameter value of the subregion is between the first threshold value and the second threshold value, the light intensity associated with the subregion is not modified.
6. The method of claim 5, wherein:
-if the optical parameter value of the subregion is greater than the first threshold value, the light intensity associated with the subregion is increased; and,
-if the optical parameter value of the subregion is lower than the second threshold value, the light intensity associated with the subregion is decreased.
7. A device comprising: an imaging device; a plurality of light sources; a processor; and a computer-readable medium comprising instructions, that when executed by the processor, cause the device to: capture a frame with the imaging device; detect a face within the captured frame;
19 divide a region of the frame including the face into a plurality of subregions associated with the plurality of light sources; calculate a plurality of light intensities for the plurality of light sources, wherein the plurality of light intensities is calculated based on an optical condition for each subregion of the plurality of subregions; and control the plurality of light sources to emit the plurality of light intensities.
8. The device of claim 7, wherein the imaging device is to capture a new frame once a period of time has expired, wherein the processor is to calculate a plurality of light intensities for the plurality of light sources based on the new frame.
9. The device of claim 7, wherein the computer-readable medium comprises further instructions to cause the device to: calculate a luminance histogram for each subregion of the plurality of subregions; and determine the optical condition for each subregion of the plurality of subregions based on a distribution of each of the luminance histograms.
10. The device of claim 7 further comprising a screen to display visual information.
11 . The device of claim 10, wherein the plurality of light sources is positioned adjacent to a perimeter of the screen.
12. The device of claim 11 , wherein the plurality of light sources is:
-a plurality of light emitting diodes strips distributed along the perimeter of the screen; or
-a plurality of lighting areas within the screen, wherein the lighting areas are display adjacent to the visual information.
20 A computer-readable medium comprising instructions, that when executed by a processor, cause a system to: capture a frame with a capturing device; identify a subject within the frame; divide a region of the frame comprising the subject into a plurality of areas; determine a set of optical parameters for each area of the plurality of areas; and, set a plurality of light intensities in a plurality of light sources of the system based on the set of optical parameters. The computer-readable medium of claim 13, further comprising instructions to cause the system to: calculate a plurality of correction factors associated with the plurality of areas based on relative locations of the areas within the frame; and, apply the plurality of correction factors to the plurality of light intensities. The computer-readable medium of claim 14, wherein the set of optical parameters comprise: a brightness level; and a color temperature.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2020/063963 WO2022125087A1 (en) | 2020-12-09 | 2020-12-09 | Light intensity determination |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2020/063963 WO2022125087A1 (en) | 2020-12-09 | 2020-12-09 | Light intensity determination |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022125087A1 true WO2022125087A1 (en) | 2022-06-16 |
Family
ID=81974656
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2020/063963 Ceased WO2022125087A1 (en) | 2020-12-09 | 2020-12-09 | Light intensity determination |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022125087A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5016039A (en) * | 1988-05-07 | 1991-05-14 | Nikon Corporation | Camera system |
| WO2004029861A1 (en) * | 2002-09-24 | 2004-04-08 | Biometix Pty Ltd | Illumination for face recognition |
| US20060018641A1 (en) * | 2004-07-07 | 2006-01-26 | Tomoyuki Goto | Vehicle cabin lighting apparatus |
| US20130128073A1 (en) * | 2011-11-22 | 2013-05-23 | Samsung Electronics Co. Ltd. | Apparatus and method for adjusting white balance |
| US20180198964A1 (en) * | 2011-01-28 | 2018-07-12 | Windy Place, Inc. | Lighting and power devices and modules |
| CN109348138A (en) * | 2018-10-12 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Light irradiation regulating method, device, equipment and storage medium |
-
2020
- 2020-12-09 WO PCT/US2020/063963 patent/WO2022125087A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5016039A (en) * | 1988-05-07 | 1991-05-14 | Nikon Corporation | Camera system |
| WO2004029861A1 (en) * | 2002-09-24 | 2004-04-08 | Biometix Pty Ltd | Illumination for face recognition |
| US20060018641A1 (en) * | 2004-07-07 | 2006-01-26 | Tomoyuki Goto | Vehicle cabin lighting apparatus |
| US20180198964A1 (en) * | 2011-01-28 | 2018-07-12 | Windy Place, Inc. | Lighting and power devices and modules |
| US20130128073A1 (en) * | 2011-11-22 | 2013-05-23 | Samsung Electronics Co. Ltd. | Apparatus and method for adjusting white balance |
| CN109348138A (en) * | 2018-10-12 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Light irradiation regulating method, device, equipment and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10523856B2 (en) | Method and electronic device for producing composite image | |
| US11689817B2 (en) | Method and apparatus for automatically detecting and suppressing fringes, electronic device and computer-readable storage medium | |
| US9936141B2 (en) | Image processing apparatus for performing correction processing for effecting virtual light source and method executed by image processing apparatus | |
| US8106958B2 (en) | Image processing apparatus and image processing method and image capturing apparatus | |
| EP3158833B1 (en) | High-dynamic-range coded light detection | |
| US20160110846A1 (en) | Automatic display image enhancement based on user's visual perception model | |
| KR102491544B1 (en) | Imaging processing device and Imaging processing method | |
| US9497433B2 (en) | Imaging device with color adjustment function, imaging method using the same, and non-transitory storage medium in which imaging program is stored | |
| US8285133B2 (en) | Dynamic lighting control in hybrid camera-projector device | |
| US11019254B2 (en) | Image processing apparatus, control method for image processing apparatus, and storage medium having correction of effect of virtual light source | |
| CN110809120A (en) | Light supplementing method for shot picture, smart television and computer readable storage medium | |
| CN111315071B (en) | LED intelligent control method and system | |
| CN110784701A (en) | Display apparatus and image processing method thereof | |
| CN113140197A (en) | Display picture adjusting method and device, electronic equipment and readable storage medium | |
| CN111654643A (en) | Exposure parameter determination method and device, unmanned aerial vehicle and computer readable storage medium | |
| CN113709949A (en) | Control method and device of lighting equipment, electronic equipment and storage medium | |
| US11670255B2 (en) | Signal light display determination device, signal light display determination method, and non-transitory computer-readable recording medium | |
| US20230239559A1 (en) | Activating light sources for output image | |
| TWI819672B (en) | Method for determining ambient light luminance, host, and computer readable storage medium | |
| US8878957B2 (en) | Method, system and computer program product for enhancing white balance of an image | |
| WO2022271161A1 (en) | Light compensations for virtual backgrounds | |
| WO2022125087A1 (en) | Light intensity determination | |
| CN113099191B (en) | Image processing method and device | |
| JP2013132065A (en) | Imaging apparatus and flash control method | |
| US20250239198A1 (en) | Display control device, display control method, image processing system, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20965273 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20965273 Country of ref document: EP Kind code of ref document: A1 |