WO2023001373A1 - Device and method for processing image data - Google Patents

Device and method for processing image data Download PDF

Info

Publication number
WO2023001373A1
WO2023001373A1 PCT/EP2021/070464 EP2021070464W WO2023001373A1 WO 2023001373 A1 WO2023001373 A1 WO 2023001373A1 EP 2021070464 W EP2021070464 W EP 2021070464W WO 2023001373 A1 WO2023001373 A1 WO 2023001373A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
image data
flickering
image
image sensor
Prior art date
Application number
PCT/EP2021/070464
Other languages
French (fr)
Inventor
Mikko Muukki
Radu Ciprian Bilcu
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to PCT/EP2021/070464 priority Critical patent/WO2023001373A1/en
Priority to CN202180098736.5A priority patent/CN117426104A/en
Publication of WO2023001373A1 publication Critical patent/WO2023001373A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination

Definitions

  • the present disclosure relates generally to the field of processing image data, and, more particularly, to a device and a method for processing imaging data.
  • a device and a method may obtain image data from an image sensor, and may detect flickering in the image data of the image sensor.
  • a rolling shutter is used in a mobile camera for capturing pictures or image frames.
  • a scene includes a light source or a target that flickers, this may cause artifacts in image frames captured with the use of the rolling shutter camera. These artifacts may depend on the exposure time used by the rolling shutter camera and the frequency of the flickering.
  • the flickering may be due to a varying brightness, such as caused by an Alternating current (AC) power or Pulse-width modulation (PWM) powered light source.
  • the flickering may become visible, for example, if an exposure time of a captured image is shorter than 1/f, where f is the flickering frequency.
  • the flickering may also be visible, if the exposure time is not an integer multiple of 1/f, for example, when exposure time is longer than 1/f.
  • a conventional device uses an active rolling shutter camera for detecting flickering, however, an issue of this conventional device is that it generates visible artifacts, in order to make the flickering visible for analysis. Therefore, the flickering become visible to the user or an event in recorded videos.
  • Another conventional device uses a separate image sensor for detecting flickering.
  • an assistant rolling shutter camera may be used to detect 50 Hz to 60 Hz.
  • an issue of using such an additional image sensor is that the cost of production of the conventional device is high.
  • Another conventional device is based on integrating a flicker detection component into another sensor, for example, by adding some extra functionality into the component.
  • an issue of such a conventional device is that the complexity of the other sensor is high. For example, since some sensors may not support a functionality for detecting flickering.
  • embodiments of the present disclosure aim to improve the conventional devices and methods for processing image data in view of flickering.
  • An objective is to provide a device and method that may use a set of pixels (for example, event pixels) for detecting flickering. Thereby, the device and method should be able to reliably reduce flickering in image data.
  • a first aspect of the present disclosure provides a device for processing image data, the device being configured to obtain first image data of a scene, wherein the first image data is obtained with a first set of pixels of at least one image sensor, estimate flickering information based on the first image data, estimate an exposure time for a second set of pixels of the at least one image sensor based on the estimated flickering information, and obtain second image data of the scene, wherein the second image data is obtained with the estimated exposure time and the second set of pixels of the at least one image sensor.
  • the device may be, or may be incorporated in, a digital camera, a digital video recorder, a smart phone, an augmented reality device, a virtual reality device, or the like.
  • the at least one image sensor may be or may comprise an event sensor, an event camera, or the like.
  • the at least one image sensor e.g., the event sensor, may respond to changes in the incoming light intensity.
  • the at least one image sensor when being an event sensor, may generate image data, which may include event data.
  • the event data may be generated when changes in the intensity of the incoming light are detected, for example, when a global or a local illuminance change occurs.
  • a flickering of a light source may cause event pixels to generate data.
  • a change in brightness of an object in a scene such as flickering of a light source, may also generate event data.
  • the at least one image sensor may be based on an asynchronous sensor, in which the pixels are activated only when they sense a change in the intensity of the incoming light.
  • an output data rate of the at least one image sensor may be variable.
  • the device may be a multi camera system, wherein the camera system includes at least one image sensor.
  • the camera system includes at least one image sensor.
  • the first set of pixels may be used for detecting flickering regions on the first image data.
  • the device may further estimate flickering information based on the first image data.
  • the flickering information may be or may comprise a flickering frequency.
  • the device may further estimate the exposure time for the second set of pixels based on the estimated flickering information. For example, the device may obtain an exposure time for an active rolling shutter image sensor comprising the second set of pixels, to capture an image without flickering. For instance, the exposure time of the second set of pixels may be a multiple of 1/flickering frequency. Furthermore, the device may obtain the second image data with the second set of pixels with the estimated exposure time.
  • the first set of pixels e.g., the event pixels
  • the first set of pixels may be in a separate event camera or in in-array of rolling shutter camera, e.g., in the same rolling shutter sensor as the active rolling shutter sensor, or in another rolling shutter sensor of the device.
  • the pixels of the at least one image sensor may capture the intensity of the light reflected from the moving object, and may further generate an event. The generated events may be due to changes in the intensity of incoming light.
  • the pixels of the at least one image sensor may capture a logarithm of the incoming light intensity.
  • the first image data may comprise a generated event.
  • the generated events may comprise +ls signaling, for the case when the incoming light intensity increased.
  • the generated event may comprise - Is signaling, for the case when the incoming light intensity decreased.
  • the generated events may further comprise information related to pixels that generated the event, and the time information of the event.
  • the generated events may comprise the pixel’s coordinates, and a time stamp when the event occurred.
  • the first set of pixels may also be used for detecting motion in to the first image data of the scene.
  • the first set of pixels e.g., the event pixels
  • the first set of pixels may operate in a logarithmic domain so that ambient light may be intrinsically eliminated, when calculating the difference between two instances of the incoming light.
  • the first set of pixels may also be used to obtain information about the reflectance of the objects of the scene.
  • the device may comprise an event camera module with optical elements and an event sensor.
  • the at least one image sensor e.g., the event sensor may be connected to main System on Chip (SoC) with an interface.
  • SoC System on Chip
  • a frame camera module having optical elements and an image sensor e.g., having Bayer color pattern
  • the main SoC may include, for example, a central processing unit (CPU), a graphics processing unit (GPU), a neural processing unit (NPU) and an image signal processor or processing (ISP).
  • a memory such as a Random-access memory (RAM) or a non-volatile memory may also be connected to the main SoC.
  • RAM Random-access memory
  • non-volatile memory may also be connected to the main SoC.
  • the device may comprise a circuitry.
  • the circuitry may comprise hardware and software.
  • the hardware may comprise analog or digital circuitry, or both analog and digital circuitry.
  • the circuitry comprises one or more processors and a non-volatile memory connected to the one or more processors.
  • the non-volatile memory may carry executable program code which, when executed by the one or more processors, causes the device to perform the operations or methods described herein.
  • the device is further configured to determine one or more flickering frequencies based on the first image data, and estimate the exposure time for the second set of pixels based on at least one flickering frequency of the one or more flickering frequencies.
  • a flickering frequency can be determined and the exposure time can be estimated based on the flickering frequency. Moreover, it is possible to adjust the exposure time such that it causes no flickering in the second image data or reduces (e.g., minimizes) the flickering in the second image data.
  • the exposure time is estimated based on an inverse of the at least one flickering frequency.
  • the exposure time may be “ 1 /flickering frequency” or a multiple of “ / flickering frequency” .
  • the exposure time may be “ 1 /flickering frequency” or a multiple of “ / flickering frequency” .
  • estimating flickering information comprises one or more of: estimating global flickering information, estimating local flickering information, determining one or more flickering frequencies, determining a flicker area size in a field of view of the at least one image sensor, determining flicker area locations in the field of view of the at least one image sensor.
  • the local flickering information and/or the global flickering information can be estimated and can be separated from each other. Moreover, by estimating the flickering information based on the local flickering information and/or the global flickering information, it is possible to reduce flickering as desired from the image data. For example, the device may reduce local flickering from the image data. Furthermore, the device may reduce global flickering from the image data. Also, the device may reduce the local and/or the global flickering in the image data.
  • the at least one image sensor comprises an event sensor, the event sensor comprising the first set of pixels, and wherein the device is further configured to activate the first set of pixels to generate the first image data by acquiring a change in light intensity at one or more time intervals.
  • the device may activate the event sensor comprising the first set of pixels which may be an assistant camera.
  • the device may activate the event sensor to detect flickering or to estimate flickering information.
  • the flickering information may include at least one of global flickering information, local flickering information, and one or more flickering frequencies.
  • the device may activate the first set of pixels, e.g., when the camera system starts, or when a framing of the device changes, or when the illumination of the incoming light intensity changes.
  • the device may use the flickering information to estimate the exposure time for the second set of pixels.
  • the exposure time may be estimated such that when capturing image data with the estimated exposure time, no flickering is caused, or at least the flickering in the second image data is significantly reduced.
  • the exposure time may be estimated depending on at least one of main objects in the scene, the flicker area size, and locations in the field of view of active camera.
  • activating the first set of pixels comprises one or more of: disabling a flicker removal filter, selecting a setting for flicker detection, using a filter for removing one or more frequencies of light.
  • the device may comprise an assistant camera which may comprise the event sensor including the first set of pixels.
  • an assistant camera which is not specifically implemented, to only detect a flicker. So, a production cost of manufacturing a hardware that is build “just for” the flicker detection may be reduced.
  • the at least one image sensor may comprise a filter to remove certain frequencies (e.g. 100Hz), for example, to avoid events that occurs due to a flickering of a light source.
  • the filter may be turned off, for example, if flicker detection process is activated (assuming that a flicker detection program is running on the device.
  • the at least one image sensor may comprise a filter to remove certain frequencies.
  • the device may detect a flickering, may determine a flickering frequency, and may further filter the determined flickering frequency.
  • the first image data is obtained based on a plurality of local windows located on top of the event sensor.
  • the plurality of local windows that may be, for example, located at different positons on the event sensor, it is possible to define the locations of the local windows based on a specific application of the image sensor or other information. For example, it is possible to automatically select a specific local window for receiving light at a specific time on the event sensor.
  • the at least one image sensor comprises an array of rolling shutter image sensors, the array comprising the first set of pixels and the second set of pixels.
  • the device may have an image sensor that includes an active sensor and an assistant sensor.
  • a first set of pixels and the second set of pixels may be integrated into same sensor.
  • the image sensor may have two separate sets of pixels or the image sensor may use a time sharing of the photodiodes for a specific operation such as generating the first image of data, generating the second image data, obtaining event data.
  • a manufacturing cost of producing an additional image sensor for detecting the flickering may be reduced.
  • the first set of pixels comprises a plurality of optical black pixels located at predefined locations in the array of rolling shutter image sensors, in particular pixels located at a border of the array of rolling shutter image sensors are optical black pixels.
  • the location of such optical black pixels can be selected such that a flickering may be detected and may further be reduced.
  • the location of the optical black pixels can be on the border of the array of rolling shutter image sensors.
  • the location of the optical black pixels can be predefined in a specific area of the array of the rolling shutter image sensor. Moreover, a flickering area may be detected and flickering may be reduced.
  • the first set of pixels comprises a plurality of event pixels located at predefined locations in the array of rolling shutter image sensors, in particular every 16 th pixel of the array of rolling shutter image sensors is an event pixel.
  • the events pixels can be used for detecting a flickering region in the array of rolling shutter image sensors.
  • flickering information such as a flickering frequency can be estimated based on the event sensors that are in in the array of rolling shutter image sensors.
  • a flickering in the image data may be reduced.
  • the first set of pixels comprises a plurality of event pixels distributed pseudo-randomly in the array of rolling shutter image sensors.
  • an example of providing event pixels in the array of rolling shutter image sensors is to pseudo-randomly distribute the event pixels in the array of rolling shutter image sensors. Moreover, by pseudo-randomly distributing the plurality of event pixels in the array of rolling shutter image sensors, it is possible to detect flickering from corresponding pseudo-randomly locations in the array.
  • the device is further configured to obtain third image data of the scene, wherein the third image data is obtained with the second set of pixels of the at least one image sensor, and estimate the flickering information based further on the third image data.
  • the third image data may be used for estimating flickering information.
  • a bright object in the third image data, a display or a screen which may cause a flickering may be identified, for example, from the third image data, and information related to the region of interest may be used for estimating the flickering.
  • the brightness information of objects may also be used for estimating flickering information.
  • the device may use this information to change a setting of the first set of pixels, the second set of pixels, an assistant camera, or for analyzing flickering from these objects. For instance, a sensitivity of the first set of pixels of the image sensor (e.g., the event sensor) may be increased to detect flickering more reliably.
  • the third image data indicates one or more of: a brightness information of an object in the scene, a brightness information of a display screen producing a flicker, a region of interest, ROI, in the scene.
  • the brightness information of objects may be used to detect the flickering by the event sensor.
  • the above information can be used to adjust a setting of the event sensor.
  • the sensitivity of the event sensor may be adjusted or increased to detect flickering more reliably.
  • the device is further configured to adjust a setting of the at least one image sensor based on the third image data, in order to estimate the flickering information from the first image data.
  • a second aspect of the disclosure provides a method for processing image data, the method comprising: obtaining first image data of a scene, wherein the first image data is obtained with a first set of pixels of at least one image sensor, estimating flickering information based on the first image data, estimating an exposure time for a second set of pixels of the at least one image sensor based on the estimated flickering information, and obtaining second image data of the scene, wherein the second image data is obtained with the estimated exposure time and the second set of pixels of the at least one image sensor.
  • the method further comprises determining one or more flickering frequencies based on the first image data, and estimating the exposure time for the second set of pixels based on at least one flickering frequency of the one or more flickering frequencies.
  • the method further comprises estimating the exposure time based on an inverse of the at least one flickering frequency.
  • estimating flickering information comprises one or more of: estimating global flickering information, estimating local flickering information, determining one or more flickering frequencies, determining a flicker area size in a field of view of the at least one image sensor, determining flicker area locations in the field of view of the at least one image sensor.
  • the at least one image sensor comprises an event sensor, the event sensor comprising the first set of pixels , and wherein the method further comprises activating the first set of pixels to generate the first image data by acquiring a change in light intensity at one or more time intervals.
  • the activating the first set of pixels comprises one or more of: disabling a flicker removal filter, selecting a setting for flicker detection, using a filter for removing one or more frequencies of light.
  • the method further comprises obtaining the first image data based on a plurality of local windows located on top of the event sensor.
  • the at least one image sensor comprises an array of rolling shutter image sensors, the array comprising the first set of pixels and the second set of pixels.
  • the first set of pixels comprises a plurality of optical black pixels located at predefined locations in the array of rolling shutter image sensors, in particular pixels located at a border of the array of rolling shutter image sensors are optical black pixels.
  • the first set of pixels comprises a plurality of event pixels located at predefined locations in the array of rolling shutter image sensors, in particular every 16 th pixel of the array of rolling shutter image sensors is an event pixel.
  • the first set of pixels comprises a plurality of event pixels distributed pseudo-randomly in the array of rolling shutter image sensors.
  • the method further comprises obtaining third image data of the scene, wherein the third image data is obtained with the second set of pixels of the at least one image sensor, and estimating the flickering information based further on the third image data.
  • the third image data indicates one or more of: a brightness information of an object in the scene, a brightness information of a display screen producing a flicker, a region of interest, ROI, in the scene.
  • the method further comprises adjusting a setting of the at least one image sensor based on the third image data, in order to estimate the flickering information from the first image data.
  • a third aspect of the present disclosure provides a computer program comprising a program code for performing the method according to the second aspect or any of its implementation forms.
  • a fourth aspect of the present disclosure provides a non-transitory storage medium storing executable program code which, when executed by a processor, causes the method according to the second aspect or any of its implementation forms to be performed.
  • the devices, elements, units and means described in the present application could be implemented in software or hardware elements or any kind of combination thereof.
  • the steps which are performed by the various entities described in the present application, as well as the functionalities described to be performed by the various entities, are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof.
  • FIG. 1 depicts a schematic view of a device for processing image data, according to an example of the disclosure
  • FIG. 2 depicts a diagram illustrating an example of a test setup for detecting a flickering of a display of a mobile device
  • FIG. 3 depicts a diagram illustrating an example of a binarized image obtained after accumulating events
  • FIG. 4 depicts a diagram illustrating an example of a flickering spectrum calculated from a time varying signal
  • FIG. 5 depicts a schematic view of a diagram illustrating an example of an arrangement of local windows on top of an event sensor
  • FIG. 6 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of optical black pixels
  • FIG. 7 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of event pixels
  • FIG. 8 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of event pixels distributed pseudo-randomly in the image sensor.
  • FIG. 9 depicts a flowchart of a method for processing image data, according to an example of the disclosure.
  • FIG. 1 shows a schematic view of a device 100 for processing image data, according to an embodiment of the disclosure.
  • the device 100 may be, or may be included in, an electronic device such as a digital camera.
  • the device 100 is configured to obtain first image data 101 of a scene, wherein the first image data 101 is obtained with a first set of pixels 111 of at least one image sensor 110.
  • the first set of pixels 111 may comprise event pixels that may be used for detecting flickering regions on the first image data 101.
  • the device 100 is further configured to estimate flickering information 102 based on the first image data 101. For example, the device 100 may calculate a flickering frequency.
  • the device 100 is further configured to estimate an exposure time 103 for a second set of pixels 112 of the at least one image sensor 110 based on the estimated flickering information.
  • the device 100 is further configured to obtain second image data 104 of the scene, wherein the second image data 104 is obtained with the estimated exposure time 103 and the second set of pixels 112 of the at least one image sensor 110.
  • the second set of pixels 112 may comprise active rolling shutter pixels.
  • the exposure time may be determined such that obtaining the second image data 104 with that exposure time, a flickering may be removed.
  • the first set of pixels may be in a separate event camera, or they may in in-array of rolling shutter camera (the same rolling shutter camera as the active rolling shutter camera or other rolling shutter camera).
  • the device 100 may comprise processing circuitry (not shown) configured to perform, conduct or initiate the various operations of the device 100 described herein.
  • the processing circuitry may comprise hardware and software.
  • the hardware may comprise analog circuitry or digital circuitry, or both analog and digital circuitry.
  • the digital circuitry may comprise components such as application-specific integrated circuits (ASICs), field-programmable arrays (FPGAs), digital signal processors (DSPs), or multi-purpose processors.
  • the processing circuitry comprises one or more processors and a non-transitory memory connected to the one or more processors.
  • the non-transitory memory may carry executable program code which, when executed by the one or more processors, causes the device 100 to perform, conduct or initiate the operations or methods described herein.
  • FIG. 2 depicts a diagram illustrating an example of a test setup 200 for detecting the display flicker of a mobile device 201.
  • the device 100 is configured to detect flickering of objects in the scene.
  • the display of the mobile device 201 is one example of an object that causes a flickering in the scene.
  • the device 100 may detect the flickering of the display of the mobile device 201.
  • the device 100 may collect several of such grayscale images I(i), e.g., for several adjacent time intervals having a length of T. Furthermore, the device 100 may process the image data such that for each of the collected images I(i) the data may be binarized. For example, the device 100 may binarize the image data such that negative values, due to accumulation of negative events are set to zero, and the positive values due to accumulation of positive events are set to 255. Furthermore, for the pixels which did not generated events, or a sum of their obtained data is zero, the device 100 may set their values to 127.
  • the device 100 may use other known binarization method. For instance, if only negative or only positive sums are retained, those pixels can be set to 255 and the rest of the pixels can be set to zero. An example of such binarized image is shown in FIG. 3.
  • the flickering areas 302 which indicates a black band is detected.
  • the flickering area 302 is caused by flickering of the display of the mobile device 201.
  • the white band 301 is also caused by flickering of the display of the mobile device 201.
  • This image is obtained by capturing event data from the mobile device 201 shown in FIG. 2, whose display was showing a white image.
  • the display of the mobile device 201 is depicted in FIG. 2.
  • the device 100 may select a small rectangular local area having a size of NxN centered at position (xc, yc), from each of the collected images I(i). Moreover, the device 100 may calculate the local averages of the pixels inside these small regions, e.g., for all the binary images.
  • FIG. 4 depicts a diagram 400 illustrating an example of a spectrum 401 calculated from a time varying signal.
  • the device 100 may use a signal that is obtained from time variation of local averages in the first image data 101.
  • the time variation of the local averages may be periodic.
  • the device 100 may further estimate a frequency spectrum 401 based on the time variation of the local averages. Moreover, the device 100 may estimate flickering information 102 based on the frequency spectrum 401. For example, the device 100 may estimate at least one flickering frequency. As can be derived from flickering spectrum of FIG. 4, a fundamental frequency is located around 60Hz, which is the refresh rate of the display of the mobile 201 used in the test.
  • the present disclosure is not limited to this specific method for detecting the flickering.
  • the time interval T can be different than the 1ms example used in the description of FIG. 3 and FIG. 4.
  • FIG. 5 depicts a schematic view of a diagram illustrating an example of an arrangement of the local windows 502 over the event sensor 501.
  • the device 100 may obtain the first image data 101 from the event sensor 501 in which several local windows 502 are located at different positions over the event sensor 501.
  • the location of the local windows may be predefined or may be selected automatically based on the application and/or other information.
  • the device 100 may perform the binarization of the accumulated grayscale images by using a different method.
  • the device 100 may calculate the flickering information 102, as described above also for non-white areas of the scene, which are the detected flickering. Furthermore, the size N of the local areas used to compute the averages may also be different than 5 pixels used in the above example. In some embodiments, the device 100 may estimate flickering information 102, e.g., the flickering frequency by using a hybrid sensor in which the event pixels are embedded into a standard color camera.
  • the device 100 may detect a periodic time variation of the signal activity in at least one area of the first image data 101 of the scene. Moreover, the device 100 may estimate the flickering information, e.g., the flickering frequency which is related to the flickering in that area of the image.
  • the flickering information e.g., the flickering frequency which is related to the flickering in that area of the image.
  • the device 100 may determine an exposure time for the second set of pixels 112 (e.g., active rolling shutter camera) such that the determined exposure time may avoid or reduce the amount of flickering. For instance, the device 100 may determine the exposure time being an integer multiple of 1/flicker frequency. Furthermore, in a case that the exposure time cannot be set as exactly as the integer multiple of 1/flicker frequency, then the device 100 may select an exposure time that is close to the integer.
  • the second set of pixels 112 e.g., active rolling shutter camera
  • FIG. 6 is a schematic view of a diagram illustrating an image sensor 110 comprising a plurality of optical black pixels 602.
  • the image sensor 110 may comprise an array of rolling shutter image sensors 601. Moreover, the array of rolling shutter image sensors 601 may comprise the first set of pixels 111 and the second set of pixels 112.
  • the first set of pixels 111 may comprise the plurality of optical black pixels 602.
  • the plurality of optical black pixels 602 may be located at predefined locations in the array of rolling shutter image sensors 601.
  • the plurality of optical black pixels 602 may be pixels that are located at a border of the array of rolling shutter image sensors 601.
  • an image sensor 110 that is a 12 Mpix may comprise tens of thousands OB pixel.
  • the OB pixels may be in a grid pattern.
  • the content of the grid may be pseudo- randomly different.
  • FIG. 7 is a schematic view of the image sensor 110 comprising a plurality of event pixels 702.
  • the image sensor 110 comprises the array of rolling shutter image sensors 601 that includes the first set of pixels 111.
  • the first set of pixels 111 includes the plurality of event pixels 702.
  • the plurality of event pixels 702 are located at predefined locations in the array of rolling shutter image sensors 601. For example, every 16 th pixel of the array of rolling shutter image sensors 601 is an event pixel 702.
  • the image sensor 110 may include both visible pixels and event pixels.
  • Event pixels may be in the image sensor in a regular form, e.g., every 16 th pixel of image sensor is event pixels or in pseudorandom order.
  • photodiode of pixel is shared by event pixel circuitry and by frame camera circuitry. Such sharing may happen in time domain, i.e., in certain moment the pixel is operating as event pixel and in some other moment as frame camera pixel.
  • FIG. 8, is a schematic view of a diagram illustrating an image sensor 110 comprising a plurality of event pixels 802 distributed pseudo-randomly in the image sensor 110.
  • the image sensor 110 comprises the array of rolling shutter image sensors 601 that includes the first set of pixels 111. Moreover, the first set of pixels 111 comprises a plurality of event pixels 802 that are pseudo-randomly distributed in the array of rolling shutter image sensors 601. For example, the device 100 may process the image data 101 such that the image data 101 may be divided into 32x32 pixels block. Each block may have one OB pixel.
  • each block of pixels may comprise its own OB pixel location type.
  • the type may be a value between 1 to 7.
  • the value may specify an x-y offset for an OB pixel address inside the block.
  • An example of x-y offset coding is presented in table 1.
  • Table 1 x-y offset coding by type of pixel
  • OB pixel location type changes for each block there may be different design choices for how the OB pixel location type changes for each block.
  • An example may be using a simple up counter for the image sensor 110 depicted in FIG. 8.
  • Another example may be using a simple down counter for the image sensor 110 depicted in FIG. 8.
  • Table 2 presents a per-frame resetting type pattern
  • table 3 presents a per-line resetting type pattern, in particular every 2 nd line in reverse order is a reset pattern.
  • Table 2 a per- frame resetting type pattern
  • Table 3 a per- line resetting type pattern It can be derived that a per-line resetting pattern has a less random behavior than a per-frame resetting pattern.
  • the image sensor 110 discussed with respect to FIG. 8 provides a Video Graphics Array (VGA) image in which the OB pixels are provided with a per-frame resetting type pattern and using a type coding according to table 4.
  • VGA Video Graphics Array
  • FIG. 9 shows a method 900 according to an embodiment of the disclosure for processing image data.
  • the method 900 may be carried out by the device 100, as it described above.
  • the method 900 comprises a step S901 of obtaining first image data 101 of a scene, wherein the first image data 101 is obtained with a first set of pixels 111 of at least one image sensor 110.
  • the method 900 further comprises a step S902 of estimating flickering information 102 based on the first image data 101.
  • the method 900 further comprises a step S903 of estimating an exposure time 103 for a second set of pixels 112 of the at least one image sensor 110 based on the estimated flickering information 102.
  • the method 900 further comprises a step S904 of obtaining second image data 104 of the scene, wherein the second image data 104 is obtained with the estimated exposure time 103 and the second set of pixels 112 of the at least one image sensor 110.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure relates to a device for processing image data. The device may obtain first image data of a scene, wherein the first image data is obtained with a first set of pixels of at least one image sensor. Moreover, the device may estimate flickering information based on the first image data, and may further estimate an exposure time for a second set of pixels of the at least one image sensor based on the estimated flickering information. Moreover, the device may obtain second image data of the scene, wherein the second image data is obtained with the estimated exposure time and the second set of pixels of the at least one image sensor.

Description

DEVICE AND METHOD FOR PROCESSING IMAGE DATA
TECHNICAL FIELD
The present disclosure relates generally to the field of processing image data, and, more particularly, to a device and a method for processing imaging data.
To this end, a device and a method are provided, which may obtain image data from an image sensor, and may detect flickering in the image data of the image sensor.
BACKGROUND
Typically, a rolling shutter is used in a mobile camera for capturing pictures or image frames. However, if a scene includes a light source or a target that flickers, this may cause artifacts in image frames captured with the use of the rolling shutter camera. These artifacts may depend on the exposure time used by the rolling shutter camera and the frequency of the flickering.
For example, the flickering may be due to a varying brightness, such as caused by an Alternating current (AC) power or Pulse-width modulation (PWM) powered light source. The flickering may become visible, for example, if an exposure time of a captured image is shorter than 1/f, where f is the flickering frequency. Furthermore, the flickering may also be visible, if the exposure time is not an integer multiple of 1/f, for example, when exposure time is longer than 1/f.
A conventional device uses an active rolling shutter camera for detecting flickering, however, an issue of this conventional device is that it generates visible artifacts, in order to make the flickering visible for analysis. Therefore, the flickering become visible to the user or an event in recorded videos.
Another conventional device uses a separate image sensor for detecting flickering. For example, in addition to the main active camera of the device, an assistant rolling shutter camera may be used to detect 50 Hz to 60 Hz. However, an issue of using such an additional image sensor is that the cost of production of the conventional device is high. Another conventional device is based on integrating a flicker detection component into another sensor, for example, by adding some extra functionality into the component. However, an issue of such a conventional device is that the complexity of the other sensor is high. For example, since some sensors may not support a functionality for detecting flickering.
SUMMARY
In view of the above-mentioned problems and disadvantages, embodiments of the present disclosure aim to improve the conventional devices and methods for processing image data in view of flickering.
An objective is to provide a device and method that may use a set of pixels (for example, event pixels) for detecting flickering. Thereby, the device and method should be able to reliably reduce flickering in image data.
These and other objectives are achieved by the embodiments of the disclosure as described in the enclosed independent claims. Advantageous implementations of the embodiments of the disclosure are further defined in the dependent claims.
A first aspect of the present disclosure provides a device for processing image data, the device being configured to obtain first image data of a scene, wherein the first image data is obtained with a first set of pixels of at least one image sensor, estimate flickering information based on the first image data, estimate an exposure time for a second set of pixels of the at least one image sensor based on the estimated flickering information, and obtain second image data of the scene, wherein the second image data is obtained with the estimated exposure time and the second set of pixels of the at least one image sensor.
The device may be, or may be incorporated in, a digital camera, a digital video recorder, a smart phone, an augmented reality device, a virtual reality device, or the like.
The at least one image sensor may be or may comprise an event sensor, an event camera, or the like. The at least one image sensor, e.g., the event sensor, may respond to changes in the incoming light intensity. For example, the at least one image sensor, when being an event sensor, may generate image data, which may include event data. The event data may be generated when changes in the intensity of the incoming light are detected, for example, when a global or a local illuminance change occurs.
Furthermore, a flickering of a light source may cause event pixels to generate data. For example, a change in brightness of an object in a scene, such as flickering of a light source, may also generate event data.
Moreover, the at least one image sensor may be based on an asynchronous sensor, in which the pixels are activated only when they sense a change in the intensity of the incoming light. Besides, an output data rate of the at least one image sensor may be variable.
In some embodiments, the device may be a multi camera system, wherein the camera system includes at least one image sensor. For example, there may be two image sensors, an active camera with a rolling shutter image sensor that may capture images and an assistant camera which may be an event sensor for detecting flickering.
The first set of pixels may be used for detecting flickering regions on the first image data. Moreover, the device may further estimate flickering information based on the first image data. The flickering information may be or may comprise a flickering frequency.
The device may further estimate the exposure time for the second set of pixels based on the estimated flickering information. For example, the device may obtain an exposure time for an active rolling shutter image sensor comprising the second set of pixels, to capture an image without flickering. For instance, the exposure time of the second set of pixels may be a multiple of 1/flickering frequency. Furthermore, the device may obtain the second image data with the second set of pixels with the estimated exposure time.
In some embodiments, the first set of pixels, e.g., the event pixels, may be in a separate event camera or in in-array of rolling shutter camera, e.g., in the same rolling shutter sensor as the active rolling shutter sensor, or in another rolling shutter sensor of the device. Moreover, in case of moving objects in the scene, for instance, the pixels of the at least one image sensor may capture the intensity of the light reflected from the moving object, and may further generate an event. The generated events may be due to changes in the intensity of incoming light.
Furthermore, the pixels of the at least one image sensor may capture a logarithm of the incoming light intensity. For example, the first image data may comprise a generated event. The generated events may comprise +ls signaling, for the case when the incoming light intensity increased. Alternatively, the generated event may comprise - Is signaling, for the case when the incoming light intensity decreased. The generated events may further comprise information related to pixels that generated the event, and the time information of the event. For example, the generated events may comprise the pixel’s coordinates, and a time stamp when the event occurred.
Moreover, since events can be generated with a time rate of 1 ps, the first set of pixels may also be used for detecting motion in to the first image data of the scene. Moreover, the first set of pixels (e.g., the event pixels) may operate in a logarithmic domain so that ambient light may be intrinsically eliminated, when calculating the difference between two instances of the incoming light. Furthermore, the first set of pixels may also be used to obtain information about the reflectance of the objects of the scene.
The device may comprise an event camera module with optical elements and an event sensor. For example, the at least one image sensor, e.g., the event sensor may be connected to main System on Chip (SoC) with an interface. Furthermore, a frame camera module having optical elements and an image sensor (e.g., having Bayer color pattern) may be connected to the main SoC with another interface. The main SoC may include, for example, a central processing unit (CPU), a graphics processing unit (GPU), a neural processing unit (NPU) and an image signal processor or processing (ISP). Furthermore, a memory such as a Random-access memory (RAM) or a non-volatile memory may also be connected to the main SoC.
The device may comprise a circuitry. The circuitry may comprise hardware and software. The hardware may comprise analog or digital circuitry, or both analog and digital circuitry. In some embodiments, the circuitry comprises one or more processors and a non-volatile memory connected to the one or more processors. The non-volatile memory may carry executable program code which, when executed by the one or more processors, causes the device to perform the operations or methods described herein. In an implementation form of the first aspect, the device is further configured to determine one or more flickering frequencies based on the first image data, and estimate the exposure time for the second set of pixels based on at least one flickering frequency of the one or more flickering frequencies.
A flickering frequency can be determined and the exposure time can be estimated based on the flickering frequency. Moreover, it is possible to adjust the exposure time such that it causes no flickering in the second image data or reduces (e.g., minimizes) the flickering in the second image data.
In a further implementation form of the first aspect, the exposure time is estimated based on an inverse of the at least one flickering frequency.
For example, the exposure time may be “ 1 /flickering frequency” or a multiple of “ / flickering frequency” . Moreover, by adjusting the exposure time based on the inverse of the flickering frequency, it is possible to avoid flickering in the image data or to reduce flickering in the image data.
In a further implementation form of the first aspect, estimating flickering information comprises one or more of: estimating global flickering information, estimating local flickering information, determining one or more flickering frequencies, determining a flicker area size in a field of view of the at least one image sensor, determining flicker area locations in the field of view of the at least one image sensor.
The local flickering information and/or the global flickering information can be estimated and can be separated from each other. Moreover, by estimating the flickering information based on the local flickering information and/or the global flickering information, it is possible to reduce flickering as desired from the image data. For example, the device may reduce local flickering from the image data. Furthermore, the device may reduce global flickering from the image data. Also, the device may reduce the local and/or the global flickering in the image data. In a further implementation form of the first aspect, the at least one image sensor comprises an event sensor, the event sensor comprising the first set of pixels, and wherein the device is further configured to activate the first set of pixels to generate the first image data by acquiring a change in light intensity at one or more time intervals.
For instance, the device may activate the event sensor comprising the first set of pixels which may be an assistant camera. The device may activate the event sensor to detect flickering or to estimate flickering information. The flickering information may include at least one of global flickering information, local flickering information, and one or more flickering frequencies. Moreover, the device may activate the first set of pixels, e.g., when the camera system starts, or when a framing of the device changes, or when the illumination of the incoming light intensity changes.
Furthermore, the device may use the flickering information to estimate the exposure time for the second set of pixels. The exposure time may be estimated such that when capturing image data with the estimated exposure time, no flickering is caused, or at least the flickering in the second image data is significantly reduced. For example, the exposure time may be estimated depending on at least one of main objects in the scene, the flicker area size, and locations in the field of view of active camera.
In a further implementation form of the first aspect, activating the first set of pixels comprises one or more of: disabling a flicker removal filter, selecting a setting for flicker detection, using a filter for removing one or more frequencies of light.
In some embodiments, the device may comprise an assistant camera which may comprise the event sensor including the first set of pixels. In some embodiments, it is possible to use an assistant camera which is not specifically implemented, to only detect a flicker. So, a production cost of manufacturing a hardware that is build “just for” the flicker detection may be reduced.
In some embodiments, the at least one image sensor may comprise a filter to remove certain frequencies (e.g. 100Hz), for example, to avoid events that occurs due to a flickering of a light source. Moreover, the filter may be turned off, for example, if flicker detection process is activated (assuming that a flicker detection program is running on the device.
In some embodiments, the at least one image sensor may comprise a filter to remove certain frequencies. For example, the device may detect a flickering, may determine a flickering frequency, and may further filter the determined flickering frequency.
In a further implementation form of the first aspect, the first image data is obtained based on a plurality of local windows located on top of the event sensor.
By using the plurality of local windows that may be, for example, located at different positons on the event sensor, it is possible to define the locations of the local windows based on a specific application of the image sensor or other information. For example, it is possible to automatically select a specific local window for receiving light at a specific time on the event sensor.
In a further implementation form of the first aspect, the at least one image sensor comprises an array of rolling shutter image sensors, the array comprising the first set of pixels and the second set of pixels.
For example, the device may have an image sensor that includes an active sensor and an assistant sensor. For instance, a first set of pixels and the second set of pixels may be integrated into same sensor. For example, the image sensor may have two separate sets of pixels or the image sensor may use a time sharing of the photodiodes for a specific operation such as generating the first image of data, generating the second image data, obtaining event data. Moreover, by providing the first set of pixels and the second set of pixels in the same the array of rolling shutter image sensors, a manufacturing cost of producing an additional image sensor for detecting the flickering may be reduced.
In a further implementation form of the first aspect, the first set of pixels comprises a plurality of optical black pixels located at predefined locations in the array of rolling shutter image sensors, in particular pixels located at a border of the array of rolling shutter image sensors are optical black pixels. For example, it is possible to provide optical black pixels in the array of the rolling shutter image sensors. Moreover, the location of such optical black pixels can be selected such that a flickering may be detected and may further be reduced. For example, the location of the optical black pixels can be on the border of the array of rolling shutter image sensors. Furthermore, the location of the optical black pixels can be predefined in a specific area of the array of the rolling shutter image sensor. Moreover, a flickering area may be detected and flickering may be reduced.
In a further implementation form of the first aspect, the first set of pixels comprises a plurality of event pixels located at predefined locations in the array of rolling shutter image sensors, in particular every 16th pixel of the array of rolling shutter image sensors is an event pixel.
By providing the event pixels in the array of rolling shutter image sensors, the events pixels can be used for detecting a flickering region in the array of rolling shutter image sensors. Moreover, flickering information such as a flickering frequency can be estimated based on the event sensors that are in in the array of rolling shutter image sensors. Moreover, a flickering in the image data may be reduced.
In a further implementation form of the first aspect, the first set of pixels comprises a plurality of event pixels distributed pseudo-randomly in the array of rolling shutter image sensors.
As discussed, it is possible to provide the plurality of event pixels in the array of rolling shutter image sensors. In this case, a cost of providing a separate event sensor can be reduced or omitted. An example of providing event pixels in the array of rolling shutter image sensors is to pseudo-randomly distribute the event pixels in the array of rolling shutter image sensors. Moreover, by pseudo-randomly distributing the plurality of event pixels in the array of rolling shutter image sensors, it is possible to detect flickering from corresponding pseudo-randomly locations in the array.
In a further implementation form of the first aspect, the device is further configured to obtain third image data of the scene, wherein the third image data is obtained with the second set of pixels of the at least one image sensor, and estimate the flickering information based further on the third image data. For example, the third image data may be used for estimating flickering information. For example, a bright object in the third image data, a display or a screen which may cause a flickering may be identified, for example, from the third image data, and information related to the region of interest may be used for estimating the flickering.
Moreover, the brightness information of objects may also be used for estimating flickering information. The device may use this information to change a setting of the first set of pixels, the second set of pixels, an assistant camera, or for analyzing flickering from these objects. For instance, a sensitivity of the first set of pixels of the image sensor (e.g., the event sensor) may be increased to detect flickering more reliably.
In a further implementation form of the first aspect, the third image data indicates one or more of: a brightness information of an object in the scene, a brightness information of a display screen producing a flicker, a region of interest, ROI, in the scene.
For example, the brightness information of objects may be used to detect the flickering by the event sensor. Moreover, the above information can be used to adjust a setting of the event sensor. For example, the sensitivity of the event sensor may be adjusted or increased to detect flickering more reliably.
In a further implementation form of the first aspect, the device is further configured to adjust a setting of the at least one image sensor based on the third image data, in order to estimate the flickering information from the first image data.
A second aspect of the disclosure provides a method for processing image data, the method comprising: obtaining first image data of a scene, wherein the first image data is obtained with a first set of pixels of at least one image sensor, estimating flickering information based on the first image data, estimating an exposure time for a second set of pixels of the at least one image sensor based on the estimated flickering information, and obtaining second image data of the scene, wherein the second image data is obtained with the estimated exposure time and the second set of pixels of the at least one image sensor. In an implementation form of the second aspect, the method further comprises determining one or more flickering frequencies based on the first image data, and estimating the exposure time for the second set of pixels based on at least one flickering frequency of the one or more flickering frequencies.
In a further implementation form of the second aspect, the method further comprises estimating the exposure time based on an inverse of the at least one flickering frequency.
In a further implementation form of the second aspect, estimating flickering information comprises one or more of: estimating global flickering information, estimating local flickering information, determining one or more flickering frequencies, determining a flicker area size in a field of view of the at least one image sensor, determining flicker area locations in the field of view of the at least one image sensor.
In a further implementation form of the second aspect, the at least one image sensor comprises an event sensor, the event sensor comprising the first set of pixels , and wherein the method further comprises activating the first set of pixels to generate the first image data by acquiring a change in light intensity at one or more time intervals.
In a further implementation form of the second aspect, the activating the first set of pixels comprises one or more of: disabling a flicker removal filter, selecting a setting for flicker detection, using a filter for removing one or more frequencies of light.
In a further implementation form of the second aspect, the method further comprises obtaining the first image data based on a plurality of local windows located on top of the event sensor.
In a further implementation form of the second aspect, the at least one image sensor comprises an array of rolling shutter image sensors, the array comprising the first set of pixels and the second set of pixels. In a further implementation form of the second aspect, the first set of pixels comprises a plurality of optical black pixels located at predefined locations in the array of rolling shutter image sensors, in particular pixels located at a border of the array of rolling shutter image sensors are optical black pixels.
In a further implementation form of the second aspect, the first set of pixels comprises a plurality of event pixels located at predefined locations in the array of rolling shutter image sensors, in particular every 16th pixel of the array of rolling shutter image sensors is an event pixel.
In a further implementation form of the second aspect, the first set of pixels comprises a plurality of event pixels distributed pseudo-randomly in the array of rolling shutter image sensors.
In a further implementation form of the second aspect, the method further comprises obtaining third image data of the scene, wherein the third image data is obtained with the second set of pixels of the at least one image sensor, and estimating the flickering information based further on the third image data.
In a further implementation form of the second aspect, the third image data indicates one or more of: a brightness information of an object in the scene, a brightness information of a display screen producing a flicker, a region of interest, ROI, in the scene.
In a further implementation form of the second aspect, the method further comprises adjusting a setting of the at least one image sensor based on the third image data, in order to estimate the flickering information from the first image data.
The method of the second aspect achieves the advantages and effects described for the device of the first aspect. A third aspect of the present disclosure provides a computer program comprising a program code for performing the method according to the second aspect or any of its implementation forms.
A fourth aspect of the present disclosure provides a non-transitory storage medium storing executable program code which, when executed by a processor, causes the method according to the second aspect or any of its implementation forms to be performed.
It has to be noted that the devices, elements, units and means described in the present application could be implemented in software or hardware elements or any kind of combination thereof. The steps which are performed by the various entities described in the present application, as well as the functionalities described to be performed by the various entities, are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof.
BRIEF DESCRIPTION OF DRAWINGS
The above described aspects and implementation forms will be explained in the following description of specific embodiments in relation to the enclosed drawings, in which
FIG. 1 depicts a schematic view of a device for processing image data, according to an example of the disclosure;
FIG. 2 depicts a diagram illustrating an example of a test setup for detecting a flickering of a display of a mobile device;
FIG. 3 depicts a diagram illustrating an example of a binarized image obtained after accumulating events;
FIG. 4 depicts a diagram illustrating an example of a flickering spectrum calculated from a time varying signal; FIG. 5 depicts a schematic view of a diagram illustrating an example of an arrangement of local windows on top of an event sensor;
FIG. 6 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of optical black pixels;
FIG. 7 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of event pixels;
FIG. 8 depicts a schematic view of a diagram illustrating an image sensor comprising a plurality of event pixels distributed pseudo-randomly in the image sensor; and
FIG. 9 depicts a flowchart of a method for processing image data, according to an example of the disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
FIG. 1 shows a schematic view of a device 100 for processing image data, according to an embodiment of the disclosure. The device 100 may be, or may be included in, an electronic device such as a digital camera.
The device 100 is configured to obtain first image data 101 of a scene, wherein the first image data 101 is obtained with a first set of pixels 111 of at least one image sensor 110. For example, the first set of pixels 111 may comprise event pixels that may be used for detecting flickering regions on the first image data 101.
The device 100 is further configured to estimate flickering information 102 based on the first image data 101. For example, the device 100 may calculate a flickering frequency.
The device 100 is further configured to estimate an exposure time 103 for a second set of pixels 112 of the at least one image sensor 110 based on the estimated flickering information.
The device 100 is further configured to obtain second image data 104 of the scene, wherein the second image data 104 is obtained with the estimated exposure time 103 and the second set of pixels 112 of the at least one image sensor 110. For example, the second set of pixels 112 may comprise active rolling shutter pixels. The exposure time may be determined such that obtaining the second image data 104 with that exposure time, a flickering may be removed.
Furthermore, the first set of pixels (e.g., the event pixels) may be in a separate event camera, or they may in in-array of rolling shutter camera (the same rolling shutter camera as the active rolling shutter camera or other rolling shutter camera).
The device 100 may comprise processing circuitry (not shown) configured to perform, conduct or initiate the various operations of the device 100 described herein. The processing circuitry may comprise hardware and software. The hardware may comprise analog circuitry or digital circuitry, or both analog and digital circuitry. The digital circuitry may comprise components such as application-specific integrated circuits (ASICs), field-programmable arrays (FPGAs), digital signal processors (DSPs), or multi-purpose processors. In one embodiment, the processing circuitry comprises one or more processors and a non-transitory memory connected to the one or more processors. The non-transitory memory may carry executable program code which, when executed by the one or more processors, causes the device 100 to perform, conduct or initiate the operations or methods described herein.
FIG. 2 depicts a diagram illustrating an example of a test setup 200 for detecting the display flicker of a mobile device 201.
The device 100 is configured to detect flickering of objects in the scene. The display of the mobile device 201 is one example of an object that causes a flickering in the scene. The device 100 may detect the flickering of the display of the mobile device 201.
The device 100 may estimate a flickering frequency. For instance, for each pixel of the first set of pixels, the device 100 may accumulate the generated events for a short period of time T to obtain a grayscale image I. For instance, T=1 ms may be assumed.
Moreover, the device 100 may collect several of such grayscale images I(i), e.g., for several adjacent time intervals having a length of T. Furthermore, the device 100 may process the image data such that for each of the collected images I(i) the data may be binarized. For example, the device 100 may binarize the image data such that negative values, due to accumulation of negative events are set to zero, and the positive values due to accumulation of positive events are set to 255. Furthermore, for the pixels which did not generated events, or a sum of their obtained data is zero, the device 100 may set their values to 127.
In some embodiments, the device 100 may use other known binarization method. For instance, if only negative or only positive sums are retained, those pixels can be set to 255 and the rest of the pixels can be set to zero. An example of such binarized image is shown in FIG. 3.
FIG. 3 depicts a diagram illustrating an example of a binarized image obtained after accumulating events for a time interval T=1 ms.
The flickering areas 302 which indicates a black band is detected. The flickering area 302 is caused by flickering of the display of the mobile device 201. The white band 301 is also caused by flickering of the display of the mobile device 201.
This image is obtained by capturing event data from the mobile device 201 shown in FIG. 2, whose display was showing a white image. The display of the mobile device 201 is depicted in FIG. 2.
Furthermore, the device 100 may select a small rectangular local area having a size of NxN centered at position (xc, yc), from each of the collected images I(i). Moreover, the device 100 may calculate the local averages of the pixels inside these small regions, e.g., for all the binary images.
FIG. 4 depicts a diagram 400 illustrating an example of a spectrum 401 calculated from a time varying signal.
For example, the device 100 may use a signal that is obtained from time variation of local averages in the first image data 101. Moreover, the first set of pixels is assumed to be 5 pixels (N=5 pixels is used). The time variation of the local averages may be periodic.
The device 100 may further estimate a frequency spectrum 401 based on the time variation of the local averages. Moreover, the device 100 may estimate flickering information 102 based on the frequency spectrum 401. For example, the device 100 may estimate at least one flickering frequency. As can be derived from flickering spectrum of FIG. 4, a fundamental frequency is located around 60Hz, which is the refresh rate of the display of the mobile 201 used in the test.
Notably, however, the present disclosure is not limited to this specific method for detecting the flickering. For instance, in some embodiments, the time interval T can be different than the 1ms example used in the description of FIG. 3 and FIG. 4.
FIG. 5 depicts a schematic view of a diagram illustrating an example of an arrangement of the local windows 502 over the event sensor 501.
For example, the device 100 may obtain the first image data 101 from the event sensor 501 in which several local windows 502 are located at different positions over the event sensor 501.
The location of the local windows may be predefined or may be selected automatically based on the application and/or other information. In some embodiments, the device 100 may perform the binarization of the accumulated grayscale images by using a different method.
Moreover, the device 100 may calculate the flickering information 102, as described above also for non-white areas of the scene, which are the detected flickering. Furthermore, the size N of the local areas used to compute the averages may also be different than 5 pixels used in the above example. In some embodiments, the device 100 may estimate flickering information 102, e.g., the flickering frequency by using a hybrid sensor in which the event pixels are embedded into a standard color camera.
The device 100 may detect a periodic time variation of the signal activity in at least one area of the first image data 101 of the scene. Moreover, the device 100 may estimate the flickering information, e.g., the flickering frequency which is related to the flickering in that area of the image.
Furthermore, after estimating the flickering frequency, the device 100 may determine an exposure time for the second set of pixels 112 (e.g., active rolling shutter camera) such that the determined exposure time may avoid or reduce the amount of flickering. For instance, the device 100 may determine the exposure time being an integer multiple of 1/flicker frequency. Furthermore, in a case that the exposure time cannot be set as exactly as the integer multiple of 1/flicker frequency, then the device 100 may select an exposure time that is close to the integer.
Reference is now made to FIG. 6, which is a schematic view of a diagram illustrating an image sensor 110 comprising a plurality of optical black pixels 602.
The image sensor 110 may comprise an array of rolling shutter image sensors 601. Moreover, the array of rolling shutter image sensors 601 may comprise the first set of pixels 111 and the second set of pixels 112.
Furthermore, the first set of pixels 111 may comprise the plurality of optical black pixels 602. The plurality of optical black pixels 602 may be located at predefined locations in the array of rolling shutter image sensors 601. For example, the plurality of optical black pixels 602 may be pixels that are located at a border of the array of rolling shutter image sensors 601.
For example, an image sensor 110 that is a 12 Mpix may comprise tens of thousands OB pixel. The OB pixels may be in a grid pattern. However, the content of the grid may be pseudo- randomly different.
Reference is now made to FIG. 7, which is a schematic view of the image sensor 110 comprising a plurality of event pixels 702.
The image sensor 110 comprises the array of rolling shutter image sensors 601 that includes the first set of pixels 111. The first set of pixels 111 includes the plurality of event pixels 702. The plurality of event pixels 702 are located at predefined locations in the array of rolling shutter image sensors 601. For example, every 16th pixel of the array of rolling shutter image sensors 601 is an event pixel 702.
For example, the image sensor 110 may include both visible pixels and event pixels. Event pixels may be in the image sensor in a regular form, e.g., every 16th pixel of image sensor is event pixels or in pseudorandom order. In one implementation, photodiode of pixel is shared by event pixel circuitry and by frame camera circuitry. Such sharing may happen in time domain, i.e., in certain moment the pixel is operating as event pixel and in some other moment as frame camera pixel. Reference is now made to FIG. 8, which is a schematic view of a diagram illustrating an image sensor 110 comprising a plurality of event pixels 802 distributed pseudo-randomly in the image sensor 110.
The image sensor 110 comprises the array of rolling shutter image sensors 601 that includes the first set of pixels 111. Moreover, the first set of pixels 111 comprises a plurality of event pixels 802 that are pseudo-randomly distributed in the array of rolling shutter image sensors 601. For example, the device 100 may process the image data 101 such that the image data 101 may be divided into 32x32 pixels block. Each block may have one OB pixel.
Furthermore, starting from a beginning of image sensor 110, e.g., top left corner, each block of pixels may comprise its own OB pixel location type. The type may be a value between 1 to 7. The value may specify an x-y offset for an OB pixel address inside the block. An example of x-y offset coding is presented in table 1.
Figure imgf000020_0001
Table 1 : x-y offset coding by type of pixel
Moreover, there may be different design choices for how the OB pixel location type changes for each block. An example may be using a simple up counter for the image sensor 110 depicted in FIG. 8. Another example may be using a simple down counter for the image sensor 110 depicted in FIG. 8.
Table 2 presents a per-frame resetting type pattern, and table 3 presents a per-line resetting type pattern, in particular every 2nd line in reverse order is a reset pattern.
Figure imgf000021_0004
Figure imgf000021_0001
Table 2: a per- frame resetting type pattern
Figure imgf000021_0003
Figure imgf000021_0002
Table 3: a per- line resetting type pattern It can be derived that a per-line resetting pattern has a less random behavior than a per-frame resetting pattern.
The image sensor 110 discussed with respect to FIG. 8 provides a Video Graphics Array (VGA) image in which the OB pixels are provided with a per-frame resetting type pattern and using a type coding according to table 4.
Figure imgf000021_0005
Table 4: a per-frame resetting type pattern for the image sensor of FIG. 8 FIG. 9 shows a method 900 according to an embodiment of the disclosure for processing image data. The method 900 may be carried out by the device 100, as it described above.
The method 900 comprises a step S901 of obtaining first image data 101 of a scene, wherein the first image data 101 is obtained with a first set of pixels 111 of at least one image sensor 110.
The method 900 further comprises a step S902 of estimating flickering information 102 based on the first image data 101.
The method 900 further comprises a step S903 of estimating an exposure time 103 for a second set of pixels 112 of the at least one image sensor 110 based on the estimated flickering information 102.
The method 900 further comprises a step S904 of obtaining second image data 104 of the scene, wherein the second image data 104 is obtained with the estimated exposure time 103 and the second set of pixels 112 of the at least one image sensor 110.
The present disclosure has been described in conjunction with various embodiments as examples as well as implementations. However, other variations can be understood and effected by those persons skilled in the art and practicing the claimed disclosure, from the studies of the drawings, this disclosure and the independent claims. In the claims as well as in the description the word “comprising” does not exclude other elements or steps and the indefinite article “a” or “an” does not exclude a plurality. A single element or other unit may fulfill the functions of several entities or items recited in the claims. The mere fact that certain measures are recited in the mutual different dependent claims does not indicate that a combination of these measures cannot be used in an advantageous implementation.

Claims

1. A device (100) for processing image data, the device (100) being configured to: obtain first image data (101) of a scene, wherein the first image data (101) is obtained with a first set of pixels (111) of at least one image sensor (110); estimate flickering information (102) based on the first image data (101); estimate an exposure time (103) for a second set of pixels (112) of the at least one image sensor (110) based on the estimated flickering information (102); and obtain second image data (104) of the scene, wherein the second image data (104) is obtained with the estimated exposure time (103) and the second set of pixels (112) of the at least one image sensor (110).
2. The device (100) according to claim 1, further configured to: determine one or more flickering frequencies based on the first image data (101); and estimate the exposure time (103) for the second set of pixels (112) based on at least one flickering frequency of the one or more flickering frequencies.
3. The device (100) according to claim 2, wherein: the exposure time (103) is estimated based on an inverse of the at least one flickering frequency.
4. The device (100) according to one of the claims 1 to 3, wherein: estimating flickering information (102) comprises one or more of: estimating global flickering information, estimating local flickering information, determining one or more flickering frequencies, determining a flicker area size in a field of view of the at least one image sensor (110), determining flicker area locations in the field of view of the at least one image sensor (110).
5. The device (100) according to one of the claims 1 to 4, wherein the at least one image sensor (110) comprises an event sensor (501), the event sensor (501) comprising the first set of pixels (111), and wherein the device (100) is further configured to activate the first set of pixels (111) to generate the first image data (101) by acquiring a change in light intensity at one or more time intervals.
6. The device (100) according to claim 5, wherein: activating the first set of pixels (111) comprises one or more of: disabling a flicker removal filter, selecting a setting for flicker detection, using a filter for removing one or more frequencies of light.
7. The device (100) according to claim 5 or 6, wherein: the first image data (101) is obtained based on a plurality of local windows (502) located on top of the event sensor (501).
8. The device (100) according to one of the claims 1 to 4, wherein: the at least one image sensor (110) comprises an array of rolling shutter image sensors
(601), the array comprising the first set of pixels (111) and the second set of pixels (112).
9. The device (100) according to claim 8, wherein: the first set of pixels (111) comprises a plurality of optical black pixels (602) located at predefined locations in the array of rolling shutter image sensors (601), in particular pixels located at a border of the array of rolling shutter image sensors (601) are optical black pixels
(602).
10. The device (100) according to claim 8, wherein: the first set of pixels (111) comprises a plurality of event pixels (702) located at predefined locations in the array of rolling shutter image sensors (601), in particular every 16th pixel of the array of rolling shutter image sensors (601) is an event pixel (702).
11. The device (100) according to claim 8, wherein: the first set of pixels (111) comprises a plurality of event pixels (802) distributed pseudo-randomly in the array of rolling shutter image sensors (601).
12. The device (100) according to one of the claims 1 to 11, further configured to: obtain third image data of the scene, wherein the third image data is obtained with the second set of pixels (112) of the at least one image sensor (110); and estimate the flickering information (102) based further on the third image data.
13. The device (100) according to claim 12, wherein: the third image data indicates one or more of: a brightness information of an object in the scene, a brightness information of a display screen producing a flicker, a region of interest, ROI, in the scene.
14. The device (100) according to claim 12 or 13, further configured to: adjust a setting of the at least one image sensor (110) based on the third image data, in order to estimate the flickering information (102) from the first image data (101).
15. A method (900) for processing image data, the method (800) comprising: obtaining (S901) first image data (101) of a scene, wherein the first image data (101) is obtained with a first set of pixels (111) of at least one image sensor (110); estimating (S902) flickering information (102) based on the first image data (101); estimating (S903) an exposure time (103) for a second set of pixels (112) of the at least one image sensor (110) based on the estimated flickering information (102); and obtaining (S904) second image data (104) of the scene, wherein the second image data (104) is obtained with the estimated exposure time (103) and the second set of pixels (112) of the at least one image sensor (110).
16. A computer program product comprising instructions, which, when the program is executed by a computer, cause the computer to carry out the steps of the method (900) of claim 15 to be performed.
PCT/EP2021/070464 2021-07-22 2021-07-22 Device and method for processing image data WO2023001373A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/EP2021/070464 WO2023001373A1 (en) 2021-07-22 2021-07-22 Device and method for processing image data
CN202180098736.5A CN117426104A (en) 2021-07-22 2021-07-22 Apparatus and method for processing image data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2021/070464 WO2023001373A1 (en) 2021-07-22 2021-07-22 Device and method for processing image data

Publications (1)

Publication Number Publication Date
WO2023001373A1 true WO2023001373A1 (en) 2023-01-26

Family

ID=77155753

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/070464 WO2023001373A1 (en) 2021-07-22 2021-07-22 Device and method for processing image data

Country Status (2)

Country Link
CN (1) CN117426104A (en)
WO (1) WO2023001373A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117692781A (en) * 2023-05-23 2024-03-12 荣耀终端有限公司 Flicker light source detection method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002520A1 (en) * 2007-06-29 2009-01-01 Norikatsu Yoshida Imaging apparatus, imaging method, storage medium storing program, and integrated circuit
US20150172529A1 (en) * 2013-12-16 2015-06-18 Olympus Corporation Imaging device and imaging method
EP3217644A1 (en) * 2014-11-06 2017-09-13 Sony Corporation Information processing device
US20210067679A1 (en) * 2019-08-28 2021-03-04 Semiconductor Components Industries, Llc Event sensors with flicker analysis circuitry

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002520A1 (en) * 2007-06-29 2009-01-01 Norikatsu Yoshida Imaging apparatus, imaging method, storage medium storing program, and integrated circuit
US20150172529A1 (en) * 2013-12-16 2015-06-18 Olympus Corporation Imaging device and imaging method
EP3217644A1 (en) * 2014-11-06 2017-09-13 Sony Corporation Information processing device
US20210067679A1 (en) * 2019-08-28 2021-03-04 Semiconductor Components Industries, Llc Event sensors with flicker analysis circuitry

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117692781A (en) * 2023-05-23 2024-03-12 荣耀终端有限公司 Flicker light source detection method and electronic equipment

Also Published As

Publication number Publication date
CN117426104A (en) 2024-01-19

Similar Documents

Publication Publication Date Title
US11563897B2 (en) Image processing method and apparatus which determines an image processing mode based on status information of the terminal device and photographing scene information
AU2018247216B2 (en) Systems and methods for liveness analysis
US9432590B2 (en) DCT based flicker detection
CN110766621B (en) Image processing method, image processing device, storage medium and electronic equipment
US9569688B2 (en) Apparatus and method of detecting motion mask
CN105611185B (en) image generating method, device and terminal device
JP6553624B2 (en) Measurement equipment and system
CN109903324B (en) Depth image acquisition method and device
JP4689518B2 (en) Fire detection equipment
US10616561B2 (en) Method and apparatus for generating a 3-D image
JP2013542528A (en) Night scene image blur detection system
CN110300268A (en) Camera switching method and equipment
CN110740266A (en) Image frame selection method and device, storage medium and electronic equipment
WO2023001373A1 (en) Device and method for processing image data
US20190287272A1 (en) Detection system and picturing filtering method thereof
CN114140481A (en) Edge detection method and device based on infrared image
CN108513068B (en) Image selection method and device, storage medium and electronic equipment
JP2020021314A (en) Image processing system and image processing method
Thakur et al. Classification of color hazy images
CN108335278B (en) Image processing method and device, storage medium and electronic equipment
JP2015126281A (en) Projector, information processing method, and program
US9298319B2 (en) Multi-touch recognition apparatus using filtering and a difference image and control method thereof
JPH0514898A (en) Image monitor device
CN116916164A (en) Image strobe processing method, device, electronic equipment and storage medium
JP4898895B2 (en) Image processing apparatus and congestion detection processing program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202180098736.5

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21748826

Country of ref document: EP

Kind code of ref document: A1