WO2022001615A1 - Method and system for automatically removing glare regions - Google Patents
Method and system for automatically removing glare regions Download PDFInfo
- Publication number
- WO2022001615A1 WO2022001615A1 PCT/CN2021/099422 CN2021099422W WO2022001615A1 WO 2022001615 A1 WO2022001615 A1 WO 2022001615A1 CN 2021099422 W CN2021099422 W CN 2021099422W WO 2022001615 A1 WO2022001615 A1 WO 2022001615A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- glare
- preview frame
- camera preview
- glare region
- Prior art date
Links
- 230000004313 glare Effects 0.000 title claims abstract description 144
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000001514 detection method Methods 0.000 claims abstract description 15
- 238000012545 processing Methods 0.000 claims description 33
- 230000004297 night vision Effects 0.000 claims description 5
- 230000011514 reflex Effects 0.000 claims description 5
- 206010052128 Glare Diseases 0.000 description 111
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 238000012805 post-processing Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
Definitions
- the present disclosure generally relates to the field of image processing, and more particularly to a method and system for automatically removing glare regions from the images at run-time.
- the “communication device” or “user equipment” or “mobile station” with various new/advanced features is used for performing a variety of tasks such as for entertainment, for communication, for capturing media (e.g., pictures, images, videos) and the like.
- media e.g., pictures, images, videos
- the communication device is being widely used for capturing pictures, taking selfies, photos, group pictures, and the like. While clicking the picture or an image, many a time, a user has to face the problem of getting glare in the images.
- the glare refers to the presence of harsh bright light at some places in the image which degrade or spoil the quality of the picture. Generally, the glare in the image caused due to light reflections from a wide variety of objects.
- an object of the present disclosure is to provide a novel method and system for automatically removing glare from images during image capture. It is another object of the disclosure to detect at least one glare region in the camera preview frame in real-time. It is yet another object of the present disclosure to remove at least one glare region from the camera preview frame in real-time. It is yet another object of the present disclosure to reduce user efforts of postprocessing the image by providing the method and system for real-time removal of the glare from the image.
- the present disclosure provides a method and system for automatically removing glare during image capture.
- One aspect of the present disclosure relates to a method of automatically removing glare during image capture.
- the method comprises receiving, by a Camera Unit, at least one camera preview frame comprising at least glare region.
- the method comprises detecting, by a Processing Unit, the at least one glare region in the field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness.
- the method comprises receiving, by the Camera Unit, at least one new camera preview frame in a different orientation after the detection of the at least one glare region.
- the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame.
- the method comprises removing, by the Processing Unit, the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
- the system comprises a camera unit, a processing unit and a storage unit.
- the camera unit is configured to receive at least one camera preview frame comprising at least one glare region.
- the processing unit is configured to detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness.
- the camera unit is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region.
- the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame.
- the processing unit is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
- the user equipment comprises a system.
- the system configured to receive at least one camera preview frame comprisnig at least one glare region.
- the system configured to detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness.
- the system is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region.
- the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame.
- the system is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
- FIG. 1 illustrates a block diagram of the system [100] for automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure.
- FIG. 2 illustrates an exemplary method [200] for automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure.
- known solutions perform processing of the image after capturing the image to remove glare from the image.
- the post-processing of the image requires the user to wait for a longer time to get a glare free image.
- the conventionally available systems are not able to remove glare from already existing light source in the frame i.e. at the time of taking the image.
- few of the conventionally available system need additional hardware to perform the process of removing glare from the images and thus leads to more cost.
- the present disclosure provides a solution relating to an efficient way of managing the time of the user by providing a method and system for automatically removing the glare from the image at run-time or at the time of taking the picture.
- the present disclosure first detects the presence of at least one glare region in the camera preview frame in real-time i.e., at the time of taking an image. Thereafter, the present disclosure removes the at least one detected glare region in real-time and provide a glare-free image to the user.
- “user equipment” may be any electrical, electronic, electromechanical and computing device or equipment, having one or more transceiver unit installed on it.
- the communication device may include but is not limited to, a mobile phone, smartphone, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, wearable device or any other computing device which is capable of implementing the features of the present disclosure and is obvious to a person skilled in the art.
- Camera Unit refers to a device or sensor configured to capture the visible media such as photos, pictures, videos and the like.
- the communication device includes camera sensors installed in it to capture and save the moments or pictures of any desired object.
- the Camera Unit may also refer to the camera lens installed at a fixed location to provide real-time camera preview for performing multiple tasks such as camera installed at the vehicle to provide the user a rearview and a forward view for safe driving.
- a “processing unit” or “processor” includes one or more processors, wherein processor refers to any logic circuitry for processing instructions.
- a processor may be a general-purpose processor, a special-purpose processor, a conventional processor, a digital signal processor, a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits, Field Programmable Gate Array circuits, any other type of integrated circuits, etc.
- the processor may perform signal coding data processing, input/output processing, and/or any other functionality that enables the working of the system according to the present disclosure. More specifically, the processor or processing unit is a hardware processor.
- storage unit refers to a machine or computer-readable medium including any mechanism for storing information in a form readable by a computer or similar machine.
- a computer-readable medium includes read-only memory ( “ROM” ) , random access memory ( “RAM” ) , magnetic disk storage media, optical storage media, flash memory devices or other types of machine-accessible storage media.
- FIG. 1 illustrates a system [100] for automatically removing glare from the images during image capture, in accordance with exemplary embodiment of the present disclosure.
- the system [100] comprises at least one Camera Unit [102] , at least one Processing Unit [104] and at least one Storage Unit [106] , wherein all the components are assumed to be connected to each other unless otherwise indicated below.
- Fig. 1 only one Camera Unit [102] , only one Processing Unit [104] and only one Storage Unit [106] is shown, however the System [100] may comprise multiple such units and modules or the system may comprise any such numbers of said units and modules, as may be required to implement the features of the present disclosure.
- the system [100] includes the Camera Unit [102] .
- the Camera Unit [102] is configured to receive at least one camera preview frame comprising at least one glare region.
- the at least one camera preview frame is associated with an aspect ratio and field of view.
- the camera preview frame comprises real-time data with respect to the current scene in the surrounding environment.
- the camera preview frame changes in accordance with the movement of the Camera Unit [102] .
- An image may then be created from this real-time data/scene of the camera preview frame.
- the Camera Unit [102] includes at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera and the like.
- DSLR Digital Single Lens Reflex
- the user tries to capture a glare-free image of the object present in a particular photograph. While clicking the image of the object, the flash of the camera cause one or more glare region on the image of the object due to which the quality of the image gets degraded.
- the present invention facilitates the user in capturing the glare-free image at the time of taking the image.
- the system [100] also includes the Processing Unit [104] .
- the Processing Unit [104] of the present disclosure is connected to the Camera Unit [102] .
- the Processing Unit [104] is configured to detect at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixels of the at least one camera preview frame with a predefined threshold level of the brightness.
- the threshold level of the brightness is fixed based on the trained classifiers and the historical data such as the average values of brightness used in the past for the detection of the glare.
- the threshold level of the brightness is defined according to the one or more parameters such as image capturing environment (e.g., day, night, sunny, rainy) , camera mode, camera lens and the like.
- the threshold level of the brightness may vary dynamically based on the real-time environment. The at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
- the at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness. For example, a region made of 8 pixels has a brightness level less than a threshold level X, then the region of the camera preview frame is considered as the glare region.
- the Processing Unit [104] is configured to identify the cluster of pixels having the value less than the threshold level of the brightness and which are connected to each other either via 4-pixel connectivity or 8-pixel connectivity.
- the 4-pixel connectivity refers to the grouping of the pixels that are connected with each other on either of their four faces.
- the 8-pixel connectivity refers to the grouping of the pixels that are connected with each other on either of their four faces or their four corners.
- the said cluster of pixels connected via 4-pixel connectivity or 8-pixel connectivity facilitates in the identification of the at least one glare regions or affected glare area.
- the Processing Unit [104] is configured to request for the at least one new camera preview frame in a different orientation from the Camera Unit [102] .
- the Camera Unit [102] is further configured to receive the at least one new camera preview frame in a different orientation after the detection of the at least one glare region in the camera preview frame.
- the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame.
- a region (x1, y1) is detected as the glare region in the camera preview frame then the new camera preview frame comprises of the glare free view of the region (x1, y1) to know the values of pixel associated with the detected glare region.
- the Camera Unit [102] receives the at least one new camera preview frame either by changing the orientation of the main camera lens or using one or more other camera lens of the camera unit.
- the Camera Unit [102] receives the at least one new camera preview frame to determine the information associated with the at least one glare region. For example, the Camera Unit [102] receives the at least one new camera preview frame in different orientation to know the values of the pixels affected by the glare in the camera preview frame.
- the Camera Unit [102] sends the data associated with the at least new camera preview frame.
- the Processing Unit [104] is further configured to remove the at least one glare region from the at least one camera preview frame.
- the Processing Unit [104] removes the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame in real-time. Thus, the processing unit interpolates the region inside glare in the camera preview frame and fill the values of the pixels obtained from the at least one new camera preview frame. The interpolation of the region inside the glare with their corresponding values is performed to remove the at least one glare region in the camera preview frame and provide the glare-free image to the user.
- the Processing Unit [104] is further connected with the Storage Unit [106] .
- the system includes the Storage Unit [106] .
- the Storage Unit [106] is configured to store the at least one glare-free image after detecting and removing the at least one glare region from the at least one camera preview frame.
- the Storage Unit [106] further configured to store the values of the pixels obtained from the at least one new camera preview frame for the removal of the at least one glare region.
- the Storage Unit [106] is configured to store all the data required for the implementation of the present invention.
- FIG. 2 an exemplary method flow diagram [200] , depicting method of automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure is shown. As shown in Fig. 2, the method begins at step [202] .
- the method comprises receiving, by a Camera Unit [102] , at least one camera preview frame comprising at least one glare region.
- the camera preview frame provides real-time scene of the environment. Further, the camera preview frame provides the view or appearance of the object to the user before taking the picture of that object.
- the camera preview frame gets activated after triggering the camera for capturing of one or more images. Further the camera preview frame refers to the frame provided by the camera unit before clicking or capturing the image.
- the user opens the camera to capture an image of a monument.
- the user with the help of a camera preview frame can adjust the position of the camera or the device to capture the best possible image before taking the image of the monument.
- the at least one camera preview frame is received by the camera unit to detect the location of the possible glare regions in the image.
- the data associated with the at least one camera preview frame is passed to the processing unit for the detection of the at least one glare regions in the camera preview frame.
- the method comprises detecting, by a Processing Unit [104] , the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness.
- the at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
- the user wants to click a picture of a particular page of the notebook. The user opens the camera of the communication device associated with the user for capturing at least one image of the said page.
- the processing unit will detect those regions as the two glare regions present in the camera preview frame based on the presence of excess brightness (more than threshold level of brightness) at those regions in comparison to other regions of the frame.
- the method encompasses receiving, by the camera unit, at least one new camera preview frame in a different orientation after the detection of the at least one glare region.
- the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame.
- the at least one new camera preview frame in the different orientation refers to the preview frames taken either by changing the orientation of the main camera lens or using one or more other camera lenses of the camera unit.
- the wide camera lens may provide the new wide camera preview frame to the user.
- the at least one new camera preview frame is received to determine the information (such as position, number, pixel values) associated with the at least one glare region and to remove the detected at least one glare region from the frame.
- the at least one camera preview frame and the at least one new camera preview frame is received by at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera.
- security camera such as position, number, pixel values
- the method comprises removing, by the processing unit, the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames.
- the method comprises storing, at the storage unit, the at least one glare-free image after detecting and removing the at least one glare region from the at least one camera preview frame. After detecting and removing the at least one glare region, the method terminates at step [212] .
- the present disclosure also facilitates detecting and removing the glare which occurs at the time of driving the vehicle.
- the present disclosure provides a glare-free view to the user in real-time for safe driving of the vehicle.
- vehicle camera installed in the vehicle provides glare-free front and rearview to the user to prevent any such accident which can occur due to having some reflection of light source during the journey.
- the functionality of the present disclosure is not limited to detection and removal of glare during the image capture but can also be used for the detection and removal of glare regions during a video capture.
- the one or more aspect of the present disclosure relates to a user equipment for automatically removing the glare.
- the user equipment includes the system [100] , wherein the system [100] is configured to receive at least one camera preview frame to capture at least one glare-free image.
- the system [100] is further configured to detect at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness.
- the system [100] is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region.
- the system [100] is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
A method and system are provided for automatically removing glare regions. The method receives at least one camera preview frame comprising at least one glare region. Next, the method thereafter detects the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. Third, the method receives at least one new camera preview frame in a different orientation after the detection of the at least one glare region. Thereafter, the method removes the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames.
Description
FIELD OF INVENTION
The present disclosure generally relates to the field of image processing, and more particularly to a method and system for automatically removing glare regions from the images at run-time.
This section is intended to provide information relating to field of the invention and thus, any approach or functionality described below should not be assumed to be qualified as prior art merely by its inclusion in this section.
With the advancement in the communication technology, the “communication device” or “user equipment” or “mobile station” with various new/advanced features is used for performing a variety of tasks such as for entertainment, for communication, for capturing media (e.g., pictures, images, videos) and the like. Further, these days, the communication device is being widely used for capturing pictures, taking selfies, photos, group pictures, and the like. While clicking the picture or an image, many a time, a user has to face the problem of getting glare in the images. The glare refers to the presence of harsh bright light at some places in the image which degrade or spoil the quality of the picture. Generally, the glare in the image caused due to light reflections from a wide variety of objects. Conventionally, there are systems available to remove the glare that works on the images already taken by the camera of the device. These images are then sent further for processing the image to remove the glare. However, these conventionally available systems are not efficient as these systems either require a lot of hardware or take too much time in providing a glare-free image to the user due to post-processing of the image.
Thus, the conventionally available methods and systems used for removing glares are not efficient and there exists a need for method and system which can provide an efficient way of removing the glare from the image in real-time.
SUMMARY
This section is provided to introduce certain objects and aspects of the present disclosure in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
To overcome at least a few problems associated with the known solutions as provided in the previous section, an object of the present disclosure is to provide a novel method and system for automatically removing glare from images during image capture. It is another object of the disclosure to detect at least one glare region in the camera preview frame in real-time. It is yet another object of the present disclosure to remove at least one glare region from the camera preview frame in real-time. It is yet another object of the present disclosure to reduce user efforts of postprocessing the image by providing the method and system for real-time removal of the glare from the image.
To achieve the aforementioned objectives, the present disclosure provides a method and system for automatically removing glare during image capture. One aspect of the present disclosure relates to a method of automatically removing glare during image capture. The method comprises receiving, by a Camera Unit, at least one camera preview frame comprising at least glare region. Next, the method comprises detecting, by a Processing Unit, the at least one glare region in the field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. Next, the method comprises receiving, by the Camera Unit, at least one new camera preview frame in a different orientation after the detection of the at least one glare region. The at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame. Thereafter, the method comprises removing, by the Processing Unit, the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
Another aspect of the present disclosure relates to a system for automatically removing glare during image capture. The system comprises a camera unit, a processing unit and a storage unit. The camera unit is configured to receive at least one camera preview frame comprising at least one glare region. The processing unit is configured to detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. Next, the camera unit is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region. The at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame. Thereafter, the processing unit is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
Yet another aspect of the present disclosure relates to user equipment for automatically removing glare during image capture. The user equipment comprises a system. The system configured to receive at least one camera preview frame comprisnig at least one glare region. Next, the system configured to detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. Then the system is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region. The at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame. Thereafter, the system is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
The accompanying drawings, which are incorporated herein, and constitute a part of this disclosure, illustrate exemplary embodiments of the disclosed methods and systems in which like reference numerals refer to the same parts throughout the different drawings. Components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Some drawings may indicate the components using block diagrams and may not represent the internal circuitry of each component. It will be appreciated by those skilled in the art that disclosure of such drawings includes disclosure of electrical components, electronic components or circuitry commonly used to implement such components.
FIG. 1 illustrates a block diagram of the system [100] for automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure.
FIG. 2 illustrates an exemplary method [200] for automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure.
The foregoing shall be more apparent from the following more detailed description of the disclosure.
In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address any of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
As discussed in the background section, known solutions perform processing of the image after capturing the image to remove glare from the image. The post-processing of the image requires the user to wait for a longer time to get a glare free image. Thus, the conventionally available systems are not able to remove glare from already existing light source in the frame i.e. at the time of taking the image. further, few of the conventionally available system need additional hardware to perform the process of removing glare from the images and thus leads to more cost.
The present disclosure provides a solution relating to an efficient way of managing the time of the user by providing a method and system for automatically removing the glare from the image at run-time or at the time of taking the picture. The present disclosure first detects the presence of at least one glare region in the camera preview frame in real-time i.e., at the time of taking an image. Thereafter, the present disclosure removes the at least one detected glare region in real-time and provide a glare-free image to the user.
As used herein, “user equipment” , “user device” and/or “communication device” , may be any electrical, electronic, electromechanical and computing device or equipment, having one or more transceiver unit installed on it. The communication device may include but is not limited to, a mobile phone, smartphone, laptop, a general-purpose computer, desktop, personal digital assistant, tablet computer, wearable device or any other computing device which is capable of implementing the features of the present disclosure and is obvious to a person skilled in the art.
As used herein, “Camera Unit” refers to a device or sensor configured to capture the visible media such as photos, pictures, videos and the like. For example, the communication device includes camera sensors installed in it to capture and save the moments or pictures of any desired object. Further, the Camera Unit may also refer to the camera lens installed at a fixed location to provide real-time camera preview for performing multiple tasks such as camera installed at the vehicle to provide the user a rearview and a forward view for safe driving.
As used herein, a “processing unit” or “processor” includes one or more processors, wherein processor refers to any logic circuitry for processing instructions. A processor may be a general-purpose processor, a special-purpose processor, a conventional processor, a digital signal processor, a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits, Field Programmable Gate Array circuits, any other type of integrated circuits, etc. The processor may perform signal coding data processing, input/output processing, and/or any other functionality that enables the working of the system according to the present disclosure. More specifically, the processor or processing unit is a hardware processor.
As used herein, “storage unit” refers to a machine or computer-readable medium including any mechanism for storing information in a form readable by a computer or similar machine. For example, a computer-readable medium includes read-only memory ( “ROM” ) , random access memory ( “RAM” ) , magnetic disk storage media, optical storage media, flash memory devices or other types of machine-accessible storage media.
The present disclosure is further explained in detail below with reference to the diagrams. FIG. 1 illustrates a system [100] for automatically removing glare from the images during image capture, in accordance with exemplary embodiment of the present disclosure. As shown in Fig. 1, the system [100] comprises at least one Camera Unit [102] , at least one Processing Unit [104] and at least one Storage Unit [106] , wherein all the components are assumed to be connected to each other unless otherwise indicated below. Also, in Fig. 1 only one Camera Unit [102] , only one Processing Unit [104] and only one Storage Unit [106] is shown, however the System [100] may comprise multiple such units and modules or the system may comprise any such numbers of said units and modules, as may be required to implement the features of the present disclosure. Also, there may be one or more sub-units of said units and modules of the system [100] and the same is not shown in the Fig. 1 for clarity.
The system [100] includes the Camera Unit [102] . The Camera Unit [102] is configured to receive at least one camera preview frame comprising at least one glare region.. In an embodiment, the at least one camera preview frame is associated with an aspect ratio and field of view. Further, the camera preview frame comprises real-time data with respect to the current scene in the surrounding environment. The camera preview frame changes in accordance with the movement of the Camera Unit [102] . An image may then be created from this real-time data/scene of the camera preview frame. The Camera Unit [102] includes at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera and the like. In an example, the user tries to capture a glare-free image of the object present in a particular photograph. While clicking the image of the object, the flash of the camera cause one or more glare region on the image of the object due to which the quality of the image gets degraded. Thus, the present invention facilitates the user in capturing the glare-free image at the time of taking the image.
The system [100] also includes the Processing Unit [104] . The Processing Unit [104] of the present disclosure is connected to the Camera Unit [102] . The Processing Unit [104] is configured to detect at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixels of the at least one camera preview frame with a predefined threshold level of the brightness. In a non-limiting embodiment, the threshold level of the brightness is fixed based on the trained classifiers and the historical data such as the average values of brightness used in the past for the detection of the glare. In another non-limiting embodiment, the threshold level of the brightness is defined according to the one or more parameters such as image capturing environment (e.g., day, night, sunny, rainy) , camera mode, camera lens and the like. In yet another non-limiting embodiment, the threshold level of the brightness may vary dynamically based on the real-time environment. The at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
The at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness. For example, a region made of 8 pixels has a brightness level less than a threshold level X, then the region of the camera preview frame is considered as the glare region. Further, the Processing Unit [104] is configured to identify the cluster of pixels having the value less than the threshold level of the brightness and which are connected to each other either via 4-pixel connectivity or 8-pixel connectivity. The 4-pixel connectivity refers to the grouping of the pixels that are connected with each other on either of their four faces. Further, the 8-pixel connectivity refers to the grouping of the pixels that are connected with each other on either of their four faces or their four corners. The said cluster of pixels connected via 4-pixel connectivity or 8-pixel connectivity facilitates in the identification of the at least one glare regions or affected glare area.
After detection of the at least one glare region in the at least one camera preview frame, the Processing Unit [104] is configured to request for the at least one new camera preview frame in a different orientation from the Camera Unit [102] . Thus, the Camera Unit [102] is further configured to receive the at least one new camera preview frame in a different orientation after the detection of the at least one glare region in the camera preview frame. The at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame. In an example, a region (x1, y1) is detected as the glare region in the camera preview frame then the new camera preview frame comprises of the glare free view of the region (x1, y1) to know the values of pixel associated with the detected glare region.
The Camera Unit [102] receives the at least one new camera preview frame either by changing the orientation of the main camera lens or using one or more other camera lens of the camera unit. The Camera Unit [102] receives the at least one new camera preview frame to determine the information associated with the at least one glare region. For example, the Camera Unit [102] receives the at least one new camera preview frame in different orientation to know the values of the pixels affected by the glare in the camera preview frame. The Camera Unit [102] sends the data associated with the at least new camera preview frame. After receiving the at least one new camera preview frame in a different orientation, the Processing Unit [104] is further configured to remove the at least one glare region from the at least one camera preview frame. The Processing Unit [104] removes the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame in real-time. Thus, the processing unit interpolates the region inside glare in the camera preview frame and fill the values of the pixels obtained from the at least one new camera preview frame. The interpolation of the region inside the glare with their corresponding values is performed to remove the at least one glare region in the camera preview frame and provide the glare-free image to the user. The Processing Unit [104] is further connected with the Storage Unit [106] .
The system includes the Storage Unit [106] . The Storage Unit [106] is configured to store the at least one glare-free image after detecting and removing the at least one glare region from the at least one camera preview frame. The Storage Unit [106] further configured to store the values of the pixels obtained from the at least one new camera preview frame for the removal of the at least one glare region. Further, the Storage Unit [106] is configured to store all the data required for the implementation of the present invention.
Referring to Fig. 2, an exemplary method flow diagram [200] , depicting method of automatically removing glare regions, in accordance with exemplary embodiment of the present disclosure is shown. As shown in Fig. 2, the method begins at step [202] .
At step [204] , the method comprises receiving, by a Camera Unit [102] , at least one camera preview frame comprising at least one glare region. The camera preview frame provides real-time scene of the environment. Further, the camera preview frame provides the view or appearance of the object to the user before taking the picture of that object. The camera preview frame gets activated after triggering the camera for capturing of one or more images. Further the camera preview frame refers to the frame provided by the camera unit before clicking or capturing the image. In an example, the user opens the camera to capture an image of a monument. The user with the help of a camera preview frame can adjust the position of the camera or the device to capture the best possible image before taking the image of the monument. In the present disclosure, the at least one camera preview frame is received by the camera unit to detect the location of the possible glare regions in the image. The data associated with the at least one camera preview frame is passed to the processing unit for the detection of the at least one glare regions in the camera preview frame.
At step [206] , the method comprises detecting, by a Processing Unit [104] , the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. The at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness. In an example, the user wants to click a picture of a particular page of the notebook. The user opens the camera of the communication device associated with the user for capturing at least one image of the said page. While clicking the image of the page, the user finds that there are two regions or places in the preview frame where the written text is not clear due to the reflection of any light present in that environment. Thus, the processing unit will detect those regions as the two glare regions present in the camera preview frame based on the presence of excess brightness (more than threshold level of brightness) at those regions in comparison to other regions of the frame.
Thereafter, at step [208] , the method encompasses receiving, by the camera unit, at least one new camera preview frame in a different orientation after the detection of the at least one glare region. The at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame. The at least one new camera preview frame in the different orientation refers to the preview frames taken either by changing the orientation of the main camera lens or using one or more other camera lenses of the camera unit. In an example, the wide camera lens may provide the new wide camera preview frame to the user. The at least one new camera preview frame is received to determine the information (such as position, number, pixel values) associated with the at least one glare region and to remove the detected at least one glare region from the frame. The at least one camera preview frame and the at least one new camera preview frame is received by at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera.
Next, at step [210] the method comprises removing, by the processing unit, the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames. The method comprises storing, at the storage unit, the at least one glare-free image after detecting and removing the at least one glare region from the at least one camera preview frame. After detecting and removing the at least one glare region, the method terminates at step [212] .
The present disclosure also facilitates detecting and removing the glare which occurs at the time of driving the vehicle. The present disclosure provides a glare-free view to the user in real-time for safe driving of the vehicle. With the help of the present disclosure, vehicle camera installed in the vehicle provides glare-free front and rearview to the user to prevent any such accident which can occur due to having some reflection of light source during the journey. Further, the functionality of the present disclosure is not limited to detection and removal of glare during the image capture but can also be used for the detection and removal of glare regions during a video capture.
Furthermore, the one or more aspect of the present disclosure relates to a user equipment for automatically removing the glare. The user equipment includes the system [100] , wherein the system [100] is configured to receive at least one camera preview frame to capture at least one glare-free image. The system [100] is further configured to detect at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness. Next, the system [100] is configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region. Thereafter, the system [100] is configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames.
While considerable emphasis has been placed herein on the disclosed embodiments, it will be appreciated that many embodiments can be made and that many changes can be made to the embodiments without departing from the principles of the present disclosure. These and other changes in the embodiments of the present disclosure will be apparent to those skilled in the art, whereby it is to be understood that the foregoing descriptive matter to be implemented is illustrative and non-limiting.
Claims (15)
- A method of automatically removing glare regions, the method comprising:receiving, by a Camera Unit [102] , at least one camera preview frame comprising at least one glare region;detecting, by a Processing Unit [104] , the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness;receiving, by the Camera Unit [102] , at least one new camera preview frame in a different orientation after the detection of the at least one glare region, wherein the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame; andremoving, by the Processing Unit [104] , the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frames.
- The method as claimed in claim 1, further comprises, storing, at the Storage Unit [106] , the at least one glare-free image after detecting and removing the at least one glare region from the at least one camera preview frame.
- The method as claimed in claim 1, wherein the at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
- The method as claimed in claim 1, wherein the at least one glare region is detected and removed in real-time.
- The method as claimed in claim 1, wherein the at least one camera preview frame and the at least one new camera preview frame is received by at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera.
- A system [100] for automatically removing glare regions, the system comprising:a Camera Unit [102] , wherein the Camera Unit [102] is configured to receive at least one camera preview frame comprising at least one glare region;a Processing Unit [104] , wherein the Processing Unit [104] is configured to detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness;the Camera Unit [102] is further configured to receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region, wherein the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame; andthe Processing Unit [104] is further configured to remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
- The system [100] as claimed in claim 6, further comprises a Storage Unit [106] , wherein the Storage Unit [106] is configured to store the at least one glare-free image obtained after detecting and removing the at least one glare region.
- The system [100] as claimed in claim 6, wherein the at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
- The system [100] as claimed in claim 6, wherein the at least one glare region is detected and removed in real-time.
- The system [100] as claimed in claim 6, wherein the Camera Unit [102] comprises at least one of security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera.
- A user equipment for automatically removing glare regions, the user equipment comprises:a system [100] configured to:- receive at least one camera preview frame comprising at least one glare region;- detect the at least one glare region in a field of view of the at least one camera preview frame based on the comparison of the brightness of pixel of the at least one camera preview frame with a predefined threshold level of the brightness;- receive at least one new camera preview frame in a different orientation after the detection of the at least one glare region, wherein the at least one new camera preview frame comprises the glare free view of the at least one glare region detected in the received at least one camera preview frame; and- remove the at least one glare region from the at least one camera preview frame by interpolating the pixel of the at least one glare region with the corresponding values obtained from the at least one new camera preview frame.
- The user equipment as claimed in claim 11, wherein the system [100] is further configured to store the at least one glare-free image obtained after detecting and removing the at least one glare region.
- The user equipment as claimed in claim 11, wherein the at least one glare region is detected in an event the brightness of the pixels associated with the at least one glare region is equal or more than the predefined threshold level of the brightness.
- The user equipment as claimed in claim 11, wherein the at least one glare region is detected and removed in real-time.
- The user equipment as claimed in claim 11, wherein the system [100] is configured to receive at least one camera preview frame and the at least one new camera preview frame by at least one of a security camera, night vision camera, Infrared camera, vehicle camera, smartphone camera, Digital Single Lens Reflex (DSLR) camera.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202041027483 | 2020-06-29 | ||
IN202041027483 | 2020-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022001615A1 true WO2022001615A1 (en) | 2022-01-06 |
Family
ID=79317418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/099422 WO2022001615A1 (en) | 2020-06-29 | 2021-06-10 | Method and system for automatically removing glare regions |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022001615A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008118386A (en) * | 2006-11-02 | 2008-05-22 | Canon Inc | Imaging apparatus, its control method and imaging system |
CN102163328A (en) * | 2011-05-06 | 2011-08-24 | 连云港杰瑞电子有限公司 | Method for detecting and eliminating glare in traffic video image |
US20160073035A1 (en) * | 2013-08-26 | 2016-03-10 | Kabushiki Kaisha Toshiba | Electronic apparatus and notification control method |
CN108616687A (en) * | 2018-03-23 | 2018-10-02 | 维沃移动通信有限公司 | A kind of photographic method, device and mobile terminal |
CN110557575A (en) * | 2019-08-28 | 2019-12-10 | 维沃移动通信有限公司 | method for eliminating glare and electronic equipment |
-
2021
- 2021-06-10 WO PCT/CN2021/099422 patent/WO2022001615A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008118386A (en) * | 2006-11-02 | 2008-05-22 | Canon Inc | Imaging apparatus, its control method and imaging system |
CN102163328A (en) * | 2011-05-06 | 2011-08-24 | 连云港杰瑞电子有限公司 | Method for detecting and eliminating glare in traffic video image |
US20160073035A1 (en) * | 2013-08-26 | 2016-03-10 | Kabushiki Kaisha Toshiba | Electronic apparatus and notification control method |
CN108616687A (en) * | 2018-03-23 | 2018-10-02 | 维沃移动通信有限公司 | A kind of photographic method, device and mobile terminal |
CN110557575A (en) * | 2019-08-28 | 2019-12-10 | 维沃移动通信有限公司 | method for eliminating glare and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11048913B2 (en) | Focusing method, device and computer apparatus for realizing clear human face | |
CN109951635B (en) | Photographing processing method and device, mobile terminal and storage medium | |
EP4044579A1 (en) | Main body detection method and apparatus, and electronic device and computer readable storage medium | |
CA3172605A1 (en) | Video jitter detection method and device | |
CN105427263A (en) | Method and terminal for realizing image registering | |
CN110335216B (en) | Image processing method, image processing apparatus, terminal device, and readable storage medium | |
CN108513069B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
WO2021093534A1 (en) | Subject detection method and apparatus, electronic device, and computer-readable storage medium | |
US10929982B2 (en) | Face pose correction based on depth information | |
WO2022134957A1 (en) | Camera occlusion detection method and system, electronic device, and storage medium | |
CN109286758B (en) | High dynamic range image generation method, mobile terminal and storage medium | |
CN115633262B (en) | Image processing method and electronic device | |
WO2019196240A1 (en) | Photographing method, apparatus, computer device, and storage medium | |
WO2022001615A1 (en) | Method and system for automatically removing glare regions | |
CN116368814A (en) | Spatial alignment transformation without FOV loss | |
CN113744139B (en) | Image processing method, device, electronic equipment and storage medium | |
US20200065979A1 (en) | Imaging system and method with motion detection | |
CN108495038B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN110930440A (en) | Image alignment method and device, storage medium and electronic equipment | |
CN108875733B (en) | Infrared small target rapid extraction system | |
CN116668843A (en) | Shooting state switching method and device, electronic equipment and storage medium | |
CN115334250A (en) | Image processing method and device and electronic equipment | |
WO2021239224A1 (en) | A computer software module arrangement, a circuitry arrangement, an arrangement and a method for improved image processing | |
CN112118394A (en) | Dim light video optimization method and device based on image fusion technology | |
WO2021259063A1 (en) | Method and system for automatically zooming one or more objects present in a camera preview frame |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21834622 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21834622 Country of ref document: EP Kind code of ref document: A1 |