CN113660428B - Image acquisition method and device and electronic equipment - Google Patents

Image acquisition method and device and electronic equipment Download PDF

Info

Publication number
CN113660428B
CN113660428B CN202010366166.0A CN202010366166A CN113660428B CN 113660428 B CN113660428 B CN 113660428B CN 202010366166 A CN202010366166 A CN 202010366166A CN 113660428 B CN113660428 B CN 113660428B
Authority
CN
China
Prior art keywords
image
frames
determining
test images
frame rate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010366166.0A
Other languages
Chinese (zh)
Other versions
CN113660428A (en
Inventor
陈沭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010366166.0A priority Critical patent/CN113660428B/en
Publication of CN113660428A publication Critical patent/CN113660428A/en
Application granted granted Critical
Publication of CN113660428B publication Critical patent/CN113660428B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to an image acquisition method, an image acquisition device and electronic equipment, and belongs to the technical field of image acquisition. The image acquisition method can effectively eliminate the reflective area in the image, improve the image quality and optimize the user experience. The method is applied to the electronic equipment and comprises the following steps: acquiring the flicker frequency of ambient light in a shooting environment; acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency; determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images; and carrying out reflection elimination treatment on the reflection area of the image to be treated to generate a target image.

Description

Image acquisition method and device and electronic equipment
Technical Field
The disclosure relates to the technical field of image acquisition, and in particular relates to an image acquisition method, an image acquisition device and electronic equipment.
Background
With the development of electronic devices, image capturing becomes a necessary function of electronic devices such as mobile phones. In the related art, an image acquired by an electronic device has a light reflection area in some scenes. For example, when a user shoots an outdoor scene indoors through glass, the acquired image may exhibit reflection of the indoor light source on the glass. The reflective area causes blurring or even loss of image details, reduces image quality and influences user experience.
Disclosure of Invention
The disclosure provides an image acquisition method, an image acquisition device and electronic equipment, which solve the defect that an image acquired in the related technology can show reflection of an indoor light source on glass.
In a first aspect, an embodiment of the present disclosure provides an image acquisition method, where the method is applied to an electronic device, and the electronic device further includes: a flash frequency acquisition component and a camera component; the method comprises the following steps:
acquiring the flicker frequency of ambient light in a shooting environment;
acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency;
determining a light reflection area in the image to be processed according to the brightness of the at least two frames of test images;
and carrying out reflection elimination treatment on the reflection area of the image to be treated to generate a target image.
In one embodiment, the determining the light reflection area in the image to be processed according to the brightness of the at least two frames of test images includes:
determining pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images;
and determining the position of the light reflecting area in the image to be processed according to the pixel coordinates.
In one embodiment, the determining the pixel coordinates of the light reflection area according to the brightness difference of the at least two frames of test images includes:
determining at least one group of interframe brightness change areas according to the brightness difference of any two frames of the test images;
determining a group of the inter-frame brightness variation regions having the largest brightness difference as the light reflection regions;
and determining pixel coordinates of the light reflecting area.
In one embodiment, determining at least one group of inter-frame brightness variation regions according to the brightness difference of the test image of any two frames includes:
and determining the area with the absolute value of the brightness difference larger than or equal to a set threshold value in any two frames of the test images as a group of inter-frame brightness change areas.
In one embodiment, the acquiring the image to be processed and the at least two frames of test images acquired at the set frame rate includes:
acquiring at least two frames of test images acquired at the set frame rate; synthesizing the image to be processed according to the acquired at least two frames of the test images;
or alternatively, the process may be performed,
acquiring the at least two frames of test images acquired at the set frame rate and the first exposure; and acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
In one embodiment, the set frame rate is greater than or equal to twice the flicker frequency.
In a second aspect, an embodiment of the present disclosure provides an image acquisition apparatus applied to an electronic device, the electronic device further including: a flash frequency acquisition component and a camera component; the image acquisition apparatus includes:
the first acquisition module is used for acquiring the flicker frequency of the ambient light through the flicker frequency acquisition component;
the second acquisition module is used for acquiring an image to be processed and at least two frames of test images through the camera shooting assembly at a set frame rate, wherein the set frame rate is larger than the flicker frequency;
the determining module is used for determining a light reflecting area in the image to be processed according to the brightness of the at least two frames of test images; and
and the reflection elimination module is used for carrying out reflection elimination treatment on the reflection area of the image to be treated to generate a target image.
In one embodiment, the determining module includes:
the first determining unit is used for determining pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images; and
and the second determining unit is used for determining the light reflecting area in the image to be processed according to the pixel coordinates.
In one embodiment, the first determining unit includes:
the first determining subunit is used for determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
a second determining subunit configured to determine, as the light reflection area, a group of the inter-frame luminance change areas where the luminance difference is largest; and
and the acquisition subunit is used for acquiring the pixel coordinates of the light reflecting area.
In one embodiment, the first determining subunit is specifically configured to: and taking the area with the absolute value of the brightness difference larger than or equal to the set threshold value in any two frames of the test images as a group of inter-frame brightness change areas.
In one embodiment, the second acquisition module includes:
the first acquisition unit is used for acquiring at least two frames of test images acquired at the set frame rate; and
the synthesis unit is used for synthesizing the image to be processed according to the acquired at least two frames of the test images;
alternatively, the second acquisition module includes:
the second acquisition unit is used for acquiring the at least two frames of test images acquired at the set frame rate and the first exposure time; and
and a third acquisition unit configured to acquire the image to be processed acquired at the set frame rate and a second exposure time, where the second exposure time is longer than the first exposure time.
In one embodiment, the set frame rate is greater than or equal to twice the flicker frequency.
In a third aspect, embodiments of the present disclosure provide an electronic device, the device comprising:
a flash frequency acquisition component;
a camera assembly;
a memory storing executable instructions; and
a processor configured to execute executable instructions in the memory to implement the steps of the image acquisition method provided in the first aspect above.
In a fourth aspect, embodiments of the present disclosure provide a method for performing the steps of the image acquisition method provided in the first aspect, when the executable instructions are executed by a processor
The image acquisition method, the image acquisition device and the electronic equipment provided by the disclosure have at least the following beneficial effects:
according to the image acquisition method provided by the embodiment of the disclosure, the test images are acquired under different ambient light brightness conditions according to the ambient light flicker frequency, and then the light reflection area is positioned based on brightness change of at least two frames of test images. The method can rapidly and accurately locate the light reflecting area, and further, the light reflecting area is subjected to light reflecting elimination treatment in the image to be treated, and the image without the light reflecting area is obtained. In this way, the user transparent glass can also shoot clear outdoor scene images, improve image quality and optimize user experience.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flowchart illustrating an image acquisition method according to an exemplary embodiment;
FIG. 2 is a flowchart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 3 is a flowchart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 4 is a flowchart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 5 is a flowchart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 6 is a flowchart illustrating an image acquisition method according to another exemplary embodiment;
FIG. 7 is a block diagram of an image acquisition device according to an exemplary embodiment;
fig. 8 is a block diagram of an image acquisition apparatus according to another exemplary embodiment;
fig. 9 is a block diagram of an image acquisition apparatus according to another exemplary embodiment;
fig. 10 is a block diagram of an image acquisition apparatus according to another exemplary embodiment;
fig. 11 is a block diagram of an image acquisition apparatus according to another exemplary embodiment;
fig. 12 is a block diagram of an electronic device, according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. Unless defined otherwise, technical or scientific terms used in this disclosure should be given the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The terms "a" or "an" and the like as used in the description and the claims do not denote a limitation of quantity, but rather denote the presence of at least one. Unless otherwise indicated, the terms "comprises," "comprising," and the like are intended to cover the presence of elements or articles recited as being "comprising" or "including," and equivalents thereof, without excluding other elements or articles. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
As used in this disclosure and the claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
Based on the defect that an image acquired in the related art has a light reflection area, the embodiment of the disclosure provides an image acquisition method, an image acquisition device and electronic equipment.
Before explaining an image acquisition method provided by an embodiment of the present disclosure, an electronic device to which the image acquisition method is applied is first described.
In one embodiment, an electronic device includes a flash acquisition component and a camera component. The flash frequency acquisition component is used for acquiring ambient light and determining the flash frequency of the ambient light according to the acquired ambient light. Optionally, the flicker obtaining component is a flicker sensor (flicker sensor). The camera assembly comprises a camera and an image sensor. Incident light is projected on the image sensor through the camera, and the image sensor senses the received light to output an image. In an embodiment of the present disclosure, the camera assembly is a high frequency camera with a refresh rate greater than 120 fps.
In one embodiment, the flash acquisition component and the camera component are external devices matched with the electronic device. Optionally, the flash frequency acquisition component and the camera component are connected with the electronic device through wireless connection modes such as Bluetooth, wiFi and the like. Or the flash frequency acquisition component and the camera component are connected with the electronic equipment in a wired connection mode such as a USB data wire.
Fig. 1 is a flow chart illustrating an image acquisition method according to an exemplary embodiment. As shown in fig. 1, and in combination with the above electronic device, the image acquisition method provided in the embodiment of the disclosure includes:
step 101, obtaining the flicker frequency of the ambient light of the shooting environment.
Indoor light sources (e.g., ceiling lights, desk lamps, etc.) are powered by alternating current at a frequency of 50Hz or 60 Hz. Therefore, the light-emitting brightness of the indoor light source periodically changes, and the indoor light source is in a rapid flickering state.
The flicker frequency of the ambient light is acquired by a flicker frequency acquisition component. When the flash frequency acquisition component is an external device of the electronic device, step 101 specifically includes: and receiving the data of the environment light flicker frequency sent by the flicker frequency acquisition component. The flash frequency acquisition component acquires the flash frequency in the following two alternative ways.
In a first alternative, the flicker frequency acquisition component acquires the flicker frequency of ambient light in response to a capture trigger operation. The shooting triggering operation is also used for triggering the shooting assembly to work. Accordingly, step 101 is completed during image capturing.
By adopting the mode, the ambient light flicker frequency acquired by the flicker frequency acquisition component is the ambient light flicker frequency of the shooting environment, and the accuracy of determining the reflective area later is improved.
In a second alternative, the flicker frequency acquisition component continuously acquires the flicker frequency of the ambient light at a set period. And the flicker frequency acquired last time before shooting triggering operation is used as the flicker frequency of the ambient light of the shooting environment.
Because the ambient light parameters of the shooting environment are relatively stable, the ambient light flicker frequency acquired by the flicker frequency acquisition component can be characterized in such a way, and the accuracy of determining the reflective area in the follow-up process is improved.
Step 102, acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is larger than the flicker frequency.
In the embodiment of the disclosure, the test image is used for positioning the light reflecting area, and the image to be processed is used for subsequent processing and finally presented on a display screen of the electronic device.
Optionally, the image to be processed and at least two frames of test images are acquired by the camera assembly. For example, the camera assembly is triggered to collect by a shooting trigger operation. When the camera module is an external device of the electronic device, step 102 specifically includes: and receiving data of the image to be processed and at least two frames of test images sent by the camera shooting assembly.
Since the set frame rate is greater than the flicker frequency, the test images can be acquired respectively under different ambient light levels by the imaging assembly. As a preferable mode, the frame rate is set to be greater than or equal to 2 times the flicker frequency. At this time, the image pickup assembly can test images respectively under the condition that the ambient light brightness is highest and lowest, so that the brightness difference caused by the ambient light frequency flash of different frames of test images is the largest, and further the accurate positioning of the light reflection area through the test images is facilitated.
Fig. 2, 3 are implementations of step 102 shown according to various exemplary embodiments.
As an example, as shown in fig. 2, step 102 specifically includes:
step 1021a, acquiring at least two frames of test images acquired at a set frame rate and a first exposure time.
Step 1022a, acquiring an image to be processed acquired at a set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
Because the acquisition frame rate and exposure time of the image to be processed are longer, the image to be processed has more image details than the test image. In this way, the quality of the image ultimately presented on the electronic device display screen is ensured. In step 1021a and step 1022a, image acquisition is performed by the imaging module.
As another example, as shown in fig. 3, step 102 specifically includes:
step 1021b, obtaining at least two frames of test images acquired at a set frame rate.
And step 1022b, synthesizing the image to be processed according to the acquired at least two frames of test images.
By adopting the mode, the image to be processed can be ensured to have rich image details, and the image quality is optimized. In step 1021b and step 1022b, image acquisition is performed by the imaging module.
With continued reference to fig. 1, step 103 is performed after step 102, specifically as follows:
and step 103, determining a light reflection area in the image to be processed according to the brightness of at least two frames of test images.
Due to the brightness change of the indoor light source, the brightness of the image presented based on the indoor light is different in different frames of test images. That is, the brightness of the retroreflective regions in the different test images is different. However, the brightness of the image outside the reflective area in the test image of different frames is little affected by the flickering of the ambient light, so that the area with the brightness change can be extracted by subtracting the brightness of the test images of two frames.
Fig. 4 is a flow diagram illustrating step 103 according to an exemplary embodiment. In one embodiment, as shown in fig. 4, step 103 specifically includes:
and 1031, determining pixel coordinates of the light reflecting area according to the brightness difference of at least two frames of test images.
Different frames of test images in at least two frames of test images correspond to different ambient light levels, so that a light reflecting area can be positioned in the test images according to brightness changes of the test images. The pixel coordinates of the light reflecting area refer to the pixel coordinates of boundary points of the light reflecting area. In the embodiment of the present disclosure, the number of boundary points is not limited. Of course, the more the number of boundary points, the more accurate the determined retroreflective regions.
Step 1032, determining the position of the light reflection area in the image to be processed according to the pixel coordinates of the light reflection area determined in the test image.
The electronic device is in a relatively stable state at the time of photographing, and thus the test image and the target frame image are considered to represent the same photographing region in the embodiments of the present disclosure. In other words, the coordinate systems of the test image and the image to be processed are the same, and the photographing areas are the same. In such a case, the pixel coordinates of the light reflecting region in the test image are the same as those of the light reflecting region in the target region.
By adopting the mode, the reflection area is accurately positioned in the image to be processed based on the brightness change in the environment light flickering process, the subsequent reflection elimination processing difficulty is reduced, and the processing effect and the processing speed of the reflection elimination processing are optimized.
In one embodiment, fig. 5 is a flow diagram illustrating step 1031, according to an example embodiment. As shown in fig. 5, step 1031 includes:
step 1031a, determining at least one group of inter-frame brightness change regions according to the brightness difference of any two frames of test images.
Optionally, a region with the absolute value of the brightness difference of the same pixel point in any two frames of test images being greater than or equal to a set threshold is used as a group of inter-frame brightness change regions. In such a case, the set of inter-frame luminance change regions includes at least one luminance change sub-region. Taking a scene in which an outdoor image is taken indoors through glass as an example. The indoor light source forms a plurality of reflecting areas on the glass, so that the plurality of reflecting areas are displayed in a test image acquired by the electronic equipment. And, the brightness of the reflective area in different test images is different, which is affected by the ambient light brightness flicker. Further, the inter-frame luminance change region obtained by the luminance difference of the two frames of test images includes a plurality of luminance change sub-regions.
It should be further noted that, when the image acquisition component acquires two frames of test images, a set of inter-frame brightness change regions is determined according to the brightness difference between the two frames of test images. When the image acquisition component acquires three or more than three test images, any two test images are compared to acquire at least three groups of inter-frame brightness change areas.
Step 1031b, determining a group of inter-frame brightness change regions with the largest brightness difference as the light reflection regions.
In the case where only one set of inter-frame luminance change regions is acquired in step 1031a, the set of regions is determined as the light reflection region. When the step 1031a obtains a plurality of groups of inter-frame luminance change regions, one group having the largest luminance difference among the three groups is used as the light reflection region. The group with the largest difference in brightness indicates that the flicker brightness fluctuation of the ambient light is the largest when the two frames of test images are acquired. Obvious ambient light brightness change can accurately position the light reflecting area.
Further, in the case where a plurality of sets of inter-frame luminance change regions are acquired in step 1031a, and one set of inter-frame luminance change regions includes a plurality of luminance change sub-regions, the retroreflective region is determined from the luminance difference of the specified region in the inter-frame luminance change regions. For example, the specified region is a luminance change sub-region having the largest area among the inter-luminance change regions (i.e., a luminance change sub-region that covers the pixel most).
Step 1031c, obtaining pixel coordinates of the light reflection area.
With continued reference to fig. 1, after determining the light reflection area in the image to be processed by step 103, the image acquisition method further includes step 104, specifically:
and 104, carrying out reflection elimination processing on the reflection area of the image to be processed to generate a target image.
Wherein the target image is an image after the reflection of the image to be processed is eliminated. And in the process of the reflection elimination treatment, repairing the image in the reflection area according to the image around the reflection area so as to restore the lost image details of the reflection area.
Optionally, an image restoration algorithm (inpainting algorithm) is adopted to perform reflection elimination processing on a reflection area in the image to be processed.
Illustratively, the method of Partial Differential Equation (PDE) is adopted to repair the reflective area in the image to be processed. In the processing, according to the light reflection area determined in step 103, information outside the light reflection area is diffused to the pixels to be repaired in the middle along the normal direction of the outline of the light reflection area.
Illustratively, a sample-based image restoration method (Exemplar-based inpainting) is employed to restore the retroreflective regions. In the processing, a patch portion (patch) is determined at a boundary portion of the retroreflective region. The patch includes a boundary of the retroreflective region. The image outside the light reflection area is divided into a plurality of parts according to the size of the patch, and a part having the highest similarity with the patch among the plurality of parts is determined as a reference part. And repairing the content of the reflection area in the patch by using the reference part.
Optionally, a deep learning algorithm is used to construct a glistening elimination model, and the glistening area determined in step 103 is repaired by the glistening elimination model.
And in the process of reflection elimination, a frame with darker brightness or a frame with smaller reflection area in the test image is used as a reference image for reflection elimination treatment. A darker retroreflective region or a smaller retroreflective region indicates that the test image is less affected by ambient light. And the reflection elimination processing rate and the accuracy of restoring the original image of the reflection area are improved through the reference image with higher accuracy.
The image acquisition method provided by the embodiment of the present disclosure is described below with reference to specific application examples. Fig. 6 is a flowchart illustrating an image acquisition method according to another exemplary embodiment. As shown in fig. 6, the acquisition method includes:
step 610, acquiring the ambient light flicker frequency by a flicker frequency acquisition component.
Step 602, responding to shooting triggering operation, and acquiring an image to be processed and at least two frames of test images through an image pickup assembly at a frame rate which is 2 times of flicker frequency.
And 603, determining a light reflecting area in the image to be processed according to the brightness of at least two frames of test images. The area with brightness and brightness alternately changed in at least two frames of test images is the reflective area.
Step 604, performing reflection elimination processing on a reflection area in the image to be processed to generate a target image. And optionally adopting an inpainting algorithm or a deep learning algorithm to eliminate the reflection area in the image to be processed.
According to the image acquisition method provided by the embodiment of the disclosure, the test image is acquired under different ambient light brightness conditions according to the ambient light flicker frequency. The retroreflective regions are located based on the brightness variation of at least two frames of test images. The method can rapidly and accurately locate the light reflecting area, and further, the light reflecting area is subjected to light reflecting elimination treatment in the image to be treated, and the image without the light reflecting area is obtained. In this way, the user transparent glass can also shoot clear outdoor scene images, improve image quality and optimize user experience.
The embodiment of the disclosure also provides an image acquisition device. The image acquisition device is applied to the electronic equipment provided by the above, and the electronic equipment comprises a flash frequency acquisition component and a camera component. Fig. 7 is a block diagram of an image acquisition apparatus according to an exemplary embodiment. As shown in fig. 7, the image acquisition apparatus includes: a first acquisition module 701, a second acquisition module 702, a determination module 703, and a light reflection cancellation module 704.
The first acquisition module 701 is configured to acquire a flicker frequency of ambient light of a shooting environment;
the second obtaining module 702 is configured to obtain an image to be processed and at least two frames of test images acquired at a set frame rate, where the set frame rate is greater than the flicker frequency;
the determining module 703 is configured to determine a light reflection area in the image to be processed according to the brightness of at least two frames of test images; and
the glistening module 704 is configured to perform glistening elimination processing on a glistening area of the image to be processed, so as to generate a target image.
In one embodiment, the frame rate is set to be greater than or equal to twice the flicker frequency.
In one embodiment, fig. 8 is a block diagram of a determination module shown according to an exemplary embodiment, and as shown in fig. 8, the determination module 703 includes: the first determination unit 7031 and the second determination unit 7032.
The first determining unit 7031 is configured to determine pixel coordinates of a light reflection region according to a luminance difference of at least two frames of test images.
The second determining unit 7032 is configured to determine a light reflection area in the image to be processed according to the pixel coordinates.
Fig. 9 is a block diagram of a first determination unit shown according to an exemplary embodiment. As shown in fig. 9, the first determination unit 7031 includes: a first determination subunit 7031a, a second determination subunit 7031b, and a third determination subunit 7031c.
The first determining subunit 7031a is configured to determine at least one group of inter-frame brightness change regions according to the brightness difference between any two frames of the test images.
The second determining subunit 7031b is configured to determine a group of inter-frame luminance change regions having the largest luminance difference as the light reflection region.
The third determination subunit 7031c is configured to acquire pixel coordinates of the light reflection region.
In one embodiment, the first determining subunit 7031a is specifically configured to: and taking the area with the absolute value of the brightness difference larger than or equal to the set threshold value in any two frames of test images as a group of inter-frame brightness change areas.
In one embodiment, FIG. 10 is a block diagram of a second acquisition module shown in accordance with an exemplary embodiment. As shown in fig. 10, the second acquisition module 702 includes: a first acquisition unit 7021a and a synthesis unit 7022a.
The first acquisition unit 7021a is used to acquire at least two frames of test images acquired at a set frame rate.
The synthesizing unit 7022a is configured to synthesize an image to be processed according to the acquired at least two frames of the test images.
In one embodiment, FIG. 11 is a block diagram of a second acquisition module shown in accordance with another exemplary embodiment. As shown in fig. 11, the second acquisition module 702 includes a second acquisition unit 7021b and a third acquisition unit 7022b.
The second acquisition unit 7021b is used to acquire at least two frames of test images acquired at a set frame rate and a first exposure time.
The third acquisition unit 7022b is used to acquire an image to be processed acquired at a set frame rate and a second exposure time. Wherein the second exposure time is longer than the first exposure time.
According to the image acquisition device provided by the embodiment of the disclosure, the test image is acquired under different ambient light brightness conditions according to the ambient light flicker frequency. The retroreflective regions are located based on the brightness variation of at least two frames of test images. The method can rapidly and accurately locate the light reflecting area, and further, the light reflecting area is subjected to light reflecting elimination treatment in the image to be treated, and the image without the light reflecting area is obtained. In this way, the user transparent glass can also shoot clear outdoor scene images, improve image quality and optimize user experience.
The embodiment of the disclosure also provides electronic equipment which comprises a memory and a processor besides the flash frequency acquisition component and the camera component. Wherein the memory stores executable instructions and the processor is configured to execute the executable instructions in the memory to implement the steps of the image acquisition method provided above.
Fig. 12 is a block diagram of a terminal device provided according to an exemplary embodiment. As shown in fig. 12, the terminal device 1200 may include one or more of the following components: a processing component 1202, a memory 1204, a power component 1206, a multimedia component 1208, an audio component 1210, an input/output (I/O) interface 1212, a sensor component 1214, a communications component 1216, and an image acquisition component.
The processing component 1202 generally performs overall operations of the terminal device 1200, such as operations associated with display, telephone call, data communication, camera operations, and recording operations. The processing component 1202 may include one or more processors 1220 to execute instructions. Further, the processing component 1202 may include one or more modules that facilitate interactions between the processing component 1202 and other components. For example, the processing component 1202 may include a multimedia module to facilitate interaction between the multimedia component 1208 and the processing component 1202.
The memory 1204 is configured to store various types of data to support operations at the terminal device 1200. Examples of such data include instructions for any application or method operating on terminal device 1200, contact data, phonebook data, messages, pictures, videos, and the like. The memory 1204 may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply assembly 1206 provides power to the various components of the terminal device 1200. Power supply component 1206 can include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal device 1200.
The multimedia component 1208 includes a screen between the terminal device 1200 and the target object that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a target object. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 1210 is configured to output and/or input audio signals. For example, the audio component 1210 includes a Microphone (MIC) configured to receive external audio signals when the terminal device 1200 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 1204 or transmitted via the communications component 1216. In some embodiments, audio assembly 1210 further includes a speaker for outputting audio signals.
The I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc.
The sensor assembly 1214 includes one or more sensors for providing status assessment of various aspects of the terminal device 1200. For example, the sensor assembly 1214 may detect an on/off state of the terminal device 1200, a relative positioning of the assemblies, such as a display and keypad of the terminal device 1200, a change in position of the terminal device 1200 or one of the assemblies, the presence or absence of a target object in contact with the terminal device 1200, an orientation or acceleration/deceleration of the terminal device 1200, and a change in temperature of the terminal device 1200. As another example, the sensor assembly 1214 also includes a light sensor disposed below the OLED display screen.
The communications component 1216 is configured to facilitate communication between the terminal device 1200 and other devices, either wired or wireless. The terminal device 1200 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 1216 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communications component 1216 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the terminal device 1200 can be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements.
In an exemplary embodiment, the disclosed embodiments also provide a readable storage medium storing executable instructions. The executable instructions may be executed by a processor of the terminal device to implement the steps of the power management chip control method provided above. The readable storage medium may be, among other things, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (14)

1. An image acquisition method, wherein the method is applied to an electronic device, the method comprising:
acquiring the flicker frequency of ambient light in a shooting environment;
acquiring an image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is greater than the flicker frequency;
determining a light reflection area in the image to be processed according to brightness change between the at least two frames of test images;
and carrying out reflection elimination treatment on the reflection area of the image to be treated to generate a target image.
2. The method of claim 1, wherein determining the retroreflective regions within the image to be processed based on the brightness variation between the at least two test images comprises:
determining pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images;
and determining the position of the light reflecting area in the image to be processed according to the pixel coordinates.
3. The method of claim 2, wherein determining pixel coordinates of the retroreflective regions based on the difference in brightness of the at least two test images comprises:
determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
determining a group of the inter-frame brightness variation regions having the largest brightness difference as the light reflection regions;
and determining pixel coordinates of the light reflecting area.
4. A method according to claim 3, wherein determining at least one set of inter-frame luminance change regions from the luminance difference of any two frames of the test image comprises:
and determining the area with the absolute value of the brightness difference larger than or equal to a set threshold value in any two frames of the test images as a group of inter-frame brightness change areas.
5. The method of claim 1, wherein the acquiring the image to be processed and the at least two frames of test images acquired at the set frame rate comprises:
acquiring at least two frames of test images acquired at the set frame rate; according to the acquired at least two frames of the test images Synthesizing the image to be processed;
or alternatively, the process may be performed,
acquiring the at least two frames of test images acquired at the set frame rate and the first exposure time; and acquiring the image to be processed acquired at the set frame rate and a second exposure time, wherein the second exposure time is longer than the first exposure time.
6. The method of claim 1, wherein the set frame rate is greater than or equal to twice the flicker frequency.
7. An image acquisition apparatus, wherein the image acquisition apparatus is applied to an electronic device, the electronic device further comprising: a flash frequency acquisition component and a camera component; the image acquisition apparatus includes:
the first acquisition module is used for acquiring the flicker frequency of the ambient light;
the second acquisition module is used for acquiring the image to be processed and at least two frames of test images acquired at a set frame rate, wherein the set frame rate is larger than the flicker frequency;
the determining module is used for determining a light reflecting area in the image to be processed according to the brightness change between the at least two frames of test images; and
and the reflection elimination module is used for carrying out reflection elimination treatment on the reflection area of the image to be treated to generate a target image.
8. The apparatus of claim 7, wherein the means for determining comprises:
the first determining unit is used for determining pixel coordinates of the light reflecting area according to the brightness difference of the at least two frames of test images; and
and the second determining unit is used for determining the light reflecting area in the image to be processed according to the pixel coordinates.
9. The apparatus of claim 8, wherein the first determining unit comprises:
the first determining subunit is used for determining at least one group of inter-frame brightness change areas according to the brightness difference of any two frames of the test images;
a second determining subunit configured to determine, as the light reflection area, a group of the inter-frame luminance change areas where the luminance difference is largest; and
and a third determining subunit, configured to determine pixel coordinates of the light reflection area.
10. The apparatus of claim 9, wherein the first determination subunit is specifically configured to: and taking the area with the absolute value of the brightness difference larger than or equal to the set threshold value in any two frames of the test images as a group of inter-frame brightness change areas.
11. The apparatus of claim 7, wherein the second acquisition module comprises:
the first acquisition unit is used for acquiring at least two frames of test images acquired at the set frame rate; and
a synthesizing unit for synthesizing the test images according to the acquired at least two frames Synthesizing the image to be processed;
or alternatively, the process may be performed,
the second acquisition module includes:
the second acquisition unit is used for acquiring the at least two frames of test images acquired at the set frame rate and the first exposure; and
and a third acquisition unit configured to acquire the image to be processed acquired at the set frame rate and a second exposure time, where the second exposure time is longer than the first exposure time.
12. The apparatus of claim 7, wherein the set frame rate is greater than or equal to twice the flicker frequency.
13. An electronic device, the device comprising:
a flash frequency acquisition component;
a camera assembly;
a memory storing executable instructions; and
a processor configured to execute executable instructions in the memory to implement the method of any one of claims 1 to 6.
14. A readable storage medium having stored thereon executable instructions, which when executed by a processor implement the method of any of claims 1 to 6.
CN202010366166.0A 2020-04-30 2020-04-30 Image acquisition method and device and electronic equipment Active CN113660428B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010366166.0A CN113660428B (en) 2020-04-30 2020-04-30 Image acquisition method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010366166.0A CN113660428B (en) 2020-04-30 2020-04-30 Image acquisition method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113660428A CN113660428A (en) 2021-11-16
CN113660428B true CN113660428B (en) 2023-07-04

Family

ID=78488637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010366166.0A Active CN113660428B (en) 2020-04-30 2020-04-30 Image acquisition method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113660428B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419076B (en) * 2022-03-28 2022-08-05 杭州微影软件有限公司 Data processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573215A (en) * 2018-03-16 2018-09-25 海信集团有限公司 Reflective road method for detecting area, device and terminal
CN110263805A (en) * 2019-05-07 2019-09-20 阿里巴巴集团控股有限公司 Certificate verification, auth method, device and equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3273619B2 (en) * 1991-08-22 2002-04-08 オリンパス光学工業株式会社 Electronic imaging device
CN109151255B (en) * 2018-08-31 2021-05-25 惠州华阳通用电子有限公司 Camera flicker elimination method and device based on photoelectric detection
CN110855889B (en) * 2019-11-21 2021-04-13 重庆金山医疗技术研究院有限公司 Image processing method, image processing apparatus, image processing device, and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573215A (en) * 2018-03-16 2018-09-25 海信集团有限公司 Reflective road method for detecting area, device and terminal
CN110263805A (en) * 2019-05-07 2019-09-20 阿里巴巴集团控股有限公司 Certificate verification, auth method, device and equipment

Also Published As

Publication number Publication date
CN113660428A (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN108986199B (en) Virtual model processing method and device, electronic equipment and storage medium
US9674395B2 (en) Methods and apparatuses for generating photograph
CN109360261B (en) Image processing method, image processing device, electronic equipment and storage medium
CN104092948B (en) Process method and the device of image
CN108810422B (en) Light supplementing method and device for shooting environment and computer readable storage medium
CN108122195B (en) Picture processing method and device
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN106484356A (en) Adjust the method and device of brightness of image
CN105407295B (en) Mobile terminal filming apparatus and method
CN113660428B (en) Image acquisition method and device and electronic equipment
CN111835941B (en) Image generation method and device, electronic equipment and computer readable storage medium
CN111340690A (en) Image processing method, image processing device, electronic equipment and storage medium
CN108156381B (en) Photographing method and device
CN112565625A (en) Video processing method, apparatus and medium
US11617023B2 (en) Method for brightness enhancement of preview image, apparatus, and medium
CN114187874B (en) Brightness adjusting method, device and storage medium
CN106775246B (en) Screen brightness adjusting method and device
CN111586280A (en) Shooting method, shooting device, terminal and readable storage medium
CN111277754B (en) Mobile terminal shooting method and device
CN113452813B (en) Image acquisition device, terminal device, method, processing device, and medium
CN114338956A (en) Image processing method, image processing apparatus, and storage medium
CN112312034A (en) Exposure method and device of image acquisition module, terminal equipment and storage medium
CN112052844A (en) Light supplementing method and device, electronic equipment and storage medium
CN113315903A (en) Image acquisition method and device, electronic equipment and storage medium
CN111314550A (en) Display control method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant