WO2022267762A1 - 一种多人工光源下的拍摄方法及相关装置 - Google Patents

一种多人工光源下的拍摄方法及相关装置 Download PDF

Info

Publication number
WO2022267762A1
WO2022267762A1 PCT/CN2022/093644 CN2022093644W WO2022267762A1 WO 2022267762 A1 WO2022267762 A1 WO 2022267762A1 CN 2022093644 W CN2022093644 W CN 2022093644W WO 2022267762 A1 WO2022267762 A1 WO 2022267762A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
artificial light
frequency
light source
blinking period
Prior art date
Application number
PCT/CN2022/093644
Other languages
English (en)
French (fr)
Inventor
冯寒予
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to US18/266,222 priority Critical patent/US20230388658A1/en
Priority to EP22827260.5A priority patent/EP4243403A1/en
Publication of WO2022267762A1 publication Critical patent/WO2022267762A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/745Detection of flicker frequency or suppression of flicker wherein the flicker is caused by illumination, e.g. due to fluorescent tube illumination or pulsed LED illumination
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/035Combinations of cameras with incandescent lamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2215/00Special procedures for taking photographs; Apparatus therefor
    • G03B2215/05Combinations of cameras with electronic flash units
    • G03B2215/0514Separate unit
    • G03B2215/0557Multiple units, e.g. slave-unit

Definitions

  • the present application relates to the field of terminal technologies, and in particular to a shooting method and related devices under multiple artificial light sources.
  • the light and dark stripes in the picture can be eliminated by adjusting the exposure time to an integer multiple of the flickering period of the artificial light source.
  • this method of adjusting the exposure time is only applicable to shooting scenes where there is an artificial light source with a flickering frequency, and is not suitable for scenes with multiple artificial light sources. Therefore, if there are multiple artificial light sources in the shooting environment, the phenomenon of scrolling stripes in the shooting screen cannot be solved, which greatly affects the user experience.
  • the present application provides a shooting method and a related device under multiple artificial light sources.
  • the electronic device can determine the flickering frequencies of multiple artificial light sources, and select two of the flickering frequencies, which are denoted as F1 and F2 respectively.
  • the electronic device determines the exposure time and frame interval used when the electronic device acquires images next according to the two flicker frequencies.
  • the method can reduce the banding phenomenon caused by other artificial light sources while eliminating the banding phenomenon caused by a certain artificial light source, so that rolling light and dark stripes no longer appear on the picture on the electronic device.
  • the present application provides a shooting method under multiple artificial light sources.
  • the method may include: the electronic device may determine the first flickering period and the second flickering period; the first flickering period may be the flickering period of the first artificial light source in the photographing environment; the second flickering period may be the flickering period of the second artificial light source in the photographing environment Blinking cycle; the electronic device can determine the first exposure time and the first frame interval; if k1 times of the first flickering cycle does not exceed the first range, the first exposure time is k1 times of the first flickering cycle, and the first frame interval is the first Two k2 times of the flicker cycle; the electronic device can shoot with the first exposure time and the first frame interval; the first frame interval is the interval between two adjacent frames of images collected by the camera during the shooting process.
  • the electronic device can determine the flickering cycles of two artificial light sources in the shooting environment—the first flickering cycle and the second flickering cycle, and take images with the first exposure time and the first frame interval . If k1 times of the first blinking period does not exceed the first range, the first exposure time of the electronic device is k1 times of the first blinking period, and the first frame interval is k2 times of the second blinking period.
  • This solution can eliminate the banding phenomenon caused by the first artificial light source, and weaken the banding phenomenon caused by the second artificial light source.
  • the rolling light and dark stripes caused by the first artificial light source will no longer be displayed on the display screen of the electronic device, and the rolling light and dark stripes caused by the second artificial light source will no longer be displayed. Scrolling improves user experience.
  • the illumination intensity of the first artificial light source may be greater than the illumination intensity of the second artificial light source.
  • the illumination intensity of the first artificial light source may be greater than the illumination intensity of the second artificial light source. That is to say, if conditions permit (k1 times of the first flickering period does not exceed the first range), the banding phenomenon caused by artificial light sources with high light intensity can be preferentially eliminated. Because the banding phenomenon caused by artificial light sources with high light intensity is more obvious, eliminating the banding phenomenon caused by artificial light sources with high light intensity first can minimize the banding phenomenon and improve the user experience.
  • the first exposure time is k2 times the second flickering period
  • the first frame interval is k1 times the first flickering period
  • the exposure time can be adjusted to an integer multiple of the second flicker cycle, so that at least one artificial light source can eliminate light and dark fringes, and can Minimize the impact of banding on screen display as much as possible.
  • the first frame interval is k1 times the first blink period.
  • the electronic device can only adjust the frame interval to an integer multiple of the flicker period of an artificial light source in the shooting environment, and continue to use the exposure time adjusted by the automatic exposure system, or directly use the method of obtaining the latest The exposure time of the frame image to acquire the next frame image. It can be understood that the electronic device can also adjust the exposure time in other ways, which is not limited in this application.
  • the electronic device determines the first blinking period and the second blinking period, specifically including: the electronic device can acquire a first time sequence; the first time sequence includes ambient brightness information and time information; the electronic device can convert the first time series into the first frequency spectrum; the electronic device can determine the frequency of the first sine wave as the first flicker frequency according to the first frequency spectrum, and determine the frequency of the second sine wave as the second flicker frequency Frequency; the electronic device can determine the first blinking period according to the first blinking frequency, and determine the second blinking period according to the second blinking frequency; wherein, the difference between the amplitude of the first sine wave and the first average value is greater than the first preset threshold ; The difference between the amplitude of the second sine wave and the second average value is greater than the second preset threshold; the first average value is within the frequency search range in the first frequency spectrum, other sine waves except the first sine wave The average value of the amplitude; the second average value is the average value of the amplitude
  • the electronic device can obtain a time series by collecting ambient brightness information and time information, and then convert the time series into a frequency spectrum according to the Fourier principle. From this spectrum, the electronic device can determine the flickering period of the artificial light source. It can be understood that the electronic device determines the flickering period of the sine wave with the largest amplitude and the sine wave with the second largest amplitude in the frequency spectrum. That is to say, the flicker period determined by the electronic device is the flicker period of the artificial light source with the highest light intensity and the second largest light intensity in the shooting environment. After the electronic device adjusts the exposure time and the frame interval according to the flicker cycle, the banding phenomenon caused by the artificial light source can be minimized.
  • the first artificial light source is the artificial light source with the highest light intensity among the two or more artificial light sources
  • the second The artificial light source is the artificial light source with the second largest light intensity among the two or more artificial light sources.
  • the electronic device can adjust the exposure time and frame interval according to the flickering periods of the artificial light sources with the highest and second largest light intensity in the shooting environment, so as to minimize the banding phenomenon caused by artificial light sources.
  • the method may further include: the electronic device determines a third blinking period; the third blinking period is the blinking period of the third artificial light source; if the k1 times of the first blinking period exceeds the first range, and the k2 times of the second blinking period does not exceed The first range, and k3 times of the third blinking period does not exceed the first range, the first exposure time is k3 times of the third blinking period, and the first frame interval is k1 times of the first blinking period.
  • the electronic device judges whether the exposure time can be adjusted according to the flickering cycle of the artificial light source with greater illumination, which can eliminate one artificial light source in the shooting environment as much as possible. Light and dark streaks caused by light sources.
  • the present application provides an electronic device.
  • the electronic device includes a camera, one or more memories, one or more processors, one or more processors are coupled with the camera and one or more memories, and one or more memories are used to store computer program codes, computer program codes Includes computer instructions.
  • the processor can be used to determine the first flicker cycle and the second flicker cycle; the first flicker cycle is the flicker cycle of the first artificial light source in the shooting environment; the second flicker cycle is the flicker cycle of the second artificial light source in the shooting environment; processing
  • the device can also be used to determine the first exposure time and the first frame interval; if k1 times of the first flashing period does not exceed the first range, the first exposure time is k1 times of the first flashing period, and the first frame interval is the first Two k2 times of the flicker period; both k1 and k2 are positive integers; the camera can be used to take pictures with the first exposure time and the first frame interval; the first frame interval is the interval between collecting two adjacent frames of images during the shooting process time.
  • the illumination intensity of the first artificial light source is greater than the illumination intensity of the second artificial light source.
  • the first exposure time is k2 times the second flickering period
  • the first frame interval is k1 times the first flickering period
  • the first frame interval is k1 times the first blink period.
  • the processor when used to determine the first blinking period and the second blinking period, it is specifically used to: obtain the first time sequence; the first time sequence includes the environment Brightness information and time information; convert the first time series into a first spectrum; determine the frequency of the first sine wave as the first flicker frequency according to the first spectrum, and determine the frequency of the second sine wave as the second flicker frequency; according to the first The flicker frequency determines the first flicker cycle, and the second flicker cycle is determined according to the second flicker frequency; wherein, the difference between the amplitude of the first sine wave and the first average value is greater than the first preset threshold; the amplitude of the second sine wave The difference with the second average value is greater than the second preset threshold; the first average value is the average value of the amplitudes of other sine waves except the first sine wave within the frequency search range in the first frequency spectrum; the second The average value is the average value of the amplitudes of other sine waves except the first sine wave and the
  • the first artificial light source is the artificial light source with the highest light intensity among the two or more artificial light sources
  • the second The artificial light source is the artificial light source with the second largest light intensity among the two or more artificial light sources.
  • the processor is further configured to: determine a third flickering cycle; the third flickering cycle is the third artificial light source Blink cycle. If the k1 times of the first blinking period exceeds the first range, and the k2 times of the second blinking period does not exceed the first range, and the k3 times of the third blinking period does not exceed the first range, the first exposure time is the third blinking period k3 times, the first frame interval is k1 times the first flicker period.
  • the present application provides a computer-readable storage medium, including instructions, which, when the instructions are run on an electronic device, cause the electronic device to execute any possible implementation manner in the first aspect above.
  • an embodiment of the present application provides a chip, the chip is applied to an electronic device, and the chip includes one or more processors, and the processor is used to invoke computer instructions so that the electronic device executes any one of the above first aspects a possible implementation.
  • an embodiment of the present application provides a computer program product including instructions, which, when the computer program product is run on a device, cause the electronic device to execute any possible implementation manner in the first aspect above.
  • the electronic device provided in the second aspect above, the computer-readable storage medium provided in the third aspect, the chip provided in the fourth aspect, and the computer program product provided in the fifth aspect are all used to execute the method. Therefore, the beneficial effects that it can achieve can refer to the beneficial effects in the corresponding method, and will not be repeated here.
  • FIG. 1 is a schematic diagram of the banding phenomenon provided by the embodiment of the present application.
  • FIG. 2A is a waveform diagram of an alternating current provided in the embodiment of the present application.
  • FIG. 2B is another waveform diagram of alternating current provided by the embodiment of the present application.
  • FIG. 3A is a waveform diagram of an optical signal provided by an embodiment of the present application.
  • FIG. 3B is a waveform diagram of another optical signal provided by the embodiment of the present application.
  • FIG. 4 is an exposure principle diagram of a sensor provided in an embodiment of the present application.
  • FIG. 5 is a waveform diagram of another optical signal provided by the embodiment of the present application.
  • FIG. 6 is a schematic diagram of an artificial multi-light source scene provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a hardware structure of an electronic device 100 provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a software structure of an electronic device 100 provided in an embodiment of the present application.
  • 9A-9D are schematic diagrams of a set of user interfaces provided by the embodiment of the present application.
  • FIG. 10 is a spectrum diagram provided by the embodiment of the present application.
  • FIG. 11 is a flow chart of a shooting method under artificial multiple light sources provided by an embodiment of the present application.
  • This application relates to the field of photography. In order to facilitate the understanding of the method provided in this application, some terms in the field of photography are introduced below.
  • Exposure reflects the amount of light energy captured by the photosensitive element when capturing an image, which affects the lightness and darkness of the final captured image. Wherein, the greater the exposure for capturing a frame of image, the brighter the brightness of this frame of image.
  • Exposure is determined by three factors: exposure time, light-passing area, and ambient light intensity. Among them, the shutter speed determines the exposure time. The size of the aperture determines the light passing area.
  • ISO was used to reflect the sensitivity of the film to light. It can be considered that ISO affects the ambient light intensity obtained by the photosensitive element.
  • the photosensitive elements of electronic devices such as digital cameras and mobile phones remain unchanged after packaging. For these electronic devices, ISO no longer represents the sensitivity of the photosensitive element to light, but the electronic signal amplification gain value. When the ISO increases, the amplification gain of the electronic signal increases, the original signal is amplified, and the image becomes brighter.
  • the aperture size of electronic devices such as digital cameras and mobile phones is fixed, and the electronic devices can adjust the brightness of images by adjusting exposure time and ISO.
  • the exposure intensity is used to represent the image brightness in the subsequent embodiments of the present application. The larger the exposure intensity, the brighter the image, and the smaller the exposure intensity, the darker the image.
  • the electronic device can adjust the exposure intensity by adjusting the exposure time and ISO.
  • Figure 1 is an image acquired by an electronic device when there is an artificial light source in the shooting environment. It can be seen that the brightness in Figure 1 is uneven and consists of bright and dark stripes.
  • FIG. 2A exemplarily shows a waveform diagram of an alternating current with a frequency of 60 hertz (Hz) of the power supply.
  • FIG. 2B exemplarily shows a waveform diagram of an AC power supply with a frequency of 50 Hz.
  • the artificial light source When the artificial light source is connected to alternating current, the artificial light source converts the electrical signal into an optical signal. Since the electrical signal is a periodic signal of a certain frequency, the converted optical signal is also a periodic signal of a certain frequency. It can be understood as: the light emitted by the artificial light source Over time, there is a certain frequency and periodic change, that is, stroboscopic phenomenon.
  • the stroboscopic phenomenon is caused by the design of the power supply and the characteristics of the artificial light source itself. Therefore, there is no true stroboscopic-free.
  • the operating current must fluctuate with the fluctuation of the input voltage, which directly leads to stroboscopic flicker due to the fluctuation of light output.
  • the light energy emitted by artificial light sources has no directionality, so the waveform of the light signal is no longer a sinusoidal waveform, but an envelope with a frequency of 100Hz or 120Hz.
  • the waveform of the optical signal converted by the artificial light source is a periodically changing envelope with a frequency of 120 Hz.
  • the waveform of the optical signal converted by the artificial light source is a periodically changing envelope with a frequency of 100 Hz.
  • the flicker frequency of the artificial light source is usually twice the frequency of the alternating current to which the artificial light source is connected.
  • the embodiment of the present application does not limit the flicker frequency of the artificial light source.
  • the frequency of the alternating current connected to the artificial light source is a frequency other than 50 Hz or 60 Hz, and the flicker frequency of the artificial light source may be a value other than 100 Hz or 120 Hz.
  • the sensor starts to expose the first row of pixels of a frame of image, and starts to expose the second row of pixels after one row period.
  • the pixels in the N-1th row start to be exposed, the pixels in the Nth row start to be exposed at intervals of one row period. That is to say, the time difference between the time when each row of pixels starts to expose and the time when the next row of pixels in this row starts to expose is one row period. Therefore, each row of pixels starts to expose at a different time.
  • exposure time refers to the time required by an electronic device to expose a row of pixels of a frame of image.
  • the exposure time of pixels in different rows of the same frame image is the same.
  • the row period may be determined by the capabilities of the sensor.
  • the line period of different sensors may be different, so the line period of different electronic devices may also be different.
  • the embodiment of the present application does not limit the value of the above row period.
  • the area enclosed by the envelope and the X-axis within a period of time is the light energy emitted by the artificial light source during this period, that is, the light energy received by the sensor during this period.
  • the light energy received by the sensor during the period T1 - T2 affects the brightness of the pixels in the Nth row of the finally displayed image.
  • the more light energy received by the sensors in T 1 -T 2 the brighter the pixels in the Nth row of the finally displayed image will be.
  • the less light energy received by the sensors in T 1 -T 2 the darker the pixels in the Nth row of the finally displayed image will be.
  • the following takes an artificial light source connected to a 50Hz AC current in the shooting environment as an example for illustration.
  • the waveform of the optical signal converted by the artificial light source is a periodically changing envelope with a frequency of 100 Hz.
  • Record the flickering period of the artificial light source as T, then: T 1/100s.
  • t T+t1. That is, t is not an integer multiple of the blinking period T, when the sensor exposes the i-th row of pixels of the image, the light energy received by the row of pixels is: S+S1.
  • S may represent the light energy received by the row of pixels within time T
  • S1 may represent the light energy received by the row of pixels within time t1.
  • t4+t2+t3 t.
  • S2+S3+S4 the light energy received by the pixels in the i+1th row
  • S2+S3 S. Therefore, the light energy received by the pixels in the i+1th row is slightly more than that of the pixels in the ith row. That is to say, the brightness of the i-th row of pixels and the i+1th row of pixels of the finally displayed image is different.
  • the pixels in row i+1 are brighter than the pixels in row i.
  • the light signal converted by the artificial light source is a periodic signal
  • the light energy received within a time period is the same, and the brightness of different rows of the finally displayed image is the same.
  • the image displayed by the electronic device will not appear bright and dark stripes. If the exposure time is not an integral multiple of the flickering period of the artificial light source, bright and dark stripes will appear in the image displayed by the electronic device. Since the positions of the light and dark stripes of different images may change, scrolling light and dark stripes may appear in the preview screen of the electronic device or the screen during video recording, that is, the banding phenomenon.
  • the phases of the light signals corresponding to the exposure time of the first row of different images are the same, light and dark stripes will still appear in the preview images of electronic devices such as digital cameras and mobile phones.
  • the first line of each frame of image is exposed at the same time, when the sensor in these electronic devices exposes the image, the relationship between the received light energy between different lines of each frame of image is the same. Therefore, the light energy received by the Nth row of each frame of image may be different from each other. That is, the brightness of the Nth row of each frame of image may be different from each other, but their brightness relative to other rows in the image is constant.
  • the banding phenomenon can be avoided by adjusting the exposure time to an integer multiple of the flickering periods of these artificial light sources.
  • the exposure time may be adjusted to an integer multiple of the flicker period of one of the artificial light sources.
  • the stroboscopic phenomenon of other artificial light sources will still cause the banding phenomenon. That is, rolling light and dark stripes will still appear on the shooting screen.
  • the chandelier is connected to 50Hz AC, and its flickering period is 1/100s; the desk lamp is charged through the Universal Serial Bus (USB) interface, and when it is fully charged, the frequency is 60Hz, and the flickering period can be obtained: 1/60s.
  • the flicker cycles of the two artificial light sources are different. Adjusting the exposure time can only solve the banding phenomenon caused by the stroboscopic flicker of one artificial light source, and the banding phenomenon caused by the stroboscopic flicker of the other artificial light source can still be solved wirelessly, so the shooting screen Scrolling light and dark stripes will still appear in the
  • the present application provides a shooting method and a related device under multiple artificial light sources.
  • the electronic device can determine the flickering frequencies of multiple artificial light sources, and select two of the flickering frequencies, which are denoted as F1 and F2 respectively.
  • the flicker periods corresponding to these two flicker frequencies are denoted as T1 and T2 respectively. If the exposure time satisfies the integral multiple of T1 and the corresponding ISO is also within the preset range, adjust the exposure time so that the exposure time satisfies the integral multiple of T1, and adjust the frame interval according to F2.
  • the electronic device If the corresponding ISO is not within the preset range when the exposure time satisfies an integral multiple of T1, then judge whether the electronic device can adjust the exposure time to an integral multiple of T2 when the ISO is within the preset range, and if so, Then adjust the exposure time to an integer multiple of T2, and adjust the frame interval according to F1, otherwise, the electronic device does not adjust the exposure time, but only adjusts the frame interval according to F1.
  • the above method can reduce the banding phenomenon caused by other artificial light sources while eliminating the banding phenomenon caused by a certain artificial light source, so that rolling light and dark stripes no longer appear on the screen on the electronic device.
  • FIG. 7 is a schematic diagram of a hardware structure of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (Universal Serial Bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber Identification Module (Subscriber Identification Module, SIM) card interface 195 and so on.
  • SIM Subscriber Identification Module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing unit, GPU), an image signal processor (Image Signal Processor, ISP), controller, memory, video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-network Processing Unit, NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor Application Processor, AP
  • modem processor a graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP Digital Signal Processor
  • baseband processor baseband processor
  • neural network processor Neural-network Processing Unit, NPU
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • the processor 110 may also include an AE system.
  • the AE system can be specifically set in the ISP.
  • the AE system can be used to realize automatic adjustment of exposure parameters.
  • the AE system may also be integrated in other processor chips. This embodiment of the present application does not limit it.
  • the electronic device 100 may execute the method for adjusting the exposure intensity through the processor 110 .
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices 100, such as AR devices.
  • the charging management module 140 is configured to receive a charging input from a charger. While the charging management module 140 is charging the battery 142 , it can also supply power to the electronic device 100 through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (Low Noise Amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the wireless communication module 160 can provide wireless local area network (Wireless Local Area Networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (Bluetooth, BT), global navigation satellite System (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field communication technology (Near Field Communication, NFC), infrared technology (Infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can adopt liquid crystal display (Liquid Crystal Display, LCD), organic light-emitting diode (Organic Light-Emitting Diode, OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (Active-Matrix Organic Light Emitting Diode, AMOLED), flexible light-emitting diode (Flex Light-Emitting Diode, FLED), Mini LED, Micro LED, Micro-OLED, quantum dot light-emitting diode (Quantum Dot Light Emitting Diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 may realize the acquisition function through an ISP, a camera 193 , a video codec, a GPU, a display screen 194 , and an application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image or video visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element can be a charge coupled device (Charge Coupled Device, CCD) or a complementary metal oxide semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS) phototransistor.
  • CCD Charge Coupled Device
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP for conversion into a digital image or video signal.
  • ISP outputs digital image or video signal to DSP for processing.
  • DSP converts digital images or video signals into standard RGB, YUV and other formats of images or video signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the electronic device 100 can use N cameras 193 to acquire images with multiple exposure coefficients, and then, in video post-processing, the electronic device 100 can synthesize HDR images using the HDR technology based on the images with multiple exposure coefficients. image.
  • Digital signal processors are used to process digital signals. In addition to digital image or video signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: Moving Picture Experts Group (Moving Picture Experts Group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (Neural-Network, NN) computing processor.
  • NN neural network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as: image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image and video playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the microphone 170C also called “microphone” or “microphone”, is used to convert sound signals into electrical signals.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the earphone interface 170D is used for connecting wired earphones.
  • the sensor module 180 may include one or more sensors, which may be of the same type or of different types. It can be understood that the sensor module 180 shown in FIG. 1 is only an exemplary division manner, and there may be other division manners, which are not limited in the present application.
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes ie, x, y and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the electronic device 100, and can be applied to applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the fingerprint sensor 180H is used to acquire fingerprints.
  • the temperature sensor 180J is used to detect temperature.
  • Touch sensor 180K also known as "touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the user needs to acquire a series of images for time-lapse photography or continuous shooting by using the electronic device 100 .
  • the electronic device 100 may adopt the AE mode. That is, the electronic device 100 automatically adjusts the AE value.
  • the touchAE mode may be triggered.
  • the electronic device 100 can adjust the brightness of the corresponding position where the user touches the display screen, and perform light metering with high weight. Therefore, when calculating the average brightness of the screen, the weight of the area touched by the user is obviously higher than that of other areas, and the average brightness of the screen finally calculated is closer to the average brightness of the area touched by the user.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • FIG. 8 is a schematic diagram of a software structure of an electronic device 100 provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the system is divided into four layers, which are application program layer, application program framework layer, runtime (Runtime) and system library, and kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs (also called applications) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs also called applications
  • the application framework layer provides an application programming interface (Application Programming Interface, API) and a programming framework for applications in the application layer.
  • API Application Programming Interface
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • Runtime includes the core library and virtual machine. Runtime is responsible for the scheduling and management of the system.
  • the core library includes two parts: one part is the function function that the programming language (for example, java language) needs to call, and the other part is the core library of the system.
  • one part is the function function that the programming language (for example, java language) needs to call
  • the other part is the core library of the system.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes programming files (for example, java files) of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: Surface Manager (Surface Manager), Media Library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • Surface Manager Surface Manager
  • Media Libraries Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the surface manager is used to manage the display subsystem, and provides fusion of two-dimensional (2-Dimensional, 2D) and three-dimensional (3-Dimensional, 3D) layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, a sensor driver, and a virtual card driver.
  • the workflow of the software and hardware of the electronic device 100 will be exemplarily described below in conjunction with capturing and photographing scenes.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, and other information). Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Take the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example.
  • the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • Camera 193 captures still images or video.
  • 9A-9D exemplarily show some user interfaces when the electronic device 100 is shooting.
  • GUI graphics user interface
  • FIG. 9A exemplarily shows a user interface 910 on the electronic device 100 .
  • the user interface 910 shows a recording interface, which may include a preview interface 911 , a shutter control 912 , an end recording control 913 , a recording time control 914 and a recording pause control 915 .
  • the preview area 911 can be used to display preview images.
  • the preview image is an image captured by the electronic device 100 through the camera in real time.
  • the electronic device can refresh the display content in the preview area 911 in real time, so that the user can preview the image currently captured by the camera.
  • the shutter control 912 can be used to trigger taking pictures, that is, the user can trigger the shutter control 912 to take pictures during video recording.
  • the end recording control 913 can be used to end recording video.
  • the recording time control 914 can indicate the time length of the currently recorded video.
  • Pause recording control 915 can be used to temporarily stop recording video.
  • the recording time control 914 displays 00:00:01, which means that the current video has been recorded for 1 second (s).
  • image 1 is displayed in the preview area 911 of the user interface 910 , and there are obvious light and dark stripes on the image 1 .
  • FIG. 9B exemplarily shows a user interface 920 on the electronic device 100 .
  • the controls in user interface 920 are substantially the same as the controls in user interface 910 .
  • 00:00:02 is displayed in the recording time control 914 in FIG. 9B , which means that the current video has been recorded for 2 seconds (s).
  • image 2 is displayed in the preview area 911 of the user interface 920 , and there are obvious light and dark stripes on the image 2 .
  • the positions of the light and dark stripes in the two frames of image 1 and image 2 have changed. That is to say, in the process of recording the video from the first second to the second, scrolling light and dark stripes are displayed on the screen in the preview area 911 .
  • the electronic device 100 may determine flicker frequencies of multiple artificial light sources, and select two artificial light sources with the largest amplitudes.
  • the flickering frequencies corresponding to the two artificial light sources are denoted as F1 and F2 respectively.
  • the flicker periods corresponding to these two flicker frequencies are denoted as T1 and T2 respectively.
  • the exposure time satisfies the integral multiple of T1, and the corresponding ISO is also within the preset range, then adjust the exposure time so that the exposure time satisfies the integral multiple of T1, and adjust the frame interval according to F2; if the exposure time satisfies T1 In the case of an integer multiple of T2, the corresponding ISO is not within the preset range, then judge whether the electronic device can adjust the exposure time to an integer multiple of T2 when the ISO is within the preset range, and if so, adjust the exposure time to Integer multiple of T2, and the frame interval is adjusted according to F1, otherwise, the electronic device 100 does not adjust the exposure time, but only adjusts the frame interval according to F1.
  • FIG. 9C and FIG. 9D exemplarily show the interface displayed after the electronic device 100 performs the above operations.
  • FIG. 9C exemplarily shows a user interface 930 on the electronic device 100 .
  • the controls in user interface 930 are substantially the same as the controls in user interface 910 .
  • 00:00:05 is displayed in the recording time control 914 in FIG. 9C , which means that the current video has been recorded for 5 seconds (s).
  • the image 3 is displayed in the preview area 911 of the user interface 930 , and there are obvious light and dark stripes on the image 3 .
  • FIG. 9D exemplarily shows a user interface 940 on the electronic device 100 .
  • the controls in user interface 940 are substantially the same as the controls in user interface 910. 00:00:06 is displayed in the recording time control 914 in FIG. 9D , which means that the current video has been recorded for 6 seconds (s). At this moment, the image 4 is displayed in the preview area 911 of the user interface 940 , and there are obvious light and dark stripes on the image 4 .
  • the electronic device 100 adjusts the exposure time and/or the frame interval according to the above method, the light and dark stripes can be fixed in the picture, and the banding phenomenon in the preview area 911 of the electronic device 100 is weakened. That is to say, rolling light and dark stripes no longer appear on the screen of the electronic device 100, which improves the user's shooting experience.
  • the following specifically introduces the method of adjusting the frame interval according to the flickering frequency of the artificial light source when there is an artificial light source in the shooting environment.
  • CMOS complementary Metal-Oxide Semiconductor
  • the anti-flicker sensor (Flicker Sensor) will also convert the light image on each pixel on the photosensitive surface into an electrical signal.
  • the Flicker Sensor has only one pixel and no light filtering, so the electrical signal output by the Flicker Sensor is the electrical signal converted from the light image on the only pixel. It can be understood that the electrical signal output by the Flicker Sensor can be used to represent the current ambient brightness, that is, the electrical signal output by the Flicker Sensor can be considered as the current ambient brightness.
  • the Flicker Sensor in the electronic device 100 starts to sample the ambient light, and outputs the time of each sampling and the corresponding electrical signal.
  • the output of the Flicker Sensor is a time series (first time series) of ambient brightness.
  • the time series of ambient brightness is a one-dimensional time series.
  • the first time sequence includes ambient brightness information and time information.
  • the ambient brightness information mentioned here is the ambient brightness mentioned above, that is, the ambient brightness of each sampling.
  • the time information mentioned here is the time of each sampling mentioned above.
  • sampling frequency of the Flicker Sensor can be set according to actual needs, which is not limited in this application.
  • the sampling frequency of the Flicker Sensor is 2kHz, that is, the Flicker Sensor samples every 0.5 milliseconds (ms).
  • any continuous measurement sequence or signal can be expressed as an infinite superposition of sine wave signals of different frequencies.
  • the obtained spectrum (Spectrum1) is composed of multiple sine waves.
  • the frequency corresponding to the sine wave with the largest amplitude is the frequency of the artificial light source.
  • Other sine waves with smaller amplitudes are interference signals in the shooting environment.
  • the method for determining the flicker frequency of multiple artificial light sources is exemplarily given below.
  • the electronic device 100 determines the frequency of the first sine wave as the first flicker frequency according to Spectrum1 (first spectrum), and determines the frequency of the second sine wave as the second flicker frequency.
  • the electronic device determines the first blinking period according to the first blinking frequency, and determines the second blinking period according to the second blinking frequency.
  • the difference between the amplitude of the first sine wave and the first average value is greater than a first preset threshold.
  • the difference between the amplitude of the second sine wave and the second average value is greater than a second preset threshold.
  • the first average is at .
  • the second average value is within the frequency search range in the first frequency spectrum, the average value of the amplitudes of other sine waves except the first sine wave and the second sine wave; the frequency search range is used to determine the search for the first sine wave and The frequency range of the second sine wave.
  • the first sine wave corresponds to the first artificial light source
  • the second sine wave corresponds to the second artificial light source. It can be understood that the amplitude of the first sine wave is greater than the amplitude of the second sine wave.
  • the amplitude in Spectrum1 represents the brightness, which can be understood as representing the light intensity of the artificial light source. That is to say, the illumination intensity of the first artificial light source is greater than the illumination intensity of the second artificial light source.
  • the electronic device 100 sets a frequency search range, for example, 20Hz-2000Hz. Within the frequency search range in Spectrum1, select the largest sine wave peak, and record the sine wave peak as A 1 , record the frequency corresponding to the sine wave as F 1 , and record the sine wave corresponding to the sine wave peak as It is sine wave1.
  • the electronic device 100 may also calculate the average value of other sine wave peaks in the spectrum Spectrum1 except sine wave1 , and record the average value as A avr1 .
  • the electronic device 100 determines that there is an artificial light source in the current shooting environment, and the flicker frequency of the artificial light source is F 1 ; otherwise, the electronic device 100 determines that there is no artificial light source in the current shooting environment.
  • the electronic device 100 can also determine the flicker frequency of other artificial light sources through the above method.
  • the electronic device 100 may also calculate the average value of other sine wave peaks in the spectrum Spectrum1 except sine wave1 and sine wave2, and record the average value as A avr2 .
  • the electronic device 100 determines that there are other artificial light sources in the current shooting environment, and the flickering frequency of the artificial light source is F2 ; otherwise, the electronic device 100 determines that only the flickering frequency exists in the current shooting environment Artificial light source for F 1 .
  • the first preset threshold and the second preset threshold may be set according to actual needs, which is not limited in the present application. It should be noted that, in some embodiments of the present application, the flickering frequencies of multiple artificial light sources may be determined according to the same preset threshold. That is to say, the first preset threshold and the second preset threshold may be the same or different.
  • both the first preset threshold and the second preset threshold are 100.
  • FIG. 10 is a spectrum diagram provided by the embodiment of the present application, and the electronic device sets a frequency search range, and the frequency search range is 20 Hz-200 Hz.
  • the electronic device 100 can set the frame interval as an integer multiple of the blinking period, so that the initial phases when the sensor starts to expose the first row of pixels of different images are consistent, so that the magnitude relationship of received light energy between different rows of pixels of each frame of image it's the same.
  • the frame interval is the interval time for the sensor to expose images.
  • the time of starting to expose the first row of pixels of a frame of image is recorded as ET1
  • the time of starting to expose the first row of pixels of the next frame of the image is recorded as ET2
  • the frame interval is: ET2-ET1.
  • the sorting is recorded as X.
  • X a two-dimensional array containing the sequence numbers of the permutations and the corresponding row numbers. For example, ⁇ (1, M 1 ), (2, M 2 ), (3, M 3 ),..., (N, M N ) ⁇ , 1-N represents the serial number arranged from large to small or from small to large , M 1 -M N represent the number of pixel rows corresponding to the previous sequence number.
  • M 1 -M N represent the number of pixel rows corresponding to the previous sequence number.
  • the light energy received by pixels in different rows on the image may be reflected by the brightness of pixels in different rows on the image. That is to say, the brightness of the Nth row of each frame of image may be different from each other, but their brightness relative to other rows in the image is constant. That is to say, after the electronic device 100 sets the frame interval as an integer multiple of the blinking period, the light and dark stripes can no longer scroll. Thereby reducing the influence of the banding phenomenon on the brightness of the image.
  • the frame interval may be set according to Table 1.
  • Flashing frequency 60Hz 80Hz 90Hz 100Hz 120Hz 150Hz frame interval 33ms 37ms 33ms 30ms 33ms 33ms
  • Determining the frame interval according to the above Table 1 can not only weaken the banding phenomenon, that is, when there is an artificial light source in the shooting environment, the light and dark stripes appearing on the display screen of the electronic device 100 will not scroll, but also meet the playback requirement of 30FPS, that is, every 30 frames per second.
  • the electronic device 100 may adjust the exposure time and the frame interval according to the frequencies corresponding to the two sine waves with larger amplitudes. It can be seen from the above that the ordinate in the frequency spectrum represents the amplitude, and the amplitude represents the brightness. It is understandable that the brighter artificial light source in the shooting environment has a greater impact on the banding phenomenon. Therefore, when there are more than two artificial light sources in the shooting environment, the electronic device 100 may select the artificial light source with the largest amplitude and the second largest value, and adjust the exposure time and frame interval according to its frequency.
  • a shooting method under multiple artificial light sources provided by the embodiment of the present application will be described in detail below with reference to FIG. 11 .
  • the shooting method under artificial multiple light sources shown in FIG. 11 can be applied to a scene where artificial light sources with two flickering frequencies exist in the shooting environment (as shown in FIG. 6 ).
  • the electronic device 100 acquires a time series of ambient brightness.
  • the electronic device 100 may obtain the time series of ambient brightness through a Flicker Sensor or other sensors similar thereto.
  • the method for acquiring the time series of ambient brightness has been described in the above embodiments, and will not be repeated here.
  • the electronic device 100 converts the acquired time series of ambient brightness into a frequency spectrum.
  • the electronic device 100 may convert the time series of ambient brightness from the time domain to the frequency domain through Fourier transform or fast Fourier transform, so as to obtain a frequency spectrum.
  • F 1 is the flicker frequency corresponding to the artificial light source with the largest amplitude
  • F 2 is the flicker frequency corresponding to the artificial light source with the second largest amplitude.
  • the electronic device 100 can determine the flickering frequency of multiple artificial light sources in the current shooting environment, and the specific method can refer to the above-mentioned embodiments, which will not be repeated here.
  • the frequencies corresponding to the largest and second largest sine wave peaks are flicker frequencies of artificial light sources in the current shooting environment. That is to say, the electronic device 100 may determine that there are artificial light sources with two flickering frequencies in the current shooting environment. These two flicker frequencies are denoted as F 1 and F 2 , respectively.
  • F 1 is the frequency corresponding to the largest sine wave peak value
  • F 2 is the frequency corresponding to the second largest sine wave peak value.
  • F 1 100Hz
  • F 2 120Hz.
  • S1104 The electronic device 100 determines exposure times ET1 and ET2. Among them, ET1 is the exposure time required to eliminate the banding phenomenon caused by the artificial light source with a flicker frequency of F1, and ET2 is the exposure time required to eliminate the banding phenomenon caused by the artificial light source with a flicker frequency of F2.
  • the electronic device 100 may determine the blinking periods of the multiple artificial light sources according to the blinking frequencies of the multiple artificial light sources determined in step S1102. If the exposure time satisfies an integral multiple of the flickering period of the artificial light source, the artificial light source corresponding to the flickering period will not cause banding.
  • the flicker periods corresponding to the artificial light sources with flicker frequencies F 1 and F 2 are denoted as t1 and t2 respectively.
  • t1 1/F 1
  • t2 1/F 2 .
  • ET2 is the exposure time required to eliminate the banding phenomenon caused by the artificial light source with a flicker frequency of F2.
  • F 1 100Hz
  • F 2 120Hz.
  • the artificial light source with a flicker frequency of 120 Hz will not cause banding.
  • S1105 The electronic device 100 determines ISO1 and ISO2. Wherein, ISO1 is the corresponding ISO when the exposure time is ET1, and ISO2 is the corresponding ISO when the exposure time is ET1.
  • the electronic device 100 may determine the exposure intensity according to the exposure time and the corresponding ISO when acquiring the latest frame of image.
  • the exposure time and ISO taken when the electronic device obtains the latest frame of image can be read in the sensor, or, when exposure parameters such as the exposure time and ISO are stored in a specified memory address, the electronic device 100 can access the specified memory address to Get exposure time and ISO.
  • S1106 The electronic device 100 determines whether the ISO1 is within a preset range.
  • ISO represents the amplification gain of the electronic signal.
  • the noise is amplified. Therefore, the higher the ISO, the more noise the image acquired by the electronic device 100, and the worse the quality of the image. Therefore, in the actual shooting process, it is often necessary to set a reasonable range for the ISO, so that the useful signal can be amplified, and at the same time, the amplified noise is also within the noise reduction capability of the electronic device 100 .
  • the electronic device 100 may set a preset range for judging whether the adopted ISO is within an appropriate range.
  • the electronic device 100 sets the preset range as ISO min ⁇ ISO ⁇ ISO max .
  • ISO min is the minimum value of ISO that the electronic device 100 can adopt
  • ISO max is the maximum value of ISO that the electronic device 100 can adopt.
  • the preset range adopted by the electronic device 100 may also have other forms, which are not limited in this application.
  • the electronic device 100 judges whether ISO1 is within a set preset range. If ISO1 is within the set preset range, the electronic device 100 adjusts the exposure time to ET1 (step S1107 ). If ISO1 is not within the preset range, the electronic device 100 determines whether ISO2 is within the preset range (step S1109 ).
  • ISO min 400
  • ISO max 1000. Available: 400 ⁇ ISO1 ⁇ 1000. That is, ISO1 is within the preset range.
  • S1107 The electronic device 100 adjusts the exposure time to ET1.
  • the electronic device 100 may adjust the exposure time to ET1.
  • S1108 The electronic device 100 sets the frame interval as an integer multiple of the flickering period of the artificial light source corresponding to F2.
  • the electronic device 100 may set the frame interval as an integer multiple of the flicker period of the artificial light source corresponding to F2, that is, set the frame interval as an integer multiple of t2 .
  • the method for setting the frame interval has been described in the above embodiments, and will not be repeated here.
  • the electronic device 100 may first adjust the frame interval according to the flicker frequency F2. After adjusting the frame interval, the electronic device 100 may further adjust the exposure time to ET1. Optionally, the electronic device 100 may also adjust the exposure time and the frame interval at the same time.
  • F 2 120Hz, according to Table 1, the frame interval can be set to 33ms.
  • S1109 The electronic device 100 determines whether ISO2 is within a preset range.
  • the electronic device 100 determines whether the ISO2 is within a set preset range. If ISO2 is within the preset range, the electronic device 100 adjusts the exposure time to ET2 (step S1110 ). If the ISO2 is not within the preset range, the electronic device 100 adjusts the frame interval according to the flicker frequency F1 (step S1111).
  • step S1106 for the related description of judging whether ISO2 is within the preset range, reference may be made to step S1106, which will not be repeated here.
  • ISO min 400
  • ISO max 1000. Available: ISO2>1000. That is, ISO2 is not within the preset range.
  • S1110 The electronic device 100 adjusts the exposure time to ET2.
  • the electronic device 100 may adjust the exposure time to ET2.
  • S1111 The electronic device 100 sets the frame interval to an integer multiple of the flickering period of the artificial light source corresponding to F1.
  • the electronic device 100 may set the frame interval as an integer multiple of the flicker period of the artificial light source corresponding to F1, that is, set the frame interval as an integer multiple of t1 .
  • the method for setting the frame interval has been described in the above embodiments, and will not be repeated here.
  • the electronic device 100 may first adjust the frame interval according to the flicker frequency F1. After adjusting the frame interval, the electronic device 100 may further adjust the exposure time to ET2. Optionally, the electronic device 100 may also adjust the exposure time and the frame interval at the same time.
  • F 1 100Hz, according to Table 1, the frame interval can be set to 30ms.
  • the electronic device 100 may record the exposure time and frame interval adopted when taking an image after performing the above method as the first exposure time and the first frame interval.
  • the electronic device 100 can take pictures by sampling the adjusted exposure time and frame interval. That is to say, the next time interval between taking adjacent images by the electronic device 100 is the adjusted frame interval (the first frame interval).
  • the exposure time adopted by the electronic device 100 when capturing an image next is the exposure time adjusted by the above method (first exposure time). It can be understood that the number of images captured by the electronic device using the adjusted exposure time is not necessarily 1, which is not limited in the present application.
  • the electronic device may directly determine whether ET1 and ET2 are within the first range without determining whether ISO1 and ISO2 are within the preset range. It can be understood that the first range may be set according to actual requirements, which is not limited in the present application.
  • the electronic device 100 may firstly adjust the determination sequence for determining whether ISO1 and ISO2 are within a preset range. That is to say, the electronic device 100 can first determine whether the ISO2 is within the preset range, that is, the electronic device 100 can first execute step S1109, and then execute step S1106.
  • the electronic device 100 may also simultaneously determine whether ISO1 and ISO2 are within a preset range, and then perform subsequent steps according to the determination result.
  • the electronic device 100 may select the artificial light source with the largest amplitude and the artificial light source with the second largest amplitude, and execute the method shown in FIG. 11 . That is to say, the electronic device 100 may select the artificial light source with the highest illumination intensity and the artificial light source with the second highest illumination intensity, and execute the method shown in FIG. 11 . If both ISO1 and ISO2 are not within the preset range, the electronic device 100 can select the artificial light source with the third largest amplitude, and can refer to step S1104 and step S1105 to determine the corresponding ET3 and ISO3. The electronic device 100 may also determine whether ISO3 is within a preset range (step S1106 and step S1109).
  • the electronic device 100 adjusts the exposure time to ET1. If the ISO3 is not within the preset range, the electronic device 100 may select the artificial light source with the fourth largest amplitude, and perform the above steps. By analogy, if there are N artificial light sources with flickering frequencies in the shooting environment, the electronic device 100 can sequentially determine the corresponding exposure time and ISO according to the order of their amplitudes, and determine whether the ISO is within the preset range, and then according to Judge the result to adjust exposure time and frame interval.
  • the electronic device 100 may also directly adopt the photographing method shown in FIG. 11 . That is to say, the electronic device 100 may determine the flicker frequency of the artificial light source with the largest amplitude and the second largest amplitude, and determine the exposure time and the frame interval according to the flicker frequency. Because the artificial light source with a large amplitude has a greater impact on the banding phenomenon, if the banding phenomenon caused by the artificial light source with a large amplitude can be eliminated, the impact of the banding phenomenon on the shooting can be greatly reduced.
  • the electronic device 100 may determine a third blinking period; the third blinking period is the blinking period of the third artificial light source; if k1 times the first blinking period exceeds the first range, And the k2 times of the second flashing cycle does not exceed the first range, and the k3 times of the third flashing cycle does not exceed the first range, the first exposure time is k3 times of the third flashing cycle, and the first frame interval is the first flashing cycle k1 times.
  • the electronic device 100 may also determine the flickering frequencies of any two kinds of artificial light sources, and determine the exposure time and the frame interval according to the two flickering frequencies ( Refer to the method shown in Figure 11).
  • t1 may be the first blinking period, and at this time t2 is the second blinking period.
  • t1 may be the second blinking period, and at this time t1 is the first blinking period.
  • the electronic device mentioned in the claims may be the electronic device 100 in the embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

本申请提供了一种多人工光源下的拍摄方法及相关装置。其中,电子设备可以确定多个人工光源的闪烁频率,并选取其中两个,分别记为F1和F2。将F1和F2对应的闪烁周期分别记为T1和T2。若T1的整数倍在第一范围内,将曝光时间调节为T1的整数倍,并根据F2调节帧间隔。若T1的整数倍不在第一范围内,但T2的整数倍在第一范围内,电子设备将曝光时间调节为T2的整数倍,并根据F1调节帧间隔,否则,电子设备仅根据F1调节帧间隔。上述方法可以消除单个人工光源造成的banding现象,并减弱其他人工光源造成的banding现象,使得电子设备上的画面不再出现滚动的明暗条纹。

Description

一种多人工光源下的拍摄方法及相关装置
本申请要求于2021年06月24日提交中国专利局、申请号为202110708031.2、申请名称为“一种多人工光源下的拍摄方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种多人工光源下的拍摄方法及相关装置。
背景技术
若拍摄环境中存在人工光源,可能会因为人工光源的频闪现象导致拍摄画面中出现滚动的明暗条纹。一般来说,可以通过将曝光时间调节为人工光源闪烁周期的整数倍来消除画面中的明暗条纹。但是,这种调节曝光时间的方法只适用于存在一种闪烁频率的人工光源的拍摄场景,并不适用于人工多光源的场景。所以,若拍摄环境中存在多个人工光源,拍摄画面中的滚动条纹现象无法解决,非常影响用户体验。
因此,如何在人工多光源的场景下减弱拍摄画面中出现的明暗条纹是目前亟待解决的问题。
发明内容
本申请提供了一种多人工光源下的拍摄方法及相关装置。电子设备可以确定多个人工光源的闪烁频率,并选取其中两个闪烁频率,分别记为F1和F2。电子设备根据这两个闪烁频率来确定电子设备接下来获取图像时采用的曝光时间和帧间隔。该方法可以在消除某个人工光源造成的banding现象的情况下,减弱其他人工光源造成的banding现象,使得电子设备上的画面不再出现滚动的明暗条纹。
第一方面,本申请提供了一种多人工光源下的拍摄方法。该方法可以包括:电子设备可以确定第一闪烁周期和第二闪烁周期;第一闪烁周期可以是拍摄环境中第一人工光源的闪烁周期;第二闪烁周期可以是拍摄环境中第二人工光源的闪烁周期;电子设备可以确定第一曝光时间和第一帧间隔;若第一闪烁周期的k1倍未超出第一范围,第一曝光时间为第一闪烁周期的k1倍,第一帧间隔为第二闪烁周期的k2倍;电子设备可以以第一曝光时间和第一帧间隔拍摄;第一帧间隔为拍摄的过程中,通过摄像头采集的相邻两帧图像的间隔时间。
在本申请提供的方案中,电子设备可以确定拍摄环境中存在的两个的人工光源的闪烁周期——第一闪烁周期和第二闪烁周期,并以第一曝光时间和第一帧间隔拍摄图像。若第一闪烁周期的k1倍未超出第一范围,电子设备第一曝光时间为第一闪烁周期的k1倍,第一帧间隔为第二闪烁周期的k2倍。该方案可以使得由于第一人工光源造成的banding现象得以消除,而由于第二人工光源造成的banding现象得以减弱。也就是说,按照上述方式调整曝光时间和帧间隔后,电子设备显示屏上不再显示因第一人工光源造成的滚动的明暗条纹,也可以使得第二人工光源造成的滚动的明暗条纹不再滚动,提升了用户体验。
结合第一方面,在第一方面的一种可能的实现方式中,第一人工光源的光照强度可以大于第二人工光源的光照强度。
在本申请提供的方案中,第一人工光源的光照强度可以大于第二人工光源的光照强度。也就是说,在条件允许的情况下(第一闪烁周期的k1倍未超出第一范围),可以优先消除因光照强度较大的人工光源造成的banding现象。因为光照强度较大的人工光源造成的banding 现象更明显,所以优先消除因光照强度较大的人工光源造成的banding现象,可以使得banding现象最大程度减弱,提升了用户体验。
结合第一方面,在第一方面的一种可能的实现方式中,若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍未超出第一范围,第一曝光时间为第二闪烁周期的k2倍,第一帧间隔为所述第一闪烁周期的k1倍。
在本申请提供的方案中,若无法将曝光时间调整为第一闪烁周期的整数倍,可以将曝光时间调整为第二闪烁周期的整数倍,这样至少可以消除一个人工光源造成的明暗条纹,可以尽可能减弱banding现象对画面显示的影响。
结合第一方面,在第一方面的一种可能的实现方式中,若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍也超出第一范围,第一帧间隔为第一闪烁周期的k1倍。
在本申请提供的方案中,若无法调整曝光时间使得消除拍摄环境中的一个人工光源造成的明暗条纹,可以只调节帧间隔,这样也可以减弱banding现象。
在一些实施例中,电子设备可以只将帧间隔调节为拍摄环境中的一个人工光源的闪烁周期的整数倍,而继续采用由自动曝光系统调节的曝光时间,或者,也可以直接采用获取最新一帧图像时的曝光时间来获取下一帧图像。可理解,电子设备还可以通过其他方式调整曝光时间,本申请对此不作限制。
结合第一方面,在第一方面的一种可能的实现方式中,电子设备确定第一闪烁周期和第二闪烁周期,具体包括:电子设备可以获取第一时间序列;第一时间序列包括环境亮度信息和时间信息;电子设备可以将第一时间序列转换为第一频谱;电子设备可以根据第一频谱确定第一正弦波的频率为第一闪烁频率,确定第二正弦波的频率为第二闪烁频率;电子设备可以根据第一闪烁频率确定第一闪烁周期,根据第二闪烁频率确定第二闪烁周期;其中,第一正弦波的幅值与第一平均值的差值大于第一预设阈值;第二正弦波的幅值与第二平均值的差值大于第二预设阈值;第一平均值为在第一频谱中的频率查找范围内,除第一正弦波外的其他正弦波的幅值的平均值;第二平均值为在第一频谱中的频率查找范围内,除第一正弦波和第二正弦波外的其他正弦波的幅值的平均值;频率查找范围用于确定查找第一正弦波和第二正弦波的频率范围。
在本申请提供的方案中,电子设备可以通过采集环境亮度信息和时间信息得到时间序列,然后根据傅里叶原理,将时间序列转换为频谱。电子设备可以根据该频谱确定人工光源的闪烁周期。可理解,电子设备确定的是频谱中幅值最大的正弦波和幅值第二大的正弦波的闪烁周期。也就是说,电子设备确定的闪烁周期为拍摄环境中光照强度最大的和第二大的人工光源的闪烁周期。电子设备根据该闪烁周期调节曝光时间和帧间隔后,可以最大程度减弱人工光源造成的banding现象。
结合第一方面,在第一方面的一种可能的实现方式中,拍摄环境中存在两个以上的人工光源;第一人工光源为两个以上的人工光源中光照强度最大的人工光源;第二人工光源为两个以上的人工光源中光照强度第二大的人工光源。
在本申请提供的方案中,电子设备可以根据拍摄环境中光照强度最大的和第二大的人工光源的闪烁周期来调节曝光时间和帧间隔后,可以最大程度减弱人工光源造成的banding现象。
结合第一方面,在第一方面的一种可能的实现方式中,拍摄环境中存在第三人工光源。该方法还可以包括:电子设备确定第三闪烁周期;第三闪烁周期为第三人工光源的闪烁周期;若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍未超出第一范围,且第三闪 烁周期的k3倍未超出第一范围,第一曝光时间为第三闪烁周期的k3倍,第一帧间隔为第一闪烁周期的k1倍。
在本申请提供的方案中,若拍摄环境中存在两个以上的人工光源,电子设备判断是否可以根据光照较大的人工光源的闪烁周期来调节曝光时间,可以尽可能消除拍摄环境中的一个人工光源造成的明暗条纹。
第二方面,本申请提供一种电子设备。该电子设备包括摄像头、一个或多个存储器、一个或多个处理器,一个或多个处理器与摄像头、一个或多个存储器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令。处理器,可以用于确定第一闪烁周期和第二闪烁周期;第一闪烁周期是拍摄环境中第一人工光源的闪烁周期;第二闪烁周期是拍摄环境中第二人工光源的闪烁周期;处理器,还可以用于确定第一曝光时间和第一帧间隔;若第一闪烁周期的k1倍未超出第一范围,第一曝光时间为第一闪烁周期的k1倍,第一帧间隔为第二闪烁周期的k2倍;k1和k2均为正整数;摄像头,可以用于以第一曝光时间和第一帧间隔拍摄;第一帧间隔为拍摄的过程中,采集相邻两帧图像的间隔时间。
结合第二方面,在第二方面的一种可能的实现方式中,第一人工光源的光照强度大于第二人工光源的光照强度。
结合第二方面,在第二方面的一种可能的实现方式中,若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍未超出第一范围,第一曝光时间为第二闪烁周期的k2倍,第一帧间隔为第一闪烁周期的k1倍。
结合第二方面,在第二方面的一种可能的实现方式中,若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍也超出第一范围,第一帧间隔为第一闪烁周期的k1倍。
结合第二方面,在第二方面的一种可能的实现方式中,处理器用于确定第一闪烁周期和第二闪烁周期时,具体用于:可以获取第一时间序列;第一时间序列包括环境亮度信息和时间信息;将第一时间序列转换为第一频谱;根据第一频谱确定第一正弦波的频率为第一闪烁频率,确定第二正弦波的频率为第二闪烁频率;根据第一闪烁频率确定第一闪烁周期,根据第二闪烁频率确定第二闪烁周期;其中,第一正弦波的幅值与第一平均值的差值大于第一预设阈值;第二正弦波的幅值与第二平均值的差值大于第二预设阈值;第一平均值为在第一频谱中的频率查找范围内,除第一正弦波外的其他正弦波的幅值的平均值;第二平均值为在第一频谱中的频率查找范围内,除第一正弦波和第二正弦波外的其他正弦波的幅值的平均值;频率查找范围用于确定查找第一正弦波和第二正弦波的频率范围。
结合第二方面,在第二方面的一种可能的实现方式中,拍摄环境中存在两个以上的人工光源;第一人工光源为两个以上的人工光源中光照强度最大的人工光源;第二人工光源为两个以上的人工光源中光照强度第二大的人工光源。
结合第二方面,在第二方面的一种可能的实现方式中,拍摄环境中存在第三人工光源;处理器,还用于:确定第三闪烁周期;第三闪烁周期为第三人工光源的闪烁周期。若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍未超出第一范围,且第三闪烁周期的k3倍未超出第一范围,第一曝光时间为第三闪烁周期的k3倍,第一帧间隔为第一闪烁周期的k1倍。
第三方面,本申请提供一种计算机可读存储介质,包括指令,当上述指令在电子设备上运行时,使得上述电子设备执行上述第一方面中任一种可能的实现方式。
第四方面,本申请实施例提供一种芯片,该芯片应用于电子设备,该芯片包括一个或多 个处理器,该处理器用于调用计算机指令以使得该电子设备执行上述第一方面中任一种可能的实现方式。
第五方面,本申请实施例提供一种包含指令的计算机程序产品,当上述计算机程序产品在设备上运行时,使得上述电子设备执行上述第一方面中任一种可能的实现方式。
可以理解地,上述第二方面提供的电子设备、第三方面提供的计算机可读存储介质、第四方面提供的芯片、第五方面提供的计算机程序产品均用于执行本申请实施例所提供的方法。因此,其所能达到的有益效果可参考对应方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的banding现象的示意图;
图2A为本申请实施例提供的一种交流电的波形图;
图2B为本申请实施例提供的又一种交流电的波形图;
图3A为本申请实施例提供的一种光信号的波形图;
图3B为本申请实施例提供的又一种光信号的波形图;
图4为本申请实施例提供的一种传感器的曝光原理图;
图5为本申请实施例提供的又一种光信号的波形图;
图6为本申请实施例提供的一种人工多光源的场景示意图;
图7为本申请实施例提供的一种电子设备100的硬件结构示意图;
图8为本申请实施例提供的一种电子设备100的软件结构示意图;
图9A-图9D为本申请实施例提供的一组用户界面示意图;
图10为本申请实施例提供的一种频谱图;
图11为本申请实施例提供的一种人工多光源下的拍摄方法的流程图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
应当理解,本申请的说明书和权利要求书及附图中的术语“第一”、“第二”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。
在本申请中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本申请所描述的实施例可以与其它实施例相结合。
本申请涉及拍摄领域,为了便于理解本申请提供的方法,下面对拍摄领域的一些术语进行介绍。
1、曝光量
曝光量反映了获取图像时感光元件获取到的光能量的多少,影响最终获取的图像的明暗。其中,拍摄一帧图像的曝光量越大,这一帧图像的亮度越亮。
曝光量由曝光时间、通光面积以及环境光强度这三大因素决定。其中,快门速度决定曝光时间。光圈大小决定通光面积。在胶片时代,用感光度ISO来反映胶片对光线的敏感度,可以认为ISO影响感光元件获取的环境光强度。而数码相机、手机等电子设备的感光元件在封装之后就不变了,对这些电子设备来说,ISO表示的不再是感光元件对光线的敏感度,而是电子信号放大增益值。当ISO增大时,电子信号的放大增益增大,原有信号被放大,图像会变亮。
综上所述,曝光时间、光圈大小和ISO是影响图像亮度的三大因素。
2、曝光强度
在一些实施例中,数码相机、手机等电子设备的光圈大小是固定的,电子设备可以通过调节曝光时间和ISO来调节图像的亮度。为了便于理解和计算,本申请后续实施例中用曝光强度来表征图像亮度。曝光强度越大,图像越亮,曝光强度越小,图像越暗。其中,电子设备可以通过调节曝光时间和ISO来调节曝光强度。具体的,曝光强度、曝光时间和ISO可以具有如下关系:曝光强度=曝光时间*ISO。
3、滚动条纹现象
当拍摄场景中存在人工光源时,拍摄画面可能会出现滚动条纹线条(简称banding现象)。即相机、手机等电子设备上的预览画面可能出现一条条的滚动的明暗条纹。如图1所示,图1为电子设备在拍摄环境中存在人工光源的情况下获取的图像,可见,图1的亮度不均匀,由一条条明暗条纹组成。
下面介绍banding现象的形成原因。
一方面,从人工光源的角度看:
日常生活中的用电一般是正弦波形的交流电。图2A示例性示出了电源的频率为60赫兹(Hz)的交流电的波形图。图2B示例性示出了电源的频率为50Hz的交流电的波形图。
当人工光源接交流电时,人工光源将电信号转换为光信号,由于电信号为一定频率的周期信号,所以转换所得的光信号也为一定频率的周期信号,可以理解为:人工光源发出的光随着时间呈现出一定频率、周期的变化,即出现频闪现象。
可理解,频闪现象是由电源的设计和人工光源本身的特性造成的,因此,没有真正意义上的无频闪。对于很多照明灯具来说,其工作电流必然随着输入电压的波动而波动,直接导致光输出的波动产生频闪。
然而,人工光源发出的光能量没有方向性,所以光信号的波形不再是正弦波形,而是频率为100Hz或120Hz的包络。具体地,如图3A所示,当人工光源接60Hz的交流电时,人工光源转换所得的光信号的波形是频率为120Hz的周期性变化的包络。如图3B所示,当人工光源接50Hz的交流电时,人工光源转换所得的光信号的波形是频率为100Hz的周期性变化的包络。
可以看出,人工光源的闪烁频率通常是该人工光源所接交流电频率的两倍。本申请实施例对人工光源的闪烁频率不作限定。例如,人工光源所接交流电的频率为50Hz或60Hz以外的频率,人工光源的闪烁频率可以为100Hz或120Hz以外的取值。
另一方面,从用于拍摄的电子设备的角度看:
目前,数码相机、手机等电子设备多采用卷帘快门(Rolling Shutter),采用卷帘快门的曝 光方式为逐行曝光。
具体地,如图4所示,传感器(例如,CMOS图像传感器)从一帧图像的第一行像素开始曝光,间隔一个行周期后开始曝光第二行像素。以此类推,第N-1行像素开始曝光后,间隔一个行周期第N行像素才开始曝光。也就是说,每一行像素开始曝光的时间与该行的下一行像素开始曝光的时间的时间差为一个行周期。因此,每一行像素开始曝光的时间都不相同。
在本申请中,曝光时间指的是电子设备曝光一帧图像的一行像素所需的时间。一般情况下,同一帧图像不同行像素的曝光时间是相同的。
可理解,行周期可以由传感器的能力确定。不同传感器的行周期可能不同,所以不同电子设备的行周期也可能不相同。本申请实施例对上述行周期的取值不作限定。
可理解,在人工光源转换所得的光信号的示意图(例如,图3A或图3B)中,一段时间内包络与X轴围成的面积(包络对应的函数在这一段时间的定积分)为人工光源在这段时间发出的光能量,即传感器在这一段时间内接收的光能量。
若第N行像素从T 1时开始曝光,并在T 2时结束曝光,T 1-T 2这段时间内传感器接收的光能量影响最终显示的图像的第N行像素的亮度。其中,T 1-T 2内传感器接收的光能量越多,最终显示的图像的第N行像素越亮。T 1-T 2内传感器接收的光能量越少,最终显示的图像的第N行像素越暗。
下面以拍摄环境中存在接50Hz交流电的人工光源为例进行说明。
如图5所示,当人工光源接50Hz的交流电时,人工光源转换所得的光信号的波形是频率为100Hz的周期性变化的包络。将人工光源的闪烁周期记为T,则有:T=1/100s。此时的曝光时间为t,即T 2-T 1=t。也就是说,传感器曝光某一帧图像每一行所需的时间都为t。由图5可知,t=T+t1。即t不为闪烁周期T的整数倍,传感器曝光该图像的第i行像素时,该行像素接收的光能量为:S+S1。其中,S可以表示该行像素在时间T内接收的光能量,S1可以表示该行像素在时间t1内接收的光能量。由图5可知,t4+t2+t3=t。传感器曝光该图像的第i+1行像素时,第i+1一行像素接收的光能量为:S2+S3+S4。又因为t1=t2=t3,可得:S1=S3<S4。明显地,S2+S3=S。因此,第i+1行像素接收的光能量比第i行像素要更多一些。也就是说,最终显示的图像的第i行像素和第i+1行像素的亮度是不同的。第i+1行像素较第i行像素要更亮一些。
可理解,由于人工光源转换所得的光信号为周期信号,在任意起始时间点,当T 2-T 1=M*T(M为正整数)且M相同时,传感器在T 1-T 2时间内接收的光能量是相同的,最终显示的图像的不同行的亮度是一样。在任意起始时间点,当T 2-T 1=M*T(M不为正整数)且M相同时,传感器在T 1-T 2时间内接收的光能量不一定相同(如图5所示),最终显示的图像的不同行的亮度不一定相同。即最终显示的图像可能出现明暗条纹。
综上,若曝光时间满足人工光源闪烁周期的整数倍,电子设备显示的图像不会出现明暗条纹。若曝光时间不满足人工光源闪烁周期的整数倍,电子设备显示的图像则会出现明暗条纹。由于不同图像的明暗条纹的位置可能发生变化,所以电子设备的预览画面或者录像时的画面中可能出现滚动的明暗条纹,即banding现象。
需要说明的是,若不同图像第一行开始曝光的时间所对应的光信号的相位相同,数码相机、手机等电子设备上的预览画面仍会出现明暗条纹。但是由于每一帧图像第一行开始曝光的时间的相位是相同的,这些电子设备中的传感器对图像进行曝光时,每一帧图像的不同行之间接收光能量的大小关系是一样的,所以每一帧图像的第N行接收的光能量可能互不相同。 即每一帧图像的第N行的亮度可能互不相同,但是它们相对于图像中其他行的明暗程度是不变的。
若拍摄环境中存在多个人工光源(例如,两个人工光源),且这些人工光源的闪烁频率一致,可以通过将曝光时间调节为这些人工光源闪烁周期的整数倍来避免出现banding现象。然而,若拍摄环境中的多个人工光源(例如,两个人工光源)的闪烁频率不相同,可以将曝光时间调节为其中一个人工光源的闪烁周期的整数倍。在这种情况下,其他人工光源的频闪现象还是会导致banding现象。即拍摄画面上仍会出现滚动的明暗条纹。
示例性的,在如图6所示的拍摄场景下,存在两个人工光源——吊灯和台灯。其中,吊灯接50Hz交流电,其闪烁周期为1/100s;台灯是通过通用串行总线(Universal Serial Bus,USB)接口充电,在其充满电的情况下,频率为60Hz,可得:其闪烁周期为1/60s。这两个人工光源的闪烁周期不同,调节曝光时间只能解决其中一个人工光源的频闪所带来的banding现象,另一个人工光源的频闪所带来的banding现象仍然无线解决,所以拍摄画面中还是会出现滚动的明暗条纹。
本申请提供了一种多人工光源下的拍摄方法及相关装置。其中,电子设备可以确定多个人工光源的闪烁频率,并选取其中两个闪烁频率,分别记为F1和F2。将这两个闪烁频率对应的闪烁周期分别记为T1和T2。若在曝光时间满足T1的整数倍的情况下,相应的ISO也在预设范围内,则调节曝光时间,使得曝光时间满足T1的整数倍,并且根据F2调节帧间隔。若在曝光时间满足T1的整数倍的情况下,相应的ISO不在预设范围内,则判断当ISO在预设范围内时,电子设备是否可以将曝光时间调节为T2的整数倍,若可以,则将曝光时间调节为T2的整数倍,并且根据F1调节帧间隔,否则,电子设备不调节曝光时间,仅根据F1调节帧间隔。上述方法可以在消除某个人工光源造成的banding现象的情况下,减弱其他人工光源造成的banding现象,使得电子设备上的画面不再出现滚动的明暗条纹。
下面介绍本申请实施例涉及的装置。
图7为本申请实施例提供的一种电子设备100的硬件结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(Universal Serial Bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(Subscriber Identification Module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器 (Application Processor,AP),调制解调处理器,图形处理器(Graphics Processing unit,GPU),图像信号处理器(Image Signal Processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(Digital Signal Processor,DSP),基带处理器,和/或神经网络处理器(Neural-network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
可理解,处理器110中还可以包括AE系统。AE系统可以具体设置在ISP中。AE系统可用于实现曝光参数的自动调整。可选的,AE系统还可以集成在其它处理器芯片中。本申请实施例对此不作限定。
在本申请提供的实施例中,电子设备100可以通过处理器110执行所述曝光强度调节方法。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备100,例如AR设备等。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备100供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(Low Noise Amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(Wireless Local Area  Networks,WLAN)(如无线保真(Wireless Fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星系统(Global Navigation Satellite System,GNSS),调频(Frequency Modulation,FM),近距离无线通信技术(Near Field Communication,NFC),红外技术(Infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(Liquid Crystal Display,LCD),有机发光二极管(Organic Light-Emitting Diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(Active-Matrix Organic Light Emitting Diode的,AMOLED),柔性发光二极管(Flex Light-Emitting Diode,FLED),Mini LED,Micro LED,Micro-OLED,量子点发光二极管(Quantum Dot Light Emitting Diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现获取功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像或视频。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(Charge Coupled Device,CCD)或互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像或视频信号。ISP将数字图像或视频信号输出到DSP加工处理。DSP将数字图像或视频信号转换成标准的RGB,YUV等格式的图像或视频信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。例如,在一些实施例中,电子设备100可以利用N个摄像头193获取多个曝光系数的图像,进而,在视频后处理中,电子设备100可以根据多个曝光系数的图像,通过HDR技术合成HDR图像。
数字信号处理器用于处理数字信号,除了可以处理数字图像或视频信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(Moving Picture Experts Group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(Neural-Network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可 以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像视频播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。电子设备100可以设置至少一个麦克风170C。
耳机接口170D用于连接有线耳机。
传感器模块180可以包括1个或多个传感器,这些传感器可以为相同类型或不同类型。可理解,图1所示的传感器模块180仅为一种示例性的划分方式,还可能有其他划分方式,本申请对此不作限制。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备100姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设 备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。
环境光传感器180L用于感知环境光亮度。
指纹传感器180H用于获取指纹。
温度传感器180J用于检测温度。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
在本申请的一个实施例中,用户利用电子设备100进行延时摄影或连拍,需要获取一系列图像。在延时摄像或连拍的场景中,电子设备100可以采取AE模式。即电子设备100自动调整AE值,在预览这一系列图像的过程中,若用户有触摸操作作用于显示屏194,可能触发touchAE模式。在touchAE模式下,电子设备100可以调整用户触摸显示屏的相应位置的亮度,并进行高权重测光。使得计算画面平均亮度的时候,用户触摸区域的权重明显高于其他区域,最终计算所得的画面平均亮度更加靠近用户触摸区域的平均亮度。
骨传导传感器180M可以获取振动信号。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
图8为本申请实施例提供的一种电子设备100的软件结构示意图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将系统分为四层,从上至下分别为应用程序层,应用程序框架层,运行时(Runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图8所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序(也可以称为应用)。
应用程序框架层为应用程序层的应用程序提供应用编程接口(Application Programming Interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图8所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话界面形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
运行时(Runtime)包括核心库和虚拟机。Runtime负责系统的调度和管理。
核心库包含两部分:一部分是编程语言(例如,java语言)需要调用的功能函数,另一部分是系统的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的编程文件(例如,java文件)执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(Surface Manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),二维图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了二维(2-Dimensional,2D)和三维(3-Dimensional,3D)图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现3D图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动,虚拟卡驱动。
下面结合捕获拍照场景,示例性说明电子设备100软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用 调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
下面介绍本申请提供的一些拍摄的场景。
图9A-图9D示例性示出了电子设备100拍摄时的一些用户界面。
本申请的说明书和权利要求书及附图中的术语“用户界面”,是应用程序或操作系统与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、Widget等可视的界面元素。
图9A示例性示出了电子设备100上的用户界面910。用户界面910显示了一个录像界面,该界面可包括预览界面911、快门控件912、结束录像控件913、录像时间控件914和暂停录像控件915。其中,预览区域911可用于显示预览图像。该预览图像为电子设备100通过摄像头实时采集的图像。电子设备可以实时刷新预览区域911中的显示内容,以便于用户预览摄像头当前采集的图像。快门控件912可用于触发拍照,即在录像的过程中用户可触发快门控件912来拍照。结束录像控件913可用于结束录制视频。录像时间控件914可以指示当前录制视频的时间长度。暂停录像控件915可用于暂时停止录制视频。
可理解,用户界面910中还可以包含更多或更少的控件,本申请实施例对此不作限定。
如图9A所示,录像时间控件914中显示00:00:01,这表示当前视频已录制1秒(s)。此时,用户界面910中的预览区域911中显示的是图像1,图像1上有明显的明暗条纹。
图9B示例性示出了电子设备100上的用户界面920。用户界面920中的控件与用户界面910中的控件基本一致。图9B中的录像时间控件914中显示00:00:02,这表示当前视频已录制2秒(s)。此时,用户界面920中的预览区域911中显示的是图像2,图像2上有明显的明暗条纹。可理解,图像1和图像2这两帧图像中的明暗条纹的位置发生了变化。也就是说,在录制视频的第1秒-第2秒的过程中,预览区域911中的画面上显示有滚动的明暗条纹。电子设备100可以确定多个人工光源的闪烁频率,并选取其中幅值最大的两个人工光源。将这两个人工光源对应的闪烁频率,分别记为F1和F2。将这两个闪烁频率对应的闪烁周期分别记为T1和T2。若在曝光时间满足T1的整数倍的情况下,相应的ISO也在预设范围内,则调节曝光时间,使得曝光时间满足T1的整数倍,并且根据F2调节帧间隔;若在曝光时间满足T1的整数倍的情况下,相应的ISO不在预设范围内,则判断当ISO在预设范围内时,电子设备是否可以将曝光时间调节为T2的整数倍,若可以,则将曝光时间调节为T2的整数倍,并且根据F1调节帧间隔,否则,电子设备100不调节曝光时间,仅根据F1调节帧间隔。
图9C和图9D示例性示出了电子设备100执行上述操作后所显示的界面。
图9C示例性示出了电子设备100上的用户界面930。用户界面930中的控件与用户界面910中的控件基本一致。图9C中的录像时间控件914中显示00:00:05,这表示当前视频已录制5秒(s)。此时,用户界面930中的预览区域911中显示的是图像3,图像3上有明显的明暗条纹。
图9D示例性示出了电子设备100上的用户界面940。用户界面940中的控件与用户界面 910中的控件基本一致。图9D中的录像时间控件914中显示00:00:06,这表示当前视频已录制6秒(s)。此时,用户界面940中的预览区域911中显示的是图像4,图像4上有明显的明暗条纹。
可理解,图像3和图像4这两帧图像中的明暗条纹的位置未发生变化。也就是说,在录制视频的第5秒-第6秒的过程中,预览区域911中的画面上显示的明暗条纹是固定的。
因此,电子设备100按照上述方法调节曝光时间和/或帧间隔后,可以使得明暗条纹固定在画面中,减弱了电子设备100的预览区域911中的banding现象。也就是说,电子设备100的画面中不再出现滚动的明暗条纹,提升了用户的拍摄体验。
下面具体介绍拍摄环境存在人工光源时,根据人工光源的闪烁频率调节帧间隔的方法。
1、获取环境亮度的时间序列
对于电荷耦合元件(Charge Coupled Device,CCD)、金属氧化物半导体元件(Complementary Metal-Oxide Semiconductor,CMOS)等感光传感器来说,其感光面上植入有微小光敏物质,即像素,感光传感器会其感光面上各个像素上的光像转换成电信号。
与上述感光传感器类似,防闪烁传感器(Flicker Sensor)也会将感光面上各个像素上的光像转化为电信号。但是,Flicker Sensor仅有一个像素且没有滤光,所以Flicker Sensor输出的电信号即为该仅有像素上的光像转换成的电信号。可理解,可以用Flicker Sensor输出的电信号来代表当前环境亮度,也就是说,可以认为Flicker Sensor输出的电信号为当前的环境亮度。
当用户触发电子设备100的拍摄功能后,电子设备100中的Flicker Sensor开始对环境光进行采样,并输出每一次采样的时间和对应的电信号。可理解,Flicker Sensor输出的是环境亮度的时间序列(第一时间序列)。可知,环境亮度的时间序列是一个一维的时间序列。可理解,第一时间序列包括环境亮度信息和时间信息。这里提到的环境亮度信息为上文提到的环境亮度,即每一次采样的环境亮度。这里提到的时间信息为上文提到的每一次采样的时间。
可理解,Flicker Sensor的采样频率可以根据实际需要进行设置,本申请对此不作限制。例如,Flicker Sensor的采样频率为2kHz,即Flicker Sensor每隔0.5毫秒(ms)采样一次。
2、确定多个人工光源的闪烁频率
对环境亮度的时间序列做傅里叶变换或快速傅里叶变换(fast Fourier transform,FFT),将环境亮度的时间序列从时域转换到频域,可以得到一个频谱,将该频谱记为Spectrum1(第一频谱)。可理解,频谱的横坐标为频率,纵坐标为幅值(信号的振幅强度)。其中,幅值表示的是亮度。
根据傅里叶原理,任何连续测量的时序或信号,都可以表示为不同频率的正弦波信号的无限叠加。在本申请提供的实施例中,将环境亮度的时间序列转换到频域后,得到的频谱(Spectrum1)由多个正弦波组成。而在这多个正弦波中,幅值最大的正弦波所对应的频率为人工光源的频率。其他幅值比较小的正弦波则为拍摄环境中的干扰信号。下面示例性给出确定多个人工光源的闪烁频率的方法。
具体的,电子设备100根据Spectrum1(第一频谱)确定第一正弦波的频率为第一闪烁频率,确定第二正弦波的频率为第二闪烁频率。电子设备根据第一闪烁频率确定第一闪烁周期,根据第二闪烁频率确定第二闪烁周期。其中,第一正弦波的幅值与第一平均值的差值大于第一预设阈值。第二正弦波的幅值与第二平均值的差值大于第二预设阈值。第一平均值为在。第一频谱中的频率查找范围内,除第一正弦波外的其他正弦波的幅值的平均值。第二平均值 为在第一频谱中的频率查找范围内,除第一正弦波和第二正弦波外的其他正弦波的幅值的平均值;频率查找范围用于确定查找第一正弦波和第二正弦波的频率范围。
在一些实施例中,第一正弦波对应的是第一人工光源,第二正弦波对应的是第二人工光源。可理解,第一正弦波的幅值大于第二正弦波的幅值。而Spectrum1中幅值表示亮度,可以理解为表示人工光源的光照强度。也就是说,第一人工光源的光照强度大于第二人工光源的光照强度。
电子设备100设置一个频率查找范围,例如,20Hz-2000Hz。在频谱Spectrum1中的频率查找范围内,选取最大的正弦波峰值,并将该正弦波峰值记为A 1,将该正弦波对应的频率记为F 1,将该正弦波峰值对应的正弦波记为sine wave1。电子设备100还可以计算频谱Spectrum1中除sine wave1外其他正弦波峰值的平均值,将该平均值记为A avr1。电子设备100计算A 1和A avr1的差值,并将该差值记为B 1,即B 1=A 1-A avr1。若B 1大于第一预设阈值,则电子设备100确定当前拍摄环境中存在人工光源,且该人工光源的闪烁频率为F 1;否则,电子设备100确定当前拍摄环境中不存在人工光源。
可理解,电子设备100还可以通过上述方法来确定其他人工光源的闪烁频率。
类似的,在频谱Spectrum1中的频率查找范围内,选取第二大的正弦波峰值,并将该正弦波峰值记为A 2,将该正弦波对应的频率记为F 2,将该正弦波峰值对应的正弦波记为sine wave2。电子设备100还可以计算频谱Spectrum1中除sine wave1和sine wave2外其他正弦波峰值的平均值,将该平均值记为A avr2。电子设备100计算A 2和A avr2的差值,并将该差值记为B 2,即B 2=A 2-A avr2。若B 2大于第二预设阈值,则电子设备100确定当前拍摄环境中存在其他的人工光源,且该人工光源的闪烁频率为F 2;否则,电子设备100确定当前拍摄环境中只存在闪烁频率为F 1的人工光源。
可理解,第一预设阈值和第二预设阈值可以根据实际需要进行设置,本申请对此不作限制。需要说明的是,在本申请的一些实施例中,可以根据同一个预设阈值来确定多个人工光源的闪烁频率。也就是说,第一预设阈值和第二预设阈值可以相同,也可以不同。
示例性的,第一预设阈值和第二预设阈值都为100。如图10所示,图10为本申请实施例提供的一个频谱图,电子设备设置一个频率查找范围,该频率查找范围为20Hz-200Hz。在图10所示的频谱中查找20Hz-200Hz范围内最大和第二大的正弦波峰值,以及它们对应的频率。可得,最大的正弦波峰值为:A 1=200,该正弦波峰值对应的频率为:F 1=50Hz;第二大的正弦波峰值为:A 2=180,该正弦波峰值对应的频率为:F 2=75Hz。根据图10还可以得到:
Figure PCTCN2022093644-appb-000001
所以,A 1和A avr1的差值为:B 1=A 1-A avr1=200-69.6=131.4;A 2和A avr2的差值为:B 2=A 2-A avr2=180-42=138。明显的,B 1大于第一预设阈值,B 2大于第二预设阈值,电子设备100确定当前拍摄环境中存在两个人工光源,且这两个人工光源的闪烁频率分别为50Hz和75Hz。
需要说明的是,在上述内容的基础上,还可以根据其他的正弦波峰值来判断是否还存在其他频率的人工光源,判断方法可参考上述内容,在此不再赘述。
3、根据人工光源的频率确定帧间隔
当环境光源存在闪烁时,需要减弱banding现象。电子设备100可以将帧间隔设置为闪烁周期的整数倍,使得传感器开始曝光不同图像的第一行像素时的初始相位一致,从而使得 每一帧图像的不同行像素之间接收光能量的大小关系是一样的。
可理解,帧间隔为传感器曝光图像的间隔时间。将开始曝光某一帧图像的第一行像素的时间记为ET1,将开始曝光该帧图像的后面一帧图像的第一行像素的时间记为ET2,帧间隔为:ET2-ET1。
若依据同一帧图像的不同行像素接收的光能量大小由大到小或由小到大进行排序,并将该排序记为X。可将X视为二维数组,其中包含排列的序号和相应的行数。例如,{(1,M 1),(2,M 2),(3,M 3),…,(N,M N)},1-N表示由大到小或由小到大排列的序号,M 1-M N表示的是前面的序号对应的像素行数。虽然不同图像的第N行像素接收的光能量可能不同,但是不同图像的X是一致的。
可理解,图像上的不同行像素接收的光能量可以由图像上不同行像素的亮度体现。也就是说,每一帧图像的第N行的亮度可能互不相同,但是它们相对于图像中其他行的明暗程度是不变的。也就是说,电子设备100将帧间隔设置为闪烁周期的整数倍后,可以使得明暗条纹不再滚动。从而减弱banding现象对图像亮度的影响。
在本申请的一个实施例中,可以按照表1设置帧间隔。
表1
闪烁频率 60Hz 80Hz 90Hz 100Hz 120Hz 150Hz
帧间隔 33ms 37ms 33ms 30ms 33ms 33ms
根据上述表1确定帧间隔,不仅可以减弱banding现象,即当拍摄环境中存在人工光源时,电子设备100显示屏上出现的明暗条纹不会滚动,而且还可以满足30FPS的播放需求,即画面每秒传输30帧。
在本申请的一些实施例中,电子设备100可以根据幅值较大的两个正弦波所对应的频率,来调整曝光时间和帧间隔。由上述内容可知,频谱中的纵坐标表示的是幅值,而幅值表示的是亮度。可理解,拍摄环境中亮度较大的人工光源对于banding现象的影响较大。因此,当拍摄环境中存在两个以上的人工光源时,电子设备100可以选取其中幅值最大的以及第二大的人工光源,并根据其频率来调整曝光时间和帧间隔。
下面结合图11具体介绍本申请实施例提供的一种多人工光源下的拍摄方法。
需要说明的是,图11所示的人工多光源下的拍摄方法可以应用于拍摄环境中存在两种闪烁频率的人工光源的场景(如图6所示)。
S1101:电子设备100获取环境亮度的时间序列。
可理解,电子设备100可以通过Flicker Sensor或与其类似的其他传感器获得环境亮度的时间序列。获取环境亮度的时间序列的方法已在上述实施例中说明,在此不再赘述。
S1102:电子设备100将获取的环境亮度的时间序列转化为频谱。
可理解,电子设备100可以通过傅里叶变换或快速傅里叶变换,将环境亮度的时间序列从时域转换到频域,从而得到一个频谱。
S1103:电子设备100确定闪烁频率F 1和F 2。其中,F 1为幅值最大的人工光源所对应的闪烁频率,F 2为幅值第二大的人工光源所对应的闪烁频率。
可理解,电子设备100可以确定当前拍摄环境中多个人工光源的闪烁频率,具体方法可 参考上述实施例,在此不再赘述。
在本申请的一个实施例中,在频谱Spectrum1中的频率查找范围内,最大的和第二大的正弦波峰值所对应的频率为当前拍摄环境中人工光源的闪烁频率。也就是说,电子设备100可以确定当前拍摄环境中存在两种闪烁频率的人工光源。将这两个闪烁频率分别记为F 1和F 2。其中,F 1为最大的正弦波峰值对应的频率,F 2为第二大的正弦波峰值对应的频率。示例性的,F 1=100Hz,F 2=120Hz。S1104:电子设备100确定曝光时间ET1和ET2。其中,ET1为消除闪烁频率为F 1的人工光源造成的banding现象所需的曝光时间,ET2为消除闪烁频率为F 2的人工光源造成的banding现象所需的曝光时间。
由于闪烁周期与闪烁频率的乘积为1,电子设备100可以根据步骤S1102中确定的多个人工光源的闪烁频率,来确定多个人工光源的闪烁周期。若曝光时间满足人工光源的闪烁周期的整数倍,则该闪烁周期对应的人工光源不会造成banding现象。
具体地,将闪烁频率分别为F 1和F 2的人工光源所对应的闪烁周期分别记为t1和t2。其中,t1=1/F 1,t2=1/F 2。若曝光时间为:ET1=M*t1(M为正整数),闪烁频率为F 1的人工光源不会造成banding现象。若曝光时间为:ET2=M*t2(M为正整数),闪烁频率为F 2的人工光源不会造成banding现象。也就意味着,ET1为消除闪烁频率为F 1的人工光源造成的banding现象所需的曝光时间,ET2为消除闪烁频率为F 2的人工光源造成的banding现象所需的曝光时间。
示例性的,F 1=100Hz,F 2=120Hz。则有:
Figure PCTCN2022093644-appb-000002
Figure PCTCN2022093644-appb-000003
可理解,若曝光时间为ET1=M*t1=10M毫秒(M为正整数),闪烁频率为100Hz的人工光源不会造成banding现象。若曝光时间为:ET2=M*t2=8.3M毫秒(M为正整数),闪烁频率为120Hz的人工光源不会造成banding现象。
S1105:电子设备100确定ISO1和ISO2。其中,ISO1为曝光时间为ET1时所对应的ISO,ISO2为曝光时间为ET1时所对应的ISO。
具体的,在曝光强度不变的情况下,电子设备100可以根据曝光强度、曝光时间和ISO这三者的关系(曝光强度=曝光时间*ISO),确定曝光时间为ET1和ET2时对应的ISO,即确定ISO1和ISO2。在本申请的一个实施例中,电子设备100可以根据获取的最新一帧图像说采取的曝光强度来确定ISO1和ISO2。也就是说,电子设备100获取最新一帧图像时采取的曝光强度=ET1*ISO1=ET2*ISO2。
示例性的,电子设备100获取最新一帧图像时采取的曝光强度为10。则有:ISO1=10/ET1=1000,ISO2=10/ET2=1200。
可理解,电子设备100可以根据获取最新一帧图像时采取的曝光时间和相应的ISO来确定曝光强度。电子设备获取最新一帧图像时采取的曝光时间和ISO可以在传感器内读取,或者,当曝光时间和ISO等曝光参数存储在指定内存地址中时,电子设备100可以通过访问该指定内存地址以获取曝光时间和ISO。
S1106:电子设备100判断ISO1是否在预设范围内。
根据上述内容,ISO表示的是电子信号的放大增益。增大ISO时,不仅放大了有用信号,也放大了噪声。因此,ISO越高,电子设备100获取的图像的噪点越多,该图像的画质越差。所以在实际拍摄过程中,往往需要对ISO设定一个合理的范围,使得有用信号得以放大,同时,放大的噪声也在电子设备100的降噪能力范围内。
在本申请中,电子设备100可以设置一个预设范围,用来判断采用的ISO是否在合适范围内。例如,电子设备100设置预设范围为ISO min≤ISO≤ISO max。其中,ISO min为电子设备100可以采取的ISO的最小值,ISO max为电子设备100可以采取的ISO的最大值。当然,电子设备100采取的预设范围还可以有其他形式,本申请对此不做限制。
具体的,电子设备100判断ISO1是否在设置的预设范围内。若ISO1在设置的预设范围内,电子设备100将曝光时间调节为ET1(步骤S1107)。若ISO1不在设置的预设范围内,电子设备100判断ISO2是否在设置的预设范围内(步骤S1109)。
示例性的,ISO min=400,ISO max=1000。可得:400≤ISO1≤1000。即ISO1在预设范围内。
S1107:电子设备100将曝光时间调节为ET1。
可理解,若ISO1在设置的预设范围内,电子设备100可以将曝光时间调节为ET1。
S1108:电子设备100将帧间隔设置为F 2对应的人工光源的闪烁周期的整数倍。
可理解,电子设备100可以将帧间隔设置为F 2对应的人工光源的闪烁周期的整数倍,即将帧间隔设置为t2的整数倍。设置帧间隔的方法已在上述实施例中说明,在此不再赘述。
本申请实施例对上述步骤S1107和步骤S1108的执行顺序不作限定。在一些实施例中,电子设备100可以先根据闪烁频率F 2调节帧间隔。在调节帧间隔之后,电子设备100可以进一步将曝光时间调节为ET1。可选的,电子设备100也可以同时调节曝光时间和帧间隔。
示例性的,F 2=120Hz,根据表1,可以将帧间隔设置为33ms。
S1109:电子设备100判断ISO2是否在预设范围内。
具体的,电子设备100判断ISO2是否在设置的预设范围内。若ISO2在设置的预设范围内,电子设备100将曝光时间调节为ET2(步骤S1110)。若ISO2不在设置的预设范围内,电子设备100根据闪烁频率F 1调节帧间隔(步骤S1111)。
可理解,关于判断ISO2是否在预设范围内的相关描述可以参考步骤S1106,在此不再赘述。
示例性的,ISO min=400,ISO max=1000。可得:ISO2>1000。即ISO2不在预设范围内。S1110:电子设备100将曝光时间调节为ET2。
可理解,若ISO1不在设置的预设范围内,而ISO2在设置的预设范围内,电子设备100可以将曝光时间调节为ET2。
S1111:电子设备100将帧间隔设置为F 1对应的人工光源的闪烁周期的整数倍。
可理解,电子设备100可以将帧间隔设置为F 1对应的人工光源的闪烁周期的整数倍,即将帧间隔设置为t1的整数倍。设置帧间隔的方法已在上述实施例中说明,在此不再赘述。
本申请实施例对上述步骤S1110和步骤S1111的执行顺序不作限定。在一些实施例中,电子设备100可以先根据闪烁频率F 1调节帧间隔。在调节帧间隔之后,电子设备100可以进一步将曝光时间调节为ET2。可选的,电子设备100也可以同时调节曝光时间和帧间隔。
示例性的,F 1=100Hz,根据表1,可以将帧间隔设置为30ms。
需要说明的是,电子设备100可以将执行上述方法后再拍摄图像时采用的曝光时间和帧间隔记为第一曝光时间和第一帧间隔。一般来说,电子设备100可以采样调整后的曝光时间和帧间隔来进行拍摄。也就是说,电子设备100接下来拍摄相邻图像的间隔时间为调整后的帧间隔(第一帧间隔)。电子设备100接下来拍摄图像时采用的曝光时间为经过上述方法调整后的曝光时间(第一曝光时间)。可理解,电子设备采用该调整后的曝光时间拍摄得到的图像的数量不一定为1,本申请对此不作限制。
在本申请的一些实施例中,电子设备可以直接判断ET1和ET2是否在第一范围内,而不用判断ISO1和ISO2是否在预设范围内。可理解,第一范围可以根据实际需求进行设置,本申请对此不作限制。
在本申请的一些实施例中,电子设备100可以首先调整判断ISO1和ISO2是否在预设范围内的判断顺序。也就是说,电子设备100可以先判断ISO2是否在预设范围内,即电子设备100可以先执行步骤S1109,后执行步骤S1106。
在本申请的一些实施例中,电子设备100还可以同时判断ISO1和ISO2是否在预设范围内,然后根据判断结果来执行后续步骤。
在本申请的一些实施例中,存在2种以上闪烁频率的人工光源,电子设备100可以选取其中幅值最大的人工光源和幅值第二大的人工光源,并执行图11所示的方法。也就是说,电子设备100可以选取光照强度最大的人工光源和光照强度第二大的人工光源,并执行图11所示的方法。若ISO1和ISO2均不在预设范围内,电子设备100可以选取幅值第三大的人工光源,并且可以参考步骤S1104和步骤S1105来确定相应的ET3和ISO3。电子设备100还可以判断ISO3是否在预设范围内(步骤S1106和步骤S1109)。若ISO3在预设范围内,电子设备100将曝光时间调整为ET1。若ISO3不在预设范围内,电子设备100可以选取幅值第四大的人工光源,并执行上述步骤。以此类推,若拍摄环境中存在N个闪烁频率的人工光源,电子设备100可以根据其幅值大小的排序来依次确定相应的曝光时间和ISO,并判断ISO是否在预设范围内,然后根据判断结果来调整曝光时间和帧间隔。
在本申请的一些实施例中,存在2种以上闪烁频率的人工光源,电子设备100也可以直接采取图11所示的拍摄方法。也就是说,电子设备100可以确定幅值最大和第二大的人工光源的闪烁频率,并根据该闪烁频率确定曝光时间和帧间隔。因为幅值较大的人工光源对于banding现象的影响较大,若能消除幅值较大的人工光源造成的banding现象,就可以大幅度降低banding现象对于拍摄的影响。
示例性的,若拍摄环境中存在第三人工光源;电子设备100可以确定第三闪烁周期;第三闪烁周期为第三人工光源的闪烁周期;若第一闪烁周期的k1倍超出第一范围,且第二闪烁周期的k2倍未超出第一范围,且第三闪烁周期的k3倍未超出第一范围,第一曝光时间为第三闪烁周期的k3倍,第一帧间隔为第一闪烁周期的k1倍。
在本申请的一些实施例中,存在超过2种以上闪烁频率的人工光源,电子设备100还可以确定其中任意2种人工光源的闪烁频率,并根据这两种闪烁频率确定曝光时间和帧间隔(可参考图11所示的方法)。
可理解,t1可以为第一闪烁周期,此时t2为第二闪烁周期。或者,t1可以为第二闪烁周期,此时t1为第一闪烁周期。类似的,ET1和ET2可以分别为:ET1=k1*t1(k1为正整数),ET2=k2*t2(k2为正整数)。或者,ET1和ET2可以分别为:ET1=k2*t1(k2为正整数),ET2=k1*t2(k1为正整数)。
需要说明的是,权利要求书中所提及的电子设备可以为本申请实施例中的电子设备100。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (15)

  1. 一种多人工光源下的拍摄方法,其特征在于,所述方法包括:
    电子设备确定第一闪烁周期和第二闪烁周期;所述第一闪烁周期是拍摄环境中第一人工光源的闪烁周期;所述第二闪烁周期是所述拍摄环境中第二人工光源的闪烁周期;
    所述电子设备确定第一曝光时间和第一帧间隔;
    若所述第一闪烁周期的k1倍未超出第一范围,所述第一曝光时间为所述第一闪烁周期的k1倍,所述第一帧间隔为所述第二闪烁周期的k2倍;所述k1和所述k2均为正整数;
    所述电子设备以所述第一曝光时间和所述第一帧间隔拍摄;所述第一帧间隔为所述拍摄的过程中,通过摄像头采集的相邻两帧图像的间隔时间。
  2. 如权利要求1所述的方法,其特征在于,所述第一人工光源的光照强度大于所述第二人工光源的光照强度。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍未超出所述第一范围,所述第一曝光时间为所述第二闪烁周期的k2倍,所述第一帧间隔为所述第一闪烁周期的k1倍。
  4. 如权利要求1或2所述的方法,其特征在于,所述方法还包括:
    若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍也超出所述第一范围,所述第一帧间隔为所述第一闪烁周期的k1倍。
  5. 如权利要求1-4任一项所述的方法,其特征在于,所述电子设备确定第一闪烁周期和第二闪烁周期,具体包括:
    所述电子设备获取第一时间序列;所述第一时间序列包括环境亮度信息和时间信息;
    所述电子设备将所述第一时间序列转换为第一频谱;
    所述电子设备根据所述第一频谱确定第一正弦波的频率为第一闪烁频率,确定第二正弦波的频率为第二闪烁频率;
    所述电子设备根据所述第一闪烁频率确定所述第一闪烁周期,根据所述第二闪烁频率确定所述第二闪烁周期;
    其中,所述第一正弦波的幅值与第一平均值的差值大于第一预设阈值;所述第二正弦波的幅值与第二平均值的差值大于第二预设阈值;所述第一平均值为在所述第一频谱中的频率查找范围内,除所述第一正弦波外的其他正弦波的幅值的平均值;所述第二平均值为在所述第一频谱中的所述频率查找范围内,除所述第一正弦波和所述第二正弦波外的其他正弦波的幅值的平均值;所述频率查找范围用于确定查找所述第一正弦波和所述第二正弦波的频率范围。
  6. 如权利要求1-5任一项所述的方法,其特征在于,所述拍摄环境中存在两个以上的人工光源;所述第一人工光源为所述两个以上的人工光源中光照强度最大的人工光源;所述第二人工光源为所述两个以上的人工光源中光照强度第二大的人工光源。
  7. 如权利要求1-6任一项所述的方法,其特征在于,所述拍摄环境中存在第三人工光源;所述方法还包括:
    所述电子设备确定第三闪烁周期;所述第三闪烁周期为所述第三人工光源的闪烁周期;
    若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍未超出所 述第一范围,且所述第三闪烁周期的k3倍未超出所述第一范围,所述第一曝光时间为所述第三闪烁周期的k3倍,所述第一帧间隔为所述第一闪烁周期的k1倍。
  8. 一种电子设备,包括摄像头、一个或多个存储器、一个或多个处理器,其特征在于,所述一个或多个处理器与所述摄像头、所述一个或多个存储器耦合,所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;
    所述处理器,用于确定第一闪烁周期和第二闪烁周期;所述第一闪烁周期是拍摄环境中第一人工光源的闪烁周期;所述第二闪烁周期是所述拍摄环境中第二人工光源的闪烁周期;
    所述处理器,还用于确定第一曝光时间和第一帧间隔;若所述第一闪烁周期的k1倍未超出第一范围,所述第一曝光时间为所述第一闪烁周期的k1倍,所述第一帧间隔为所述第二闪烁周期的k2倍;所述k1和所述k2均为正整数;
    所述摄像头,用于以所述第一曝光时间和所述第一帧间隔拍摄;所述第一帧间隔为所述拍摄的过程中,采集相邻两帧图像的间隔时间。
  9. 如权利要求8所述的电子设备,其特征在于,所述第一人工光源的光照强度大于所述第二人工光源的光照强度。
  10. 如权利要求8或9所述的电子设备,其特征在于,若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍未超出所述第一范围,所述第一曝光时间为所述第二闪烁周期的k2倍,所述第一帧间隔为所述第一闪烁周期的k1倍。
  11. 如权利要求8或9所述的电子设备,其特征在于,若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍也超出所述第一范围,所述第一帧间隔为所述第一闪烁周期的k1倍。
  12. 如权利要求8-11任一项所述的电子设备,其特征在于,所述处理器用于确定第一闪烁周期和第二闪烁周期时,具体用于:
    获取第一时间序列;所述第一时间序列包括环境亮度信息和时间信息;
    将所述第一时间序列转换为第一频谱;
    根据所述第一频谱确定第一正弦波的频率为第一闪烁频率,确定第二正弦波的频率为第二闪烁频率;
    根据所述第一闪烁频率确定所述第一闪烁周期,根据所述第二闪烁频率确定所述第二闪烁周期;
    其中,所述第一正弦波的幅值与第一平均值的差值大于第一预设阈值;所述第二正弦波的幅值与第二平均值的差值大于第二预设阈值;所述第一平均值为在所述第一频谱中的频率查找范围内,除所述第一正弦波外的其他正弦波的幅值的平均值;所述第二平均值为在所述第一频谱中的所述频率查找范围内,除所述第一正弦波和所述第二正弦波外的其他正弦波的幅值的平均值;所述频率查找范围用于确定查找所述第一正弦波和所述第二正弦波的频率范围。
  13. 如权利要求8-12任一项所述的电子设备,其特征在于,所述拍摄环境中存在两个以上的人工光源;所述第一人工光源为所述两个以上的人工光源中光照强度最大的人工光源;所述第二人工光源为所述两个以上的人工光源中光照强度第二大的人工光源。
  14. 如权利要求8-13任一项所述的电子设备,其特征在于,所述拍摄环境中存在第三人工光源;所述处理器,还用于:确定第三闪烁周期;所述第三闪烁周期为所述第三人工光源的闪烁周期;
    若所述第一闪烁周期的k1倍超出所述第一范围,且所述第二闪烁周期的k2倍未超出所 述第一范围,且所述第三闪烁周期的k3倍未超出所述第一范围,所述第一曝光时间为所述第三闪烁周期的k3倍,所述第一帧间隔为所述第一闪烁周期的k1倍。
  15. 一种计算机可读存储介质,其特征在于,包括:计算机指令;当所述计算机指令在电子设备上运行时,使得所述电子设备执行权利要求1-7中任一项所述的方法。
PCT/CN2022/093644 2021-06-24 2022-05-18 一种多人工光源下的拍摄方法及相关装置 WO2022267762A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/266,222 US20230388658A1 (en) 2021-06-24 2022-05-18 Photographing method from plurality of artificial light sources and related apparatus
EP22827260.5A EP4243403A1 (en) 2021-06-24 2022-05-18 Method for photographing under multiple artificial light sources, and related apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110708031.2 2021-06-24
CN202110708031.2A CN115529419B (zh) 2021-06-24 2021-06-24 一种多人工光源下的拍摄方法及相关装置

Publications (1)

Publication Number Publication Date
WO2022267762A1 true WO2022267762A1 (zh) 2022-12-29

Family

ID=84544087

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/093644 WO2022267762A1 (zh) 2021-06-24 2022-05-18 一种多人工光源下的拍摄方法及相关装置

Country Status (4)

Country Link
US (1) US20230388658A1 (zh)
EP (1) EP4243403A1 (zh)
CN (1) CN115529419B (zh)
WO (1) WO2022267762A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117714662A (zh) * 2023-05-25 2024-03-15 荣耀终端有限公司 频闪评测装置
CN116996777B (zh) * 2023-09-26 2024-04-05 荣耀终端有限公司 一种拍摄方法、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012094956A (ja) * 2010-10-25 2012-05-17 Nikon Corp 撮像装置
US20130342726A1 (en) * 2012-06-20 2013-12-26 Hitachi, Ltd. Imaging apparatus
US20170094148A1 (en) * 2015-09-24 2017-03-30 Casio Computer Co., Ltd. Imaging apparatus with a flicker detection function
CN108111767A (zh) * 2018-01-24 2018-06-01 努比亚技术有限公司 一种拍摄方法、终端及计算机可读存储介质
CN110248110A (zh) * 2019-06-28 2019-09-17 Oppo广东移动通信有限公司 拍摄参数设置方法、设置装置、终端设备及可读存储介质

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3370979B2 (ja) * 2000-09-08 2003-01-27 三菱電機株式会社 撮像装置および自動レベル調整方法
JP2003198932A (ja) * 2001-12-27 2003-07-11 Sharp Corp フリッカ補正装置およびフリッカ補正方法、並びにフリッカ補正プログラムを記録した記録媒体
JP4207926B2 (ja) * 2005-05-13 2009-01-14 ソニー株式会社 フリッカ補正方法、フリッカ補正装置及び撮像装置
JP2009077057A (ja) * 2007-09-19 2009-04-09 Olympus Imaging Corp 撮像装置、撮像装置の制御方法
JP4626689B2 (ja) * 2008-08-26 2011-02-09 ソニー株式会社 撮像装置、補正回路および補正方法
CN101567977B (zh) * 2009-06-09 2013-09-18 北京中星微电子有限公司 一种闪烁检测方法及其装置
JP2013238479A (ja) * 2012-05-15 2013-11-28 Seiko Epson Corp 検出装置
US9344640B2 (en) * 2013-06-14 2016-05-17 Panasonic Intellectual Property Corporation Of America Imaging device, integrated circuit, and flicker reduction method
JP6356400B2 (ja) * 2013-09-13 2018-07-11 学校法人明治大学 周波数検出装置
KR20150109177A (ko) * 2014-03-19 2015-10-01 삼성전자주식회사 촬영 장치, 그 제어 방법, 및 컴퓨터 판독가능 기록매체
CN109151256B (zh) * 2018-08-31 2020-12-08 惠州华阳通用电子有限公司 一种基于传感器检测的摄像头闪烁消除方法及装置
JP7222745B2 (ja) * 2019-02-08 2023-02-15 ローム株式会社 フリッカ検出装置
CN110381276B (zh) * 2019-05-06 2021-08-13 华为技术有限公司 一种视频拍摄方法及电子设备
US11032486B2 (en) * 2019-10-11 2021-06-08 Google Llc Reducing a flicker effect of multiple light sources in an image
CN111355864B (zh) * 2020-04-16 2022-06-14 浙江大华技术股份有限公司 一种图像闪烁消除方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012094956A (ja) * 2010-10-25 2012-05-17 Nikon Corp 撮像装置
US20130342726A1 (en) * 2012-06-20 2013-12-26 Hitachi, Ltd. Imaging apparatus
US20170094148A1 (en) * 2015-09-24 2017-03-30 Casio Computer Co., Ltd. Imaging apparatus with a flicker detection function
CN108111767A (zh) * 2018-01-24 2018-06-01 努比亚技术有限公司 一种拍摄方法、终端及计算机可读存储介质
CN110248110A (zh) * 2019-06-28 2019-09-17 Oppo广东移动通信有限公司 拍摄参数设置方法、设置装置、终端设备及可读存储介质

Also Published As

Publication number Publication date
CN115529419A (zh) 2022-12-27
US20230388658A1 (en) 2023-11-30
EP4243403A1 (en) 2023-09-13
CN115529419B (zh) 2024-04-16

Similar Documents

Publication Publication Date Title
WO2021052232A1 (zh) 一种延时摄影的拍摄方法及设备
KR102577396B1 (ko) 녹화 프레임 레이트 제어 방법 및 관련 장치
WO2022267762A1 (zh) 一种多人工光源下的拍摄方法及相关装置
CN112532892B (zh) 图像处理方法及电子装置
WO2022257451A1 (zh) 一种显示方法、电子设备及计算机存储介质
WO2023000772A1 (zh) 模式切换方法、装置、电子设备及芯片系统
WO2023015991A1 (zh) 拍照方法、电子设备和计算机可读存储介质
CN113891009B (zh) 曝光调整方法及相关设备
EP4280586A1 (en) Point light source image detection method and electronic device
WO2022001258A1 (zh) 多屏显示方法、装置、终端设备及存储介质
WO2021104122A1 (zh) 呼叫需求响应方法、装置及电子设备
WO2022267608A1 (zh) 一种曝光强度调节方法及相关装置
WO2022267763A1 (zh) 一种拍摄方法及相关装置
WO2023030168A1 (zh) 界面显示方法和电子设备
CN113891008B (zh) 一种曝光强度调节方法及相关设备
CN113923372B (zh) 曝光调整方法及相关设备
WO2021204103A1 (zh) 照片预览方法、电子设备和存储介质
WO2021190097A1 (zh) 图像处理方法和装置
CN116193269A (zh) 一种曝光模式切换方法及相关设备
WO2023160224A9 (zh) 一种拍摄方法及相关设备
CN114596819B (zh) 亮度调节方法及相关装置
RU2782255C1 (ru) Способ управления частотой кадров записи и соответствующее устройство
WO2023029565A9 (zh) 一种触控采样率调节方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22827260

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022827260

Country of ref document: EP

Effective date: 20230605

NENP Non-entry into the national phase

Ref country code: DE