WO2024041006A1 - 一种控制摄像头帧率的方法及电子设备 - Google Patents
一种控制摄像头帧率的方法及电子设备 Download PDFInfo
- Publication number
- WO2024041006A1 WO2024041006A1 PCT/CN2023/089867 CN2023089867W WO2024041006A1 WO 2024041006 A1 WO2024041006 A1 WO 2024041006A1 CN 2023089867 W CN2023089867 W CN 2023089867W WO 2024041006 A1 WO2024041006 A1 WO 2024041006A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- frame rate
- output
- image
- images
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 93
- 238000003384 imaging method Methods 0.000 claims abstract description 86
- 230000004044 response Effects 0.000 claims description 13
- 230000000977 initiatory effect Effects 0.000 claims description 8
- 230000000694 effects Effects 0.000 abstract description 12
- 230000006870 function Effects 0.000 description 34
- 238000004891 communication Methods 0.000 description 18
- 238000007726 management method Methods 0.000 description 17
- 239000002131 composite material Substances 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 9
- 238000009877 rendering Methods 0.000 description 9
- 238000010295 mobile communication Methods 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 229920001621 AMOLED Polymers 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 241001464837 Viridiplantae Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 231100000289 photo-effect Toxicity 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
Definitions
- the present application relates to the field of terminals, and in particular to a method and electronic device for controlling the frame rate of a camera.
- multiple cameras also called lenses
- multiple cameras can be combined for imaging to improve shooting quality in different scenarios.
- one of the multiple cameras is the main camera (referred to as the main camera), and one or more of the remaining cameras are the secondary cameras (referred to as the secondary camera).
- the images output by the main camera and the secondary camera are combined according to the set rules, so that the shooting effects in different scenes can be achieved; however, the image output by the camera will cause power loss. How to reduce power consumption while ensuring shooting effects is a problem that needs to be solved.
- This application provides a method and electronic device for controlling the frame rate of a camera, which can reduce the power loss when multiple cameras are combined for imaging without affecting the shooting effect.
- an electronic device including: a processor, a display screen, a first camera and a second camera; an operating system of the electronic device is run on the processor, and a camera application is installed in the operating system; wherein, the processor is used to receiving a first message indicating that the user initiates a camera application operation; and further configured to, in response to receiving the first message, activate the first camera and the second camera; the first camera is configured to output an image to the processor at a first frame rate; The second camera is used to output an image to the processor at a second frame rate, and the second frame rate is less than the first frame rate; the processor is also used to transmit the image output by the first camera to the camera application; the display screen is used to display the camera application
- the preview interface includes the image output by the first camera.
- the image output frame rates of camera one and camera two are controlled separately.
- the photo preview stage only the images collected by the main camera are used for display in the preview interface.
- the images output by the secondary camera are not used for display.
- the secondary camera outputs images at a lower frame rate than the main camera, which can reduce the power during photo preview. loss.
- the processor is further configured to notify the first camera to collect images at a first frame rate and to notify the second camera to collect images at a second frame rate after starting the first camera and the second camera. .
- the processor controls the frame rate of images collected by the first camera and the second camera.
- the processor is configured to send a picture request message to the first camera at a first frequency; and is also configured to send a picture request message to the second camera at a second frequency; wherein, the picture request The message is used to request the camera to output a frame of image.
- the first frequency is equal to the first frame rate
- the second frequency is equal to the second frame rate. wait.
- every time the processor sends a picture request message to the camera it can trigger the camera to transmit a frame of image to the camera.
- the processor sends an image request message to the first camera at a first frequency equal to the first frame rate, which can trigger the first camera to output the collected image to the processor at the first frame rate.
- the processor sends an image request message to the second camera at a second frequency equal to the second frame rate, which can trigger the second camera to output the collected image to the processor at the second frame rate.
- the first camera can output an image to the processor at a first frame rate; the second camera can output an image to the processor at a second frame rate.
- the processor is further configured to notify the second camera to collect images at an initial frame rate before notifying the second camera to collect images at a second frame rate; wherein the initial frame rate is the first frame Rate.
- the processor is further configured to, after receiving the first message, determine to activate the first camera and the second camera according to the camera configuration of the electronic device and the preset policy.
- the processor is further configured to obtain the camera configuration of the electronic device after receiving the first message. In this way, you can get the latest camera configuration every time and select the appropriate camera to turn on.
- the processor is further configured to receive a second message, and the second message is used to indicate that the user initiates a photographing and imaging operation on the preview interface; the processor is further configured to respond to receiving the third message.
- the second message notifies the second camera to output the image at the first frame rate; the second camera is also used to output the image to the processor at the first frame rate; the processor is also used to combine the output image of the first camera with the output of the second camera.
- An image composite photo wherein the first camera outputs an image at a first frame rate.
- the secondary camera when taking pictures and imaging, the secondary camera is increased to the same frame rate as the main camera to output the image, which can be used to combine the output image of the first camera and the output image of the second camera into a photo to ensure imaging quality.
- the second frame rate is greater than zero.
- the second frame rate is half the first frame rate.
- the second frame rate is greater than 0, that is, the secondary camera maintains the output image during the photo preview stage, rather than stopping outputting the image; compared with the method of the secondary camera not outputting the image during the preview stage, the secondary camera maintains the output image at a lower frame rate.
- the image frame rate of the secondary camera will be adjusted upward. (For example, when adjusting to the first frame rate), it can respond quickly to avoid affecting imaging speed or quality.
- a method for controlling the frame rate of a camera is provided, applied to an electronic device.
- the electronic device includes a first camera and a second camera.
- the method includes: receiving an operation by a user to start a camera application; in response to an operation of starting a camera application, Start the first camera and the second camera; control the first camera to output images at a first frame rate; control the second camera to output images at a second frame rate, and the second frame rate is smaller than the first frame rate; display the preview interface of the camera application,
- the preview interface includes images output by the first camera.
- the image output frame rates of camera one and camera two are controlled separately.
- the photo preview stage only the images collected by the main camera are used for display in the preview interface.
- the images output by the secondary camera are not used for display.
- the secondary camera outputs images at a lower frame rate than the main camera, which can reduce the power during photo preview. loss.
- the method before controlling the second camera to output images at the second frame rate, the method also includes: controlling the second camera to output images at an initial frame rate; the initial frame rate is a first frame rate.
- the method further includes: receiving the user's initiation of an operation of photographing and imaging on the preview interface; in response to the operation of initiating the operation of photographing and imaging, controlling the second camera to output an image at a first frame rate; and converting the first The output image of the camera and the output image of the second camera are combined into a photo; wherein the first camera outputs the image at a first frame rate.
- the secondary camera when taking pictures and imaging, is increased to the same frame rate as the main camera to output the image, which can be used to combine the output image of the first camera and the output image of the second camera into a photo to ensure imaging quality.
- the second camera is controlled to output the image at a second frame rate; wherein, the first camera outputs the image at the first frame rate. output the image at a certain rate; display the preview interface of the camera application, and the preview interface includes the image output by the first camera.
- the preview interface continues to be displayed after the photo imaging is completed.
- the preview interface only displays the image collected by the first camera; reducing the picture frame rate of the second camera to the second frame rate again can reduce the power consumption in the photo preview stage. .
- the second frame rate is greater than zero.
- the second frame rate is half the first frame rate.
- the second frame rate is greater than 0, that is, the secondary camera maintains the output image during the photo preview stage, rather than stopping outputting the image; compared with the method of the secondary camera not outputting the image during the preview stage, the secondary camera maintains the output image at a lower frame rate.
- the image frame rate of the secondary camera will be adjusted upward. (For example, when adjusting to the first frame rate), it can respond quickly to avoid affecting imaging speed or quality.
- a method for controlling a camera frame rate which can be applied to a chip system.
- the method includes: receiving a first message, the first message being used to indicate receiving an operation of the user to start the camera application; in response to receiving the first message, notifying the first camera and the second camera to start; notifying the first camera to collect images at a first frame rate; notifying the second camera to collect images at a second frame rate, and the second frame rate is less than the first frame rate; receiving the first The image output by the camera at the first frame rate is received; the image output by the second camera at the second frame rate is received; and the image output by the first camera is transmitted to the camera application, so that the camera application displays a preview interface according to the image output by the first camera.
- the image output frame rates of camera one and camera two are controlled separately.
- the photo preview stage only the images collected by the main camera are used for display in the preview interface.
- the images output by the secondary camera are not used for display.
- the secondary camera outputs images at a lower frame rate than the main camera, which can reduce the power during photo preview. loss.
- the method further includes: transmitting data to the first camera at a first frequency. Send a picture request message to the second camera at a second frequency; wherein the picture request message is used to request the camera to output a frame of image, the first frequency is equal to the first frame rate, and the second frequency is equal to the first frame rate.
- the two frame rates are equal.
- the processor every time the processor sends a picture request message to the camera, it can trigger the camera to transmit a frame of image to the camera.
- the processor sends an image request message to the first camera at a first frequency equal to the first frame rate, which can trigger the first camera to output the collected image to the processor at the first frame rate.
- processor with a second By sending an image request message to the second camera at a second frequency with the same frame rate, the second camera can be triggered to output the collected image to the processor at the second frame rate. In this way, the first camera can output an image to the processor at a first frame rate; the second camera can output an image to the processor at a second frame rate.
- the second camera before notifying the second camera to collect images at the second frame rate, the second camera is also notified to collect images at an initial frame rate; the initial frame rate is the first frame rate.
- the camera configuration is obtained; and it is determined to start the first camera and the second camera according to the camera configuration and the preset policy. In this way, you can get the latest camera configuration every time and select the appropriate camera to turn on.
- the method further includes: receiving a second message, the second message being used to indicate that the user initiates a photographing and imaging operation on the preview interface; in response to receiving the second message, notifying the second
- the camera collects images at a first frame rate; receives an image output by the second camera at a first frame rate; combines the output image of the first camera and the output image of the second camera into a photo; wherein, the first camera outputs at the first frame rate image.
- the secondary camera when taking pictures and imaging, is increased to the same frame rate as the main camera to output the image, which can be used to combine the output image of the first camera and the output image of the second camera into a photo to ensure imaging quality.
- notifying the first camera to collect images at a first frame rate and notifying the second camera to collect images at a second frame rate includes: setting an exposure parameter of the first camera to a first value, setting The exposure parameter of the second camera is a second value, and the second value is greater than the first value.
- the second frame rate is half of the first frame rate, and the second value is twice the first value.
- a fourth aspect provides an electronic device having the function of implementing the method described in the second aspect.
- This function can be implemented by hardware, or it can be implemented by hardware executing corresponding software.
- the hardware or software includes one or more modules corresponding to the above functions.
- a fifth aspect provides an electronic device, including: a processor; the processor is configured to be coupled to a memory, and after reading instructions in the memory, execute according to the instructions as described in any one of the above third aspects. Methods.
- a computer-readable storage medium In a sixth aspect, a computer-readable storage medium is provided. Instructions are stored in the computer-readable storage medium, and when run on a computer, the computer can execute the method described in any one of the above-mentioned second aspects.
- a seventh aspect provides a computer program product containing instructions that, when run on a computer, enable the computer to execute the method described in any one of the above-mentioned second aspects.
- An eighth aspect provides a device (for example, the device may be a chip system).
- the device includes a processor and is used to support an electronic device to implement the functions involved in the third aspect.
- the device further includes a memory, which is used to store necessary program instructions and data of the electronic device.
- the device is a chip system, it may be composed of a chip or may include a chip and other discrete components.
- Figure 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- Figure 2 is a schematic diagram of the software architecture of an electronic device provided by an embodiment of the present application.
- FIG. 3 is a schematic diagram of interaction between modules in the method of controlling the camera frame rate provided by the embodiment of the present application;
- FIG. 4 is a schematic diagram of interaction between modules in the method of controlling the camera frame rate provided by the embodiment of the present application;
- Figure 5 is a schematic flowchart of a method for controlling a camera frame rate provided by an embodiment of the present application
- Figure 6 is a schematic flowchart of a method for controlling a camera frame rate provided by an embodiment of the present application
- Figure 7 is a schematic flowchart of a method for controlling the frame rate of a camera provided by an embodiment of the present application.
- FIG. 8 is a schematic diagram of a chip system provided by an embodiment of the present application.
- a and/or B can mean: A exists alone, A and B exist simultaneously, and B exists alone, Where A and B can be singular or plural.
- the character "/" generally indicates that the related objects are in an "or” relationship.
- RGB camera and computer view (CV) camera can be used to take photos in scenes such as the sky, mountain peaks, green plants, and portraits to improve the photo effects.
- the images output by the secondary camera are not used for display, they are still output at the same frame rate as the main camera.
- the image frame rate is the number of images output by the camera per unit time. Generally expressed in frames per second (fps), such as 30fps.
- fps frames per second
- the image frame rates of the main camera and the secondary camera are the same.
- the image frame rates of the main camera and the secondary camera are bound and set, and the control is simple; however, the secondary camera produces images at a higher frame rate without transmission. to camera applications, causing unnecessary power loss.
- Embodiments of the present application provide a method for controlling the frame rate of a camera.
- the frame rates of the main camera and the secondary camera are respectively controlled, and the secondary camera is controlled to output images at a lower frame rate to reduce power consumption.
- the frame rate of the secondary camera is increased to the same frame rate as the main camera to improve the photography effect in special scenes.
- the method provided by the embodiment of the present application can be applied to electronic equipment including a display screen.
- the above-mentioned electronic devices may include mobile phones, tablet computers, notebook computers, personal computers (PC), ultra-mobile personal computers (UMPC), handheld computers, netbooks, smart home devices (such as smart TVs, Smart screens, large screens, smart speakers, smart air conditioners, etc.), personal digital assistants (PDAs), wearable devices (such as smart watches, smart bracelets, etc.), vehicle-mounted devices, virtual reality devices, etc., this application
- PDAs personal digital assistants
- wearable devices such as smart watches, smart bracelets, etc.
- vehicle-mounted devices virtual reality devices, etc.
- FIG. 1 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, and a battery 142 , Antenna 1, Antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
- the structure illustrated in this embodiment does not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or fewer components than illustrated, some components may be combined, some components may be separated, or components may be arranged differently.
- the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units.
- the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
- application processor application processor, AP
- modem processor graphics processing unit
- GPU graphics processing unit
- image signal processor image signal processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- baseband processor baseband processor
- NPU neural-network processing unit
- different processing units can be independent devices or integrated in one or more processors.
- the controller may be the nerve center and command center of the electronic device 100 .
- the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
- the processor 110 may also be provided with a memory for storing instructions and data.
- the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
- processor 110 may include one or more interfaces.
- Interfaces may include integrated circuits (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (PCM) interface, universal asynchronous receiver/transmitter (UART) ) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and/or universal serial Bus (universal serial bus, USB) interface, etc.
- I2C integrated circuits
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- UART universal asynchronous receiver/transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial Bus
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
- the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
- Internal memory 121 may be used to store computer executable program code, which includes instructions.
- the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
- the processor 110 can execute instructions stored in the internal memory 121, and the internal memory 121 can include a program storage area and a data storage area.
- the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
- the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
- the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
- the interface connection relationships between the modules illustrated in this embodiment are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
- the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
- the charging management module 140 is used to receive charging input from the charger. While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
- the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
- the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, camera 193, wireless communication module 160, etc.
- the power management module 141 may also be provided in the processor 110 .
- the power management module 141 and the charging management module 140 may also be provided in the same device.
- the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN.
- the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
- the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
- a modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
- the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
- the wireless communication module 160 can provide applications on the electronic device 100 including WLAN (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS) , frequency modulation (FM), near field communication (NFC), infrared (IR) and other wireless communication solutions.
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
- the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
- the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation systems
- the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
- the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. Speaker 170A, also called “speaker”, is used to convert audio electrical signals into sound signals. Receiver 170B, also called “earpiece”, is used to convert audio electrical signals into sound signals. Microphone 170C, also called “microphone” or “microphone”, is used to convert sound signals into electrical signals. The headphone interface 170D is used to connect wired headphones.
- the buttons 190 include a power button, a volume button, etc.
- Key 190 may be a mechanical key. It can also be a touch button.
- the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
- the motor 191 can generate vibration prompts.
- the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
- the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
- the SIM card interface 195 is used to connect a SIM card.
- the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
- the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card, etc.
- the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
- the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
- Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
- the display screen 194 is used to display images, videos, etc.
- the display screen 194 includes a display panel.
- the display panel can use a liquid crystal display (LCD), a light-emitting diode (LED), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode or an active matrix Organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light emitting diode (flex light emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (quantum dot light emitting diode, QLED) )wait.
- the display screen 194 can be used to display a human-computer interaction interface, a photo preview interface, etc.
- the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
- the ISP is used to process the data fed back by the camera 193.
- Camera 193 is used to capture still images or video.
- Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals.
- Video codecs are used to compress or decompress digital video.
- Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
- MPEG moving picture experts group
- the cameras 193 may include 1 to N cameras.
- the electronic device may include a wide-angle camera, an ultra-wide-angle camera, a telephoto camera, a black and white camera, a macro camera, a CV camera, a depth-of-field camera, etc., which are not limited in this application.
- multiple cameras can be combined and imaged according to preset rules according to different scenes to improve the shooting quality in different scenes.
- Multiple cameras include one main camera and one or more secondary cameras.
- a main camera and a secondary camera are taken as examples for introduction. It can be understood that when more sub-shots are included, the execution logic of the remaining sub-shots may refer to the execution logic of the sub-shots in the following embodiments.
- the above-mentioned electronic device is an electronic device that can run an operating system and install applications.
- the operating system running on the electronic device may be system, system, system etc.
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- This embodiment of the present invention takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
- the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
- the layers communicate through interfaces.
- the Android system may include an application layer, an application framework layer, an Android runtime and system libraries, a hardware abstraction layer (HAL), and a kernel layer.
- HAL hardware abstraction layer
- the embodiments of the present application are illustrated by taking the Android system as an example. In other operating systems (such as Hongmeng system, IOS system, etc.), as long as the functions implemented by each functional module are similar to those of the embodiments of the present application, the present application can also be implemented. plan.
- the application layer can include a series of application packages.
- the application package can include camera application, gallery, calendar, calling, map, navigation, Wireless local area networks (WLAN), Bluetooth, music, video, text messages, settings and other applications.
- the application layer may also include other application packages, such as payment applications, shopping applications, banking applications or chat applications, which are not limited in this application.
- an application with a shooting function such as a camera application
- the camera application has the function of taking photos and videos.
- other applications need to use the shooting function, they can also call the camera application to implement the shooting function.
- the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
- the application framework layer includes some predefined functions. For example, it may include an activity manager, a window manager, a content provider, a view system, a resource manager, a notification manager, a camera service (Camera Service), etc., and the embodiments of this application do not impose any restrictions on this.
- Camera Service can be started when the electronic device is turned on.
- the Camera Service can interact with the Camera HAL (Camera HAL) in the Hardware Abstraction Layer (HAL) during operation.
- Camera HAL Camera HAL
- HAL Hardware Abstraction Layer
- System libraries can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (such as OpenGL ES), 2D graphics engines (such as SGL), etc.
- the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- OpenGL ES is used to implement three-dimensional graphics drawing, image rendering, composition, and layer processing.
- SGL is a drawing engine for 2D drawing.
- Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
- the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
- the application layer and application framework layer run in virtual machines.
- the virtual machine executes the java files of the application layer and application framework layer into binary files.
- the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
- the HAL layer encapsulates the Linux kernel driver, provides an interface upwards, and shields the implementation details of the underlying hardware.
- the HAL layer can include Wi-Fi HAL, audio HAL, camera HAL, decision-making module, etc.
- the camera HAL is the core software framework of the camera, which is responsible for interacting with the hardware devices (such as cameras) that implement the shooting function in electronic devices.
- the camera HAL hides the implementation details of related hardware devices (such as specific image processing algorithms), and on the other hand, it can provide the Android system with an interface for calling related hardware devices.
- the decision-making module is used to adapt the logic of the initialization phase of opening the camera application (for example, determining the camera to be started when opening the camera application, determining the camera's picture frame rate), and the logic of the photo preview phase (for example, determining the main camera and secondary camera when previewing the photo) The picture frame rate) and the logic of the photo and imaging stage (for example, determining the picture frame rate of the main camera and the secondary camera when taking pictures and imaging), etc.
- the decision module includes a frame rate decision sub-module for determining the camera (such as secondary shooting) frame rate.
- the decision-making module may also be provided in the camera HAL.
- the kernel layer is the layer between hardware and software.
- the kernel layer can include display drivers, camera drivers, audio drivers, sensor drivers, etc.
- the camera driver is the driver layer of the Camera device and is mainly responsible for interacting with the hardware.
- the hardware layer includes displays, cameras, etc.
- the camera may include a wide-angle camera (also called a wide-angle camera module or a wide-angle lens), a telephoto camera (also called a telephoto camera module or a telephoto lens), an ultra-wide-angle camera (also called an ultra-wide-angle camera module). group or ultra-wide-angle lens), black-and-white camera (also known as black-and-white camera module or black-and-white lens), macro camera (also known as macro camera module or macro lens), computer vision (CV) camera (Can also be called CV camera module or CV lens) etc.
- a wide-angle camera also called a wide-angle camera module or a wide-angle lens
- a telephoto camera also called a telephoto camera module or a telephoto lens
- an ultra-wide-angle camera also called an ultra-wide-angle camera module.
- group or ultra-wide-angle lens black-and-white camera (also known as black-and-white camera module or black-and-white lens)
- the desktop of the mobile phone 100 includes shortcut icons for multiple applications.
- the user can click the icon 101 of the "Camera" application to start the Camera application.
- the mobile phone 100 receives the user's click operation on the icon 101 and starts the camera application.
- the camera application starts the phone's camera to collect images and displays the images collected by the camera as a preview interface.
- the mobile phone 100 displays a preview interface 102 .
- the mobile phone 100 performs a series of internal processes.
- the mobile phone 100 receives the user's click operation on the icon 101 and starts the camera application. Further, as shown in (c) of Figure 3, the camera application sends a preview request to the camera HAL.
- the camera HAL sends a preview request to the decision module.
- the decision-making module determines which cameras to turn on based on the camera configuration information and preset policies. Among them, the decision-making module can use the strategy that can be obtained in conventional technology to determine the camera to be turned on. For example, the camera to be turned on can be determined based on the brightness of the light around the phone, the camera mode, the type of camera included in the phone, etc. For example, in some scenarios, the decision-making module determines to turn on a camera.
- the decision-making module determines to turn on multiple cameras, and the images collected by multiple cameras are transmitted to the camera application. In some scenarios, the decision-making module determines to turn on multiple cameras, including main camera and secondary camera.
- the photo preview stage the images collected by the main camera are transmitted to the camera application for display; in the photo imaging stage, the images collected by the main camera and the secondary camera are combined to generate photos. In other words, the images collected by the main camera will be used during the photo preview stage and the photo imaging stage; the images collected by the secondary camera will not be used during the photo preview stage, but will be used during the photo imaging stage.
- transmitting images collected by the camera to the camera application may include transmitting each frame of image collected by the camera to the camera application, or may also include merging multiple frames of images collected by the camera and transmitting them. To the camera application, it may also include combining images collected by multiple cameras and then transmitting them to the camera application.
- the decision-making module also determines the camera's picture frame rate according to the preset strategy.
- the decision-making module can use strategies that can be obtained in conventional technology to determine the image frame rate of the camera. For example, the decision-making module can determine the camera's image frame rate based on the type of camera enabled, the image frame rate supported by the camera, the size of the system's available resources, etc.
- the decision-making module determines that it is in the shooting preview stage according to the preview request, then according to the above The default strategy determines the main camera's output frame rate as the first frame rate.
- the decision-making module further determines that the frame rate of the secondary shooting is a second frame rate, where the second frame rate is smaller than the first frame rate.
- the frame rate of the main camera is 30fps
- the frame rate of the secondary camera is 15fps.
- the decision-making module can call the frame rate decision sub-module (not shown in Figure 3) to specifically determine the frame rate of the secondary shooting.
- the decision-making module notifies the camera HAL that the image frame rate of the main camera is the first frame rate, and the image frame rate of the secondary camera is the second frame rate.
- the camera HAL controls camera 1 (main camera) to produce pictures at the first frame rate, and controls camera 2 (sub-camera) to produce pictures at the second frame rate.
- camera 1 main camera
- camera 2 auxiliary camera
- the camera HAL transmits the image returned by camera 1 (main camera) to the camera application.
- the camera application displays the acquired image, that is, the image collected by camera 1 (main camera) is displayed on the preview interface.
- the mobile phone 100 displays a preview interface 102 that displays images collected by the camera 1 (main camera).
- the main camera outputs images according to the first frame rate, and transmits the images output by the main camera to the camera application for display, and generates a preview interface; the secondary camera outputs images according to the first frame rate.
- the image is output at the second frame rate, and the image output by the secondary camera is not transmitted to the camera application.
- the first frame rate is greater than the second frame rate.
- the second frame rate is greater than 0, that is, the sub-photography keeps outputting images instead of stopping outputting images; compared with the method in which the sub-photography does not output images during the preview stage, in the method provided by the embodiment of the present application, since the sub-photography is always
- the output image is just that the frame rate is lower than that of the main camera.
- the functions of the secondary camera such as auto focus (AF), automatic exposure (AE) and automatic white balance (AWB) are always at low power consumption.
- Working status when the frame rate of the secondary camera is adjusted upward (for example, to the first frame rate) during the imaging stage, quick response can be achieved to avoid affecting the imaging speed or quality.
- the preview interface 102 includes a control 103 , and the user can click the control 103 to start photographing and imaging.
- the mobile phone 100 After receiving the user's click operation on the control 103, the mobile phone 100 generates a photo based on the image collected by the camera.
- the mobile phone 100 displays the thumbnail 104 of the photo.
- the mobile phone 100 performs a series of internal processes.
- the mobile phone 100 receives the user's click operation on the control 103 .
- the camera application sends an imaging request to the camera HAL.
- the camera HAL sends an imaging request to the decision module.
- the decision-making module determines the image rendering frame rate of the secondary camera according to the imaging request and the image rendering frame rate of the main camera, that is, the image rendering frame rate of the secondary camera is equal to the image rendering frame rate of the main camera. For example, if the frame rate of the main camera is the first frame rate, then the frame rate of the secondary camera is determined to be the first frame rate.
- the decision-making module can call the frame rate decision sub-module (not shown in Figure 3) to specifically determine the frame rate of the secondary shooting.
- the image output frame rate of the main camera during the photo imaging stage may be different from the image output frame rate during the photo preview stage.
- the decision-making module determines the image frame rate of the main camera as the first frame rate, and determines the image frame rate of the secondary camera as the second frame rate, where the second frame rate is smaller than the first frame rate; when taking pictures imaging stage, The decision-making module determines that the image output frame rate of the main camera is the third frame rate, and then determines that the image output frame rate of the secondary camera is also the third frame rate.
- the camera HAL controls camera 1 (main camera) to produce pictures at the first frame rate, and controls camera 2 (sub-camera) to produce pictures at the first frame rate.
- camera 1 (main camera) returns images to camera HAL at a first frame rate
- camera 2 (sub-camera) returns images to camera HAL at a first frame rate.
- the camera HAL synthesizes the image returned by camera 1 (main camera) and the image returned by camera 2 (secondary camera), and transmits the synthesized image to the camera application.
- the camera application saves the obtained composite image as a photo.
- the mobile phone 100 displays the thumbnail 104 of the photo.
- the main camera and the secondary camera output images at the same frame rate, and photos are generated based on the images output by the main camera and the secondary camera.
- the images output by the main camera and the secondary camera are combined according to the set rules, so that shooting effects in different scenes can be achieved.
- the camera application sends an imaging completion request to the camera HAL.
- the camera HAL sends an imaging completion request to the decision module.
- the decision-making module (specifically, it can be executed by the frame rate decision sub-module) determines that the frame rate of the secondary shooting is restored to the second frame rate.
- the decision-making module notifies the camera HAL that the image frame rate of the main camera is the first frame rate, and the image frame rate of the secondary camera is the second frame rate.
- the camera HAL controls camera 1 (main camera) to produce pictures at the first frame rate, and controls camera 2 (sub-camera) to produce pictures at the second frame rate.
- Camera 1 main camera
- camera 2 sub-camera
- the camera HAL transmits the image returned by camera 1 (main camera) to the camera application.
- the camera application transmits the acquired images to the camera application for display, that is, the images captured by camera 1 (main camera) are displayed on the preview interface.
- the main camera outputs images at the first frame rate, and transmits the images output by the main camera to the camera application to generate a preview interface; the secondary camera outputs images at the second frame rate. , and the images output by the secondary camera are not transferred to the camera application.
- the main camera outputs images at a normal frame rate to ensure the image quality of the preview interface; the secondary camera outputs images at a second frame rate smaller than the first frame rate, which can reduce the power consumption of the mobile phone.
- the method for controlling the frame rate of a camera controls the image output frame rate of the secondary camera separately from the image output frame rate of the main camera; in the photo preview stage, the image frame rate of the secondary camera is smaller than that of the main camera Frame rate to reduce the power consumption of the mobile phone; when starting the camera imaging, the frame rate of the secondary camera is increased to the same frame rate as the main camera to improve the camera effect in special scenes; without affecting the camera effect Reduces power consumption when multiple cameras are combined for imaging.
- FIG. 5 is a message interaction flow chart of a method for controlling a camera frame rate provided by an embodiment of the present application. As shown in Figure 5, the method may include:
- the camera application receives the user's operation to start the camera application.
- the user can start the camera application by clicking the icon of the camera application.
- users can launch the camera app through other means.
- users can start the camera application through voice, gestures, etc.
- the input device (such as a touch screen, microphone, camera, etc.) of the mobile phone detects the user's operation of starting the camera application and can generate a first message.
- the first message is used to indicate that the user's operation of starting the camera application is received.
- the processor distributes the first message to the camera application.
- the camera application receives the first message, that is, the user's operation to start the camera application is received.
- the camera application sends a preview request message to the camera HAL.
- the camera application receives the operation of starting the camera application and sends a preview request message to the camera HAL.
- the camera HAL sends a preview request message to the decision-making module.
- the decision-making module receives the preview request message, determines to turn on camera one and camera two, and determines the initial frame rate of the camera.
- Camera configuration information includes the number of cameras on the phone, the type of each camera (for example, wide-angle camera, telephoto camera, ultra-wide-angle camera, black and white camera, macro camera or CV camera, etc.), and the hardware parameters of each camera (such as , the output frame rate supported by the camera), etc.
- the decision-making module determines to turn on all cameras on the mobile phone.
- the mobile phone includes two cameras, camera one and camera two.
- the decision module determines to turn on the camera of the phone, that is, it determines to turn on camera one and camera two.
- the implementation logic is simple.
- the camera configuration information is stored in the mobile phone, and the decision-making module determines to turn on camera one and camera two based on the camera configuration information and the preset policy.
- the camera HAL receives the preview request message from the camera application and obtains the camera configuration information of the mobile phone.
- the camera HAL sends a preview request message to the decision-making module, it also sends camera configuration information to the decision-making module.
- the decision-making module determines to enable camera one and camera two based on the camera configuration information and preset policy. This implementation method can obtain the configuration of the camera on the mobile phone in real time, enable camera combinations more accurately, and improve the quality of taking pictures.
- the decision-making module can use the strategy that can be obtained in conventional technology to determine the camera to be turned on.
- the decision-making module determines to turn on camera one and camera two, where camera one is the main camera and camera two is the secondary camera.
- camera one is a wide-angle camera
- camera two is a CV camera.
- the decision-making module can use the frame rate strategy that can be obtained in conventional technology to determine the initial frame rate of camera one and camera two. For example, based on the image rendering frame rates supported by camera 1 and camera 2 respectively, determine the image rendering frame rate supported by both camera 1 and camera 2. If both camera 1 and camera 2 support multiple output frame rates, you can select one of the multiple output frame rates as the initial frame rate. For example, select the largest output frame rate among the output frame rates supported by both camera 1 and camera 2; for example, preset a default output frame rate, and if both camera 1 and camera 2 support the output frame rate, then Determine the initial frame rate and the default rendering frame rate. For example, the initial frame rate is the first frame rate.
- the decision-making module sends information about the camera combination selected to be turned on and the initial frame rate of the camera to the camera HAL.
- the decision-making module sends information about the camera combination selected to be turned on and the initial frame rate of the camera to the camera HAL.
- the camera HAL receives the information of the camera combination selected to be turned on, and notifies camera 1 to start collecting images, and the picture frame rate is the initial frame rate (first frame rate); it notifies camera 2 to start collecting images, and the picture frame rate is the initial frame. rate (first frame rate).
- the decision-making module determines the image output frame rate of camera one and camera two.
- step S504 after the decision-making module determines to turn on camera one and camera two, and determines the initial frame rate of the camera, the decision-making module also determines the image output frame rate of camera one and camera two. It should be noted that this application implements This example does not limit the order in which S505 and S508 are executed; after the decision-making module determines to turn on camera one and camera two, and determines the initial frame rate of the camera, S505 can be executed first to send the information of the camera combination selected to be turned on and the camera's information to the camera HAL.
- the frame rate decision sub-module in the decision-making module determines the picture frame rates of camera one and camera two.
- the decision-making module sends information about the enabled camera combination to the frame rate decision sub-module; the information about the camera combination includes the identification of a group of cameras (camera one and camera two).
- the decision-making module also sends the initial frame rate of the camera to the frame rate decision sub-module.
- the frame rate decision sub-module receives the information about the camera combination selected to be turned on and the initial frame rate of the camera, determines that it is currently in the photo preview stage, and then determines the frame rate of camera one (main camera) to be the initial frame rate (the first frame rate ), determine that the image frame rate of camera two (sub-camera) is smaller than the first frame rate.
- the image output frame rate of the second camera is determined to be the second frame rate, where the second frame rate is smaller than the first frame rate.
- the second frame rate is half the first frame rate.
- the first frame rate is 30fps
- the second frame rate is 15fps.
- the value of the second frame rate is greater than 0 instead of equal to 0; that is, the sub-photographer keeps outputting images instead of stopping outputting images; compared with the method in which the sub-photographer does not output images in the preview stage, since the sub-photographer keeps outputting images,
- the AF, AE, and AWB functions of the secondary camera are always in a low-power working state; this allows for quick response when the frame rate of the secondary camera is adjusted upward (for example, to the first frame rate) during the imaging phase. Avoid affecting imaging speed or quality.
- the frame rate decision sub-module returns the image output frame rate of camera one (main camera) to the decision-making module as the first frame rate, and returns the image output frame rate of camera two (secondary camera) as the second frame rate.
- the decision-making module sends the image output frame rate of camera 1 to the camera HAL as the first frame rate, and sends the image output frame rate of camera 2 to the camera HAL as the second frame rate.
- the decision-making module determines that the image output frame rate of camera 1 is the first frame rate, and there is no change.
- the image output frame rate of camera 1 does not need to be sent to the camera HAL. If the camera HAL does not receive the image output frame rate for camera one, it is determined that the image output frame rate of camera one remains unchanged.
- step S504 after the decision-making module determines to turn on camera one and camera two, and determines the initial frame rate of the camera, it does not send the information of the camera combination selected to be turned on and the camera's frame rate to the camera HAL.
- the initial frame rate means that the above steps S505, S506 and S507 are not performed.
- the decision-making module After the S508 decision-making module determines the image output frame rates of camera one and camera two, the decision-making module sends the information of the selected camera combination to the camera HAL, as well as the image output frame rate of camera one, the first frame rate, and the image frame rate of camera two. rate to the second frame rate.
- the camera HAL notifies camera one to collect images at the first frame rate, and notifies camera two to collect images at the second frame rate.
- the camera HAL sends the first value of the exposure parameter to camera one, and sends the second value of the exposure parameter to camera two; wherein the second value is greater than the first value.
- the exposure parameter is used to indicate the exposure duration. The longer the exposure duration of the camera, the lower the frequency of image collection; the exposure duration is inversely proportional to the frequency of image collection. If the second value of the exposure parameter is twice the first value of the exposure parameter, then the frequency at which camera 2 collects images (second frame rate) is half of the frequency at which camera 1 collects images (first frame rate).
- the camera HAL sends an image request message to camera one at the first frequency, and the camera HAL sends an image request message to camera two at the second frequency.
- the picture request message is used to request the camera to output an image; each time the camera receives the picture request message, it outputs a frame of image to the camera HAL.
- the first frequency is equal to the first frame rate
- the second frequency is equal to the second frame rate.
- Camera one outputs images to camera HAL at the first frequency
- camera two outputs images to camera HAL at the second frequency.
- Camera 1 collects images at the first frame rate and outputs images to the camera HAL at the first frequency, that is, the image output frame rate of camera 1 is the first frame rate.
- Camera two collects images at a second frame rate, and outputs images to the camera HAL at a second frequency, that is, the image output frame rate of camera two is the second frame rate.
- the camera HAL transmits the image output by the camera to the camera application.
- the camera HAL receives the images output by camera one and camera two respectively, and transmits the image output by camera one to the camera application.
- the camera application displays the acquired image on the preview interface.
- the camera application displays the acquired image, that is, the image collected by camera 1 (main camera) is displayed on the preview interface.
- the method for controlling the camera frame rate provided by the embodiment of the present application also includes:
- the camera application receives the user's operation to initiate photography and imaging.
- the user can start taking pictures and imaging.
- the preview interface 102 includes a control 103 , and the user can click the control 103 to start photographing and imaging.
- the camera application receives the user's click operation on the control 103, that is, receives the user's operation to start taking pictures and imaging.
- the user can initiate photographing and imaging through other methods.
- users can start photography and imaging through voice, gestures, etc.
- a second message may be generated.
- the second message is used to indicate that the user's operation of initiating photographing and imaging is received.
- the processor distributes the second message to the camera application.
- the camera application receives the second message, that is, it receives the user's operation of initiating photography and imaging.
- the camera application sends an imaging request message to the camera HAL.
- the camera HAL sends an imaging request message to the decision-making module.
- the decision-making module receives the imaging request message, determines the image output frame rate of camera one (main camera) as the first frame rate, and determines the image output frame rate of camera two (auxiliary camera) as the first frame rate.
- the frame rate decision sub-module in the decision-making module determines the picture frame rate of camera one (main camera) and camera two (secondary camera). For example, the decision module sends an imaging message to the frame rate decision sub-module.
- the frame rate decision sub-module receives the imaging message and determines that it is currently in the photo imaging stage, and then determines the image frame rate of camera one (main camera) and camera two (secondary camera) as the initial frame rate, that is, the first frame rate.
- the frame rate decision sub-module returns the first frame rate of the image output frame rate of camera one (main camera) and camera two (secondary camera) to the decision-making module.
- restoring the image frame rate of the secondary camera to the same value as the main camera allows the secondary camera to output images at a higher frequency, which can be used for combined imaging and improve photo quality.
- the decision-making module sends the image output frame rate of camera 1 to the camera HAL as the first frame rate, and sends the image output frame rate of camera 2 as the first frame rate to the camera HAL.
- the decision-making module determines that the image output frame rate of camera 1 is the first frame rate, and there is no change.
- the image output frame rate of camera 1 does not need to be sent to the camera HAL. If the camera HAL does not receive the image output frame rate for camera one, it is determined that the image output frame rate of camera one remains unchanged.
- the camera HAL notifies camera one of the picture output frame rate as the first frame rate, and notifies camera two of the picture output frame rate of the first frame rate.
- the camera HAL sends the same exposure parameters to camera one and camera two, so that the image output frame rates of camera one and camera two can be equal.
- the camera HAL sends image request messages to camera one and camera two respectively at the first frequency.
- the first frequency is equal to the first frame rate.
- Camera one outputs the image to the camera HAL at the first frequency
- camera two outputs the image to the camera HAL at the first frequency
- Camera 1 collects images at the first frame rate and outputs images to the camera HAL at the first frequency, that is, the image output frame rate of camera 1 is the first frame rate.
- Camera 2 collects images at the first frame rate and outputs images to the camera HAL at the first frequency, that is, the image output frame rate of camera 2 is the first frame rate.
- the camera HAL synthesizes the image output by the first camera and the image output by the second camera to generate a composite image.
- the camera HAL can synthesize the image output by the first camera and the image output by the second camera using methods available in conventional technology.
- the camera HAL can generate a composite image from one frame of image output by camera one and one frame of image output by camera two.
- the camera HAL can generate a composite image from M frames of images output by camera 1 and N frames of images output by camera 2; where M>1, N>1; optionally, M and N can Equal or not equal.
- the camera HAL transmits the composite image to the camera application.
- the camera application receives the composite image and saves the composite image as a photo.
- the camera application receives the composite image and determines that the photo imaging is completed.
- the camera application can display thumbnails of the photos in the preview interface.
- users can also click on a photo's thumbnail to view the resulting photo.
- the method for controlling the camera frame rate provided by the embodiment of the present application also includes:
- the camera application sends an imaging end message to the camera HAL.
- the camera application receives the composite image and sends an imaging end message to the camera HAL.
- the camera HAL sends the imaging end message to the decision-making module.
- the decision-making module receives the imaging end message, determines the image output frame rate of camera one (main camera) as the first frame rate, and determines the image output frame rate of camera two (sub-photographer) as the second frame rate.
- the frame rate decision sub-module in the decision-making module determines the picture frame rates of camera one and camera two.
- the decision module sends an imaging end message to the frame rate decision sub-module.
- Frame rate decision sub-module receives When the imaging end message is received, it is determined that the current stage is the photo preview stage, the image output frame rate of camera one (main camera) is determined to be the first frame rate, and the image output frame rate of camera two (sub-photographer) is restored to the second frame rate.
- the frame rate decision sub-module returns the image output frame rate of camera one (main camera) to the decision-making module as the first frame rate, and returns the image output frame rate of camera two (sub-photographer) as the second frame rate.
- imaging After imaging is completed, it returns to the photo preview stage.
- the image output by the secondary camera will not be transferred to the camera application for display.
- the image frame rate of the secondary camera will be restored to the second frame rate. This can reduce the power consumption in the photo preview stage. .
- the decision-making module sends the image output frame rate of camera 1 to the camera HAL as the first frame rate, and sends the image output frame rate of camera 2 to the camera HAL as the second frame rate.
- the decision-making module determines that the image output frame rate of camera 1 is the first frame rate, and there is no change.
- the image output frame rate of camera 1 does not need to be sent to the camera HAL. If the camera HAL does not receive the image output frame rate for camera one, it is determined that the image output frame rate of camera one remains unchanged.
- the camera HAL notifies camera one to collect images at the first frame rate, and notifies camera two to collect images at the second frame rate.
- the camera HAL sends an image request message to camera one at the first frequency, and the camera HAL sends an image request message to camera two at the second frequency.
- Camera one outputs images to camera HAL at the first frequency, and camera two outputs images to camera HAL at the second frequency.
- the camera HAL transmits the image output by the camera to the camera application.
- the camera HAL receives the images output by camera one and camera two respectively, and transmits the image output by camera one to the camera application.
- the camera application displays the acquired image on the preview interface.
- the embodiments of this application take taking pictures as an example to introduce the method of controlling the camera frame rate provided by this application. It can be understood that the method for controlling the camera frame rate provided by the embodiments of the present application is also applicable to the video recording function. For specific implementation methods, please refer to the camera function, and the embodiments of the present application will not introduce it in detail.
- Some embodiments of the present application provide an electronic device, which may include a memory, multiple cameras, and one or more processors.
- the camera, memory and processor are coupled.
- the memory is used to store computer program code, which includes computer instructions.
- the processor executes computer instructions, the electronic device may perform various functions or steps performed by the electronic device in the above method embodiments.
- the structure of the electronic device may refer to the structure of the electronic device 100 shown in FIG. 1 .
- the chip system includes at least one processor 801 and at least one interface circuit 802.
- the processor 801 and the interface circuit 802 may be interconnected by wires.
- interface circuitry 802 may be used to receive signals from other devices, such as memory of an electronic device.
- the interface circuit 802 may be used to send signals to other devices (such as the processor 801 or a touch screen of an electronic device or a camera of an electronic device).
- the interface circuit 802 can read instructions stored in the memory and send the instructions to the processor 801 .
- the chip system can be caused to execute the steps of each module of the HAL layer in the above embodiment.
- the chip system may also include other discrete devices, which are not specifically limited in the embodiments of this application.
- Embodiments of the present application also provide a computer-readable storage medium.
- the computer-readable storage medium includes a computer Computer instructions, when the computer instructions are run on the above-mentioned electronic device, cause the electronic device to perform each function or step performed by the electronic device in the above method embodiment.
- An embodiment of the present application also provides a computer program product.
- the computer program product When the computer program product is run on an electronic device, it causes the electronic device to perform each function or step performed by the electronic device in the above method embodiment.
- the disclosed devices and methods can be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of modules or units is only a logical function division.
- there may be other division methods for example, multiple units or components may be The combination can either be integrated into another device, or some features can be omitted, or not implemented.
- the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated.
- the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
- each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
- the above integrated units can be implemented in the form of hardware or software functional units.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
- the technical solutions of the embodiments of the present application are essentially or contribute to the existing technology, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium , including several instructions to cause a device (which can be a microcontroller, a chip, etc.) or a processor to execute all or part of the steps of the methods described in various embodiments of this application.
- the aforementioned storage media include: U disk, mobile hard disk, read only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
本申请提供一种控制摄像头帧率的方法及电子设备,涉及终端领域。用于多个摄像头组合成像场景,主摄与副摄的出图帧率分开控制。在拍照预览阶段,副摄以比主摄低的出图帧率输出图像,降低功率损耗;在拍照成像时,副摄提高至与主摄相同的出图帧率输出图像,保证成像质量。在不影响拍摄效果的条件下降低多个摄像头组合成像时的功率损耗。
Description
本申请要求于2022年08月26日提交国家知识产权局、申请号为202211032087.1、发明名称为“一种控制摄像头帧率的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及终端领域,尤其涉及一种控制摄像头帧率的方法及电子设备。
随着人们对成像质量要求越来越高,电子设备(例如,手机)上通常会集成多个摄像头(也称为镜头),比如广角镜头、长焦镜头、超广角镜头、黑白镜头、微距镜头、计算机视觉(computer vision,CV)镜头等。多个摄像头可以组合成像,以提升不同场景下的拍摄质量。在多个摄像头组合成像时,多个摄像头中一个为主摄像头(简称主摄),其余一个或多个为副摄像头(简称副摄)。主摄和副摄输出的图像按照设定的规则合成,就可以实现不同场景下的拍摄效果;但是,摄像头输出图像会带来功率损耗。如何在保证拍摄效果的同时降低功率损耗,是一个需要解决的问题。
发明内容
本申请提供一种控制摄像头帧率的方法及电子设备,可以在不影响拍摄效果的条件下降低多个摄像头组合成像时的功率损耗。
为达到上述目的,本申请采用如下技术方案:
第一方面,提供了一种电子设备,包括:处理器,显示屏,第一摄像头和第二摄像头;处理器上运行电子设备的操作系统,操作系统中安装有相机应用;其中,处理器用于接收指示接收到用户启动相机应用操作的第一消息;还用于响应于接收到第一消息,启动第一摄像头和第二摄像头;第一摄像头用于以第一帧率向处理器输出图像;第二摄像头用于以第二帧率向处理器输出图像,第二帧率小于第一帧率;处理器还用于将第一摄像头输出的图像传输至相机应用;显示屏用于显示相机应用的预览界面,预览界面包括第一摄像头输出的图像。
在该方案中,启动多个摄像头(摄像头一和摄像头二)进行组合成像时,分开控制摄像头一和摄像头二的出图帧率。在拍照预览阶段,仅将主摄采集的图像用于预览界面进行显示,副摄输出的图像不用于显示,副摄以比主摄低的出图帧率输出图像,可以降低拍照预览时的功率损耗。
根据第一方面,在一些实施方式中,处理器还用于在启动第一摄像头和第二摄像头之后,通知第一摄像头以第一帧率采集图像,通知第二摄像头以第二帧率采集图像。
在该方法中,处理器控制第一摄像头和第二摄像头采集图像的帧率。
根据第一方面,在一些实施方式中,处理器用于以第一频率向第一摄像头发送出图请求消息;还用于以第二频率向第二摄像头发送出图请求消息;其中,出图请求消息用于请求摄像头输出一帧图像,第一频率与第一帧率相等,第二频率与第二帧率相
等。
在该方法中,处理器每向摄像头发送一次出图请求消息,就可以触发摄像头向摄像头传输一帧图像。处理器以与第一帧率相等的第一频率向第一摄像头发送出图请求消息,就可以触发第一摄像头以第一帧率向处理器输出采集的图像。处理器以与第二帧率相等的第二频率向第二摄像头发送出图请求消息,就可以触发第二摄像头以第二帧率向处理器输出采集的图像。这样就可以实现第一摄像头以第一帧率向处理器输出图像;第二摄像头以第二帧率向处理器输出图像。
根据第一方面,在一些实施方式中,处理器还用于在通知第二摄像头以第二帧率采集图像之前,通知第二摄像头以初始帧率采集图像;其中初始帧率为该第一帧率。
根据第一方面,在一些实施方式中,处理器还用于在接收到第一消息之后,根据电子设备的摄像头配置以及预设策略确定启动第一摄像头和第二摄像头。
根据第一方面,在一些实施方式中,处理器还用于在接收到第一消息之后,获取电子设备的摄像头配置。这样,每次都可以获取最新的摄像头配置,选择合适的摄像头进行开启。
根据第一方面,在一些实施方式中,处理器,还用于接收第二消息,第二消息用于指示接收到用户在预览界面启动拍照成像的操作;处理器还用于响应于接收到第二消息,通知第二摄像头以第一帧率输出图像;第二摄像头还用于以第一帧率向处理器输出图像;处理器还用于将第一摄像头的输出图像和第二摄像头的输出图像合成照片;其中,第一摄像头以第一帧率输出图像。
在该方案中,在拍照成像时,副摄提高至与主摄相同的出图帧率输出图像,可以用于将第一摄像头的输出图像和第二摄像头的输出图像合成照片,保证成像质量。
其中,处理器具体用于:将第一摄像头输出的M帧图像和第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
在一些实施方式中,第二帧率大于0。比如,第二帧率为第一帧率的一半。第二帧率大于0,即在拍照预览阶段副摄保持输出图像,而不是停止输出图像;和副摄在预览阶段不输出图像的方法相比,副摄以较低帧率保持输出图像的方法,由于副摄一直在输出图像,只是出图帧率小于主摄,副摄的AF、AE和AWB等功能一直处于低功耗工作状态;待拍照成像阶段将副摄的出图帧率向上调整(比如调整至第一帧率)时,可以做到快速响应,避免影响成像速度或质量。
第二方面,提供一种控制摄像头帧率的方法,应用于电子设备,电子设备包括第一摄像头和第二摄像头,该方法包括:接收用户启动相机应用的操作;响应于启动相机应用的操作,启动第一摄像头和第二摄像头;控制第一摄像头以第一帧率输出图像;控制第二摄像头以第二帧率输出图像,第二帧率小于第一帧率;显示相机应用的预览界面,该预览界面包括第一摄像头输出的图像。
在该方法中,启动多个摄像头(摄像头一和摄像头二)进行组合成像时,分开控制摄像头一和摄像头二的出图帧率。在拍照预览阶段,仅将主摄采集的图像用于预览界面进行显示,副摄输出的图像不用于显示,副摄以比主摄低的出图帧率输出图像,可以降低拍照预览时的功率损耗。
根据第二方面,在一些实施方式中,在控制第二摄像头以第二帧率输出图像之前,
该方法还包括:控制第二摄像头以初始帧率输出图像;初始帧率为第一帧率。
根据第二方面,在一些实施方式中,该方法还包括:接收用户在预览界面启动拍照成像的操作;响应于启动拍照成像的操作,控制第二摄像头以第一帧率输出图像;将第一摄像头的输出图像和第二摄像头的输出图像合成照片;其中,第一摄像头以第一帧率输出图像。
在该方法中,在拍照成像时,副摄提高至与主摄相同的出图帧率输出图像,可以用于将第一摄像头的输出图像和第二摄像头的输出图像合成照片,保证成像质量。
根据第二方面,在一些实施方式中,将第一摄像头的输出图像和第二摄像头的输出图像合成照片之后,控制第二摄像头以第二帧率输出图像;其中,第一摄像头以第一帧率输出图像;显示相机应用的预览界面,预览界面包括第一摄像头输出的图像。
在该方法中,完成拍照成像后继续显示预览界面,预览界面仅显示第一摄像头采集的图像;将第二摄像头的出图帧率再次降到第二帧率,可以降低拍照预览阶段的功耗。
其中,将第一摄像头的输出图像和第二摄像头的输出图像合成照片,包括:将第一摄像头输出的M帧图像和第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
根据第二方面,在一些实施方式中,第二帧率大于0。比如,第二帧率为第一帧率的一半。第二帧率大于0,即在拍照预览阶段副摄保持输出图像,而不是停止输出图像;和副摄在预览阶段不输出图像的方法相比,副摄以较低帧率保持输出图像的方法,由于副摄一直在输出图像,只是出图帧率小于主摄,副摄的AF、AE和AWB等功能一直处于低功耗工作状态;待拍照成像阶段将副摄的出图帧率向上调整(比如调整至第一帧率)时,可以做到快速响应,避免影响成像速度或质量。
第三方面,提供一种控制摄像头帧率的方法,可以应用于芯片系统,该方法包括:接收第一消息,第一消息用于指示接收到用户启动相机应用的操作;响应于接收到第一消息,通知第一摄像头和第二摄像头启动;通知第一摄像头以第一帧率采集图像;通知第二摄像头以第二帧率采集图像,第二帧率小于第一帧率;接收述第一摄像头以述第一帧率输出的图像;接收第二摄像头以第二帧率输出的图像;将第一摄像头输出的图像传输至相机应用,使得相机应用根据第一摄像头输出的图像显示预览界面。
在该方法中,启动多个摄像头(摄像头一和摄像头二)进行组合成像时,分开控制摄像头一和摄像头二的出图帧率。在拍照预览阶段,仅将主摄采集的图像用于预览界面进行显示,副摄输出的图像不用于显示,副摄以比主摄低的出图帧率输出图像,可以降低拍照预览时的功率损耗。
根据第三方面,在一些实施方式中,在通知第一摄像头以第一帧率采集图像,通知第二摄像头以第二帧率采集图像之后,该方法还包括:以第一频率向第一摄像头发送出图请求消息,以第二频率向第二摄像头发送出图请求消息;其中,出图请求消息用于请求摄像头输出一帧图像,第一频率与第一帧率相等,第二频率与第二帧率相等。
在该方法中,处理器每向摄像头发送一次出图请求消息,就可以触发摄像头向摄像头传输一帧图像。处理器以与第一帧率相等的第一频率向第一摄像头发送出图请求消息,就可以触发第一摄像头以第一帧率向处理器输出采集的图像。处理器以与第二
帧率相等的第二频率向第二摄像头发送出图请求消息,就可以触发第二摄像头以第二帧率向处理器输出采集的图像。这样就可以实现第一摄像头以第一帧率向处理器输出图像;第二摄像头以第二帧率向处理器输出图像。
根据第三方面,在一些实施方式中,在通知第二摄像头以第二帧率采集图像之前,还通知第二摄像头以初始帧率采集图像;初始帧率为第一帧率。
根据第三方面,在一些实施方式中,在接收第一消息之后,获取摄像头配置;根据摄像头配置以及预设策略确定启动第一摄像头和第二摄像头。这样,每次都可以获取最新的摄像头配置,选择合适的摄像头进行开启。
根据第三方面,在一些实施方式中,该方法还包括:接收第二消息,第二消息用于指示接收到用户在预览界面启动拍照成像的操作;响应于接收到第二消息,通知第二摄像头以第一帧率采集图像;接收第二摄像头以第一帧率输出的图像;将第一摄像头的输出图像和第二摄像头的输出图像合成照片;其中,第一摄像头以第一帧率输出图像。
在该方法中,在拍照成像时,副摄提高至与主摄相同的出图帧率输出图像,可以用于将第一摄像头的输出图像和第二摄像头的输出图像合成照片,保证成像质量。
根据第三方面,在一些实施方式中,将第一摄像头的输出图像和第二摄像头的输出图像合成照片,包括:将第一摄像头输出的M帧图像和第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
根据第三方面,在一些实施方式中,通知第一摄像头以第一帧率采集图像,通知第二摄像头以第二帧率采集图像,包括:设置第一摄像头的曝光参数为第一值,设置第二摄像头的曝光参数为第二值,第二值大于第一值。
在一种实现方式中,第二帧率为第一帧率的一半,第二值为第一值的二倍。
第四方面,提供了一种电子设备,该电子设备具有实现上述第二方面所述的方法的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的模块。
第五方面,提供了一种电子设备,包括:处理器;所述处理器用于与存储器耦合,并读取存储器中的指令之后,根据所述指令执行如上述第三方面中任一项所述的方法。
第六方面,提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机可以执行上述第二方面中任一项所述的方法。
第七方面,提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机可以执行上述第二方面中任一项所述的方法。
第八方面,提供了一种装置(例如,该装置可以是芯片系统),该装置包括处理器,用于支持电子设备实现上述第三方面中所涉及的功能。在一种可能的设计中,该装置还包括存储器,该存储器,用于保存电子设备必要的程序指令和数据。该装置是芯片系统时,可以由芯片构成,也可以包含芯片和其他分立器件。
其中,第四方面至第八方面中任一种设计方式所带来的技术效果可参见对应的第二方面或第三方面中不同设计方式所带来的技术效果,此处不再赘述。
图1为本申请实施例提供的一种电子设备的结构示意图;
图2为本申请实施例提供的一种电子设备的软件架构示意图;
图3为本申请实施例提供的控制摄像头帧率的方法中模块间交互示意图;
图4为本申请实施例提供的控制摄像头帧率的方法中模块间交互示意图;
图5为本申请实施例提供的控制摄像头帧率的方法的流程示意图;
图6为本申请实施例提供的控制摄像头帧率的方法的流程示意图;
图7为本申请实施例提供的控制摄像头帧率的方法的流程示意图;
图8为本申请实施例提供的一种芯片系统的示意图。
在本申请实施例的描述中,以下实施例中所使用的术语只是为了描述特定实施例的目的,而并非旨在作为对本申请的限制。如在本申请的说明书和所附权利要求书中所使用的那样,单数表达形式“一种”、“所述”、“上述”、“该”和“这一”旨在也包括例如“一个或多个”这种表达形式,除非其上下文中明确地有相反指示。还应当理解,在本申请以下各实施例中,“至少一个”、“一个或多个”是指一个或两个以上(包含两个)。术语“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系;例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A、B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。术语“连接”包括直接连接和间接连接,除非另外说明。“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
对于集成了多个摄像头的电子设备(例如,手机),用户启动相机应用进行拍照时,可以根据预设策略开启多个摄像头组合成像,以提升不同场景下的拍摄质量。比如,可以通过RGB摄像头和计算机视觉(computer view,CV)摄像头双路融合拍照实现在天空、山峰、绿植、人像等场景下的拍照效果提升。
一般来说,考虑到电子设备处理器的处理能力,功耗等方面因素,在拍照预览时,只有主摄输出的图像传输至相机应用进行显示,副摄输出的图像不传输至相机应用。拍照成像时,才将主摄和副摄输出的图像进行合成成像,实现拍照效果提升。
目前的实现方式中,在拍照预览时,副摄输出的图像虽然不用于显示,但是还以与主摄相同的出图帧率输出图像。出图帧率即摄像头单位时长内输出图像的数量,一
般用帧每秒(fps)来表示,比如30fps。这种实现方式,主摄和副摄的出图帧率相同,主摄和副摄的出图帧率是绑定设置的,控制简单;但是副摄以一个较高帧率出图而不传输至相机应用,带来了不必要的功率损耗。
本申请实施例提供一种控制摄像头帧率的方法,在拍照预览时,分别控制主摄和副摄的出图帧率,控制副摄以较低的出图帧率输出图像,降低功率损耗。拍照成像时,提高副摄的出图帧率至与主摄出图帧率相同,实现特殊场景下拍照效果提升。
本申请实施例提供的方法可以应用于包括显示屏的电子设备。上述电子设备可以包括手机、平板电脑、笔记本电脑、个人电脑(personal computer,PC)、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、智能家居设备(比如,智能电视、智慧屏、大屏、智能音箱、智能空调等)、个人数字助理(personal digital assistant,PDA)、可穿戴设备(比如,智能手表、智能手环等)、车载设备、虚拟现实设备等,本申请实施例对此不做任何限制。
图1为本申请实施例提供的一种电子设备100的结构示意图。如图1所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。
其中,传感器模块180可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
可以理解的是,本实施例示意的结构并不构成对电子设备100的具体限定。在另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路
(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。例如,在本申请实施例中,处理器110可以通过执行存储在内部存储器121中的指令,内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。
无线通信模块160可以提供应用在电子设备100上的包括WLAN(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。耳机接口170D用于连接有线耳机。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。
SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。该显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),发光二极管(light-emitting diode,LED),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。本申请实施例中,显示屏194可以用于显示人机交互界面,拍照预览界面等。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。ISP用于处理摄像头193反馈的数据。摄像头193用于捕获静态图像或视频。数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
摄像头193可以包括1~N个。例如电子设备可以包括广角摄像头、超广角摄像头、长焦摄像头、黑白摄像头、微距摄像头、CV摄像头、景深摄像头等,本申请不做限定。在拍照时,多个摄像头可以根据不同场景,按照预设规则组合成像,以提升不同场景下的拍摄质量。多个摄像头中包括一个主摄,包括一个或多个副摄。以下实施例中,均以一个主摄和一个副摄为例进行介绍。可以理解的,当包括更多副摄时,其余副摄的执行逻辑可以参考下述实施例中副摄的执行逻辑。
以下实施例中的方法均可以在具有上述硬件结构的电子设备100中实现。
在本申请实施例中,上述电子设备是可以运行操作系统,安装应用程序的电子设备。可选地,电子设备运行的操作系统可以是系统,系统,系统等。例如,上述电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本发明实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过接口通信。在一些实施例中,Android系统可以包括应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,硬件抽象层(hardware abstraction layer,HAL)以及内核层。需要说明的是,本申请实施例以Android系统举例来说明,在其他操作系统中(例如鸿蒙系统,IOS系统等),只要各个功能模块实现的功能和本申请的实施例类似也能实现本申请的方案。
其中,应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机应用,图库,日历,通话,地图,导航,
无线局域网(wireless local area networks,WLAN),蓝牙,音乐,视频,短信息、设置等应用程序。当然,应用程序层还可以包括其他应用程序包,例如支付应用,购物应用、银行应用或聊天应用等,本申请不做限定。
在本申请实施例中,应用程序层中可以安装具有拍摄功能的应用,例如,相机应用。相机应用具有拍照和摄像的功能。当然,其他应用需要使用拍摄功能时,也可以调用相机应用实现拍摄功能。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。例如可以包括活动管理器、窗口管理器,内容提供器,视图系统,资源管理器,通知管理器和相机服务(Camera Service)等,本申请实施例对此不做任何限制。
其中,Camera Service可以在电子设备开机阶段启动。Camera Service在运行过程中可以与硬件抽象层(HAL)中的相机HAL(Camera HAL)交互。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如OpenGL ES),2D图形引擎(例如SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
OpenGL ES用于实现三维图形绘图,图像渲染,合成,和图层处理等。
SGL是2D绘图的绘图引擎。
安卓运行时(Android Runtime)包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
HAL层是对Linux内核驱动程序的封装,向上提供接口,屏蔽底层硬件的实现细节。
HAL层中可以包括Wi-Fi HAL,音频(audio)HAL,相机HAL(Camera HAL),决策模块等。
其中,相机HAL是摄像头(Camera)的核心软件框架,负责与电子设备中实现拍摄功能的硬件设备(例如摄像头)进行交互。相机HAL一方面隐藏了相关硬件设备的实现细节(例如具体的图像处理算法),另一方面可向Android系统提供调用相关硬件设备的接口。
决策模块用于适配打开相机应用的初始化阶段逻辑(例如,确定打开相机应用时启动的摄像头,确定摄像头的出图帧率)、拍照预览阶段逻辑(例如,确定拍照预览时主摄和副摄的出图帧率)和拍照成像阶段逻辑(例如,确定拍照成像时主摄和副摄的出图帧率)等。在一种示例中,决策模块中包括帧率决策子模块,用于确定摄像头
(比如副摄)的出图帧率。在另一些实施例中,决策模块也可以设置于相机HAL中。
内核层是硬件和软件之间的层。内核层可以包含显示驱动,摄像头驱动,音频驱动,传感器驱动等。其中,摄像头驱动是Camera器件的驱动层,主要负责和硬件的交互。
硬件层包括显示器、摄像头等。摄像头例如可以包括广角摄像头(也可以称为广角摄像头模组或广角镜头)、长焦摄像头(也可以称为长焦摄像头模组或长焦镜头)、超广角摄像头(也可以称为超广角摄像头模组或超广角镜头)、黑白摄像头(也可以称为黑白摄像头模组或黑白镜头)、微距摄像头(也可以称为微距摄像头模组或微距镜头)、计算机视觉(computer vision,CV)摄像头(也可以称为CV摄像头模组或CV镜头)等。
下面以电子设备为手机为例,对本申请实施例提供的控制摄像头帧率的方法进行详细说明。
用户可以通过手机的摄像头来进行拍照或摄像。示例性的,参考图3的(a),手机100的桌面包括多个应用程序的快捷图标。用户可以点击“相机”应用的图标101,启动相机应用。手机100接收到用户对图标101的点击操作,启动相机应用。相机应用启动手机的摄像头采集图像,并将摄像头采集的图像显示为预览界面。如图3的(b)所示,手机100显示预览界面102。
在手机100接收到用户启动相机应用的操作(比如,对图标101的点击操作),至手机100显示预览界面102之间,手机100进行了一系列内部处理。
如图3的(a)所示,手机100接收到用户对图标101的点击操作,启动相机应用。进一步的,如图3的(c)所示,相机应用向相机HAL发送预览请求。相机HAL向决策模块发送预览请求。决策模块根据摄像头配置信息和预设策略确定开启的摄像头。其中,决策模块可以采用常规技术中能够获取到的策略来确定开启的摄像头。比如,可以根据手机周围的光线亮度、拍照模式、手机包括的摄像头类型等确定开启的摄像头。例如,在一些场景,决策模块确定开启一个摄像头。在一些场景(比如人像模式),决策模块确定开启多个摄像头,且多个摄像头采集的图像都传输至相机应用。在一些场景,决策模块确定开启多个摄像头,其中包括主摄和副摄。在拍照预览阶段,主摄采集的图像传输至相机应用进行显示;在拍照成像阶段,主摄和副摄采集的图像合并生成照片。也就是说,主摄采集的图像在拍照预览阶段和拍照成像时都会使用;副摄采集的图像在拍照预览阶段不使用,在拍照成像阶段使用。示例性的,决策模块确定启用黑白融合拍照,则确定主摄为广角摄像头,副摄为黑白摄像头。示例性的,决策模块确定启用CV融合拍照,则确定主摄为广角摄像头,副摄为CV摄像头。需要说明的是,本申请实施例中,摄像头采集的图像传输至相机应用,既可以包括将摄像头采集的每一帧图像都传输至相机应用,还可以包括将摄像头采集的多帧图像合并后传输至相机应用,还可以包括将多个摄像头采集的图像合成后传输至相机应用。
决策模块还根据预设策略确定摄像头的出图帧率。决策模块可以采用常规技术中能够获取到的策略来确定摄像头的出图帧率。比如,决策模块可以根据开启的摄像头类型、摄像头支持的出图帧率,系统可用资源大小等确定摄像头的出图帧率。
在一种实现方式中,决策模块根据预览请求确定处于拍摄预览阶段,则根据上述
预设策略确定主摄的出图帧率为第一帧率。决策模块还确定副摄的出图帧率为第二帧率,其中第二帧率小于第一帧率。比如,主摄的出图帧率为30fps,副摄的出图帧率为15fps。在一种示例中,决策模块可以调用帧率决策子模块(图3中未示出)具体确定副摄的出图帧率。
决策模块通知相机HAL,主摄的出图帧率为第一帧率,副摄的出图帧率为第二帧率。相机HAL控制摄像头1(主摄)以第一帧率出图,控制摄像头2(副摄)以第二帧率出图。
示例性的,如图3的(d)所示,摄像头1(主摄)以第一帧率向相机HAL返回图像,摄像头2(副摄)以第二帧率向相机HAL返回图像。当前处于拍照预览阶段,相机HAL将摄像头1(主摄)返回的图像传输至相机应用。
相机应用显示获取到的图像,即在预览界面显示摄像头1(主摄)采集的图像。示例性的,如图3的(b)所示,手机100显示预览界面102,该预览界面102显示摄像头1(主摄)采集的图像。
本申请实施例提供的控制摄像头帧率的方法,在拍照预览阶段,主摄按照第一帧率输出图像,并将主摄输出的图像传输至相机应用进行显示,生成预览界面;副摄按照第二帧率输出图像,并且副摄输出的图像不传输至相机应用。其中,第一帧率大于第二帧率。这样,主摄按照正常帧率输出图像,保证预览界面的图像质量;副摄输出的图像不传输至相机应用,节约手机处理资源;副摄按照小于第一帧率的第二帧率输出图像,可以降低手机的功率损耗。并且,第二帧率大于0,即副摄保持输出图像,而不是停止输出图像;和副摄在预览阶段不输出图像的方法相比,本申请实施例提供的方法中,由于副摄一直在输出图像,只是出图帧率小于主摄,副摄的自动对焦(auto focus,AF)、自动曝光(automatic exposure,AE)和自动白平衡(automatic white balance,AWB)等功能一直处于低功耗工作状态;待拍照成像阶段将副摄的出图帧率向上调整(比如调整至第一帧率)时,可以做到快速响应,避免影响成像速度或质量。
在拍照预览界面,用户可以启动拍照成像。示例性的,如图4的(a)所示,预览界面102包括控件103,用户可以点击控件103,启动拍照成像。接收到用户对控件103的点击操作,手机100根据摄像头采集的图像生成照片。示例性的,如图4的(b)所示,手机100显示该照片的缩略图104。
在手机100接收到用户启动拍照成像的操作(比如,对控件103的点击操作),至生成照片之间,手机100进行了一系列内部处理。
如图4的(a)所示,手机100接收到用户对控件103的点击操作。进一步的,如图4的(c)所示,相机应用向相机HAL发送成像请求。相机HAL向决策模块发送成像请求。决策模块根据成像请求和主摄的出图帧率确定副摄的出图帧率,即副摄的出图帧率与主摄的出图帧率相等。比如,主摄的出图帧率为第一帧率,则确定副摄的出图帧率为第一帧率。在一种示例中,决策模块可以调用帧率决策子模块(图3中未示出)具体确定副摄的出图帧率。
可选的,在一些实现方式中,主摄在拍照成像阶段的出图帧率可以与拍照预览阶段的出图帧率不同。比如,在拍照预览阶段,决策模块确定主摄的出图帧率为第一帧率,确定副摄的出图帧率为第二帧率,其中第二帧率小于第一帧率;在拍照成像阶段,
决策模块确定主摄的出图帧率为第三帧率,则确定副摄的出图帧率也为第三帧率。
下面继续以主摄在拍照成像阶段的出图帧率为第一帧率为例进行介绍。相机HAL控制摄像头1(主摄)以第一帧率出图,控制摄像头2(副摄)以第一帧率出图。如图4的(d)所示,摄像头1(主摄)以第一帧率向相机HAL返回图像,摄像头2(副摄)以第一帧率向相机HAL返回图像。当前处于拍照成像阶段,相机HAL根据摄像头1(主摄)返回的图像和摄像头2(副摄)返回的图像进行合成,将合成的图像传输至相机应用。相机应用将获取到的合成图像保存为照片。在一种实现方式中,如图4的(b)所示,手机100显示该照片的缩略图104。
本申请实施例提供的控制摄像头帧率的方法,在拍照成像阶段,主摄和副摄以相同的出图帧率输出图像,并根据主摄和副摄输出的图像生成照片。这样,主摄和副摄输出的图像按照设定的规则合成,就可以实现不同场景下的拍摄效果。
在拍摄成像完成后,相机应用向相机HAL发送成像完成请求。相机HAL向决策模块发送成像完成请求。决策模块(具体可由帧率决策子模块执行)确定副摄的出图帧率恢复为第二帧率。决策模块通知相机HAL,主摄的出图帧率为第一帧率,副摄的出图帧率为第二帧率。相机HAL控制摄像头1(主摄)以第一帧率出图,控制摄像头2(副摄)以第二帧率出图。摄像头1(主摄)以第一帧率向相机HAL返回图像,摄像头2(副摄)以第二帧率向相机HAL返回图像。相机HAL将摄像头1(主摄)返回的图像传输至相机应用。相机应用将获取到的图像传输至相机应用进行显示,即在预览界面显示摄像头1(主摄)采集的图像。
也就是说,拍照成像完成之后,恢复到拍照预览阶段,主摄按照第一帧率输出图像,并将主摄输出的图像传输至相机应用,生成预览界面;副摄按照第二帧率输出图像,并且副摄输出的图像不传输至相机应用。主摄按照正常帧率输出图像,保证预览界面的图像质量;副摄按照小于第一帧率的第二帧率输出图像,可以降低手机的功率损耗。
本申请实施例提供的控制摄像头帧率的方法,将副摄的出图帧率与主摄的出图帧率分开控制;在拍照预览阶段,副摄的出图帧率小于主摄的出图帧率,实现降低手机功率损耗;当启动拍照成像时,将副摄的出图帧率提高至与主摄出图帧率相同,实现特殊场景下拍照效果提升;在不影响拍照效果的条件下降低了多个摄像头组合成像时的功率损耗。
示例性的,图5为本申请实施例提供的控制摄像头帧率的方法的一种消息交互流程图。如图5所示,该方法可以包括:
S501、相机应用接收到用户启动相机应用的操作。
示例性的,用户可以通过点击相机应用的图标来启动相机应用。在另一些示例中,用户可以通过其他方式启动相机应用。比如,用户可以通过语音、手势等方式启动相机应用。
手机的输入装置(比如触摸屏、麦克风、摄像头等)检测到用户启动相机应用的操作,可以生成第一消息,第一消息用于指示接收到用户启动相机应用的操作。处理器将第一消息分发至相机应用。相机应用接收到第一消息,即接收到用户启动相机应用的操作。
S502、相机应用向相机HAL发送预览请求消息。
相机应用接收到启动相机应用的操作,向相机HAL发送预览请求消息。
S503、相机HAL向决策模块发送预览请求消息。
S504、决策模块接收到预览请求消息,确定开启摄像头一和摄像头二,并确定摄像头的初始帧率。
手机上可以配置多个摄像头。摄像头配置信息包括手机上的摄像头个数,每个摄像头的类型(比如,广角摄像头、长焦摄像头、超广角摄像头、黑白摄像头、微距摄像头或CV摄像头等),每个摄像头的硬件参数(比如,摄像头支持的出图帧率)等。
在一种实现方式中,决策模块确定开启手机上全部摄像头。示例性的,手机包括两个摄像头,摄像头一和摄像头二,决策模块确定开启手机的摄像头即确定开启摄像头一和摄像头二。该实现方式逻辑简单。
在一种实现方式中,手机内保存了摄像头配置信息,决策模块根据摄像头配置信息和预设策略确定开启摄像头一和摄像头二。
在一种实现方式中,S503中相机HAL从相机应用接收到预览请求消息,则获取手机的摄像头配置信息。相机HAL向决策模块发送预览请求消息时,还向决策模块发送摄像头配置信息。决策模块根据摄像头配置信息和预设策略确定开启摄像头一和摄像头二。该实现方式可以实时获取手机上摄像头的配置情况,可以更准确地开启摄像头组合,提高拍照质量。
其中,决策模块可以采用常规技术中能够获取到的策略来确定开启的摄像头。在一些实施例中,决策模块确定开启摄像头一和摄像头二,其中摄像头一为主摄,摄像头二为副摄。示例性的,摄像头一为广角摄像头,摄像头二为CV摄像头。
决策模块可以采用常规技术中能够获取到的帧率策略来确定摄像头一和摄像头二的初始帧率。比如,根据摄像头一和摄像头二分别支持的出图帧率,确定摄像头一和摄像头二都支持的出图帧率。如果摄像头一和摄像头二都支持的出图帧率有多个,可以在该多个出图帧率中选择一个作为初始帧率。比如,选择摄像头一和摄像头二都支持的出图帧率中最大的出图帧率;比如,预先设定一个默认出图帧率,如果摄像头一和摄像头二都支持该出图帧率,则确定初始帧率为该默认出图帧率。比如,该初始帧率为第一帧率。
S505、决策模块向相机HAL发送选择开启的摄像头组合的信息以及摄像头的初始帧率。
可选的,决策模块向相机HAL发送选择开启的摄像头组合的信息以及摄像头的初始帧率。
S506、相机HAL接收到选择开启的摄像头组合的信息,通知摄像头一开启采集图像,出图帧率为初始帧率(第一帧率);通知摄像头二开启采集图像,出图帧率为初始帧率(第一帧率)。
S507、摄像头一和摄像头二分别以第一帧率采集图像。
S508、决策模块确定摄像头一和摄像头二的出图帧率。
在S504步骤中决策模块确定开启摄像头一和摄像头二,并确定摄像头的初始帧率之后,决策模块还确定摄像头一和摄像头二的出图帧率。需要说明的是,本申请实施
例并不限定S505和S508执行的先后顺序;决策模块确定开启摄像头一和摄像头二,并确定摄像头的初始帧率之后;可以先执行S505,向相机HAL发送选择开启的摄像头组合的信息以及摄像头的初始帧率,再执行S508,确定摄像头一和摄像头二的出图帧率;或者,可以先执行S508,确定摄像头一和摄像头二的出图帧率,再执行S505,向相机HAL发送选择开启的摄像头组合的信息以及摄像头的初始帧率。在一种实现方式中,由决策模块中的帧率决策子模块确定摄像头一和摄像头二的出图帧率。
示例性的,决策模块向帧率决策子模块发送开启的摄像头组合的信息;摄像头组合的信息包括一组摄像头(摄像头一和摄像头二)的标识。决策模块还向帧率决策子模块发送摄像头的初始帧率。帧率决策子模块接收到选择开启的摄像头组合的信息以及摄像头的初始帧率,确定当前为拍照预览阶段,则确定摄像头一(主摄)的出图帧率为初始帧率(第一帧率),确定摄像头二(副摄)的出图帧率为小于第一帧率的值。即,确定摄像头二(副摄)的出图帧率为第二帧率,其中,第二帧率小于第一帧率。在一种示例中,第二帧率为第一帧率的一半。示例性的,第一帧率为30fps,第二帧率为15fps。
由于在拍照预览阶段,副摄输出的图像不会传输至相机应用,将副摄的出图帧率设置的小于主摄出图帧率,可以在不影响预览图像界面的情况下降低功率损耗。并且,第二帧率的值大于0而不是等于0;即副摄保持输出图像,而不是停止输出图像;和副摄在预览阶段不输出图像的方法相比,由于副摄一直在输出图像,副摄的AF、AE和AWB等功能一直处于低功耗工作状态;这样就可以待拍照成像阶段将副摄的出图帧率向上调整(比如调整至第一帧率)时做到快速响应,避免影响成像速度或质量。
进一步的,帧率决策子模块向决策模块返回摄像头一(主摄)的出图帧率为第一帧率,返回摄像头二(副摄)的出图帧率为第二帧率。
S509、决策模块向相机HAL发送摄像头一的出图帧率为第一帧率,向相机HAL发送摄像头二的出图帧率为第二帧率。
可选的,决策模块确定摄像头一的出图帧率为第一帧率,没有发生变化,可以不向相机HAL发送摄像头一的出图帧率。相机HAL没有收到针对摄像头一的出图帧率,则确定摄像头一的出图帧率不变。
可选的,在另一些实施例中,在S504步骤中决策模块确定开启摄像头一和摄像头二,并确定摄像头的初始帧率之后,先不向相机HAL发送选择开启的摄像头组合的信息以及摄像头的初始帧率,即不执行上述步骤S505、S506和S507。在S508决策模块确定摄像头一和摄像头二的出图帧率之后,决策模块向相机HAL发送选择开启的摄像头组合的信息以及摄像头一的出图帧率为第一帧率,摄像头二的出图帧率为第二帧率。
S510、相机HAL通知摄像头一以第一帧率采集图像,通知摄像头二以第二帧率采集图像。
在一种实现方式中,相机HAL向摄像头一发送曝光参数第一值,向摄像头二发送曝光参数第二值;其中,第二值大于第一值。曝光参数用于指示曝光时长,摄像头的曝光时长越长,则采集图像的频率越低;曝光时长与采集图像的频率成反比。曝光参数第二值是曝光参数第一值的二倍,则摄像头二采集图像的频率(第二帧率)是摄像头一采集图像频率(第一帧率)的一半。
S511、相机HAL以第一频率向摄像头一发送出图请求消息,相机HAL以第二频率向摄像头二发送出图请求消息。
出图请求消息用于请求摄像头输出图像;摄像头每次接收到出图请求消息,则向相机HAL输出一帧图像。其中,第一频率与第一帧率相等,第二频率与第二帧率相等。
S512、摄像头一以第一频率向相机HAL输出图像,摄像头二以第二频率向相机HAL输出图像。
摄像头一以第一帧率采集图像,并以第一频率向相机HAL输出图像,即摄像头一的出图帧率为第一帧率。摄像头二以第二帧率采集图像,并以第二频率向相机HAL输出图像,即摄像头二的出图帧率为第二帧率。
S513、相机HAL将摄像头一输出的图像传输至相机应用。
当前处于拍照预览阶段,相机HAL接收到摄像头一和摄像头二分别输出的图像,将摄像头一输出的图像传输至相机应用。
S514、相机应用将获取到的图像显示在预览界面。
相机应用显示获取到的图像,即在预览界面显示摄像头1(主摄)采集的图像。
在预览界面,用户可以启动拍照成像。示例性的,如图6所示,本申请实施例提供的控制摄像头帧率的方法,还包括:
S515、相机应用接收到用户启动拍照成像的操作。
在预览界面,用户可能启动拍照成像。示例性的,如图4的(a)所示,预览界面102包括控件103,用户可以点击控件103,启动拍照成像。相机应用接收到用户对控件103的点击操作,即接收到用户启动拍照成像的操作。
在另一些示例中,用户可以通过其他方式启动拍照成像。比如,用户可以通过语音、手势等方式启动拍照成像。
手机的输入装置(比如触摸屏、麦克风、摄像头等)检测到用户启动拍照成像的操作,可以生成第二消息,第二消息用于指示接收到用户启动拍照成像的操作。处理器将第二消息分发至相机应用。相机应用接收到第二消息,即接收到用户启动拍照成像的操作。
S516、相机应用向相机HAL发送成像请求消息。
S517、相机HAL向决策模块发送成像请求消息。
S518、决策模块接收到成像请求消息,确定摄像头一(主摄)的出图帧率为第一帧率,确定摄像头二(副摄)的出图帧率为第一帧率。
在一种实现方式中,由决策模块中的帧率决策子模块确定摄像头一(主摄)和摄像头二(副摄)的出图帧率。示例性的,决策模块向帧率决策子模块发送成像消息。帧率决策子模块接收到成像消息,确定当前为拍照成像阶段,则确定摄像头一(主摄)和摄像头二(副摄)的出图帧率为初始帧率,即第一帧率。帧率决策子模块向决策模块返回摄像头一(主摄)和摄像头二(副摄)的出图帧率为第一帧率。
在拍照成像阶段,将副摄的出图帧率恢复为与主摄相同的值,可以使得副摄以较高频率输出图像,这样就可以用于组合成像,提高照片质量。
S519、决策模块向相机HAL发送摄像头一的出图帧率为第一帧率,向相机HAL发送摄像头二的出图帧率为第一帧率。
可选的,决策模块确定摄像头一的出图帧率为第一帧率,没有发生变化,可以不向相机HAL发送摄像头一的出图帧率。相机HAL没有收到针对摄像头一的出图帧率,则确定摄像头一的出图帧率不变。
S520、相机HAL通知摄像头一的出图帧率为第一帧率,通知摄像头二的出图帧率为第一帧率。
在一种实现方式中,相机HAL向摄像头一和摄像头二发送相同的曝光参数,这样就可以实现摄像头一和摄像头二的出图帧率相等。
S521、相机HAL以第一频率分别向摄像头一和摄像头二发送出图请求消息。
其中,第一频率与第一帧率相等。
S522、摄像头一以第一频率向相机HAL输出图像,摄像头二以第一频率向相机HAL输出图像。
摄像头一以第一帧率采集图像,并以第一频率向相机HAL输出图像,即摄像头一的出图帧率为第一帧率。摄像头二以第一帧率采集图像,并以第一频率向相机HAL输出图像,即摄像头二的出图帧率为第一帧率。
S523、相机HAL将摄像头一输出的图像和摄像头二输出的图像进行合成,生成合成图像。
相机HAL可以采取常规技术中能够获取到的方法合成摄像头一输出的图像和摄像头二输出的图像。
在一种实现方式中,相机HAL可以将摄像头一输出的一帧图像和摄像头二输出的一帧图像生成一帧合成图像。
在一种实现方式中,相机HAL可以将摄像头一输出的M帧图像和摄像头二输出的N帧图像生成一帧合成图像;其中,M>1,N>1;可选的,M和N可以相等,也可以不相等。
S524、相机HAL向相机应用传输合成图像。
S525、相机应用接收到合成图像,将合成图像保存为照片。
相机应用接收到合成图像,确定拍照成像结束。在一种实现方式中,相机应用可以在预览界面中显示照片的缩略图。在一种示例中,用户还可以点击照片的缩略图,查看生成的照片。
在拍照成像后,恢复为拍照预览阶段,可以将摄像头二的出图帧率恢复为第二帧率,以节省功率。示例性的,如图7所示,本申请实施例提供的控制摄像头帧率的方法,还包括:
S526、相机应用向相机HAL发送成像结束消息。
相机应用接收到合成图像,向相机HAL发送成像结束消息。
S527、相机HAL向决策模块发送成像结束消息。
S528、决策模块接收到成像结束消息,确定摄像头一(主摄)的出图帧率为第一帧率,确定摄像头二(副摄)的出图帧率为第二帧率。
在一种实现方式中,由决策模块中的帧率决策子模块确定摄像头一和摄像头二的出图帧率。
示例性的,决策模块向帧率决策子模块发送成像结束消息。帧率决策子模块接收
到成像结束消息,确定当前为拍照预览阶段,确定摄像头一(主摄)的出图帧率为第一帧率,将摄像头二(副摄)的出图帧率恢复为第二帧率。帧率决策子模块向决策模块返回摄像头一(主摄)的出图帧率为第一帧率,返回摄像头二(副摄)的出图帧率为第二帧率。
成像结束之后,恢复到拍照预览阶段,副摄输出的图像不会传输至相机应用进行显示,就将副摄的出图帧率恢复到第二帧率,这样就可以降低拍照预览阶段的功率损耗。
S529、决策模块向相机HAL发送摄像头一的出图帧率为第一帧率,向相机HAL发送摄像头二的出图帧率为第二帧率。
可选的,决策模块确定摄像头一的出图帧率为第一帧率,没有发生变化,可以不向相机HAL发送摄像头一的出图帧率。相机HAL没有收到针对摄像头一的出图帧率,则确定摄像头一的出图帧率不变。
S530、相机HAL通知摄像头一以第一帧率采集图像,通知摄像头二以第二帧率采集图像。
S531、相机HAL以第一频率向摄像头一发送出图请求消息,相机HAL以第二频率向摄像头二发送出图请求消息。
S532、摄像头一以第一频率向相机HAL输出图像,摄像头二以第二频率向相机HAL输出图像。
S533、相机HAL将摄像头一输出的图像传输至相机应用。
当前处于拍照预览阶段,相机HAL接收到摄像头一和摄像头二分别输出的图像,将摄像头一输出的图像传输至相机应用。
S534、相机应用将获取到的图像显示在预览界面。
需要说明的是,本申请实施例以拍照为例对本申请提供的控制摄像头帧率的方法进行了介绍。可以理解的,本申请实施例提供的控制摄像头帧率的方法,同样适用于录像功能,具体实现方式可以参考拍照功能,本申请实施例不再详细介绍。
本申请一些实施例提供了一种电子设备,该电子设备可以包括:存储器、多个摄像头和一个或多个处理器。该摄像头、存储器和处理器耦合。该存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令。当处理器执行计算机指令时,电子设备可执行上述方法实施例中电子设备执行的各个功能或者步骤。该电子设备的结构可以参考图1所示的电子设备100的结构。
本申请实施例还提供一种芯片系统(例如,片上系统(system on a chip,SoC)),如图8所示,该芯片系统包括至少一个处理器801和至少一个接口电路802。处理器801和接口电路802可通过线路互联。例如,接口电路802可用于从其它装置(例如电子设备的存储器)接收信号。又例如,接口电路802可用于向其它装置(例如处理器801或者电子设备的触摸屏或者电子设备的摄像头)发送信号。示例性的,接口电路802可读取存储器中存储的指令,并将该指令发送给处理器801。当所述指令被处理器801执行时,可使得芯片系统执行上述实施例中HAL层各个模块的步骤。当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括计算
机指令,当所述计算机指令在上述电子设备上运行时,使得该电子设备执行上述方法实施例中电子设备执行的各个功能或者步骤。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行上述方法实施例中电子设备执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。
Claims (25)
- 一种电子设备,其特征在于,包括:处理器,显示屏,第一摄像头和第二摄像头;所述处理器上运行所述电子设备的操作系统,所述操作系统中安装有相机应用;所述处理器,用于接收第一消息,所述第一消息用于指示接收到用户启动所述相机应用的操作;所述处理器,还用于响应于接收到所述第一消息,启动所述第一摄像头和所述第二摄像头;所述第一摄像头,用于以第一帧率向所述处理器输出图像;所述第二摄像头,用于以第二帧率向所述处理器输出图像,所述第二帧率小于所述第一帧率;所述处理器,还用于将所述第一摄像头输出的图像传输至所述相机应用;所述显示屏,用于显示所述相机应用的预览界面,所述预览界面包括所述第一摄像头采集的图像。
- 根据权利要求1所述的电子设备,其特征在于,所述处理器,还用于在启动所述第一摄像头和所述第二摄像头之后,通知所述第一摄像头以第一帧率采集图像,通知所述第二摄像头以第二帧率采集图像。
- 根据权利要求2所述的电子设备,其特征在于,所述处理器,还用于以第一频率向所述第一摄像头发送出图请求消息;所述处理器,还用于以第二频率向所述第二摄像头发送出图请求消息;其中,所述出图请求消息用于请求摄像头输出一帧图像,所述第一频率与所述第一帧率相等,所述第二频率与所述第二帧率相等。
- 根据权利要求2或3所述的电子设备,其特征在于,所述处理器,还用于在通知所述第二摄像头以第二帧率采集图像之前,通知所述第二摄像头以初始帧率采集图像;所述初始帧率为所述第一帧率。
- 根据权利要求1-4任意一项所述的电子设备,其特征在于,所述处理器,还用于在接收到所述第一消息之后,根据所述电子设备的摄像头配置以及预设策略确定启动所述第一摄像头和所述第二摄像头。
- 根据权利要求5所述的电子设备,其特征在于,所述处理器,还用于在接收到所述第一消息之后,获取所述电子设备的摄像头配置。
- 根据权利要求1-6任意一项所述的电子设备,其特征在于,所述处理器,还用于接收第二消息,所述第二消息用于指示接收到用户在所述预览界面启动拍照成像的操作;所述处理器,还用于响应于接收到所述第二消息,通知所述第二摄像头以所述第一帧率输出图像;所述第二摄像头,还用于以所述第一帧率向所述处理器输出图像;所述处理器,还用于将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片;其中,所述第一摄像头以所述第一帧率输出图像。
- 根据权利要求7所述的电子设备,其特征在于,所述处理器具体用于:将所述第一摄像头输出的M帧图像和所述第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
- 根据权利要求1-8任意一项所述的电子设备,其特征在于,所述第二帧率大于0。
- 一种控制摄像头帧率的方法,应用于电子设备,所述电子设备包括第一摄像头和第二摄像头,其特征在于,所述方法包括:接收用户启动相机应用的操作;响应于所述启动相机应用的操作,启动所述第一摄像头和所述第二摄像头;控制所述第一摄像头以第一帧率输出图像;控制所述第二摄像头以第二帧率输出图像,所述第二帧率小于所述第一帧率;显示所述相机应用的预览界面,所述预览界面包括所述第一摄像头输出的图像。
- 根据权利要求10所述的方法,其特征在于,在所述控制所述第二摄像头以第二帧率输出图像之前,所述方法还包括:控制所述第二摄像头以初始帧率输出图像;所述初始帧率为所述第一帧率。
- 根据权利要求10或11所述的方法,其特征在于,所述方法还包括:接收用户在所述预览界面启动拍照成像的操作;响应于所述启动拍照成像的操作,控制所述第二摄像头以所述第一帧率输出图像;将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片;其中,所述第一摄像头以所述第一帧率输出图像。
- 根据权利要求12所述的方法,其特征在于,所述将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片之后,所述方法还包括:控制所述第二摄像头以所述第二帧率输出图像;其中,所述第一摄像头以所述第一帧率输出图像;显示所述相机应用的预览界面,所述预览界面包括所述第一摄像头输出的图像。
- 根据权利要求12所述的方法,其特征在于,所述将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片,包括:将所述第一摄像头输出的M帧图像和所述第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
- 根据权利要求10-14任意一项所述的方法,其特征在于,所述第二帧率大于0。
- 一种控制摄像头帧率的方法,其特征在于,包括:接收第一消息,所述第一消息用于指示接收到用户启动相机应用的操作;响应于接收到所述第一消息,通知第一摄像头和第二摄像头启动;通知第一摄像头以第一帧率采集图像;通知第二摄像头以第二帧率采集图像,所述第二帧率小于所述第一帧率;接收所述第一摄像头以所述第一帧率输出的图像;接收所述第二摄像头以所述第二帧率输出的图像;将所述第一摄像头输出的图像传输至所述相机应用,使得所述相机应用根据所述第一摄像头输出的图像显示预览界面。
- 根据权利要求16所述的方法,其特征在于,在所述通知第一摄像头以第一帧 率采集图像,通知第二摄像头以第二帧率采集图像之后,所述方法还包括:以第一频率向所述第一摄像头发送出图请求消息,以第二频率向所述第二摄像头发送出图请求消息;其中,所述出图请求消息用于请求摄像头输出一帧图像,所述第一频率与所述第一帧率相等,所述第二频率与所述第二帧率相等。
- 根据权利要求16或17所述的方法,其特征在于,在所述通知第二摄像头以第二帧率采集图像之前,所述方法还包括:通知所述第二摄像头以初始帧率采集图像;所述初始帧率为所述第一帧率。
- 根据权利要求16-18任意一项所述的方法,其特征在于,在所述接收第一消息之后,所述方法还包括:获取摄像头配置;根据所述摄像头配置以及预设策略确定启动所述第一摄像头和所述第二摄像头。
- 根据权利要求16-19任意一项所述的方法,其特征在于,所述方法还包括:接收第二消息,所述第二消息用于指示接收到用户在所述预览界面启动拍照成像的操作;响应于接收到所述第二消息,通知所述第二摄像头以所述第一帧率采集图像;接收所述第二摄像头以所述第一帧率输出的图像;将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片;其中,所述第一摄像头以所述第一帧率输出图像。
- 根据权利要求20所述的方法,其特征在于,所述将所述第一摄像头的输出图像和所述第二摄像头的输出图像合成照片,包括:将所述第一摄像头输出的M帧图像和所述第二摄像头输出的N帧图像合成一张照片;其中,M>=1,N>=1。
- 根据权利要求16-21任意一项所述的方法,其特征在于,所述通知第一摄像头以第一帧率采集图像,通知第二摄像头以第二帧率采集图像,包括:设置所述第一摄像头的曝光参数为第一值,设置所述第二摄像头的曝光参数为第二值,所述第二值大于所述第一值。
- 根据权利要求22所述的方法,其特征在于,所述第二帧率为所述第一帧率的一半,所述第二值为所述第一值的二倍。
- 一种计算机可读存储介质,其特征在于,包括计算机指令;当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求10-15任意一项所述的方法。
- 一种芯片系统,其特征在于,所述芯片系统包括一个或多个接口电路和一个或多个处理器,所述接口电路和所述处理器通过线路互联,所述接口电路用于从电子设备的存储器接收计算机指令,当所述计算机指令被所述处理器执行时,使得所述芯片系统执行如权利要求16-23任意一项所述的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211032087.1A CN117641116B (zh) | 2022-08-26 | 2022-08-26 | 一种控制摄像头帧率的方法及电子设备 |
CN202211032087.1 | 2022-08-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024041006A1 true WO2024041006A1 (zh) | 2024-02-29 |
Family
ID=90012344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/089867 WO2024041006A1 (zh) | 2022-08-26 | 2023-04-21 | 一种控制摄像头帧率的方法及电子设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN117641116B (zh) |
WO (1) | WO2024041006A1 (zh) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150077580A1 (en) * | 2013-09-16 | 2015-03-19 | Lg Electronics Inc. | Portable device and control method thereof |
CN105975046A (zh) * | 2016-04-29 | 2016-09-28 | 乐视控股(北京)有限公司 | 拍照预览方法和装置 |
CN106791017A (zh) * | 2016-11-29 | 2017-05-31 | 努比亚技术有限公司 | 一种终端及拍照方法 |
CN107800959A (zh) * | 2016-09-07 | 2018-03-13 | 三星电子株式会社 | 电子设备及其控制方法 |
CN109803087A (zh) * | 2018-12-17 | 2019-05-24 | 维沃移动通信有限公司 | 一种图像生成方法及终端设备 |
CN114466232A (zh) * | 2022-01-29 | 2022-05-10 | 维沃移动通信有限公司 | 视频处理方法、装置、电子设备和介质 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107682625B (zh) * | 2017-09-27 | 2020-12-08 | 惠州Tcl移动通信有限公司 | 一种拍照预览时任意帧率控制方法、移动终端及存储介质 |
CN112565589B (zh) * | 2020-11-13 | 2023-03-31 | 北京爱芯科技有限公司 | 一种拍照预览方法、装置、存储介质和电子设备 |
CN114650363B (zh) * | 2020-12-18 | 2023-07-21 | 华为技术有限公司 | 一种图像显示的方法及电子设备 |
-
2022
- 2022-08-26 CN CN202211032087.1A patent/CN117641116B/zh active Active
-
2023
- 2023-04-21 WO PCT/CN2023/089867 patent/WO2024041006A1/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150077580A1 (en) * | 2013-09-16 | 2015-03-19 | Lg Electronics Inc. | Portable device and control method thereof |
CN105975046A (zh) * | 2016-04-29 | 2016-09-28 | 乐视控股(北京)有限公司 | 拍照预览方法和装置 |
CN107800959A (zh) * | 2016-09-07 | 2018-03-13 | 三星电子株式会社 | 电子设备及其控制方法 |
CN106791017A (zh) * | 2016-11-29 | 2017-05-31 | 努比亚技术有限公司 | 一种终端及拍照方法 |
CN109803087A (zh) * | 2018-12-17 | 2019-05-24 | 维沃移动通信有限公司 | 一种图像生成方法及终端设备 |
CN114466232A (zh) * | 2022-01-29 | 2022-05-10 | 维沃移动通信有限公司 | 视频处理方法、装置、电子设备和介质 |
Also Published As
Publication number | Publication date |
---|---|
CN117641116A (zh) | 2024-03-01 |
CN117641116B (zh) | 2024-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020233553A1 (zh) | 一种拍摄方法及终端 | |
US11669242B2 (en) | Screenshot method and electronic device | |
JP7355941B2 (ja) | 長焦点シナリオにおける撮影方法および端末 | |
WO2021121052A1 (zh) | 一种多屏协同方法、系统及电子设备 | |
EP4199500A1 (en) | Method for multiple applications to share camera, and electronic device | |
CN114697527B (zh) | 一种拍摄方法、系统及电子设备 | |
CN114726950A (zh) | 一种摄像头模组的开启方法和装置 | |
WO2022222773A1 (zh) | 拍摄方法、相关装置及系统 | |
CN115967851A (zh) | 快速拍照方法、电子设备及计算机可读存储介质 | |
WO2024001810A1 (zh) | 设备交互方法、电子设备及计算机可读存储介质 | |
CN117278850A (zh) | 一种拍摄方法及电子设备 | |
CN112437341B (zh) | 一种视频流处理方法及电子设备 | |
CN116074623B (zh) | 一种摄像头的分辨率选择方法和装置 | |
WO2024041006A1 (zh) | 一种控制摄像头帧率的方法及电子设备 | |
CN115460343A (zh) | 图像处理方法、设备及存储介质 | |
WO2021204103A1 (zh) | 照片预览方法、电子设备和存储介质 | |
CN114827439A (zh) | 一种全景图像的拍摄方法及电子设备 | |
WO2023160224A9 (zh) | 一种拍摄方法及相关设备 | |
CN117479008B (zh) | 一种视频处理方法、电子设备及芯片系统 | |
CN117995137B (zh) | 一种调节显示屏色温的方法、电子设备及相关介质 | |
CN116916148B (zh) | 一种图像处理方法、电子设备及可读存储介质 | |
WO2024193523A1 (zh) | 端云协同的图像处理方法及相关装置 | |
WO2023116415A1 (zh) | 一种应用程序的抑制方法和电子设备 | |
WO2023035920A1 (zh) | 一种录像中抓拍图像的方法及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23856101 Country of ref document: EP Kind code of ref document: A1 |