WO2024093432A1 - 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质 - Google Patents

拍摄帧率控制方法、电子设备、芯片系统及可读存储介质 Download PDF

Info

Publication number
WO2024093432A1
WO2024093432A1 PCT/CN2023/112906 CN2023112906W WO2024093432A1 WO 2024093432 A1 WO2024093432 A1 WO 2024093432A1 CN 2023112906 W CN2023112906 W CN 2023112906W WO 2024093432 A1 WO2024093432 A1 WO 2024093432A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame rate
strategy
module
shooting
camera
Prior art date
Application number
PCT/CN2023/112906
Other languages
English (en)
French (fr)
Inventor
许集润
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2024093432A1 publication Critical patent/WO2024093432A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present application relates to the field of terminal technology, and in particular to a shooting frame rate control method, an electronic device, a chip system and a readable storage medium.
  • the present application provides a shooting frame rate control method, electronic device, chip system and readable storage medium, which decouples the adaptation logic of the shooting frame rates of multiple cameras from the hardware platform, thereby facilitating subsequent functional expansion and maintenance of the shooting frame rate control.
  • a shooting frame rate control method which is applied to an electronic device including a frame rate strategy decision layer and multiple cameras.
  • the method may include:
  • the electronic device detects a first operation on a first control in a display interface; in response to the first operation, the frame rate strategy decision layer can determine a target frame rate strategy based on initial data; the frame rate strategy decision layer can also generate a decision instruction based on the target frame rate strategy, and the decision instruction is used by the electronic device to control the shooting frame rate of multiple cameras.
  • a frame rate strategy decision layer is set in the hardware abstraction layer to decouple the configuration logic of the shooting frame rate control of multiple cameras from the hardware platform, thereby facilitating subsequent function expansion and maintenance.
  • the frame rate policy decision layer includes an interface adaptation module, a frame rate policy calculation module, and a frame rate policy control module connected in sequence, the interface adaptation module is also connected to the frame rate policy control module, the frame rate policy decision layer also includes a frame rate policy analysis module, and the frame rate policy analysis module is connected to the frame rate policy calculation module;
  • the method comprises:
  • the interface adapter module acquires the initial data, where the initial data includes: at least one of sensor data and control data;
  • the frame rate strategy parsing module parses the frame rate strategy configuration to obtain multiple frame rate strategies
  • the frame rate strategy calculation module determines, according to the initial data, the target frame rate strategy matching the initial data from among the multiple frame rate strategies, the target frame rate strategy being one of the multiple frame rate strategies;
  • the frame rate strategy control module generates the decision instruction according to the target frame rate strategy
  • the interface adaptation module outputs the decision instruction.
  • the control of the shooting frame rate corresponding to different sensor states in different scenarios can be realized; the configuration logic of the shooting frame rate control of multiple cameras is decoupled from the hardware platform, and the adaptive calculation and output control of the shooting frame rate are realized through multiple modules, which has the advantages of applicability and portability across chip platforms; the frame rate strategy calculation module set at the same time can adjust the shooting frame rate based on sensor data or control data, which can balance the operating power consumption of multiple cameras during the shooting process, thereby extending the use time of electronic equipment.
  • the frame rate policy decision layer further includes a first conversion module and a second conversion module, the first conversion module is connected between the interface adaptation module and the frame rate policy calculation module, and the second conversion module is connected between the interface adaptation module and the frame rate policy control module;
  • the method further includes:
  • the first conversion module converts the initial data into first data
  • the method further includes:
  • the second conversion module converts the decision instruction into second data
  • the interface adaptation module outputs the second data.
  • the frame rate strategy decision layer is set as an independent module, and data conversion is realized through the conversion interface, so that it can be applied to different chip architectures, which makes it easier to expand and maintain functions such as the adaptation logic of subsequent shooting frame rates.
  • the sensing data includes: at least one of: an ambient brightness value, a dynamic range value, state information, and a scene type, and the control data includes: a zoom factor;
  • the frame rate strategy decision layer determines the target frame rate strategy according to the initial data, including:
  • the frame rate strategy calculation module calculates the adaptation condition information according to the ambient brightness value and the control data
  • the frame rate strategy calculation module matches the condition information with the parsed frame rate strategy to calculate a frame rate result
  • the frame rate strategy calculation module determines the target frame rate strategy according to the frame rate result and the sensor data.
  • the condition information includes at least one of an ultra wide-angle mode, a night scene mode, a daylight mode, and a telephoto mode; and the method further includes:
  • the frame rate result calculated by the frame rate strategy calculation module is the first frame rate
  • the frame rate result calculated by the frame rate strategy calculation module is a second frame rate.
  • the method further includes:
  • the target frame rate strategy is switched from the frame rate strategy corresponding to the first frame rate to the frame rate strategy corresponding to the second frame rate.
  • the frame rate strategy decision layer determines the target frame rate strategy according to the initial data, including:
  • the target frame rate strategy If the ambient brightness value is greater than the brightness threshold, the dynamic range value is greater than the dynamic threshold, and the state information is a jitter state, then a first strategy among the multiple frame rate strategies is used as the target frame rate strategy;
  • the target frame rate strategy If the ambient brightness value is greater than the brightness threshold, the dynamic range value is greater than the dynamic threshold, the state information is a static state, and the scene type is a light flickering scene, then a second strategy among the multiple frame rate strategies is used as the target frame rate strategy;
  • the ambient brightness value is greater than the brightness threshold
  • the dynamic range value is greater than the dynamic threshold
  • the state information is a static state
  • the scene type is a scene with unchanged lighting
  • the first strategy among the multiple frame rate strategies is used as the target frame rate strategy.
  • the frame rate strategy decision layer determines the target frame rate strategy according to the initial data, including:
  • the target frame rate strategy If the ambient brightness value is greater than the brightness threshold, the dynamic range value is less than or equal to the dynamic threshold, and the state information is a jitter state, a third strategy among the multiple frame rate strategies is used as the target frame rate strategy;
  • the target frame rate strategy If the ambient brightness value is greater than the brightness threshold, the dynamic range value is less than or equal to the dynamic threshold, the state information is a static state, and the scene type is a light flickering scene, then a second strategy among the multiple frame rate strategies is used as the target frame rate strategy;
  • the ambient brightness value is greater than the brightness threshold, the dynamic range value is less than or equal to the dynamic threshold, the state information is a static state, and the scene type is a scene with unchanged lighting, then the third strategy among the multiple frame rate strategies is used as the target frame rate strategy.
  • the frame rate strategy decision layer determines the target frame rate strategy according to the initial data, including:
  • a third strategy among the multiple frame rate strategies is used as the target strategy.
  • the initial data further includes power consumption information
  • the frame rate strategy decision layer determines the target frame rate strategy according to the initial data, including:
  • the frame rate strategy decision layer determines the target frame rate strategy according to the sensor data, the control data, and the power consumption information.
  • the function of flexibly switching the shooting frame rate during the recording process as the conditions change can be provided, thereby achieving a better balance between shooting effect and device power consumption.
  • an electronic device comprising a memory and a processor, wherein the memory stores a computer program that can be run on the processor, and when the processor executes the computer program, the shooting frame rate control method provided in the first aspect or any possible implementation of the first aspect is implemented.
  • a chip system comprising: a processor for calling and running a computer program from a memory, so that a device equipped with the chip executes a shooting frame rate control method as provided in the first aspect or any possible implementation of the first aspect.
  • a computer-readable storage medium comprising: a computer program stored therein, wherein when the computer program is executed by a processor, the shooting frame rate control method provided in the first aspect or any possible implementation of the first aspect is implemented.
  • a computer program product is provided.
  • the computer program product runs on an electronic device, the electronic device executes the shooting frame rate control method provided in the first aspect or any possible implementation of the first aspect.
  • FIG1 is a schematic diagram of a scenario to which the shooting frame rate control method provided in an embodiment of the present application is applicable;
  • FIG2 is a schematic diagram of another scenario to which the shooting frame rate control method provided in an embodiment of the present application is applicable;
  • FIG3 is a schematic diagram of the structure of a software system of an electronic device provided in an embodiment of the present application.
  • FIG4 is a schematic diagram of the structure of a hardware abstraction layer provided in an embodiment of the present application.
  • FIG5 is a schematic diagram of the structure of another hardware abstraction layer provided in an embodiment of the present application.
  • FIG6 is a flow chart of a shooting frame rate control method provided in an embodiment of the present application.
  • FIG7 is a flow chart of another shooting frame rate control method provided in an embodiment of the present application.
  • FIG8 is a schematic diagram of dynamic control of frame rate strategy in an application scenario provided by an embodiment of the present application.
  • FIG9 is a schematic diagram of frame rate strategy adaptation in an application scenario provided by an embodiment of the present application.
  • FIG10 is a schematic diagram of an application scenario provided by an embodiment of the present application.
  • FIG11 is a diagram of a hardware system architecture of an electronic device provided in an embodiment of the present application.
  • FIG12 is a schematic diagram of the structure of a chip system provided in an embodiment of the present application.
  • the term “if” may be interpreted as “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context.
  • the phrases “if it is determined” or “if [described condition or event] is detected” may be interpreted as meaning “upon determination” or “in response to determining” or “upon detection of [described condition or event]” or “in response to detecting [described condition or event],” depending on the context.
  • Frame rate is the frequency (rate) at which bitmap images in frames appear continuously on the display, usually expressed as fps (Frames Per Second). Frame rate can reflect the smoothness of the captured video page to a certain extent.
  • Shooting frame rate is the number of refreshes per second that the electronic device can call on the hardware sensor based on the current shooting environment. It can also be called recording frame rate or real-time frame rate.
  • the focal length indicates the size of the refractive power. The shorter the focal length, the greater the refractive power.
  • the focal length of the optical lens assembly determines the size of the image generated by the object photographed by the optical lens assembly on the imaging plane. Assuming that the same object is photographed at the same distance, the longer the focal length of the optical lens assembly, the greater the magnification of the image generated by the object on the photosensitive element (charge-coupled device, CCD).
  • optical zoom mainly the comparison ratio and switching of different focal lengths in the camera module.
  • the optical zoom factor can be used to represent the ability of optical zoom. The larger the optical zoom factor, the farther the scene can be photographed.
  • the size of the optical zoom factor is related to the physical focal length of the optical lens assembly.
  • the equivalent focal length of the camera module is usually 28mm, corresponding to 1X (i.e. 1x) optical zoom factor.
  • Field of view is used to indicate the maximum angle range that the camera can capture. If the object to be photographed is within this angle range, the object to be photographed will be captured by the camera. If the object to be photographed is outside this angle range, the object to be photographed will not be captured by the camera.
  • FOV Field of view
  • cameras can be divided into main cameras, wide-angle cameras, and telephoto cameras according to their different field of view.
  • the field of view of a wide-angle camera is larger than that of a main camera, and the focal length is smaller, which is suitable for close-up shooting; while the field of view of a telephoto camera is smaller than that of a main camera, and the focal length is longer, which is suitable for long-range shooting.
  • the lighting value (LV) is used to estimate the ambient brightness.
  • the specific calculation formula is as follows:
  • Exposure is the exposure time
  • Aperture is the aperture size
  • ISO is the sensitivity
  • Luma is the average value of Y in the XYZ color space.
  • the dynamic range value is used to indicate the proportion of the overexposed area in the preview image obtained through the camera to the entire image.
  • the embodiment of the present application provides a shooting frame rate control method, which integrates the frame rate adaptation and scheduling of multiple cameras into one module, so that the control of the shooting frame rates of multiple cameras can be decoupled from the hardware platform, no longer limited by the hardware platform, and can be applied to platforms with different chip architectures.
  • the control of the shooting frame rate can be more flexible, and can be dynamically switched according to different shooting environments, which is more friendly and convenient for the subsequent expansion and maintenance of functions for more complex scenes.
  • the shooting frame rate control method provided in the embodiment of the present application can be applied to the shooting field, for example, can be applied to the process of recording a video.
  • the electronic device 100 is a mobile phone.
  • the mobile phone may include multiple cameras. For example, it may have four rear cameras and one front camera.
  • the four rear cameras may be a wide-angle camera, a main camera, a black-and-white camera, and a telephoto camera, respectively.
  • the four rear cameras are used to shoot the same scene to be shot.
  • the front camera is a camera on the same plane as the display screen, and may be used for video recording, multi-lens video recording, and face recognition.
  • the electronic device 100 may also include other cameras, and the type of cameras and the number of each camera may be set as needed, and the embodiment of the present application does not impose any restrictions on this.
  • the four cameras included in the electronic device 100 may also be an ultra-wide-angle camera, a wide-angle camera, a black and white camera, and a telephoto camera.
  • the electronic device can receive a user's click or drag of the zoom ratio control to adjust the focal length of the multiple cameras currently used for shooting.
  • the zoom ratio range of the main camera and the black and white camera is basically the same, while the zoom ratio of the wide-angle camera is relatively smaller than that of the main camera, and the zoom ratio of the telephoto camera is relatively larger than that of the main camera.
  • the zoom ratio refers to the optical zoom capability of the camera.
  • the zoom ratio range for the wide-angle camera is [0.1, 1)
  • the zoom ratio range for the main camera is [1, 3.9)
  • the zoom ratio range for the black and white camera is [1, 2)
  • the zoom ratio range for the telephoto camera is [3.9, 100).
  • 0.1 refers to 0.1 times zoom ratio, that is, 0.1X
  • 1 refers to 1 times zoom ratio, that is, 1X
  • 2 refers to 2 times zoom ratio, that is, 2X
  • 3.9 refers to 3.9 zoom ratio, that is, 3.9X
  • 100 refers to 100 times zoom ratio, that is, 100X.
  • FIG. 1 is a schematic diagram of a scenario to which the shooting frame rate control method provided in an embodiment of the present application is applicable.
  • the electronic device detects a first operation of a first control in a display interface.
  • the display interface may be a main interface 10
  • the first control may be a camera control of the main interface
  • the first operation may be an operation in which a user clicks on the camera control, such as operation 1; accordingly, the display interface may also be a preview interface 11, i.e., an interface after a camera application is opened, and the first control may also be a zoom control in the preview interface, such as the zoom bar 12 in FIG1
  • the first operation may also be an operation in which a user clicks or slides the zoom bar, such as operation 2, to change the camera in recording. zoom factor before or during recording.
  • the mobile phone can start the camera application and display a graphical user interface (GUI) as shown in FIG. 1, which can be called a preview interface 11.
  • GUI graphical user interface
  • the preview interface includes a viewfinder, multiple shooting mode options and a zoom bar 12.
  • the viewfinder can be used to display a preview image in real time.
  • the shooting modes include night scene mode, portrait mode, video mode, multi-lens video mode, etc.
  • the preview interface can also include a frame rate control 13 for displaying the shooting frame rate applicable to the current shooting scene.
  • the shooting frame rate displayed by the frame rate control 13 can be dynamically changed with the change of the shooting environment or the user's first operation (such as operation 2). Accordingly, the user can also change the shooting focal length of the camera by clicking the zoom bar 12 in the preview interface 11; the user can select the zoom factor required for the current recording scene in the zoom bar 12, for example, 0.5 times, 2 times or 50 times.
  • the user can increase the zoom factor by sliding, and the object being photographed can be continuously enlarged in the viewfinder window; and can reduce the zoom factor by sliding, and the object being photographed can be continuously reduced in the viewfinder window.
  • the user can adjust the preview image displayed in the viewfinder window by selecting the zoom factor.
  • the shooting frame rate control method provided in the embodiment of the present application is applicable to the frame rate strategy control before the camera application is opened and the shooting starts.
  • the shooting frame rate can be adapted and regulated based on the shooting scene or the zoom factor input by the user and other information.
  • the above-mentioned frame rate control 13 can also be in a non-display state, that is, the control process of the shooting frame rate is not perceived by the user, and the shooting frame rate of the camera's image sensor can also be directly controlled in the background.
  • Figure 1 is only for the convenience of schematically illustrating the process of dynamic control of the shooting frame rate in this application.
  • the program corresponding to the shooting frame rate control method provided in the embodiment of the present application can be used; so as to achieve a better balance between shooting effect and power consumption in various application scenarios.
  • FIG. 2 is a schematic diagram of another scenario to which the shooting strategy control method provided in an embodiment of the present application is applicable.
  • the user when the user wants to take pictures of grass and trees outdoors, the user opens the camera application, and the default zoom ratio of the preview interface can be 1x, and the camera called by the mobile phone can be the main camera.
  • the corresponding frame rate will also be dynamically adjusted, for example, from the original 60fps to 30fps; due to the corresponding darker application scene, some of its corresponding image algorithms (such as night mode algorithms) are also relatively complex, and the power consumption of the mobile phone will be relatively increased. If the acquisition continues at a high frame rate, it may affect the operation of the mobile phone or shorten the battery life. Therefore, by reducing the shooting frame rate, while ensuring the corresponding shooting effect, the operating power consumption in this scene is balanced.
  • the corresponding shooting frame rate when switching from a darker shooting scene to a brighter shooting scene, the corresponding shooting frame rate will also switch back from 30fps to 60fps.
  • a night scene control 14 may also be displayed to indicate that the camera is currently in the night scene shooting mode.
  • the shooting frame rate control method provided in the embodiment of the present application is also applicable to dynamically adjusting the shooting frame rate when the shooting scene changes or the zoom factor input by the user changes during the shooting process after the camera application is opened.
  • the shooting frame rate in the above application scenarios is only for schematic illustration, and may also be other frame rates, such as 24 frames, 72 frames, etc., or may be other control parameters related to the image output format, which are not specifically limited here.
  • the control of the shooting frame rate corresponding to the shooting scene can also support continuous conversion between two frame rates.
  • the frame rate can be continuously switched from 30fps corresponding to the first shooting scene to 31fps, 32fps or 33fps, etc.
  • the specific span value corresponding to each switch of the shooting frame rate is also not limited.
  • the software system of the electronic device 100 can adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the operating system (OS) of the electronic device may include but is not limited to: (Symbian), (Android), (iOS), (Blackberry), Hongmeng (HarmonyOS) and other operating systems, this application does not make any limitation.
  • the embodiment of the present application takes the Android system of layered architecture as an example to exemplarily illustrate the software structure of the electronic device 100.
  • Fig. 3 is a software structure block diagram of the electronic device 100 provided in the embodiment of the present application.
  • the layered architecture divides the software into several layers, each with clear roles and division of labor.
  • the layers communicate with each other through software interfaces.
  • the Android system is divided into five layers, from top to bottom, namely application layer (application, APP) 110, application framework layer 120, hardware abstract layer (hardware abstract layer, HAL) 130, driver layer 140, and hardware layer 150.
  • application layer application, APP
  • application framework layer 120 application framework layer 120
  • hardware abstract layer hardware abstract layer 130
  • driver layer 140 driver layer 140
  • hardware layer 150 hardware layer 150
  • the application layer 110 may include a series of application packages.
  • the application layer 110 may include applications such as a camera and a gallery.
  • the application layer 110 is at the top of the entire framework and is responsible for interacting directly with the user. Once it receives a direct or indirect request from the user, such as taking a photo or recording a video, it sends the request to the application framework layer 120 through the interface and waits for the application framework layer 120 to return the processing result, which includes image data and camera parameters (such as shooting frame rate), etc. The application layer 110 then feeds back the result to the user.
  • a direct or indirect request from the user such as taking a photo or recording a video
  • the application framework layer 120 sends the request to the application framework layer 120 through the interface and waits for the application framework layer 120 to return the processing result, which includes image data and camera parameters (such as shooting frame rate), etc.
  • the application layer 110 then feeds back the result to the user.
  • the application framework layer 120 is located between the application layer 110 and the hardware abstraction layer 130.
  • the application framework layer 120 provides an application programming interface (API) and a programming framework for the application programs of the application layer 110.
  • API application programming interface
  • the application framework layer 120 includes some predefined functions.
  • the application framework layer 120 is a framework of the application program. Developers can develop some applications based on the application framework layer 120 while following the development principles of the application framework.
  • the application framework layer 120 also includes an access interface corresponding to the camera application.
  • the hardware abstraction layer 130 is used to abstract the hardware and provide a virtual hardware usage platform for the operating system.
  • the hardware abstraction layer 130 may include a camera hardware abstraction layer 131 and the like.
  • the hardware layer 150 may include sensors, image signal processors, and other hardware devices, which are not limited in this application; the sensors may also include ambient light sensors, gyroscope sensors, image sensors, and the like.
  • the hardware abstraction layer 130 by calling the camera hardware abstraction layer 131 in the hardware abstraction layer 130, the hardware The connection between the application layer 110 and the application framework layer 120 above the abstract layer 130 and the driver layer 140 and the hardware layer 150 below realizes the camera data transmission and function control.
  • the embodiment of the present application adds a frame rate strategy decision layer 132 in the hardware abstraction layer 130, which is connected to the camera hardware abstraction layer 131, so that the shooting frame rate strategy of different shooting environments can be adapted in the frame rate strategy decision layer 132, and different frame rate control adaptation logics can be customized according to different requirements or actual scene applications.
  • the frame rate strategy decision layer 132 can obtain various parameters through the camera hardware abstraction layer 131, and call various hardware such as sensors and ISP to realize dynamic adjustment of the shooting frame rate for different shooting environments during shooting.
  • FIG. 4 is a schematic diagram of the structure of a hardware abstraction layer 130 provided in an embodiment of the present application
  • FIG. 5 is a schematic diagram of the structure of another hardware abstraction layer 130 provided in an embodiment of the present application.
  • the camera hardware abstraction layer 131 includes a general functional interface layer 1311 , and the interface layer 1311 is used to access different operating systems to achieve management and control.
  • the camera hardware abstraction layer 131 also includes a media control layer 1312, a chip platform (Camx), and a public library.
  • the media control layer 1312 is used to connect to the business customization of the chip platform.
  • the chip platform includes a collection of codes of common functional interfaces, such as core, which is used to store the core implementation module of the chip platform.
  • the public library includes various data such as operating system adaptation data and metadata (Metadate).
  • the frame rate policy decision layer 132 includes: a frame rate policy analysis module 1322 , a frame rate policy calculation module 1323 and a frame rate policy control module 1324 which are connected in sequence; the frame rate policy decision layer 132 also includes an interface adaptation module 1321 .
  • the interface adapter module 1321 is also connected to the frame rate policy calculation module 1323, the frame rate policy control module 1324 and the camera hardware abstraction layer 131.
  • the interface adapter module 1321 is used to receive initial data from the camera hardware abstraction layer 131 and provide the initial data to the frame rate policy calculation module 1323; the interface adapter module 1321 is also used to receive the decision instruction provided by the frame rate policy control module 1324, and provide the decision instruction to the camera hardware abstraction layer 131, so that the camera hardware abstraction layer 131 can be controlled according to the decision instruction.
  • the interface adaptation module 1321 is respectively connected to the interface layer 1311 and the media control layer 1312 in the camera hardware abstraction layer 131, and the interface adaptation module 1321 is used to receive initial data from the interface layer 1311 and provide the initial data to the frame rate strategy calculation module 1323; the interface adaptation module 1321 is also used to receive decision instructions provided by the frame rate strategy control module 1324, and provide the decision instructions to the media control layer 1312, so that the media control layer 1312 can be controlled according to the decision instructions.
  • the frame rate policy parsing module 1322 is used to parse the frame rate policy configuration to obtain multiple frame rate policies.
  • the frame rate strategy calculation module 1323 is used to determine a target frame rate strategy that matches the initial data from among multiple frame rate strategies according to the initial data, and the target frame rate strategy is one of the multiple frame rate strategies.
  • the frame rate policy control module 1324 is used to generate a decision instruction according to the target frame rate policy.
  • the interface adaptation module 1321 is used to output the decision instruction.
  • the initial data may include at least one of: sensing data and control data.
  • the sensor data may include: at least one of: ambient brightness value, dynamic range value, status information, scene type, and the control data may include: zoom factor, etc.
  • the sensor data can be set as needed, and the embodiments of the present application do not impose any restrictions on this.
  • the control data can be an instruction generated by the user by clicking or dragging the corresponding zoom factor space.
  • the sensor data in the initial data is used to indicate the type of the current shooting environment of the electronic device; the control data in the initial data is used to indicate the zoom factor that the user can select when recording a video.
  • the zoom factor includes the zoom factor corresponding to before the user performs a zoom operation, referred to herein as a first zoom factor, and the zoom factor after the user performs a zoom operation, referred to herein as a second zoom factor.
  • Zoom switching mode refers to the operation mode used by the user to change the zoom ratio, such as sliding mode or point-to-point mode.
  • the sliding mode refers to the user continuously sliding at a certain position on the zoom bar or display screen to change the zoom ratio;
  • the point-to-point mode refers to the user directly clicking on the zoom ratio value at a certain position on the interface to change the zoom ratio.
  • the point-to-point mode has the characteristics of discontinuity, which is opposite to the characteristics of the sliding mode.
  • the ambient brightness value refers to the brightness value collected by the ambient light sensor.
  • its reference indicators can include dark (or low brightness), ordinary illumination, and bright (or high brightness); different reference indicators correspond to different brightness ranges.
  • the dynamic range value is based on the setting of the dynamic threshold, and its reference indicators may include high dynamic and low dynamic.
  • the state information refers to the state of the electronic device during the video recording process, which can be measured by a gyroscope sensor at the hardware layer. Its reference indicators may include shaking scenes and still scenes.
  • the scene type is an indicator determined based on the flicker of lights in the environment, and its reference indicators may include light flicker (or high-frequency light flicker) and light stability, which can be measured by the ambient light sensor through the light flicker collected.
  • data such as zoom ratio can come from the application layer 110; sensor data can come from the chip platform, from the hardware abstraction layer 130 or from the hardware layer 150, and of course can also come from other layers.
  • the embodiments of the present application do not impose any restrictions on this.
  • the dynamic frame rate configuration may include multiple frame rate strategies according to requirements, for example, may include a first strategy, a second strategy, a third strategy, etc.
  • the corresponding zoom multiples may also be in other ranges, which are not specifically limited.
  • the first strategy can be a high-quality low-frame strategy.
  • the ambient brightness value is greater than the brightness threshold
  • the dynamic range value is greater than the dynamic threshold
  • the state information is a jitter state
  • the first strategy among the multiple frame rate strategies is used as the target frame rate strategy
  • the decision instruction corresponding to the first strategy is output to control the shooting frame rate during the video recording process; thereby adapting to the shooting scenes of high brightness, high dynamic and jitter state, such as the shooting of the concert scene shown in FIG10 (b).
  • the state information is a static state
  • the scene type is a scene with unchanged lighting
  • the first strategy among the multiple frame rate strategies is used as the target frame rate strategy
  • the decision instruction corresponding to the first strategy is output to control the shooting frame rate during the video recording process; thereby adapting to the shooting scenes of high brightness, high dynamic, static state and unchanged lighting.
  • the second strategy can be a low-quality high-frame strategy.
  • the ambient brightness value is greater than the brightness threshold
  • the dynamic range value is greater than the dynamic threshold
  • the state information is a static state
  • the scene type is a light flickering scene
  • the second strategy among the multiple frame rate strategies is used as the target frame rate strategy; and the decision instruction corresponding to the second strategy is output to control the shooting frame rate during the video recording process; thereby adapting to the scene types of high brightness, high dynamic, static state, and high-frequency flickering of lights, such as the shooting of a scene with a marquee as shown in (a) of FIG. 10.
  • the ambient brightness value is greater than the brightness threshold
  • the dynamic range value is less than or equal to the dynamic threshold
  • the state information is a static state
  • the scene type is a light flickering scene
  • the second strategy among the multiple frame rate strategies is used as the target frame rate strategy
  • the decision instruction corresponding to the second strategy is output to control the shooting frame rate during the video recording process; thereby adapting to the scene types of high brightness, low dynamic, static state, and high-frequency flickering of lights.
  • the third strategy can be a low-quality low-frame strategy.
  • the dynamic range value When the ambient brightness value is greater than the brightness threshold, the dynamic range value is less than or equal to the dynamic threshold, the state information is a jitter state, and the third strategy among the multiple frame rate strategies is used as the target frame rate strategy, the decision instruction corresponding to the third strategy can be called to control the shooting frame rate during video recording; thereby, it can adapt to shooting scenes with high brightness, low dynamics, and jitter states.
  • the third strategy among the multiple frame rate strategies is used as the target frame rate strategy; to adapt to scene types with high brightness, low dynamics, static states, and unchanged lighting.
  • the third strategy among the multiple frame rate strategies is used as the target strategy; to adapt to low-brightness shooting scenes.
  • the final output decision instruction may include sensor module parameters.
  • the sensor module parameters corresponding to high quality and low frame rate may be dual conversion gain control unit (DCG) 30fps.
  • DCG is an image output format for improving the output image quality of the image sensor. It can output images with high dynamic range through high conversion gain and low conversion gain reading modes.
  • the sensor module parameters corresponding to low quality and high frame may be Binning 60fps. Binning is an image readout mode that adds the charges induced by adjacent pixels together and reads them out in a pixel mode.
  • the corresponding sensor module parameters corresponding to low quality and low frame may be Binning 30fps.
  • the multiple frame rate strategies in the dynamic strategy configuration can be a file configured in XML format. Of course, they can also be configured in other ways. The embodiments of the present application do not impose any restrictions on this.
  • the frame rate policy decision layer 132 further includes: a first conversion module 1325 and a second conversion module 1326 .
  • the first conversion module 1325 is connected between the interface adaptation module 1321 and the frame rate strategy calculation module 1323 , and is used to convert the initial data received by the interface adaptation module 1321 into first data and input the first data into the frame rate strategy calculation module 1323 .
  • the second conversion module 1326 is connected between the frame rate policy control module 1324 and the interface adaptation module 1321 .
  • the second conversion module is used to convert the decision instruction output by the frame rate policy control module 1324 into second data and provide the interface adaptation module 1321 with output.
  • the format of the first data should be a format that can be recognized and processed by the frame rate strategy calculation module 1323, such as the McxContext format.
  • the format of the second data should be a format that can be recognized and processed by the camera hardware abstraction layer 131. Therefore, by setting different conversion modules, different chip platforms can be adapted, and the platform applicability and portability are relatively good.
  • An embodiment of the present application provides an electronic device.
  • the camera shooting frame rate can be decoupled from the chip platform, and the frame rate policies in the frame rate policy decision layer can be configured and changed more flexibly and conveniently to meet the shooting needs of more and more shooting scenes.
  • the shooting frame rate and power consumption are better balanced to improve the use time of the electronic device.
  • FIG6 shows a flow chart of a shooting frame rate control method provided by the present application. As shown in FIG6 , the method includes the following S10 to S100.
  • the camera application In response to a preset trigger condition, the camera application sends a request to the camera hardware abstraction layer 131 .
  • the sensor in the hardware layer 150 reports the real-time data of the camera to the camera hardware abstraction layer 131 .
  • the real-time data of the camera may include data corresponding to various shooting scenes.
  • the camera hardware abstraction layer 131 parses the corresponding initial data according to the sent request and the reported real-time data of the camera.
  • the initial data may include sensor data and control data; the sensor data may include: ambient brightness value, dynamic range value, status information, scene type, etc.; the control data may include: zoom factor.
  • the camera hardware abstraction layer 131 transmits the initial data to the interface adaptation module 1321 in the frame rate strategy decision layer 132, the interface adaptation module 1321 provides the initial data to the first conversion module 1325, and the first conversion module 1325 converts the initial data into first data.
  • the first data may be conditional control information corresponding to the shooting scene after the initial data is converted.
  • the first conversion module 1325 may provide the frame rate strategy calculation module 1323 with condition control information corresponding to the shooting scene.
  • the frame rate strategy calculation module 1323 calculates an adapted target frame rate strategy from a plurality of frame rate strategies according to the received conditional control information.
  • the frame rate strategy analysis module analyzes the dynamic frame rate configuration to obtain multiple frame rate strategies; the target frame rate strategy is one of the multiple frame rate strategies.
  • the frame rate strategy calculation module 1323 outputs the target frame rate strategy.
  • the target frame rate strategy is a frame rate strategy adapted to different shooting scenes.
  • the frame rate strategy control module 1324 generates a decision instruction according to the target frame rate strategy, and returns it to the second conversion module 1326 .
  • the decision instruction includes sensor module parameters, and the shooting frame rate of the camera is controlled by the sensor module parameters.
  • the second conversion module 1326 converts the decision instruction into second data, outputs it to the camera hardware abstraction layer 131 via the interface adaptation module 1321, and then sends it to the sensor to control the shooting frame rate of multiple cameras according to the decision instruction.
  • An embodiment of the present application provides a shooting frame rate control method.
  • a frame rate strategy decision layer that integrates multiple frame rate strategies that adapt to the shooting environment in the hardware abstraction layer of the electronic device
  • the control of the shooting frame rate can be decoupled from the chip platform, and the frame rate strategy in the frame rate strategy decision layer can be configured and changed more flexibly and conveniently to meet the needs of more and more shooting scenarios.
  • Figure 7 is a flow chart of a method for controlling a shooting frame rate provided in an embodiment of the present application.
  • the present application implements a calling process between modules in the frame rate strategy decision layer 132 for controlling a shooting frame rate, and the interaction process may include the following S101 to S107.
  • the multi-camera management module initializes the multi-camera strategy management module.
  • the initialized multi-camera management module calls the multi-camera merging strategy-frame rate module, the multi-camera scene strategy module, the multi-camera frame rate strategy module and the multi-camera frame rate calculation module.
  • the camera application in the application layer 110 is displayed on the screen of the electronic device 100 in the form of an icon.
  • the icon of the camera application is clicked by the user to trigger, the electronic device 100 starts to run the camera application.
  • the camera application calls the corresponding camera application in the application framework layer 120.
  • the camera accesses the interface and calls the frame rate strategy decision layer 132 through the camera hardware abstraction layer 131.
  • the frame rate strategy decision layer 132 performs relevant initialization, loads relevant dynamic frame rate configuration files, and loads and parses frame rate strategy related files.
  • the camera hardware abstraction layer 131 can control the preview interface to display that the current shooting mode is the video mode, the zoom ratio is 1x, the main camera is called to capture at the calculated shooting frequency, and the captured image is transmitted to the viewfinder for display.
  • the multi-camera frame rate calculation module calls the multi-camera frame rate analysis module.
  • the multi-camera frame rate parsing module parses the frame rate policy configuration to obtain the frame rate policy; and provides the frame rate policy to the multi-camera frame rate calculation module.
  • the multi-camera strategy management module calls the request and control information sent by the application layer and transmitted to the multi-camera management module.
  • the control information may include the zoom factor changed by the user by sliding or clicking the zoom bar.
  • the multi-camera strategy management module selects a frame rate strategy and provides it to the multi-camera merging strategy-frame rate module.
  • the multi-camera frame rate strategy module calculates the acquired data based on the provided frame rate strategy, and provides the calculated frame rate result to the multi-camera frame rate calculation module, and the multi-camera frame rate calculation module generates corresponding parameters and outputs them to the multi-camera management module.
  • FIG 8 is a schematic diagram of the dynamic control of the frame rate strategy in the application scenario provided by the embodiment of the present application.
  • the shooting frame rate is fixed at 60FPS or 30FPS. It will not be dynamically adapted due to changes in the scene.
  • the embodiment of the present application provides a shooting frame rate control method, which may include the following S111 to S118.
  • the camera application in the application layer 110 is displayed in the form of an icon on the screen of the electronic device 100.
  • the icon of the camera application is clicked by the user to be triggered, the electronic device 100 starts to run the camera application.
  • the camera application runs on the electronic device 100, the camera application calls the camera access interface corresponding to the camera application in the application framework layer 120, and calls the frame rate policy decision layer 132 through the camera hardware abstraction layer 131.
  • the frame rate policy decision layer 132 performs relevant initialization, loads relevant frame rate policy configuration files, and loads and parses frame rate policy related files.
  • the camera hardware abstraction layer 131 can control the preview interface to display that the current shooting mode is the video recording mode, the zoom ratio is 1x, the main camera is called to capture at a shooting frequency adapted to the retaining wall shooting scene, and the captured image is transmitted to the viewfinder for display.
  • the frame rate strategy calculation module calculates the adaptation condition information according to the ambient brightness value and the control data.
  • the frame rate strategy calculation module matches the condition information with the parsed frame rate strategy and calculates the frame rate result.
  • the frame rate strategy calculation module determines the target frame rate strategy according to the frame rate result and the sensor data.
  • the frame rate strategy calculation module can calculate the condition information that the ambient brightness value and the zoom multiple meet based on the ambient brightness value uploaded by the sensor of the hardware layer and the zoom multiple provided by the application layer.
  • the condition information may include: ultra-wide angle UV, wide+dark, wide+bright or telephoto information; by calculating the ambient brightness value, determine the range of the ambient brightness value, whether it is dark or bright, and then determine the zoom multiple based on the zoom multiple.
  • Determine the shooting mode to be used such as ultra-wide-angle UV, wide-angle and telephoto shooting models, and then combine the results of the two calculated situations to determine whether it meets the requirements of wide+dark or wide+bright.
  • the frame rate result calculated by the frame rate strategy calculation module is the first frame rate; such as 30 FPS as shown in FIG. 8 (b).
  • the frame rate result calculated by the frame rate strategy calculation module is a second frame rate; for example, 60 FPS as shown in FIG. 8 (b).
  • the target frame rate strategy is switched from the frame rate strategy corresponding to the first frame rate to the frame rate strategy corresponding to the second frame rate.
  • the corresponding shooting frame rate is also controlled to be dynamically switched.
  • the frame rate result of the above calculation can also be other frame rates, such as 24 frames, 112 frames, or any frame between 30 frames and 60 frames. It can be set according to the needs of the actual application scenario.
  • the embodiment of the present application only exemplifies the dynamic control of the frame rate; of course, steps can be added or reduced, and the embodiment of the present application does not impose any restrictions on this.
  • the shooting frame rate control method may further include the following S121 to S126.
  • the camera application in the application layer 110 is displayed in the form of an icon on the screen of the electronic device 100.
  • the icon of the camera application is clicked by the user to be triggered, the electronic device 100 starts to run the camera application.
  • the camera application runs on the electronic device 100, the camera application calls the camera access interface corresponding to the camera application in the application framework layer 120, and calls the frame rate policy decision layer 132 through the camera hardware abstraction layer 131.
  • the frame rate policy decision layer 132 performs relevant initialization, loads relevant frame rate policy configuration files, and loads and parses frame rate policy related files.
  • the camera hardware abstraction layer 131 can control the preview interface to display that the current shooting mode is the video recording mode, the zoom ratio is 1x, the main camera is called to capture at a shooting frequency adapted to the retaining wall shooting scene, and the captured image is transmitted to the viewfinder for display.
  • the camera application process is opened to record the video according to the required zoom ratio and/or shooting frame rate.
  • the camera application schedules the request, and the hardware abstraction layer 130 parses the control information (such as zoom ratio) and scene information issued by the application from the upper layer request.
  • the sensor of the hardware layer 150 reports the latest sensor data of the sensor to the hardware abstraction layer, and performs data conversion on the control information and the sensor data at the interface adaptation layer, and converts them into a data format that can be recognized by the frame rate strategy decision layer 132.
  • the frame rate strategy calculation module 1323 calculates whether the ambient brightness value in the current state data satisfies the condition information according to a preset brightness threshold, for example, whether the current ambient brightness value satisfies the high brightness or low brightness condition information.
  • the brightness threshold may be set based on the hardware foundation of the camera, for example, based on the type of camera.
  • the specific size of the brightness threshold is not limited here.
  • the frame rate strategy calculation module 1323 calculates whether the dynamic range value in the current state data satisfies the condition information according to a preset dynamic threshold, for example, whether the dynamic range value of the current shooting environment satisfies the high dynamic or low dynamic condition information.
  • the frame rate strategy calculation module 1323 calculates whether the condition information is satisfied according to the state uploaded by the sensor. For example, whether the current state of the camera satisfies the condition information of the shaking state or the static state.
  • the frame rate strategy calculation module 1323 determines whether the light status in the scene uploaded by the sensor satisfies the condition information, for example, whether the light in the current shooting scene satisfies the condition information of high-frequency flickering of the light or unchanged light.
  • the frame rate policy parsing module 1322 parses and stores the configured dynamic frame rate configuration XML.
  • the dynamic frame rate configuration XML includes a frame rate strategy corresponding to a shooting scene composed of various condition information or a combination of condition information.
  • the frame rate strategy calculation module 1323 matches and calculates the adapted condition information and the analyzed frame rate strategy of the corresponding shooting scene, integrates the output frame rate result and the camera state, and outputs the final decision result and the target frame rate strategy.
  • the frame rate policy control module generates a decision instruction based on the target frame rate policy, and the interface adaptation module 1321 outputs the decision instruction.
  • the interface adaptation module 1321 outputs to the media control layer 1312 of the camera hardware abstraction layer 131.
  • the media control layer 1312 adds the decision instruction to the request and transmits it to the chip platform, which then sends it to the driver layer 140 to control the shooting frame rate of multiple cameras according to the decision instruction, thereby enabling the camera to dynamically adapt and adjust the shooting frame rate in different shooting scenarios.
  • the frame rate strategy decision layer can also determine the target frame rate strategy based on the sensor data, control data, and power consumption information.
  • the power consumption information can be the state information of the sensor, that is, the algorithm power consumption of the sensor corresponding to the current shooting mode, such as the algorithm power consumption corresponding to the night scene mode.
  • the frame rate strategy decision layer 132 can also adapt and regulate the dynamic frame rate in combination with the power consumption information required by the current shooting scene to achieve a better balance between shooting effect and running power consumption.
  • condition information is only illustrative, and the shooting scenes corresponding to the individual condition information or the combination of the condition information can be expanded or matched based on the actual application scene, and are not specifically limited here.
  • Fig. 11 shows a hardware system of an electronic device applicable to the present application.
  • the electronic device 100 can be used to implement the shooting frame rate control method described in the above method embodiment.
  • the electronic device 100 can be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an augmented reality (AR) device, a virtual reality (VR) device, a laptop computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA), a projector, etc.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • projector etc.
  • the embodiment of the present application does not impose any limitation on the specific type of the electronic device 100.
  • the electronic device 100 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, and a speaker.
  • the device includes a speaker 270A, a receiver 270B, a microphone 270C, an earphone jack 270D, a sensor module 280, a button 290, a motor 291, an indicator 292, a camera 293, a display 294, and a fan unit 295.
  • the sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated into one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller may generate an operation control signal according to the instruction operation code and the timing signal to complete the control of fetching and executing instructions.
  • the processor 210 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 210 is a cache memory.
  • the memory may store instructions or data that the processor 210 has just used or cyclically used. If the processor 210 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 210, and thus improves the efficiency of the system.
  • the processor 210 can execute displaying a desktop (main interface or preview interface), where the display interface may include a first control (for example, the main interface may include a camera control, and the preview interface may include a zoom control); detecting a first operation on the first control; determining a target frame rate strategy based on initial data in response to the first operation; and then determining a decision instruction based on the target frame rate strategy to dynamically modulate the shooting frame rate of the camera in the shooting scene.
  • a desktop main interface or preview interface
  • the display interface may include a first control
  • the main interface may include a camera control
  • the preview interface may include a zoom control
  • detecting a first operation on the first control determining a target frame rate strategy based on initial data in response to the first operation
  • determining a decision instruction based on the target frame rate strategy to dynamically modulate the shooting frame rate of the camera in the shooting scene.
  • connection relationship between the modules shown in Fig. 11 is only a schematic illustration and does not constitute a limitation on the connection relationship between the modules of the electronic device 100.
  • the modules of the electronic device 100 may also adopt a combination of multiple connection modes in the above embodiments.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
  • antenna 1 can be reused as a diversity antenna for a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 250 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
  • the mobile communication module 250 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
  • the mobile communication module 250 may receive electromagnetic waves through the antenna 1, The received electromagnetic waves are filtered, amplified, and transmitted to the modulation and demodulation processor for demodulation.
  • the mobile communication module 250 can also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
  • at least some functional modules of the mobile communication module 250 can be set in the processor 210.
  • at least some functional modules of the mobile communication module 250 can be set in the same device as at least some modules of the processor 210.
  • the electronic device 100 can realize the display function through a GPU, a display screen 294, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 210 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 294 is used to display images, videos, etc.
  • the display screen 294 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 294, where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through ISP, camera 293, video codec, GPU, display screen 294 and application processor.
  • the ISP is used to process the data fed back by the camera 293. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converts it into an image visible to the naked eye.
  • the ISP can also perform algorithm optimization on the noise, brightness, and skin color of the image. The ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP can be set in the camera 293.
  • the camera 293 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • the DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
  • the electronic device 100 may include 1 or N cameras 293, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, and can process not only digital image signals but also other digital signals. For example, when the electronic device 100 is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the electronic device 100 can implement audio functions such as music playing and recording through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the headphone jack 270D, and the application processor.
  • the pressure sensor 280A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • the pressure sensor 280A can be set on the display screen 294.
  • the capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the touch operation intensity according to the pressure sensor 280A.
  • the electronic device 100 can also calculate the position of the touch according to the detection signal of the pressure sensor 280A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 280B can be used to determine the motion posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
  • the gyro sensor 280B can be used for anti-shake shooting. For example, when the shutter is pressed, the gyro sensor 280B detects the angle of the shaking of the electronic device 100, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 280B can also be used for navigation and somatosensory game scenes.
  • the gyro sensor can also detect the state of the electronic device during the shooting process, such as a moving shooting state or a stationary shooting state.
  • the acceleration sensor 280E can detect the magnitude of the acceleration of the electronic device 100 in all directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the electronic device and is applied to applications such as horizontal and vertical screen switching and pedometers.
  • the distance sensor 280F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 280F to measure the distance to achieve fast focusing.
  • the proximity light sensor 280G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 280G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 280G can also be used in leather case mode and pocket mode to automatically unlock and lock the screen.
  • the ambient light sensor 280L is used to sense the ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 294 according to the perceived ambient light brightness.
  • the ambient light sensor 280L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 280L can also cooperate with the proximity light sensor 280G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photography, fingerprint answering calls, etc.
  • the temperature sensor 280J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 280J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 280J exceeds a threshold, the electronic device 100 executes a strategy to reduce the performance of a processor located near the temperature sensor 280J to reduce power consumption. Thermal protection.
  • the electronic device 100 heats the battery 242 to prevent the low temperature from causing abnormal shutdown of the electronic device 100.
  • the electronic device 100 boosts the output voltage of the battery 242 to prevent abnormal shutdown caused by low temperature.
  • the touch sensor 280K is also called a "touch control device”.
  • the touch sensor 280K can be set on the display screen 294.
  • the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch control screen”.
  • the touch sensor 280K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 294.
  • the touch sensor 280K can also be set on the surface of the electronic device 100, which is different from the position of the display screen 294.
  • the structure of the above-mentioned electronic device is only illustrative, and based on different application scenarios, it may also include other physical structures, and the physical structure of the electronic device is not limited here.
  • the embodiment of the present application also provides a computer program product including computer instructions, which, when executed on the electronic device 100, enables the electronic device 100 to execute the shooting frame rate control method shown above.
  • FIG12 is a schematic diagram of the structure of a chip system provided in an embodiment of the present application.
  • the chip system 30 shown in FIG12 can be a general-purpose processor or a dedicated processor.
  • the chip system 30 includes a processor 301.
  • the processor 301 is used to support the electronic device 100 to execute the technical solution shown above.
  • the chip system also includes a transceiver 302, which is used to accept the control of the processor 301 and to support the electronic device 100 to execute the technical solution shown above.
  • the chip system shown in FIG. 12 may further include: a storage medium 303 .
  • chip system shown in Figure 12 can be implemented using the following circuits or devices: one or more field programmable gate arrays (FPGA), programmable logic devices (PLD), controllers, state machines, gate logic, discrete hardware components, any other suitable circuits, or any combination of circuits that can perform the various functions described throughout this application.
  • FPGA field programmable gate arrays
  • PLD programmable logic devices
  • state machines gate logic, discrete hardware components, any other suitable circuits, or any combination of circuits that can perform the various functions described throughout this application.
  • the electronic device, computer storage medium, computer program product, and chip system provided in the above-mentioned embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects that can be achieved can refer to the corresponding beneficial effects of the methods provided above, and will not be repeated here.
  • pre-setting and “pre-definition” can be achieved by pre-saving corresponding codes, tables or other methods that can be used to indicate relevant information in a device (for example, including an electronic device), and the present application does not limit its specific implementation method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请适用于终端技术领域,提供了一种拍摄帧率控制方法、电子设备、芯片系统及可读存储介质。在本申请的拍摄帧率控制方法中,电子设备可以检测对显示界面中第一控件的第一操作;并响应于所述第一操作,由帧率策略决策层根据初始数据,确定目标帧率策略;通过帧率策略决策层根据所述目标帧率策略,生成决策指令,该决策指令用于控制多个摄像头的拍摄帧率;本申请通过在硬件抽象层设置帧率策略决策层,将多个摄像头的拍摄帧率控制的配置逻辑与硬件平台解耦,从而便于后续对于拍摄帧率控制的功能扩展及维护。

Description

拍摄帧率控制方法、电子设备、芯片系统及可读存储介质
本申请要求于2022年10月31日提交国家知识产权局、申请号为202211350405.9、申请名称为“拍摄帧率控制方法、电子设备、芯片系统及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种拍摄帧率控制方法、电子设备、芯片系统及可读存储介质。
背景技术
随着电子设备功能的不断开发,其拍摄性能也越来越强;通过在电子设备上安装多个摄像头,可以提供更多的拍摄模式供用户选择与使用。
通常情况下,拍摄环境条件较好时,可以使用高帧率进行拍摄以达到更优的拍摄效果,反之则采用低帧率进行拍摄;然而采用高帧率的同时会带来功耗的增加。目前用户通过电子设备进行拍摄时,只能基于设备固定的帧率模式进行拍摄;且多个摄像头所涉及的业务又和电子设备的硬件平台相关联耦合;随着多摄像头基于不同拍摄环境及运行功耗的帧率适配方案越来越多,多个摄像头对应不同拍摄场景的适配逻辑也越来越复杂,给后续的功能扩展及维护带来巨大的困难。
发明内容
本申请提供一种拍摄帧率控制方法、电子设备、芯片系统及可读存储介质,通过将多个摄像头的拍摄帧率的适配逻辑与硬件平台解耦,从而可以便于后续对于拍摄帧率控制的功能扩展及维护。
为达到上述目的,本申请采用如下技术方案:
第一方面,提供一种拍摄帧率控制方法,应用于包括帧率策略决策层和多个摄像头的电子设备,该方法可以包括:
电子设备检测对显示界面中第一控件的第一操作;响应于所述第一操作,上述帧率策略决策层可以根据初始数据,确定目标帧率策略;该帧率策略决策层还可以根据目标帧率策略,生成决策指令,该决策指令用于电子设备控制多个摄像头的拍摄帧率。
通过上述方式,在硬件抽象层设置帧率策略决策层,将多个摄像头的拍摄帧率控制的配置逻辑与硬件平台解耦,从而便于后续功能扩展及维护。
在第一方面的一种可能的实现方式中,所述帧率策略决策层包括依次连接的接口适配模块、帧率策略计算模块及帧率策略控制模块,所述接口适配模块还与所述帧率策略控制模块相连接,所述帧率策略决策层还包括帧率策略解析模块,所述帧率策略解析模块与帧率策略计算模块相连接;
所述方法包括:
所述接口适配模块获取所述初始数据,所述初始数据包括:传感数据、控制数据中的至少一项;
所述帧率策略解析模块解析帧率策略配置,得到多个帧率策略;
所述帧率策略计算模块根据所述初始数据,在多个所述帧率策略中确定与所述初始数据匹配的所述目标帧率策略,所述目标帧率策略为多个所述帧率策略中的一个;
所述帧率策略控制模块根据所述目标帧率策略,生成所述决策指令;
所述接口适配模块输出所述决策指令。
通过上述方式,基于多摄像头的基础架构,并通过帧率策略解析模块、帧率策略计算模块以及帧率策略控制模块,可以实现不同场景下支持不同传感状态对应的拍摄帧率的调控;将多个摄像头的拍摄帧率控制的配置逻辑与硬件平台解耦,并通过多个模块实现拍摄帧率的适配计算及输出控制,具有跨芯片平台的可适用性、可移植性等优点;同时设置的帧率策略计算模块可以基于传感数据或控制数据实现对拍摄帧率的调整,可以平衡多摄像机在拍摄过程中的运行功耗,从而可以延长电子设备的使用时间。
在第一方面的一种可能的实现方式中,所述帧率策略决策层还包括第一转换模块和第二转换模块,所述第一转换模块连接于所述接口适配模块与所述帧率策略计算模块之间,所述第二转换模块连接于所述接口适配模块与所述帧率策略控制模块之间;
在所述接口适配模块获取所述初始数据之后,所述方法还包括:
所述第一转换模块将所述初始数据转换为第一数据;
在所述帧率策略控制模块根据所述目标帧率策略,生成所述决策指令之后,所述方法还包括:
所述第二转换模块将所述决策指令转换为第二数据;
所述接口适配模块输出所述第二数据。
通过上述方式,将帧率策略决策层设置为独立的模块,通过转换接口实现数据的转换,从而可以适用于不同的芯片架构,更加便于对后续拍摄帧率的适配逻辑等功能的扩展及维护。
在第一方面的一种可能的实现方式中,所述传感数据包括:环境亮度值、动态范围值、状态信息、场景类型中的至少一项,所述控制数据包括:变焦倍数;
所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
所述帧率策略计算模块根据所述环境亮度值和所述控制数据,计算适配的条件信息;
所述帧率策略计算模块将所述条件信息与解析后的所述帧率策略进行匹配,计算帧率结果;
所述帧率策略计算模块根据所述帧率结果和所述传感数据,确定所述目标帧率策略。
在第一方面的一种可能的实现方式中,所述条件信息包括超广角模式、夜景模式、日光模式、长焦模式中的至少一项;所述方法还包括:
若所述条件信息为所述超广角模式、所述夜景模式或所述长焦模式,则所述帧率策略计算模块计算的所述帧率结果为第一帧率;
若所述条件信息为所述日光模式,则所述帧率策略计算模块计算的所述帧率结果为第二帧率。
在第一方面的一种可能的实现方式中,所述方法还包括:
当计算的所述条件信息在由所述超广角模式、所述夜景模式或所述长焦模式向所述日光模式切换时,所述目标帧率策略由所述第一帧率对应的帧率策略切换为所述第二帧率对应的帧率策略。
通过上述方式,通过设置帧率策略决策层,可以基于初始数据中的各类数据,实现对与不同拍摄场景相适配的拍摄帧率的动态控制;从而可以降低在拍摄场景要求较高,设备采用固定帧率进行拍摄时可能产生的功耗,平衡拍摄场景、拍摄帧率及设备运行之间的功耗,延长设备的使用时长。
在第一方面的一种可能的实现方式中,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为抖动状态,则将多个所述帧率策略中的第一策略作为所述目标帧率策略;
若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为静止状态,所述场景类型为灯光闪烁场景,则将多个所述帧率策略中的第二策略作为所述目标帧率策略;
若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为静止状态,所述场景类型为灯光不变场景,则将多个所述帧率策略中的第一策略作为所述目标帧率策略。
在第一方面的一种可能的实现方式中,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为抖动状态,则将多个所述帧率策略中的第三策略作为所述目标帧率策略;
若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为静止状态,所述场景类型为灯光闪烁场景,则将多个所述帧率策略中的第二策略作为所述目标帧率策略;
若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为静止状态,所述场景类型为灯光不变场景,则将多个所述帧率策略中的第三策略作为所述目标帧率策略。
在第一方面的一种可能的实现方式中,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
若所述环境亮度值小于或等于亮度阈值,则将多个所述帧率策略中的第三策略作为所述目标策略。
在第一方面的一种可能的实现方式中,所述初始数据还包括功耗信息;
所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
所述帧率策略决策层根据所述传感数据、所述控制数据、所述功耗信息,确定所述目标帧率策略。
通过上述方式,基于设备多摄的架构,通过对帧率动态控制功能进行扩展,当录像条件(如摄像头切换、倍率切换、环境亮暗变化或者不同功耗的拍照算法)随场景的切换动态变化时,可以提供在录像过程中随条件的变化灵活切换拍摄帧率的功能,从而达到拍摄效果与设备功耗更好的平衡。
第二方面,提供了一种电子设备,所述电子设备包括存储器、处理器,所述存储器上存储有可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如上述第一方面或第一方面的任意可能的实现方式中提供的拍摄帧率控制方法。
第三方面,提供一种芯片系统,包括:处理器,用于从存储器中调用并运行计算机程序,使得安装有芯片的设备执行如第一方面或第一方面的任意可能的实现方式中提供的拍摄帧率控制方法。
第四方面,提供了一种计算机可读存储介质,包括:存储有计算机程序,所述计算机程序被处理器执行时实现如上述第一方面或第一方面的任意可能的实现方式中提供的拍摄帧率控制方法。
第五方面,提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面或第一方面的任意可能的实现方式中提供的拍摄帧率控制方法。
可以理解的是,上述第二方面至第五方面的有益效果可以参见第一方面中的相关描述,在此不再赘述。
附图说明
图1为本申请实施例提供的拍摄帧率控制方法适用的场景示意图;
图2为本申请实施例提供的拍摄帧率控制方法适用的另一种场景示意图;
图3是本申请实施例提供的一种电子设备的软件系统的结构示意图;
图4是本申请实施例提供的一种硬件抽象层的结构示意图;
图5是本申请实施例提供的另一种硬件抽象层的结构示意图;
图6是本申请实施例提供的一种拍摄帧率控制方法的流程示意图;
图7是本申请实施例提供的另一种拍摄帧率控制方法的流程示意图;
图8是本申请实施例提供的应用场景中帧率策略动态控制示意图;
图9是本申请实施例提供的应用场景中帧率策略适配的示意图;
图10是本申请实施例提供的应用场景示意图;
图11为本申请实施例提供的电子设备的硬件系统架构图;
图12为本申请实施例提供的一种芯片系统的结构示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
为了说明本申请所述的技术方案,下面通过具体实施例来进行说明。
应当理解,当在本说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形 式。
还应当进一步理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
首先,对本申请实施例中的部分用语进行解释说明,以便于本领域技术人员理解。
1、帧率(Frame rate),是以帧为单位的位图图像连续出现在显示器上的频率(速率),通常用fps(Frames Per Second)表示;帧率可以在一定程度上反映采集的视频页面流畅程度;拍摄帧率为电子设备基于当前拍摄环境下可以调用硬件传感器的每秒刷新次数,也可以称为录制帧率或实时帧率。
2、焦距,焦距的大小标志着折光能力的大小,焦距越短,其折光能力就越大。光学镜头组件的焦距决定了该光学镜头组件拍摄的被摄物体在成像平面上所生成图像的大小。假设以相同的距离面对同一被摄物体进行拍摄,那么光学镜头组件的焦距越长,则被摄体在感光元件(charge-coupled device,CCD)上所生成的图像的放大倍数就越大。
3、光学变焦,主要是摄像模组内不同焦距的对比比例和切换。可用光学变焦倍数表示光学变焦的能力,光学变焦倍数越大,能拍摄的景物就越远。光学变焦倍数的大小与光学镜头组件的物理焦距相关。常以摄像模组的等效焦距为28mm对应1X(即1倍)光学变焦倍数。
4、视场角(field of view,FOV),用于指示摄像头所能拍摄到的最大的角度范围。若待拍摄物体处于这个角度范围内,该待拍摄物体便会被摄像头捕捉到。若待拍摄物体处于这个角度范围之外,该待拍摄物体便不会被摄像头捕捉到。
通常,摄像头的视场角越大,则拍摄范围就越大,焦距就越短;而摄像头的视场角越小,则拍摄范围就越小,焦距就越长。因此,摄像头因视场角的不同可以被划分主摄摄像头、广角摄像头和长焦摄像头。其中,广角摄像头的视场角相对于主摄摄像头的视场角较大,焦距较小,适合近景拍摄;而长焦摄像头的视场角相对于主摄摄像头的视场角较小,焦距较长,适合远景拍摄。
5、亮度(lighting value,LV)值,用于估计环境亮度,其具体计算公式如下:
其中,Exposure为曝光时间,Aperture为光圈大小,ISO为感光度,Luma为XYZ颜色空间中,Y的平均值。
6、动态范围(dynamic range)值,用于表示通过摄像头获取的预览图像中过曝区域所占整个图像的比例信息。
以上是对本申请实施例所涉及术语的简单介绍,以下不再赘述。
通常情况下,电子设备(例如手机)在拍摄环境条件较好时,通过使用较高的拍摄帧率进行录像,以达到更优的录像效果;然而较高帧率拍摄的同时会带来功耗的增 加,当某些场景需要对功耗进行限制时,则需要使用较低的拍摄功率。
目前,在录像时用户只能手动选择设备硬件所支持的几种帧率中的一种特定的帧率模式,例如30帧或60帧,针对不同拍摄环境的适用性较差,且可能产生更多不必要的功耗,缩短设备的使用时长。而且多摄像头所涉及的业务又和电子设备的硬件平台相关联耦合,不便于后续对功能的开发及维护。
针对上述缺陷,本申请实施例提供一种拍摄帧率控制方法,通过将多摄的帧率适配与调度内聚成一个模块,从而可以将多个摄像头的拍摄帧率的控制与硬件平台解耦,不再受限于硬件平台,可以适用于不同芯片架构的平台。同时对拍摄帧率的控制可以更加灵活,可以随不同的拍摄环境进行动态切换,对后续适用于更复杂的场景进行功能的扩展与维护更加友好及便利。
本申请实施例提供的拍摄帧率控制方法可以应用于拍摄领域。例如,可以应用于录制视频的过程中。
下面对本申请实施例提供的拍摄帧率控制方法所适用的场景进行介绍。在一个示例中,以电子设备100为手机为例进行说明。
该手机可以包括多个摄像头。例如,具有4个后置摄像头和一个前置摄像头,该4个后置摄像头分别可以为广角摄像头、主摄摄像头、黑白摄像头和长焦摄像头,该4个后置摄像头用于拍摄同一待拍摄场景。该前置摄像头为与显示屏处于同一平面的摄像头,可以用于录像、多镜录像及人脸识别等。
相应的,该电子设备100上还可以包括其他摄像头,摄像头的种类以及每种摄像头的个数均可以根据需要进行设置,本申请实施例对此不进行任何限制。示例性的,电子设备100包括的4个摄像头还可以分别为超广角摄像头、广角摄像头、黑白摄像头和长焦摄像头。
应理解,不同的摄像头对应支持不同的变焦倍数,电子设备可以接收用户通过点击或拖动变焦倍数控件,以调整当前所使用多个摄像头进行拍摄的焦距。通常主摄摄像头和黑白摄像头的变焦倍数范围基本一致,而广角摄像头对应的变焦倍数相对小于主摄摄像头的变焦倍数,长焦摄像头的变焦倍数相对大于主摄摄像头的变焦倍数。变焦倍数指的是摄像头的光学变焦的能力。
示例性的,广角摄像头对应的变焦倍数范围为[0.1,1),主摄摄像头对应的变焦倍数范围为[1,3.9),黑白摄像头对应的变焦倍数范围为[1,2),长焦摄像头对应的变焦倍数范围[3.9,100)。其中,0.1指的是0.1倍变焦倍数,即0.1X;1指的是1倍变焦倍数,即1X;2指的是2倍变焦倍数,即2X;3.9指的是3.9变焦倍数,即3.9X;100指的是100倍变焦倍数,即100X。
场景一
请参考图1,图1为本申请实施例提供的拍摄帧率控制方法适用的场景示意图。
如图1所示,电子设备检测在显示界面中第一控件的第一操作。该显示界面可以为主界面10,该第一控件可以是主界面的相机控件,该第一操作可以是用户点击相机控件的操作,例如操作1;相应地,该显示界面还可以是预览界面11,即相机应用打开后的界面,该第一控件还可以是在预览界面的变焦控件,例如图1中的变焦条12,该第一操作还可以是用户点击或滑动变焦条的操作,例如操作2,以改变相机在录制 前或录制过程中的变焦倍数。
响应于用户点击主界面10内相机控件的操作(如操作1),手机可以启动相机应用,显示如图1中所示的图形用户界面(graphical user interface,GUI),该GUI界面可以称为预览界面11。该预览界面包括取景窗口、多种拍摄模式选项和变焦条12。该取景窗口可用于实时显示预览图像。该拍摄模式包括夜景模式、人像模式、录像模式、多镜录像模式等。在录像模式下,该预览界面还可以包括用于显示适用于当前拍摄场景的拍摄帧率的帧率控件13。
示例性的,该帧率控件13所对应显示的拍摄帧率可以随拍摄环境的变化或用户的第一操作(例如操作2)进行动态改变。相应地,用户还可以通过点击预览界面11内的变焦条12,改变相机的拍摄焦段;用户可以在变焦条12中选择当前录制场景需要的变焦倍数,例如,0.5倍、2倍或50倍等变焦倍数。
示例性的,用户可以通过滑动操作增大变焦倍数,可以将拍摄物体在取景窗口中不断放大;通过滑动操作减小变焦倍数,可以将拍摄物体在取景窗口中不断缩小。由此,用户可以通过进行变焦倍数的选择,以实现对取景窗口显示的预览图像的调整。
需要说明的是,本申请实施例提供的拍摄帧率控制方法适用于相机应用打开后,开始拍摄前的帧率策略控制,该情况下可以基于拍摄场景或用户输入的变焦倍数等信息进行拍摄帧率的适配与调控。上述帧率控件13也可以为不显示状态,即对拍摄帧率的控制过程为用户无感知状态,还可以直接在后台控制相机的图像传感器的拍摄帧率,图1仅为了便于对本申请中对拍摄帧率动态控制的过程进行示意性说明。在开启相机应用后,无论是在开始进行视频录制之前还是在视频录制过程中,拍摄环境发生变化时,均可以运用本申请实施例所提供的拍摄帧率控制方法所对应的程序;以实现拍摄效果和功耗在各种应用场景下的更优的平衡。
场景二
请参考图2,图2为本申请实施例提供的拍摄策略控制方法适用的另一种场景示意图。
如图2所示,当用户想在户外对草地、树木进行拍照时,用户打开相机应用,预览界面默认的变焦倍数可以为1倍,手机所调用的摄像头可以为主摄摄像头。手机在录制视频的过程中,由较亮的拍摄场景(如图2中的(a)图所示),切换到较暗的拍摄场景(如图2中的(b)图所示)时,基于本申请实施例提供的拍摄帧率控制方法,相应的帧率也会进行动态调整,例如,由原来的60fps切换到30fps;由于对应较暗的应用场景,其相适应的一些图像算法(例如夜间模式算法)也相对复杂,手机的功耗相对会升高,如果继续以高帧率采集,可能会影响到手机的运行或缩短续航时长,因此通过降低拍摄帧率,在保证相应的拍摄效果的同时,平衡在该场景的下运行功耗。
相应地,在由较暗的拍摄场景转换到较亮的拍摄场景时,相应的拍摄帧率也会由30fps切换回60fps。
示例性的,当相机处于夜景模式时,还可以显示夜景控件14,用于指示当前处于夜景拍摄模式。
需要说明的是,本申请实施例提供的拍摄帧率控制方法还适用于相机应用打开后,在拍摄过程中,拍摄场景发生变化或用户输入的变焦倍数改变,对拍摄帧率进行动态 的适配与调控。上述应用场景中的拍摄帧率仅示意性说明,还可以是其他的帧率,例如24帧、72帧等,也可以是其他与图像输出格式相关的控制参数,在此不作具体限定。另外,针对拍摄场景发生切换的应用场景,与拍摄场景对应的拍摄帧率的控制还可以支持两帧率间的连续变换,例如帧率可以从第一拍摄场景对应的30fps,连续地切换为31fps、32fps或33fps等。拍摄帧率对应的每次切换的具体跨度值也不做限定。
应理解,上述为对应用场景的举例说明,并不对本申请的应用场景及拍摄帧率的调整方向作任何限定。
为了便于理解本申请实施例提供的拍摄帧率控制方法,下面先对电子设备100的软件系统进行介绍。电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。
需要说明的是,本申请实施例中,电子设备的操作系统(operating system,OS)可以包括但不限于(Symbian)、(Andriod)、(iOS)、(Blackberry)、鸿蒙(HarmonyOS)等操作系统,本申请不作任何限定。
本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。图3是本申请实施例提供的电子设备100的一种软件结构框图。
如图3所示,分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。
在一些实施例中,将Android系统分为五层,从上至下分别为应用层(application,APP)110,应用框架层120、硬件抽象层(hardware abstract layer,HAL)130、驱动层140、以及硬件层150。
如图3所示,应用层110可以包括一系列应用程序包。例如,应用层110可以包括相机,图库等应用程序。
应用层110处于整个框架的顶端,承担着与用户直接进行交互的责任,一旦接收到用户直接或间接的比如拍照或录像的需求,便会通过接口将需求发送给应用框架层120,等待应用框架层120进行回传处理结果,其中,该结果包括图像数据以及相机参数(例如拍摄帧率)等;然后应用层110再将该结果反馈给用户。
应用框架层120位于应用层110和硬件抽象层130之间,应用框架层120为应用层110的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用框架层120包括一些预先定义的函数。
应用框架层120是应用程序的框架,开发人员可以在遵循应用程序的框架的开发原则的情况下,基于应用框架层120开发一些应用程序,此外,应用框架层120还包括相机应用对应的访问接口等。
硬件抽象层130用于将硬件抽象化,为操作系统提供虚拟硬件使用平台。具体结合本申请方案,比如,硬件抽象层130可以包括相机硬件抽象层131等。
驱动层140用于为不同的硬件设备提供驱动。比如,驱动层140可以包括相机设备驱动。
硬件层150可以包括传感器(sensor)、图像信号处理器以及其他硬件设备,本申请对此不作限定;传感器还可以包括环境光传感器、陀螺仪传感器及图像传感器等。
在本申请中,通过调用硬件抽象层130中的相机硬件抽象层131,可以实现硬件 抽象层130上方的应用层110、应用框架层120与下方的驱动层140、硬件层150的连接,实现摄像头数据传输及功能控制。
在此基础上,本申请实施例在硬件抽象层130中增设了一种帧率策略决策层132,通过与相机硬件抽象层131相连接,从而可以在帧率策略决策层132中适配不同拍摄环境的拍摄帧率策略,根据不同需求或实际场景应用定制不同的帧率控制的适配逻辑。帧率策略决策层132可以通过相机硬件抽象层131来获取各种参数,以及调用传感器、ISP等各种硬件,来实现拍摄过程中针对不同拍摄环境进行拍摄帧率的动态调整。
参考图4和图5,图4为本申请实施例提供的一种硬件抽象层130的结构示意图;图5为本申请实施例提供的另一种硬件抽象层130的结构示意图。
如图4和图5所示的,在一些实施例中,相机硬件抽象层131包含了通用功能性的接口层1311,该接口层1311用于访问不同的操作系统,以实现管理和控制。
相机硬件抽象层131还包含了媒体控制层1312、芯片平台(Camx)、公共库,该媒体控制层1312用于对接芯片平台的业务定制。芯片平台包含通用功能性接口的代码的集合等,比如,core用于存放芯片平台的核心实现模块。公共库包含有操作系统的适配数据、元数据(Metadate)等各种数据。
如图4所示的,帧率策略决策层132包括:依次相连接的帧率策略解析模块1322、帧率策略计算模块1323及帧率策略控制模块1324;该帧率策略决策层132还包括接口适配模块1321。
接口适配模块1321还与帧率策略计算模块1323、帧率策略控制模块1324以及相机硬件抽象层131分别相连接。接口适配模块1321用于从相机硬件抽象层131接收初始数据,将初始数据提供给帧率策略计算模块1323;接口适配模块1321还用于接收帧率策略控制模块1324提供的决策指令,并将决策指令提供给相机硬件抽象层131,使得相机硬件抽象层131可以根据该决策指令进行控制。
可选地,接口适配模块1321与相机硬件抽象层131中的接口层1311、媒体控制层1312分别相连接,接口适配模块1321用于从接口层1311接收初始数据,将初始数据提供给帧率策略计算模块1323;接口适配模块1321还用于接收帧率策略控制模块1324提供的决策指令,并将决策指令提供给媒体控制层1312,使得媒体控制层1312可以根据该决策指令进行控制。
帧率策略解析模块1322用于解析帧率策略配置,得到多个帧率策略。
帧率策略计算模块1323用于根据初始数据,在多个帧率策略中确定与初始数据匹配的目标帧率策略,该目标帧率策略为多个帧率策略中的一个。
帧率策略控制模块1324用于根据目标帧率策略,生成决策指令。
接口适配模块1321用于输出该决策指令。
在一些实施例中,初始数据可以包括:传感数据、控制数据中的至少一项。
其中,传感数据可以包括:环境亮度值、动态范围值、状态信息、场景类型中的至少一项,控制数据可以包括:变焦倍数等。传感数据具体可以根据需要进行设置,本申请实施例对此不进行任何限制。控制数据可以为用户通过点击或拖动相应的变焦倍数空间生成的指令。初始数据中的传感数据用于指示电子设备当前拍摄环境的类型;初始数据中的控制数据用于指示用户进行视频录制时可选择的变焦倍数。
应理解,变焦倍数包括用户进行变焦操作之前对应的变焦倍数,在此称为第一变焦倍数,以及用户进行变焦操作后的变焦倍数,在此称为第二变焦倍数。
变焦切换方式指的是用户更改变焦倍数时,所使用的操作方式,比如滑动方式或者是点切方式。其中,滑动方式指的是用户在变焦条或显示屏的某一位置处进行连续滑动来更改变焦倍数;点切方式指的是用户直接在界面上点击了某一位置处的变焦倍数数值来改变变焦倍数,点切方式具有不连贯的特性,与滑动方式的特性相反。
环境亮度值指的是环境光传感器采集的亮度值,基于亮度阈值的设定,其参考指标可以包括暗dark(或低亮)、普通照度ordinary、明亮bright(或高亮);不同的参考指标对应不同的亮度范围。
动态范围值基于动态阈值的设定,其参考指标可以包括高动态和低动态。
状态信息指的是电子设备在视频录制过程中所处的状态,可以通过硬件层的陀螺仪传感器测得,其参考指标可以包括抖动场景和静止场景。
场景类型是基于环境中灯光的闪烁确定的指标,其参考指标可以包括灯光闪烁(或灯光高频闪烁)和灯光不变,可以通过环境光传感器通过采集到的灯光闪烁情况测定。
其中,变焦倍数等数据可以来自于应用层110;传感数据等可以来自于芯片平台、来自于硬件抽象层130或者来自硬件层150,当然也可以来自其他层,本申请实施例对此不进行任何限制。
需要说明的是,动态帧率配置可以根据需求包括多个帧率策略,例如,可以包括第一策略、第二策略、第三策略等。上述对应的变焦倍数还可以是其他范围,对此不作具体限定。
如图9所示,变焦倍数的范围为[1.0,4.0)时各个拍摄场景对应的帧率策略。比如第一策略可以为高质量低帧策略,当环境亮度值大于亮度阈值,动态范围值大于动态阈值,状态信息为抖动状态时,将多个所率策略中的第一策略作为目标帧率策略,并输出与该第一策略对应的决策指令,以控制视频录制过程中的拍摄帧率;从而适配高亮、高动态及抖动状态的拍摄场景,例如图10中的(b)图所示的对演唱会现场的拍摄等。当环境亮度值大于亮度阈值,动态范围值大于动态阈值,状态信息为静止状态,场景类型为灯光不变场景时,将多个帧率策略中的第一策略作为目标帧率策略,并输出与该第一策略对应的决策指令,以控制视频录制过程中的拍摄帧率;从而适配高亮、高动态、静止状态及灯光不变的拍摄场景。
比如第二策略可以为低质量高帧策略,当环境亮度值大于亮度阈值,动态范围值大于动态阈值,状态信息为静止状态,场景类型为灯光闪烁场景,将多个帧率策略中的第二策略作为目标帧率策略;并输出与该第二策略对应的决策指令,以控制视频录制过程中的拍摄帧率;从而适配高亮、高动态、静止状态及灯光高频闪烁的场景类型,例如图10中的(a)图所示的对设置有跑马灯现场的拍摄等。当环境亮度值大于亮度阈值,动态范围值小于或等于动态阈值,状态信息为静止状态,场景类型为灯光闪烁场景,将多个帧率策略中的第二策略作为目标帧率策略,并输出与该第二策略对应的决策指令,以控制视频录制过程中的拍摄帧率;从而适配高亮、低动态、静止状态及灯光高频闪烁的场景类型。
比如第三策略可以为低质量低帧策略,当环境亮度值大于亮度阈值,动态范围值 小于或等于动态阈值,状态信息为抖动状态,将多个帧率策略中的第三策略作为目标帧率策略,可以通过调用该策略,输出与该第三策略对应的决策指令,以控制视频录制过程中的拍摄帧率;从而可以适配高亮、低动态及抖动状态的拍摄场景。当环境亮度值大于亮度阈值,动态范围值小于或等于动态阈值,状态信息为静止状态,场景类型为灯光不变场景,则将多个帧率策略中的第三策略作为目标帧率策略;以适配高亮、低动态、静止状态及灯光不变的场景类型。当环境亮度值小于或等于亮度阈值,将多个所述帧率策略中的第三策略作为目标策略;以适配低亮的拍摄场景。
示例性的,最后输出的决策指令可以包含传感模组参数,例如高质量低帧率对应的传感模组参数可以为双转换增益控制单元(Dual Conversion Gain,DCG)30fps,DCG为提高图像传感器输出图像质量的一种图像输出格式,可以通过高转换增益和低转换增益读取模式实现输出高动态范围的图像;低质量高帧对应的传感模组参数可以为Binning60fps,Binning为一种图像读出模式,将相邻像元感应的电荷加在一起,以一个像素的模式读出;相应的低质量低帧对应的传感模组参数可以为Binning30fps。
其中,动态策略配置中的多个帧率策略可以是以XML格式配置的文件,当然,也可以是以其他方式进行配置的,本申请实施例对此不进行任何限制。
应理解,上述帧率策略的个数、内容和格式可以根据需要进行设置和更改,本申请实施例对此不进行任何限制。
可选地,如图5所示的,帧率策略决策层132还包括:第一转换模块1325和第二转换模块1326。
其中,第一转换模块1325连接于接口适配模块1321和帧率策略计算模块1323之间,第一转换模块1325用于将接口适配模块1321接收的初始数据转换为第一数据并输入帧率策略计算模块1323。
第二转换模块1326连接于帧率策略控制模块1324和接口适配模块1321之间,第二转接模块用于将帧率策略控制模块1324输出的决策指令转换为第二数据并提供接口适配模块1321输出。
需要说明的是,第一数据的格式应为帧率策略计算模块1323可识别和处理的格式,例如McxContext格式。第二数据的格式应为相机硬件抽象层131可以识别和处理的格式。从而可以通过设置不同的转换模块,可以适配不同的芯片平台,平台适用性及可移植性都比较好。
本申请实施例提供了一种电子设备,通过在该电子设备的硬件抽象层中增设集成有多种帧率策略的帧率策略决策层,从而可以将相机拍摄帧率与芯片平台解耦,更灵活方便地配置和变更帧率策略决策层中的帧率策略,以满足越来越多拍摄场景的拍摄需求,同时平衡拍摄帧率与功耗更好的适配,提高电子设备的使用时长。
针对上述软件结构,相应的,本申请实施例提供了一种拍摄帧率控制方法。参考图6,图6示出了本申请实施例提供的一种拍摄帧率控制方法的流程示意图,如图6所示的,该方法包括以下S10至S100。
S10、当相机应用的图标被用户点击以进行触发时,电子设备100开始运行相机应用,并通过相机硬件抽象层131调用帧率策略决策层132。
S20、响应于预设触发条件,相机应用向相机硬件抽象层131下发请求。
S30、硬件层150中的传感器向相机硬件抽象层131上报摄像头的实时数据。
例如,摄像头的实时数据可以包括对应各种拍摄场景的数据。
S40、相机硬件抽象层131根据下发的请求以及上报的摄像头的实时数据,解析出对应的初始数据。
例如,初始数据可以包括传感数据、控制数据;传感数据可以包括:环境亮度值、动态范围值、状态信息、场景类型等;控制数据可以包括:变焦倍数。
S50、相机硬件抽象层131将初始数据传输给帧率策略决策层132中的接口适配模块1321中,接口适配模块1321将初始数据提供给第一转换模块1325,第一转换模块1325将初始数据转换成第一数据。
其中,第一数据可以为初始数据经过转后与拍摄场景对应的条件控制信息。
S60、第一转换模块1325将数据转换成第一数据后,可以向帧率策略计算模块1323提供拍摄场景对应的条件控制信息。
S70、帧率策略计算模块1323根据接收到条件控制信息,在多个帧率策略中计算适配的目标帧率策略。
其中,帧率策略解析模块对动态帧率配置进行解析,得到多个帧率策略;该目标帧率策略为多个帧率策略中的一个。
S80、帧率策略计算模块1323输出目标帧率策略。
其中,目标帧率策略为适配于不同拍摄场景的帧率策略。
S90、帧率策略控制模块1324将根据目标帧率策略,生成决策指令;并返回给第二转换模块1326。
其中,决策指令包括传感器模组参数,通过该传感器模组参数控制相机的拍摄帧率。
S100、第二转换模块1326将该决策指令转换成第二数据,经接口适配模块1321输出给相机硬件抽象层131,再下发给传感器以根据决策指令对多个摄像头的拍摄帧率进行控制。
本申请实施例提供了一种拍摄帧率控制方法,通过在该电子设备的硬件抽象层中增设集成有适配拍摄环境的多种帧率策略的帧率策略决策层,从而可以将拍摄帧率的控制与芯片平台解耦,更灵活方便的配置和变更帧率策略决策层中的帧率策略,以满足越来越多的拍摄场景的需求。
下面结合附图,对帧率策略决策层132的工作过程进行举例说明。
参考图7,图7为本申请实施例提供拍摄帧率控制方法的流程示意图。本申请实施例实现拍摄帧率控制的帧率策略决策层132内部各模块间的调用过程,该交互过程可以包括以下S101至S107。
S101,多摄像头管理模块对多摄像头策略管理模块进行初始化。
S102,初始化后的多摄像头管理模块调用多摄像头合并策略-帧率模块、多摄像头场景策略模块、多摄像头帧率策略模块以及多摄像头帧率计算模块。
示例性的,应用层110中的相机应用以图标的方式显示在电子设备100的屏幕上。当相机应用的图标被用户点击以进行触发时,电子设备100开始运行相机应用。当相机应用运行在电子设备100上时,相机应用调用应用框架层120中的相机应用对应的 相机访问接口,并通过相机硬件抽象层131调用帧率策略决策层132。帧率策略决策层132进行相关初始化,加载相关动态帧率配置文件,加载和解析帧率策略相关文件。
相应地,帧率策略决策层132初始化后,可以使得相机硬件抽象层131控制预览界面显示当前拍摄模式为录像模式,变焦倍率为1倍,调用主摄摄像头以计算的拍摄频率进行采集,并将采集的图像传输至取景窗口进行显示。
S103,多摄像头帧率计算模块调用多摄像头帧率解析模块。
S104,多摄像头帧率解析模块对帧率的策略配置进行解析,得到帧率策略;并将帧率策略提供给多摄像头帧率计算模块。
S105,多摄像头策略管理模块调用传输到多摄像头管理模块的应用层下发的请求,以及控制信息。
其中,该控制信息可以包括用户通过滑动或点切变焦条而改变的变焦倍数。
S106,多摄像头策略管理模块选择一个帧率策略,提供给多摄像头合并策略-帧率模块。
S107,多摄像头帧率策略模块基于提供的帧率策略对获取的数据进行计算,并将计算的帧率结果提供给多摄像头帧率计算模块,由多摄像头帧率计算模块生成相应的参数并输出给多摄像头管理模块。
上述过程仅为一种示例,具体可以根据需要进行顺序上的调整,当然,还可以增加或减少步骤,本申请实施例对此不进行任何限制。
参考图8,图8为本申请实施例提供的应用场景中帧率策略动态控制示意图。如图8中的(a)图所示,不论对应相机摄像头的调度模式或工作状态,还是亮度lux的不同条件,其拍摄帧率均为固定的60FPS或30FPS。不会因为场景的改变而动态的适配。本申请实施例提供了一种拍摄帧率控制方法,可以包括以下S111至S118。
S111、应用层110中的相机应用以图标的方式显示在电子设备100的屏幕上。当相机应用的图标被用户点击以进行触发时,电子设备100开始运行相机应用。
S112、当相机应用运行在电子设备100上时,相机应用调用应用框架层120中的相机应用对应的相机访问接口,并通过相机硬件抽象层131调用帧率策略决策层132。帧率策略决策层132进行相关初始化,加载相关帧率策略配置文件,加载和解析帧率策略相关文件。
帧率策略决策层132初始化后,例如,可以使得相机硬件抽象层131控制预览界面显示当前拍摄模式为录像模式,变焦倍率为1倍,调用主摄摄像头以与挡墙拍摄场景适配的拍摄频率进行采集,并将采集的图像传输至取景窗口进行显示。
S113,帧率策略计算模块根据环境亮度值和控制数据,计算适配的条件信息。
S114,帧率策略计算模块将条件信息与解析后的帧率策略进行匹配,计算帧率结果。
S115,帧率策略计算模块根据帧率结果和传感数据,确定目标帧率策略。
其中,帧率策略计算模块可以根据硬件层的传感器上传的环境亮度值,以及应用层提供的变焦倍数,计算环境亮度值和变焦倍数所符合的条件信息。如图8中的(b)图所示,该条件信息可以包括:超广角UV、wide+dark、wide+bright或者长焦等信息;通过计算环境亮度值,确定环境亮度值所处的范围,为暗或亮,然后基于变焦倍数确 定采用的拍摄模式,例如超广角UV、广角及长焦等拍摄模型,然后将计算的两种情况的结果相结合,判断是否符合wide+dark或者wide+bright。
S116,若计算出的条件信息满足超广角模式、夜景模式或长焦模式,则帧率策略计算模块计算的帧率结果为第一帧率;如图8中的(b)图所示的30FPS。
S117,若计算出的条件信息满足日光模式,则帧率策略计算模块计算的帧率结果为第二帧率;如图8中的(b)图所示的60FPS。
S118,当计算的条件信息在由超广角模式、夜景模式或长焦模式向日光模式切换时,目标帧率策略由第一帧率对应的帧率策略切换为第二帧率对应的帧率策略。
示例性的,如图8中的(b)图所示,当计算出的条件信息由其中的一种情况向另一种不同帧率的拍摄场景或模式切换时,控制相应的拍摄帧率也进行动态切换。
上述过程仅为一种示例,具体可以根据需要进行顺序上的调整,例如上述计算的帧率结果还可以是其他的帧率,如24帧、112帧或30帧至60帧之间的任一帧,具体可以根据实际应用场景的需要进行设定,本申请实施例仅对帧率的动态调控进行示例性说明;当然,还可以增加或减少步骤,本申请实施例对此不进行任何限制。
在一些实施例中,拍摄帧率控制方法,还可以包括以下S121至S126。
S121、应用层110中的相机应用以图标的方式显示在电子设备100的屏幕上。当相机应用的图标被用户点击以进行触发时,电子设备100开始运行相机应用。
S122、当相机应用运行在电子设备100上时,相机应用调用应用框架层120中的相机应用对应的相机访问接口,并通过相机硬件抽象层131调用帧率策略决策层132。帧率策略决策层132进行相关初始化,加载相关帧率策略配置文件,加载和解析帧率策略相关文件。
帧率策略决策层132初始化后,例如,可以使得相机硬件抽象层131控制预览界面显示当前拍摄模式为录像模式,变焦倍率为1倍,调用主摄摄像头以与挡墙拍摄场景适配的拍摄频率进行采集,并将采集的图像传输至取景窗口进行显示。
在相机应用打开后,根据需要的变焦倍数和/或拍摄帧率,进行录像,相机应用进程打开。相机应用调度下发请求,硬件抽象层130从上层的请求中解析出应用下发的控制信息(例如变焦倍率zoom ratio)和场景信息等。硬件层150的传感器上报传感器的最新传感数据给硬件抽象层,并在接口适配层对控制信息和传感器的传感数据进行数据转换,转换为可以被帧率策略决策层132识别的数据格式。
S123,转换过的控制信息和状态信息传输给帧率策略计算模块1323进行计算。
示例性的,帧率策略计算模块1323根据预设的亮度阈值计算当前状态数据中的环境亮度值是否满足条件信息,例如当前的环境亮度值为是否满足高亮或低亮的条件信息。
示例性的,亮度阈值可以基于相机的硬件基础进行设定,例如基于摄像头的类型进行设定。在此不对亮度阈值的具体大小不做限定。
示例性的,帧率策略计算模块1323根据预设的动态阈值计算当前状态数据中动态范围值是否满足条件信息,例如当前拍摄环境的动态范围值是否满足高动态或低动态的条件信息。
示例性的,帧率策略计算模块1323根据传感器上传的状态计算是否满足条件信息, 例如当前相机的状态是否满足抖动状态或静止状态的条件信息。
示例性的,帧率策略计算模块1323根据传感器上传的场景中灯光状态是否满足条件信息,例如当前拍摄场景中的灯光是否满足灯光高频闪烁或灯光不变的条件信息。
S124,帧率策略解析模块1322对配置好的动态帧率配置XML进行解析和存储。
示例性的,动态帧率配置XML中包括由各种条件信息或各条件信息的组合构成的拍摄场景所对应的帧率策略。
S125,帧率策略计算模块1323将适配好的条件信息和解析后的对应拍摄场景的帧率策略进行匹配计算,将输出的帧率结果和相机状态进行整合,输出最终的决策结果,及目标帧率策略。
S126,帧率策略控制模块基于目标帧率策略生成决策指令,并由接口适配模块1321输出。
接口适配模块1321输出给相机硬件抽象层131的媒体控制层1312,媒体控制层1312将决策指令添加到请求中传输给芯片平台,由芯片平台再下发至驱动层140以根据决策指令对多个摄像头的拍摄帧率进行控制,从而使得相机在不同拍摄场景下实现拍摄帧率的动态适配与调控。
在一些实施例中,帧率策略决策层还可以根据传感数据、控制数据、功耗信息,确定目标帧率策略。功耗信息可以为传感器的状态信息,即当前拍摄模式对应的传感器的算法功耗,例如夜景模式对应的算法功耗等。帧率策略决策层132还可以结合当前拍摄场景所需的功耗信息进行动态帧率的适配与调控,以达到拍摄效果和运行功耗更优的平衡。
需要说明的是,上述条件信息仅示例性说明,各个条件信息或各条件信息的组合对应的拍摄场景可以基于实际应用场景进行扩展或匹配,在此不作具体限定。
上述过程仅为一种示例,具体可以根据需要进行顺序上的调整,当然,还可以增加或减少步骤,本申请实施例对此不进行任何限制。
上文结合图1至图9详细描述了本申请实施例提供的拍摄帧率控制方法以及相关的显示界面;下面将结合图11和图12详细描述本申请实施例提供的电子设备和芯片。应理解,本申请实施例中的电子设备和芯片可以执行前述本申请实施例的各种拍摄帧率控制方法,即以下各种产品的具体工作过程,可以参考前述方法实施例中的对应过程。
图11示出了一种适用于本申请的电子设备的硬件系统。电子设备100可用于实现上述方法实施例中描述的拍摄帧率控制方法。
该电子设备100可以是手机、智慧屏、平板电脑、可穿戴电子设备、车载电子设备、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)、投影仪等等,本申请实施例对电子设备100的具体类型不作任何限制。
电子设备100可以包括处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬 声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及风扇单元295等。其中传感器模块280可以包括压力传感器280A,陀螺仪传感器280B,气压传感器280C,磁传感器280D,加速度传感器280E,距离传感器280F,接近光传感器280G,指纹传感器280H,温度传感器280J,触摸传感器280K,环境光传感器280L,骨导传感器280M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
在本申请实施例中,处理器210可以执行显示桌面(主界面或预览界面),该显示界面可以包括第一控件(例如主界面可以包括相机控件,预览界面可以包括变焦控件);检测到对第一控件的第一操作;响应于第一操作,根据初始数据,确定目标帧率策略;然后,根据目标帧率策略,确定决策指令,以对相机在拍摄场景下的拍摄帧率进行动态调制。
图11所示的各模块间的连接关系只是示意性说明,并不构成对电子设备100的各模块间的连接关系的限定。可选地,电子设备100的各模块也可以采用上述实施例中多种连接方式的组合。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块250可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250可以由天线1接收电磁波, 并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块250的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块250的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。
电子设备100可以通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏294用于显示图像,视频等。显示屏294包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏294,N为大于1的正整数。
电子设备100可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。
ISP用于处理摄像头293反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头293中。
摄像头293用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头293,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
电子设备100可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
压力传感器280A用于感受压力信号,可以将压力信号转换成电信号。在一些实 施例中,压力传感器280A可以设置于显示屏294。压力传感器280A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器280A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏294,电子设备100根据压力传感器280A检测所述触摸操作强度。电子设备100也可以根据压力传感器280A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器280B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器280B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器280B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器280B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器280B还可以用于导航,体感游戏场景。陀螺仪传感器还可以检测电子设备在拍摄过程中的状态,例如可以运动的拍摄状态或静止的拍摄状态。
加速度传感器280E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器280F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器280F测距以实现快速对焦。
接近光传感器280G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器280G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器280G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器280L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏294亮度。环境光传感器280L也可用于拍照时自动调节白平衡。环境光传感器280L还可以与接近光传感器280G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器280H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器280J用于检测温度。在一些实施例中,电子设备100利用温度传感器280J检测的温度,执行温度处理策略。例如,当温度传感器280J上报的温度超过阈值,电子设备100执行降低位于温度传感器280J附近的处理器的性能,以便降低功耗实施 热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池242加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池242的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器280K,也称“触控器件”。触摸传感器280K可以设置于显示屏294,由触摸传感器280K与显示屏294组成触摸屏,也称“触控屏”。触摸传感器280K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏294提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器280K也可以设置于电子设备100的表面,与显示屏294所处的位置不同。
需要说明的是,上述电子设备的结构仅示例性说明,基于不同的应用场景,还可以包括其他实体结构,在此不对电子设备的实体结构进行限定。
本申请实施例还提供了一种包含计算机指令的计算机程序产品,当其在电子设备100上运行时,使得电子设备100可以执行前述所示的拍摄帧率控制方法。
图12为本申请实施例提供的一种芯片系统的结构示意图。图12所示的芯片系统30可以为通用处理器,也可以为专用处理器。该芯片系统30包括处理器301。其中,处理器301用于支持电子设备100执行前述所示的技术方案。
可选的,该芯片系统还包括收发器302,收发器302用于接受处理器301的控制,用于支持电子设备100执行前述所示的技术方案。
可选的,图12所示的芯片系统还可以包括:存储介质303。
需要说明的是,图12所示的芯片系统可以使用下述电路或者器件来实现:一个或多个现场可编程门阵列(field programmable gate array,FPGA)、可编程逻辑器件(programmable logic device,PLD)、控制器、状态机、门逻辑、分立硬件部件、任何其他适合的电路、或者能够执行本申请通篇所描述的各种功能的电路的任意组合。
上述本申请实施例提供的电子设备、计算机存储介质、计算机程序产品、芯片系统均用于执行上文所提供的方法,因此,其所能达到的有益效果可参考上文所提供的方法对应的有益效果,在此不再赘述。
应理解,上述只是为了帮助本领域技术人员更好地理解本申请实施例,而非要限制本申请实施例的范围。本领域技术人员根据所给出的上述示例,显然可以进行各种等价的修改或变化,例如,上述检测方法的各个实施例中某些步骤可以是不必须的,或者可以新加入某些步骤等。或者上述任意两种或者任意多种实施例的组合。这样的修改、变化或者组合后的方案也落入本申请实施例的范围内。
还应理解,上文对本申请实施例的描述着重于强调各个实施例之间的不同之处,未提到的相同或相似之处可以互相参考,为了简洁,这里不再赘述。
还应理解,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
还应理解,本申请实施例中,“预先设定”、“预先定义”可以通过在设备(例如,包括电子设备)中预先保存相应的代码、表格或其他可用于指示相关信息的方式来实现,本申请对于其具体的实现方式不做限定。
还应理解,本申请实施例中的方式、情况、类别以及实施例的划分仅是为了描述 的方便,不应构成特别的限定,各种方式、类别、情况以及实施例中的特征在不矛盾的情况下可以相结合。
还应理解,在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
最后应说明的是:以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种拍摄帧率控制方法,其特征在于,应用于包括帧率策略决策层和多个摄像头的电子设备,所述方法包括:
    检测对所述电子设备在显示界面中第一控件的第一操作;
    响应于所述第一操作,所述帧率策略决策层根据初始数据,确定目标帧率策略;
    所述帧率策略决策层根据所述目标帧率策略,生成决策指令,所述决策指令用于控制所述多个摄像头的拍摄帧率。
  2. 根据权利要求1所述的方法,其特征在于,所述帧率策略决策层包括依次连接的接口适配模块、帧率策略计算模块及帧率策略控制模块,所述接口适配模块还与所述帧率策略控制模块相连接,所述帧率策略决策层还包括帧率策略解析模块,所述帧率策略解析模块与帧率策略计算模块相连接;
    所述方法包括:
    所述接口适配模块获取所述初始数据,所述初始数据包括:传感数据、控制数据中的至少一项;
    所述帧率策略解析模块解析帧率策略配置,得到多个帧率策略;
    所述帧率策略计算模块根据所述初始数据,在多个所述帧率策略中确定与所述初始数据匹配的所述目标帧率策略,所述目标帧率策略为多个所述帧率策略中的一个;
    所述帧率策略控制模块根据所述目标帧率策略,生成所述决策指令;
    所述接口适配模块输出所述决策指令。
  3. 根据权利要求2所述的方法,其特征在于,所述帧率策略决策层还包括第一转换模块和第二转换模块,所述第一转换模块连接于所述接口适配模块与所述帧率策略计算模块之间,所述第二转换模块连接于所述接口适配模块与所述帧率策略控制模块之间;
    在所述接口适配模块获取所述初始数据之后,所述方法还包括:
    所述第一转换模块将所述初始数据转换为第一数据;
    在所述帧率策略控制模块根据所述目标帧率策略,生成所述决策指令之后,所述方法还包括:
    所述第二转换模块将所述决策指令转换为第二数据;
    所述接口适配模块输出所述第二数据。
  4. 根据权利要求2或3所述的方法,其特征在于,所述传感数据包括:环境亮度值、动态范围值、状态信息、场景类型中的至少一项,所述控制数据包括:变焦倍数;
    所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
    所述帧率策略计算模块根据所述环境亮度值和所述控制数据,计算适配的条件信息;
    所述帧率策略计算模块将所述条件信息与解析后的所述帧率策略进行匹配,计算帧率结果;
    所述帧率策略计算模块根据所述帧率结果和所述传感数据,确定所述目标帧率策略。
  5. 根据权利要求4所述的方法,其特征在于,所述条件信息包括超广角模式、夜 景模式、日光模式、长焦模式中的至少一项;所述方法还包括:
    若所述条件信息为所述超广角模式、所述夜景模式或所述长焦模式,则所述帧率策略计算模块计算的所述帧率结果为第一帧率;
    若所述条件信息为所述日光模式,则所述帧率策略计算模块计算的所述帧率结果为第二帧率。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    当计算的所述条件信息在由所述超广角模式、所述夜景模式或所述长焦模式向所述日光模式切换时,所述目标帧率策略由所述第一帧率对应的帧率策略切换为所述第二帧率对应的帧率策略。
  7. 根据权利要求4至6任一项所述的方法,其特征在于,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
    若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为抖动状态,则将多个所述帧率策略中的第一策略作为所述目标帧率策略;
    若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为静止状态,所述场景类型为灯光闪烁场景,则将多个所述帧率策略中的第二策略作为所述目标帧率策略;
    若所述环境亮度值大于亮度阈值,所述动态范围值大于动态阈值,所述状态信息为静止状态,所述场景类型为灯光不变场景,则将多个所述帧率策略中的第一策略作为所述目标帧率策略。
  8. 根据权利要求4至6任一项所述的方法,其特征在于,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
    若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为抖动状态,则将多个所述帧率策略中的第三策略作为所述目标帧率策略;
    若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为静止状态,所述场景类型为灯光闪烁场景,则将多个所述帧率策略中的第二策略作为所述目标帧率策略;
    若所述环境亮度值大于亮度阈值,所述动态范围值小于或等于动态阈值,所述状态信息为静止状态,所述场景类型为灯光不变场景,则将多个所述帧率策略中的第三策略作为所述目标帧率策略。
  9. 根据权利要求4至6任一项所述的方法,其特征在于,所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
    若所述环境亮度值小于或等于亮度阈值,则将多个所述帧率策略中的第三策略作为所述目标策略。
  10. 根据权利要求2至9任一项所述的方法,其特征在于,所述初始数据还包括功耗信息;
    所述帧率策略决策层根据初始数据,确定目标帧率策略,包括:
    所述帧率策略决策层根据所述传感数据、所述控制数据、所述功耗信息,确定所述目标帧率策略。
  11. 一种电子设备,其特征在于,包括处理器和存储器;
    所述存储器,用于存储可在所述处理器上运行的计算机程序;
    所述处理器,用于执行如权利要求1至10中任一项所述的方法。
  12. 一种芯片系统,其特征在于,所述芯片系统包括:处理器,用于从存储器中调用并运行计算机程序,使得安装有芯片的设备执行如权利要求1至10中任一项所述的方法。
  13. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至10任一项所述方法的步骤。
PCT/CN2023/112906 2022-10-31 2023-08-14 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质 WO2024093432A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211350405.9A CN116347224B (zh) 2022-10-31 2022-10-31 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质
CN202211350405.9 2022-10-31

Publications (1)

Publication Number Publication Date
WO2024093432A1 true WO2024093432A1 (zh) 2024-05-10

Family

ID=86886313

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/112906 WO2024093432A1 (zh) 2022-10-31 2023-08-14 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质

Country Status (2)

Country Link
CN (1) CN116347224B (zh)
WO (1) WO2024093432A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116347224B (zh) * 2022-10-31 2023-11-21 荣耀终端有限公司 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质
CN117714897A (zh) * 2023-07-24 2024-03-15 荣耀终端有限公司 确定帧率的方法、电子设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611139A (zh) * 2015-07-15 2016-05-25 宇龙计算机通信科技(深圳)有限公司 图像处理方法、图像处理装置和终端
CN109327626A (zh) * 2018-12-12 2019-02-12 Oppo广东移动通信有限公司 图像采集方法、装置、电子设备和计算机可读存储介质
US20200275050A1 (en) * 2017-09-28 2020-08-27 Dolby Laboratories Licensing Corporation Frame rate conversion metadata
CN113411529A (zh) * 2019-02-28 2021-09-17 华为技术有限公司 一种录像帧率的控制方法及相关装置
CN116347224A (zh) * 2022-10-31 2023-06-27 荣耀终端有限公司 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004205792A (ja) * 2002-12-25 2004-07-22 Fuji Photo Film Co Ltd オートフォーカス制御方法および画像記録装置
CN101529890B (zh) * 2006-10-24 2011-11-30 索尼株式会社 图像摄取设备和再现控制设备
WO2013165377A1 (en) * 2012-04-30 2013-11-07 Hewlett-Packard Development Company, L. P. A system and method of modifying the dynamic range
US9197809B2 (en) * 2013-05-01 2015-11-24 Canon Kabushiki Kaisha Image pickup apparatus, method of controlling image pickup apparatus, and non-transitory computer-readable storage medium
CN107454322A (zh) * 2017-07-31 2017-12-08 广东欧珀移动通信有限公司 拍照方法、装置、计算机可存储介质和移动终端
CN108322650B (zh) * 2018-02-08 2020-03-27 Oppo广东移动通信有限公司 视频拍摄方法和装置、电子设备、计算机可读存储介质
CN110248081A (zh) * 2018-10-12 2019-09-17 华为技术有限公司 图像捕捉方法及电子设备
CN112532859B (zh) * 2019-09-18 2022-05-31 华为技术有限公司 视频采集方法和电子设备
CN112532857B (zh) * 2019-09-18 2022-04-12 华为技术有限公司 一种延时摄影的拍摄方法及设备
CN110572581B (zh) * 2019-10-14 2021-04-30 Oppo广东移动通信有限公司 基于终端设备的变焦虚化图像获取方法和装置
CN111464761A (zh) * 2020-04-07 2020-07-28 北京字节跳动网络技术有限公司 视频的处理方法、装置、电子设备及计算机可读存储介质
CN111526314B (zh) * 2020-04-24 2022-04-05 荣耀终端有限公司 视频拍摄方法及电子设备
CN116582741B (zh) * 2020-05-07 2023-11-28 华为技术有限公司 一种拍摄方法及设备
CN115134516A (zh) * 2021-03-29 2022-09-30 北京小米移动软件有限公司 拍摄方法及装置
CN113422902B (zh) * 2021-05-31 2023-01-06 惠州华阳通用电子有限公司 一种摄像头帧率调节方法
CN115086567B (zh) * 2021-09-28 2023-05-19 荣耀终端有限公司 延时摄影方法和装置
CN114339042A (zh) * 2021-12-28 2022-04-12 展讯通信(上海)有限公司 基于多摄像头的图像处理方法及装置、计算机可读存储介质
CN114745502A (zh) * 2022-03-30 2022-07-12 联想(北京)有限公司 拍摄方法及装置、电子设备、存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105611139A (zh) * 2015-07-15 2016-05-25 宇龙计算机通信科技(深圳)有限公司 图像处理方法、图像处理装置和终端
US20200275050A1 (en) * 2017-09-28 2020-08-27 Dolby Laboratories Licensing Corporation Frame rate conversion metadata
CN109327626A (zh) * 2018-12-12 2019-02-12 Oppo广东移动通信有限公司 图像采集方法、装置、电子设备和计算机可读存储介质
CN113411529A (zh) * 2019-02-28 2021-09-17 华为技术有限公司 一种录像帧率的控制方法及相关装置
CN116347224A (zh) * 2022-10-31 2023-06-27 荣耀终端有限公司 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质

Also Published As

Publication number Publication date
CN116347224A (zh) 2023-06-27
CN116347224B (zh) 2023-11-21

Similar Documents

Publication Publication Date Title
WO2022262260A1 (zh) 一种拍摄方法及电子设备
CN114157804B (zh) 一种长焦拍摄的方法及电子设备
WO2024093432A1 (zh) 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质
WO2023015981A1 (zh) 图像处理方法及其相关设备
US20230188861A1 (en) Light Compensation Method for Photographing and Related Apparatus
CN113630558B (zh) 一种摄像曝光方法及电子设备
WO2023160285A1 (zh) 视频处理方法和装置
WO2022252780A1 (zh) 拍摄方法及电子设备
CN116055897B (zh) 拍照方法及其相关设备
WO2024045670A1 (zh) 生成高动态范围视频的方法和电子设备
WO2023016232A1 (zh) 图像拍摄的方法、设备、存储介质和程序产品
WO2023226612A1 (zh) 一种曝光参数确定方法和装置
WO2022267608A1 (zh) 一种曝光强度调节方法及相关装置
CN115604572A (zh) 图像的获取方法及装置
CN115589539B (zh) 一种图像调节的方法、设备及存储介质
CN111510629A (zh) 数据显示方法、图像处理器、拍摄装置和电子设备
WO2023060921A1 (zh) 图像处理方法与电子设备
CN113891008B (zh) 一种曝光强度调节方法及相关设备
CN115767290A (zh) 图像处理方法和电子设备
CN115883958A (zh) 一种人像拍摄方法
CN116055855B (zh) 图像处理方法及其相关设备
WO2023160223A1 (zh) 多摄策略调度方法及其相关设备
WO2023160220A1 (zh) 一种图像处理方法和电子设备
WO2023160221A1 (zh) 一种图像处理方法和电子设备
CN115426458B (zh) 光源检测方法及其相关设备