CN117119295A - Camera control method and electronic device - Google Patents

Camera control method and electronic device Download PDF

Info

Publication number
CN117119295A
CN117119295A CN202311377967.7A CN202311377967A CN117119295A CN 117119295 A CN117119295 A CN 117119295A CN 202311377967 A CN202311377967 A CN 202311377967A CN 117119295 A CN117119295 A CN 117119295A
Authority
CN
China
Prior art keywords
frame rate
service
camera
activity
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311377967.7A
Other languages
Chinese (zh)
Other versions
CN117119295B (en
Inventor
李俊科
李满
张祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311377967.7A priority Critical patent/CN117119295B/en
Publication of CN117119295A publication Critical patent/CN117119295A/en
Application granted granted Critical
Publication of CN117119295B publication Critical patent/CN117119295B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a camera control method and electronic equipment. In the method, the electronic device can monitor the running condition of the service in the application, after the application 1 creates the service 1, the service 1 created by the application 1 is matched in the resource configuration file, and whether the service 1 is recorded in the resource configuration file is determined. If the service 1 is recorded in the resource configuration file, the service 1 is a preset service requiring the camera to be called, and the frame rate of the camera is lowered before the service 1 calls the camera, so that the camera reduces the power consumption due to the lowered frame rate. In some possible cases, the images acquired by the camera can be displayed in real time in the preset service so as to preview the images. By implementing the technical scheme provided by the application, the power consumption of the third party application in the use of the camera can be reduced under the condition that the third party application is not required to be redeveloped for reducing the use power consumption of the camera.

Description

Camera control method and electronic device
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a camera control method and an electronic device.
Background
In the present stage, in an electronic device (for example, a mobile phone) provided with a camera, besides starting the camera to shoot in a camera application, the camera can be called by a part of applications to collect images so as to realize services such as video conference or live broadcast.
When the part of application comprises a third party application, the phenomenon that the power consumption of the electronic equipment is too high due to excessive use of the camera by the third party application may occur, so that the electronic equipment is overheated in the use process to open overheat protection to influence the use of other functions.
Without requiring the redevelopment of third party applications to reduce the power consumption of camera use, it is worth discussing how to reduce the power consumption of third party applications when using a camera.
Disclosure of Invention
The application provides a camera control method and electronic equipment, which can reduce the power consumption of a third party application when the camera is used under the condition that the third party application is not required to be redeveloped for reducing the use power consumption of the camera.
In a first aspect, the present application provides a camera control method, the method comprising: the electronic equipment runs a first application; when the first application creates an activity of a first service, the electronic equipment determines that the first service is a preset service based on the identification of the activity; when the activity state is creation, the electronic equipment reduces the first parameter of the camera from a default value to a preset value corresponding to the first service; the electronic device captures an image with the preset value through the camera, and displays the image in the user interface associated with the activity in the first application.
In the above embodiment, the first application may be application 1, application a, or a third party application that needs to use a camera, which is related to the following description; the first service may be service 1 referred to in the following. The first parameter of the camera is turned down when the first application uses the camera, and the power consumption of the third party application when the camera is used can be reduced without the need for the third party application to be redeveloped to reduce the power consumption of the camera.
With reference to the first aspect, in some embodiments, the method further includes: when the first application closes the activity of the first service, the electronic equipment determines that the first service is a preset service based on the identification of the activity; and when the active state is closed, the electronic equipment restores the first parameter of the camera from the preset value corresponding to the first service to the default value.
In the above embodiment, after the first parameter of the camera is changed, the first parameter of the camera needs to be restored to the default value (the default value is the default configuration of the first parameter) at the end of the first service. So that other applications are not affected by the present solution when the camera is invoked. For example, in the case where the first parameter is a frame rate, the default value may be a frame rate of a default configuration referred to in the following. In the case where the first parameter is resolution, the default value may be the camera's default resolution.
With reference to the first aspect, in some embodiments, when the first parameter is resolution, the preset value corresponding to the first service is a preset resolution corresponding to the first service; the resolution is the number of photosensitive elements utilized by the camera to capture an image.
With reference to the first aspect, in some embodiments, when the first parameter is a frame rate, the preset value corresponding to the first service is a preset frame rate corresponding to the first service; the frame rate is the sampling rate at which the camera captures images in a unit of time.
With reference to the first aspect, in some embodiments, when a preset frame rate corresponding to the first service is a range, the electronic device collects, by using the camera, an image by using the preset value, including: the electronic device determining a frame rate value of the camera in the preset frame rate based on the illumination intensity; the frame rate value is positively correlated to the illumination intensity; the electronic device captures an image through the camera using the frame rate value.
In the above-described embodiments, the illumination intensity may be understood as the illumination intensity of the shooting environment referred to in the following embodiments. The stronger the ambient light, the faster the shutter speed of the camera to capture one frame of image (weaker than ambient light), and more images (weaker than ambient light) can be captured per unit time. The weaker the ambient light (compared with weaker ambient light), the slower the shutter speed at which the camera collects a frame of image so that sufficient light enters the camera's image sensor, facilitating the presentation of a clear image, the fewer images that can be collected per unit time compared with when ambient light is strong. The illumination intensity confirmation means referred to herein include, but are not limited to: the electronic device may detect the illumination intensity of the shooting environment through the ambient light sensor. Or the electronic equipment can acquire a frame of preview image, count the average brightness value of all pixel points in the frame of preview image, and take the average brightness value as the illumination intensity of the shooting scene.
With reference to the first aspect, in some embodiments, before the electronic device decreases the first parameter of the camera from the default value to the preset value corresponding to the first service, the method further includes: the electronic equipment performs matching in a first configuration file based on the identification of the activity, and determines the frame rate type corresponding to the identification of the activity in the first configuration file; and the electronic equipment determines a preset frame rate corresponding to the frame rate type in a second configuration file as the preset frame rate corresponding to the first service.
In the above embodiment, the first profile may be regarded as a resource profile referred to in the following embodiment. The second profile may be regarded as a frame rate profile as referred to in the embodiments described below. At this time, the resource configuration file includes the identifier of the activity corresponding to the service and the corresponding frame rate type. The frame rate configuration file includes a frame rate corresponding to the frame rate type. In this way, different frame rates can be set based on traffic type. One frame rate type corresponds to one traffic type in essence. In this way, the same type of traffic can be made to have the same frame rate.
With reference to the first aspect, in some embodiments, before the electronic device decreases the first parameter of the camera from the default value to the preset value corresponding to the first service, the method further includes: the electronic equipment performs matching in a third configuration file based on the identification of the activity, and determines a preset frame rate corresponding to the identification of the activity in the third configuration file as a preset frame rate corresponding to the first service.
In the above embodiment, the third profile may be regarded as a resource profile referred to in the following embodiment. Here, included in the resource profile is an identification of the activity corresponding to the service and the corresponding frame rate. The electronic device may set different frame rates for different services.
With reference to the first aspect, in some embodiments, the electronic device includes an activity manager and an adaptive power manager APS that records a first configuration file, and determining a frame rate type corresponding to an identifier of the activity in the first configuration file specifically includes: after the first application creates the activity corresponding to the service 1 through the activity manager, the activity manager sends a first notification to the APS through a first interface; wherein, the first notice carries the mark of the activity and the first state of the activity; the first state indicates that the activity was created; after the APS determines that the state of the activity is created based on the first state of the activity in the first notification, the frame rate type corresponding to the identification of the activity is matched in the first profile.
With reference to the first aspect, in some embodiments, the electronic device further includes a camera hardware abstraction server that records a second configuration file, and after matching the frame rate type corresponding to the identification of the activity in the first configuration file, the method further includes: the APS sends the frame rate type to the camera hardware abstraction layer server; the electronic device determines a preset frame rate corresponding to the frame rate type in a second configuration file as the preset frame rate corresponding to the first service, and specifically includes: the camera hardware abstraction layer server determines a preset frame rate corresponding to the frame rate type in the second configuration file as the preset frame rate corresponding to the first service.
With reference to the first aspect, in some embodiments, the first service includes a live service or a video conference service.
In a second aspect, an embodiment of the present application provides an electronic device, including: one or more processors and memory; the memory is coupled to the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors call to cause the electronic device to perform the method as implemented in the first aspect.
In a third aspect, embodiments of the present application provide a computer-readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as implemented in the first aspect.
In a fourth aspect, embodiments of the present application provide a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform the method as implemented in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the method as implemented in the first aspect.
It will be appreciated that the electronic device provided in the second aspect, the computer storage medium provided in the third aspect, the chip system provided in the fourth aspect and the computer program product provided in the fifth aspect are all configured to perform the method provided by the embodiment of the present application. Therefore, other advantages achieved by the method can be referred to as advantages of the corresponding method, and will not be described herein.
Drawings
FIG. 1 shows a schematic diagram of controlling a camera frame rate when an electronic device turns on a video conferencing service;
FIG. 2 shows a schematic diagram of controlling a camera frame rate when an electronic device switches cameras;
FIG. 3 illustrates an exemplary software architecture block diagram involved in controlling the frame rate of a camera by an electronic device;
FIG. 4 shows a schematic interaction flow diagram between modules of an electronic device when the frame rate of a camera is reduced in a scenario in which a service is created;
FIG. 5 is a schematic interaction flow chart among modules when the electronic device resumes the frame rate of the camera to the default configured frame rate in a scenario where the service is ended;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include the plural forms as well, unless the context clearly indicates to the contrary. It should also be understood that the term "and/or" as used in this disclosure refers to and encompasses any or all possible combinations of one or more of the listed items.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in the description of embodiments of the application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In order to reduce the power consumption of the third party application when the camera is used under the condition that the third party application is not required to be redeveloped for reducing the power consumption of the camera, the embodiment of the application provides a camera control method. In the method, the electronic device can monitor the running condition of the service in the application, after the application 1 creates the service 1, the service 1 created by the application 1 is matched in the resource configuration file, and whether the service 1 is recorded in the resource configuration file is determined. If the service 1 is recorded in the resource configuration file, the service 1 is a preset service requiring the camera to be called, and the frame rate of the camera is lowered before the service 1 calls the camera, so that the camera reduces the power consumption due to the lowered frame rate. In some possible cases, the images acquired by the camera can be displayed in real time in the preset service so as to preview the images.
In some possible cases, the preset services include, but are not limited to, one or more of the following: live type services, video conference type services, etc. In general, when the live broadcast service and the video conference service are turned on, the frame rate of the camera may be reduced to 15fps, or may be other values, for example, 20 fps, which is not limited by the embodiment of the present application.
Wherein the frame rate of the camera represents the number of images acquired by the camera per unit time (1 s). The frame rate of the camera may also be the sampling rate at which the camera captures images.
The resource configuration file records part or all of the service which can cause excessive power consumption by using the camera in the electronic equipment. The part or all of the service may be from a third party application, or the part or all of the service may be from a system application in addition to the third party application, which is not limited by the embodiment of the present application.
The resource configuration file also records the frame rate or frame rate type corresponding to each service. Wherein, the frame rate corresponding to a service represents the frame rate of the camera when the service invokes the camera. The frame rate type corresponding to a service indicates the frame rate type of the camera when the service invokes the camera. The frame rate type may also correspond to a frame rate, and the frame rate corresponding to the frame rate type (and also to the service corresponding to the frame rate type) is the frame rate of the camera when the camera is invoked for the service. The frame rate corresponding to the frame rate type is recorded in a profile other than the resource profile (which may be referred to as a frame rate profile). It should be understood here that the frame rate of the camera may be a value or a range.
Hereinafter, description will be given by taking an example in which a frame rate type corresponding to each service is recorded in a profile, and a frame rate corresponding to the frame rate type is recorded in the frame rate profile. Other cases may refer to this description, and the embodiments of the present application will not be described in detail.
Wherein the frame rates corresponding to the different frame rate types are different, the difference comprises that the frame rate can be a value, or the frame rate can be a range. The differences also include differences in values of frame rates. Different services may correspond to the same frame rate type, or may correspond to different frame rate types, which is not limited in the embodiment of the present application.
The frame rate corresponding to the service is less than the frame rate of the default configuration of the camera. When the frame rate corresponding to the service is a value and the frame rate of the default configuration is a range, the frame rate corresponding to the service being smaller than the frame rate of the default configuration may include: the frame rate corresponding to the service is less than the minimum value in the frame rate of the default configuration. When the frame rate corresponding to the service and the frame rate of the default configuration are both in a range, the frame rate corresponding to the service being smaller than the frame rate of the default configuration may include: the minimum value in the frame rate corresponding to the service is larger than the minimum value in the frame rate of the default configuration, and the maximum value in the frame rate corresponding to the service is smaller than the maximum value in the frame rate of the default configuration. When the frame rate corresponding to the service is in a range and the frame rate of the default configuration is in a value, the maximum value of the frame rates corresponding to the service is smaller than the frame rate of the default configuration.
The details of the resource configuration file may refer to the following description of step S104, and the details of the frame rate configuration file may refer to the following description of step S106, which will not be repeated here.
It should be appreciated that in order to reduce the power consumption of the camera, this may be done in other ways than down-regulating the frame rate of the camera. For example, the resolution at which the camera captures images may be reduced: the number of photosensitive elements in the camera that capture images is reduced. The embodiment of the present application is not limited thereto, and the control of the frame rate of the camera is described below as an example, but should not be construed as limiting the embodiment of the present application.
Some or all of the services that the camera is used to cause excessive power consumption may be video conference services, or live services, etc. Taking the video conference service as an example, in conjunction with fig. 1, the relevant content of the electronic device for reducing the frame rate of the camera when the video conference service is started will be described in detail.
As described with reference to fig. 1 (1), the user interface 11 is a desktop of the electronic device. In response to an operation (e.g., a click operation) for the icon 111 corresponding to application a, the electronic device opens application a. The application A and the camera are two independent applications, and the application A can call the camera to collect images to realize the video conference when the video conference service is started. Before application a invokes the camera, the default configuration of the camera is illustrated with a frame rate of 30 fps.
As shown in fig. 1 (2), an exemplary user interface (which may also be referred to as a page) is referred to before the user interface 12 opens the videoconference service for application a. In response to operation of start control 1211, the electronic device creates a video conferencing service, starting to pull up hardware resources and software resources involved in conducting the video conference to open the video conference. Wherein, the hardware resources pulled up by the electronic equipment comprise cameras. After creating the video conferencing service, the electronic device may switch the frame rate of the camera from 30fps to 15fps before turning on the camera.
Referring to (3) in fig. 1, the frame rate of the camera is adjusted from 30fps to 15fps after the video conference service is turned on.
In some possible cases, the frame rate of the camera may also be restored to the default configured frame rate after the electronic device has finished the videoconferencing service.
Referring to the user interface 13 shown in (3) of fig. 1 and the user interface 14 shown in (4) of fig. 1, in response to an operation for the end control 131, the electronic device ends the videoconference service, and the frame rate of the camera is restored from 15fps to 30fps.
In general, cameras of electronic devices may include front-facing cameras (also referred to as front-facing cameras) and rear-facing cameras (also referred to as rear-facing cameras), and in some possible implementations, reducing the frame rate of the cameras includes reducing the frame rate of the front-facing cameras as well as reducing the frame rate of the rear-facing cameras.
In other possible implementations, turning down the frame rate of the camera includes turning down the frame rate of the first camera (front camera or rear camera) used after the conference is about to be started. And then monitoring whether to switch the cameras, and when detecting to switch the used cameras, reducing the frame rate of the cameras used after the switching. Referring to fig. 2, in response to an operation for the switch control 132, the electronic device switches the camera in use, for example, switches the camera from the front camera to the rear camera, and adjusts the frame rate of the rear camera to 15fps.
It should be understood here that 30fps or 15fps means that the camera can acquire 30 frames of images or 15 frames of images within 1s, respectively. The reference to 30fps and 15fps is merely illustrative, and other values are possible in practice and should not be construed as limiting the embodiments of the application.
Fig. 3 shows an exemplary software architecture block diagram involved in controlling the frame rate of a camera by an electronic device.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, from top to bottom, an application layer, an application framework layer, a hardware abstraction layer, and a kernel layer, respectively.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes the programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The application layer may include a series of application packages.
As shown in fig. 3, the application package may include an application program (also referred to as an application) such as a launcher (launcher), an application 1, an adaptive power manager (adaptive power saving, APS), and the like.
The application 1 may be a third party application, such as an entertainment application, an office application, etc. The initiator and the application 1 may together be used to create or close an activity and send the status of the activity to the activity manager described below when creating or closing the activity.
The creation activity may also be understood as creating a page, creating a service, etc. Closing activities may also be understood as closing pages, closing traffic, etc.
The APS may include a resource profile therein.
The APS may be used to register an activity period status notification interface with an activity manager, referred to below, to monitor the status of activity (including newly built or off status, etc.) in various applications of the electronic device.
The APS may also be used to match whether the service 1 exists in a resource profile after receiving a service 1 creation notification from an application 1 sent by an activity manager described below. If the service 1 is recorded in the resource configuration file, the adaptive power manager may send the frame rate type corresponding to the service 1 to a camera local interface described below.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 3, the application framework layer may include an activity manager (activitymananager), a camera local interface (postcamera jni), and so on.
The activity manager may be used to manage the status of activities in the electronic device, scheduling the lifecycle of the activities.
The camera local interface may be used after receiving the frame rate type corresponding to the APS transmitted service 1. The frame rate type is transferred to a next level, such as a hardware abstraction layer.
The hardware abstraction layer may include a camera hardware abstraction layer server (camera halserver), a hardware configuration server (HwCfgServer), and the like.
The hardware configuration server may be configured to receive the frame rate type sent by the camera local interface and transmit it to a camera hardware abstraction layer server as referred to below.
A frame rate profile may be included in the camera hardware abstraction layer server. And determining the frame rate corresponding to the frame rate type in the frame rate configuration file based on the frame rate type. And issues the frame rate to the camera driver.
The kernel layer may include a camera driver (sensor driver) and a display driver.
The camera drive may be used to drive the camera to capture images. And driving the camera to adjust the frame rate to the frame rate corresponding to the service 1 based on the frame rate corresponding to the frame rate type.
It should be understood here that the camera referred to above belongs to a hardware layer, and includes modules such as an image sensor. Adjusting the frame rate of the camera may be seen as adjusting the sampling rate at which the image sensor captures images.
An exemplary process flow for controlling the frame rate of a camera by an electronic device is described in detail below in conjunction with fig. 3.
Step 1, the aps registers an active period status notification interface with the activity manager. The activity period status notification interface may be used to be invoked by an activity manager to implement a status of transmitting activity to the APS. Step 1 is circle mark (1) in fig. 3.
Step 2, the application 1 creates an activity corresponding to the service 1 to the activity manager through the initiator. Wherein step 1 is circle identification (2) in fig. 3.
Step 3, the activity manager invokes the activity period status notification interface to send an identification of the activity created by application 1 to the APS to notify the APS that service 1 of APS application 1 is created. Wherein step 1 is circle identification (3) in fig. 3.
And 4, determining the frame rate type corresponding to the active identifier in the resource configuration file through the active identifier as the frame rate type corresponding to the service 1 by the APS, and sending the frame rate type to the camera hardware abstraction layer server through the camera local interface and the hardware configuration server. Wherein step 1 is circle identification (4) in fig. 3.
And 5, the camera hardware abstraction layer server determines the frame rate corresponding to the frame rate type in the frame rate configuration file, and adjusts the frame rate of the camera to the frame rate corresponding to the frame rate type through a camera driver. Wherein step 1 is circle identification (5) in fig. 3.
It should be understood herein that the software modules shown in fig. 3 are exemplary, and that more or fewer software modules may be included in practice. For example, in fig. 3, different software modules may be combined into one, and the same software module may be functionally split into two or more software modules. The level of each software module is also illustrated and can be adjusted according to actual situations. And should not be construed as limiting embodiments of the application.
Fig. 4 shows a schematic interaction flow diagram between modules of an electronic device when the frame rate of a camera is reduced in a scenario in which a service is created.
The process by which the electronic device controls the frame rate of the camera is described below in conjunction with fig. 3 and 4. The description of this process may refer to the following descriptions of step S101 to step S110.
S101.aps registers an active period status notification interface with an active manager.
The active period state (activitylestate) informs the interface that a connection between the APS and the activity manager is established. Subsequently, the activity manager may send the status of the activity in each application (including the new or closed status, etc.) to the APS by invoking the activity period status notification interface.
It is also understood that the APS detects the status of activity in applications of the electronic device through an activity period status notification interface.
S102, the application 1 creates the service 1 through the activity manager.
Creating service 1 may also be understood as creating an activity corresponding to service 1. After application 1 creates service 1 through the activity manager, the electronic device may begin pulling up the hardware resources and software resources that open service 1 to open the video conference, and then open service 1. The hardware resources of the opening service 1 include cameras.
Referring to fig. 1 (2) described above, the application 1 may be the application a (office-type application) referred to above, and in response to an operation for the start control 1211, the application 1 may create the service 1 through the activity manager.
S103, the activity manager sends a notification 1 of the created service 1 of the application 1 to the APS, wherein the notification 1 carries service information.
In some possible cases, the activity manager invokes the active period status notification interface to send notification 1 to the APS. The service information in the notification 1 may include an identification of the activity created by the application 1. The identifier of the activity may include a packet name of the application 1 and a page name corresponding to the service 1, where the service information further includes a state of the activity corresponding to the service 1, which is herein creation.
S104, the APS matches the frame rate type of the service 1 in the resource configuration file based on the service information carried in the notification 1, and the frame rate type is recorded as the frame rate type 1.
In some possible cases, part or all of the services and the frame rate types corresponding to the services, which can cause excessive power consumption by using the camera in the electronic device, are recorded in the resource configuration file.
The resource profile represents the service with an identification of the activity at the time the service was created. Table 1 shows an exemplary resource profile.
TABLE 1
In table 1, except for the traffic and the frame rate type corresponding to the traffic. The resource profile may also include other content, such as validation status. The validation state indicates whether the frame rate of the camera is adjusted when the service invokes the camera. For example, if the validation state corresponding to a service is 1, it indicates that the service calls the camera, and the frame rate of the camera needs to be adjusted. If the effective state corresponding to a service is 0, it means that the service calls the camera without adjusting the frame rate of the camera.
The APS includes, based on the service information carried in the notification 1, matching the frame rate type of the service 1 in the resource configuration file: when the APS determines that the state of the activity corresponding to the service 1 in the service information is creation, whether the service identical to the activity identifier corresponding to the service 1 exists is matched in the resource configuration file based on the activity identifier corresponding to the service 1 carried in the service information. If yes, the frame rate type corresponding to the service is further determined.
It should be noted that, in some possible cases, the content recorded in the resource configuration file in step S104 may not be the service and the frame rate type corresponding to the service. But the traffic and the frame rate type identification corresponding to the traffic. The APS includes, based on the service information carried in the notification 1, matching the frame rate type of the service 1 in the resource configuration file: the APS matches the frame rate type identifier of the service 1 in the resource configuration file based on the service information carried in the notification 1, and then determines the frame rate type corresponding to the frame rate type identifier based on the frame rate type identifier. Wherein, the correspondence between the frame rate type identifier and the frame rate type may also be recorded in the APS. For example, the APS may determine that the frame rate type corresponding to the frame rate type identifier 501X is 3011X through the frame rate type identifier 501X.
S105.aps sends frame rate type 1 to the camera hardware abstraction layer server.
In some possible cases, referring to the foregoing description of fig. 3, the APS may send frame rate type 1 to the camera hardware abstraction layer server through the camera local interface and the hardware configuration server.
S106, the camera hardware abstraction layer server matches the frame rate corresponding to the frame rate type 1 in the frame rate configuration file and marks the frame rate as the frame rate 1.
In some possible cases, the frame rate type and the frame rate corresponding to the frame rate type are recorded in the frame rate profile. Different services may correspond to the same frame rate type or may correspond to different frame rate types. The frame rates corresponding to the different frame rate types may be different, including the frame rate may be a value, or the frame rate may be a range. The differences also include differences in values of frame rates. The relevant description of the frame rate may refer to the foregoing relevant content, and will not be repeated here.
Table 2 shows an exemplary frame rate profile.
TABLE 2
As shown in table 2, when the minimum frame rate value is equal to the maximum frame rate value, this indicates that the frame rate is one value.
S107, the camera hardware abstraction layer server switches the frame rate of the camera from the frame rate 2 to the frame rate 1, and the frame rate 1 is smaller than the frame rate 2.
The frame rate 2 can be understood as the frame rate of the default configuration of the camera referred to previously. For a detailed description of the frame rate 1 being less than the frame rate 2, reference is made to the foregoing related contents, and a detailed description thereof will be omitted herein.
A schematic diagram of switching the frame rate of the camera from the frame rate 2 to the frame rate 1 may refer to the related content shown in fig. 1 (2) and fig. 1 (3) described above.
When the frame rate of the camera is switched to the frame rate 1 and the frame rate 1 is a range, the camera can determine a frame rate value according to parameters such as illumination intensity of a shooting environment to collect images, and the frame rate value belongs to the frame rate 1. For example, there is a positive correlation between the size of the frame rate value and the intensity of the illumination intensity of the shooting environment: the stronger the illumination of the shooting environment, the larger the frame rate value; the weaker the illumination of the shooting environment, the smaller the frame rate value. The reason for this is that: the stronger the illumination of the shooting environment, the faster the shutter speed of the camera for acquiring one frame of image (weaker than the ambient illumination), and more images (weaker than the ambient illumination) can be acquired in a unit time. The weaker the illumination of the shooting environment (compared with the weaker the ambient illumination), the slower the shutter speed of the camera to collect one frame of image so that sufficient light enters the image sensor of the camera, so that clear images can be conveniently presented, and the images which can be collected in unit time are less compared with the ambient illumination. The method for confirming the illumination intensity of the shooting environment may include: the electronic device may detect the illumination intensity of the shooting environment through the ambient light sensor.
S108, the camera controls the sampling rate of the acquired image based on the frame rate 1.
At a frame rate of F1, the camera may control the image sensor to capture F1 frame images in a unit time.
S109, the camera sends the image to the application 1.
S110, displaying the image by using the application 1.
The image collected by the camera can be displayed in real time in the application 1: the user interface in application 1 associated with the activity corresponding to business 1 may display the image captured by the camera. An example of this process may refer to the content shown in (3) of fig. 1 described above.
In some possible implementations, on the basis of the foregoing steps S101-S110, if the frame rate type is recorded in the resource configuration file, and the identifier of the activity recorded in the resource configuration file includes the packet name of the application, when the frame rate corresponding to the frame rate type is recorded in the frame rate configuration file, the service in the resource configuration file may be added: for an application (denoted as application 2) installed by the electronic device for the first time, the electronic device may match in the resource configuration file based on the package name corresponding to the application 2, and if it is determined that the same package name as the application 2 exists in the resource configuration file, the process ends. If it is determined that the same package name as the application 2 does not exist in the resource configuration file, the electronic device may monitor whether the application 2 uses the camera, monitor power consumption of the camera when the application 2 uses the camera, and if the power consumption is greater than a preset power consumption threshold, record an identification of an activity when the application 2 calls the camera in the resource configuration file, and record data such as a corresponding frame rate type for the identification of the activity of the application 2. The frame rate type of application 2 may be determined based on the traffic type of the activity in application 2 when the camera is invoked. The frame rate type of the application 2 may also be a preset frame rate type when the active traffic type of the application 2 cannot be determined.
It should be understood herein that in some possible cases, the electronic device may adjust the frame rate of the camera to reduce the power consumption of the camera when creating the preset service, maintaining the proper operation of the electronic device. When the preset service is finished, the frame rate of the camera can be restored to the frame rate of the default configuration so as to facilitate the acquisition of images at the frame rate of the default configuration when the camera is subsequently used by other applications.
Fig. 5 shows a schematic interaction flow chart between the modules when the electronic device resumes the frame rate of the camera to the default configured frame rate in the scenario of ending the service. The description of this process may refer to the following descriptions of step S201 to step S205.
S201, the application 1 ends the service 1 through the activity manager.
Ending service 1 may also be understood as closing the corresponding activity of service 1.
Referring to the foregoing illustration of (3) in fig. 1, the application 1 may be the aforementioned related application a (office-type application), and in response to the operation for the end control 131, the application 1 may end the service 1 through the activity manager.
S202, the activity manager sends a notice 2 that the service 1 of the application 1 is closed to the APS, wherein the notice 2 carries service information.
In some possible cases, the activity manager invokes the active period status notification interface to send notification 2 to the APS. The service information in this notification 2 may include an identification of the activity that application 1 is closed. The identifier of the activity may include a packet name of the application 1 and a page name corresponding to the service 1, and the service information further includes a state of the activity corresponding to the service 1, which is closed here.
S203.APS is matched to service 1 in the resource configuration file based on the service information carried in notification 2.
Matching the APS to the service 1 in the resource configuration file based on the service information carried in the notification 2 includes: when the APS determines that the state of the activity corresponding to the service 1 in the service information is closed, whether the service identical to the activity identifier corresponding to the service 1 exists is matched in the resource configuration file based on the activity identifier corresponding to the service 1 carried in the service information. If so, the following step S204 may be performed to restore the frame rate of the camera to the frame rate of the default configuration.
S204.aps sends a default configured frame rate to the camera hardware abstraction layer server.
In some possible cases, referring to the foregoing description of fig. 3, the APS may send a default configured frame rate to the camera hardware abstraction layer server through the camera local interface and the hardware configuration server.
S205, the camera hardware abstraction layer server switches the frame rate of the camera from the frame rate 1 to the frame rate of the default configuration.
A schematic diagram of switching the frame rate of the camera from the frame rate 1 to the frame rate of the default configuration may refer to the related content shown in (3) of fig. 1 and (4) of fig. 1 described above.
An exemplary electronic device provided by an embodiment of the present application is described below.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The embodiment will be specifically described below with reference to an electronic device as an example. It should be understood that an electronic device may have more or fewer components than shown in fig. 6, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 6 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The electronic device may implement shooting functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
In some possible cases, the video camera 193 may also be referred to as a camera 193.
In the embodiment of the present application, the processor 110 may call the computer instructions stored in the internal memory 121, so that the terminal executes the camera control method in the embodiment of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.

Claims (13)

1. A camera control method, the method comprising:
the electronic equipment runs a first application;
when the first application creates an activity of a first service, the electronic equipment determines that the first service is a preset service based on the identification of the activity;
when the active state is creation, the electronic equipment reduces the first parameter of the camera from a default value to a preset value corresponding to the first service;
the electronic device collects images through the camera by utilizing the preset value, and the images are displayed in the user interface associated with the activities in the first application.
2. The method according to claim 1, wherein the method further comprises:
When the first application closes the activity of the first service, the electronic equipment determines that the first service is a preset service based on the identification of the activity;
and when the active state is closed, the electronic equipment restores the first parameter of the camera from the preset value corresponding to the first service to the default value.
3. The method according to claim 1 or 2, wherein when the first parameter is resolution, the preset value corresponding to the first service is a preset resolution corresponding to the first service; the resolution is the number of photosensitive elements utilized by the camera to capture an image.
4. The method according to claim 1 or 2, wherein when the first parameter is a frame rate, the preset value corresponding to the first service is a preset frame rate corresponding to the first service; the frame rate is the sampling rate at which the camera captures images in a unit time.
5. The method according to claim 4, wherein when the preset frame rate corresponding to the first service is a range, the electronic device acquires an image by using the preset value through the camera, specifically including:
the electronic equipment determines a frame rate value of the camera in the preset frame rate based on illumination intensity; the frame rate value is positively correlated to the illumination intensity;
The electronic device acquires an image through the camera using the frame rate value.
6. The method of claim 4, wherein the electronic device is further configured to, before the first parameter of the camera is adjusted from a default value to a preset value corresponding to the first service, further comprising:
the electronic equipment performs matching in a first configuration file based on the identification of the activity, and determines a frame rate type corresponding to the identification of the activity in the first configuration file;
and the electronic equipment determines a preset frame rate corresponding to the frame rate type in a second configuration file as the preset frame rate corresponding to the first service.
7. The method of claim 4, wherein the electronic device is further configured to, before the first parameter of the camera is adjusted from a default value to a preset value corresponding to the first service, further comprising:
and the electronic equipment performs matching in a third configuration file based on the identification of the activity, and determines a preset frame rate corresponding to the identification of the activity in the third configuration file as the preset frame rate corresponding to the first service.
8. The method according to claim 6, wherein the electronic device comprises an activity manager and an adaptive power manager APS that records a first profile, and wherein determining the frame rate type corresponding to the identification of the activity in the first profile comprises:
After the first application creates the activity with the first service through the activity manager, the activity manager sends a first notification to the APS through a first interface; wherein, the first notice carries the mark of the activity and the first state of the activity; the first state indicates that the activity was created;
after the APS determines that the state of the activity is created based on the first state of the activity in the first notification, the frame rate type corresponding to the identifier of the activity is matched in the first configuration file.
9. The method of claim 8, wherein the electronic device further comprises a camera hardware abstraction layer server having a second profile recorded therein, the method further comprising, after matching the frame rate type corresponding to the identification of the activity in the first profile:
the APS sends the frame rate type to the camera hardware abstraction layer server;
the electronic device determines a preset frame rate corresponding to the frame rate type in a second configuration file as the preset frame rate corresponding to the first service, and specifically includes:
and the camera hardware abstraction layer server determines a preset frame rate corresponding to the frame rate type in the second configuration file as the preset frame rate corresponding to the first service.
10. The method of any of claims 1, 2, 5-9, wherein the first service comprises a live service or a video conferencing service.
11. An electronic device, comprising: one or more processors and memory; the memory is coupled with the one or more processors, the memory for storing computer program code comprising computer instructions that the one or more processors invoke to cause the electronic device to perform the method of any of claims 1-10.
12. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
13. A chip system for application to an electronic device, wherein the chip system comprises one or more processors for invoking computer instructions to cause the electronic device to perform the method of any of claims 1-10.
CN202311377967.7A 2023-10-24 2023-10-24 Camera control method and electronic device Active CN117119295B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311377967.7A CN117119295B (en) 2023-10-24 2023-10-24 Camera control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311377967.7A CN117119295B (en) 2023-10-24 2023-10-24 Camera control method and electronic device

Publications (2)

Publication Number Publication Date
CN117119295A true CN117119295A (en) 2023-11-24
CN117119295B CN117119295B (en) 2024-04-12

Family

ID=88813235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311377967.7A Active CN117119295B (en) 2023-10-24 2023-10-24 Camera control method and electronic device

Country Status (1)

Country Link
CN (1) CN117119295B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172345A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN108881781A (en) * 2018-07-17 2018-11-23 广东小天才科技有限公司 Method and device for determining resolution in video call process
CN109005578A (en) * 2018-08-14 2018-12-14 广东小天才科技有限公司 Method for reducing power consumption of video call and wearable device
CN111372004A (en) * 2019-04-25 2020-07-03 深圳市泰衡诺科技有限公司 Camera control method, mobile terminal and computer-readable storage medium
CN114727004A (en) * 2021-01-05 2022-07-08 北京小米移动软件有限公司 Image acquisition method and device, electronic equipment and storage medium
CN115543061A (en) * 2022-04-12 2022-12-30 荣耀终端有限公司 Power consumption control method and electronic equipment
US20230319395A1 (en) * 2020-08-31 2023-10-05 Huawei Technologies Co., Ltd. Service processing method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107172345A (en) * 2017-04-07 2017-09-15 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN108881781A (en) * 2018-07-17 2018-11-23 广东小天才科技有限公司 Method and device for determining resolution in video call process
CN109005578A (en) * 2018-08-14 2018-12-14 广东小天才科技有限公司 Method for reducing power consumption of video call and wearable device
CN111372004A (en) * 2019-04-25 2020-07-03 深圳市泰衡诺科技有限公司 Camera control method, mobile terminal and computer-readable storage medium
US20230319395A1 (en) * 2020-08-31 2023-10-05 Huawei Technologies Co., Ltd. Service processing method and device
CN114727004A (en) * 2021-01-05 2022-07-08 北京小米移动软件有限公司 Image acquisition method and device, electronic equipment and storage medium
CN115543061A (en) * 2022-04-12 2022-12-30 荣耀终端有限公司 Power consumption control method and electronic equipment

Also Published As

Publication number Publication date
CN117119295B (en) 2024-04-12

Similar Documents

Publication Publication Date Title
EP3893491A1 (en) Method for photographing the moon and electronic device
CN113885759B (en) Notification message processing method, device, system and computer readable storage medium
CN113810600B (en) Terminal image processing method and device and terminal equipment
EP3893495B1 (en) Method for selecting images based on continuous shooting and electronic device
CN113810601B (en) Terminal image processing method and device and terminal equipment
EP4395294A1 (en) Quick photographing method, electronic device, and computer readable storage medium
EP4160373A1 (en) Screenshot method and electronic device
EP4280586A1 (en) Point light source image detection method and electronic device
CN115567630B (en) Electronic equipment management method, electronic equipment and readable storage medium
WO2023077939A1 (en) Camera switching method and apparatus, and electronic device and storage medium
CN116055897B (en) Photographing method and related equipment thereof
CN116074634B (en) Exposure parameter determination method and device
CN114442970B (en) Screen projection method of application window and electronic equipment
CN115526787A (en) Video processing method and device
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
CN117119291B (en) Picture mode switching method and electronic equipment
CN115426449B (en) Photographing method and terminal
CN117119295B (en) Camera control method and electronic device
CN117593236A (en) Image display method and device and terminal equipment
CN112929854B (en) Event subscription method and electronic equipment
CN115706869A (en) Terminal image processing method and device and terminal equipment
CN116095509B (en) Method, device, electronic equipment and storage medium for generating video frame
CN116233599B (en) Video mode recommendation method and electronic equipment
CN114520870B (en) Display method and terminal
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant