CN113179369B - Shot picture display method, mobile terminal and storage medium - Google Patents

Shot picture display method, mobile terminal and storage medium Download PDF

Info

Publication number
CN113179369B
CN113179369B CN202110376895.9A CN202110376895A CN113179369B CN 113179369 B CN113179369 B CN 113179369B CN 202110376895 A CN202110376895 A CN 202110376895A CN 113179369 B CN113179369 B CN 113179369B
Authority
CN
China
Prior art keywords
application
picture
camera application
preset data
display parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110376895.9A
Other languages
Chinese (zh)
Other versions
CN113179369A (en
Inventor
缪威
许美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Chuanyin Communication Technology Co ltd
Original Assignee
Chongqing Chuanyin Communication Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Chuanyin Communication Technology Co ltd filed Critical Chongqing Chuanyin Communication Technology Co ltd
Priority to CN202110376895.9A priority Critical patent/CN113179369B/en
Publication of CN113179369A publication Critical patent/CN113179369A/en
Application granted granted Critical
Publication of CN113179369B publication Critical patent/CN113179369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to a display method of a shot picture, a mobile terminal and a storage medium, after a camera application is started, picture display parameters adaptive to the camera application are obtained from preset data, the preset data comprise picture display parameters adaptive to different camera applications, and the picture display parameters comprise frame rate and/or resolution; and displaying the shooting picture according to the acquired picture display parameters so as to correct the size and/or the image quality of the shooting picture. Through the mode, the camera installed on the terminal can be adapted to the appropriate picture display parameters, the effect of correcting the size and/or the image quality of the shot picture is achieved, the problems that the shot picture is pulled up, blurred, slightly dark and the like due to the adaptability problem of the terminal and different cameras are avoided, and especially the use experience of a third-party camera can be remarkably improved.

Description

Shot picture display method, mobile terminal and storage medium
Technical Field
The application relates to the technical field of shooting, in particular to a shot picture display method, a mobile terminal and a storage medium.
Background
With the rapid development of terminal technology, the functions of mobile terminals such as mobile phones and tablet computers are also improved, and the mobile terminals become one of the common tools in daily life and work, and it is more and more common to use the camera application installed in the terminal to shoot. The camera application can be an application carried by the terminal when the terminal leaves a factory or a third-party application downloaded by the user, in the process of conceiving and realizing the application, the inventor finds that the current third-party camera application or the application carried by the terminal is a resolution list/frame rate list supported by the bottom layer of the system no matter which application interface is used, and then the application selects one fixed resolution/frame rate to display a picture according to the list.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides a shot picture display method, a mobile terminal, a storage medium, a mobile terminal, and a storage medium, which can correct the size and/or the image quality of a shot picture, and significantly improve the use experience of a third-party camera.
In order to solve the above technical problem, the present application provides a shot picture display method, including:
starting a camera application;
acquiring picture display parameters adaptive to the camera application from preset data, wherein the picture display parameters comprise a frame rate and/or a resolution;
and displaying the shooting picture according to the acquired picture display parameters so as to correct the size and/or the picture quality of the shooting picture.
Optionally, the preset data comprises picture display parameters adapted to different camera applications
Optionally, the acquiring, from preset data, a picture display parameter adapted to the camera application includes:
acquiring application data of the camera application, wherein the application data comprises at least one of an application interface and camera identification information;
matching at least one target parameter set in the preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
and confirming at least one value or value range from the at least one target parameter set according to the current working mode so as to obtain picture display parameters adapted to the camera application.
Optionally, the matching at least one target parameter set in the preset data according to the application data includes:
determining a system architecture layer for calling the picture display parameters according to the application data, wherein the system architecture layer comprises a hardware abstraction layer and/or an application architecture layer;
determining the at least one target parameter set based on the preset data associated with the system architecture layer.
Optionally, the determining the at least one target parameter set based on the preset data associated with the system architecture layer includes:
and when the system architecture layer for calling the picture display parameters is an application framework layer, determining the target parameter set from the preset data associated with the application framework layer according to an application interface adapted to the camera application.
Optionally, before the starting the camera application, the method further includes:
and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution adapted to a part of the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution adapted to the other part of the camera application are stored in association with the application framework layer according to the application interface adapted to the camera application.
Optionally, the method further comprises:
when the preset data of the resolution ratio is stored, the value or the numerical value range of the resolution ratio is set to be smaller than or equal to the value or the numerical value range of the resolution ratio corresponding to the preset maximum picture size.
Optionally, the operating mode includes at least one of a photographing mode, a video recording mode or a preview mode, and the determining at least one value or value range from the at least one target parameter set according to the current operating mode includes:
confirming at least one resolution value from the at least one target parameter set according to the current working mode; and/or the presence of a gas in the gas,
and confirming at least one frame rate value range from the at least one target parameter set according to the current working mode.
Optionally, after the starting the camera application, the method further includes:
determining whether the camera application is a system application;
if not, executing the step of acquiring the picture display parameters adaptive to the camera application from preset data; and/or the presence of a gas in the gas,
and if so, displaying a shooting picture according to the picture display parameters configured by the camera application.
Optionally, the preset data is stored in an xml parsing manner.
The present application also provides a second shot picture display method, including:
starting a camera application;
determining whether the size and/or the image quality of a shooting picture of the camera application is matched with a terminal parameter;
if not, determining picture display parameters for correcting the shot picture, wherein the picture display parameters comprise a frame rate and/or a resolution;
and displaying the shooting picture according to the determined parameter value.
Optionally, in a second method, the determining whether the size and/or the quality of the captured picture of the camera application is adapted to the terminal parameter includes:
judging whether the camera application is a preset camera application or not;
and if so, determining that the size and/or the image quality of the shooting picture of the camera application is not matched with the terminal parameters.
Optionally, in a second method, the determining picture display parameters for correcting the captured picture includes:
acquiring application data of the camera application, wherein the application data comprises at least one of an application interface and identification information;
matching at least one target parameter set in preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
and confirming at least one value or value range from the at least one target parameter set according to the current working mode so as to obtain picture display parameters for correcting the shooting picture.
Optionally, in the second method, the matching, according to the application data, at least one target parameter set in the preset data includes:
determining a system architecture layer for calling the picture display parameters according to the identification information, wherein the system architecture layer comprises a hardware abstraction layer and/or an application framework layer;
determining the at least one target parameter set based on the preset data associated with the system architecture layer.
Optionally, in the second method, before the starting of the camera application, the method further includes:
and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution adapted to a part of the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution adapted to the other part of the camera application are stored in association with the application framework layer according to the application interface adapted to the camera application.
Optionally, in the second method, the screen display parameters are stored in an xml parsing manner.
The present application further provides a mobile terminal, the mobile terminal including: the display method comprises a memory and a processor, wherein the memory stores a display program of a shooting picture, and the display program of the shooting picture realizes the steps of the display method of the shooting picture when being executed by the processor.
The present application also provides a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the display method of a captured picture as described above.
As described above, the shot picture display method of the present application is applied to a mobile terminal, and after a camera application is started, picture display parameters adapted to the camera application are obtained from preset data, where the preset data includes picture display parameters adapted to different camera applications, and the picture display parameters include a frame rate and/or a resolution; and displaying the shooting picture according to the acquired picture display parameters so as to correct the size and/or the image quality of the shooting picture. Through the mode, the camera installed on the terminal can be adapted to the appropriate picture display parameters, the effect of correcting the size and/or the image quality of the shot picture is achieved, the problems that the shot picture is pulled up, blurred, slightly dark and the like due to the adaptability problem of the terminal and different cameras are avoided, and especially the use experience of a third-party camera can be remarkably improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a display method of a photographed picture according to the first embodiment;
FIG. 4 is a diagram illustrating data setup based on a system architecture implementation according to a first embodiment;
fig. 5 is a flowchart illustrating a display method of a photographed picture according to the second embodiment.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, a reference to an element identified by the phrase "comprising one of 82308230a of 82303030, or an element defined by the phrase" comprising another identical element does not exclude the presence of the same element in a process, method, article, or apparatus comprising the element, and elements having the same designation may or may not have the same meaning in different embodiments of the application, the particular meaning being determined by its interpretation in the particular embodiment or by further reference to the context of the particular embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", by way of further example," a, B or C "or" a, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
The words "if", as used herein may be interpreted as "at \8230; \8230whenor" when 8230; \8230when or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as 310, 320, 330, etc. are used herein for the purpose of more clearly and briefly describing the corresponding content, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform 310 after 320 in the specific implementation, but these should be within the scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, wiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of the phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor that may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the gesture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Alternatively, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, etc. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an e-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an epc (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include MME (Mobility Management Entity) 2031, hss (Home Subscriber Server) 2032, other MME2033, SGW (Serving gateway) 2034, pgw (PDN gateway) 2035, PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
Fig. 3 is a flowchart illustrating a display method of a photographed picture according to the first embodiment. As shown in fig. 3, the method for displaying a captured picture of the present embodiment includes:
at step 310, a camera application is launched.
Optionally, the camera application may be a system application configured by the terminal, and may also be a third-party application downloaded by the user.
In step 320, picture display parameters adapted to the camera application are obtained from preset data, where the preset data includes picture display parameters adapted to different camera applications, and the picture display parameters include a frame rate and/or a resolution.
Optionally, acquiring the picture display parameters adapted to the camera application from preset data includes:
acquiring application data of a camera application;
matching at least one target parameter set in preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
at least one value or value range is confirmed from at least one target parameter set according to the current working mode so as to obtain picture display parameters adapted to the camera application.
Optionally, the application data includes at least one of an application interface and camera identification information, and the working mode is at least one of preview, photograph, and video. Optionally, the preset data is pre-stored in the terminal, and may be stored in different locations according to configurations of different cameras, so as to ensure stability and reliability of system operation while storing the preset data. Optionally, before starting the camera application, the method further includes:
and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution adapted to the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution adapted to the camera application are stored in association with the application architecture layer according to the application interface adapted to the camera application. Optionally, the preset data is stored in an xml parsing manner. When the preset data of the resolution ratio is stored, the value or the numerical range of the resolution ratio is set to be smaller than the value or the numerical range of the resolution ratio corresponding to the preset maximum picture size.
The preset data are mainly used for solving the problems of pull-up, blurring, partial darkness and the like of a shot picture, and when the preset data are set and stored based on the problems, the process is as follows:
(1) To address such problems of picture blur: configuring corresponding resolution data based on an application interface adapted by a camera application, and if the camera application is adapted by an api1 interface, adapting such camera application by configuring preset data 1 of resolution at an application framework layer as shown in fig. 4; if the camera application is adaptive to the api2 interface, the camera application is adapted by configuring preset data 2 of resolution in an application framework layer, so that each camera application has resolution data which is perfectly compatible with the camera application; when the preset data of the resolution ratio is stored, setting the value or the numerical range of the resolution ratio to be smaller than the value or the numerical range of the resolution ratio corresponding to the preset maximum picture size; in addition, the design requirement of the part is more, and the preset data can be stored in an xml analysis mode;
(2) Aiming at the problems of picture pull-up: whether the api1 or the api2 is the resolution data supported by the hardware abstraction layer (HAL layer) is finally obtained, so that a resolution synthesis strategy can be integrated, a proper resolution is customized for some cameras, especially third-party cameras, and preset data 3 of the resolution shown in fig. 4 is obtained, in the preset data 3 of the resolution, the third-party camera with compatibility problems is specified to use the customized resolution data, and the camera without the problems uses the original resolution data;
(3) Aiming at the problem of dark picture: no matter api1 or api2, frame rate data supported by a hardware abstraction layer (HAL layer) is finally obtained, so as to configure preset data of a frame rate at the hardware abstraction layer as shown in fig. 4, so that each camera application can customize its own frame rate data; in addition, the design requirement of the part is more, and the preset data can be stored in an xml parsing mode.
The method comprises the steps of operating corresponding camera application in the process of configuring preset data 1 of resolution, preset data 2 of resolution, preset data 3 of resolution and preset data of frame rate, detecting the correction effect of a shot picture through adjustment of the preset data so as to obtain the preset data adaptive to the camera application, arranging and reasonably distributing the preset data to different areas for storage according to storage space of data, configuration characteristics of different cameras and other factors, and ensuring stability and reliability of system operation. The preset data can be stored in an xml analysis mode, so that efficient and dynamic debugging is achieved, camera application is optimized and adapted conveniently and quickly, and development efficiency of research and development personnel is improved.
The parameter sets are respectively used for storing at least one value or a numerical range of one picture display parameter, wherein the parameter sets of the resolution ratio are a set of numerical values, and the parameter sets of the frame rate are a set of the numerical range. In the process of configuring the preset data, the preset data and the application data of the camera form an incidence relation, so that the adaptive preset data can be found for use according to the application interface of the camera application or the data such as the camera identification information, the problems of improvement, blurring, partial darkness and the like of a shooting picture are solved, and the experience of a user in using the camera, especially using a third-party camera is improved.
Based on the storage manner of the preset data, optionally, matching at least one target parameter set in the preset data according to the application data, including: determining a system architecture layer for calling picture display parameters according to application data of camera application, wherein the system architecture layer comprises a hardware abstraction layer and/or an application architecture layer; at least one target parameter set is determined based on preset data associated with a system architecture layer. Optionally, a system architecture layer for invoking the screen display parameters may be determined according to the identification information of the camera, and then at least one target parameter set may be determined based on preset data associated with the system architecture layer.
Optionally, determining at least one target parameter set based on preset data associated with the system architecture layer includes: and when the system framework layer for calling the picture display parameters is the application framework layer, determining a target parameter set from preset data associated with the application framework layer according to an application interface adaptive to the camera application. Optionally, the number of the application interfaces is not limited to the number shown in fig. 4, and at the application framework layer, at least one parameter set is correspondingly configured for different application interfaces.
Optionally, the operation mode includes at least one of a photo mode, a video mode or a preview mode, and the determining at least one value or value range from at least one target parameter set according to the current operation mode includes: confirming at least one resolution value from at least one target parameter set according to the current working mode; and/or, determining at least one frame rate value range from at least one target parameter set according to the current working mode. For example, in the preview mode, the value of the adapted resolution and the numerical range of the frame rate can be acquired simultaneously, so that the picture is clear and bright; and in the photographing mode, the adaptive resolution value is acquired to display the image, so that the image is appropriate in size and the lifting deformation is avoided.
In some embodiments, after the starting the camera application, further comprising:
determining whether the camera application is a system application;
if not, executing step 320 to obtain picture display parameters adapted to the camera application from preset data; and/or if so, displaying a shooting picture according to picture display parameters configured by the camera application.
Optionally, the system is configured with picture display parameters suitable for system applications, and for third party camera applications, preset data is uniformly configured to provide adapted picture display parameters.
In step 330, the shot picture is displayed according to the acquired picture display parameters to correct the size and/or quality of the shot picture. The shot picture is displayed through the acquired adaptive picture display parameters, so that the size and/or the image quality of the shot picture can be corrected, and the shooting experience is improved.
As described above, the shot picture display method of the present application is applied to a mobile terminal, and after a camera application is started, picture display parameters adapted to the camera application are obtained from preset data, where the preset data includes picture display parameters adapted to different camera applications, and the picture display parameters include a frame rate and/or a resolution; and displaying the shooting picture according to the acquired picture display parameters so as to correct the size and/or the image quality of the shooting picture. Through the mode, the camera installed on the terminal can be adapted to the appropriate picture display parameters, the effect of correcting the size and/or the image quality of the shot picture is achieved, the problems that the shot picture is pulled up, blurred, slightly dark and the like due to the adaptability problem of the terminal and different cameras are avoided, and especially the use experience of a third-party camera can be remarkably improved.
Second embodiment
Fig. 5 is a flowchart illustrating a display method of a photographed picture according to the second embodiment. As shown in fig. 5, the method for displaying a captured picture of the present embodiment includes:
step 410, starting a camera application;
optionally, the camera application may be a system application configured by the terminal, and may also be a third-party application downloaded by the user.
Step 420, determining whether the size and/or the image quality of a shooting picture of the camera application is matched with the terminal parameters;
optionally, determining that the size and/or the image quality of a shot picture of the camera application is adapted to the terminal parameter includes:
judging whether the camera application is a preset camera application or not;
and if so, determining that the size and/or the image quality of the shooting picture of the camera application is not matched with the terminal parameters.
Optionally, when the size and/or the image quality of the shot picture are adapted to the terminal parameters, the problems of picture pull-up, blurring or partial darkness do not occur, and generally, these problems easily occur in a third party camera, especially in a part of cameras in the third party camera, so that, by determining whether the camera application is a preset camera application, for example, whether a camera identifier (such as a camera name) is a preset identifier, it is determined whether the size and/or the image quality of the shot picture of the camera application is adapted to the terminal parameters, and if the camera application is a preset identifier, it is determined that the size and/or the image quality of the shot picture of the camera application is not adapted to the terminal parameters, and otherwise, the camera application is adapted to the terminal parameters. Or, directly analyzing the picture parameters applied by the camera, judging whether the size of the shot picture is consistent with the proportion of the screen size, whether the resolution of the shot picture is consistent with the resolution of the terminal, whether the frame rate of the shot picture is consistent with the frame rate range of the terminal and the like, and further acquiring whether the size and/or the image quality of the shot picture is matched with the terminal parameters according to the analysis result.
Step 430, if yes, determining picture display parameters for correcting the shot picture, wherein the picture display parameters comprise a frame rate and/or a resolution;
alternatively, determining picture display parameters for correcting a captured picture includes: acquiring application data of the camera application, wherein the application data comprises at least one of an application interface and identification information; matching at least one target parameter set in preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter; and confirming at least one value or value range from at least one target parameter set according to the current working mode so as to obtain picture display parameters for correcting the shot picture.
Optionally, matching at least one target parameter set in the preset data according to the application data includes: determining a system architecture layer for calling picture display parameters according to the identification information, wherein the system architecture layer comprises a hardware abstraction layer and/or an application framework layer; at least one target parameter set is determined based on preset data associated with a system architecture layer.
Optionally, determining at least one target parameter set based on preset data associated with the system architecture layer includes:
and when the system framework layer for calling the picture display parameters is the application framework layer, determining a target parameter set from preset data associated with the application framework layer according to an application interface adaptive to the camera application.
Optionally, before starting the camera application, the method further includes: and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution adapted to the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution adapted to the camera application are stored in association with the application architecture layer according to the application interface adapted to the camera application. Optionally, the screen display parameters are stored in an xml parsing manner. Optionally, when the preset data of the resolution is stored, the value or the numerical range of the resolution is set to be smaller than or equal to the value or the numerical range of the resolution corresponding to the preset maximum picture size.
Optionally, the operation mode includes at least one of a photo mode, a video mode or a preview mode, and the determining at least one value or value range from at least one target parameter set according to the current operation mode includes: confirming at least one resolution value from at least one target parameter set according to the current working mode; and/or, determining at least one frame rate value range from at least one target parameter set according to the current working mode.
The detailed implementation process of the above steps is described in the first embodiment about step 320, and is not described herein again.
And step 440, displaying the shooting picture according to the determined parameter value. The shot picture is displayed through the acquired adaptive parameter values, so that the size and/or the image quality of the shot picture can be corrected, and the shooting experience is improved.
As described above, the shot picture display method of the present application is applied to a mobile terminal, and after a camera application is started, it is determined whether the size and/or the image quality of a shot picture of the camera application is adapted to a terminal parameter; if not, determining picture display parameters for correcting the shot picture, wherein the picture display parameters comprise a frame rate and/or a resolution; and displaying the shooting picture according to the determined parameter value. Through the mode, the camera installed on the terminal can be adapted to the appropriate picture display parameters, the effect of correcting the size and/or the image quality of the shot picture is achieved, the problems that the shot picture is pulled up, blurred, slightly dark and the like due to the adaptability problem of the terminal and different cameras are avoided, and especially the use experience of a third-party camera can be remarkably improved.
The present application further provides a mobile terminal device, where the terminal device includes a memory and a processor, and the memory stores a display program of a captured image, and the display program of the captured image is executed by the processor to implement the steps of the method for displaying a captured image in any of the above embodiments.
The present application further provides a computer-readable storage medium, in which a display program of a captured image is stored, and when being executed by a processor, the display program of the captured image realizes the steps of the method for displaying the captured image in any one of the above embodiments.
In the embodiments of the mobile terminal and the computer-readable storage medium provided in the present application, all technical features of the embodiments of the display method of the shot image are included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the present application, the same or similar term concepts, technical solutions and/or application scenario descriptions will be generally described only in detail at the first occurrence, and when the description is repeated later, the detailed description will not be repeated in general for brevity, and when understanding the technical solutions and the like of the present application, reference may be made to the related detailed description before the description for the same or similar term concepts, technical solutions and/or application scenario descriptions and the like which are not described in detail later.
In the present application, each embodiment is described with an emphasis on the description, and reference may be made to the description of other embodiments for parts that are not described or recited in any embodiment.
The technical features of the technical solution of the present application may be arbitrarily combined, and for brevity of description, all possible combinations of the technical features in the embodiments are not described, however, as long as there is no contradiction between the combinations of the technical features, the scope of the present application should be considered as being described in the present application.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (14)

1. A display method of a shot picture is applied to a mobile terminal, and is characterized by comprising the following steps:
starting a camera application;
acquiring picture display parameters adaptive to the camera application from preset data, wherein the picture display parameters comprise a frame rate and/or a resolution;
the acquiring of the picture display parameters adapted to the camera application from the preset data includes:
acquiring application data of the camera application;
matching at least one target parameter set in the preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
confirming at least one value or value range from the at least one target parameter set according to the current working mode to obtain picture display parameters adapted to the camera application;
and displaying the shooting picture according to the acquired picture display parameters so as to correct the size and/or the picture quality of the shooting picture.
2. The method as claimed in claim 1, wherein the matching at least one target parameter set in the predetermined data according to the application data comprises:
determining a system architecture layer for calling the picture display parameters according to the application data, wherein the system architecture layer comprises a hardware abstraction layer and/or an application architecture layer;
determining the at least one target parameter set based on the preset data associated with the system architecture layer.
3. The method as claimed in claim 2, wherein the determining the at least one target parameter set based on the preset data associated with the system architecture layer comprises:
and when the system architecture layer for calling the picture display parameters is an application framework layer, determining the target parameter set from the preset data associated with the application framework layer according to an application interface adapted to the camera application.
4. The method for displaying a captured picture according to claim 2 or 3, wherein before the starting of the camera application, the method further comprises:
and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution ratio adapted by one part of the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution ratio adapted by the other part of the camera application are stored in association with the application framework layer according to the application interface adapted by the camera application.
5. The method for displaying a captured picture according to claim 4, further comprising:
when the preset data of the resolution ratio is stored, the value or the numerical range of the resolution ratio is set to be smaller than or equal to the value or the numerical range of the resolution ratio corresponding to the preset maximum picture size.
6. The method as claimed in claim 1, wherein the operating mode includes at least one of a photo mode, a video mode or a preview mode, and the determining at least one value or value range from the at least one target parameter set according to the current operating mode comprises:
confirming at least one resolution value from the at least one target parameter set according to the current working mode; and/or the presence of a gas in the gas,
and confirming at least one frame rate value range from the at least one target parameter set according to the current working mode.
7. The method for displaying a captured picture according to claim 1, further comprising, after the starting of the camera application:
determining whether the camera application is a system application;
if not, executing the step of acquiring the picture display parameters adaptive to the camera application from the preset data; and/or the presence of a gas in the gas,
and if so, displaying a shooting picture according to the picture display parameters configured by the camera application.
8. A method for displaying a captured picture, comprising:
starting a camera application;
determining whether the size and/or the image quality of a shooting picture of the camera application is matched with a terminal parameter;
if not, determining picture display parameters for correcting the shot picture, and acquiring picture display parameters adaptive to the camera application from preset data, wherein the picture display parameters comprise a frame rate and/or a resolution;
the acquiring of the picture display parameters adapted to the camera application from the preset data includes:
acquiring application data of the camera application;
matching at least one target parameter set in the preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
confirming at least one value or value range from the at least one target parameter set according to the current working mode to obtain picture display parameters adapted to the camera application;
and displaying the shooting picture according to the determined parameter value.
9. The method according to claim 8, wherein the determining whether the size and/or the quality of the captured image of the camera application is adapted to the terminal parameter comprises:
judging whether the camera application is a preset camera application or not;
and if so, determining that the size and/or the image quality of the shooting picture of the camera application is not matched with the terminal parameters.
10. The captured picture displaying method according to claim 8, wherein the determining picture display parameters for correcting the captured picture includes:
acquiring application data of the camera application;
matching at least one target parameter set in preset data according to the application data, wherein each target parameter set is respectively used for storing at least one value or a numerical range of one picture display parameter;
and confirming at least one value or value range from the at least one target parameter set according to the current working mode so as to obtain picture display parameters for correcting the shooting picture.
11. The method as claimed in claim 10, wherein the matching at least one target parameter set in the predetermined data according to the application data comprises:
determining a system architecture layer for calling the picture display parameters according to the application data, wherein the system architecture layer comprises a hardware abstraction layer and/or an application architecture layer;
determining the at least one target parameter set based on the preset data associated with the system architecture layer.
12. The method for displaying a captured picture according to claim 11, wherein before the starting of the camera application, the method further comprises:
and storing preset data associated with each system architecture layer, wherein the preset data corresponding to the frame rate and the preset data of the resolution ratio adapted by one part of the camera application are stored in association with the hardware abstraction layer, and the preset data of the resolution ratio adapted by the other part of the camera application are stored in association with the application framework layer according to the application interface adapted by the camera application.
13. A mobile terminal, characterized in that the mobile terminal comprises: a memory, a processor, wherein the memory has stored thereon a display program of a captured picture, the display program of the captured picture implementing the steps of the display method of the captured picture according to any one of claims 1 to 12 when executed by the processor.
14. A readable storage medium, characterized in that the readable storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the display method of a captured picture according to any one of claims 1 to 12.
CN202110376895.9A 2021-04-08 2021-04-08 Shot picture display method, mobile terminal and storage medium Active CN113179369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110376895.9A CN113179369B (en) 2021-04-08 2021-04-08 Shot picture display method, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110376895.9A CN113179369B (en) 2021-04-08 2021-04-08 Shot picture display method, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN113179369A CN113179369A (en) 2021-07-27
CN113179369B true CN113179369B (en) 2023-03-21

Family

ID=76924550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110376895.9A Active CN113179369B (en) 2021-04-08 2021-04-08 Shot picture display method, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113179369B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114727011B (en) * 2022-03-07 2024-06-11 深圳创维-Rgb电子有限公司 Image pickup optimization method, device, electronic equipment and readable storage medium
CN114640887A (en) * 2022-03-22 2022-06-17 深圳创维-Rgb电子有限公司 Display method, device, equipment and computer readable storage medium
CN115914706A (en) * 2022-10-10 2023-04-04 安徽康佳电子有限公司 Camera image quality parameter matching method, storage medium and computer system
CN116528063B (en) * 2023-07-04 2023-11-03 荣耀终端有限公司 Shooting method, readable storage medium and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106899795A (en) * 2015-12-17 2017-06-27 阿里巴巴集团控股有限公司 A kind of camera hardware parameter call, method to set up, device and camera applications system
CN108848309A (en) * 2018-07-13 2018-11-20 维沃移动通信有限公司 A kind of camera programm starting method and mobile terminal

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100738540B1 (en) * 2005-08-30 2007-07-11 삼성전자주식회사 Method and apparatus of interface in multitasking system
CN106657757B (en) * 2015-11-04 2020-06-09 阿里巴巴集团控股有限公司 Image preview method and device for camera application and camera application system
CN105959530A (en) * 2016-04-26 2016-09-21 乐视控股(北京)有限公司 Method and system for invoking a camera function according to an individualized property of an application
CN106502513A (en) * 2016-10-31 2017-03-15 珠海我爱拍科技有限公司 A kind of split screen display available technology based on Android system
CN107172345B (en) * 2017-04-07 2020-02-04 深圳市金立通信设备有限公司 Image processing method and terminal
CN111279679A (en) * 2017-08-31 2020-06-12 深圳传音通讯有限公司 Square cutting photographing method, photographing system and photographing device
CN110753187B (en) * 2019-10-31 2021-06-01 芋头科技(杭州)有限公司 Camera control method and device
CN110995994B (en) * 2019-12-09 2021-09-14 上海瑾盛通信科技有限公司 Image shooting method and related device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106899795A (en) * 2015-12-17 2017-06-27 阿里巴巴集团控股有限公司 A kind of camera hardware parameter call, method to set up, device and camera applications system
CN108848309A (en) * 2018-07-13 2018-11-20 维沃移动通信有限公司 A kind of camera programm starting method and mobile terminal

Also Published As

Publication number Publication date
CN113179369A (en) 2021-07-27

Similar Documents

Publication Publication Date Title
CN107093418B (en) Screen display method, computer equipment and storage medium
CN113179369B (en) Shot picture display method, mobile terminal and storage medium
CN110784898A (en) Network switching method, mobile terminal and computer readable storage medium
CN107705247B (en) Image saturation adjusting method, terminal and storage medium
CN107172605B (en) Emergency call method, mobile terminal and computer readable storage medium
CN109710159B (en) Flexible screen response method and device and computer readable storage medium
CN109672822A (en) A kind of method for processing video frequency of mobile terminal, mobile terminal and storage medium
CN107896304B (en) Image shooting method and device and computer readable storage medium
CN108234893B (en) Brightness adjusting method, brightness adjusting equipment and computer readable storage medium
CN110058827A (en) A kind of vice screen wallpaper treatment method, mobile terminal and the storage medium of mobile terminal
CN110278481B (en) Picture-in-picture implementation method, terminal and computer readable storage medium
CN112188058A (en) Video shooting method, mobile terminal and computer storage medium
CN108900779A (en) Initial automatic exposure convergence method, mobile terminal and computer readable storage medium
CN109510941B (en) Shooting processing method and device and computer readable storage medium
CN110955397A (en) Method for setting frame rate of game terminal, game terminal and storage medium
CN108282608B (en) Multi-region focusing method, mobile terminal and computer readable storage medium
CN113347372A (en) Shooting light supplement method, mobile terminal and readable storage medium
CN112153305A (en) Camera starting method, mobile terminal and computer storage medium
CN108495033B (en) Photographing regulation and control method and device and computer readable storage medium
CN107844353B (en) Display method, terminal and computer readable storage medium
CN107743204B (en) Exposure processing method, terminal, and computer-readable storage medium
CN112532838B (en) Image processing method, mobile terminal and computer storage medium
CN112040134B (en) Micro-holder shooting control method and device and computer readable storage medium
CN113542605A (en) Camera shooting control method, mobile terminal and storage medium
CN108335301B (en) Photographing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant