CN116095220B - Parameter determination method and related device - Google Patents
Parameter determination method and related device Download PDFInfo
- Publication number
- CN116095220B CN116095220B CN202210943024.5A CN202210943024A CN116095220B CN 116095220 B CN116095220 B CN 116095220B CN 202210943024 A CN202210943024 A CN 202210943024A CN 116095220 B CN116095220 B CN 116095220B
- Authority
- CN
- China
- Prior art keywords
- images
- similarity
- image
- group
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 89
- 238000004590 computer program Methods 0.000 claims description 12
- 238000005070 sampling Methods 0.000 claims description 4
- 230000009286 beneficial effect Effects 0.000 abstract description 6
- 238000012545 processing Methods 0.000 description 38
- 230000006870 function Effects 0.000 description 26
- 238000004891 communication Methods 0.000 description 24
- 230000008569 process Effects 0.000 description 17
- 239000000872 buffer Substances 0.000 description 16
- 238000001514 detection method Methods 0.000 description 15
- 238000006243 chemical reaction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 10
- 230000005236 sound signal Effects 0.000 description 10
- 238000004364 calculation method Methods 0.000 description 8
- 238000007726 management method Methods 0.000 description 8
- 238000009877 rendering Methods 0.000 description 5
- 239000008186 active pharmaceutical agent Substances 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 230000008093 supporting effect Effects 0.000 description 4
- 235000008694 Humulus lupulus Nutrition 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 101100134058 Caenorhabditis elegans nth-1 gene Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72427—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Telephone Function (AREA)
Abstract
The application provides a parameter determining method and a related device, which are beneficial to reducing the phenomenon of jitter and flickering of game pictures and improving the use experience of users. The method comprises the following steps: determining the similarity of at least one group of images in the previous N frames of images of the target application, wherein each group of images in the at least one group of images comprises two adjacent frames of images in the previous N frames of images, and N is more than or equal to 2 and is an integer; predicting a target similarity based on the similarity of at least one group of images, wherein the target similarity represents the similarity between an (n+1) th frame image of a target application and images included in the at least one group of images; the number of skipped frames is determined based on the target similarity.
Description
Technical Field
The present application relates to the field of terminals, and in particular, to a parameter determining method and a related device.
Background
With the further expansion of the mobile terminal market, the types of applications on the terminal are endless, and some applications are also evolving towards high image quality and high frame rate, such as game applications and video applications. Taking game application as an example, a high frame rate can bring better game experience to users, but at the same time, the burden of drawing by an image processor (graphics processing unit, GPU) is increased, the power consumption of a terminal is increased, and the problems of terminal heating, performance reduction and the like are caused.
The similarity of two adjacent frames of images is different under different game scenes, for example, in a game competition countermeasure scene, the picture change is large, and the similarity of two adjacent frames of images is low; in a scene of entering a game hall or game setting, the picture change is small, and the similarity of two adjacent frames of images is high. In general, the terminal device may employ a frame-skip (also referred to as frame-skip) method to reduce the power consumption of the GPU, in which the GPU uses a fixed frame-skip parameter to adjust the frame rate, e.g., the GPU draws a frame image at every other frame.
However, the method does not distinguish between the similarity of two adjacent frames of images in different game scenes, and may cause the phenomenon of shaking and flickering of game pictures.
Disclosure of Invention
The application provides a parameter determining method and a related device, which are beneficial to reducing the phenomenon of jitter and flickering of game pictures and improving the use experience of users.
In a first aspect, a method for determining parameters is provided, the method comprising: determining the similarity of at least one group of images in the previous N frames of images of the target application, wherein each group of images in the at least one group of images comprises two adjacent frames of images in the previous N frames of images, and N is more than or equal to 2 and is an integer; predicting target similarity based on the similarity of the at least one group of images, wherein the target similarity represents the similarity between an (n+1) th frame image of a target application and an image included in the at least one group of images; the number of skipped frames is determined based on the target similarity.
In the present application, the nth frame image is an image currently displayed by the target application, and the previous N frame image (including the nth frame image) is a history frame image. According to the application, the terminal equipment can determine the number of the frame skipping according to the target similarity, so that the frame rate of the image drawn by the terminal equipment is adjusted. For example, if the predicted target similarity is low, the terminal device may set a smaller frame skip parameter (i.e., the number of frame skip), which is advantageous in avoiding jitter and flicker of the screen of the target application (e.g., game application); the predicted target similarity is higher, so that the terminal equipment can set larger frame skip parameters, the terminal equipment can draw a smaller number of images, and the power consumption caused by drawing the images is reduced.
With reference to the first aspect, in certain implementations of the first aspect, predicting the target similarity based on the similarity of the at least one set of images includes: calculating an average value of the similarity of the at least one set of images; and predicting the target similarity according to the average value.
With reference to the first aspect, in certain implementations of the first aspect, the target similarity is linear with the average value.
In the application, the target similarity and the average value are in a linear relation, so that the calculation is simpler, and the power consumption of the terminal equipment is reduced.
With reference to the first aspect, in certain implementations of the first aspect, the target similarity and the average value satisfy:
wherein S is target Representing the similarity of targets, S i Representing the similarity of the ith group of images in at least one group of images, T representing the number of groups of said at least one group of images, T being a positive integer, c 1 And c 2 Is a preset value.
With reference to the first aspect, in certain implementation manners of the first aspect, determining a similarity of at least one set of images in the first N frames of images of the target application includes: determining a first gray value of a first image and a second gray value of a second image of an ith group of images in at least one group of images; and determining the similarity of the ith group of images according to the first gray level value and the second gray level value. Wherein the first gray value is determined based on the pixel value of the first image and the second gray value is determined based on the pixel value of the second image; i=1, …, T representing the number of groups of the at least one group of images, i, T being positive integers.
With reference to the first aspect, in certain implementations of the first aspect, the similarity of the i-th group of images is linear with a difference between the first gray value and the second gray value.
In the application, the terminal equipment carries out similarity fitting through the gray values of the two adjacent frames of images to obtain the similarity of the two adjacent frames of images, and the mode is beneficial to reducing the calculation complexity and the calculation power cost.
With reference to the first aspect, in certain implementations of the first aspect, the similarity of the i-th group of images and the difference between the first gray value and the second gray value satisfy:
S i =1-α(L 1 -L 2 )
wherein S is i Representing the similarity of the ith group of images, wherein alpha is a preset value and L 1 Represents a first gray value, L 2 Representing a second gray value.
With reference to the first aspect, in some implementations of the first aspect, the first gray value is obtained by down-sampling a pixel value of the first image and then converting the pixel value, and the second gray value is obtained by down-sampling a pixel value of the second image and then converting the pixel value; alternatively, the first gray value is obtained by converting a pixel value of the first image into a gray value and then downsampling, and the second gray value is obtained by converting a pixel value of the second image into a gray value and then downsampling.
In the present application, taking a first image as an example, the first image is a color image, and includes a plurality of pixel values, where each pixel value includes three color components of red (red, R), green (green, G), blue (blue, B), and the terminal device may downsample the plurality of pixel values of the first image, then convert the downsampled pixel values into a first gray value, or convert the plurality of pixel values of the color image drawn by the GPU into a plurality of gray values correspondingly, then downsample the plurality of gray values to obtain the first gray value, which is favorable to reduce the power consumption of the terminal device for similarity detection by downsampling and converting into the gray values.
With reference to the first aspect, in certain implementations of the first aspect, determining a number of skipped frames based on the target similarity includes: and determining the number of the skipped frames corresponding to the target similarity based on the corresponding relation between the value range of the predefined similarity and the number of the skipped frames.
With reference to the first aspect, in some implementations of the first aspect, in a case where the determined number of skipped frames is greater than zero, the method further includes: intercepting the drawing of the n+1st frame image.
With reference to the first aspect, in certain implementation manners of the first aspect, before determining the similarity of at least one set of images in the first N frames of images of the target application, the method further includes: detecting whether the target application is in a supporting application list; whether the frame rate of the target application is greater than or equal to a preset threshold is detected. Determining the similarity of at least one group of images in the previous N frames of images of the target application comprises the following steps: and under the condition that the target application is in the support application list and the frame rate of the target application is greater than or equal to a preset threshold value, determining the similarity of at least one group of images in the previous N frames of images of the target application.
In a second aspect, there is provided a parameter determination apparatus including: for performing the method in any of the possible implementations of the first aspect described above. In particular, a module for performing the method in any of the possible implementations of the first aspect described above is included.
In a third aspect, there is provided another parameter determination apparatus comprising a processor and a memory, the processor being coupled to the memory, the memory being operable to store a computer program, the processor being operable to invoke and execute the computer program in the memory to implement the method of any of the possible implementations of the first aspect.
In one implementation, the parameter determining means is a terminal device. When the parameter determining means is a terminal device, the communication interface may be a transceiver, or an input/output interface.
In another implementation, the parameter determining means is a chip configured in the terminal device. When the parameter determining means is a chip configured in the terminal device, the communication interface may be an input/output interface.
In a fourth aspect, there is provided a processor comprising: input circuit, output circuit and processing circuit. The processing circuitry is configured to receive signals via the input circuitry and to transmit signals via the output circuitry such that the processor performs the method of any one of the possible implementations of the first aspect described above.
In a specific implementation process, the processor may be a chip, the input circuit may be an input pin, the output circuit may be an output pin, and the processing circuit may be a transistor, a gate circuit, a trigger, various logic circuits, and the like. The input signal received by the input circuit may be received and input by, for example and without limitation, a receiver, the output signal may be output by, for example and without limitation, a transmitter and transmitted by a transmitter, and the input circuit and the output circuit may be the same circuit, which functions as the input circuit and the output circuit, respectively, at different times. The specific implementation of the processor and various circuits is not limited by the present application.
In a fifth aspect, a processing device is provided that includes a processor and a memory. The processor is configured to read instructions stored in the memory and to receive signals via the receiver and to transmit signals via the transmitter to perform the method of any one of the possible implementations of the first aspect.
Optionally, the processor is one or more and the memory is one or more.
Alternatively, the memory may be integrated with the processor or the memory may be separate from the processor.
In a specific implementation process, the memory may be a non-transient (non-transitory) memory, for example, a Read Only Memory (ROM), which may be integrated on the same chip as the processor, or may be separately disposed on different chips.
It should be appreciated that the related data interaction process, for example, transmitting the indication information, may be a process of outputting the indication information from the processor, and the receiving the capability information may be a process of receiving the input capability information by the processor. Specifically, the data output by the processing may be output to the transmitter, and the input data received by the processor may be from the receiver. Wherein the transmitter and receiver may be collectively referred to as a transceiver.
The processing means in the fifth aspect may be a chip, and the processor may be implemented by hardware or by software, and when implemented by hardware, the processor may be a logic circuit, an integrated circuit, or the like; when implemented in software, the processor may be a general-purpose processor, implemented by reading software code stored in a memory, which may be integrated in the processor, or may reside outside the processor, and exist separately.
In a sixth aspect, there is provided a computer program product comprising: computer program code which, when run, causes a computer to perform the method of any one of the possible implementations of the first aspect described above.
In a seventh aspect, a computer readable storage medium is provided, the computer readable storage medium storing a computer program which, when executed, causes a computer to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applicable;
FIG. 2 is a block diagram of a software architecture of a terminal device to which embodiments of the present application are applicable;
FIG. 3 is a schematic flow chart of a parameter determination method provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a framework for adjusting frame rate according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of another parameter determination method provided by the application embodiment;
FIG. 6 is a schematic block diagram of a parameter determination apparatus provided by an embodiment of the present application;
fig. 7 is a schematic block diagram of another parameter determination apparatus provided in an embodiment of the present application.
Detailed Description
The technical scheme of the application will be described below with reference to the accompanying drawings.
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In the present application, the words "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
Furthermore, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, and c may represent: a, b, or c, or a and b, or a and c, or b and c, or a, b and c, wherein a, b and c can be single or multiple.
Fig. 1 is a schematic structural diagram of a terminal device to which the embodiment of the present application is applicable. As shown in fig. 1, the terminal device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. It is to be understood that the configuration illustrated in the present embodiment does not constitute a specific limitation on the terminal device 100. In other embodiments of the application, terminal device 100 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a display processing unit (display process unit, DPU), and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. In some embodiments, the terminal device 100 may also include one or more processors 110. The processor may be a neural hub and a command center of the terminal device 100. The processor can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 uses or recycles. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby improving the efficiency of the terminal device 100.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others. The USB interface 130 is an interface conforming to the USB standard, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the terminal device 100, or may be used to transfer data between the terminal device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is illustrated schematically, and does not constitute a structural limitation of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The wireless communication function of the terminal device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the terminal device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the terminal device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN), bluetooth, global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), NFC, infrared technology (IR), etc. applied on the terminal device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of terminal device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that terminal device 100 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou satellite navigation system (bei dou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The terminal device 100 may implement a display function through a GPU, a display screen 194, an application processor, and the like. The application processor may include an NPU and/or a DPU. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the terminal device 100 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc. The DPU is also referred to as a display sub-system (DSS) for adjusting the color of the display screen 194, and may adjust the color of the display screen via a color three-dimensional (3D) look-up table (LUT). The DPU can also perform processes such as scaling, noise reduction, contrast enhancement, backlight brightness management, hdr processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-OLED, or a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED). In some embodiments, the terminal device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The terminal device 100 may implement photographing functions through an ISP, one or more cameras 193, a video codec, a GPU, one or more display screens 194, an application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to realize expansion of the memory capability of the terminal device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, data files such as music, photos, videos, etc. are stored in an external memory card.
The internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the terminal device 100 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area can store an operating system; the storage area may also store one or more applications (e.g., gallery, contacts, etc.), and so forth. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the terminal device 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. In some embodiments, the processor 110 may cause the terminal device 100 to perform various functional applications and data processing by executing instructions stored in the internal memory 121, and/or instructions stored in a memory provided in the processor 110.
The terminal device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc. Wherein the audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal device 100 can listen to music or to handsfree talk through the speaker 170A. A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the terminal device 100 receives a call or voice message, it is possible to receive voice by approaching the receiver 170B to the human ear. Microphone 170C, also known as a "microphone" or "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may be further provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the source of sound, implement directional recording functions, etc. The earphone interface 170D is used to connect a wired earphone. The earphone interface 170D may be a USB interface 130, or may be a 3.5mm open mobile terminal platform (open mobile terminal platform, OMTP) standard interface, or may be a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The software system of the terminal device 100 may employ a layered architecture, an event driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android (Android) system with a layered architecture as an example, and illustrates a software structure of the terminal device 100.
Fig. 2 is a block diagram of a software architecture of a terminal device to which an embodiment of the present application is applicable. The layered architecture divides the software system of the terminal device 100 into several layers, each layer having a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into an application layer (APP), an application framework layer (application framework), an Zhuoyun rows (Android run) and libraries, a hardware abstraction layer (hardware abstraction layer, HAL), and a kernel layer (kernel). In some embodiments, the terminal device 100 also includes hardware, such as a GPU, CPU, display screen, etc.
The application layer may include a series of application packages that run applications by calling an application program interface (application programming interface, API) provided by the application framework layer. As shown in fig. 2, the application package may include applications for cameras, calendars, maps, phones, music, WLAN, bluetooth, video, social, gallery, navigation, short messages, games, etc.
The application framework layer provides APIs and programming frameworks for application programs of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 2, the application framework layer may include a window manager, a content provider, a resource manager, a notification manager, a view system, a telephony manager, a frame acquisition module, a frame rate adjustment module, an inter-frame similarity detection module, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc. The view system includes visual controls, such as controls to display text, controls to display pictures, and the like.
The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the terminal device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is presented in a status bar, a presentation sound is emitted, the terminal device 100 vibrates, and an indicator light blinks.
And the frame acquisition module user acquires the current frame image of the target application through the virtual display screen. The frame acquisition module may be located at an application framework layer or a hardware abstraction layer, which is not limited in the embodiment of the present application.
The inter-frame similarity detection module is used for processing the current frame image acquired by the frame acquisition module to obtain a gray value of the current frame image, calculating the similarity of two adjacent frame images according to the gray value, and predicting the similarity of the next frame image and the history frame.
The frame rate adjusting module is used for adjusting the frame rate according to the similarity between the next frame of the predicted current frame and the historical frame, classifying the frame rate according to different threshold spaces, and determining the number of the frame skipping according to different thresholds.
The android runtime includes a core library and virtual machines. And the android running time is responsible for scheduling and managing an android system. The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like. The system library may include a plurality of functional modules. For example: surface manager (surface manager), media library (media library), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional (2D) graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications. Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio video encoding formats, such as: MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer is an abstract interface driven by the device kernel, and provides an application program interface for accessing the bottom layer device for a java API framework at a higher level. The hardware abstraction layer may include a plurality of library modules, each of which may implement an interface for a particular type of hardware component, such as a hardware synthesizer (hardware composer). When the framework API requires access to the device hardware, the Android system will load the library module for that hardware component.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving the hardware so that the hardware works. The kernel layer at least includes a display driver, an audio driver, a bluetooth driver, a Wi-Fi driver, and the like, which is not limited in the embodiment of the present application. Illustratively, the kernel layer employs display drivers and audio drivers to drive the display 194 and speakers 170A in the terminal device 100 for video playback.
The terminal device in the embodiment of the present application may be a handheld device, an in-vehicle device, or the like with a wireless connection function, and the terminal device may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. Currently, examples of some terminal devices are: mobile phone (mobile phone), tablet, smart tv, notebook, tablet (Pad), palm, mobile internet device (mobile internet device, MID), virtual Reality (VR) device, augmented reality (augmented reality, AR) device, wireless terminal in industrial control (industrial control), wireless terminal in unmanned driving (self driving), wireless terminal in teleoperation (remote medical surgery), wireless terminal in smart grid (smart grid), wireless terminal in transportation security (transportation safety), wireless terminal in smart city (smart home), wireless terminal in smart home (smart home), cellular phone, cordless phone, session initiation protocol (session initiation protocol, SIP) phone, wireless local loop (wireless local loop, WLL) station, personal digital assistant (personal digital assistant, PDA), handheld device with wireless communication function, computing device or other processing device connected to wireless modem, vehicle device, wearable device, terminal device in 5G network or terminal device in future evolution, public mode of the application is not adopted for specific embodiments of the present application, and the present application.
By way of example, and not limitation, in embodiments of the present application, the terminal device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
It should be understood that in the embodiment of the present application, the terminal device may be a device for implementing a function of the terminal device, or may be a device capable of supporting the terminal device to implement the function, for example, a chip system, and the device may be installed in the terminal. In the embodiment of the application, the chip system can be composed of chips, and can also comprise chips and other discrete devices.
The terminal device in the embodiment of the present application may also be referred to as: a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment, etc.
Currently, games on terminal devices are evolving towards high image quality, high frame rates, e.g. game applications such as the king, hero alliance, etc. can be configured with high frame rates of 90 Frames Per Second (FPS) or 120 FPS. The frame rate is the number of frames transmitted per second, and the more frames per second, the smoother the displayed motion. The high frame rate can bring better game experience to users, but at the same time, the burden of GPU drawing is increased, the power consumption of the equipment is increased, the equipment heats up, and the performance is reduced.
The similarity difference between two adjacent frames of images under different game scenes is larger, for example, for a scene which is more intense in a game match, the change of two adjacent frames of images is larger, that is, the similarity between two adjacent frames of images is lower. In a scene of entering a game hall or a game setting, the adjacent two frames of images have smaller changes, that is, the similarity of the adjacent two frames of images is higher. When the continuous frame images have higher similarity, a game is not needed to draw higher frame rate, and a frame skipping method can be adopted to reduce the power consumption of the CPU and the GPU.
The common frame rate adjustment does not consider the similarity between two adjacent frame images, and the terminal equipment directly performs frame skipping processing on the game picture by using fixed frame skipping parameters, for example, the frame rate set in the game by a user is 90FPS, the GPU is required to draw 90 frame images per second, the performance requirement on the GPU is higher, and excessive power consumption of the GPU can be caused. If a frame skip process is used, for example, the GPU draws an image every other frame, the un-drawn image may be displayed in a multiplexed frame manner. For example, the GPU draws the first frame image, the second frame is not drawn, the display screen multiplexes the first frame image, i.e., the first frame image is displayed twice, then the GPU draws the third frame image, and so on, the GPU draws 45 frame images per second, but the display screen still displays 90 frame images per second.
However, the existing frame skipping mode does not distinguish the similarity of two adjacent frame images, and when the two adjacent frame images have larger changes and smaller similarity, if frame skipping processing is directly performed, the game picture may be unsmooth, and the phenomenon of shaking and flickering may occur.
In view of this, the embodiment of the application provides a parameter determining method and a related device, in which a terminal device can calculate the similarity of two adjacent frame images in a history frame image, predict the similarity of a next frame image and a history frame image according to the similarity of two adjacent frame images in the history frame image, and determine the number of frame skipping according to the predicted similarity of the next frame image and the history frame image, so as to adaptively adjust the drawing frame rate of a GPU, and the way of adjusting the drawing frame rate of the GPU is more flexible and effective, thereby being beneficial to avoiding the phenomenon of jitter and flicker of game pictures.
It should be understood that, the embodiment of the present application adjusts to the drawing frame rate of the GPU, that is, the number of drawing images per second of the GPU, so as to improve the smoothness of the image of the target application while reducing the power consumption of the CPU.
Fig. 3 is a schematic flowchart of a parameter determining method 300 according to an embodiment of the present application, where steps of the method 300 may be performed by a terminal device, and the terminal device may have a structure as shown in fig. 1 and/or fig. 2, which is not limited by the embodiment of the present application. The method 300 includes S301 to S303, which specifically include the following steps:
s301, determining the similarity of at least one group of images in the previous N frames of images of the target application, wherein each group of images comprises two adjacent frames of images in the previous N frames of images, and N is more than or equal to 2 and is an integer.
The N-th frame image is the image currently displayed by the target application, and the previous N-th frame image is the historical frame image of the target application. The target application includes an application in a supported application list of the terminal device, and may include an application having a high requirement on a frame rate, such as a video application and/or a game application. The terminal device may make intelligent frame rate adjustments to the applications in the supported applications list. The intelligent frame rate adjustment described in embodiments of the present application includes determining the number of skipped frames.
Illustratively, n=4, the terminal device has drawn and displayed four frame images, the first four frame images including three sets of images, each set of images including two adjacent frame images of the first four frame images, wherein the first frame image and the second frame image are a set of phase images, denoted as a first set of images, the second frame image and the third frame image are a set of images, denoted as a second set of images, and the third frame image and the fourth frame image are a set of images, denoted as a third set of images. The terminal device may determine a similarity of at least one of the first, second, or third sets of images.
S302, predicting target similarity based on the similarity of at least one group of images, wherein the target similarity represents the similarity between the (n+1) th frame image of the target application and the image included in the at least one group of images.
In combination with the S301 example, in one example, the terminal device determines the similarity of the first group of images, the similarity of the second group of images, and the similarity of the third group of images, where the group number of at least one group of images is 3, and the three groups of images include a first frame image, a second frame image, a third frame image, and a fourth frame image, and the target similarity predicted by the terminal device is the similarity of the n+1st frame image to the first frame image, the second frame image, the third frame image, and the fourth frame image.
In combination with the S301 example, in another example, the terminal device determines the similarity of the second group of images and the similarity of the third group of images, and in this example, the group number of at least one group of images is 2, and the two groups of images include a third frame image, and a fourth frame image, and the target similarity predicted by the terminal device is the similarity of the n+1st frame image and the second frame image, the third frame image, and the fourth frame image.
S303, determining the number of the frame skipping based on the target similarity.
After obtaining the target similarity, the terminal equipment determines the number of the frame skipping according to the target similarity. For example, if the determined number of frame skipping is 0, the terminal device does not skip frames, continues to draw the (n+1) th frame image and sends the image to display. For example, if the determined number of skipped frames is 1, the terminal device skips drawing the n+1st frame image, and continues to display the N-th frame image drawn previously, that is, the n+1st frame multiplexing N-th frame of the target application.
In the embodiment of the application, when the terminal equipment draws the image of the target application, a similarity detection method is adopted to predict the target similarity between the (n+1) th frame and all or part of frames in the historical frames, and the number of the frame skipping is flexibly determined according to the target similarity so as to adjust the drawing frame rate of the terminal equipment. Compared with the scheme adopting the fixed frame skip parameter, the embodiment of the application can set less frame skip quantity under the condition of lower predicted target similarity, so as to avoid the jitter and the flicker of the picture of the target application. Under the condition that the predicted target similarity is higher, a larger number of frame skipping is set, so that the drawing operation of the terminal equipment is reduced, and the power consumption of the terminal equipment is reduced.
The drawing frame rate in the embodiment of the application refers to the number of drawing images per second of the terminal equipment, if the user sets the frame rate to 90FPS in the target application, the terminal equipment is required to display 90 frames of images per second, but the terminal equipment can adjust the number of drawing images according to the predicted target similarity, so that the power consumption of the terminal equipment can be reduced and the picture fluency of the target application can be improved.
Before performing the method 300, the terminal device may detect whether the target application is within the supported application list and whether the frame rate of the target application is greater than or equal to a preset threshold. In case the target application is within the supported application list and the frame rate of the target application is greater than or equal to a preset threshold, the terminal device starts to execute the method 300.
Illustratively, the support application list includes applications that have a high requirement for picture frame rate, such as video class applications and/or game class applications. The terminal equipment can acquire an extensible markup language (extensible markup language, xml) file of the target application, detect whether an application name is in a supporting application list in the xml file of the target application, detect whether the configured frame rate is greater than or equal to a preset threshold 90FPS, if the target application is in the supporting application list, and the configured frame rate is greater than or equal to the preset threshold, the terminal equipment starts a function switch of the scheme, and determines the number of frame skipping through similarity detection.
Illustratively, the preset threshold is 90FPS, which is not limited by embodiments of the present application.
As an alternative embodiment, S302 includes: calculating an average value of the similarity of the at least one set of images; and predicting the target similarity according to the average value.
In one possible design, the target similarity may be linear with the average. The target similarity and the average value satisfy the following formula:
wherein S is target Representing the similarity of targets, S i Representing similarity of an ith group of images in the at least one group of images, T representing the number of groups of the at least one group of images, c 1 And c 2 The preset value may also be referred to as an adjustment parameter.
In combination with the example of n=4 in S301, three sets of images are included in the first four frame images, the first set of images including the first frame image and the second frame image, the second set of images including the second frame image and the third frame image, and the third set of images including the third frame image and the fourth frame image. The terminal device may determine the target similarity based on a similarity of at least one of the first set of images, the second set of images, or the third set of images.
To be based on the similarity of the first group of images (denoted as S 1 ) Similarity of the second group of images (S 2 ) Similarity to the third group of images (denoted S 3 ) The target similarity is determined as an example, in which case t=3. The terminal device can calculate the average value of the similarity of the three groups of images, and the target similarity is obtained through the formula.
In the embodiment of the application, the mode of calculating the target similarity through the linear expression is simpler and easier to implement, and is beneficial to reducing the power consumption of the terminal equipment.
Other variations of the formula for calculating the target similarity described in the embodiments of the present application are also possible, and any variation of the formula for calculating the target similarity described above is within the scope of the embodiments of the present application.
As an alternative embodiment, S301 includes: determining a first gray value of a first image and a second gray value of a second image of an ith group of images in the at least one group of images; and determining the similarity of the ith group of images according to the first gray level value and the second gray level value. Wherein a first gray value is determined based on the pixel value of the first image and a second gray value is determined based on the pixel value of the second image, i=1, …, T representing the number of groups of the at least one group of images.
In the embodiment of the application, the similarity S of the ith group of images i The determination may be based on a first gray value of a first image and a second gray value of a second image in the i-th group of images.
Taking the first group of images as an example, the first group of images includes a first frame image and a second frame image, and illustratively, the first frame image and the second frame image are color (RGB) images, each pixel value of the RGB images includes three color components of red (R), green (G), and blue (B), in order to facilitate calculation of the similarity of the first frame image and the second frame image, the terminal device may convert the first frame image and the second frame image into gray maps, respectively, and calculate the similarity according to the gray values of the gray maps.
The RGB image may be converted into a gray scale by a maximum value method, an average value method, or a weighted average method.
Taking the pixel value of a certain pixel point as (35, 40, 45) as an example, if the maximum value method is adopted, the gray value corresponding to the pixel point takes the maximum value of R, G, B three color components, namely 45. If the average method is adopted, the gray value corresponding to the pixel point takes the average value of R, G, B three color components, namely 40. If the weighted average method is adopted, the weight of each color component is different during gray level conversion, for example, the weight of red R is 0.299, the weight of green G is 0.587, and the weight of blue B is 0.114, the gray level value corresponding to the pixel point is 39.
Since each frame of image comprises a plurality of pixel points, a plurality of gray values corresponding to the number of the pixel points are obtained after gray conversion of each frame of image. After obtaining the plurality of gray values of the first frame image and the plurality of gray values of the second frame image, the terminal device respectively carries out numerical processing on the plurality of gray values of the first frame image and the plurality of gray values of the second frame image to obtain the first gray value of the first frame image and the second gray value of the second frame image.
The first gray value is explained by performing numerical processing on a plurality of gray values of the first frame image. For example, the terminal device may average the plurality of gray values of the first frame image to obtain the first gray value, or may average the plurality of gray values by weight to obtain the first gray value. A similar method is also used for performing numerical processing on the plurality of gray values of the second frame image to obtain a second gray value.
Based on the above manner of calculating the first gray value and the second gray value, the first gray value of the first image and the second gray value of the second image of the i-th group of images can be obtained. For convenience of description, the first gray value is denoted as L 1 The second gray value is L 2
At a first gray value L of a first image of an ith group of images 1 And a second gray value L of the second image 2 After that, the terminal device generates a first gray value L 1 And a second gray value L 2 Determining similarity S of ith group image i Comprising: according to similarity S of the ith group of images i And the first gray value L 1 And a second gray value L 2 Determining similarity S of the ith group of images according to the fitting relation of the differences i 。
Wherein, similarity S of the ith group of images i And the first gray value L 1 And a second gray value L 2 The difference of (2) may be in a linear relationship, which may be represented by the following formula:
S i =1-α(L 1 -L 2 )
where α is a preset value, and a fitting coefficient representing the transition of the gray value to a structural similarity (structural similarity, SSIM) value can be determined by offline analysis of adjacent frames of the target application.
The mode for calculating the inter-frame similarity provided by the embodiment of the application only needs to carry out simple linear calculation, thereby being beneficial to saving the power consumption of terminal equipment.
In order to reduce the calculation amount, in one implementation manner, the terminal device may downsample the first image of the ith group of images, that is, reduce the number of pixels of the first image, obtain a downsampled first image, and then convert the pixel value of the downsampled first image into a gray value to obtain a first gray value. Similarly, the terminal device may downsample the second image of the ith group of images, i.e. reduce the number of pixels of the second image, to obtain a downsampled second image. And then converting the pixel value of the downsampled second image into a gray value to obtain a second gray value.
In another implementation manner, the terminal device may perform gray level conversion on the first image of the ith group of images to obtain a gray level image of the first image, then perform downsampling on the gray level image of the first image, and then obtain the first gray level value according to a plurality of gray level values of the downsampled gray level image. Similarly, the terminal device may perform gray-level conversion on the second image of the i-th group of images to obtain a gray-level image of the second image, then perform downsampling on the gray-level image of the second image, and then obtain a second gray-level value according to a plurality of gray-level values of the downsampled gray-level image.
In the case of not considering the calculation amount, the terminal device may also calculate the similarity of two adjacent frames in the previous N frames of images by using the existing method of calculating the similarity of the images, for example, SSIM, histogram method, cosine similarity method, mutual information method, and the like, which is not limited in the embodiment of the present application.
Fig. 4 is a schematic diagram of a frame 400 for adjusting a frame rate according to an embodiment of the present application. As shown in fig. 4, the framework 400 includes a target application, a frame rate adjustment module, an inter-frame similarity detection module, a frame acquisition module, a graphics layer synthesis module (surfaceflinger), a hardware synthesizer (hardware composer), a display screen, and a GPU.
Taking a target application as an example of a game application, in a life cycle of running the game application, the game application sends a graphic drawing instruction to the GPU, the GPU receives the graphic drawing instruction to draw a graphic layer, sends the drawn graphic layer to a graphic layer combining module, combines the received graphic layers, sends the combined graphic layer to a hardware synthesizer, and the hardware synthesizer is used for synthesizing a plurality of buffer areas (buffers) through available hardware, and then sends the buffer areas to a display screen to enable the display screen to read image data from the buffer areas for display.
In the embodiment of the application, the frame acquisition module can acquire the current frame image of the target application through the virtual display screen, the inter-frame similarity detection module can detect the similarity, and the frame rate adjustment module can determine the frame skipping parameter according to the similarity detection result so as to adjust the drawing frame rate of the GPU.
The process of determining the frame skip parameter in the terminal device will be described below based on the framework shown in fig. 4 by taking the example of calculating the similarity between the nth-1 frame image and the current nth frame image, and predicting the similarity between the (n+1) th frame image and the history frame.
Fig. 5 is a schematic flow chart of another parameter determination method 500 provided by the application embodiment. The method 500 includes S501 to S517, which specifically include the following steps:
S501, the target application sends a first drawing instruction to the GPU, wherein the first drawing instruction is used for indicating the GPU to draw an N-1 frame image. Accordingly, the GPU receives a first drawing instruction.
Wherein the first drawing instructions comprise an open graphics library (open graphics library, openGL), which is a cross-language, cross-platform application programming interface for rendering 2D, 3D vector graphics.
S502, the GPU draws an N-1 frame image.
The GPU draws the image by invoking the OpenGL interface. For example, 3D models, graphics transforms, texture maps, image enhancement, and extensions to bitmap displays are built.
S503, the GPU sends an N-1 frame image to the display screen. Accordingly, the display screen receives the N-1 th frame image.
It should be noted that, the GPU sends the N-1 frame image to the display screen through the graphics layer combining module and the hardware synthesizer.
For example, for a mobile phone, the screen direction is vertical, the status bar is at the top, the navigation bar is at the bottom, and other areas display application content. The layer composition module composes layers of a status bar, a navigation bar and other areas, the content of each layer is in a separate buffer area, and the hardware synthesizer can completely send the three buffer areas to the display screen and instruct the display screen to read image data of different parts of the screen from different buffer areas.
S504, displaying the N-1 frame image on the display screen.
The display screen acquires the data of the N-1 frame image from the buffer area of the N-1 frame image of the hardware synthesizer, and displays the N-1 frame image.
S505, the frame acquisition module acquires an N-1 frame image.
After the GPU finishes drawing the N-1 frame image, the frame acquisition module can acquire the N-1 frame image from a buffer area of the hardware synthesizer.
After the frame acquisition module acquires the N-1 frame image, the frame acquisition module can perform downsampling on the N-1 frame image, and perform gray level conversion on the downsampled N-1 frame image to obtain a gray level image of the N-1 frame image.
S506, the target application sends a second drawing instruction to the GPU, wherein the second drawing instruction is used for indicating the GPU to draw the Nth frame image. Accordingly, the GPU receives a second drawing instruction.
S507, the GPU draws the N frame image.
S508, the GPU sends the Nth frame of image to the display screen. Accordingly, the display screen receives the nth frame image.
It should be noted that, the GPU sends the nth frame image to the display screen through the graphics layer combining module and the hardware synthesizer.
S509, the display screen displays the Nth frame of image.
The display screen acquires the data of the N frame image from the buffer area of the N frame image of the hardware synthesizer, and displays the N frame image.
S510, a frame acquisition module acquires an Nth frame image.
After the GPU draws the nth frame image, the frame acquisition module may acquire the nth frame image from the buffer of the hardware synthesizer.
After the frame acquisition module acquires the nth frame image, the frame acquisition module can perform downsampling on the nth frame image, and perform gray level conversion on the downsampled nth frame image to obtain a gray level image of the nth frame image.
S511, the inter-frame similarity detection module acquires an N-1 frame image and an N frame image.
If the frame acquisition module performs gray level conversion on the N-1 frame image and the N frame image, the inter-frame similarity detection module can obtain a gray level image of the N-1 frame image and a gray level image of the N frame image.
The gray scale conversion of the N-1 th frame image and the N-th frame image may be performed by the frame acquisition module or may be performed by the inter-frame similarity detection module, which is not limited in the embodiment of the present application.
S512, the inter-frame similarity detection module calculates the similarity between the N-1 frame image and the N frame image.
The inter-frame similarity detection module may calculate the similarity between the N-1 th frame image and the N-th frame image based on the formula for calculating the similarity of the i-th group image described above, and will not be described here.
S513, the inter-frame similarity detection module predicts the target similarity according to the similarity of the N-1 frame image and the N frame image.
If the inter-frame similarity detection module predicts the target similarity according to the similarity of the N-1 th frame image and the N-th frame image in the previous N frame images, namely, only the similarity of a group of images in the previous N frame images is considered, the target similarity represents the similarity of the (N+1) -th frame image and the N-1 th frame image and the N-th frame image in the previous N frame images.
S514, the frame rate adjustment module acquires the target similarity.
S515, the frame rate adjusting module determines the number of the skipped frames corresponding to the target similarity according to the corresponding relation between the value range of the predefined similarity and the number of the skipped frames.
Illustratively, the predefined similarity range includes: "< 90%", ". Gtoreq.90% and < 95%", and ". Gtoreq.95%". The number of the frame hops corresponding to the ' less than 90% ' is 0, the number of the frame hops corresponding to the ' more than or equal to 90% and the ' less than 95% ' is 1, and the number of the frame hops corresponding to the ' more than or equal to 95% ' is 2.
Optionally, the method 500 further comprises S516: if the determined number of skipped frames is greater than zero, the frame rate adjustment module intercepts the drawing of the n+1st frame image.
It should be understood that intercepting the drawing of the n+1st frame image does not represent intercepting the drawing instruction for the n+1st frame image. The frame rate adjustment module may intercept the GPU's rendering of the n+1st frame image by modifying parameters in the rendering instructions. For example, in one implementation, the frame rate adjustment module may modify the drawing parameter value at which the drawing instruction is null, returning a null value to the GPU such that the GPU does not draw the n+1th image after receiving the drawing instruction.
Optionally, the method 500 further includes S517: the display screen determines that the buffer area of the (n+1) th frame is empty, and continues to display the (N) th frame image.
For example, the calculated target similarity is 98%, and in the value range of ∈95% ", the terminal device may determine that the number of skipped frames is 2, that is, after displaying the nth frame image of the target application, the frame rate adjustment module intercepts the drawing instruction of the n+1st frame image, and the GPU does not draw the nth+1st frame image and the n+2nd frame image. The display screen reads the data of the (N+1) -th frame image from the buffer area of the (N+1) -th frame image of the hardware synthesizer, and discovers that the buffer area of the (N+1) -th frame image is empty, so that the display screen acquires the data of the (N) -th frame image and displays the (N) -th frame image. And then the display screen reads the data of the (N+2) th frame image from the buffer area of the (N+2) th frame image of the hardware synthesizer, and discovers that the buffer area of the (N+2) th frame image is empty, so that the display screen acquires the data of the (N) th frame image and displays the (N) th frame image. The display screen thus multiplexes the nth frame image twice. After the GPU skips the drawing of two frames, continuing to draw the (N+3) th frame image of the game application based on the received drawing instruction and sending and displaying the image.
It should be understood that the sequence numbers of the above processes do not mean the order of execution, and the execution order of the processes should be determined by the functions and internal logic of the processes, and should not be construed as limiting the implementation process of the embodiments of the present application.
The parameter determining method according to the embodiment of the present application is described in detail above with reference to fig. 1 to 5, and the parameter determining apparatus according to the embodiment of the present application will be described in detail below with reference to fig. 6 and 7.
Fig. 6 shows a schematic block diagram of a parameter determination apparatus 600 provided by an embodiment of the present application. The apparatus 600 includes an acquisition module 610 and a processing module 620.
Wherein, the obtaining module 610 is configured to: and acquiring the first N frames of images of the target application. The processing module 620 is configured to: determining the similarity of at least one group of images in the previous N frames of images of the target application, wherein each group of images in the at least one group of images comprises two adjacent frames of images in the previous N frames of images, and N is more than or equal to 2 and is an integer; predicting target similarity based on the similarity of at least one group of images, wherein the target similarity represents the similarity between an (n+1) th frame image of a target application and images included in at least one group of images; and determining the number of skipped frames based on the target similarity.
Optionally, the processing module 620 is configured to: calculating an average value of the similarity of the at least one set of images; and predicting the target similarity according to the average value.
Optionally, the target similarity is linear with the average value.
Optionally, the target similarity and the average value satisfy:
Wherein S is target Representing the similarity of targets, S i Representing the similarity of the ith group of images in the at least one group of images, T representing the number of groups of the at least one group of images, T being a positive integer, c 1 And c 2 Is a preset value.
Optionally, the processing module 620 is configured to: determining a first gray value of a first image of an ith group of images and a second gray value of a second image of the ith group of images in the at least one group of images, the first gray value being determined based on the pixel value of the first image and the second gray value being determined based on the pixel value of the second image; and determining the similarity of the ith group of images according to the first gray level value and the second gray level value. Wherein i=1, …, T represents the group number of the at least one group of images, and i and T are positive integers.
Optionally, the similarity of the i-th group of images is linear with the difference between the first gray value and the second gray value.
Optionally, the similarity of the i-th group of images and the difference between the first gray value and the second gray value satisfy:
S i =1-α(L 1 -L 2 )
wherein S is i Representing the similarity of the ith group of images, wherein alpha is a preset value and L 1 Represents a first gray value, L 2 Representing a second gray value.
Optionally, the first gray value is obtained by performing conversion after downsampling the pixel value of the first image, and the second gray value is obtained by performing conversion after downsampling the pixel value of the second image; alternatively, the first gray value is obtained by converting a pixel value of the first image into a gray value and then downsampling, and the second gray value is obtained by converting a pixel value of the second image into a gray value and then downsampling.
Optionally, the processing module 620 is configured to: and determining the number of the skipped frames corresponding to the target similarity based on the corresponding relation between the value range of the predefined similarity and the number of the skipped frames.
Optionally, the processing module 620 is configured to: intercepting the drawing of the n+1st frame image.
Optionally, detecting whether the target application is in the support application list; detecting whether the frame rate of the target application is greater than or equal to a preset threshold; and determining the similarity of at least one group of images in the previous N frames of images of the target application under the condition that the target application is in the support application list and the frame rate of the target application is greater than or equal to a preset threshold value.
In an alternative example, it will be appreciated by those skilled in the art that the apparatus 600 may be embodied as a terminal device in the above embodiment, or the functions of the terminal device in the above embodiment may be integrated in the apparatus 600. The above functions may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the functions described above. The apparatus 600 may be configured to perform the respective processes and/or steps corresponding to the terminal device in the above-described method embodiment.
It should be appreciated that the apparatus 600 herein is embodied in the form of functional modules. The term module herein may refer to an application specific integrated circuit (application specific integrated circuit, ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an embodiment of the present application, the apparatus 600 may also be a chip or a chip system, for example: system on chip (SoC).
Fig. 7 shows a schematic block diagram of another parameter determination apparatus 700 provided by an embodiment of the present application. The apparatus 700 includes a processor 710, a communication interface 720, and a memory 730. Wherein the processor 710, the communication interface 720 and the memory 730 communicate with each other through an internal connection path, the memory 730 is used for storing instructions, and the processor 710 is used for executing the instructions stored in the memory 730 to control the communication interface 720 to transmit signals and/or receive signals.
It should be understood that the apparatus 700 may be configured to perform the steps and/or flows corresponding to the terminal device in the above-described method embodiments. Alternatively, the memory 730 may include read-only memory and random access memory, and provide instructions and data to the processor. A portion of the memory may also include non-volatile random access memory. For example, the memory may also store information of the device type. The processor 710 may be configured to execute instructions stored in the memory, and when the processor executes the instructions, the processor may perform steps and/or processes corresponding to the terminal device in the above-described method embodiments.
It should be appreciated that in embodiments of the application, the processor 710 may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital signal processors (digital signal processing, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate arrays (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The application also provides a computer readable storage medium, wherein computer executable instructions are stored in the computer readable storage medium, and when the computer executable instructions are executed by a processor, the method executed by the terminal device in any method embodiment can be realized.
The embodiment of the application also provides a computer program product, which comprises a computer program, wherein the computer program can realize the method executed by the terminal equipment in any method embodiment when being executed by a processor.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and module may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided by the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific implementation of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and all changes and substitutions are included in the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.
Claims (11)
1. A method of determining parameters, comprising:
determining the similarity of at least one group of images in the previous N frames of images of a target application, wherein each group of images in the at least one group of images comprises two adjacent frames of images in the previous N frames of images, and N is more than or equal to 2 and is an integer;
calculating an average value of the similarity of the at least one set of images;
predicting target similarity according to the average value, wherein the target similarity represents the similarity between an (n+1) th frame image of the target application and an image included in the at least one group of images;
wherein the target similarity and the average value satisfy:
wherein S is target Representing the target similarity, S i Representing the similarity of the ith group of images in the at least one group of images, T representing the number of groups of the at least one group of images, T being a positive integer, c 1 And c 2 Is a preset value;
and determining the number of the frame skipping based on the target similarity.
2. The method of claim 1, wherein determining the similarity of at least one set of images in the first N frames of images of the target application comprises:
determining a first gray value of a first image and a second gray value of a second image of an ith group of images in the at least one group of images; the first gray value is determined based on the pixel value of the first image, and the second gray value is determined based on the pixel value of the second image;
Determining the similarity of the ith group of images according to the first gray level value and the second gray level value;
wherein i=1, …, T represents the group number of the at least one group of images, and i and T are positive integers.
3. The method of claim 2, wherein the similarity of the i-th group of images is linear with the difference between the first gray value and the second gray value.
4. A method according to claim 3, wherein the similarity of the i-th group of images with the difference between the first and second grey values satisfies:
S i =1-α(L 1 -L 2 )
wherein S is i Representing the similarity of the ith group of images, wherein alpha is a preset value and L 1 Representing the first gray value, L 2 Representing the second gray value.
5. The method according to any one of claims 2 to 4, wherein the first gray value is obtained by down-sampling the pixel value of the first image and then converting, and the second gray value is obtained by down-sampling the pixel value of the second image and then converting; or alternatively
The first gray value is obtained by converting the pixel value of the first image into a gray value and then downsampling, and the second gray value is obtained by converting the pixel value of the second image into a gray value and then downsampling.
6. The method of any of claims 1 to 5, wherein the determining the number of skipped frames based on the target similarity comprises:
and determining the number of the skipped frames corresponding to the target similarity based on the corresponding relation between the value range of the predefined similarity and the number of the skipped frames.
7. The method according to any one of claims 1 to 6, wherein in case the determined number of skipped frames is greater than zero, the method further comprises:
intercepting the drawing of the (n+1) th frame image.
8. The method according to any one of claims 1 to 7, wherein prior to said determining the similarity of at least one set of images in the first N frames of images of the target application, the method further comprises:
detecting whether the target application is in a support application list;
detecting whether the frame rate of the target application is greater than or equal to a preset threshold;
the determining the similarity of at least one group of images in the previous N frames of images of the target application comprises the following steps:
and under the condition that the target application is in the support application list and the frame rate of the target application is greater than or equal to the preset threshold value, determining the similarity of at least one group of images in the previous N frames of images of the target application.
9. A parameter determination apparatus comprising means for performing the method of any one of claims 1 to 8.
10. A parameter determining apparatus, comprising: a processor and a memory, wherein,
the memory is used for storing a computer program;
the processor is configured to invoke and execute the computer program to cause the apparatus to perform the method of any of claims 1 to 8.
11. A computer readable storage medium for storing a computer program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210943024.5A CN116095220B (en) | 2022-08-08 | 2022-08-08 | Parameter determination method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210943024.5A CN116095220B (en) | 2022-08-08 | 2022-08-08 | Parameter determination method and related device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116095220A CN116095220A (en) | 2023-05-09 |
CN116095220B true CN116095220B (en) | 2023-10-31 |
Family
ID=86197924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210943024.5A Active CN116095220B (en) | 2022-08-08 | 2022-08-08 | Parameter determination method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116095220B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106919358A (en) * | 2017-03-10 | 2017-07-04 | 广东欧珀移动通信有限公司 | A kind of display control method of mobile terminal, device and mobile terminal |
CN108646906A (en) * | 2018-03-27 | 2018-10-12 | 广东欧珀移动通信有限公司 | Frame per second method of adjustment, device, storage medium and intelligent terminal |
CN112422873A (en) * | 2020-11-30 | 2021-02-26 | Oppo(重庆)智能科技有限公司 | Frame insertion method and device, electronic equipment and storage medium |
CN114740965A (en) * | 2022-05-05 | 2022-07-12 | Oppo广东移动通信有限公司 | Processing method and device for reducing power consumption of terminal, terminal and readable storage medium |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8749541B2 (en) * | 2012-04-05 | 2014-06-10 | Apple Inc. | Decreasing power consumption in display devices |
CN107799053A (en) * | 2017-11-13 | 2018-03-13 | 合肥京东方光电科技有限公司 | Control method and apparatus, time schedule controller, the display device of refreshing frequency |
-
2022
- 2022-08-08 CN CN202210943024.5A patent/CN116095220B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106919358A (en) * | 2017-03-10 | 2017-07-04 | 广东欧珀移动通信有限公司 | A kind of display control method of mobile terminal, device and mobile terminal |
CN108646906A (en) * | 2018-03-27 | 2018-10-12 | 广东欧珀移动通信有限公司 | Frame per second method of adjustment, device, storage medium and intelligent terminal |
CN112422873A (en) * | 2020-11-30 | 2021-02-26 | Oppo(重庆)智能科技有限公司 | Frame insertion method and device, electronic equipment and storage medium |
CN114740965A (en) * | 2022-05-05 | 2022-07-12 | Oppo广东移动通信有限公司 | Processing method and device for reducing power consumption of terminal, terminal and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN116095220A (en) | 2023-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115473957B (en) | Image processing method and electronic equipment | |
US12020620B2 (en) | Display method, electronic device, and computer storage medium | |
WO2022007862A1 (en) | Image processing method, system, electronic device and computer readable storage medium | |
WO2023065873A1 (en) | Frame rate adjustment method, terminal device, and frame rate adjustment system | |
CN113254120A (en) | Data processing method and related device | |
WO2023273844A1 (en) | Display method and electronic device | |
CN111930787A (en) | Synchronization method and device | |
CN116077943B (en) | Method for scheduling system resources and related device | |
WO2023160179A1 (en) | Magnification switching method and magnification switching apparatus | |
CN116051351B (en) | Special effect processing method and electronic equipment | |
CN116048831B (en) | Target signal processing method and electronic equipment | |
CN116095220B (en) | Parameter determination method and related device | |
CN116663587A (en) | Two-dimensional code identification method and identification device | |
CN115686403A (en) | Display parameter adjusting method, electronic device, chip and readable storage medium | |
CN114793283A (en) | Image encoding method, image decoding method, terminal device, and readable storage medium | |
CN116095225B (en) | Image processing method and device of terminal equipment | |
CN116700578B (en) | Layer synthesis method, electronic device and storage medium | |
WO2024082713A1 (en) | Image rendering method and apparatus | |
CN116095512B (en) | Photographing method of terminal equipment and related device | |
CN116074624B (en) | Focusing method and device | |
CN115880198B (en) | Image processing method and device | |
CN117726543B (en) | Image processing method and device | |
WO2024067037A1 (en) | Service calling method and system, and electronic device | |
EP4276618A1 (en) | Image processing method, electronic device, and storage medium | |
CN118430406A (en) | Brightness display method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |